Earlier this week, Google Launched AMP or what they call 'Accelerated Mobile Pages'. As a concept it's pretty nifty. It's their answer to Facebook Instant Articles, and Apple News. But instead of publishing 'on Google', Google is trying to make publishers' sites faster without moving the content. It's a brilliant and a much better idea, but it's also hugely problematic.
But I won't talk about how AMP works in this article. Instead, I will suggest you head over to NiemanLab where Joshua Benton has already written an absolutely brilliant article about it.
What I will talk about here is the concept of it. And I will start with a bit of nostalgia and go back to the 1990s, when I first started using the internet.
Back in the 1990s, when my local business school became the first place in my city to have 'the internet', I was one of the few who became really obsessed with it. I would stay at school throughout the long evenings, just browsing around. It was an amazing time.
But it was also mind bogglingly slow. Loading a big page could take minutes, while you were watching the images appear one line at the time. And as you sat there waiting and waiting and waiting, you started to wonder why it was so slow.
For instance, why is it reloading the menu bar every time I click on a link. The menu bar didn't changed, so why does it have to load it again. It's the same with the sidebar, and all the other elements, like the scripts that are the same for every page.
It seemed kind of silly. Why couldn't we find a way to ...I don't know ...only load the things that have changed, and maybe cache all the common elements?
And, in fact, the internet developers found a way to do just that, by inventing frames. Frames were brilliant, in theory. What it did was that it took your site and divided it up into sections.
There was the 'parent' section, which contained all the code for the site as a whole. So all the scripts, all the functionality, and the structure of your site would be loaded with that.
Then there were the site element frames. Below, for instance, you see the website of the 'Seeley Swan Pathfinder'. You will notice how it is divided up into fice sections (or frames). There are the logo frame, the sidebar/menu frame, the advertising frame and the footer frame. And right in the middle is the content frame.
This was rather brilliant, because now when you click on a link, instead reloading everything on this page (including all the scripts), only the content frame will be changed.
This dramatically speed up the time it took to load the content. In fact, most sites would see a 10x speed increase, which was a really big deal if it took a couple of minutes to load a site.
Of course, it also came with a shitload of flaws and limitations. To name a few, these frames severely limited how you could design a page. They dramatically reduced your ability to interact with a page dynamically, they force you to be quite basic, and they only worked if people wanted to look at more than one page.
On top of this, there were quite a lot of usability problems. It prevented deep-linking, forcing you to only link to the front page (which was rather hopeless if you were a newspaper or a webshop and you wanted to send a link to someone). It prevented indexing by search engines, it messed up third party tools, it was also almost impossible to track accurately with analytics and it pretty much stopped the back button from working the way it was supposed to.
So, when we started getting faster and faster internet connections, we quickly got rid of these pesky frames altogether.
The concept of frames was brilliant, but it clearly wasn't the right way to solve the problem. The right way was to focus on faster internet connections, combined with more optimized and user friendly sites.
Why am I telling you this? Well, because it's kind of the same thing as what we see today.
When Facebook talks about 'Instant Articles', when Apple is talking about Apple News, and when Google is now talking about Accelerated Mobile Pages, it's like we are straight back into the world of frames ... or worse, it feels like the old days of WAP pages.
The problem they are trying to solve is the same as in 1998, where people were complaining about sites being too slow to load. And the solution is the same as well. Reduce or limit loading to only the part that is the article itself.
Take Facebook Instant Articles. It's basically a frame that only loads the content from a newspaper, turning the rest of the navigation into a static element of the 'parent' app.
Google's Accelerated Mobile Pages are slightly different, in that it isn't a frame that is loaded within another part. But the concept is just like with frames. Take away all the things that aren't specifically about the content, and voila, everything loads fast again.
And just like in 1998, it's not really the right solution, which is made even worse in that publishers are giving up their 'owned relationships' and long term loyalty, in exchange for a short term focus on reach on someone else's platform.
But the question is, why do we even have this problem again? Didn't we solve it already? I mean, we have super-fast internet, even super-fast mobile internet. Our smartphones are at or very near desktop performance. Surely, speed couldn't possibly be a problem anymore? Right?
Well, sadly, publishers have completely failed to do their job when it came to designing their sites. Instead of a super-fast internet experience, publishers have added so much junk to their pages that it is, quite frankly, ridicules.
Here is an example of how many things TMZ loads into your browser.
What you see here is 803 files being sent to your browser, totalling almost 20 MB of data. It's insane!
But," you say, "we need all these things. And even if we cleaned it up a bit, we still couldn't get anywhere close to the mobile performance that we see from Facebook Instant Articles or Google Accelerated Mobile Pages. They load instantly. We could never do that ourselves!
One thing publishers don't seem to understand is how fast the internet already is. We don't have a speed problem on the internet. It's already plenty fast.
Let me illustrate this by showing you how fast this site really is.
Everything you see on this site was custom built by me, since I learned how to build website back in the 1990s. And I spent quite some time making sure that it would load exceptionally fast on any device.
What I have done is three things:
First, I have made sure that my site isn't loading anything that it doesn't have to, and I have minimized the use of third party services. The site is still loading quite a lot of files. At a minimum around 70 or so for the free articles, and about 150 files for the much bigger Plus articles (depending on how much video content I have embedded into them).
Secondly, All the static content, like the images or the menu graphics are loaded via Cloudflare's CDN. What this does is that it will load these files from a cache closest to you, rather than from my server in Chicago, USA.
Finally, I have built a system that pre-builds every static element (in memory) of the page before you even arrive. The sidebar, related articles, and the footer have already been loaded, and rendered by the server before you even click on anything. This server-side caching makes the site, at least, 3-4 times faster. It makes a huge difference.
The result of this is a load time of 0.8 seconds according to Pingdom, which I use to monitor my site.
But, I'm not just going to give you the numbers. Let me show you how this works, for real. Earlier today, I reached out for my iPad Air (the old one) and I recorded this:
Notice how fast everything loads from the time that I click on something to when the new page shows up.
So, when Facebook, Google and all the others starts to talk about 'instant' whatever, I simply do not have any need for it. My site is already super fast. So why would I move my content to a rented channel on someone else's platform?
But, wait-a-minute, you say. I have tried clicking on a link to your site, and it didn't load in 0.8 seconds. It took several seconds.
Yeah... you are right about that. It's true that, when you visit this site from the Facebook app, it loads a lot slower than what I illustrate above. But let me show you why.
Below is another video of what happens when I click on a link from my Facebook page to this site.
Did you notice how slow it was?
It's slow, because when you click on a link from inside the Facebook App, Facebook first have to initialize the 'WebView' (basically a browser within the app), and only then does it start to load the link that you wanted to see.
And this is essentially what they have solved Facebook Instant Articles. The Instant article 'view' is preloaded before you even click on anything. Which means that, when you click, you don't have to wait for it to initialize.
But if Facebook can do this with their instant article 'view' why can't it also pre-load the WebView so that I don't have to wait for the browser to load up?
It's not my site that is slow. It's how Facebook transitions from the app to the web. As Google explains it:
App developers face a difficult tradeoff when it comes to showing web content in their apps. Opening links in the browser is familiar for users and easy to implement, but results in a heavy-weight transition between the app and the web.
So, the real problem that we have on Facebook (and in other apps) is not the web, but the transition between them. Why don't we solve that instead of all this nonsense about 'instant articles'?
What if we could load the web as fast (or nearly as fast) as we could load content within an app?
Well, Google is actually working on that too. They call it 'Chrome Custom Tabs', and what it does is that it fixes the transition cost between apps and the web.
Check this comparison between how fast a page can load in Chrome Custom Tabs (from within an app), by switching to the browser, and from WebView.
Notice how slow WebView is (which is what all apps use today).
This is the real future. This is the trend that we need to work towards, not all these other trends of building weird separate versions of a site for Facebook Instant Articles, Google Accelerated Mobile Pages, Apple News, or whatever other social sites that suddenly want to get all our content on their platforms.
We actually have a way to make the web work the way is should work. We have the ability to achieve 'instant articles' without using Facebook or Google, and without handing off our future to their platforms.
We can get the best of both worlds, the distribution of the social channels, combined with a link that brings people directly back to us.
Of course, to make this work, we need to do two things.
We have the ability to link to super fast websites directly from within apps. We just have to implement it.
Let's focus on solving the right problem. Facebook Instant Articles and Google's Accelerated Mobile Pages are great concepts, and we can learn a lot from them, but they are also like the frames idea of the 1990s.
Many publishers are starting to incorporate AI into their newsrooms, but most make the mistake of using AI to lower the quality of their news.
Climate change coverage needs a different focus, otherwise we lose our audiences
For publishers, a getting people to subscribe is only the very beginning of their conversion process.
Churn is something you have to manage long before people even subscribe
Many publishers talk about doing memberships instead of subscriptions. But what does that actually mean?
When publishers want to engage with their audiences, are they really unengaging them instead?
Only about 15% of the public pays for news, so how we do convert the remaining 85%?
Innovation doesn't happen in separation. It's a tool that you have to use to reach specific outcomes.
Founder, media analyst, author, and publisher. Follow on Twitter
"Thomas Baekdal is one of Scandinavia's most sought-after experts in the digitization of media companies. He has made himself known for his analysis of how digitization has changed the way we consume media."
Swedish business magazine, Resumé