The problem is the cache and specifically the preloading of it. This gives AMP an unfair advantage of multiple seconds over anything else.
Still, even without the AMP-cache, mobile sites were loading way too much JS, even after Google penalized them. The effect of AMP showing how sites could be loaded as fast as native Apple News/Facebook Instant, has finally gotten publishers to strip down their sites. You might not like the way it played out, but the end result is that not only do end users get AMP-cached fast loading, but they also end up download far less data, because the sites themselves have been pared down.
It won't fix this. The only thing it will do, it will let browsers show the original link, not the AMP link, and fix the UI. The problems described in the article will not go away.
> But before you have a general purpose spec that fixes something, you need a specific embodiment that does
AMP isn't that spec though. It does nothing special. And the only reason it's fast is because Google aggressively preloads it.
> but the end result is that not only do end users get AMP-cached fast loading, but they also end up download far less data,
Are they though? When for every search google preloads tens of AMP sites to make them "fast"?
No, it does more that, it does away with the need to use iframes which break scrolling, and it allows all sites to use preloading without violating privacy see https://redfin.engineering/how-to-fix-googles-amp-without-sl...
"If other browsers accepted the Web Packaging standard, the web might look rather different in the future, since basically any site that links to a lot of external sites (Reddit? Twitter? Facebook?) could start linking to prerendered Web Packages, rather the original site. Those sites would appear to just load faster. Web-Packaged pages could one day eliminate the Reddit “hug of death,” where Reddit’s overenthusiastic visitors overwhelm sites hosting original content.
Despite cries that Google is trying to subvert the open web, the result could be a more open web, a web open to copying, sharing, and archiving web sites."
>Are they though? When for every search google preloads tens of AMP sites to make them "fast"?
TheVerge.com non-AMP loads 3MB of data, 289 HTTP requests, executes 1.5Mb of JS. Going to Google.com and searching for Verge stories produces 10 carosel Verge stories, and according to Chrome DevTools, only 377kb was loaded, though this seems oddly wrong, I doubt prefetching AMP stories will exceed the shitty bloat of non-AMP pages.
WashingtonPost non-AMP homepage is 6MB+
NYT non-AMP is 4MB+
WSJ non-AMP is 5.7MB
And by non-AMP, I mean "mobile web version" The desktop versions are even larger.
Can you see the problem?
It's prerendered (via a static site generator). In total, it loads 692 KB (I din't do anything to optimize it, the images are quite large etc.). It loads from a small server, and images are loaded from Twitter, meme.com etc.
Here's an AMP page: https://www.google.se/amp/s/www.usmagazine.com/celebrity-new...
It loads a whopping 2.9 MB , and keeps loading as you scroll down. If you open it from Google's search, it opens instantly. Because parts of it were already preloaded on the search page. And the page itself (including almost all images) is served by a ridiculously powerful geographically distributed CDN.
1. How is that fair to people who actually build their pages and host them on their servers?
2. What is open about this web?
3. How will Web Packaging solve this issue if I can't afford to build a geographically-distributed CDN on par with Google's for my own cache?
 It actually changes on every reload. The lowest number I've seen is 1.6 MB, but then, in a second or two, it starts loading additional stuff, going up to at least 2.2 MB
So much for "small APM pages". Actually, as I'm clicking around, rarely is a page below 1 MB. Even for pages that are not that different from mine: only images and text.
For some reason you think that the solution to that is "let's do a standards-incompatible aggressively preloaded slimmed down page that will live on our ultra-fast CDN/cache servers".
Also, can you see why web packages don't solve the problem (hint to start you thinking: not everyone can run their pre-rendered pages off of Google's CDN. Even Google's own AMP isn't fast if it's not preloaded from Google's cache)?
How can it be standards incompatible if it works in existing standards compatible browsers?
> Also, can you see why web packages don't solve the problem (hint to start you thinking: not everyone can run their pre-rendered pages off of Google's CDN. Even Google's own AMP isn't fast if it's not preloaded from Google's cache)?
Did you read the Redfin article? The point isn't for you to run the CDN or do the prefetching, the point is, how do people find your site and articles? Either they find it through Google/Bing/Baidu/etc, social network sites (Twitter/Facebook), or aggregation sites (Reddit, HackerNews, etc). The point is, for large aggregation sites with a lot of traffic to roll out preloading on CDNs. So for example, Cloudflare already supports AMP-Cache, and Reddit could roll out prefetching if desired.
And you completely missed the point that, getting publishers to adopt AMP gets them to slim down their sites even if you don't use the AMP cache or preloading. Something everyone has been trying to get them to do for years, including Google, who has been trying to penalize slow sites for years (https://www.linkedin.com/pulse/20140827025406-126344576-goog...)
So hurray for you making a slimmed down page, but you're not the target audience, the huge number of other sites that have for years, bloated the Web and haven't responded to previous attempts to force them to go on a diet are the target.
You really have no idea how the web works, do you? Browsers do a best effort to display any page. Even if the HTML is totally absolutely invalid, the browser will go out of its way to display at least something.
The mere fact that something is displayed by a browser doesn't make it standards-compliant.
AMP is standards incompatible because:
- its HTML is not valid HTML 5 (just a few examples here: https://news.ycombinator.com/item?id=16467873)
- whatever extensions to HTML 5 they bring are not a part of any HTML standard, past or present. And it doesn't look like Google is interested in making them a part of any future standard.
> So hurray for you making a slimmed down page, but you're not the target audience, the huge number of other sites that have for years
That's not the point, is it? Google will still penalise my page even if it's way slimmer than a standard AMP page. And since I cannot afford to run a Google-scale CDN, it will perform worse than an AMP page.
So here's what we have in the end:
- Google (and Google alone) decides what AMP will look like. There are no discussions with the web community at large or the standards committees.
- Google (and Google alone) decides that only AMP pages end up in its own proprietary AMP cache. (Other "big aggregators" may/will also decide that only AMP pages can be in their proprietary caches)
- Even if a web developer follows all of Google's performance tips (https://developers.google.com/speed/docs/insights/rules) the page will still be penalised because it's not an AMP page (i.e.: not a page developed using whatever a big corp has decided, and running from a big corp's CDN/cache)
- Even Google's own page speed tools tell you that AMP is not fast, and yet everyone (even 100% optimised slimmed down pages) is penalised if you're not running the page from an overpowered private cache
A lot of mental gymnastics and total ignorance of how the web works goes into calling this an open, extensible web that will benefit everyone.