Hacker News new | past | comments | ask | show | jobs | submit login

Have you even read the article? The only reason AMP is perceived as better than anything out there is because Google aggressively preloads it.

At best it's not better than anything else out there. Solutions to slow loading pages are well known and have nothing to do with AMP or "other web frameworks". Even Google themselves lay out the solutions: https://developers.google.com/speed/docs/insights/rules. Nowhere does it say AMP. And if you bothered to read the article, you'd see that Google's own tools consider Google's own AMP to be bad and non-performant.

> Less talk and politics, more shipping code.

That's exactly how and why we ended up with AMP.




The article is wrong, as no amount of preloading will make the non-amp Verge fast to load as it runs too much JS to block the initial render.

You could provide a mobile framework and tools for publishers that helps sites create pages that render fast by putting them on rails. AMP is that framework, other people could similarly introduce tools to help. Chrome DevTools and Google has long offered the Page Speed tools and others to audit your code for slowness, but curiously, no one seems to use them or care, which is why we ended up with millions of slow ass mobile sites shoehorned with megabytes of JS.


You're missing the entire point. It is fine to have a framework like AMP that puts up technical constraints leading to a performance-friendly web page.

The problem is the cache and specifically the preloading of it. This gives AMP an unfair advantage of multiple seconds over anything else.


That's why Redfin (https://redfin.engineering/how-to-fix-googles-amp-without-sl...) pointed out that the Web Packaging spec could fix this. But before you have a general purpose spec that fixes something, you need a specific embodiment that does. asmjs came before WASM. SPDY came before HTTP/2. Flash came before HTML5. I didn't like Flash, but would you have suggested Adobe worked with browser vendors for years to bring the Web up the capabilities needed and never having shipped Flash?

Still, even without the AMP-cache, mobile sites were loading way too much JS, even after Google penalized them. The effect of AMP showing how sites could be loaded as fast as native Apple News/Facebook Instant, has finally gotten publishers to strip down their sites. You might not like the way it played out, but the end result is that not only do end users get AMP-cached fast loading, but they also end up download far less data, because the sites themselves have been pared down.


> Web Packaging spec could fix this

It won't fix this. The only thing it will do, it will let browsers show the original link, not the AMP link, and fix the UI. The problems described in the article will not go away.

> But before you have a general purpose spec that fixes something, you need a specific embodiment that does

AMP isn't that spec though. It does nothing special. And the only reason it's fast is because Google aggressively preloads it.

> but the end result is that not only do end users get AMP-cached fast loading, but they also end up download far less data,

Are they though? When for every search google preloads tens of AMP sites to make them "fast"?


> It won't fix this. The only thing it will do, it will let browsers show the original link, not the AMP link, and fix the UI. The problems described in the article will not go away.

No, it does more that, it does away with the need to use iframes which break scrolling, and it allows all sites to use preloading without violating privacy see https://redfin.engineering/how-to-fix-googles-amp-without-sl...

"If other browsers accepted the Web Packaging standard, the web might look rather different in the future, since basically any site that links to a lot of external sites (Reddit? Twitter? Facebook?) could start linking to prerendered Web Packages, rather the original site. Those sites would appear to just load faster. Web-Packaged pages could one day eliminate the Reddit “hug of death,” where Reddit’s overenthusiastic visitors overwhelm sites hosting original content.

Despite cries that Google is trying to subvert the open web, the result could be a more open web, a web open to copying, sharing, and archiving web sites."

>Are they though? When for every search google preloads tens of AMP sites to make them "fast"?

TheVerge.com non-AMP loads 3MB of data, 289 HTTP requests, executes 1.5Mb of JS. Going to Google.com and searching for Verge stories produces 10 carosel Verge stories, and according to Chrome DevTools, only 377kb was loaded, though this seems oddly wrong, I doubt prefetching AMP stories will exceed the shitty bloat of non-AMP pages.

WashingtonPost non-AMP homepage is 6MB+

NYT non-AMP is 4MB+

WSJ non-AMP is 5.7MB

And by non-AMP, I mean "mobile web version" The desktop versions are even larger.

Can you see the problem?


Here's another way for you to understand the problem.

Here's my page: https://dmitriid.com/blog/2016/10/javascript-tools/

It's prerendered (via a static site generator). In total, it loads 692 KB (I din't do anything to optimize it, the images are quite large etc.). It loads from a small server, and images are loaded from Twitter, meme.com etc.

Here's an AMP page: https://www.google.se/amp/s/www.usmagazine.com/celebrity-new...

It loads a whopping 2.9 MB [1], and keeps loading as you scroll down. If you open it from Google's search, it opens instantly. Because parts of it were already preloaded on the search page. And the page itself (including almost all images) is served by a ridiculously powerful geographically distributed CDN.

So, questions/hints

1. How is that fair to people who actually build their pages and host them on their servers?

2. What is open about this web?

3. How will Web Packaging solve this issue if I can't afford to build a geographically-distributed CDN on par with Google's for my own cache?

---

[1] It actually changes on every reload. The lowest number I've seen is 1.6 MB, but then, in a second or two, it starts loading additional stuff, going up to at least 2.2 MB

So much for "small APM pages". Actually, as I'm clicking around, rarely is a page below 1 MB. Even for pages that are not that different from mine: only images and text.


> TheVerge.com non-AMP > yada yada

For some reason you think that the solution to that is "let's do a standards-incompatible aggressively preloaded slimmed down page that will live on our ultra-fast CDN/cache servers".

Can you see the problem?

Also, can you see why web packages don't solve the problem (hint to start you thinking: not everyone can run their pre-rendered pages off of Google's CDN. Even Google's own AMP isn't fast if it's not preloaded from Google's cache)?


> "let's do a standards-incompatible aggressively preloaded slimmed down page that will live on our ultra-fast CDN/cache servers".

How can it be standards incompatible if it works in existing standards compatible browsers?

> Also, can you see why web packages don't solve the problem (hint to start you thinking: not everyone can run their pre-rendered pages off of Google's CDN. Even Google's own AMP isn't fast if it's not preloaded from Google's cache)?

Did you read the Redfin article? The point isn't for you to run the CDN or do the prefetching, the point is, how do people find your site and articles? Either they find it through Google/Bing/Baidu/etc, social network sites (Twitter/Facebook), or aggregation sites (Reddit, HackerNews, etc). The point is, for large aggregation sites with a lot of traffic to roll out preloading on CDNs. So for example, Cloudflare already supports AMP-Cache, and Reddit could roll out prefetching if desired.

And you completely missed the point that, getting publishers to adopt AMP gets them to slim down their sites even if you don't use the AMP cache or preloading. Something everyone has been trying to get them to do for years, including Google, who has been trying to penalize slow sites for years (https://www.linkedin.com/pulse/20140827025406-126344576-goog...)

So hurray for you making a slimmed down page, but you're not the target audience, the huge number of other sites that have for years, bloated the Web and haven't responded to previous attempts to force them to go on a diet are the target.


> How can it be standards incompatible if it works in existing standards compatible browsers?

You really have no idea how the web works, do you? Browsers do a best effort to display any page. Even if the HTML is totally absolutely invalid, the browser will go out of its way to display at least something.

The mere fact that something is displayed by a browser doesn't make it standards-compliant.

AMP is standards incompatible because:

- its HTML is not valid HTML 5 (just a few examples here: https://news.ycombinator.com/item?id=16467873)

- whatever extensions to HTML 5 they bring are not a part of any HTML standard, past or present. And it doesn't look like Google is interested in making them a part of any future standard.

> So hurray for you making a slimmed down page, but you're not the target audience, the huge number of other sites that have for years

That's not the point, is it? Google will still penalise my page even if it's way slimmer than a standard AMP page. And since I cannot afford to run a Google-scale CDN, it will perform worse than an AMP page.

So here's what we have in the end:

- Google (and Google alone) decides what AMP will look like. There are no discussions with the web community at large or the standards committees.

- Google (and Google alone) decides that only AMP pages end up in its own proprietary AMP cache. (Other "big aggregators" may/will also decide that only AMP pages can be in their proprietary caches)

- Even if a web developer follows all of Google's performance tips (https://developers.google.com/speed/docs/insights/rules) the page will still be penalised because it's not an AMP page (i.e.: not a page developed using whatever a big corp has decided, and running from a big corp's CDN/cache)

- Even Google's own page speed tools tell you that AMP is not fast, and yet everyone (even 100% optimised slimmed down pages) is penalised if you're not running the page from an overpowered private cache

A lot of mental gymnastics and total ignorance of how the web works goes into calling this an open, extensible web that will benefit everyone.


> The article is wrong

In what part is the article wrong?

> You could provide a mobile framework and tools for publishers that helps sites create pages that render fast by putting them on rails. AMP is that framework

You clearly didn't read the article. AMP is not fast. The moment you load it from anywhere else but Google's cache, it's not faster than any other webpage with a similar amount of Javascript and other code.

Even Google's own performance measuring tools say that AMP isn't fast.

> Chrome DevTools and Google has long offered the Page Speed tools and others to audit your code for slowness, but curiously, no one seems to use them or care, which is why we ended up with millions of slow ass mobile sites shoehorned with megabytes of JS.

That is really besides the point. You bemoan that "there's no mobile framework and tools for publishers that helps sites create pages that render fast"? Oh look, there are plenty of those frameworks, and there are tools like Google's own tools.

And those tools say one thing:

- AMP is not fast

- Google lies about the speed of AMP by aggressively preloading AMP pages from it's own overpowered CDN/cache

- It's entirely possible to create fast pages with existing technologies without AMP. Google has extensive documentation on how to do that (and obviously it never mentions AMP). However, Google will penalise those pages even if they are faster than AMP.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: