Hacker News new | past | comments | ask | show | jobs | submit login
A Report from the AMP Advisory Committee Meeting (shkspr.mobi)
97 points by velmu 4 days ago | hide | past | web | favorite | 70 comments





I see this more and more recently where a webpage takes ~5 seconds to load if it’s an AMP page if I have some sort of ad blocking (cookie restriction) in place. What’s interesting is that the page content is already loaded but my browser (Safari) is waiting for something to render everything. So, if I have the Reader Mode pre-activated on the same page, I can read the page before the ~5 seconds is over and I can actually see the page!

That's because to create a valid AMP page you need to add an 8 seconds delay of blank page for users that disable Google's third-party js.

You actually have to make it deliberately slow if Google's js is disabled? that seems a little malicious...

Yup. This snippet adds a css animation to make the page white for 8s, and it's stopped by Google's JS or the browser's noscript (which you don't enable if you only block Google's JS and not the domain JS):

https://amp.dev/fr/documentation/guides-and-tutorials/learn/...


Insane. Google has lost all credibility.

So, is there a way that I can block this tag somehow and drop it during the load time?

You can add a Stylus stylesheet enabled for the regex .[star]amp.[star] (replace [star] with the star symbol) that contains body{animation:none}, but I don't know if that's good for performance.

Edit: Developers can also add that css in their amp style, and it seems to be accepted by the amp validator. AMP devs, you know what to do! ;)


Go for this on all pages instead:

  html[\26a1] body, html[amp] body {
    animation: none;
  }
(I wrote \26a1 because HN doesn’t allow the literal U+26A1 character.)

That way it applies to all AMP pages, regardless of their URL, and doesn’t clobber any other pages (though I can’t actually imagine any genuine animation on a body element).


Works perfectly, thank you!

I took the liberty to put it on userstyles.org for one-click install: https://userstyles.org/styles/171953/disable-amp-blank-loadi...


Btw you can begin a line with two spaces for code formatting:

  .*amp.*

I assume the idea behind it is to prevent the “flash of unstyled content”.

I wonder who made the decision that 8s delay is better than the unstyled flash.

Flashes are distracting. IMO, it's not the delay that's the problem, but the fact that it lasts eight seconds. The limit should be 500 milliseconds, at most.

There is no delay if the script loads properly.

Having the delay in that circumstance is much, much worse than a FOUC.

I agree

Consider a very slow network connection. Is effectively delaying the page loading for 8 seconds worth it?

No, but I didn't say it was was a good idea

the original code is to prevent FOUC, that 8 second delay is sort of a fallback if the js hasn't unhidden the body

Well that explains it. I constantly hear people say, "AMP is about increasing pages speed" but that has not been the case for me for over 50% of the time.

That's likely due to amp-font, which will wait a certain amount of time for custom fonts to load before continuing with a fallback browser font (see: https://amp.dev/documentation/components/amp-font)

8 seconds is not a reasonable delay, and AMP pages work perfectly fine if the JavaScript is blocked. Which is going to annoy users more -- a slight delay in a font's loading or an 8-second delay in the entire page's loading?

I suspect that Google is trying to get most users to not block their JavaScript by making their access to "the Web" unbearable if they do it. (For other types of user punishment, see the other article on that site: https://shkspr.mobi/blog/2016/11/removing-your-site-from-amp... )

I put "the Web" in quotes, because a "standard" that requires you to load JavaScript from Google's servers, lets other sites serve your content while spoofing the URL, attempts to fundamentally change the way navigation on the entire Web works (portals) without input from other browsers, and that most people have to be coerced to use by reducing their traffic if they don't implement it, can't seriously be considered a real standard.


Major appreciation to Terence Eden for putting his own time and expense into representing the interests of the Internet here. I'm extremely comforted that there are people out there that care enough to take our words here on the Internet and bring them forward in person, and I think if you look at the meeting notes, it's clear there was some real impact.

> When a user uses Chrome for Android to search Google, they get AMP results. When a user tries the same search in Firefox, they only get regular results.

Sounds like a win for mobile Firefox.


I am curious how Firefox does this, it is just a setting to auto redirect AMP to the real page?

I primarily stick with Safari since it all syncs with iCloud, that would tempt me to use Firefox.


It's not Firefox doing it, it's Google. Its search results only serves AMP links to whitelisted user agents.

I wasn't aware of this. This sounds excellent, as it means I can avoid AMP by spoofing my user agent string.

Yes, but of course, you miss out on other browser-specific optimizations that way. Also, I believe this is only for Google search results.

You might consider a userscript instead, like https://github.com/bentasker/RemoveAMP.


> you miss out on other browser-specific optimizations that way

That idea doesn't actually bother me at all.


Interesting, does Firefox just not have enough mobile marketshare for them to care?

Very possibly, AMP in search is (sadly) enabled in mobile Safari.

For sure, AMP was the straw which finally pushed me to install Firefox. My mobile experience is so much better now without AMP.

Fantastic, a big thanks for doing this. Google probably has the most to benefit from, but I suspect there is something more structural that is leading to poor management decisions.

"Well, AMP was an interesting experiment. Now it is time to shut it down and take the lessons learned back through a proper standards process."


> poor management decisions

You make the mistake of thinking it is a bad decision for Google to make AMP. It suits Google's interests, that's why it exists. It might be selfish, monopolistic, and harmful to the open web, but Google doesn't have to care.


This was in reference to many other things going on at Google.

It was in Google’s interest if everyone went along with it. They haven’t. Perhaps the strongest opponents now are the webmasters who added amp support and then lost rankings, users, and as revenue.

For all that Google has, they have zero control over the platform the majority of the wealthiest internet users in the United States use — iOS. If Apple doesn’t like something, Google has to care.


My only concern is that Google is big enough to force through changes in the standards too. Chrome+Search is so dominant it's able to steamroll objections pretty effectively through sheer momentum.

"Without user research support, there's no acceptable route to creating new AMP components."

I can't think of a way to say this without sounds really cynical... I don't think they can about "users" in the same sense that "user research" would support, but rather they can that users click on whatever AMP is designed to do best, which is I assume ads or paid stuff at the top of search results. I seem to remember Google selling people on this by highlighting it's fast and more people click on it? I would think that's all that matters to them. I can't blame them, that's their job, to get more people to click on more things that Google makes money from.


This was a very well rounded analysis and I agree with it's points. Even though I like AMP in its general case (making the web less of a hog), it's proprietary nature, lack of accessibility and so on are in fact real dings against it.

Thanks for being the voice of reason over there Terence, well done.

I only want one thing from AMP -- the ability to opt out so that I never get an AMP page.

I have to agree with this but if Google's goal is to make the web faster then the AMP mission can still be done almost exactly as they are doing it today. Just publish a profiling tool (expanding Lighthouse maybe) that reports the numbers back to Google and then you either get to use their cache and edge servers and get ranked higher or not based on whatever metric that Google wants us to care about. Maybe I'm missing something but that really seems like a much more obvious approach to solving the problem without going through this weird extra webcomponents library, different URL schemes and all that.

> use their cache and edge servers

Using the amp cache requires pre-loading content in the end user's browser. That makes things really fast (there is zero request latency to "load" and AMP page), but has security implications. If I stick `window.onload(send a bunch of user analytics to my personal server);` in a preloaded page, than anyone who does a google search where my resource is a result unintentionally loads my page and then has data leaked to me, a random nefarious internet person.

So given these two requirements:

1. 0 request latency

2. Prevent data leakage to third parties

Come up with a scheme that is different from AMP. Or you can argue that either of these requirements is wrong: that data leakage is okay, or more likely that some low-but non-zero level of latency is alright. But my understanding is that the options are either use something amp-like and get approximately 0 latency, or use something non-amp like, and have latency of 50-100ms minimum on good, wired, HTTP/2 connections for well designed websites, and that becomes really bad, really fast, if you're on a 2g or 3g HTTP/1 connection in rural $wherever that's dropping some packets.


Perhaps give browsers the ability to load cached content across domains? So, Google would tell my browser to cache this resource, and then when another website requests it, it's already available in my cache.

I'm sure there are some security/privacy implications that would need to be worked through, but they don't seem insurmountable? It can't be worse than letting websites show fake URLs...

(Security-wise, the primary thing that comes to mind is you'd want to store and check a hash of any asset cached by a third party, to make sure they're actually the same file.)


> I'm sure there are some security/privacy implications that would need to be worked through, but they don't seem insurmountable? It can't be worse than letting websites show fake URLs...

Prefetching cross-domain means that now anyone who loads google.com has the potential to also ping mywebsite.com/trackingendpoint, which is a resource that you have to cache. So I get some information, likely less than if you actually execute the page, but still enough to be worrisome.

> It can't be worse than letting websites show fake URLs...

I really don't get this complaint. I can absolutely understand the unease people have about "fake urls" but a lot of urls already lie. How many are really cloudflare or aws? It's completely possible I'm missing something, but as far as I can tell, signed http exchanges are more secure than a CDN, since they're signed.

I guess that potentially since I, a nefarious person, can re-host your signed content, there might be data leakage possible there, but given that AMP is mostly static, I think that's mostly mitigated. Could be wrong here, I'm not a security (or AMP) expert, but I don't get the fear.


> Prefetching cross-domain means that now anyone who loads google.com has the potential to also ping mywebsite.com/trackingendpoint.

Sorry, I'm not suggesting prefetching cross-domain. I know for a fact this works already, as I've done it.

I'm suggesting that one domain should be able to load assets cached by another domain. They can check the hash to ensure it's the correct, unmodified asset.


I had a longer reply to this but lost it.

There's some significant security concerns with this (I start claiming to cache the pornhub css file), it's not at all clear how it works when you have other caches (if Bing refers me, how do I decide to use bings cached version of my asset vs Google's). I guess you might mean that google can cache something as my site, but I'm pretty sure that would also allow me to maliciously overwrite Google's js in your cache.

And it still has the latency issues: to connect to your site I now have to establish an initial https connection and load the main html page. That's significantly more than 0 latency.


Shouldn't storing and checking a hash solve most of the security problems? That way, you know the asset hasn't changed at all. It doesn't matter if there's multiple caches because unless you've managed to break sha256, they should all be the same file.

The actual html can also be cached, which should make it zero latency. However, I am realizing now that I'm not sure where the checksum gets stored...

(I'll readily admit this was a spur-of-the-moment idea that I haven't actually thought through intensively.)


And isn't that available today with `<link rel="preload">`?

I don't think one domain can preload content for another domain from their own server, but I may well be wrong. If I am, great, Google could do this today.

You didn't solve requirement #2.

How does the third party know the asset was cached if I don't actually visit the third party's page?

The CDN (Google) knows all, but that's true with AMP too.


If that's what you're describing, you've described exactly how AMP works today (though AMP goes one step further and prerenders above the fold, making it instant instead of merely fast).

> if Google's goal is to make the web faster

It's not. It's just a means to an end - lock-in publishers, and lure consumers.


"Some people felt aggrieved that all the hard work they'd done to speed up their sites was for nothing."

This seems like a backwards way of saying that speed improves for most sites? Isn't that a goal?


I wrote the article, let me clarify.

Imagine you're a newspaper. You spend lots of money and time optimising your site. More people visit and your search ranking increases because Google takes page speed into account.

Your rival newspaper doesn't do any of this. But one day they switch to Google's proprietary language. They now get featured at the top of Google's results page.

All your work was for nothing and you have to spend more time + money on an AMP solution. It's no faster than your previous one, and no improvement on user experience. But you have to otherwise you don't get the top spot.

There is nothing you can do to differentiate your site from a performance point of view.

So, yes, all sites are now fast. That might be good for the users. But it also means that you don't get the free market benefit of optimised sites attracting more users. Which, of course, lowers the quality of content for the user.


Thanks!

Yes, I can see how it would be unfortunate for them. In the long term, though, not competing on performance (because it has become good enough, for everyone) means that the competition switches to other things like quality journalism. (Or more cynically, clickbait headlines?)

It's not clear that competing over milliseconds like the high-frequency traders do is something we should want.


> It's no faster than your previous one

Impossible. You cannot beat preloaded+prerendered.


I read this as, there is no longer a compelling reason to speed up your site if AMP always wins by default.

How does it compare to Facebook Instant Articles?

What's your solution for instant-loading pages from link aggregation pages? If you have a better solution, the AMP advisory committee will have a reason to change. Otherwise, you're wasting your time and now ours.

Put your site behind a CDN like Cloudflare and cache it aggressively.

It's really not rocket science.


That doesn't make it instant. AMP pages load instantly because AMP prerenders just the part of the page above the fold for multiple links efficiently.

If you don't understand the problem AMP solves, you aren't going to be able to offer an alternative solution; and Google, Bing, Baidu, Cloudflare, and the other AMP users are going to ignore you as a crank.


But you have to give up so much for that "instant load". Personally, as a user, I couldn't care less if a page loads instantly or not. I do care that AMP pages tend to be terrible in multiple ways, though.

The solution, of course, is to make it possible for the user to choose whether or not they will get AMP pages. I think that it's telling that Google refuses to address this despite cries for this ability from day 1.


> Personally, as a user, I couldn't care less if a page loads instantly or not.

Google's, Bing's, Baidu's, Yahoo! Japan's, and others' metrics show otherwise. If not, they wouldn't use it. I know I prefer instantly loading pages, so I personally seek out results with the lightning logo.

> I do care that AMP pages tend to be terrible in multiple ways, though.

The people who say this tend to be people who use terrible browsers (i.e., all browsers on iOS) where the instant loading comes at the expense of triggering browser bugs that the user can't avoid by switching to a better browser. For people with non-buggy browsers, the experience is fantastic.

> The solution, of course, is to make it possible for the user to choose whether or not they will get AMP pages.

That already exists. Use the basic HTML search page. All the aforementioned search engines, including Google, provide one.


> The people who say this tend to be people who use terrible browsers (i.e., all browsers on iOS) where the instant loading comes at the expense of triggering browser bugs that the user can't avoid by switching to a better browser. For people with non-buggy browsers, the experience is fantastic.

Yes there are browser issues, but the issue is also the pages themselves. AMP pages are horrible, crippled crap. Take a look at this AMP page on a desktop browser. How is this not a step backwards? http://amp.washingtontimes.com/news/2019/may/16/doug-jones-a...


> Google's, Bing's, Baidu's, Yahoo! Japan's, and others' metrics show otherwise.

I didn't realize that they had metrics about what my personal preference was, let alone that their metrics demonstrates that I'm wrong about what I prefer.

> For people with non-buggy browsers, the experience is fantastic.

Not for me. I don't have a problem with bugs as you describe. I have a problem with the contents of the AMP pages (generally speaking -- I have seen one or two that weren't awful).

> That already exists.

That's insufficient. I do that, but I still manage to get AMP links in various ways. I admit, it's merely an annoyance as I can generally edit the URL to get the real page, but it's annoying nonetheless.


> I'm wrong about what I prefer.

You're certainly in the tiny minority.

> I have a problem with the contents of the AMP pages

And more people have problems with the contents of non-AMP pages. What problems do you have with AMP pages that can't be solved by the AMP publishers?

> I still manage to get AMP links in various ways.

You were specifically talking about Google search. If you get them from other sources and have some (so far, it seems) irrational hatred of them, you can probably find a way to solve that too.


> You're certainly in the tiny minority.

Probably so, but that's irrelevant to my comment.

> have some (so far, it seems) irrational hatred of them

Since I'm expressing my opinion rather than making a technical argument, I'll admit to the "irrational" part. AMP pages are less useful to me, bring unnecessary third parties into the exchange, and I dislike them.

I don't hate them in the sense that you imply, though. I don't care if others see AMP pages. I just don't want them myself. That's no more "hatred" than saying that I prefer not to eat a certain type of food.


Why does it have to be instant? We're talking about the difference of perhaps 100-200ms.

Because the metrics of these search engines show that instant is better for their users and because their non-web proprietary competitors are instant (Facebook Instant Articles and Apple News). Otherwise, they wouldn't bother with the extra infrastructure costs.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: