Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Is it just me or is the AMP project making everything slower?
378 points by gtf21 on Jan 12, 2019 | hide | past | favorite | 182 comments
I have µMatrix in my browser, and by default it blocks the AMP CDN. The result of this seems to be horrendous white-screen times for web-pages, beyond what I would expect it to take for a web-server to load these pages, and the delay in loading is very noticeable across all of them.

I have a vague idea of the point of AMP, which is to speed up the delivery of web pages. However:

1) I'm not convinced the web pages were really _that_ slow to start with, so it feels like an unnecessary project (well, excepting the fact that web-page bloat has massively increased as people use more and more javascript libraries &c.).

2) Perhaps I'm being a bit old fashioned, but I don't really understand what was wrong with a browser serving a web page made out of some HTML, CSS, and Javascript. It feels like we're replacing one technology with another (and that latter one comes from Google, which makes me nervous). Do we really want AMP to be the way we serve the web?

3) I haven't properly investigated, bt I'm assuming that this delay is some script which is waiting for the data to load from the AMP CDN, and once that times out it displays the underlying content (I did check and loading this page [1] is almost instantaneous yet the content only displays after three seconds). Any insight into why it's seen as acceptable for this to be so slow when it is unnecessary for displaying the page?

[1]: https://www.ehow.co.uk/how_5840711_test-electrical-outlet-digital-multimeter.html

I sadly have to code AMP for work. The reason you see a white page is that the boilerplate code required for any AMP page to be valid includes :

  webkit-animation:-amp-start 8s steps(1,end) 0s 1 normal both;
With visibility set to hidden for every element on the page. This is used to minimize page rendering artifacts for the end user, which is one of AMP's main goals, content should be fixed.

The problem however here is that this animation is usually triggered before the 8s, as soon as the AMP javascript has been served from the CDN. Your adblocker of something of the sort is blocking it which results in you having to wait for the hard coded 8s fallback to see the page no matter if the content is already loaded.

AMP is beautiful.

I, too, sadly have to code on AMP bullshit for work.

To be a valid AMP page (at least, according to the Google overlords) you _must_ use the CDN'd AMP scripts. This is the very antithesis of what the world wide web is supposed to be about. IMHO, it's not at all unreasonable to block stuff from domains you don't trust, especially domains controlled by large surveillance corporations. The web should be more resilient. AMP makes it brittle.

> IMHO, it's not at all unreasonable to block stuff from domains you don't trust, especially domains controlled by large surveillance corporations. The web should be more resilient. AMP makes it brittle.

My thoughts exactly, this is what makes me nervous about AMP, it feels like it's pushing the web in a direction that is alien to its original ideas.

From my reading of https://www.ampproject.org/docs/fundamentals/spec, it looks like you could override this behaviour in uBlock Origin with a style cosmetic filter. Something like...

    ##html[amp] body,html[\26A1] body:style(animation:none!important)
(I can't test it in my current browser, unfortunately, but I think that's right.)


Update: I got to a browser I could test it on. Unfortunately, it turns out uBO doesn't let you apply :style() rules to all pages indiscriminately, you have to specify a domain name pattern - which is pretty annoying, because the principle is evidently sound; this does work on the link in OP:

    ehow.co.uk##html[amp] body,html[\26A1] body:style(animation:none!important)

Alternatively, you could use a CSS injection addon (or userContent.css, in Firefox) to add the rule to all pages:

    html[amp] body, html[\26A1] body {
        animation: none !important;
Or do it with a userscript (ie via Greasemonkey or similar).

[edit 2]

It appears uBO is aware of a need for something like this - they provide an injectable "scriptlet" that adds the necessary CSS: https://github.com/gorhill/uBlock/wiki/Resources-Library#amp...

Unfortunately, scriptlet injection also can't be applied to every page indiscriminately. :/

    ! works on OP's link
    ! doesn't work at all
So I'm just mentioning it here for completeness' sake.

And if uBlock needs to be modified to make your Amp-adjacent experience better, it's probably a bug uBlock's team should be notified about.

Has it been figured out which adblock list accidentally blocked AMP? We'd like to contact them.

This is clearly a bug, since blocking AMP is not related to blocking ads (all the normal ad blocking rules should do the job just fine).

µMatrix blocks everything that's not from the primary domain unless I tell it otherwise, which is sometimes annoying, but on the other hand lets me cut out loads of crap (hello AMP-project).

Thanks for the enlightenment, this explains why I no longer visit The Independent's website.

I naively assumed, to begin with, that they'd pushed out something broken. Except it never got resolved, which surprised considering the length of delay. So I resolved it by pretending they no longer exist.

I'll likely never know if they ever do resolve it.

8 seconds? I thought it was 3.

That being said, this is most likely the culprit. If OP is blocking AMP CDN, the inlined CSS code will hide the content until the CSS animation completes after whatever the timeout is nowadays.

This CSS animation is just the backup in case the javascript doesn't load at all, really. After 8 seconds, the page gives up trying to prevent the flash of unstyled content and just renders, regardless of how bad the styling is. It also includes a <noscript> block that renders the page immediately if javascript is disabled. The 8s thing is for network issues. Graceful degradation.

Lots of documents (amp or not) use the same "hide the screen until layout is done" trick to avoid multiple relayouts as the initial javascript is running and CSS is being fetched. More often than ideal, they don't have a fallback if the javascript doesn't load at all. AMP mandates it with this CSS animation, which is far better than nothing.

Also when served from the AMP Cache (https://www-ehow-co-uk.cdn.ampproject.org/c/s/www.ehow.co.uk... for the example shown), the layout algorithm that javascript runs is applied by the AMP Cache instead and the 8-second timeout code is removed completely (view source and take a look). This doesn't work on all pages - there are some features that require a client-side context, but it does work on this one. Websites can run the same server-side layout algorithm on their origin using a node library (https://www.npmjs.com/package/amp-toolbox-optimizer). There is also work being done to improve all of this (server side layout on more documents, and making the system easier to run on your own site).

The 3s observation from the original post is interesting. It may just have been an estimation of the 8s, or it could have something to do with how the document is configured. Looking at this document, there are some <amp-font> tags that the document author has added with a 3000ms timeout. These are tags that instruct the AMP javascript to change the document CSS class depending on the success of failure of loading of a particular font. By default (not amp specific), if a document loads a webfont for a particular text, the browser will not display the text until the font is loaded. <amp-font> provides a CSS hook by which the author can do stuff like "hide the text for up to 3s, or when the font has loaded, whichever comes first". This page has some <amp-font> tags with a 3s timeout referencing fonts that have not been added to the document, which seems like a mistake from the document author, unsure. I was not able to reproduce the 3s experience though, so this may be incorrect speculation as to what happened.

> The 8s thing is for network issues. Graceful degradation.

A mandatory 8s lag for a careful users doesn't sound graceful to me :-/

For HNers who weren't creating web back in 2009: we used to have another term as well, "progressive enhancement" that meant more or less "after we got a baseline working on all supported browsers we can add nifty stuff that doesn't work in IE.

Graceful degradation is often handled by changing timing on expected results. Timing is often a useful unconstrained "fudge factor" for such things. And since disabling non-standardized intermediaries mucking about in your browser's w3c-compliant behavior is a solution, I doubt Amp will change to address this failure mode.

> Lots of documents (amp or not) use the same ... trick

Perhaps luck, but I have never encountered one that is so hostile about it with nearly 10s wait outside of AMP. I've rarely encountered a second or so's delay, but two or three sites including a large one that have the massively excessive AMP delay.

So to call it the same trick seems like a stretch from a user annoyance point of view.

but why should they improve your experience, who has ad-block enabled and as such not creating any value to the website owner?

It annoyed me as well before, though I realized as I stopped going there that this is for the best... for both my sanity and the websites bottom line.

Sometimes it's that simple, often it's not. There's still plenty of sites available without such hassles so it's not yet a problem finding alternatives.

On the other hand an ad blocking user is not necessarily producing no value for the site. I've had subs to newspaper sites, all of which I viewed with blocking on, and Independent will never now be one of them. If Guardian did likewise, and added AMP, I'd cancel my current sub and look elsewhere, which would be a definite loss of value for them.

I'm not so wedded to the views of any one outlet that I'd subscribe with this in place, or turn off blocking to subscribe, or feel I must subscribe to the same one indefinitely.

If it's a site I wouldn't have considered a sub or donation to, you're right, nothing is lost.

This sort of assumes that (a) the only possible system is surveillance capitalism (I have no problem unblocking advertisers on sites which don't track me e.g. DuckDuckGo, I also have no problem paying for things directly -- in fact, I prefer it); and (b) that this is the sole reason for using something like µMatrix which blocks _everything_, not just ads.


That claim is pretty shallow.

The content is there, it's just hidden through css. If anything, a screen reader has access to the content earlier than the unimpaired people.

Accessibility is about much more than screen readers. If a user experiences problems accessing your site, that’s an accessibility problem, no matter what physical attributes that person possesses.

It seems you've been misinformed by the current generation of web designers. Accessibility is actually limited to people with disabilities, though a few people have started to use it interchangeably with usability. [0]

I agree with the spirit of your message though, but that's not what the grandparent post claimed.

> Accessibility is the design of products, devices, services, or environments for people with disabilities.[1] The concept of accessible design and practice of accessible development ensures both "direct access" (i.e. unassisted) and "indirect access" meaning compatibility with a person's assistive technology[2] (for example, computer screen readers).

[0] https://en.wikipedia.org/wiki/Accessibility

There is more to accessibly than screen reader compatibility. (I want to emphasize this point, not because I think you said anything to contradict that but because I think that is a point lacking from the conversation in this thread)

For example, I'm autistic and use a "normal" browsers, but garish websites (for example those displaying animated ads) are less accessible to me, because those lead me to becoming overstimulated, making me less likely to absorb and remember information presented on the page as well as being physically and mentally exhausting.

I am perfectly willing to pay for content (and am, through Spotify, Netflix and Patreon), but much of the web is actively hostile to many disabled people. My physical disabilities don't prevent me from using my computer in the standard way but every time I misplace my mouse and try to navigate the web solely using keyboard (which isn't that far from how I usually use my computer, so it's not like I don't know the shortcuts) I am reminded how it must be for people who are unable to use a mouse and have to rely on other input methods. Many websites couldn't have worse UI when it comes to accessibility if they tried.

There's a wide spectrum of accessibility between "content as created by the designer" and screen readers. People with a vision impairment who do not require a screen reader can use the browser's built-in magnifying/zooming capability to make a site usable for them, but only if the content is visible.

> <amp-font> provides a CSS hook by which the author can do stuff like "hide the text for up to 3s, or when the font has loaded, whichever comes first

Modern browsers let the page author control this with the "font-display" CSS property [1].

[1] https://developer.mozilla.org/en-US/docs/Web/CSS/@font-face/...

> code required for any AMP page to be valid includes: [sleep 8 to avoid rendering artifacts]

That sounds awfully convenient for whoever cooked this up.

8 seconds! I remember browsing the Web in the early 2000s on a good connection and pages loaded fully in a matter of milliseconds. It was often imperceptible. Oh how we've regressed.

>I remember browsing the Web in the early 2000s on a good connection and pages loaded fully in a matter of milliseconds.

That has to be rose-tinted glasses. Networks used to be excruciatingly slow. I would walk away waiting for page loads during the dial-up era. Our expectations have simply been adjusted over time.

If black-tea had a good connection at a time when the rest of us were on dial-up, and all the sites were coded expecting dial-up, they may well have seen excellent load times!

(But yeah, I remember the internet being super slow. JPEGs coming down scanline-by-scanline.)

I don't remember it being milliseconds, but it definitely didn't take _this_ long, especially not if I had had the same connection speed I have now.

>I'm not convinced the web pages were really _that_ slow to start with, so it feels like an unnecessary project (well, excepting the fact that web-page bloat has massively increased as people use more and more javascript libraries &c.).

Right. It's super easy to make lightning fast web pages. In fact, that's the default state of a static HTML page. It actually takes a lot _more_ work to make pages slow by adding tons of javascript, dependencies, server-side applications, etc.

From the article linked elsewhere in this thread: >The 90th percentile weight for the canonical version is 5,229kb. [...] The 90th percentile request count for the canonical version is 647

There's the problem. Over 5 megabytes and 647 requests for one article? 5MB is equivalent to about 20 full-length novels or 1,000 pages of text. And 647 requests?! That must be one extremely comprehensive article, right? No, it's 1 request for a brief article and 646 unnecessary requests wasting bandwidth to load unnecessary stuff. Then people wonder why it's slow. But someone put in a lot of work to fluff that page up to be that huge and slow and use that many requests.

> The 90th percentile weight for the canonical version is 5,229kb.

I get antsy when I made a page over 100KB (excluding images, and even then anything over 1MB starts to feel excessive unless it's something particularly reliant on images). 5MB feels very high.

I'm with you, but it's the reality. I just searched for "NASA" on google news, clicked the first thing not from JPL (https://www.fool.com/investing/2019/01/12/whos-who-in-nasas-...) and it loaded 2.6MB cold cache.

Fetched the same on the AMP Cache: https://www-fool-com.cdn.ampproject.org/c/s/www.fool.com/amp... 200KB cold cache. Removing the likely cached resources (amp javascript, fontawesome cdn font) and the payload is ~35KB.

It's certainly possible to make a lightweight page, as hacker news is a great example, but people aren't doing it. For whatever reason, AMP seems to be effective at making the web actually follow best practices.

No it's not. All the AMP websites I've seen were just as bad as their non-AMP counterparts. Reddit's AMP pages in particular, are slow and immensely frustrating to use.

> AMP seems to be effective at making the web actually follow best practices.

Then those best practices aren't very good. In my experience, AMP is, on the whole, a detrimental thing that degrades web pages.

All I want is some way of telling sites to never give me an AMP page.

This is why I have µMatrix -- it's amazing just how many unnecessary requests it has blocked.

I am constantly amazed at just how bloated (and carelessly built) websites have become.

Sure, but from an end-user perspective, preloading is a perfectly valid way to achieve good performance, so it shouldn't be left out of the conversation. The fact that AMP can be preloaded without leaking your sensitive information to the owners of domains that you have not yet navigated to is a primary component of the design, and often gets ignored by armchair analysts.

The problem is that preloading is brittle. I see AMP pages lagging only a daily basis which is much rarer for sites which follow best practices — that 100KB of render-blocking JavaScript makes AMP incredibly brittle.

The web isn't what it was back in the 90s and early 2000s though. Back then, you wanted a website to last because there wasn't much on the web to begin with. Those of us who were online owned a reasonable amount of web bookmarks that led us to nice solid pages that were designed to load nice and fast over slower connections.

Nowadays, the landscape is way different. There are just so many "normal folk" on the internet now that content is being consumed at an alarming rate. There is so much stuff on the internet now, that even Youtube videos have become disposable. Most of the stuff that people read and watch now is consumed once and then never visited again because there's just not enough time to revisit the insane amount of content we're exposed to.

Nowadays, why does it matter if a website is made "brittle" if the content isn't going to matter in a few months anyways? And if you do want to archive something for later, shouldn't the words on the page matter more than the code behind it? After all, if a user 12 years from now wants to read your article, all they're going to want is your words and pictures. Code is always brittle because new technology makes everything obsolete.

Having been a web developer in that era, performance was definitely a big concern. People were more willing to wait but there were still limits and you had the same tendencies for developers to work on fast systems and forget the experience on slow ones.

AMP is also a worse experience than that was because in the 90s you were usually waiting on images to render and progressive display was usually possible so you could start seeing that fuzzy JPEG fairly quickly and read the rest of an article, whereas AMP by design prevents anything from displaying until it’s loaded and executed correctly so you often have to reload the page to see anything at all when it fails.

This matters because most of where AMP was marketed to are competitive fields and that means it’s training users that they’ll get what they want faster and more reliably somewhere else.

I remember browsing the web on my old Nokia N900 phone and watching everything get progressively slower as javascript started getting more and more memory intensive. Eventually I stopped receiving updates, and the entire web became unusable because lots and lots of websites break without javascript now.

> Nowadays, why does it matter if a website is made "brittle" if the content isn't going to matter in a few months anyways?

One reason is brand perception. If your website is significantly slower and/or brokener than a competitor's then eventually people will stop coming back. Presumably you want your brand to last more than a few months.

I too love single points of failure, and look forward to this Google-Internet.

Can it be preloaded without Google having any pertinent information logged?

No matter what, the site you are navigating from will inevitably know the link was shown to you and whether you clicked it. If that site isn't Google, they can run their own AMP cache or use Cloudflare's.

Some of us still have internet quota, sadly.

My takeaways from that article:

When comparing AMP pages to their associated canonical pages, the most striking difference is that AMP pages are significantly lighter (905 KB vs. 2,762 KB) and load significantly fewer assets (61 vs. 318 requests). (All numbers are median values from testing 50 pages.)

Many people have slow phones and expensive internet, and for them this difference enables them to browse news sites, literally.

What bothers me about this argument is that there are plenty of other ways to make a news site fast...like just using straight static html content and minimizing JS usage.

See http://text.npr.org or http://lite.cnn.io for example

Do you really think that websites can “just minimize JS usage” without consequences?

For example, AMP provides an ad component that is probably the by far most performant way to display ads on the web. Without AMP, the site has to use alternative ad solutions, which probably perform worse and load more JS.

The only way for the site to reduce JS would be to find an ad solution that is similar to AMP in terms of performance, and I fear such a solution just doesn’t exist.

They could have fixed this in a heartbeat by simply having pagerank penalize pages for adding external JS assets. No AMP needed; better experience all around, and with less effort.

That just leads to every page on the web hosting its own copy of jQuery. The whole purpose of external JS is to reduce load times by sharing cacheable libraries (admittedly with the side benefit of less bandwidth for the host).

And if you think a whitelist would work, all it takes is one look at how rapidly new JS libraries come and go to render that unfeasible

You would use content names rather than location-based identifiers in that case.

It would be really nice if we started using URIs / URNs more instead of URLs for everything, and did a better job of separating "what" from "where"

By ads you mean tracking.

Simple ads can be served statically without JS.

Have you tried to do this? At any substantial volume? If you serve ads completely statically you're going to have massive fraud issues. Distinguishing real views or clicks from fake ones is an arms race, and it's one you have no hope of winning without JS.

(Disclosure: I work on ads at Google, but I don't know much about how our spam detection works and I couldn't talk about it if I did.)

The most frustrating is that AMP version of app/pages/websites are very incomplete, e.g. the AMP version of a reddit post has only 3 comments (and the "load more comments" button is buggy).

Would have been much better to use Lighthouse / PageSpeed as a basis for showing optimized pages.

Ugh, Reddit. Their amp version is terrible for the exact reason you mention. I know every time I see a Reddit page on Google results that I'll make two page loads instead of one (this does make a difference on my country, where everything is slow). Then, when accessing their pages on mobile browser, there's a huge banner telling me to download their app. In this same banner there is a dark UX pattern, where the call to action, giant button is to download the app, and the tiny footer text is "continue to mobile site". And apparently it doesn't matter I clicked through the banner before, it always displays the same annoying banner.

Sorry for the rant. I'm just genuinely disappointed in what the web has become.

My favorite is that they've recently started animating their "Get the app" button in the header. That's both immediately on page load, _and_ on an interval as you're reading, meaning it's impossible to read for more than a minute or so without an animation distracting you.

Is it just me or has the mobile web taken a nosedive in recent years? Imgur won't even let you upload images from mobile anymore without their app, reddit is getting progressively more annoying, "Get the app" buttons and modals and banners are getting progressively more prominent and dark-pattern-y, and the default for all web pages now seems to be to include <meta name="viewport" conent="width=device-width, initial-scale=1"> regardless of whether the developers actually tested that their web page works on small screens, meaning web pages which would've just fallen back to desktop emulation mode in the past just simply don't work at all.

The mobile web has always been utter trash, it's just gotten worse in the last five years as more sites have rolled out mobile versions.

Disabling Javascript helps a lot, but the "request desktop site" toggle is still frequently necessary.

Imgur is, however, deserving of special mention for actively degrading their mobile experience - direct image links redirect to the mobile site now, which loads low-resolution images.

Imgur gets another point for loading images with (third-party?) JavaScript, so that it sometimes hangs on the fancy animated loading screen and I never see the picture unless I reload.

It’s unfortunate that so much work goes into UX design but at the same time the web has been going downhill for years.

Imgur's mobile site used to be really good. Now it's awful and I refuse to use it, much less be badgered into installing their damn app.

You can siphon off much more user data from an app than a webpage, besides you can pester the user with notifications! The users love that.

I bet there are plugins for firefox mobile to remove those animations and nag window. Basically the "continue to mobile site" just close the banner since you are already on that site.

And then, after the big modal footer, and the animated button, when you tap on a link you sometimes get another different footer appear asking whether you want to view the link in the app or continue to view it in a browser.

all reader mode everything

Yeah, no, my experience is that reader modes usually remove blocks of code or images or paragraphs at random. I will rather have a bad reading experience than one where I always worry in the back of my mind that some cruicial part of the content is missing.

A few too many times, I have read an article, finding it a bit vague and honestly not that good, only to find that all the code blocks or some of the images were gone.

this happens to me maybe 1 or 2% of the time

But how do you know which 1 or 2% of the time it's happening?

If you don't notice that the article is missing some paragraphs, then the article has had a bad editor.

I mostly read random people's programming related blog posts. Knowing whether the article just is kind of bad at explaining what it's trying to say, or if important parts of it is missing, is really not obvious.

If paragraphs of text go missing, maybe you notice it instantly, but if it's images or code blocks which aren't explicitly mentioned in the body text of the article and just let exist to contextualize or demonstrate what the author is talking about, how will you know that something is missing? More importantly, is it even possible to develop a heuristic which catches almost all cases where parts are missing, without lots of false positives for cases which are just inexperienced authors writing bad articles?

blogs aren't usually amp, i've never seen reader mode screw up wordpress or anything. i'll concede it's probably not ideal for code blocks though.

>Is it just me or has the mobile web taken a nosedive in recent years?

It's because "desktop" web is difficult for newbies.

People like you and I love the "desktop" web because we grew up using desktop computers. We don't mind firing up our web browser, typing in a URL and poking around for tiny menu items using CSS that's optimized for big HD desktop monitors. That's what we're used to and that's what we think is normal.

Nowadays, everybody- even your grandma- is on the inernet. Everybody's using an iPhone with a tiny screen and large fonts. Opening up a web browser to browse a website is just something people no longer do.

Grandma would much rather have an app on her home screen that connects her to the world rather than a bookmark in Google Chrome.

Not everybody is a nerd like you and me. The internet has exploded in ways we never imagined, and now we're going to have to deal with everything being optimized for the average user.

I don't know, I seem to hear now and then complaints from relatively non-nerdy people about websites are bugging them about apps or don't work or force them to use their computer because the mobile page doesn't have some necessary feature. Besides, even if everyone is using, say, Reddit, from a dedicated reddit app, links they click will still open in a web view. A link to a medium.com article will still bug you about getting the app; you're there from a dedicated app, but not from _their_ app. Similarly, if an image happens to open in an imgur.com web view instead of a dedicated image viewer, there will be "Get the app" buttons hovering over the image - again, you're visiting their content from a dedicated app, but not _their_ app.

If mobile web pages did just enough to make it obvious to their users that there are apps which the user can download, I'd believe you that it's just corporations being altruistic and wanting the best user experience for their users. They don't though. They intentionally break their mobile pages by removing important functionality, they have modals which reappear for every load trying to trick you by making the big orange "Continue" button take you to the app store, while having a tiny link which is easy to miss-click taking you to the page you're actually trying to visit, they put "get the app"-buttons _over_ the content, sometimes even with no close button, they try to distract you from the content you're trying to read by making their "Get the app" buttons fucking animate around.

This is not just corporations being altruistic and keeping their users' well-being at heart. This is corporations wanting to optimize for user interaction and data collection, and the best way to do that is to make their users use a dedicated app for just their content, where the user will always be reminded that they should check Imgur whenever they unlock their phones, they can send their users a notification about what's trending on /r/AskReddit if they detect their users haven't used their app for a while, they can make sure their users won't leave for a competitor as easily.

reddit user:

> 2) Can you stop the "Download the app" popups showing up so frequently on mobile?

spez (Steve Huffman):

> 2) They've been gone a while, but we are chasing down an issue with incognito users seeing it more often. Please let me know if that's the case, or if you are having a different experience.*

Also talking about dark patterns: https://old.reddit.com/r/announcements/comments/9ld746/you_h...

Just went there and there's a pulsing 'use app' button on the top banner and an 'open in the official app' non-scrolling pop-in taking up the bottom 1/5 or so of the screen. So they're, uh, not gone at all.

Imgur is full of coercive patterns too. On mobile it has a permenant "get app" button hovering over the content and you can't see user profiles or other social interaction stuff. It says you need to install the app to use them, even though they work fine on the desktop website.

Why in the world would anybody browse reddit on anything other than the unofficial apps? Their website is awful too.

Unofficial apps make browsing 100x faster and less annoying.

I only visit reddit when friends link off to it - I rather not be giving reddit any traffic at all if I could help it, so downloading an app just doesn't make sense for me.

This issue is compounded by the fact that you need a search engine to find old reddit posts and AMP-free DDG doesn’t have the last year and custom date filters Google does.

The multiple "Open in the Reddit App" banners that load for every AMP result, and do not remember that you've closed them in the past. Awful.

Not only that but tapping the top bar on iOS devices doesn't scroll the page to the top.

Is bringing up reddit fair? The their new design is awful and only getting worse, AMP or not.

AMP is an anti-open-web disaster, and just another example of Google abusing their dominant position.

The open question is would it be better or worse without AMP? It's likely that reddit (for example) would have just deployed an equally frustrating "mobile" page that loads even slower.

I just tried fetching http://www.reddit.com/ in an incognito window (no personalization) on a fast connection and it took 1.2s to get the first byte on the connection. I can't say if this is typical, but the AMP Cache has much faster delivery (~30ms) even before considering the preloading.

> The open question is would it be better or worse without AMP?

https://i.reddit.com loads instantly, no JS required for 'performance'.

I think there are definite drawbacks to the AMP project. For starters, it seems like the main goal is for Google to own more of the time people are interacting with the web, rather than anything else.

A mark in its favor though is local news sites. They are desperate to be at the top of the search page so when they join AMP, and you look at their AMP pages all of a sudden the loads of dark patterns they usually employ are absent.

Maybe Google could just heavily discount for SEO popovers, slow sites, huge blobs of JS and generally stuff that we all find slow and irritating (i.e. jumpy images). It used to be the case that Google wanted to solve problems of their users but now Google wants to solve the problems of Google.

I can imagine a search engine that heavily penalized popovers and other user hostile patterns to have the effect of giving great content since the authors of such sites respect their users more.

As Google results have gotten progressively worse for common search terms (try searching anything health related), this sort of filter could bring back the web we see to where it used to be.

I'd love to see a nonGoogle search engine like this.

Google could get the same effect by more aggressively factoring page load time & size into page ranking. They don’t because that doesn’t give them control over the ad ecosystem.

They did that for years already, I believe. It didn't work.

I'm not sure if that's true. Even now they claim that page speed is only a small factor in overall page rank, and even _that_ applies to "only the slowest sites." https://webmasters.googleblog.com/2018/01/using-page-speed-i...

They chose not to make it significant, however, so we have no way of saying that it wouldn’t have worked if it’d been a aggressive as AMP’s targets.

I'd wager it's because the most relevant results for a search don't become less relevant just because they load more slowly. I certainly don't want that as a user.

Website performance could be a minor tie-breaker between two similarly ranked websites that have the same information (and it probably is), but I don't see how it could be anything more than a weak signal in a search system where the objective is to find the most relevant results.

I'll wait 30 seconds if it means I'm loading the result I'm looking for. That a less relevant website loads in one millisecond is frankly immaterial to me.

I'm trying to update my site to amp right now. I wrote the thing in django but it doesn't look like django is compatible with amp.

I'm switching over to php because it's the simplest language that seems to have CORS headers without requiring middleware packages. It seems like AMP can't even do something as simple as a contact form without CORS headers. I'm sure an experienced programmer could solve this problem easily but it makes things harder for amateurs like me.

It also doesn't seem to work with bootstrap so I've had to switch the whole thing over to basscss.

It seems like choosing AMP limits the other technology you can use.

> I wrote the thing in django but it doesn't look like django is compatible with amp.

This doesn't make sense at all. Even if I hadn't previously built a site on django before which served AMP pages, I would be willing to bet that no pure backend web framework limits what your frontend can achieve. . Also, Things are always harder for amateurs, just keep working hard man.

I'm using Django templating language to serve up pages with a base template and also allow server side stuff to show up in the html pages.

Google isn't "owning more of the time"; all ad revenue, analytics, and branding accrues to the original site owner.

I finally deployed code a few days ago to rewrite AMP urls on HN. It's tricky to get right. Fortunately Brian Eno was not playing.


Shouldn't HN be grabbing the canonical URL from submissions anyway?

This is fantastic. Thank you, dang.

> I have µMatrix in my browser, and by default it blocks the AMP CDN.

Maybe an unpopular opinion, but if you're getting a slow load/blank page because you're blocking parts of the content, yeah, "it's just you". Without the fallback mechanism that gets triggered after a few second, you'd just be stuck at the white page.

A good site will load faster than AMP, on a decent connection it may even be comparable to preloaded AMP. However, news sites have been pretty universally horrible at making good web sites. If someone stood behind their web developers and management with a whip, forcing them to build decent web sites and disallowing them from adding bloat, AMP would likely be unnecessary.

Part of the benefits of AMP is the possibility to preload, which allows for near-instant page opening from the search result page (unless you're blocking content, of course), the other part of the benefits is being that whip.

My overall experience with AMP has been positive.

> If someone stood behind their web developers and management with a whip, forcing them to build decent web sites and disallowing them from adding bloat, AMP would likely be unnecessary.

sadly it's the management, marketers, sales who are standing behind them and holding the whip. Most places measure engineering quality by how many features you can push through your CI/CD pipe (minus the fault-reports, breakage that the pipeline generates).

When everything works perfectly it isn't appreciated as it indicates that engineering isn't running at full capacity. The same problem we have with security and all resilience topics: how do you measure & budget (or in security monetize) something that is supposed to be invisible to the end-user?

As for AMP it's not just a tire-fire it's a clear attempt to break the open web and create lock-in for competitors.

What’s weird to me though is because I browse the internet on my phone a lot, there are some sites that are terribly slow (and prone to those weird “you’ve won an iPad” hacks).

The Huffington post is one. On an older phone or tablet, the site is unusable on iOS. It will crash, and sometimes freeze. Long articles fill in excruciatingly slow.

I would think in a data driven business, that there has to be significant numbers of people who actively avoid the site, and that they would want to minimize this.

Yet there seems to be no improvement over time.

I’ve sometimes wondered if it’s because literally every person in engineering and management at company like Huffington Post have fast devices, and they don’t notice how bad it is.

I find it paradoxical.

They're likely unrelated issues, but I am not surprised that those outside Google's happy path experience severe performance issues. Gmail has been horrendously slow ever since they launched their terrible redesign, with initial page load taking 10+ seconds and actions like opening up emails taking several seconds. I can't help but think this is because I don't use Chrome.

No, it's slow in chrome, too.

Multi user Hangouts is also horribly broken in chrome (first user can't see or hear anyone).

So... you're blocking the AMP javascript from loading and then complaining that AMP is slow for you.

Have you tried not blocking the script?

Downloading a website sans scripts takes strictly less time than downloading and executing scripts. A website that hides its content for 8 seconds because scripts are disabled is a defective website which deliberately stops users from looking static content (amp uses css animation).

The intent of the feature seems to be to hide content until it has been laid out properly, which is generally a positive change to the experience because you can't read stuff while it is jumping around. In that sense, the website is "deliberately" stopping you from seeing static content in the same sense that browsers deliberately don't flash the raw HTML code onscreen before they have finished performing layout operations, but it seems unreasonable to imply it is malicious or defective behavior.

The CSS animation is a fallback to that intended behavior.

I is actually both. If it would be just a fallback then the time it uses would be at most 5s, more realistic 3s.

The think is, if you are on a internet connection where you have frequent loading time of 3s for the _basic_ content, then you are used to seeing partial loaded pages and you likely love it if the connection happen to load the content you care about first, potentially leaving the page before it's completely loaded when noticing that it's not what you are looking for.

The choice of 8s looks _a lot_ like they where looking for the highest time they can chooses without making it look obvious deliberately but which is long enough to make many "causal" users stop using the script blocker.

I.e. 10+s would look way to obvious as a intentionally long choice. 9s still looks big as we are used to see 9 as nearly 10 from product prices to some degree (e.g. 9.99€). So the highest number which doesn't look obviously high is the 8. ¯\_(ツ)_/¯

Eight seconds may seem exceptional on broadband internet, but it is not elsewhere in the world. I recently worked with a website in Brazil that could take over a minute to load over 3G.

It only looks malicious because you're looking for it to be malicious.

Great, then serve AMP pages to people in Brazil. I don’t see why I can’t permanently opt-out of AMP considering I’ll alway be browsing on either wifi connected to gbit fiber or a high speed 4G connection.

Better yet, make AMP opt-in.

It’s naive to think AMP has anything to do with improving the UX, it makes it worse in every way.

Like parent said, if a page takes 8s to load, I sure as heck don't mind seeing some partial results so I can start reading in those 8s.

With my crummy internet connection, it's not a better experience.

I really only care about the text on most websites, but I end up waiting for a lot of assets before I can see that. I understand why google wants to avoid having the page redraw the layout as assets arrive, that can also be annoying if done with a lot of relative formatting.

But for me, on average, the experience is worse than non-AMP sites.

Ha. That's like removing a bunch of methods from a program and claiming it's broken when it crashes. What did you expect to happen when blocking half of a site's libraries?

News sites are hypertext, also known as formatted text, and not web-apps or programs which download and render websites. I expect news sites to load properly in the absence of the AMP CDN (not "half the methods"), not deliberately hide content from users. (I feel the same way about blocking page rendering when web-fonts don't download when their http connections hang instead.)

EDIT: I just moved to desktop. If I block all scripts in uMatrix, OP's URL loads immediately (albeit with a full page of garbage above page content). If I block external scripts in uMatrix, I have to wait 8 seconds before anything appears.

>News sites are hypertext, also known as formatted text, and not web-apps or programs which download and render websites.

No, there are many thousands of websites that render their content dynamically on the client - news or otherwise. Facebook, Twitter, and Reddit all make use of client-side rendering (and all provide news services).

>not deliberately hide content from users

Your browser also "deliberately hides content". It waits a few hundred milliseconds before triggering a paint, just to avoid content from jumping around. AMP pages do the same -- unless you break the mechanism of course.

> Facebook, Twitter, and Reddit all make use of client-side rendering (and all provide news services)

Oh, come on. You and I both know that those are social media apps which act as news aggregators, and are not themselves sources of textual news articles. jimbo1qaz was clearly talking about the latter.

What is the meaningful distinction between news websites and non-news websites? Client-side rendering is just a technique which isn't specific to any industry.

Completely different.

This is like a car that tries to phone home every time you start it, so either you keep the phone service active and get a salesperson talking to you while driving, or you disable the phone and the car leave the immobiliser active for a minute each time you try to start the car.

Are you upset that the website is using a CDN? That is the standard distribution method for common libraries.

As for the other half of your analogy - stop poking holes in the gas tank and it won't need active service.

Using a CDN isn't the problem on it's own; the problem is the (unfortunately common) shoddy programming practice of assuming error never happen. Good programming almost always involves checking for and properly handling errors. Using the returned value from fopen(3) without checking if it was NULL probably results in the program crashing when displaying a simple "file not found" error message would have been more appropriate.

This principle is also true when writing web pages; if you load an external resource, you need to check for and handle the case where that resource might not be available. Failing to load a resource can happen for many different reasons, not just someone blocking it with their client. Proper handling the error depends on what the resource is: display the content that is already available (perhaps in a less than ideal state), or if that isn't possible displaying an appropriate error message. This includes cases where the Javascript itself isn't available (for any reason).

Networks are not reliable or universally available. Quality programming understand that and does the best it can when failures happen.

I agree with you. However understand that AMP actually does have a failsafe here, and it's exactly what the OP is complaining about. After a timeout (to allow for shoddy network conditions as you described) it will load the content as best it can.

All three cases are actually covered (full-script support, partial-script support, and noscript support). That's more resilient than most apps.

no script support that makes you wait eight seconds in a technology to make things ×faster× isn't really support, is it?

Agree. To put it another way in hope of driving this home:

It is sad when the brightest minds in the world decide that the fix for "flash of unstyled content" is to show no content at all for close to 10 seconds.

Seriously: 8 seconds mandatory waiting would have been considered slow even 10 years ago and the only reason it passes now is because Google got a stranglehold on most of the web.

Edit: improve last paragraph

If you turn off all scripts AMP has a <noscript> block that disables the 8s timeout. The OP is blocking only external scripts, which not surprisingly looks a lot like a very bad network connection.

(Disclosure: I work at Google on making ads be AMP)

Why 8s though? That’s well into “give up on loading this page, close the tab and try somewhere else” territory.

Perhaps a better option is finding ways to prevent content jumping around so much while assets are loading.

> Why 8s though? That’s well into “give up on loading this page, close the tab and try somewhere else” territory.

That 8s timeout is for loading the AMP JS from the CDN. You want a time limit that separates "you're on a slow connection, keep waiting" and "just give up, it's not worth it". I suspect it was set by looking at network graphs, but I don't know.

What the OP is doing, blocking JS and also ignoring <noscript>, is bizarre, and something you should expect to break sites.

> Perhaps a better option is finding ways to prevent content jumping around so much while assets are loading.

AMP does that very well, but only by taking control of the process of rendering, which requires JS.

(Disclosure: I work on making ads be AMP.)

No, the page loads perfectly fine if Javascript disabled. It's only if you go out of your way to break the page in the most difficult way possible that the script will fallback to having a delay.

Is the only way you can argue via straw man?

My complaint is not the use of CDN, it is the forced delay to load a page when the CDN is not available or certain resources are blocked. This is a direct form of punishment from Google: refuse to let us track everything you read, we will make it uncomfortable for you to read anything.

Disabling the phone-home feature is nothing like poking holes in the fuel tank.

>My complaint is not the use of CDN

Then why the complaint about "phoning home"? Analogies are imprecise and brittle.

>This is a direct form of punishment from Google

If you show me some proof I'll be glad to believe you. Occam's razor says it's a simple fallback for unexpected results.

>Disabling the phone-home feature...

Blocking a required library? Who is straw-manning now?

Phoning home is where a web site uses Google Analytics for example, while a CDN is where the assets you actually want are loaded from a server with large links to Internet backbone near you. Completely different things, which are sometimes conflated by people confused about the nature of the argument.

Calling the phone home feature a required library is part of the malintent from Google. Most websites I interact with don’t need a Google Analytics in order to function, for example. If I block a Google Analytics they refuse to work. The simplest explanation here is that time-short programmers just copy the code presented by Google, who in turn want everything to report back to them so they can monetise third party sites.

Building your site for AMP and blindly using the 8s timeout from the Google copy and paste archive is exactly the same story. You don’t need that much time to load all the assets, that timeout is there only to punish people who block the AMP resources.

>Calling the phone home feature a required library is part of the malintent from Google.

It is code required for the WebComponents to function. It's also charged with optimizations such as sandboxing of iframes, making requests synchronous, and calculating layout to reduce paints.

Clearly it's not "phoning home". The script provides clear actions, as laid out on the amp project website.

>You don’t need that much time to load all the assets

On American broadband, perhaps. Other countries do not have the same infrastructure.

>that timeout is there only to punish people who block the AMP resources.

Citation, not speculation, needed.

At the very least, I expect it to crash faster after removing methods. It should not pause for a long time and then work.

If scripts are disabled AMP uses <noscript> to skip the 8s timeout, and renders quickly. The OP is getting slow loads because they're (a) silently blocking the AMP JS and (b) not running the <noscript> section.

(Disclosure: I work on AMP ads on non-AMP pages, which I like because it means the ads are declarative)

I’m surprised anyone hangs around a web page for 8 seconds waiting for it to load.

By that time I’d assume the server/connection was down and either refreshed or moved on.

Correct. You've broken the page by blocking only some of the assets. AMP loads fine if you allow all scripts or disable Javascript completely.

This is not really the point, the point is that the page loads anyway, without the AMP scripts so there's really no excuse for having an 8s delay. I don't want this google stuff on every web page I visit, and frankly it's unnecessary -- this article could easily have been written using static HTML / CSS and it would probably have been easier to develop (as well as access).

Disabling javascript doesn't disable css animations.

The CSS animation only comes into play as a fallback mechanism. The AMP script will force a draw sooner otherwise.

Precisely. The exact code is here:


Here it is pretty-printed and simplified to not have the vendor-specific stuff:

  <style amp-boilerplate>
    animation:-amp-start 8s steps(1,end) 0s 1 normal both
  @keyframes -amp-start{
    <style amp-boilerplate>
The first bit is for browsers with javascript, but where the network is poor and can't load the minimal required highly cacheable javascript file within 8 seconds. It uses CSS just in case javascript really isn't working anyway such as being disabled per domain or something. The second bit is for browsers without javascript - the page is unblocked immediately.

I wish Google would just heavily weight fast-loading, responsively-designed pages in pagerank and call it a day. There's no need for a weird problematic proprietary format.. It just smacks of a power grab to keep visitors on google.com.

> There's no need for a weird problematic proprietary format.

The reason we need AMP now is that you can't otherwise preload a page in a privacy-preserving way.

Webpackaging will allow doing this without AMP though: https://github.com/WICG/webpackage/blob/master/explainer.md

(Disclosure: I work at Google on making ads be AMP.)

Are you sure this is going to be allowed at Google? Policy and all that. They don't exactly look to be condoning open standards in this regard. Would be happy if I am wrong though.

Google is one of the main groups working on it. For example, the draft https://tools.ietf.org/html/draft-yasskin-dispatch-web-packa... is by a Google employee.

A thousand times yes. If Google actually cared about end-user "experience", the health of the Web, and accessibility, they would do exactly as you've described.

It's a power grab.

AMP pages take forever to load for me with Firefox and uBlock Origin. When I open a news story from Google's recommendations, which always open to AMP pages, it pretty much always takes about 15 to 30 to show anything on screen. I instinctively edit the link manually nowto try to go to the non-AMP version, which typically loads the content in under 5 seconds.

If you're using Firefox, you can just use the Redirect AMP to HTML extension.

Or, an alternative approach that also works in Chrome: use DuckDuckGo. Actually early AMP, that was extremely broken, pushed me out of Google completely.

Still have a problem with shared AMP links in that case. I like the Safari approach where AMP URLs get converted back to their canonical URLs when shared.

AMP has always and will always be a garbage idea. Stop implementing AMP.

Over the last couple years Google has acted like a dictator on the web. No more autoplay... except if you're Youtube, or these 100 approved sites. If you use Chrome... oh now you're auto signed in. You opt out of location tracking - we'll still track you, we don't care.

Don't give Google more power.

Implementing your website in AMP is basically handing the keys of the web over to Google. They're going to make it more and more ridiculous to stay AMP-compliant and you're going to be shut out if you don't play their game. Eventually they'll enable/disable features of AMP-based sites arbitrarily... similar to how they determine autoplay policy on their "approved sites list." Do you think Google should be the all-powerful being that determines what features you can have on your website? That's the future you're opting into if you implement AMP.

I don't know about slower, but it's certainly made the web more annoying to use with piecemeal javascript support - which obviously benefits advertisers like google, if it makes more of us cease interfering with tools like noscript and µMatrix.

AMP is a project that benefits only a giant corporation. Please don't support it.

That giant corporation runs the biggest reason for SEO. AMP affects SEO.

The content of [1] displays immediately for me (and I do not have document scripts enabled). There are <amp-...> elements; I don't know what they are, and they do not do anything.

For webpages that do hide everything, I define my own CSS overrides which prevents the <html> and <body> element from ever being hidden, opacity less than 1, or similar. Further rules can be added to prevent <body> from being animated or having transitions.

What is wrong with just HTML, without CSS and JavaScript, or in some cases just plain text and not even HTML?

AMP is about guarantees that you can't have with regular JS. You guarantee to not do any sync JS and document.write() stuff when you use AMP. Personally, I plan to use AMP to make a better indexation of my app which I do as a SPA, and which seems to be poorly indexed by Google. I'll make sitemap pointing to AMP pages with content, which has "see full view" link on it.

Yeah, I actively avoid visiting AMP and sharing AMP sites for this reason whenever possible. They just take so fucking long to load.

In my experience AMP is considerably slower unless I happen to be browsing in Chrome. It's probably an accidental result of how they implemented AMP, that framework is tuned specifically for Chrome in the first place (odd CSS hacks to nudge Chrome's compositor into behaving a certain way, etc.)

We need amp.wexe, fully DOM-based AMP implemented in WASM.


That's the joke.

Hey, where is my bitcoin!?

This is what's keeping the screen white before showing its content https://news.ycombinator.com/item?id=17746886

Have I been under a rock?

I keep reading about AMP on HN, but I have yet (professionally or personally) actually come across AMP in the wild unless I intentionally went looking for it -- or am I consuming AMP and not even realizing it?

I don't believe that Google serves AMP links to Firefox, certainly not to the Mobile version and also not when you have JS disabled from google.com

Plus you might not be browsing the subset of AMPish websites. The only ALP links I see are when people share mainstream news links, and then it's straightforward enough to manually extract the URL.

It's outright blocking pages for me, until the pages retry without AMP

It's junk and I hate it. Google isn't Yahoo but it's become replaceable in the same way that firm did. Google News has become a case study in what not to do.

I wish I could opt-out of AMP, I dislike the reddit AMP experience as its poorly maintained compared the the web based mobile experience.

AMP is IMHO the wrong solution for a stupid problem of bloating when simply making decent html pages would solve it very well.

amp is slower than full pages for me. No custom configuration.

> which is to speed up the delivery of web pages

...to make up for the slowness of all of the ads Google and others serve. It's amazing how fast pages load with no ad networks involved.

Yes. I absolutely hate when I want to load an article on my iPhone and it loads AMP. I remove that part of the URL to read the original source most of the time.

I refuse to work with AMP simply because it seems like the only reason for its existence is to retain more users permanently in the google "ecosphere". Making performant, fast loading websites has always been a priority for me and my employers, AMP isn't being integrated everywhere because companies want to make their pages load faster, they simply don't want to be penalized by google for not doing AMP.

So TL;DR AMP is rarely implemented for the right reasons, and the implementations reflect this.

why not go all the way and just block websites that use amp? it's what i do. trend riding idiots are not worth it

>I'm not convinced the web pages were really _that_ slow to start with, so it feels like an unnecessary project

That's because you use umatrix and possibly ublock.

AMP is about making the web faster for the common Joe.

It is also likely that you don't spend a significant amount of time browsing on a 2G cellular connection. When ping times are very long, the only way to get web performance anywhere close to native apps is pre-caching, which is what AMP is designed to enable (link-rel="prefetch" is a privacy nightmare). AMP is about the common Joe, but also about the 90th percentile slow performance (like in the developing world or on cellular networks).

Pre-caching in those cases will cause other things to slow down and needlessly consume bandwidth which the user is paying for.

The real reason for AMP is to allow Google to watch everything your visitors do so they can sell your visitors eyeballs to your competitors.

AMP and PWA are pile of shits designed to lock users to Google.


Sounds right to me, don't know why you're downvoted? Do something non-standard and then complain things are broke?

AMP is slow like s..t. Mess as a f..k.

HTML seems like it has turned into cancer now...

I wonder what would be required for a format to not have this problem, and how it can be prevented.

I have some 12 dev inspect tool for that style scripting this notebook was given to me by my gf and its a starter kit.

It’s not really AMP that delivers speed improvements that made everyone go WOW back when it started. It was the Google would prefetch AMP pages in search results.

In many cases AMP is neglibly faster than mobile web page and sometimes slower.

Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact