Hacker News new | past | comments | ask | show | jobs | submit login

AMP is the result of webmasters being unable to provide any kind of reasonable site performance. Which isn't hard. Don't include content from dozens or hundreds of thirdparty hosts. Don't do crazy amounts of super slow javascript.

If people would do that there would be no need for AMP. The whole concept is mostly "Google tells people their webpages suck and puts them into a cage where they can guarantee that they don't suck".




It's worth pointing out, though, that the source of a lot of this performance drain is ads. And Google are the main provider of ads across the internet.

To me, AMP feels like a method for Google to sidestep the real issue at hand.


Are google ads that slow? When I look at the dozens of requests issued by slow sites, I don't see google domains. I see every other fucking tracker, suggester, sharer, and optimizer, but not google.


Ah, but just because you see Google ads, it doesn't mean that was all the website tried to serve.

They probably ran a waterfall, asking AppNexus, Pubmatic, TradeDesk and 40 other ad providers for an ad at $5, $2, $1, etc. before giving up and serving a Google ad...


Google doesn't force you to add ads to your site, though. It's disingenuous to suggest that it's the ad provider's fault that you wanted ads on your site.


Nor does the pusher force his marks to start smoking his crack.


I don't disagree with you. However, I think the best solutions almost always address root issues, so I'd much rather Google more heavily penalized slow sites. That gets to the heart of the issue without harmful side effects. AMP does neither.


The solution is very simple.

Google's spider uses headless chrome which runs Javascript (to index flash and Ajax content). Why can't they detect how long it takes to run your page before getting useful content?

What about just penalizing Javascript?

What about penalizing download size?


Websites already can detect whether the client on the other side is Google's spider (either by checking for the well-known IP ranges, or by looking at the User-Agent). So they could supply a fast, cached version to Googlebots, and thus appear faster than they are.

I can see why Google would do this AMP thing. It's much easier to detect fast websites through positive rather than negative evidence.


That's cloaking, and doing much of it will get you severe penalties.


Yeah you'd think that isn't very hard, but time has proven that we are quite incapable of writing websites with sane amounts of ads, trackers, and JS libraries.


How much of AMP's speed is coming from re-hosting content on very very fast CDNs with edge nodes everywhere?


I would say that JavaScript can also make <some> aspects of your site faster than pure HTML CSS.

Why? Because most sites are not just a single landing page -- they have links, or tabs, or new pages, and each request could take 0.2 seconds.

So instead of requesting a new page at the time of clicking on a link, and then re-loading your entire page, you just fetch information predictively and mutate the DOM to give a feeling of <instant>, faster than even an HTML CSS page behind a CDN.


(I'm the author of the blog post)

I agree with you. My site was already fairly lightweight, but I wanted to see if AMP improved things and made a better user experience.

It made it slightly faster, but other than that, didn't create a big enough impact for me to notice.


According to Google PageSpeed Insights, your speed scores are 59/100 for mobile and 40/100 for desktop. The "optimize images" section says there are 1.3MB of image optimizations to do. That's just the image bloat, not the image content, and it's just one of the improvements to make. If you want to make AMP go away, make slow webpages go away.

https://developers.google.com/speed/pagespeed/insights/


Interesting. I put my own blog (link in my profile) into there, and it scored only 90/100 on mobile. Google now wants to talk me into enabling compression and caching on a website that is literally 4.98 KB large (including all assets).


I got 91/100 mobile, 90/100 desktop for a site that is 4.95kb.

Reasons?

* Apparently the HTML should be minified.

* Apparently I should use gzip, because 4.95kb is too big.

* The inline styles are below the content to prevent showing the user nothing as it paints. Google thinks the 594 bytes of styles needs to be in a separate request.

I think insights is pretty useless at analysing sites below a certain size threshold.

At these sizes, network latency is the biggest drain on loading a page... Which effectively means users don't notice loading times when they click a link.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: