I strongly agree. Web developers and app designers should work to build fast, performant web sites that use bandwidth carefully because that's good for end users. But the entire AMP approach to doing this is questionable, and as we have seen over the years it appears to act more like a way to give Google more undeserved and unnecessary control over what should be an open web.
More broadly, I consider this yet another reason to avoid using Google properties where possible. They have shown themselves to be bullies and bad actors who want to control the internet and oppose an open web. I recommend simply not building AMP pages at all, but instead working to build high quality, performant websites which gracefully handle device size changes and lack of javascript.
> I strongly agree. Web developers and app designers should work to build fast, performant web sites that use bandwidth carefully because that's good for end users.
They should, but they didn't. Before AMP most of the web was unusable on slower Android phones and frontenders just laughed at you and told you to drop 800$ on an iPhone if you want to see their pages. Is it a surprise that Google shoved a technology to fix web on their platform down developers throats?
Nothing else before AMP helped. Why do you think those developers will suddenly wake up and start building lightweight web pages now? Instead of ad bloated, video playing monstrosities?
Web developers were slothful. This is how purgatory looks like. ;)
AMP is not the savior you describe it to be, and web developers are not against lean websites. The real bloat comes from ads and excessive tracking, and you can test that by installing uBlock Origin in Firefox for Android and see for yourself how the web suddenly becomes fast and lean.
I've downvoted this post because it lacks substance, and the resulting arguments will derail the thread and bury actionable information that was shared below. It's depressing how most of these threads could initiate action but seem to be derailed.
Of course no one is _against_ good performance, but web devs obviously don't care enough to do anything about it. There's no practical difference between the two.
Most web developers do make fairly lean websites, but that is not enough when ads and a dozen tracking scripts which in part are supplied by Google are slapped over their work.
Which is completely 100% irrelevant from the end-user perspective.
Just to be clear, I hate AMP, but I also feel a sort of pleasurable vindication in the pain that developers and companies must now go through because of the horrendously slow trackers and ads they used to fill their pages with.
Google could restrict the content loaded in AdSense iframes and apply AMP restrictions to ad content only. They also have the means to limit the number of ads partners can load on a page, and restrict the overzealous use of Google Tag Manager which slows down sites.
Google offers both the poison and the antidote, and each of their solutions, see what they're doing with request blocking in Chrome, happens to erode user liberties and privacy rights to concentrate power around Google properties.
It's a bad idea to just slap AdSense and analytics in a page. If they're a requirement then they need to be properly integrated and thought about. It can be done properly but nobody really does.
It absolutely does. Improving your page speed (or google's idea of your page speed) is a critical step in optimizing a site's organic google search ranking.
Why do people feel the need to "fight back against Google"? Should their actual fighting energy be going to fighting dictatorships around the world and torture by the CIA? Priorities are really messed up.
No it's not. We're not talking about squeezing every last ounce of performance out of the CPU and hand tuning every query. Just stop bloating your pages with 10,000 dependencies and awful JS frameworks and pretending everything needs to be a SPA.
Again, install Firefox for Android with uBlock Origin, and see your opinion change about the main reason mobile sites are slow. Pages load fast and are responsive even on older phones if you use an ad blocker.
I use ubo and have used AMP in past. UBO is an excellent ad blocker, but as far as bandwidth savings are concerned, it doesn't come close to AMP. Of course I use UBO with JS off by default which is better and arguably more secure. But I still occasionally have to unbreak sites.
Your suggestion to install Firefox for Android for a performent mobile web experience undercuts whatever other arguments you may make.
Firefox for Android is a UX and performance disaster which is likely why Mozilla has chosen to start from scratch and develop a replacement browser for Android.
Install Firefox for Android with uBlock Origin to see how ads and tracking destroy mobile performance. The setup I described loads pages faster than Chrome.
I believe the optimistic read is that in a world where Google manages to measure strictly performance-based metrics and then rewards pages based around that, of course developers will do things right this time. After all, we all want to write good code and produce quality work!
That speed matters to user behavior has been known for a long, long time. This knowledge existed long before AMP did. It had surprisingly little effect on how pages were implemented.
So perhaps our princess is in another castle.
To my thinking, ahat AMP does is create a political context that enables developers to push back. By setting an unambiguous standard and clear advantages to complying with it, developers have a weapon to push back next time Marketing wants to ad fifteen trackers or whatever. This is leverage that just was not present previously, and it can change decisions.
> To my thinking, ahat AMP does is create a political context that enables developers to push back. By setting an unambiguous standard and clear advantages to complying with it, developers have a weapon to push back next time Marketing wants to ad fifteen trackers or whatever. This is leverage that just was not prevent previously, and it can change decisions.
Yeah, I think this is exactly it. Just like web developers don''
t care about disabled people until law threatens penalties, they didn't care about performance until Google threatened penalties.
The question is - who else could provide same incentives as Google? How could an independent, non-corporate entitiy create the same pressure?
Normally I would say "That's what standards bodies and governemnts are for", but in this particular context both have failed. It's been thirty freaking years since the ADA, and most websites are still not accessible. Standards bodies both move slowly and are historically bad at achieving widespread implementation in reasonable timeframes.
The other answer is "Browser makers"... but that's also Google. And maybe Mozilla, which is arguably the "independent, non-corporate entity" you'd like.
Really though, this works because Google has the technical chops to make it work and the positioning to make people want to do it. I cannot think of a single "independent, non-corporate entity" that's both positioned to do this and capable of it.
All Google has to do is reward site improvements in critical metrics. That's it. If my page is going to rank higher because it's faster, I will optimize the hell out of my site. But Google has been really unclear about the amount of impact those improvements have, especially as they compare to building an AMP site that will without question be featured in their carousel.
What metrics are you thinking of? Page size and load speed are the typical ones. There may be some wrinkles to measuring those well, given how dynamic modern pages often are. That would make any such metrics relatively easily gameable. It might also be challenging to turn measured improvements into measurable gains in SERPs, which means the gains in corporate politics are limited.
AMP avoids all of that. It also brings security benefits by getting rid of basically every tag that can be used to mount attacks on the browser.
Also, it's been known for quite a long time that users like faster sites, resulting in much lower bounce rates. Was that not enough for you to optimize the hell out of your site? It's been my experience that in a lot of companies, it isn't enough. Marketing or publishing or whichever department can attach dollar amounts to the tracker or ad or whatever they want to add, and devs can only handwave around experience.
It could only be used as a tie-breaker for search results with the same level of confidence, anyway.
It would be ridiculous to down-rank the exact thing the user is searching for just because the user would have to wait 800ms longer for that information. Or up-rank something the user isn't looking for just because it loads faster.
The best Google can do is bluff about how much perf matters.
The efficacy of the incentive is linked directly to the strength of its effect. If optimizing the hell out of your company's site only matters in extreme cases where it's a tiebreaker among hundreds of other signals, the people who want the things that make pages slow will win. They will be able to point to more tangible and measurable benefits, and the effect of the tiebreaker will be lost in statistical noise.
It may just be unfounded cynicism on my part, but this does not sound like a better web experience. It sounds like the web circa 2009-2015. It sounds, to me, exactly like all the things we'd like to get away from with something less intrusive than AMP.
I've been using the web on mobile connections ever since I got my first iPhone in 2008.
When you say that it was unusable, surely it's hyperbole.
I might be in a minority maybe, but I never had a problem with it and I've been a heavy user. And especially now that 4G connections are everywhere and smartphones are overpowered.
I mean I watch HD videos on the web while riding the city bus with no interruptions.
Are you telling me that a phone with better performance than the desktop I had 10 years ago, with a 4G connection able to stream hundreds of MB of data on a moving bus isn't capable of loading freaking text content without AMP?
Surely something is missing from this picture. I'm replying to you on Hacker News by loading the website in my browser, no AMP in sight. And I read HN, including all websites listed on HN, from my phone with no AMP.
And sure some websites can take a second or two to load due to crappy ads mostly. I remember a time when I waited for 5 minutes to load a website, when all we had was dial-up. And even that was awesome ;-)
N.b. I avoid AMP on purpose. I started using DuckDuckGo on my mobile to avoid AMP, as I had no other way to turn that shit it off.
Iphone was one of the more expensive phones you could get in 2008, just like it is now. You were not browsing the Web on the "slow android phones" parent was taking about.
HN is an exceptionally fast website and not representative of the Web at large.
Compare HN to something like reddit, a website which provides very similar functionality but is an order of magnitude slower. Then ask yourself why reddit has to be so slow.
The Reddit website is working perfectly fine for my purposes. The only thing I'm bothered with are the annoying popups suggesting to try their app.
Also if Reddit is slower than HN, that's probably because they don't care (law of diminishing returns ftw) and I'm sure they'd rather drive people to their mobile app instead. All of this isn't the fault of the web technologies used and neglect can't be solved by AMP.
AMP puts websites under Google's control and nobody asked for it, being shoved down on people's throats due to an imaginary problem.
---
> You were not browsing the Web on "slow android phones"
Note that even the shitty, stock Android phones today are better than the iPhone that I had back then. Such is the progress of technology.
I know because we have a ton of low cost Android phones to test with.
The only performance problems we encounter are in the third world countries of Africa and possibly in other emerging markets, but that's only a temporary issue and I predict that in another 3-4 years from now it will be a non-issue even in those countries, hardly a reason to give up on our web standards. And it's not like you can't design super lean websites anyway.
> I've been using the web on mobile connections ever since I got my first iPhone in 2008.
Okay, great. You had one of the most powerful phones at the time. How was the experience for people with a "feature phone" in 2008? (I'll tell you from experience, it was terrible).
How would the experience be today, with your iPhone from 2008? Terrible. Why? Is the web more powerful as a result? Can you do more things? Nah, it just looks flashier.
Tracking blockers via extensions, and autoplay off by default would have fixed most of the problems while also encouraging site builders to stop doing those things. Firefox makes that possible on Android. Google seems determined to never support those things in mobile Chrome and are slowly removing or crippling the ability to do it on desktop.
iirc they're also pushing for a new extention standard, for firefox and stuff too, which is very adblock-crippling...wouldn't be so bad if it was ONLY chrome...
also, loopback proxy to localhost with a standalone blocker is the next step they'll force us to take ;p
Web standards and traffic being monopolized by a company with... dubious opinions about the role of privacy online is your idea of purgatory?
I'd like to think similar ends could have been achieved by setting and rewarding standards around #'s of included scripts, size of the page load, etc. But that wouldn't have achieved the goal of keeping people on google.com even when clicking search results.
Fighting web bloat is a noble cause. It doesn't require a self-designated centralized gatekeeper. All Google needs to do is reward lightweight sites with better search placement.
To be fair most web dev practices are all based on silly notions of tracking and crappy UI ideas made by idiots. Animating in blocks of text is what I'm mostly referring to, but theres plenty more.
Take twitter for example. A tweet takes about 10mb to load. Based on something I did about a year ago. To put that in perspective, information transfer wise, war and peace is like 800kb. The whole book. 280 char or whatever, of a single page tweet being 10mb is moronic. Reddit bit the same stupid bug with their redesign.
The biggest problem, everyone is complacent and thinks "this is what progress looks like and you're a curmudgeon boomer if you think otherwise." Forethought in real sustainability, both environmentally and sociologically is looked on as impeding progress. Just like when small amounts of devs a decade ago said we need to be careful of big tech companies with our data. They were shot down and that push for "break things fast" became the name of the game. Now everyone says tax dollars must be spent for 5g because "we need the bandwidth". No, more people need to be less stupid. Mostly consumers. But devs need to start taking a stronger stance in outing bullshit tactics these businesses are implementing and quit going on their knees to pray to the silicon valley giants as some great saviors of society and their wealth is an indication of their genius. Ugh... got into a rant...
WordPress is also a platform that encourages bad takes like this. If I have 50-so plugins that provide me only with tools in the dashboard, logged-out users won't be impacted by any of the fifty.
Compare it to any consumer operating system. It puts a lot of power into the user's hands.
It would be great if these companies had enough good taste and pride in their work to at least try to build something great by default. What we get instead are minimum viable products built in the cheapest way possible and it takes a Google to force them out of their complacency by imposing policies.
On the other hand, Google is at least partially responsible for the current web situation: they normalized advertising and tracking malware on the web. Because of them, publishers think it's totally acceptable to make people download 10 megabytes of ads and javascript to read 10 kilobytes of text. The correct solution is to block all that stuff by default by shipping uBlock Origin pre-installed with browsers.
> Before AMP most of the web was unusable on slower Android phones and frontenders just laughed at you and told you to drop 800$ on an iPhone if you want to see their pages.
> Why do you think those developers will suddenly wake up and start building lightweight web pages now? Instead of ad bloated, video playing monstrosities?
To be fair, I would say a lot of this is a result of marketing/sales trying to push a lot of BS on the page, and managers or devs failing to push back.
Is the developer guily of creating a "ad bloated, video playing" webpage? Yes, a lot of them don't care and make it bloated, but even if you tried, you can't do much to improve the perfomance of a bad idea.
> Nothing else before AMP helped. Why do you think those developers will suddenly wake up and start building lightweight web pages now? Instead of ad bloated, video playing monstrosities?
This has been an ongoing trend since ever, Viz. YSlow and Firebug Speed Tab.
> They should, but they didn't. Before AMP most of the web was unusable on slower Android phones and frontenders just laughed at you and told you to drop 800$ on an iPhone if you want to see their pages. Is it a surprise that Google shoved a technology to fix web on their platform down developers throats?
So let me understand this: Google allows OEM's to ship Android on shit hardware with terrible performance, is rightfully complained at for rubber-stamping hardware with no oversight, no standards of quality, and no requirements of suitably good UX, and then Google passes the burden of supporting the shit hardware they by-virtue-of-silence gave permission to onto a ton of unsuspecting content publishers, who now either face delisting from the dominant search engine not because their content is bad, but because their website requires resources not met by Google's, proxy, shit hardware? And you're okay with that?
Yes, I'm OK with world having the ability to buy a smartphone for 50$ outside US. Mobile devices shouldn't be reserved just for rich westerners. Same for the whole web - I don't see the reason why it shouldn't be usable on a dual core laptop with 2GB of RAM.
I'm fine if supporting people with older and slower devices costs more development time for developers in Silicon Valley.
Years ago the web was fast on a 1 GHz single-core with 512MB of RAM. What changed, other than ads and ad networks like Google becoming far more invasive by wasting more and more memory and CPU?
In the days since 1 GHz CPUs, web pages have also grown from simple HTML/CSS to huge JavaScript frameworks, in which displaying the simplest static content requires a ton of JavaScript.
But if you install a browser add-on such as uMatrix, you can see that surprisingly many web sites will still work just fine if you disable JavaScript (even first-party JavaScript). One example is nytimes.com.
Should mention that megabytes of javascripts are slow to download, compile and execute. While a few seconds may go unnoticed on the developer desktops, it will be a lot more on a mobile or laptop.
I advised a friend to ditch the JS-powered pop-out social media icons which were hovering almost out of sight over on the right. They said quite flatly, "nope, that's staying". That was probably ten years ago. There is a school of public opinion that everyone seems to be attending. The things they learn there are not always logical or justifiable but I get the impression that they all want to secure their piece of the pie and that means meeting everyone's expectations, so they are all doing it to each other, together. Google is "merely" running classes in that school, it seems... and of course helping the school keep running by supplying tons of tech.
I was mildly disgusted when required reading for freshman orientation at Akron U included a book called Nickel and Dimed. The gist was something like "get your education or you're screwed". But people made it that way in the first place! Everyone supposedly needing formal higher education in order to have any decent future isn't something that just happens, it's something the human race is doing to itself. Leave it to a higher education institution to push the idea that "this is just the way it is, do the right thing if you know what's good for you".
edit: obvs I didn't read the book, it's not exactly like I said. I think I bought the book but dropped that "class" anyway
In a similar way, stupid "trends" like social media buttons and Like buttons are just examples of how everyone is ruining the web together. These days it's the aforementioned massive JS frameworks and SPAs and of course the obsession with "analytics." In a way it's nice for me and my workstation because it helps drive up the current average affordable densities of RAM and storage, but ...it's slavery. And Google seems to be less and less bashful about it.
"you are slaves of whatever you submit to by obeying" --that guy
> because it helps drive up the current average affordable densities of RAM and storage
It does, but it also means that RAM and storage isn't available to be used for other things. Think about what you could if you had current hardware back in the XP days...
We covered it, floor to ceiling, in images and video. Yesteryear's web had a few grainy avatar images and GIFs in footers, todays has nonstop, wall-to-wall, high-definition media.
> Yes, I'm OK with world having the ability to buy a smartphone for 50$
But you apparently aren't okay with getting $50 worth of smartphone, since you're demanding a ton of companies you erroneously claim to be in California expend thousands of dollars in labor to support a framework they never agreed to support, have little to no say in how it's developed, in the name of a supposedly "open" web, so that you can have a good experience consuming content more than likely for free. That, to me at least, reeks of the worst kind of entitlement.
This is, in my mind, like buying a Tata Nano, which is a perfectly acceptable if limited car, and subsequently demanding all the road ways be limited to 65 mph, so that you don't feel slow. If you want to drive with the pace of traffic, the absolute cheapest car you can possibly buy brand new [1] is probably not what you want.
Yeah, this is ridiculous.
I used to browse the Web (not the Wap!) 13 years ago on my Nokia N70 (Symbian OS, 220 Mhz, 32MB) smartphone, on a Internet plan that cost 1€/MB (I have a plan that costs 100 000 times less today), and while it was a bit rough, it was already pretty serviceable!
Most of the content (in time spent on it) is still text (remember what HTTP stands for?), and text takes hardly any processing power!
Not everyone can buy $800 iPhone or close to that. Being from a third world country, I understand how valuable it was to have a cheap smartphone (umm laptops were too costly) so my main interest shifted from physics to CS / Programming..
If you don't like Google AMP, it is fine.. (of course I too prefer to browse with only HTML & CSS whenever it works).. If you don't like low end hardware standards, it is fine.. But they have solved real world problems, whether first world problems or not. Not everything is black and white..
> Being from a third world country, I understand how valuable it was to have a cheap smartphone
And just because you live in the US doesn't mean you can afford a top tier iPhone. That's why the secondary market is so hot for them.
> If you don't like Google AMP, it is fine..
I don't really care one way or the other.
> If you don't like low end hardware standards, it is fine..
I do take some issues with the fact that Google employs no standards at all for a baseline level of quality with their devices, and then places the burden of supporting those devices on others under threat of delisting.
> But they have solved real world problems, whether first world problems or not.
Ends do not always justify means. Lest we forget that the winner here is not limited to people with low end hardware getting to consume AMP content, it's also Google, who profits directly off of that consumption. And THAT is where I believe the ethical lapse is. Google isn't doing this so people can get content easily on low end hardware, they're doing it under the guise of that, while laughing to the bank as they're breathlessly defended by people who refuse to accept for some reason that Google is a business, and it acts in every way to forward it's business.
Just like Stadia is not Google setting out so that people who can't afford game consoles can still play the latest games, they are inserting themselves in a user's market so they can be the provider, and get that sweet, sweet engagement.
I used to browse the Web (not the Wap!) 13 years ago on my Nokia N70 (Symbian OS, 220 Mhz, 32MB) smartphone, on a Internet plan that cost 1€/MB (I have a plan that costs 100 000 times less today), and while it was a bit rough, it was already pretty serviceable!
Most of the content (in time spent on it) is still text (remember what HTTP stands for?), and text takes hardly any processing power!
We're not okay with Google usurping web sites but we don't sympathize with publishers either.
The right thing is to build good web sites. Publishers obviously don't care about doing it right and we ended up with system requirements for web sites as a result. Google is now making it expensive for them to not care. Publishers are not a blameless victim of Google's monopolistic power, they actively contributed to the current state of the web.
People should not need a $1000 phone to read a news article. The only situation where it's acceptable for web sites to not work on "shit hardware" is when it's a WebGL application. In those cases, people know that powerful hardware is required before they even load the page.
If Google had blocked manufacturers from selling cheap Android phones then they would have just found another mobile operating system to use. Maybe Firefox OS or WebOS.
Also yeah I'm pretty happy that cheap smartphones are available for the masses to use. I have zero sympathy for content publishers with bloated websites.
bloated websites are for a reason - nobody wants to pay money for content, but content gets created by people who get paid for their job. so you are not paying money for content, but also don’t want to have advertisement. what is solution? in my mind is just not use those websites :)
> Google allows OEM's to ship Android on shit hardware
Well, now there's an interesting complaint in this context. I thought Google was evil because they forced strategies on people, but now they're evil because they don't restrict hardware?
Yes, it’s one thing to promote a cleaner and faster web though better design and implementation. It’s another thing for Google to use its effectively monopoly power to enforce that. As the FA says, Google didn’t invent the web or create its content - what gives them the moral right to take it over?
I think the collective web will eventually fix the problems without Google.
The root of the AMP issue is placement in Google’s search engine. Personally, I use DDG, and would be willing to pay a sizable subscription fee to keep it from being more like Google or from being acquired. But, most people probably would not - they are used to the web being “free”.
This is just another “embrace, extend, extinguish” effort, like the ones we have seen in the past. These attacks are transparently self-serving and should be “routed around”. It will require commitment to do so!
> Yes, it’s one thing to promote a cleaner and faster web though better design and implementation. It’s another thing for Google to use its effectively monopoly power to enforce that.
AMP is more than just cleaner and faster - it gives Google control. They could discriminate on cleaner and faster without it, but they purposefully don't mention that, since it would undercut the push for AMP.
Firefox for Android seems to solve this "issue" for me completely, after I started to use it, Google stopped to show AMP pages to me, even in News section all the links are direct. So use mobile Firefox, I find it too be very good these days, no regrets. I still have Chrome (just in case), but I didn't use it since.
«the entire AMP approach to doing this is questionable»
Why? AMP is roughly speaking a subset of HTML that's somewhat easier to cache, and nothing more. Ideally it should be possible and encouraged to serve most webpages from a cache, to optimize Internet traffic on the global scale. It should be okay to fetch them from a cache without breaking anything. I don't see why the AMP Cache is hated so much. Publishers shouldn't care whether browsers hit their servers or some third-party cache, as long as they can have proper analytics. And guess what? AMP does provide a way to do proper analytics. You can even send analytics data to an in-house URL: https://amp.dev/documentation/components/amp-analytics/#send... I think most of the hate against AMP in unjustified. Any search engine could decide to cache AMP content.[1] AMP in and of itself doesn't give search engines "more control" over the web (whatever that means), it just makes the web easier to cache for everyone, all search engines, all end-users.
> What's in a URL? On the web, a lot - URLs and origins represent, to some extent, trust and ownership of content. When you're reading a New York Times article, a quick glimpse at the URL gives you a level of trust that what you're reading represents the voice of the New York Times. Attribution, brand, and ownership are clear.
> the recent launch of AMP in Google Search [has] blurred this line
Google has inserted itself in the URL. Copy and paste that, submit it to reddit or Hacker News, or just read it to a friend, and what do you get? A connection to Google.
But anybody (Bing, Yahoo, etc) can "insert themselves in the URL" if they decide to cache the AMP content. In fact they could also cache non-AMP pages if they wanted. This isn't a problem created by AMP in and of itself.
You can't even make the argument that AMP degrades privacy, because regardless of whether you click an AMP link or a non-AMP link in the search results, in both cases many search engines will ping back or use a redirect through a search engine-controlled domain, so they will be aware of the URL you click anyway, AMP or non-AMP.
Anyone else who inserts themselves in the URL should be fought as well.
I guess you're making a minor technical point, and it's technically correct. Someone else could do AMP better. But until someone does, why not shorten "Google's implementation of AMP" to simply "AMP"? Is there any other?
I agree that there is a UX problem to solve (the address bar should show the original URL, copying it should preserve the original URL, etc) but whether the webpage got loaded from the original site or from some AMP cache is irrelevant.
Why do you care? You like the address bar to show the original domain name? What if this UX problem was solved by the address bar always showing the original URL, regardless of whether the content was loaded from an AMP cache or not?
I care because I want to know what server I'm hitting up. There are many servers that I don't want to be touching, regardless of whether the bits being delivered are correct or not. If the URL bar is lying to me, then I can't detect if I'm talking to a server I don't want to be talking to.
I also want to avoid AMP pages themselves, and the URL is the easiest way to see if I've hit one or not.
If Google only cares about a faster, more semantic web, then why not just give an even bigger ranking boost to faster, more semantic websites? Where does the need for a new standard come in, other than to gain more control?"
The above is a comment found in the OP.
Is there a requirement that AMP sites host resources with Google?
If there is, then Google has hijacked the purported goal of of promoting websites that consume fewer client resources (and are therefore faster) -- arguably a worthy cause -- in order to promote the use of Google's own web servers,[1] thereby increasing Google's data gathering potential.
If there is no such requirement, then is it practical for any website to host an AMP-compliant site, without using Google web servers?
If not, then AMP sure looks a lot like an effort to get websites to host more resources on Google web servers and help generate more data for Google.
1. When I use the term "web servers" in this comment I mean servers that host any resource, e.g., images, scripts, etc., that is linked to from within a web page (and thus automatically accessed by popular graphical web browsers such as Chrome, Safari, Firefox, Edge, etc.)
AMP is an open ecosystem and the AMP Project actively encourages the development of more AMP Caches. To learn about creating AMP Caches, see the AMP Cache Guidelines.
How do I choose an AMP Cache?
As a publisher, you don't choose an AMP Cache, it's actually the platform that links to your content that chooses the AMP Cache (if any) to use."
The above is from amp.dev, formerly ampproject.org
As the dominant search engine/web portal (excuse me, "platform"), already having the largest web cache and the infrastructure to maintain it, it looks like Google therefore becomes the dominant AMP cache as well.
There is also the Cloudflare AMP cache that can be hosted on any domain, so it is easy to implement a link aggregator that gets instant article loading just like Google and Bing. Compare to the situation prior to AMP where if you wanted instant article loading, you would have to convince publishers to integrate directly with you like Apple News or Facebook Instant Articles.
Dominant AMP cache is a meaningless concept. You as the link aggregator have to have your own AMP cache to implement instant loading.
Yes. If you're a search engine, a Reddit, a Twitter, or some other site that presents links to other pages expecting the user to click through to multiple pages, you can safely prerender AMP pages by implementing your own AMP cache but not by using Google's AMP cache.
> I recommend simply not building AMP pages at all, but instead working to build high quality, performant websites which gracefully handle device size changes and lack of javascript.
Doesn't matter. Google will penalise against not AMP sites. Let's not pretend there's a choice if you want people to find your content.
If you believe Google engages in anti-competitive practices with AMP, you have the power to signal these issues, which may result in an investigation.
You can also share your concerns with a simple email to comp-market-information@ec.europa.eu.
> You can report your concerns by e-mail to comp-market-information@ec.europa.eu. Please indicate your name and address, identify the firms and products concerned and describe the practice you have observed. This will help the Commission to detect problems in the market and be the starting point for an investigation.
and they did nothing, I doubt they even study the case, nothing was on the table of the parliament related to this. Google have continued to abuse and will do more if no one stops them. It's important to complain again and again until they step in.
AMP Is ruining mobile web. I cannot stand it. If it was actually made to be fluid, I'd see the value. But it's such a terrible UX, and so janky with the way it "pops in", and messes up "browser back" abilities. Out of all the shitty things Google has ever done, AMP is #1 to me on that list.
Orrrrrr just give me a damn option to turn it off, if I want. I will never understand why companies force people into these types of major UX decisions on their behalf. Stop assuming every user is stupid. Sure, make it the default, I don't care about that for the everyday user, but for something as fundamental as the browser, I should have an option to turn off every single Google opinion they bake in.
If you use DuckDuckGo you don't have to deal with AMP at all. Giving it a go on mobile is a good way to see how well it works for you too as searches tend to be less mission-critical compared to desktop-based searches.
Yeah, I use DDG and only get crappy amp links on reddit and other forums. Either don't click or manually remove the amp. Google can [insert insult here].
Unfortunately, it is my experience that DuckDuckGo excels on the desktop (lots of facts and technical questions), and falls short on many mobile use cases ("best cafes in Some City," assistance with shopping or goods, maps, etc).
I use it on my phone anyway, but I wind up using `!g` all the time.
(yes, I made this same complaint on the DDG topic just a couple days ago)
Yeah, Google is far better at location-aware searches. To be honest most of those are usually done in Google Maps (sigh.. please somebody make a decent Google Maps alternative!) anyway, so it's not really an issue.
It's not just location-awareness. They also excel at weird fuzzy searches.
random example: My girlfriend lost her laptop in airport security, and I wanted to find a picture of the specific scratch-and-sniff sticker she put on it for the claim form. Duck Duck Go search for "glossier blackberry sticker" didn't find it; Google Images did, first try.
* this turned out to be a good move, I got a positive response from TSA within minutes
One issue is that "a decent Google Maps alternative" would cost literally billions at this point.
This kind of infrastructure is really the government's job, but they have struggled to keep pace...
You can still try Qwant (pretty good in my experience), Bing or Startpage (which uses Google in the back - but I never had any trouble with AMP when using it). It's not like there is no choice. And of course, there should be no reason to support Chrome either, Firefox is a great alternative.
EDIT: interesting, never thought such a comment would get a downvote... Google brigade?
I've found DuckDuckGo to work very well as my default search on mobile (so well, in fact, that I can't remember the last time I had to go to google for a search).
I still haven't brought myself to use it on my laptop, but I do use Bing on Vivaldi (I use Opera, Vivaldi, FF Dev Edition, Opera Dev Edition, and Chrome Canary - the first two for everyday browsing, and the rest for dev-ing. I use google in Opera, and Bing in Vivaldi).
Using search engines other than Google is a nice change of pace, even if not solely to avoid AMP pages.
This is what I did. When the Google News redesign happened, it made Google News substantially less useful to me. Enough so that came up with a replacement.
It's not for everyone, as it requires running your own webserver, but I use Tiny Tiny RSS to aggregate the feeds of the various sources I'm interested in, then can read the aggregated feeds (I have multiple, a different feed for each general subject) through the web interface and/or by using an RSS reader. I use an RSS reader (gReader) on my mobile devices to do this.
Every browser does this if you copy the address in the address bar, because it's an AMP link. (I believe "sharing" the link will use the canonical, non-Google one.)
Google browsers on google devices will rewrite the URL bar. IMO this is one of the more egregious offenses — the URL bar no longer accurately reflects the website you are browsing.
This article focuses on what it's like for web developers and for the web ecosystem, which are both important issues. But AMP is also really annoying for end users.
As an end user, AMP gets in my way and complicates my experience. There's extra work to figure out what's going on. This page is from whatever site but "delivered by Google". As an end user, my reaction is basically: what the hell does that mean, why is it here wasting my time and cluttering up my screen, and when can Google cut it out?
Then sometimes I go to share a link with a friend over Slack or whatever, I hit the share button, and the URL comes out all fucked up. I know they're going to look at the URL to figure out what it's about (because in the real world, people do look at URLs), so I feel compelled to fix it, so I have to back up out of there, then dig around in the UI to figure out how to get a real URL. Maybe "open in chrome" will do it, or maybe I need to flip through the page itself to find where it gives a link to itself. I can never remember what works, and I don't want to have to.
I know AMP pages are supposed to load faster, and they probably do a little, but I would gladly trade that for simplicity.
Also, I would turn it off if they would give me the option, which wouldn't be hard but they don't, which tells me they don't want people turning it off.
Yeah but on the plus side, it loads in a second, isn't sluggish to use, and isn't full of annoying fixed elements. Most non-AMP news websites which take many seconds to load and are really slow and annoying when loaded.
I get all the arguments against AMP, but "annoying for users" surely isn't one of them.
Most websites I've personally encountered that use AMP don't actually deliver a usable amount of content/features on the AMP version of the page, and so it usually ends up in an awful user experience where I then have to get the real version of the page before I can do anything.
If I understand AMP correctly, there are 3 alternatives to compare:
(1) Google doesn't intervene at all, web sites are full of bloat
(2) Google requires mobile sites to not suck if they want decent rankings
(3) Google requires mobile sites to not suck and also delivers the bits instead of the site's servers.
I agree that #1 is not a good state of affairs. I'm fine with Google pressuring mobile developers to create sites that perform well. I just prefer #2 over #3.
If as a user you don't want AMP, just don't use Chrome and Google Search. But for a website to miss out on the traffic from Google Search is a really steep price. We need everybody to switch to a different search engine.
Websites WERE building horrible, non mobile news articles in HTML when AMP started at Google in 2015. The news articles were so slow and wasted so much bandwidth that many news orgs wrote bad apps (think CNN app; BBC app) to replace shit with even worse shit. That's what you get when you skimp on frontend engineers!!!
AMP gives little guys, the ones starting blogs and trying to grow, a shot at freedom of speech. The web was developing in a way that the big players like BBC and CNN would dominate with big budget winner-take-all walled gardens. AMP is one of Google's most anti establishment services, which means I'm sure Ruth will be killing it very soon!
This meant Google search on the mobile web was literally dying. Every year more and more content was being locked inside walled Gardens!! I was a maintainer of AMPHTML 2015 - 2018 at Google. The project is hibernating and loses a ton of money I know I worked on the budgets for flash memory for AMP. At the time Facebook and others were proposing proprietary non HTML news document formats. Google, to keep HTML alive, decided to cache amp for free, which subsidized hosting costs for ALL news websites. I hate it that now I have to switch browsers 2x to write an article comment, too! But news apps NEVER supported this AT ALL!! News apps NEVER supported a working search feature AT ALL!! News apps NEVER supported a good user experience or global search AT ALL!
If you want to rant, blame the bloatware mess that is HTML, it has almost at killed The mobile web, not AMP! AMP is Google's attempt to keep HTML alive on phones ...
Seriously, is html performance a real issue? Mobile traffic keeps growiong and growing and growing, according to google, who now crawls most sites mobile-first! Phones have 4 cores and download 300-MB games daily. There is absolutely no need for this abomination. If it cared, google could threaten to derank slow sites for slow phones and the average website size would be slashed to half in a week!
> AMP gives little guys, the ones starting blogs and trying to grow, a shot at freedom of speech.
Yes, it was a huge issue and many websites were unusable on anything but an expensive iPhone for a long time. Especially a few years back.
While this might not be a problem with most Apple-toting frontend engineers, most people of the world can't afford to constantly pay for very expensive phones just to browse the web. And until AMP there just wasn't a way to make anyone care it seems. Even here on HN.
Just to be clear: I dislike AMP. But I dislike the crap attitude towards users the web developers have shown time and time again more.
I must be crazy because i never had a catastrophic issue with an iphone 8 - with an adblocker. If ads are the problem, well guess who is serving those ads.
AMP doesn't even scale anyway - it will bloat like HTML pages bloat over time, because web ppl have a bad habit of only adding things to sites, not removing. What happens then? We invent Amp-html2 to fix amp? AMP is a very-ill-thought bandaid to a culture problem that can be solved with simple nudges (have people forgotten what seismic changes happen to the web every time google rolls out a new SEO algorithm?). Amp s probably the silliest tech idea of the decade.
there are so many better ways that google could solve this issue other than amp (derank sites for slow devices / mark them as slow / pass a parameter for slow-phone visitors / create a chrome version for slow devices). AMP is a dictatorial attempt to keep websites forever bound and limited to what google is offering.
Yes, Google did whatever maximally benefits Google. They're a corporation and behave as such. Just like Apple won't de-DRM their cable protocols just because it's "right".
The question is - what can the web community do to make AMP redundant outside of complaint posts.
First, AMP is already redundant. it doesnt offer anything that stripped-down html can't do. The primary reason sites choose it is because google ranks the pages higher! it's purely coercive.
Second , it's not as if AMP has taken over the web. But this coercion has to stop. Third, it's real easy to make a faster website with 10 minutes of work. I 'm not sure we need some kind of activism to stop amp i do believe it will crash on its own as soon as most sites look exactly alike and start losing revenues. But until then ... maybe ban AMP links?
If it's so easy then why have so few websites done it? Google has understood what Google/AMP haters refuse to see: web performance is not an engineering problem, it's a product and marketing problem. Coercion is exactly what's needed to push website owners to prioritize performance, because HN's monthly whinefest isn't cutting it. Here's two basic things AMP offers that stripped-down HTML can't do: a world-class CDN that many website owners won't justify investing in, and a clear, marketable incentive to develop a mobile-efficient website that VPs, marketers, product managers, and other business stakeholders can immediately understand.
Because the vast majority of websites are reasonably fast on mobile? Loading times of 1,2 or 5 seconds are a non-problem that amp is addressing. The worst offenders i see are too high res images and autoplay videos, but frankly i cant remember seeing any of those recently. Most blogs/news sites are fine. Where is google sourcing their data that users are desperate for web-breaking solutions that bring them 200msec response times? The purpose of AMP is so that people flick a website instantly and then go back to google. That's obviously not in the interest of the publishers. The whinefest is because google is actively prioritizing amp publishers thus forcing it on the web.
> Coercion is exactly what's needed to push
this is not a defensible statement
> a world-class CDN that many website owners
facebook needs a world-class cdn, not blogs.
> a clear, marketable incentive to develop a mobile-efficient websit
the "marketable incentive" is the de-ranking of the site. It's entirely unnecessary to force amp for that, a simple page speed deranking would do
Google knows load performance is a critical user need from the ample data they collect from Google search users, they've talked about this before. I forget the exact number, but every 100ms less load time drives significantly more traffic and engagement. I have no idea what data you're looking at that implies 5s load times are not a problem. I, for one, am overjoyed the Google is tackling this problem and succeeding at it.
Google has applied performance penalties to sites before and it still does. It's not enough, and there are limits to the penalties they can apply because these websites are ultimately very useful and relevant, it would worsen search quality to derank useful but bloated websites. The carousel is a good balance of incentive and penalty.
It's funny. Google is thinking of marking slow loading sites. I analyze my sites with their own page speed tool the biggest blocker is Google/DoubleClick ads. I'm probably going to completely remove AdSense (auto ads are terrible) but can't they optimize their own code?
Seriously, nothing is going to kill the mobile web more than Google continuing to overreach and use bait-and-switch tactics on publishers. Oh, sure, AMP is good for the "google-mobile-web experience", but bad for an open web.
If Google has an option for logged in users to bypass AMP pages, I would not blame Google. They stubbornly refuse to do this, thus it is Google that is ruining mobile browsing for me.
(I would have written an iOS Safari extension that bypasses AMP years ago if Apple supported such a thing…)
> If Google has an option for logged in users to bypass AMP pages
Not just for logged in users, for all users. Really, if Google provided some way to avoid getting AMP pages (through a cookie or something), I would have no problems with it.
I tried to find the real URL behind an AMP page to bookmark, but couldn't find it. I think they've added a tiny (i) since then, but they're really trying to hide it.
> AMP gives little guys, the ones starting blogs and trying to grow, a shot at freedom of speech.
> AMP is one of Google's most anti establishment services
You're either writing satire I don't get, or work for Google.
How exactly does a walled garden give you free speech? Especially when it's provided by who profit the most from you not leaving said garden? While also forcing you to bypass standard practices?
Utter nonsense, unless it's a joke I'm not getting.
> AMP gives little guys, the ones starting blogs and trying to grow, a shot at freedom of speech
How so?
> AMP is one of Google's most anti establishment services
It looks like the exact opposite of that to me. This is Google's attempt at remaking the web in a way the enhances Google's control and power. That's pretty pro-establishment.
What I think he meant is that, most of the news website became slow and bad user experiences on mobile, pushing users to download wall-gardened native mobile news apps by established News Corporations to experience something fast and kind of pleasant.
This is a problem for Google and for "freedom of speech", because you're not googling for news anymore, you go straight to your established news native application, preventing you to see other competing results (like blogs or smaller news websites for instance)
Pushing them to have cleaner and faster websites makes the user stay on the web. It is a clear benefit for Google, but to his point, to the user too. (At least that was the goal)
AMP absolutely does not give the little guy a leg up.
In fact, it’s only the massive news sites that have the developer time to support AMP, meanwhile the little guy has to play around with terrible Wordpress plugins and spend hours fiddling with it just so Google will properly crawl their site.
And don’t even get me started on static site generators. AMP support is shoddy at best and a giant PITA for 99% of static site generators. Wordpress is one of the main reasons the web is so slow, yet AMP gives power to Wordpress since it’s the only way non-technical blog owners can support AMP.
AMP forces small time blogs and content sites to waste time building two versions of their website to rank alongside the big boys. How does this help the little guy?
> AMP gives little guys, the ones starting blogs and trying to grow, a shot at freedom of speech.
As long as the big guys aren't on AMP yet. But an overlooked tradeoff is that the little guys are forced to play by Google's rules in terms of how and where they display ads, even the ones that aren't sourced by Google's ad network. It creates a completely uniform policy that undeniably benefits the scale of Google. A small publisher simply cannot differentiate their ad offerings. If you view that as a good thing for the end user, that's fine, but it's certainly not in favor of the little guy. Little guys depend on differentiation in every area of their business to effectively carve out a niche against a giant like Google's ad network.
I think you are dead wrong. 2015 didn’t mark some ah-ha moment when AMP came along finally we were able to use web on mobile. Most of the websites that did and still do have problems are auto-playing video news sites or sites with way too many ads than necessary.
AMP is just a step above the top results boxes Google puts on the results page that are scraped from other websites. See the other front page article about Google repeatedly stealing Genius lyrics.
Yes! You're absolutely right that page size, coupled with something like time-to-render metrics, could do that!
Of course, there might be a wrinkle or two. How do you propose to evaluate the size of a page when large amounts of something like a newspaper article is loaded by reference, dynamic, and depends on third parties making independent run-time decisions? How can you know a page's size won't vary 50% minute-to-minute in a world like ours? And how can you meaningfully measure load time in such a context?
You're absolutely right. Page size and speed could absolutely be better ways to do this! It's just maybe possible that there could be some minor obstacles to doing so.
> How do you propose to evaluate the size of a page when large amounts of something like a newspaper article is loaded by reference, dynamic, and depends on third parties making independent run-time decisions?
You downrank them immediately because that's slow.
Of course, there might be an issue here because the amount of things that work that way is huge. So now you have a scenario where everyone is angry at Google for trying to dictate how they can build web pages and writing angry digital polemics about how this is an unreasonable standard and abuse of power. Nobody actually wants to re-implement massive chunks of how their website works, so everyone will resent this incredibly artificial imposition.
Which is to say it's a wonderfully straightforward answer, but perhaps not better than AMP in practice.
> everyone is angry at Google for trying to dictate how they can build web pages
But we're already doing that because Google downranks results that don't use AMP. We're generally OK with Google downranking sites on actual metrics (such as HTTPS) but not when they're pushing their "solution" that clearly has a number of issues with conflict of interest.
Are you saying you'd be completely fine with the above scenario, where Google downranks each webpage based on the number of external assets it loads and the amount of dynamic content it has? Instead of using AMP?
Personally, I prefer AMP for security reasons. It's tightly restrictive and does a lot to limit the available space to mount attacks aimed at browsers. But I understand that's far down most people's lists, and tends to fall under the same sentiment as "devs should just write fast websites".
I suspect you may be an outlier, as most people seem to deeply resent the strong incentives to change how they author web pages. Shaping them slightly differently strikes me as unlikely to generate a dramatically different reaction.
You're absolutely correct. Please accept my apologies for being uncolear. I was speaking specificially and narrowly of strong incentives to build weg pages differently being shaped slightly differently under a hypothetical regime.
Again, please accept my apologies for my failure to communicate my point clearly.
This is a really old article, but as long as we're here: just a quick reminder that the AMP standard still includes platform-specific components that favor individual companies[0] over smaller creators. It's still not clear what will happen to the components when those services disappear[1], and it's still not clear whether Google has the guts to tell someone like Facebook that a new component feature isn't performant enough to be included.
Quick reminder that the only way to do captchas in AMP is to use Google ReCaptcha.
There are a lot of reasons to hate AMP, but one big reason I hope doesn't get drowned out is that it's not just anticompetitive in the sense of handing control of traffic or hosting to Google. It's anticompetitive in the sense of reducing functionality on the web to a handful of large corporations that have every incentive to reduce diversity and place harsher performance restrictions on competitors than they place on themselves.
> "Quick reminder that the only way to do captchas in AMP is to use Google ReCaptcha."
That is terrible. ReCaptcha is the worst. Also, ReCaptcha seems to discriminate against Firefox, and if AMP discriminates against other captchas, this might actually count as monopolistic abuse by EU rules.
I love AMP sites that do it the right way, like Politico. Keeps the real domain, loads fast, clean interface. I wish more sites were like this. I think the first version of AMP where the URL was always "google.com/amp/politico/sdgffsdf" was awful but you can now keep the correct domain and I sometimes prefer it to the regular version of a lot of sites.
It's nicer than the original AMP setup, but still awful for publishers.
For any user that navigates to your AMP page from a Google search...
The publisher gives up the most important piece of screen real estate, and Google highjacks left/right swipes to navigate to your competitors. And, they hijack the back button post swipe too...back equals "back to Google"...not back to the page I swiped from.
It is pretty much like early AOL. A semi walled garden. It offers some speed benefit for users, but way more benefit to Google.
The page you linked takes 8s to display on my browser, even on subsequent reloads, just because I don't allow third-party scripts. It also has no displayed images, for the same reason. I really don't wish more sites were like this.
Correction: It lets anyone cache your page, not just Google. And no "masquerading"; that's what the crypto is designed to prevent. Also it's not specific to AMP; you can use signed exchanges with any data served over HTTPS.
It's effectively just Google since it's not widely supported by browsers other than Chrome. There's also only one CA provider that can create the right certificate for SXG.
Or maybe you have some notable examples of SXG being used in a production non-AMP scenario?
The standard is brand new, and AMP was the motivating factor for its creation, so obviously the majority of existing use cases are AMP-related. That doesn't mean you couldn't go and implement a non-AMP use case in your own production site today.
One interesting use case for SXG is to allow decentralised and offline websites, since the site's data can be tied to a key/certificate/domain without having to be downloaded from a specific server. As an example, the IPFS project is already trialling the technology:
tying it to a domain name (which is the typical use of the URL) breaks the web though. i could understand if the key is used to show that the origin is a twitter account handle or something, but breaking the semantics of the domain by signing the content doesn't make any functional sense. Other than putting lipstick on a pig (AMP) of course
> oh wow, Signed Exchanges are worse than AMP!
> "make sure you are visiting mybanksite.com" is no longer safe.
Sounds like you don't trust public key based content signing. This is just broadening public key based signatures beyond the domain to include the domain and the content itself, and using signing to make the authenticity of the content independent of the physical infrastructure that served it.
That' what's being used here to verify authenticity of content's source, just like PGP/GPG does for signed emails.
That's a far stronger guarantee than "the data is authentic because it came IP address range X purchased by company Y".
In fact, without a such signature, there is no guarantee that just because a piece of content came from a particular server/datacenter, that it is authentic.
With signed exchanges, the chain of authenticity is pushed all the way back to the website's content creators - it doesn't stop at the web server. Also, this can't be phished unless you break the the content signing algorithms, and if that happens ... we all have bigger problems.
first, it breaks the URL specification, as the "host" is no longer a host. it breaks user's expectation of one of the VERY FEW things that everyday users understand about the internet.
one may manage to upload an html file to the bank's server and serve a -signed- page that google amp will cache, and then use it to phish customers from within the bank's domain. Or just use a stolen key to make thousands of such pages before the bank finds out. I think , contrary to what you say, it's a brand new, major attack surface.
> first, it breaks the URL specification, as the "host" is no longer a host.
By this definition, "host" hasn't been a host in a long time, since the time it was possible to route DNS traffic to multiple IP addresses, possibly in different datacenters.
> it breaks user's expectation of one of the VERY FEW things that everyday users understand about the internet.
How is signing content directly less authentic than signing only at the web server? Signing content directly at the time of publishing ensures that it was created using the private keys of the entity in question, regardless of the delivery mechanism for the content.
> one may manage to upload an html file to the bank's server and serve a -signed- page that google amp will cache,
Signed content exchanges specifically limit that by putting the content signing step at the content creator level, not the web server level. Unless you steal the content creator's private keys, you can't represent your content as theirs.
> wouldn't the server sign all http responses by default? all you would need to do is upload a file
No, the content has to be signed when it is created, in the content management system or similar content creation tool, not when the server sends it. The content management system itself must have strong controls on it (ACLs, controlled user accounts, protected private keys stored only on encrypted and access controlled media, regular audits, etc).
Basically the server itself is no longer trusted as the arbiter of content authenticity, the actual content creator is. Concretely, when the editor at a publication approves an article after reviewing it, it is signed for delivery at the moment of publication, not at the moment that the request is served.
so that means i can sign a page on the editor's computer, take it with me and serve it to amp from my website? that sounds even more dangerous tbh. it delegates security from people who may know a little bit about it (web hosts) to people who likely know nothing about it (writers)
what happens if someone's key is stolen and they need to re-issue it? All the previously published copies are now invalid?
> first, it breaks the URL specification, as the "host" is no longer a host.
Really, how so? RFC 3986 goes out of it's way to make clear that the "host" component doesn't mean DNS, and doesn't even have to denote a host.
"In other cases, the data within the host component identifies a registered name that has nothing to do with an Internet host."
"A URI resolution implementation might use DNS, host tables, yellow pages, NetInfo, WINS, or any other system for lookup of registered names."
> it breaks user's expectation of one of the FEW things that everyday users understand about the internet.
What, exactly and concretely, is that expectation?
> one may manage to upload an html file to the bank's server and serve a -signed- page that google amp will cache, and then use it to phish customers from within the bank's domain.
If the attacker can upload arbitrary pages to the bank's website, just why would they need signed exchanges? They've already got their phishing page on the correct domain.
the RFC uses the word "host" and not "signer". It also says that the "host" is intented to be looked up in some service registry, and there is no such thing for arbitrary signers.
> exactly and concretely, is that expectation
One of the common security advice banks used to give is "check your browser address that you are in our server"
> just why would they need signed exchanges
with signed exchanges they can fool amp to cache the page long after it has been deleted from the server
The RFC explicitly says that "host" doesn't necessarily mean an actual host and you still insist the opposite. So I don't really know what to say.
> One of the common security advice banks used to give is "check your browser address that you are in our server"
So you say that everyday users have an expectation that they're "in the bank's server"? That doesn't seem very concrete, since that could mean anything. Surely there is some kind of expectation they have about actual behavior or property. Something that will happen / can't happen right now, but the opposite with signed exchanges.
> Anyone who has the file can intercept the form data from that page now - a complete phishing attack.
Uhh... And just how would they do that? They can't inject anything into the page, and they can't modify the page. How do you figure they force the browser to submit the form to the wrong server?
assuming that someone finds a way to sign a malicious Html page (e.g. by sneaking into the editors office) they can serve it from anywhere, and the browser will pretend it's coming from the bank
> One of the common security advice banks used to give is "check your browser address that you are in our server"
" in our server" is a simplification of the technical explanation: "signed by our computers using our private keys before delivery to you". That is still maintained in the case of signed content exchange, but instead the transport function is provided by a different server.
It's not much different than, i.e. signing a compiled app with your private keys before uploading it to an app store. Such apps also use hosts to identify themselves and their content, even though they are delivered via app-store mechanisms.
> signed by our computers using our private keys before delivery to you
Please try to explain that to an everyday grandma.
I still dont' see how it's an improvement. The file can be masqueraded by an arbitrary server god knows where and still be served as valid to me. Anyone who has the file can intercept the form data from that page now - a complete phishing attack. There are so many things that can go horribly wrong it just makes one wonder what's wrong with googlers these days: https://blog.intelx.io/2019/04/15/a-new-type-of-http-client-...
> one may manage to upload an html file to the bank's server and serve a -signed- page that google amp will cache
Only if you have the bank's private key, and the ability to serve arbitrary content from the bank's domain. In which case... yeah, I don't see how the signed exchanges standard makes that problem significantly worse.
i don't know what's the max expiration for amp's cache, but i could set a really-long expiration date on the file and remove it from the server without the bank ever knowing it existed. SGX don't even require an upload - one disgruntled employee could do the same with a stolen key.
Nobody benefits from this shit than google. Do we really need more attack surfaces?
I hadn't realized the content was actually signed; I assumed we were simply trusting Google to send us the content they said they were sending (much like we do when using the Google cache).
I'm curious now: would it be possible to use use the content/markup intended for use by the amp cache to view a static/unscripted/readable version of the page's main content? If so, why hasn't anyone built a browser extension to do so?
On a broader note, this also sounds like it could be used to allow caching proxies to work with https; you'd lose the privacy, but you'd gain from being able to cache content on local network if the browser only had to verify the content, and you trusted the cache not to spy on you.
> I'm curious now: would it be possible to use use the content/markup intended for use by the amp cache to view a static/unscripted/readable version of the page's main content? If so, why hasn't anyone built a browser extension to do so?
If the goal is to get around the AMP CDN, you don't even need to read the main page content. The AMP URL contains the original source URL itself [1].
The extension you are describing would just need to capture all requests with the prefix https://www.google.com/amp (or whatever CDN you are trying to get around), parse out the original URL, and then fetch it, and do what you will with it.
If the goal is to disable scripting on the AMP CDN delivered content, first note that AMP pages can't contain page-author-written JS [2], and any implicit JS has to run async.
But if that's insufficient, you can disable JS in the browser altogether, which would disable it in the loaded AMP content.
You could also try to parse out the main content from your extension from the AMP page if you know from the URL that it's an AMP page. Because AMP's forces relative terseness and simplicity of HTML content, it is probably easier to parse than original page's content. Obviously that won't generalize easily given the large variety of possible of content representations, but you stand a better chance of achieving this with AMP content than the original content.
And if you generalize it enough, you will end up with one component of a web crawl / indexing system in an extension ;)
I’m not sure you understand the purpose of https. Ensuring integrity of the document served by the server is only one small piece of it.
The other critical components are:
encryption so middleboxes can’t see what you’re looking at
guarantee (via the PKI) that the server you’re about to send your banking credentials to is using a cert that belongs to the domain name in the address bar that you trust sending your credentials to.
> encryption so middleboxes can’t see what you’re looking at
> guarantee (via the PKI) that the server you’re about to send your banking credentials to is using a cert that belongs to the domain name in the address bar that you trust sending your credentials to.
The purpose of SXG is to allow publisher signing of edge-cache accelerated public content - i.e. it's read-only - not to encrypt private information like credentials in transport. Https still handles encrypted transport independently of SXG.
Also, why or how would someone create a system that accepted private info or credentials via signed SXG anyways? There's literally no mechanism in it to achieve that. If you tried to build a password entry field for your bank website and distributed it via SXG, it wouldn't even work in the first place.
No, you can distribute whatever content you want. But the content distribution network can't listen for posts from those forms when the content is rendered.
SXG doesn't answer DNS requests for your domain. It only says that a particular piece of content has been signed using private keys that have been registered with the displayed host. That's it.
In fact, you don't even need a CDN or DNS to distribute SXG content. You could distribute it via USB drives, or code flags, USB drives attached to messenger pigeons, whatever. The point is that authenticity of the origin of the content is completely independent of how the content got to you.
When that SXG content, however it is distributed, is rendered, the browser represents that content as originating from your domain, which is in fact exactly where it originated.
There are 100 ways to steal credentials if you manage to convince the user that it’s safe to start typing in the page, since you can serve malicious js that way.
I really don’t understand why the browser would masquerade the url just because the content is signed. At best it is able to say ‘the content is signed with x’s key’
> There are 100 ways to steal credentials if you manage to convince the user that it’s safe to start typing in the page, since you can serve malicious js that way.
That's true, but it's completely independent of SXG. There's no way to trick SXG into showing a URL that it's not signed for. You would have to steal the private keys.
> At best it is able to say ‘the content is signed with x’s key’
Remember that x's key is cryptographically associated with their domain - that's how web certs work - so the browser can also say that "this content is signed with domain x's key". That's exactly what happens with https today, but with https, the chain of attribution implied by the signature stops at the webserver, since it holds the private keys for signing the content.
SXG allows the chain of attribution to be completely independent of the transport mechanism, https or otherwise. Of course, you should still use https to encrypt during data transmission over the internet, but that's orthogonal to content signing.
This is also directly analogous to how app stores distribute cryptographically signed apps. For example, it allows an iPhone to open a local native iOS app in response to a URL click in web content [1]: The app and the URL are both cryptographically signed by the same entity, so iOS can conclude that they are from the same origin, and allow the app to handle the URL.
i agree but i just can't justify the connection between the domain and signed content. The root node here is "X's key" and it is used to sign a domain cert and also a document. It's semantically wrong for the browser to pretend that the document belongs to the domain, and even more wrong when the signed document is being served by another domain with a completely different cert, google's!
Even app stores don't do that - if you download a signed app from any domain, it won't pretend it s downloaded from apple.com but it will report that its signed from Apple Inc. The situation is not analogous anyway because there are very few app stores from 3-4 highly trusted corporates. If any of their app store private keys are stolen the internet is fucked.
> The root node here is "X's key" and it is used to sign a domain cert and also a document. It's semantically wrong for the browser to pretend that the document belongs to the domain
Browsers "pretend" exactly this every time they download a page via HTTPS. It's how HTTPS works. Did you think that they trust that the content comes from the correct source by just doing a reverse DNS lookup on the IP address? They don't. Instead, they check a signature from the web server against their cert keystore, and if the PKI signature check fails, you get a big scary warning that the connection isn't secure/private. The same thing would happen with SXG based content if the signature didn't match the keystore, except the signature to be checked is carried with the content itself, just like with PGP/GPG.
> Even app stores don't do that - if you download a signed app from any domain, it won't pretend it s downloaded from apple.com but it will report that its signed from Apple Inc.
I just checked an iPhone, and they appear to attribute an app to the creator, not Apple, Inc.
But the reason they don't show a download domain is because consumer iOS apps can only be downloaded from Apple, from the App Store, and nowhere else. Adding the information about download source information to the iOS UI would be totally redundant as the value would always be 'downloads.apple.com' or whatever.
If you look at the actual cert signing procedure for iOS apps, the configuration step includes the domain, which is why Apple can associate an entity's apps with it's https websites. Nonetheless, the apps are still signed by the app's creator, not Apple, and the app's creator is responsible for securing the private keys [1]
> The situation is not analogous anyway because there are very few app stores from 3-4 highly trusted corporates.
Why should only the 3-4 big corporates be the only entities who can sign or distribute apps or static web content? They are not the only entities capable of securing private keys. Banks do it all the time, as do individual app developers (note the warnings to app developers about private key management on Apple's website). They are also not the only entities capable of distributing content. App and content stores can provide many other services of added value, like aggregation and curation and payment systems, but signing and distributing content isn't one of those services they can uniquely provide.
You could even argue that distributing the ability to sign and distribute content away from the big corporations reduces single points of failure and makes the whole content distribution ecosystem more robust and fault tolerant.
Well thanks for your reply , i still think sxg breaks semantics.
> Browsers "pretend" exactly this every time they download a page via HTTPS.
yeah and the big scary warnings are for the connection, not the content. currently browsers tie url host to DNS so the semantics are different, so the cert certifies the distributor. I also think this is only true for certs that don't have an organization name, at least i think that , for extended-validation SSL they still show this: https://upload.wikimedia.org/wikipedia/commons/6/63/Firefox_...
> and they appear to attribute an app to the creator, not Apple, Inc.
indeed , i meant that they attribute the app to Apple Inc as the creator, but not their domain, which is again, different semantics. (although i suppose apple is somehow involved in ensuring that the correct binary is distributed for every developer)
> Why should only the 3-4 big corporates
i m obviously not saying they should , but that it's not analogous situation, with their walled gardens and all. the web is nobody's a walled garden and a large part of the content is public domain which doesnt need any signing. that s why app store logic doesnt apply.
> reduces single points of failure
that 's what software hosts already do with providing hashes for binaries. and it's great that sxg can verify content through the browser. but it shows where the content was created, not where it was distributed , thats why i think it's wrong to change the URL
I don't think there's a real phishing risk with them, but I object to Signed Exchanges because they are actively making the browser lie to me about the URL being used.
The URL the browser shows is the one which was cryptographically verified to be correct. I don't see how you can call that a "lie".
If I'm offline and I open an offline cached page in my browser, would you call it a lie if the browser displays the URL I originally downloaded that page from in the URL bar instead of saying it came from "your hard drive"?
It's not just us HN commenters that are concerned. Mozilla, for example, is highly opposed to it in it's current state.
"Mozilla has concerns about the shift in the web security model required for handling web-packaged information. Specifically, the ability for an origin to act on behalf of another without a client ever contacting the authoritative server is worrisome, as is the removal of a guarantee of confidentiality from the web security model (the host serving the web package has access to plain text). We recognise that the use cases satisfied by web packaging are useful, and would be likely to support an approach that enabled such use cases so long as the foregoing concerns could be addressed."
> We recognise that the use cases satisfied by web packaging are useful, and would be likely to support an approach that enabled such use cases[...]
That doesn't sound "highly opposed" to me.
Anyway, I read the full report from Mozilla back when they first published it, and while they do have some valid concerns (any new feature introduced to the web will necessarily introduce some new attack surfaces) I believe their concerns are already sufficiently well addressed by the standard.
The paragraph from Mozilla that you quoted is also rather vague and misleading. In particular:
> the ability for an origin to act on behalf of another without a client ever contacting the authoritative server is worrisome
This is super vague. I see no reason why that should be "worrisome". That sort of thing happens all the time in public key cryptography. When you receive a message signed with the private key of a trusted actor, it's perfectly reasonable to trust that the trusted actor authorized that message regardless of where the message itself came from. TLS itself already does exactly that every time you visit a website over HTTPS (your browser trusts certificates signed by a trusted CA, even though those certificates are being presented by an untrusted website, not the CA itself).
> as is the removal of a guarantee of confidentiality from the web security model
This concern is completely unfounded, and I'm surprised Mozilla included it in their summary. The use of the signed exchange standard doesn't reveal any information to any party that would not already have access to that information without the standard (a host serving you a link to a static, public page will necessarily already have access to the plaintext content of that page, regardless of whether they serve you that content themselves or not).
Yes, I know. Again, I read the full report. I don't think "Harmful" is an accurate summary of their position either. (At least in a layman's sense of the term; it may very well be the correct category from the perspective of Mozilla's formal standards position process.)
The more detailed summary in the full report says:
> There is a lot to consider with web packaging. Many of the technical concerns are relatively minor. There are security problems, but most are well managed. There are operational concerns, but those can be overcome. It’s a complex addition to the platform, but we can justify complication in exchange for significant benefits.
> [...]
> Big changes need strong justification and support. This particular change is bigger than most and presents a number of challenges. The increased exposure to security problems and the unknown effects of this on power dynamics is significant enough that we have to regard this as harmful[1] until more information is available.
> We’re actively working to understand this technology better. The Internet Architecture Board are organizing a workshop that aims to gather information about the bigger questions. That workshop is specifically structured to collect input from the publishing community. The technical details of the proposal will also be discussed at upcoming IETF meetings. Based on what we learn through these processes and our own investigation, we might be able to revise this position.
That doesn't sound "harmful" to me, it just sounds like they're skeptical, and possibly a bit confused. The meat of their concerns also seem to be primarily political, not technical.
It's "harmful" in it's current form, and Google hasn't yet committed to addressing all of Mozilla's concerns. Mozilla could have chosen a different label than "harmful". They did not. They didn't change it either.
Last I understood, Apple had similar concerns. I find it unlikely that both of those orgs are making noise for no good reason.
There are only 6 labels to choose from. They actually couldn't have picked a different label without making up a new one, or without making their choice of label even more misleading than it already is.
Let's try a different approach. How about this: I've carefully read over both the spec itself and everything Apple and Mozilla have to say on the matter (that I was able to find anyway), and have come to an informed conclusion: both Apple and Mozilla are wrong. (That's actually a rather poor, oversimplified summary of my position. But no moreso than "harmful" is a poor, oversimplified summary of Mozilla's position.)
You are making an argument from authority. I consider myself sufficiently well informed on this particular topic to be making arguments based on facts and reason. I don't find you repeatedly citing a one-word summary of Mozilla's position on the matter (which is actually quite nuanced, and not at all able to be summed up by a single word) to be particularly convincing.
Let's try this...It isn't me anyone needs to convince. An appeal to authority is appropriate when said authorities control the browsers needed for the proposal to succeed.
One of those 6 labels is "non-harmful". It's it isn't harmful, that seems right. Here's the legend:
"Mozilla does not see this specification as harmful, but is not convinced that it is a good approach or worth working on."
Mozilla didn't choose that label.
My view is that the proposal was driven by a desire to make AMP less icky. It looks like it could have broader benefit if the concerns Mozilla outlined are addressed. I am skeptical Google will do that.
As for your characterization of yourself as "well informed" and me as, er, something else...really? Was that necessary?
It's a lie because the URL being displayed does not reflect the source of the bits.
> If I'm offline and I open an offline cached page in my browser, would you call it a lie if the browser displays the URL I originally downloaded that page from
That's a bit of a gray area. Yes, it is a lie (the browser should provide an indication of the actual source of the bits). On the other hand, the cache was created by you and exists on your own machine, so it's more of a little white lie in that case.
I had been told (and I have 0 special knowledge here, this is just what a consultant in this space explained to me a few years ago) that AMP boosted your placement specifically because latency was a scored and important factor.
As such, all you needed to do to get similar rankings was use any sort of CDN hosting for your page and you would get similar results to using AMP.
Also, it sorta seems to me like the author is complaining, "I can't just do a minimum effort AMP page for the search juice, I actually have to make a functional AMP offering or not use AMP at all." Strictly as a consumer, I feel like maybe Google is doing me a favor while telling off a publisher.
CDN helps with the page load/latency variable of Google' PageRank but won't equal AMP.
To get AMP-like speed you'd need: CDN, no render-blocking javascript, minimized image files, "lazy-loaded" assets & inlined CSS for "Above the Fold" content. On the server side you want to cache content with something like Varnish & send it over an "Edge" network like Akamai or Fastly. Ideally everything is served over HTTP2 or SPDY.
That's sort of my point: publishers seem mad at the temerity that they can't festoon their web pages with tons of stuff and then not also be put in a category that is meant to be fast.
There are other parts of AMP I am less okay with, but tbh I trust the publishers even less than I trust Google with respect to sketchy tracking bugs and data collection and useless javascript ux.
Not true. Google favors AMP content in “articles for you” on chrome mobile, as well as featured story carousels on search and inside Google discover. All of these areas can be a massive firehose of traffic.
If you make a living from a content site, you have to play ball and create AMP versions of all your pages.
OR, you can choose to lose to your competitors. Let’s stop pretending like that’s really a choice, or that any sizable share of users will ever switch to DuckDuckGo.
This is where we need government to step in and regulate Google’s de facto monopoly on search.
Has someone demonstrated that if you make a static page with similar characteristics to AMP on a different CDN, that this doesn't get similar placement?
I'd be more inclined to side with publishers here if AMP was the only way to get this. But as an awful lot of content sites are run by folks very mad that they can't run their own invasive tracking and analytics, my sympathy is limited.
The first part of your post is incorrect, AMP is instant because it is prerendered, which just using a CDN can't achieve.
The second part is correct. I hate Reddit AMP results, and I'm happy that Google is telling them to fix it. I'll be even happier when they and other search engines demote Reddit AMP pages that do not match the canonical pages.
It says Google is validating that AMP pages match the content of the canonical pages and warning webmasters when they don't. I desuced that Reddit would get these emails because their AMP pages are the biggest offender that I regularly see.
It is unsafe to deanonymize a user to a website that the user did not even click on. That's why the page is served from the link aggregator's AMP cache.
Have you ever visited a news site without AMP enabled? It's literally impossible to use. Popups flying all over the place, unwanted advertising videos loading, megabytes of tracking JS loaded etc. AMP is forcing publishers to make sites people can actually use.
Yes. A news site, any news site, around 2014 would just smash my phone, making it completely unusable.
The page load would just be completely unpredictable. You start reading something and then it would fly down 2 pages and then up half a page because some asset container would load but then change its ideas of how much space it would need. Then you'd touch to scroll to continue reading and it would register as a tap and open up the ad.
Then the js hell would make the phone unusable, it felt like the people that wrote late 90s windows malware went into the business of making local news websites. it was complete and utter trash.
I use duckduckgo with Firefox so I never see any AMP website and it works fine. Firefox allows you to have an ad-blocker though so it makes a difference.
It's all about choice, but if you want to use a Google search engine with a Google browser then indeed you probably have to browse the web the Google way.
As with the former Readability, or Pocket, this offers a simplified page view.
On Firefox or Safari, use Reader Mode.
(Ironically, Chrome has a Reader mode which is 1) automatically enable, 2) not disablable, and 3) is uniformly styled horribly.
I believe there are now browsers and/or extensions which enable Reader View by default, or at least on specified domains/sites.
I've got a modicum of sympathy for Google here. Yes, the Web is a problem, and HTML's fast-and-loose attitude "be generous in what you accept, conservative in what you emit" has turned out to be a long-term liability.
Allowing Google and Google alone to play both sides of the deal in specifying and benefitting from the standards, is a flagrantly glaring conflict of interest (and very likely antitrust violatio). But the underlying stated concerns are real. And absent some entity with the ability to tell website publishers "no, your cavalier so-called HTML Does Not Play Here", the descent into further levels of markup and Javascript hell will continue.
I've found the most useful process for re-rendering sane HTML from most websites is to dump to plain text first (w3m or lynx, if they'll handle the site, copy/paste if not), and then re-add whatever minimal markup is actually required (usually via Markdown), then generate clean HTML.
Actual content payload is often only a few single-digit percent, and often far less, of the page's markdown. And that's excluding additional asset loads (CSS, JS), let alone image and media files.
Again: until there's a cost to publishers for pulling this crap, we're going to see more of it.
I believe you can disable Chrome's reader mode in flags. I know you can force it to show up on all pages, as that's how I have it set up on my tablet (I use Firefox and Brave primarily, with Chrome or Bromite as a fallback)
You can't load a page, though, and say "hey, please render this in Reader Mode".
And you cannot (at least on Mobile, I've nuked Chrome/Desktop from orbit, it's the only way to be sure) configure the Reader's styling at all.
I basically hate the font choice, size, margins, and about everything else.
For Firefox, I apply a custom stylesheet to _its_ reader mode, but at least I can do that, and it's only once for all the sites encountered through it.
Also, you might be interested in brow.sh, as it should provide easily readable, clean HTML that only contains visible elements. It should be better than dumping pages from links, as it supports scripting.
Sure, I visit non-AMP sites all the time, and here's what happens:
Reader mode comes up and shuts everything else down. I have it on by default, it's great. It blows AMP the hell out of the water and it's the only thing that makes the "unwashed" web (ie. anything that matches common search terms) tolerable.
Can you name such a site that is impossible to use? I 'm quite sure it sees so few visitors that it doesn't care for disappointing them. And if they re bloated as hell - why would they care about amp anyway?
Because their traditional extend, embrace, extinguish strategy no longer worked. Windows Phone and Internet Explorer failed and they were losing so much mindshare switching from one failed api to the next. The monopoly was becoming pretty fragile.
I don't know why people get so bent out of shape over what Google does. They'll just cancel it in two years. ;)
In all seriousness, though, I've been on the "screw google" train for a while. Everything good hey do they throw away (Google Code, Google Reader, Google Plus, Google Inbox, Google Wave... pours one out for OT while I work on my CRDTs) while anything they make which originally held merit they just make more awful, year after year. I don't know if they're aware just how unsustainable a business model screwing over their users might be, but I'm guessing they're willing to find out.
I have a page that I put on the web about 13 years ago, that somehow became the top hit for "how to <X>" for a particular <X>. I have never done anything to promote the page, and do not have any ads on it.
A few years ago, it lost the #1 spot, but is still on the first page.
(I'm specifically not saying what <X> is, or what domain the page is on, because I do not want to do anything that might get people to go there. I want to see how long a simple page with no promotion and no ads can stay on the front page of search).
I just noticed that Google says in the search results "Your page is not mobile-friendly". Would that be because I don't have an AMP version of the page?
Clicking on that notice goes to a Google tool that analyzes the page for mobile friendliness...which tells me that the page IS mobile friendly, so I'm a bit confused.
The page is very simple. Just some paragraphs of text, some h1 and h2 headings, a few tables, and a couple of lists. The only CSS is on the page itself via a <style> tag, and only has one rule, setting the width of th to a certain number of px.
You can definitely have pages that Google considers "mobile-friendly" without using AMP. Do you have a viewport meta tag on the page? If you have any CSS at all some mobile browsers will default to a desktop view that requires zooming to read, and that's probably what Google is detecting.
Regarding the analyzer tool, I suspect this is a case of Google's left hand not knowing what the right hand is doing. As I recall, Google is in the process of switching from a static analyzer to a Chrome Lighthouse-based analyzer (or something to that effect) and I'd guess that the mobile-friendly rules are slightly different between the two.
I love these posts - this has been talked to death.
The last thread had some good illustrations comparing media sites non-AMP pages (the bloat from ads / javascript / etc was INCREDIBLE) to AMP pages.
Google puts a little icon next to amp pages at least some of the time. These pages usually load VERY quickly in my experience - somehow whatever AMP/Google is doing results in less bloated pages on these AMP pages.
I wouldn't be surprised if users start naturally gravitating to these pages for the better experience. I know I have sometimes just because I know the page is not going to trap me on their site if they are AMP. I can usually get back to search results with AMP, where other sites do a weird thing where they pop up a registration page in front, then even if you fight through that you have to fight through some registration redirects to get back.
I wish google would push down news sites I don't have memberships too though - banging on paywalls is annoying - I pay for a few sites already -> be great to have those be the ones surfaced most often.
> I wouldn't be surprised if users start naturally gravitating to these pages for the better experience.
If this were true at all, these sites would have lost their mobile traffic. At worst, google could downrank them. This whole AMP thing is a glaringly terrible idea and such an incredible arm-twisting that should be scaring developers away from google.
It's not difficult to see amp for what it is: an evil attempt to turn websites to "web snippet producers" that can only be monetized via google or die. It's web feudalism.
The rules in AMP designed to speed performance are extensive.
Lots of things you can do in HTML - linked style sheets, synchronous third party java-script and ad frameworks etc etc are heavily restricted. There are size limits even on the inline CSS I think or even on the separate web worker javascript even animations are restricted so they can be accelerated.
The speed increase is not just because of preload. Turn on dev tools and look at network round trips / page size / CPU usage on an AMP vs nonAMP page.
Thanks for the downvote just because you don't want people to know the truth: all those "parts of the standard" that "make pages faster"? Don't do much at all. Bother to read TFA please - the average AMP page still loads in roughly 8.5 seconds on normal 3G. What makes them seem "fast" is Google preloading their content on the search results page and using Google's CDN to serve what remains.
YOU are spreading misinformation and one might ask why.
> That is incredibly monopolistic behavior because it can't be reproduced any other way.
Monopolistic would be Google making publishers integrate directly with them to enable prerendering like Apple News or Facebook Instant Articles. AMP pages can and are consumed by link aggregators other than Google, so they are very specifically not abusing their monopoly by telling publishers to use AMP.
Also Bing, Yahoo, Baidu, Yandex, and every other link aggregator (unlike Apple News or FBIA, which were designed for the same purpose). What part of that don't you understand?
Ignore this guy - he's just trolling I think. AMP can be loaded (and preloaded) by anyone who wants to. You can have your browser do preloads if you want.
I stopped caring what designers (and the community that really hates amp is designers, in addition to the "everything Google food is evil" crowd) think when scroll hijacking became popular. Sorry but the web would be better off without you guys.
I know that AMP is an intensely controversial topic, but I have to provide a counterpoint here because I don't think the web developer community fully appreciates the real state of the web on mobile:
> How many of you think that the amount of time spent on the mobile web as a fraction of total time on device has increased in the last five years? ... it's well below 7% and in a lot of markets it's falling. We have pretty good telemetry inside of Google and what it tells us is that the web is not essential to the future of computing on mobile.
The gist of his argument is that history suggests that when computing platforms drop below 13% of usage (or perhaps it was 10%, I don't remember the specific number), the platform is in a death spiral.
In my 5 years in Web DevRel at Google, I've never heard "it makes the web easier to index" as the main motivation for AMP, I've actually never heard it as a motivation at all. Maybe it is a motivator. The argument definitely makes sense. But I've not heard indexing as a motivator, not even in passing.
As a Googler involved in the web I know that I'm inherently biased and everything I say is likely to be written off, but I'm being 100% transparent with you here. There's a lot of people that care about the open web within Google and see it as the best platform for a lot of the same ideological reasons that you all probably love the web. The open web enthusiasts consider the real competition to be: the web versus whatever other platforms are dominant on the computing devices of the future (right now, that's iOS and Android on mobile devices). AMP is a strategy to incentivize websites to provide user experiences on par with whatever else is out there.
Disclosures: Googler in the "Web DevRel" team. FWIW we don't report to AMP. So my understanding of AMP may not actually align with the AMP team's. Note that I'm not even saying here whether I think AMP is the right approach to tackle the web's big problem, but I do understand why some people felt AMP is necessary right now. This post is definitely my own personal opinion and doesn't even represent Web DevRel's opinion. I may in fact get in trouble for speaking out about this ;)
I strongly dislike AMP pages, and I avoid them. The essay touches on many of the reasons why. Any sites that were entirely in AMP are sites I would not be using.
Google AMP is bad, but I think publishers got what they deserved, at least most of them.
When AMP was launched, news sites were horrible. Bloated, buggy, unintuitive, with intrusive ads all over the place and barely functional on mobile. Something needed to be done. Unfortunately for the open web, the one to find a solution was Google, and it turn a real need for a better user experience into a way to control the web.
And what comforts me in this idea is that many articles criticizing AMP do it for all the wrong reasons. Publishers want to add whatever bloat feature and complain that AMP don't let them do it. Guys, that's the part of AMP the world really needs. Complain about the Google bits, not about the efficiency improvement it brings.
Now, if you take a look at AMP, there are actually two parts. One is a very sensible set of rules for making efficient websites, the other is a big blob of JS with plenty of Google stuff inside, as well as a few functions that help you follow said rules.
You want AMP to go to hell, simple, follow the relevant rules from AMP, ditch the Google turd of a JS and keep only the good stuff. That way, your website will be even more efficient than the AMP version. Show the world that AMP is not what makes the web faster.
275 comments and nobody mentions that AMP is no longer a Google technology, but a Linux Foundation project? Is that good or bad news that nobody seems to notice?
Well, Google is probably one of the bigger sources of income for the LF, and the LF does not say no to money.
Technically AMP is a project in the OpenJS foundation, one of the dozens of foundations the LF is spawning of. Managed by Robin Ginn ex Microsoft. Good at driving the interests of the strongest if at anything.
That makes sense. Since AMP is intended to be a more optimized site structure than the average website, pages adhering to the AMP standard should increase the performance of the web on average, whether or not the site actually gets indexed into Google's localized caches.
We have HNers complaining about bloated JavaScript pages all the time. Maybe Google has actually found a way to tip the market away from those page designs?
If it were just an optional web standard, that might be okay. But it's a system that removes control from webmasters and hands it over to Google, and Google are using their incumbent advantage to coerce you into the AMP system. I can build a site just as fast as AMP without sacrificing the UX of the web itself in the process, and I intend to do so. But there are many businesses jumping on the AMP bandwagon because they think there will be an SEO benefit.
As a user, I hate clicking on AMP links by mistake. It's not even clear where you are, is it a website, is it still Google, how do I get to the website from here? Who even served me the content? It says it's from this website, but it wasn't?
It is an optional web standard. There are other search engines than Google.
I can't help but wonder why none of them have thought to start indexing AMP pages (or pages with some other standard or even no standard) in local cache and serve them faster to their users; seems like a space they could choose to compete on.
I'll admit I'm surprised as well, however if other search engines started doing that I would avoid them also. I want my content to be served by the URL I was shown and clicked on.
Am I the only one who read this piece as a screed against having web standards at all? "The web is a messy, complicated place.... The end result is an enormously diverse and anarchic free-for-all where almost no two websites use the same code."
In any other article I would expect the next paragraph to start with "That's why we're introducing Fribble, a framework for structuring your data so that blah the blah...." But, somehow, the author makes out that well-structured data and appropriate use of semantic tags is a ploy by this one evil company to secretly profit from your work.
I'm sure there are well-reasoned arguments against AMP, ways it could be improved or competing ideas for how to encourage fast page loads, expose content in a machine-friendly format, etc etc, but I haven't heard any of them in this article. It boils down to "We don't want your rules, maaaaannn" and it frankly isn't worthy of HN's attention.
Personally, I'm okay without AMP if 99.9 percentile latency on the overall mobile web ecosystem was within 2~3 secs. Unfortunately, it's more of 20~30 secs on non-AMP sites, which made me to take an ambivalent stance upon AMP. My ISP is obviously not doing a great job, but it's also a publisher side problem which is nearly un-fixable anytime soon.
I'm sad caching http proxies are all abandoned nowdays...
I bet those would also help noticeably, there's enough stuff browsers don't know to cache...
also in proxy vein, adblockers cannot be blocked, only in-browser convenient ones can...and ublock origin dev edition got lately hit on the chrome app store again ("lately" as in this year), I think...for no reason...
as other people said, the AMP is good so devs have a justified practical reason to push back against the lazy clients who'd happily have 20mb pages over their fiber connection...but it's bad in every OTHER way...
google monopoly really should be broken...they got like, 90% of BOTH search engine, and browser market...and a lot of people don't even know alternatives EXIST...
why do people got such stupid nowdays...
I switched to duckduckgo entirely in the beginning of November. Just to see if I'll like it.
I kind of forgot I did. I really dont notice a loss of anything other than not having to ignore really bullshitty sponsored posts and the Fisher Price feel of google telling me what I want because of "trends".
That I feel like maybe tech companies need something like the alcohol industry in the US. A person cant have ownership in a brewery, distributor and alcohol retailer. They can only have ownership in one. Maybe tech needs to be broken up that way? I'm not sure exactly when it comes to fine lines. But I know a search engine controlling ads, news, tracking data, phones, and just data in general... well it's not good.
I wonder how different Google would be if they were able to diversify their revenue to not rely so much on ads.
AMP = faster page loads = more pageviews = more ads served
AMP shouldn't need to be enforced to achieve a faster internet. It's not just Google that knows more pageviews = more ads = more revenue. Publishers know this too, and therefore publishers themselves have the same incentive to reduce page load time as does Google.
It's disorienting to see Google frame AMP as something that's for the good of the user. If it is, it's only a side effect of also being good for Google's business model.
I posted this after googling something along the lines of 'firefox mobile turn off autoplay amp' and coming across this article which was a fun hate read.
For some reason it manages to sneak past autoplay disabling? How?
I know my opinion here is completely self-serving..
But I prefer AMP over pretty much any news article or website.
As a developer, I can't trust operations nor my wider organisation to invest in making things run better.
This is proven by browsing the top 20 news websites in my country. All are slow, bloated, ad-riddled trash. Turning on an ad blocker helps, I've signed up and trialed just about any semi-reputable news sites in this country (AU) and none of them do anything to curb or make the experience any better.. Just "unlocks" more popular articles or reduces the TTL for me.
This is stupid. If you build an Apple News article with different content from the actual article, Apple users will complain, just as if you build an AMP article that is worse than the normal page, search engine and link aggregator users (not just Google's) will complain. That's why people hate Reddit AMP results. If you ignore these warnings, Google will probably demote your AMP pages, and rightly so because users will hate those results.
The author is basically broadcasting that he is clueless and actively harming his clients' businesses.
Putting aside the merits of AMP or Google entirely, I find this author's outrage suspicious. It says in Google's message, "this issue will not affect your appearance on search...". There is a false equivalency proposed that Google is strong-arming publishers to switch their entire site to AMP by sending these messages.
I am not saying Google is blameless or that AMP is perfect, but reading the author's argument literally, I don't understand why this action is the focus of their outrage. Just ignore it.
It's kind of a "pay no attention to the man behind the curtain" type of thing.
Google had the option of simply highly ranking pages that are performant and lightweight on mobile. They didn't take that option, instead they went with AMP. There's intent there. I'm not going to speculate on the intent, because it isn't relevant: the point is that because there was a simpler technical solution available that they didn't take, there is clearly intent in going the harder path. It's worth asking what that means.
AMP provides all the benefits that a well-built, well-architected site would, and that would have been the "non-evil" option. Use web standards and push people to respect them, imagine that...
After AMP was introduced, I converted to DDG (DuckDuckGo) and after taking some time getting used to it and personalizing it’s settings, I haven’t gone back to Google.
Though I agree with the sentiment of the article, I believe AMP can easily be thwarted by efficient website structuring and use of libraries with mobile efficiency in mind. What is good about the AMP indexing process is that, I think, Google still favors the non-AMP version of the site if the AMP site is less efficient than the standard version.
Should the threshold not be met, then a reduced, non-interactive version of the site loads using the same HTML, as non critical HTML can be moved into templates if needed, and the "Critical Rendering Path" is used to form a decision on how the site should download the sites' elements using DSO. Since images, CSS and JavasSript make up the bulk of the download footprint.
DSO aims at optimising the footprint via the following concepts.
- Offering only the CSS files relevant for breakpoints just for that device;
- Scaling down images and lazy loading them at the lowest possible resolution;
- Blocking out elements of HTML using "display: none" for areas that wont be loaded; and
- Most importantly, only download the minimum Javascript libraries for non interactive viewing.
There are other features DSO has such as background downloading for full interactive after non-interactive has loaded; prompting the user and asking if the interactive version is to be downloaded later, and a few other things. However, I haven't completed the entire project as I was side tracked with other projects but building a solution such as this that works along side WebPack would be the best senario.
Further, I'm not too sure if there are other solutions out there similar to DSO since I stopped development, but surely sidestepping AMP should be a rather straightforward process.
I thought we were done with seeing old articles complaining about AMP. I feel like I should set up an account that just reposts anti-AMP articles once a day, and see how many points it gets. Clearly HN demands more opportunities to rail against it.
We all know that HTML sucks, but what’s wrong with AMP? I like AMP, I’m forced to pay for each byte I use and 2mb web pages are killing me. The author doesn’t give any technical reason why AMP is bad. And I don’t think that it’s open source but google employees contribute more than anyone else is a good reason: if we are against open source projects that have open source contributors from predominantly one company we should think that Java, Scala, R, Kotlin, Swift, MySQL, ReasonML, Haskell, F#, and much much more are “evil”. In short, I don’t think we can conclude that “AMP can go to hell”.
Real people aren't moving to Firefox or Brave, they are moving (or have moved) to the Facebook mobile app. It has instant articles, (https://instantarticles.fb.com/) an amp-like lightweight way to load news stories, all hidden inside Facebook's system. Don't worry about searching for news, Facebook will just give you whatever it feels like.
Then there's Apples iOS only "News" app. Who knows how that works.
AMP is clearly a far more open competitor to the above. It is saving news publishers from themselves.
Where every "website" is an app in the Google App store. Google will serve ads automatically as freemium app (no code required - just agree) or be able to process subscription fees via Google pay and own the eco systems (Chrome, Android, Chromebook, Strada).
I don't think the rewritten title is faithful to the link.
I understand why the original title of "Google AMP Can Go To Hell" was rewritten, as it is unnecessarily inflammatory and clickbaity, but
"Google wants websites to adopt AMP as the default for building webpages"
really isn't the point of the article, at all. It is necessary background for the article, but the point of the article as I read it is that there are specific implications of the AMP effort that are nefarious, insidious, and dangerous to the open web. A rough outline would be
* Background: Google wants websites to adopt AMP as the default for building webpages
* Contention 1: Adopting AMP is technically difficult
* Contention 2: Adopting AMP will harm non-google web properties in the future
* Call to action: there are several things you can do in order to "...fight back. You could tell them to stuff it, and find ways to undermine their dominance. Use a different search engine, and convince your friends and family to do the same. Write to your elected officials and ask them to investigate Google’s monopoly. Stop using the Chrome browser. Ditch your Android phone. Turn off Google’s tracking of your every move. And, for goodness sake, disable AMP on your website."
I don't think any reader would think the Background statement provides the best summary of the posts thesis; it's clearly the call to action. By changing it to the background statement, it changes the "meaning" of the article completely.
So "Google AMP Can Go To Hell" is much better than the revised title, and if it's unacceptable, then "Fight back against Google AMP" is both faithful and uses the author's own terminology.
I think in this instance that the editorial control exerted over the post title could have been better considered (notwithstanding the same change made in the prior submission; the same mistakes were made then, as well)
The main thing we're trying for in title changes is to use representative language from the article itself. Since it does use the phrase "fight back", we can go with your suggestion.
Nice one HN, by changing the title (to something that is not the title of the actual submission!) you just inverted the meaning of the first sentence of the top comment at the time of writing [0]
If you are going with the second headline why not include the "Tell them no."-part too?
re: "Google is also the reason AMP sees any kind of adoption at all. Basically, Google has forced websites – specifically news publishers – to create AMP versions of their articles."
Actually, that's not quite right. Publishers have forced Google to forced publishers to use AMP. That is, publishers can't control themselves and are adding more and more and more bloat to their pages.
I am by no means a Google and/or AMP fan, but the truth is most sites have no respect for the receiving device and/or connection speed.
How many average people are walking around saying "I've quit the web! It's too slow!"?
Nobody. It's a non-issue because, despite the faults of the web, the average person is still clicking on chum-boxes, sharing clickbait, and using Google services.
Google isn't trying to save the web. They're trying to become the web. And they've already made their money, so what makes you think they care out of the goodness of their hearts?
> How many average people are walking around saying "I've quit the web! It's too slow!"? Nobody.
Everyone did. It's why web apps are dead & buried, and native apps rule the world. Nobody talks about cool web experiences anymore. It's why we're all talking about the experience of viewing a search result instead of the experience of browsing the web.
Entire areas that started exclusively on the web are now dominated by native apps, like social.
The web only exists for long-tail, rarely used content. That's the actual area it occupies now. Google, with AMP, are optimizing for the usage the web has left. It's a realistic approach to the web.
Show me the numbers. Everyone just spouts this rhetoric about the web being "dead" but where's the evidence? Please show me evidence. If people are still using The Google, then I doubt the web is "dead".
I quit sites all the time. Take trip advisor, great website, completely unusable if you are not on a high end desktop. Its a website ffs! I refuse to open pages unless I really need them. I find them using the duck since google is: to slow.
You're right in a sense, but I have changed which local/regional/national news sources I use. In Canada, the Postmedia group of companies have made it almost impossible to actually read the news. I'm actually ok with them having ads, but they have ads that cause the page to scroll/shift while I'm trying to read, and it seems to be random. It might teleport me further into the article, or earlier. And trying to scroll to where I was causes further ad loads that shift the page again. Totally unusable. The CBC though has none of that junk. If it continues frustrating people this much, it'll potentially shift the political discourse left (the CBC seems to have a left-leaning bias)
completely anecdotal, but my cousin did. she told me chrome (not the web, chrome, but she definitely meant the web in general) was "too confusing" and now only uses facebook and the mobile apps for a few other services.
>Google isn't trying to save the web. They're trying to become the web.
This is 100% the argument that I try and explain when people say "I don't support Google, I use Brave" (or some other Chrome/Chromium browser over Firefox). It typically falls on deaf ears.
I wonder how many people supporting the imo absolutely deserved "FUCK GOOGLE" mentality use Chrome to voice their thoughts?
If the respect was for the receiving device or connection speed, the metric used to determine priority of results would be size of payload / speed of delivery, rather than if the technology / hosting was google-owned.
As far as I recall they fixed that, now the problem is that they the measure for speed they use does some weird tricks around partial loads; as long as you display _something_ soon they are happy and the fact that a page might be useless for many seconds is weighted less
But then Google shows up and demands that all that cruft has to be added back to make AMP and normal pages the same again. Doesn't this kinda defeat the purpose? I thought I can maybe understand AMP pages as some reader mode, fast and no-bullshit content, but now they look like they just want to push their kind-of proprietary tech.
It's frustrating to me that anyone buys that the driver behind AMP was performance. A Trojan horse has to have some plausible reason to let it in the gates. That plausible reason isn't the actual reason it exists.
If performance really mattered to Google, it would influence SERP and Carousel position in a meaningful way.
Sorry isn't this just another web developer complaining that they have to use someone else's technology? I mean the first time I looked into supporting AMP I had that same visceral reaction "f this it sucks". After spending sometime with it it's not that bad and I get the value prop - AMP is content being served up directly on google.com or cloud flare or any cdn provider that is essentially the evolution of google's cache. Pretty cool if you stop to think about it. In the past to access google's cache you got a pretty fast page response but it was kind of also very broken because not all of the essentials would load correctly. Now with AMP you have some control over that cached page that is offloaded for free to google's servers. I think there are some UX issues for sure - IMO it's open source so you can always open a pull request/issue and try to make it better... or ignore it and continue to build something great... if you have something great people will find it whether you use AMP or not.
It doesn't matter how good or bad the technology is, or whether it's convenient or profitable to use it. AMP is a brazen attempt by Google to use their monopoly to take control of the web. The current version of AMP is not their intended final destination.
so I remember being upset that I could not run my javascript code in amp too... is that maybe what has you feeling it’s google trying to take control of the web... the thing is google make money selling ads not controlling your web content so I just don’t see how your assertion makes sense?
Just think how Youtube evolved from a seemingly open platform - to now where tons of people are being 'deplatformed' for not towing Google's line. That's what happens when Google gets 90%+ market share.
Remember when the Chrome team decided that certain sites won't be allowed to Autoplay videos? Yet other partner sites (and Youtube) were allowed to Autoplay. What if every single f-ing goddamn feature on your website was restricted based what Google says? If you're not "AMP-compliant" enough, following the arbitrary rules Google sets then they might restrict certain features like video viewing on your AMP site. That's the future people are buying into by implementing AMP. Free speech and the open web will die if AMP gets dominant market share.
More broadly, I consider this yet another reason to avoid using Google properties where possible. They have shown themselves to be bullies and bad actors who want to control the internet and oppose an open web. I recommend simply not building AMP pages at all, but instead working to build high quality, performant websites which gracefully handle device size changes and lack of javascript.