Hacker News new | past | comments | ask | show | jobs | submit login
Fight back against Google AMP (2018) (polemicdigital.com)
895 points by mancerayder 45 days ago | hide | past | web | favorite | 379 comments



I strongly agree. Web developers and app designers should work to build fast, performant web sites that use bandwidth carefully because that's good for end users. But the entire AMP approach to doing this is questionable, and as we have seen over the years it appears to act more like a way to give Google more undeserved and unnecessary control over what should be an open web.

More broadly, I consider this yet another reason to avoid using Google properties where possible. They have shown themselves to be bullies and bad actors who want to control the internet and oppose an open web. I recommend simply not building AMP pages at all, but instead working to build high quality, performant websites which gracefully handle device size changes and lack of javascript.


> I strongly agree. Web developers and app designers should work to build fast, performant web sites that use bandwidth carefully because that's good for end users.

They should, but they didn't. Before AMP most of the web was unusable on slower Android phones and frontenders just laughed at you and told you to drop 800$ on an iPhone if you want to see their pages. Is it a surprise that Google shoved a technology to fix web on their platform down developers throats?

Nothing else before AMP helped. Why do you think those developers will suddenly wake up and start building lightweight web pages now? Instead of ad bloated, video playing monstrosities?

Web developers were slothful. This is how purgatory looks like. ;)


AMP is not the savior you describe it to be, and web developers are not against lean websites. The real bloat comes from ads and excessive tracking, and you can test that by installing uBlock Origin in Firefox for Android and see for yourself how the web suddenly becomes fast and lean.

I've downvoted this post because it lacks substance, and the resulting arguments will derail the thread and bury actionable information that was shared below. It's depressing how most of these threads could initiate action but seem to be derailed.


>and web developers are not against lean websites

Of course no one is _against_ good performance, but web devs obviously don't care enough to do anything about it. There's no practical difference between the two.


Most web developers do make fairly lean websites, but that is not enough when ads and a dozen tracking scripts which in part are supplied by Google are slapped over their work.


Which is completely 100% irrelevant from the end-user perspective.

Just to be clear, I hate AMP, but I also feel a sort of pleasurable vindication in the pain that developers and companies must now go through because of the horrendously slow trackers and ads they used to fill their pages with.


Google could restrict the content loaded in AdSense iframes and apply AMP restrictions to ad content only. They also have the means to limit the number of ads partners can load on a page, and restrict the overzealous use of Google Tag Manager which slows down sites.

Google offers both the poison and the antidote, and each of their solutions, see what they're doing with request blocking in Chrome, happens to erode user liberties and privacy rights to concentrate power around Google properties.


It's a bad idea to just slap AdSense and analytics in a page. If they're a requirement then they need to be properly integrated and thought about. It can be done properly but nobody really does.

FYI Personal opinion not Google's.


Page rank would make them care. Can't understand why Google didn't just down rank heavy pages...


It absolutely does. Improving your page speed (or google's idea of your page speed) is a critical step in optimizing a site's organic google search ranking.


Which means AMP actually has very little to do with "performance".


Why do people feel the need to "fight back against Google"? Should their actual fighting energy be going to fighting dictatorships around the world and torture by the CIA? Priorities are really messed up.


Why not both?


Web devs care but efficiency is expensive and clients don't want to pay for it.


No it's not. We're not talking about squeezing every last ounce of performance out of the CPU and hand tuning every query. Just stop bloating your pages with 10,000 dependencies and awful JS frameworks and pretending everything needs to be a SPA.


Yeah, but Google wants the excessive tracking and ads.


Should read: "Google wants their excessive tracking and their ads."


[flagged]


Again, install Firefox for Android with uBlock Origin, and see your opinion change about the main reason mobile sites are slow. Pages load fast and are responsive even on older phones if you use an ad blocker.

https://play.google.com/store/apps/details?id=org.mozilla.fi...

https://addons.mozilla.org/en-US/android/addon/ublock-origin...


I use ubo and have used AMP in past. UBO is an excellent ad blocker, but as far as bandwidth savings are concerned, it doesn't come close to AMP. Of course I use UBO with JS off by default which is better and arguably more secure. But I still occasionally have to unbreak sites.


Should installing an adblocker be a common knowledge. The world is not made up of techies


Your suggestion to install Firefox for Android for a performent mobile web experience undercuts whatever other arguments you may make.

Firefox for Android is a UX and performance disaster which is likely why Mozilla has chosen to start from scratch and develop a replacement browser for Android.


Install Firefox for Android with uBlock Origin to see how ads and tracking destroy mobile performance. The setup I described loads pages faster than Chrome.


DuckDuckGo browser with Blockada on Android. Not just web ads, but most app-internal ads, all gone.


I believe the optimistic read is that in a world where Google manages to measure strictly performance-based metrics and then rewards pages based around that, of course developers will do things right this time. After all, we all want to write good code and produce quality work!

That speed matters to user behavior has been known for a long, long time. This knowledge existed long before AMP did. It had surprisingly little effect on how pages were implemented.

So perhaps our princess is in another castle.

To my thinking, ahat AMP does is create a political context that enables developers to push back. By setting an unambiguous standard and clear advantages to complying with it, developers have a weapon to push back next time Marketing wants to ad fifteen trackers or whatever. This is leverage that just was not present previously, and it can change decisions.


> To my thinking, ahat AMP does is create a political context that enables developers to push back. By setting an unambiguous standard and clear advantages to complying with it, developers have a weapon to push back next time Marketing wants to ad fifteen trackers or whatever. This is leverage that just was not prevent previously, and it can change decisions.

Yeah, I think this is exactly it. Just like web developers don'' t care about disabled people until law threatens penalties, they didn't care about performance until Google threatened penalties.

The question is - who else could provide same incentives as Google? How could an independent, non-corporate entitiy create the same pressure?


Normally I would say "That's what standards bodies and governemnts are for", but in this particular context both have failed. It's been thirty freaking years since the ADA, and most websites are still not accessible. Standards bodies both move slowly and are historically bad at achieving widespread implementation in reasonable timeframes.

The other answer is "Browser makers"... but that's also Google. And maybe Mozilla, which is arguably the "independent, non-corporate entity" you'd like.

Really though, this works because Google has the technical chops to make it work and the positioning to make people want to do it. I cannot think of a single "independent, non-corporate entity" that's both positioned to do this and capable of it.


> they didn't care about performance until Google threatened penalties.

But that's not an appropriate role for Google. They aren't, and shouldn't be, the web equivalent of fashion police.


All Google has to do is reward site improvements in critical metrics. That's it. If my page is going to rank higher because it's faster, I will optimize the hell out of my site. But Google has been really unclear about the amount of impact those improvements have, especially as they compare to building an AMP site that will without question be featured in their carousel.


What metrics are you thinking of? Page size and load speed are the typical ones. There may be some wrinkles to measuring those well, given how dynamic modern pages often are. That would make any such metrics relatively easily gameable. It might also be challenging to turn measured improvements into measurable gains in SERPs, which means the gains in corporate politics are limited.

AMP avoids all of that. It also brings security benefits by getting rid of basically every tag that can be used to mount attacks on the browser.

Also, it's been known for quite a long time that users like faster sites, resulting in much lower bounce rates. Was that not enough for you to optimize the hell out of your site? It's been my experience that in a lot of companies, it isn't enough. Marketing or publishing or whichever department can attach dollar amounts to the tracker or ad or whatever they want to add, and devs can only handwave around experience.

It's not a winning proposition.


They did exactly that and you can find several talks about how they prioritize performance. Didn't work.


It could only be used as a tie-breaker for search results with the same level of confidence, anyway.

It would be ridiculous to down-rank the exact thing the user is searching for just because the user would have to wait 800ms longer for that information. Or up-rank something the user isn't looking for just because it loads faster.

The best Google can do is bluff about how much perf matters.


The efficacy of the incentive is linked directly to the strength of its effect. If optimizing the hell out of your company's site only matters in extreme cases where it's a tiebreaker among hundreds of other signals, the people who want the things that make pages slow will win. They will be able to point to more tangible and measurable benefits, and the effect of the tiebreaker will be lost in statistical noise.

It may just be unfounded cynicism on my part, but this does not sound like a better web experience. It sounds like the web circa 2009-2015. It sounds, to me, exactly like all the things we'd like to get away from with something less intrusive than AMP.


I've been using the web on mobile connections ever since I got my first iPhone in 2008.

When you say that it was unusable, surely it's hyperbole.

I might be in a minority maybe, but I never had a problem with it and I've been a heavy user. And especially now that 4G connections are everywhere and smartphones are overpowered.

I mean I watch HD videos on the web while riding the city bus with no interruptions.

Are you telling me that a phone with better performance than the desktop I had 10 years ago, with a 4G connection able to stream hundreds of MB of data on a moving bus isn't capable of loading freaking text content without AMP?

Surely something is missing from this picture. I'm replying to you on Hacker News by loading the website in my browser, no AMP in sight. And I read HN, including all websites listed on HN, from my phone with no AMP.

And sure some websites can take a second or two to load due to crappy ads mostly. I remember a time when I waited for 5 minutes to load a website, when all we had was dial-up. And even that was awesome ;-)

N.b. I avoid AMP on purpose. I started using DuckDuckGo on my mobile to avoid AMP, as I had no other way to turn that shit it off.


Iphone was one of the more expensive phones you could get in 2008, just like it is now. You were not browsing the Web on the "slow android phones" parent was taking about.

HN is an exceptionally fast website and not representative of the Web at large.

Compare HN to something like reddit, a website which provides very similar functionality but is an order of magnitude slower. Then ask yourself why reddit has to be so slow.


The Reddit website is working perfectly fine for my purposes. The only thing I'm bothered with are the annoying popups suggesting to try their app.

Also if Reddit is slower than HN, that's probably because they don't care (law of diminishing returns ftw) and I'm sure they'd rather drive people to their mobile app instead. All of this isn't the fault of the web technologies used and neglect can't be solved by AMP.

AMP puts websites under Google's control and nobody asked for it, being shoved down on people's throats due to an imaginary problem.

---

> You were not browsing the Web on "slow android phones"

Note that even the shitty, stock Android phones today are better than the iPhone that I had back then. Such is the progress of technology.

I know because we have a ton of low cost Android phones to test with.

The only performance problems we encounter are in the third world countries of Africa and possibly in other emerging markets, but that's only a temporary issue and I predict that in another 3-4 years from now it will be a non-issue even in those countries, hardly a reason to give up on our web standards. And it's not like you can't design super lean websites anyway.


> it's not like you can't design super lean websites anyway.

Sure, but people don't.


> I've been using the web on mobile connections ever since I got my first iPhone in 2008.

Okay, great. You had one of the most powerful phones at the time. How was the experience for people with a "feature phone" in 2008? (I'll tell you from experience, it was terrible).

How would the experience be today, with your iPhone from 2008? Terrible. Why? Is the web more powerful as a result? Can you do more things? Nah, it just looks flashier.


Tracking blockers via extensions, and autoplay off by default would have fixed most of the problems while also encouraging site builders to stop doing those things. Firefox makes that possible on Android. Google seems determined to never support those things in mobile Chrome and are slowly removing or crippling the ability to do it on desktop.


iirc they're also pushing for a new extention standard, for firefox and stuff too, which is very adblock-crippling...wouldn't be so bad if it was ONLY chrome... also, loopback proxy to localhost with a standalone blocker is the next step they'll force us to take ;p


Web standards and traffic being monopolized by a company with... dubious opinions about the role of privacy online is your idea of purgatory?

I'd like to think similar ends could have been achieved by setting and rewarding standards around #'s of included scripts, size of the page load, etc. But that wouldn't have achieved the goal of keeping people on google.com even when clicking search results.


I would think so too and that would be the perfect solution. But it didn't happen unfortunately.

I'm using the word "purgatory" because it's not too late to get rid of it. But it does demand the web devs to get their act together. Will they?


Fighting web bloat is a noble cause. It doesn't require a self-designated centralized gatekeeper. All Google needs to do is reward lightweight sites with better search placement.


To be fair most web dev practices are all based on silly notions of tracking and crappy UI ideas made by idiots. Animating in blocks of text is what I'm mostly referring to, but theres plenty more.

Take twitter for example. A tweet takes about 10mb to load. Based on something I did about a year ago. To put that in perspective, information transfer wise, war and peace is like 800kb. The whole book. 280 char or whatever, of a single page tweet being 10mb is moronic. Reddit bit the same stupid bug with their redesign.

The biggest problem, everyone is complacent and thinks "this is what progress looks like and you're a curmudgeon boomer if you think otherwise." Forethought in real sustainability, both environmentally and sociologically is looked on as impeding progress. Just like when small amounts of devs a decade ago said we need to be careful of big tech companies with our data. They were shot down and that push for "break things fast" became the name of the game. Now everyone says tax dollars must be spent for 5g because "we need the bandwidth". No, more people need to be less stupid. Mostly consumers. But devs need to start taking a stronger stance in outing bullshit tactics these businesses are implementing and quit going on their knees to pray to the silicon valley giants as some great saviors of society and their wealth is an indication of their genius. Ugh... got into a rant...


As a "millennial" (urgh), that has already been using the Web in the previous millennium, I agree.


> Web developers were slothful. This is how purgatory looks like.

But as a web user, I resent Google's efforts to put me in purgatory as well.


Honestly, a lot of bloat is coming from WordPress, which is a platform that encourages bad development practices.

I had this client with content WordPress website, 50-so plugins and regular blog post would have 1 MB JavaScript.

It's crazy. WordPress is epic mess and at the same time one of the most fascinating software platforms.

I don't even know what to compare it with.


Some of the slowest sites I go to are news websites, and that's not because they're on WordPress.


There’s a good chance some of them are. Or Drupal, which is very similar.


They're probably on their own proprietary CMS that's probably just as bad or worse than WordPress.


WordPress is also a platform that encourages bad takes like this. If I have 50-so plugins that provide me only with tools in the dashboard, logged-out users won't be impacted by any of the fifty.

Compare it to any consumer operating system. It puts a lot of power into the user's hands.


I agree. Another example of this: Google will force ChromeBook hardware manufacturers to use fwupd instead of proprietary solutions:

https://blogs.gnome.org/hughsie/2019/11/18/google-and-fwupd/

It would be great if these companies had enough good taste and pride in their work to at least try to build something great by default. What we get instead are minimum viable products built in the cheapest way possible and it takes a Google to force them out of their complacency by imposing policies.

On the other hand, Google is at least partially responsible for the current web situation: they normalized advertising and tracking malware on the web. Because of them, publishers think it's totally acceptable to make people download 10 megabytes of ads and javascript to read 10 kilobytes of text. The correct solution is to block all that stuff by default by shipping uBlock Origin pre-installed with browsers.


So the answer was for Google to reward fast sites by giving them a SERPS boost.


But AMP didn't help either! The new reddit was rewritten using AMP and it's really slow (ironically old.reddit.com load faster on my phone right now).


> Before AMP most of the web was unusable on slower Android phones and frontenders just laughed at you and told you to drop 800$ on an iPhone if you want to see their pages. > Why do you think those developers will suddenly wake up and start building lightweight web pages now? Instead of ad bloated, video playing monstrosities?

To be fair, I would say a lot of this is a result of marketing/sales trying to push a lot of BS on the page, and managers or devs failing to push back. Is the developer guily of creating a "ad bloated, video playing" webpage? Yes, a lot of them don't care and make it bloated, but even if you tried, you can't do much to improve the perfomance of a bad idea.


> Nothing else before AMP helped. Why do you think those developers will suddenly wake up and start building lightweight web pages now? Instead of ad bloated, video playing monstrosities?

This has been an ongoing trend since ever, Viz. YSlow and Firebug Speed Tab.

Fuck Google, fuck amp


You realise you have to develop a static page for amp to cache it correctly. It doesn't any page.

It's more to avoid bad networks not so your phone can load a page any better.


> They should, but they didn't. Before AMP most of the web was unusable on slower Android phones and frontenders just laughed at you and told you to drop 800$ on an iPhone if you want to see their pages. Is it a surprise that Google shoved a technology to fix web on their platform down developers throats?

So let me understand this: Google allows OEM's to ship Android on shit hardware with terrible performance, is rightfully complained at for rubber-stamping hardware with no oversight, no standards of quality, and no requirements of suitably good UX, and then Google passes the burden of supporting the shit hardware they by-virtue-of-silence gave permission to onto a ton of unsuspecting content publishers, who now either face delisting from the dominant search engine not because their content is bad, but because their website requires resources not met by Google's, proxy, shit hardware? And you're okay with that?


Yes, I'm OK with world having the ability to buy a smartphone for 50$ outside US. Mobile devices shouldn't be reserved just for rich westerners. Same for the whole web - I don't see the reason why it shouldn't be usable on a dual core laptop with 2GB of RAM.

I'm fine if supporting people with older and slower devices costs more development time for developers in Silicon Valley.


Years ago the web was fast on a 1 GHz single-core with 512MB of RAM. What changed, other than ads and ad networks like Google becoming far more invasive by wasting more and more memory and CPU?


In the days since 1 GHz CPUs, web pages have also grown from simple HTML/CSS to huge JavaScript frameworks, in which displaying the simplest static content requires a ton of JavaScript.

But if you install a browser add-on such as uMatrix, you can see that surprisingly many web sites will still work just fine if you disable JavaScript (even first-party JavaScript). One example is nytimes.com.


Should mention that megabytes of javascripts are slow to download, compile and execute. While a few seconds may go unnoticed on the developer desktops, it will be a lot more on a mobile or laptop.


I advised a friend to ditch the JS-powered pop-out social media icons which were hovering almost out of sight over on the right. They said quite flatly, "nope, that's staying". That was probably ten years ago. There is a school of public opinion that everyone seems to be attending. The things they learn there are not always logical or justifiable but I get the impression that they all want to secure their piece of the pie and that means meeting everyone's expectations, so they are all doing it to each other, together. Google is "merely" running classes in that school, it seems... and of course helping the school keep running by supplying tons of tech.

I was mildly disgusted when required reading for freshman orientation at Akron U included a book called Nickel and Dimed. The gist was something like "get your education or you're screwed". But people made it that way in the first place! Everyone supposedly needing formal higher education in order to have any decent future isn't something that just happens, it's something the human race is doing to itself. Leave it to a higher education institution to push the idea that "this is just the way it is, do the right thing if you know what's good for you".

edit: obvs I didn't read the book, it's not exactly like I said. I think I bought the book but dropped that "class" anyway

In a similar way, stupid "trends" like social media buttons and Like buttons are just examples of how everyone is ruining the web together. These days it's the aforementioned massive JS frameworks and SPAs and of course the obsession with "analytics." In a way it's nice for me and my workstation because it helps drive up the current average affordable densities of RAM and storage, but ...it's slavery. And Google seems to be less and less bashful about it.

"you are slaves of whatever you submit to by obeying" --that guy


> because it helps drive up the current average affordable densities of RAM and storage

It does, but it also means that RAM and storage isn't available to be used for other things. Think about what you could if you had current hardware back in the XP days...


We covered it, floor to ceiling, in images and video. Yesteryear's web had a few grainy avatar images and GIFs in footers, todays has nonstop, wall-to-wall, high-definition media.


Not an excuse, those can be loaded on demand. Also, gifs have basically no compression.


JavaScript, SPAs, animations, pages filled with "pretty" instead of content.


> Yes, I'm OK with world having the ability to buy a smartphone for 50$

But you apparently aren't okay with getting $50 worth of smartphone, since you're demanding a ton of companies you erroneously claim to be in California expend thousands of dollars in labor to support a framework they never agreed to support, have little to no say in how it's developed, in the name of a supposedly "open" web, so that you can have a good experience consuming content more than likely for free. That, to me at least, reeks of the worst kind of entitlement.

This is, in my mind, like buying a Tata Nano, which is a perfectly acceptable if limited car, and subsequently demanding all the road ways be limited to 65 mph, so that you don't feel slow. If you want to drive with the pace of traffic, the absolute cheapest car you can possibly buy brand new [1] is probably not what you want.

[1] That I'm aware of.


You guys are both way off the rails here

Developers don't need to put in more work to support cheap phones

You just have to install an ad-blocker, and you can surf the web lightning-fast on even an old, slow piece of crap phone


Yeah, this is ridiculous. I used to browse the Web (not the Wap!) 13 years ago on my Nokia N70 (Symbian OS, 220 Mhz, 32MB) smartphone, on a Internet plan that cost 1€/MB (I have a plan that costs 100 000 times less today), and while it was a bit rough, it was already pretty serviceable!

Most of the content (in time spent on it) is still text (remember what HTTP stands for?), and text takes hardly any processing power!


A cheap phone with shit hardware is a feature of Android, not a bug.

The solution to slow web pages isn't AMP, it's Firefox with an ad blocker. Google doesn't like this solution, obviously, but that's not my problem.


In particular, publishers don't like this solution.


That's not our problem either.


Not everyone can buy $800 iPhone or close to that. Being from a third world country, I understand how valuable it was to have a cheap smartphone (umm laptops were too costly) so my main interest shifted from physics to CS / Programming..

If you don't like Google AMP, it is fine.. (of course I too prefer to browse with only HTML & CSS whenever it works).. If you don't like low end hardware standards, it is fine.. But they have solved real world problems, whether first world problems or not. Not everything is black and white..


> Being from a third world country, I understand how valuable it was to have a cheap smartphone

And just because you live in the US doesn't mean you can afford a top tier iPhone. That's why the secondary market is so hot for them.

> If you don't like Google AMP, it is fine..

I don't really care one way or the other.

> If you don't like low end hardware standards, it is fine..

I do take some issues with the fact that Google employs no standards at all for a baseline level of quality with their devices, and then places the burden of supporting those devices on others under threat of delisting.

> But they have solved real world problems, whether first world problems or not.

Ends do not always justify means. Lest we forget that the winner here is not limited to people with low end hardware getting to consume AMP content, it's also Google, who profits directly off of that consumption. And THAT is where I believe the ethical lapse is. Google isn't doing this so people can get content easily on low end hardware, they're doing it under the guise of that, while laughing to the bank as they're breathlessly defended by people who refuse to accept for some reason that Google is a business, and it acts in every way to forward it's business.

Just like Stadia is not Google setting out so that people who can't afford game consoles can still play the latest games, they are inserting themselves in a user's market so they can be the provider, and get that sweet, sweet engagement.


I used to browse the Web (not the Wap!) 13 years ago on my Nokia N70 (Symbian OS, 220 Mhz, 32MB) smartphone, on a Internet plan that cost 1€/MB (I have a plan that costs 100 000 times less today), and while it was a bit rough, it was already pretty serviceable!

Most of the content (in time spent on it) is still text (remember what HTTP stands for?), and text takes hardly any processing power!


We're not okay with Google usurping web sites but we don't sympathize with publishers either.

The right thing is to build good web sites. Publishers obviously don't care about doing it right and we ended up with system requirements for web sites as a result. Google is now making it expensive for them to not care. Publishers are not a blameless victim of Google's monopolistic power, they actively contributed to the current state of the web.

People should not need a $1000 phone to read a news article. The only situation where it's acceptable for web sites to not work on "shit hardware" is when it's a WebGL application. In those cases, people know that powerful hardware is required before they even load the page.


If Google had blocked manufacturers from selling cheap Android phones then they would have just found another mobile operating system to use. Maybe Firefox OS or WebOS.

Also yeah I'm pretty happy that cheap smartphones are available for the masses to use. I have zero sympathy for content publishers with bloated websites.


bloated websites are for a reason - nobody wants to pay money for content, but content gets created by people who get paid for their job. so you are not paying money for content, but also don’t want to have advertisement. what is solution? in my mind is just not use those websites :)

but then don’t hate publishers.


> Google allows OEM's to ship Android on shit hardware

Well, now there's an interesting complaint in this context. I thought Google was evil because they forced strategies on people, but now they're evil because they don't restrict hardware?


Yes, it’s one thing to promote a cleaner and faster web though better design and implementation. It’s another thing for Google to use its effectively monopoly power to enforce that. As the FA says, Google didn’t invent the web or create its content - what gives them the moral right to take it over?

I think the collective web will eventually fix the problems without Google.

The root of the AMP issue is placement in Google’s search engine. Personally, I use DDG, and would be willing to pay a sizable subscription fee to keep it from being more like Google or from being acquired. But, most people probably would not - they are used to the web being “free”.

This is just another “embrace, extend, extinguish” effort, like the ones we have seen in the past. These attacks are transparently self-serving and should be “routed around”. It will require commitment to do so!


> Yes, it’s one thing to promote a cleaner and faster web though better design and implementation. It’s another thing for Google to use its effectively monopoly power to enforce that.

AMP is more than just cleaner and faster - it gives Google control. They could discriminate on cleaner and faster without it, but they purposefully don't mention that, since it would undercut the push for AMP.


Agreed, it is yet another Google data collection method created under the guise of a beneficial offering


Firefox for Android seems to solve this "issue" for me completely, after I started to use it, Google stopped to show AMP pages to me, even in News section all the links are direct. So use mobile Firefox, I find it too be very good these days, no regrets. I still have Chrome (just in case), but I didn't use it since.


«the entire AMP approach to doing this is questionable»

Why? AMP is roughly speaking a subset of HTML that's somewhat easier to cache, and nothing more. Ideally it should be possible and encouraged to serve most webpages from a cache, to optimize Internet traffic on the global scale. It should be okay to fetch them from a cache without breaking anything. I don't see why the AMP Cache is hated so much. Publishers shouldn't care whether browsers hit their servers or some third-party cache, as long as they can have proper analytics. And guess what? AMP does provide a way to do proper analytics. You can even send analytics data to an in-house URL: https://amp.dev/documentation/components/amp-analytics/#send... I think most of the hate against AMP in unjustified. Any search engine could decide to cache AMP content.[1] AMP in and of itself doesn't give search engines "more control" over the web (whatever that means), it just makes the web easier to cache for everyone, all search engines, all end-users.

Edit: [1] not only Google caches it, Bing does it too: https://blogs.bing.com/Webmaster-Blog/September-2018/Introdu...


I'll let Google explain what's wrong with AMP[1]:

> What's in a URL? On the web, a lot - URLs and origins represent, to some extent, trust and ownership of content. When you're reading a New York Times article, a quick glimpse at the URL gives you a level of trust that what you're reading represents the voice of the New York Times. Attribution, brand, and ownership are clear.

> the recent launch of AMP in Google Search [has] blurred this line

> Google AMP Viewer URL: The document displayed in an AMP viewer (e.g., when rendered on the search result page). https://www.google.com/amp/www.example.com/amp.doc.html

Google has inserted itself in the URL. Copy and paste that, submit it to reddit or Hacker News, or just read it to a friend, and what do you get? A connection to Google.

1: https://developers.googleblog.com/2017/02/whats-in-amp-url.h...


But anybody (Bing, Yahoo, etc) can "insert themselves in the URL" if they decide to cache the AMP content. In fact they could also cache non-AMP pages if they wanted. This isn't a problem created by AMP in and of itself.

You can't even make the argument that AMP degrades privacy, because regardless of whether you click an AMP link or a non-AMP link in the search results, in both cases many search engines will ping back or use a redirect through a search engine-controlled domain, so they will be aware of the URL you click anyway, AMP or non-AMP.


Anyone else who inserts themselves in the URL should be fought as well.

I guess you're making a minor technical point, and it's technically correct. Someone else could do AMP better. But until someone does, why not shorten "Google's implementation of AMP" to simply "AMP"? Is there any other?


Bing also caches AMP content: https://blogs.bing.com/Webmaster-Blog/September-2018/Introdu...

I agree that there is a UX problem to solve (the address bar should show the original URL, copying it should preserve the original URL, etc) but whether the webpage got loaded from the original site or from some AMP cache is irrelevant.


Interesting, thanks for that. I don't really follow what Bing does.

It looks like Bing has the same problem, and serves AMP from bing-amp.com.


> Publishers shouldn't care whether browsers hit their servers or some third-party cache, as long as they can have proper analytics.

Perhaps not, but as a regular web user, I care a lot about this.


Why do you care? You like the address bar to show the original domain name? What if this UX problem was solved by the address bar always showing the original URL, regardless of whether the content was loaded from an AMP cache or not?


I care because I want to know what server I'm hitting up. There are many servers that I don't want to be touching, regardless of whether the bits being delivered are correct or not. If the URL bar is lying to me, then I can't detect if I'm talking to a server I don't want to be talking to.

I also want to avoid AMP pages themselves, and the URL is the easiest way to see if I've hit one or not.


"18. By Keith Devon on September 7, 2018 at 11:04

If Google only cares about a faster, more semantic web, then why not just give an even bigger ranking boost to faster, more semantic websites? Where does the need for a new standard come in, other than to gain more control?"

The above is a comment found in the OP.

Is there a requirement that AMP sites host resources with Google?

If there is, then Google has hijacked the purported goal of of promoting websites that consume fewer client resources (and are therefore faster) -- arguably a worthy cause -- in order to promote the use of Google's own web servers,[1] thereby increasing Google's data gathering potential.

If there is no such requirement, then is it practical for any website to host an AMP-compliant site, without using Google web servers?

If not, then AMP sure looks a lot like an effort to get websites to host more resources on Google web servers and help generate more data for Google.

1. When I use the term "web servers" in this comment I mean servers that host any resource, e.g., images, scripts, etc., that is linked to from within a web page (and thus automatically accessed by popular graphical web browsers such as Chrome, Safari, Firefox, Edge, etc.)


> Is there a requirement that AMP sites host resources with Google?

Bing's AMP cache doesn't load any resources from Google.


"What AMP Caches are available?

Currently, there are two AMP Cache providers:

* Google AMP Cache

* Bing AMP Cache

AMP is an open ecosystem and the AMP Project actively encourages the development of more AMP Caches. To learn about creating AMP Caches, see the AMP Cache Guidelines.

How do I choose an AMP Cache?

As a publisher, you don't choose an AMP Cache, it's actually the platform that links to your content that chooses the AMP Cache (if any) to use."

The above is from amp.dev, formerly ampproject.org

As the dominant search engine/web portal (excuse me, "platform"), already having the largest web cache and the infrastructure to maintain it, it looks like Google therefore becomes the dominant AMP cache as well.


There is also the Cloudflare AMP cache that can be hosted on any domain, so it is easy to implement a link aggregator that gets instant article loading just like Google and Bing. Compare to the situation prior to AMP where if you wanted instant article loading, you would have to convince publishers to integrate directly with you like Apple News or Facebook Instant Articles.

Dominant AMP cache is a meaningless concept. You as the link aggregator have to have your own AMP cache to implement instant loading.


"You as the link aggregator have to own your own AMP cache to implement instant loading."

You lost me there. By "link" you mean URL?


Yes. If you're a search engine, a Reddit, a Twitter, or some other site that presents links to other pages expecting the user to click through to multiple pages, you can safely prerender AMP pages by implementing your own AMP cache but not by using Google's AMP cache.


> I recommend simply not building AMP pages at all, but instead working to build high quality, performant websites which gracefully handle device size changes and lack of javascript.

Doesn't matter. Google will penalise against not AMP sites. Let's not pretend there's a choice if you want people to find your content.


There is a choice. It might not be easy but it is right, and no monopoly is forever.


So, I guess that there's an one-line script in robots.txt to prevent Google from crawling my website?


If AMP somehow manages to sell quality and performance, (whether you use AMP or not), that's mission accomplished!


EU citizens can submit formal complaints to the European Commission for suspected infringements of competition rules.

Here is more information on how to file a complaint: https://ec.europa.eu/competition/contacts/electronic_documen...

If you believe Google engages in anti-competitive practices with AMP, you have the power to signal these issues, which may result in an investigation.

You can also share your concerns with a simple email to comp-market-information@ec.europa.eu.

> You can report your concerns by e-mail to comp-market-information@ec.europa.eu. Please indicate your name and address, identify the firms and products concerned and describe the practice you have observed. This will help the Commission to detect problems in the market and be the starting point for an investigation.


I sent it a complain relating AMP to margrethe vestager over 2 years ago, when this was relevant

https://news.ycombinator.com/item?id=13414570

and they did nothing, I doubt they even study the case, nothing was on the table of the parliament related to this. Google have continued to abuse and will do more if no one stops them. It's important to complain again and again until they step in.


AMP Is ruining mobile web. I cannot stand it. If it was actually made to be fluid, I'd see the value. But it's such a terrible UX, and so janky with the way it "pops in", and messes up "browser back" abilities. Out of all the shitty things Google has ever done, AMP is #1 to me on that list.

Orrrrrr just give me a damn option to turn it off, if I want. I will never understand why companies force people into these types of major UX decisions on their behalf. Stop assuming every user is stupid. Sure, make it the default, I don't care about that for the everyday user, but for something as fundamental as the browser, I should have an option to turn off every single Google opinion they bake in.


If you use DuckDuckGo you don't have to deal with AMP at all. Giving it a go on mobile is a good way to see how well it works for you too as searches tend to be less mission-critical compared to desktop-based searches.


Oh, that's why I've never really had any trouble with AMP - I've been using DDG for years.

Posts pasted from other people frequently have AMP - guess I should suggest a better search engine to them.


You can also install a browser extension that redirects AMP -> origin for you.

Chrome: https://chrome.google.com/webstore/detail/redirect-amp-to-ht...

Firefox: https://addons.mozilla.org/en-US/firefox/addon/amp2html/


Yeah, I use DDG and only get crappy amp links on reddit and other forums. Either don't click or manually remove the amp. Google can [insert insult here].


AMP is literally the reason I switched to DuckDuckGo.


Unfortunately, it is my experience that DuckDuckGo excels on the desktop (lots of facts and technical questions), and falls short on many mobile use cases ("best cafes in Some City," assistance with shopping or goods, maps, etc).

I use it on my phone anyway, but I wind up using `!g` all the time.

(yes, I made this same complaint on the DDG topic just a couple days ago)


Yeah, Google is far better at location-aware searches. To be honest most of those are usually done in Google Maps (sigh.. please somebody make a decent Google Maps alternative!) anyway, so it's not really an issue.


It's not just location-awareness. They also excel at weird fuzzy searches.

random example: My girlfriend lost her laptop in airport security, and I wanted to find a picture of the specific scratch-and-sniff sticker she put on it for the claim form. Duck Duck Go search for "glossier blackberry sticker" didn't find it; Google Images did, first try.

* this turned out to be a good move, I got a positive response from TSA within minutes


One issue is that "a decent Google Maps alternative" would cost literally billions at this point. This kind of infrastructure is really the government's job, but they have struggled to keep pace...


> Yeah, Google is far better at location-aware searches.

That's one of the reasons that I'm glad that I stopped using Google search. I've always hated location-aware searches.


> please somebody make a decent Google Maps alternative!

https://wego.here.com/ ?


On Duckduckgo there's a switch to turn on Region just below the search-field, or even choose Region manually. It helps alot!

amenod 44 days ago [flagged]

You can still try Qwant (pretty good in my experience), Bing or Startpage (which uses Google in the back - but I never had any trouble with AMP when using it). It's not like there is no choice. And of course, there should be no reason to support Chrome either, Firefox is a great alternative.

EDIT: interesting, never thought such a comment would get a downvote... Google brigade?


Startpage was recently bought by an ad company.

https://reclaimthenet.org/startpage-buyout-ad-tech-company/


Thank you very much, I wasn't aware of this! Changing my default search engine... again.


> EDIT: interesting, never thought such a comment would get a downvote... Google brigade?

It breaks more than one of the site guidelines to post like this. Would you mind reviewing them and following them when posting here?

https://news.ycombinator.com/newsguidelines.html


This could be a game changer, I did not know this!!! I will look at that today, thank you so much for the tip!


I've found DuckDuckGo to work very well as my default search on mobile (so well, in fact, that I can't remember the last time I had to go to google for a search).

I still haven't brought myself to use it on my laptop, but I do use Bing on Vivaldi (I use Opera, Vivaldi, FF Dev Edition, Opera Dev Edition, and Chrome Canary - the first two for everyday browsing, and the rest for dev-ing. I use google in Opera, and Bing in Vivaldi).

Using search engines other than Google is a nice change of pace, even if not solely to avoid AMP pages.


Avoiding AMP is not so straightforward. I use DDG too, but I still end up on amp links, Twitter for example uses AMP links by default.


Any good alternative to Google News?



I'm working on building something like this for myself actually. I'm just curious, what would you hope to get out of a replacement tool?


RSS.


This is what I did. When the Google News redesign happened, it made Google News substantially less useful to me. Enough so that came up with a replacement.

It's not for everyone, as it requires running your own webserver, but I use Tiny Tiny RSS to aggregate the feeds of the various sources I'm interested in, then can read the aggregated feeds (I have multiple, a different feed for each general subject) through the web interface and/or by using an RSS reader. I use an RSS reader (gReader) on my mobile devices to do this.


I'm pretty sure that not all RSS clients require to run your own webserver? Opera used to have one, Thunderbird maybe ?


HN + a regional newspaper is my solution.


It's also really annoying when you want to copy a link to send to people, but it copies the amp link instead.


What on earth? Chrome does this? (I'm a Firefox guy)


Every browser does this if you copy the address in the address bar, because it's an AMP link. (I believe "sharing" the link will use the canonical, non-Google one.)


Google browsers on google devices will rewrite the URL bar. IMO this is one of the more egregious offenses — the URL bar no longer accurately reflects the website you are browsing.


That wouldn't be a surprise; rewriting is.


It does! At least on mobile.


What is an example of a web site I might have used that breaks the Back button because it uses AMP?


This article focuses on what it's like for web developers and for the web ecosystem, which are both important issues. But AMP is also really annoying for end users.

As an end user, AMP gets in my way and complicates my experience. There's extra work to figure out what's going on. This page is from whatever site but "delivered by Google". As an end user, my reaction is basically: what the hell does that mean, why is it here wasting my time and cluttering up my screen, and when can Google cut it out?

Then sometimes I go to share a link with a friend over Slack or whatever, I hit the share button, and the URL comes out all fucked up. I know they're going to look at the URL to figure out what it's about (because in the real world, people do look at URLs), so I feel compelled to fix it, so I have to back up out of there, then dig around in the UI to figure out how to get a real URL. Maybe "open in chrome" will do it, or maybe I need to flip through the page itself to find where it gives a link to itself. I can never remember what works, and I don't want to have to.

I know AMP pages are supposed to load faster, and they probably do a little, but I would gladly trade that for simplicity.

Also, I would turn it off if they would give me the option, which wouldn't be hard but they don't, which tells me they don't want people turning it off.


Yeah but on the plus side, it loads in a second, isn't sluggish to use, and isn't full of annoying fixed elements. Most non-AMP news websites which take many seconds to load and are really slow and annoying when loaded.

I get all the arguments against AMP, but "annoying for users" surely isn't one of them.


Most websites I've personally encountered that use AMP don't actually deliver a usable amount of content/features on the AMP version of the page, and so it usually ends up in an awful user experience where I then have to get the real version of the page before I can do anything.


If I understand AMP correctly, there are 3 alternatives to compare:

(1) Google doesn't intervene at all, web sites are full of bloat

(2) Google requires mobile sites to not suck if they want decent rankings

(3) Google requires mobile sites to not suck and also delivers the bits instead of the site's servers.

I agree that #1 is not a good state of affairs. I'm fine with Google pressuring mobile developers to create sites that perform well. I just prefer #2 over #3.


(4) your remove bloat from websites yourself by disabling JavaScript.


> isn't full of annoying fixed elements

But it is: there's the AMP chrome around the website.


> I get all the arguments against AMP, but "annoying for users" surely isn't one of them.

Sure it is. It's certainly not annoying for all users, but it is annoying for some, including myself.


If as a user you don't want AMP, just don't use Chrome and Google Search. But for a website to miss out on the traffic from Google Search is a really steep price. We need everybody to switch to a different search engine.


Websites WERE building horrible, non mobile news articles in HTML when AMP started at Google in 2015. The news articles were so slow and wasted so much bandwidth that many news orgs wrote bad apps (think CNN app; BBC app) to replace shit with even worse shit. That's what you get when you skimp on frontend engineers!!!

AMP gives little guys, the ones starting blogs and trying to grow, a shot at freedom of speech. The web was developing in a way that the big players like BBC and CNN would dominate with big budget winner-take-all walled gardens. AMP is one of Google's most anti establishment services, which means I'm sure Ruth will be killing it very soon!

This meant Google search on the mobile web was literally dying. Every year more and more content was being locked inside walled Gardens!! I was a maintainer of AMPHTML 2015 - 2018 at Google. The project is hibernating and loses a ton of money I know I worked on the budgets for flash memory for AMP. At the time Facebook and others were proposing proprietary non HTML news document formats. Google, to keep HTML alive, decided to cache amp for free, which subsidized hosting costs for ALL news websites. I hate it that now I have to switch browsers 2x to write an article comment, too! But news apps NEVER supported this AT ALL!! News apps NEVER supported a working search feature AT ALL!! News apps NEVER supported a good user experience or global search AT ALL!

If you want to rant, blame the bloatware mess that is HTML, it has almost at killed The mobile web, not AMP! AMP is Google's attempt to keep HTML alive on phones ...


Seriously, is html performance a real issue? Mobile traffic keeps growiong and growing and growing, according to google, who now crawls most sites mobile-first! Phones have 4 cores and download 300-MB games daily. There is absolutely no need for this abomination. If it cared, google could threaten to derank slow sites for slow phones and the average website size would be slashed to half in a week!

> AMP gives little guys, the ones starting blogs and trying to grow, a shot at freedom of speech.

woosh, i now realize u re joking


Yes, it was a huge issue and many websites were unusable on anything but an expensive iPhone for a long time. Especially a few years back.

While this might not be a problem with most Apple-toting frontend engineers, most people of the world can't afford to constantly pay for very expensive phones just to browse the web. And until AMP there just wasn't a way to make anyone care it seems. Even here on HN.

Just to be clear: I dislike AMP. But I dislike the crap attitude towards users the web developers have shown time and time again more.


> just to browse the web.

I must be crazy because i never had a catastrophic issue with an iphone 8 - with an adblocker. If ads are the problem, well guess who is serving those ads.

AMP doesn't even scale anyway - it will bloat like HTML pages bloat over time, because web ppl have a bad habit of only adding things to sites, not removing. What happens then? We invent Amp-html2 to fix amp? AMP is a very-ill-thought bandaid to a culture problem that can be solved with simple nudges (have people forgotten what seismic changes happen to the web every time google rolls out a new SEO algorithm?). Amp s probably the silliest tech idea of the decade.


Yes, of course you didn't have an issue with a new Apple device, that's exactly my point.

Did you try browsing the non-AMP web on something like Nexus 4? Motorola Moto E? Oppo and Xiaomi lowend units from 2015?


I don't recall ever noting any issues with my Nexus 4. In my opinion, this is a better options: https://techcrunch.com/2019/11/11/google-chrome-to-identify-...


there are so many better ways that google could solve this issue other than amp (derank sites for slow devices / mark them as slow / pass a parameter for slow-phone visitors / create a chrome version for slow devices). AMP is a dictatorial attempt to keep websites forever bound and limited to what google is offering.


Yes, Google did whatever maximally benefits Google. They're a corporation and behave as such. Just like Apple won't de-DRM their cable protocols just because it's "right".

The question is - what can the web community do to make AMP redundant outside of complaint posts.


> to make AMP redundant outside

First, AMP is already redundant. it doesnt offer anything that stripped-down html can't do. The primary reason sites choose it is because google ranks the pages higher! it's purely coercive.

Second , it's not as if AMP has taken over the web. But this coercion has to stop. Third, it's real easy to make a faster website with 10 minutes of work. I 'm not sure we need some kind of activism to stop amp i do believe it will crash on its own as soon as most sites look exactly alike and start losing revenues. But until then ... maybe ban AMP links?


If it's so easy then why have so few websites done it? Google has understood what Google/AMP haters refuse to see: web performance is not an engineering problem, it's a product and marketing problem. Coercion is exactly what's needed to push website owners to prioritize performance, because HN's monthly whinefest isn't cutting it. Here's two basic things AMP offers that stripped-down HTML can't do: a world-class CDN that many website owners won't justify investing in, and a clear, marketable incentive to develop a mobile-efficient website that VPs, marketers, product managers, and other business stakeholders can immediately understand.


> why have so few websites done it?

Because the vast majority of websites are reasonably fast on mobile? Loading times of 1,2 or 5 seconds are a non-problem that amp is addressing. The worst offenders i see are too high res images and autoplay videos, but frankly i cant remember seeing any of those recently. Most blogs/news sites are fine. Where is google sourcing their data that users are desperate for web-breaking solutions that bring them 200msec response times? The purpose of AMP is so that people flick a website instantly and then go back to google. That's obviously not in the interest of the publishers. The whinefest is because google is actively prioritizing amp publishers thus forcing it on the web.

> Coercion is exactly what's needed to push

this is not a defensible statement

> a world-class CDN that many website owners

facebook needs a world-class cdn, not blogs.

> a clear, marketable incentive to develop a mobile-efficient websit

the "marketable incentive" is the de-ranking of the site. It's entirely unnecessary to force amp for that, a simple page speed deranking would do


Google knows load performance is a critical user need from the ample data they collect from Google search users, they've talked about this before. I forget the exact number, but every 100ms less load time drives significantly more traffic and engagement. I have no idea what data you're looking at that implies 5s load times are not a problem. I, for one, am overjoyed the Google is tackling this problem and succeeding at it.

Google has applied performance penalties to sites before and it still does. It's not enough, and there are limits to the penalties they can apply because these websites are ultimately very useful and relevant, it would worsen search quality to derank useful but bloated websites. The carousel is a good balance of incentive and penalty.


It's funny. Google is thinking of marking slow loading sites. I analyze my sites with their own page speed tool the biggest blocker is Google/DoubleClick ads. I'm probably going to completely remove AdSense (auto ads are terrible) but can't they optimize their own code?


HTML no, Javascript yes.

Javascript for front-end frameworks, and especially for tracking.


Nice card-stacking.

Seriously, nothing is going to kill the mobile web more than Google continuing to overreach and use bait-and-switch tactics on publishers. Oh, sure, AMP is good for the "google-mobile-web experience", but bad for an open web.


If Google has an option for logged in users to bypass AMP pages, I would not blame Google. They stubbornly refuse to do this, thus it is Google that is ruining mobile browsing for me.

(I would have written an iOS Safari extension that bypasses AMP years ago if Apple supported such a thing…)


> If Google has an option for logged in users to bypass AMP pages

Not just for logged in users, for all users. Really, if Google provided some way to avoid getting AMP pages (through a cookie or something), I would have no problems with it.


I tried to find the real URL behind an AMP page to bookmark, but couldn't find it. I think they've added a tiny (i) since then, but they're really trying to hide it.


Change your user-agent to mimic Firefox.


> AMP gives little guys, the ones starting blogs and trying to grow, a shot at freedom of speech.

> AMP is one of Google's most anti establishment services

You're either writing satire I don't get, or work for Google.

How exactly does a walled garden give you free speech? Especially when it's provided by who profit the most from you not leaving said garden? While also forcing you to bypass standard practices?

Utter nonsense, unless it's a joke I'm not getting.


> If you want to rant, blame the bloatware mess that is HTML, it has almost at killed The mobile web, not AMP

The author of this article is pretty much praising the bloatware mess that we have and wants more. I'm also puzzled.

To an end user, this article just gave the best highlights about AMP.


> AMP gives little guys, the ones starting blogs and trying to grow, a shot at freedom of speech

How so?

> AMP is one of Google's most anti establishment services

It looks like the exact opposite of that to me. This is Google's attempt at remaking the web in a way the enhances Google's control and power. That's pretty pro-establishment.


What I think he meant is that, most of the news website became slow and bad user experiences on mobile, pushing users to download wall-gardened native mobile news apps by established News Corporations to experience something fast and kind of pleasant. This is a problem for Google and for "freedom of speech", because you're not googling for news anymore, you go straight to your established news native application, preventing you to see other competing results (like blogs or smaller news websites for instance)

Pushing them to have cleaner and faster websites makes the user stay on the web. It is a clear benefit for Google, but to his point, to the user too. (At least that was the goal)


Meh.

It was the third party ad networks that caused performance issues on the news sites, as well as distributing malware.

There were alot of Flash ads for awhile, as well.

Definitely an issue prior to 2015.


AMP absolutely does not give the little guy a leg up.

In fact, it’s only the massive news sites that have the developer time to support AMP, meanwhile the little guy has to play around with terrible Wordpress plugins and spend hours fiddling with it just so Google will properly crawl their site.

And don’t even get me started on static site generators. AMP support is shoddy at best and a giant PITA for 99% of static site generators. Wordpress is one of the main reasons the web is so slow, yet AMP gives power to Wordpress since it’s the only way non-technical blog owners can support AMP.

AMP forces small time blogs and content sites to waste time building two versions of their website to rank alongside the big boys. How does this help the little guy?


> Wordpress is one of the main reasons the web is so slow

That's a funny way to spell advertising.


> AMP gives little guys, the ones starting blogs and trying to grow, a shot at freedom of speech.

As long as the big guys aren't on AMP yet. But an overlooked tradeoff is that the little guys are forced to play by Google's rules in terms of how and where they display ads, even the ones that aren't sourced by Google's ad network. It creates a completely uniform policy that undeniably benefits the scale of Google. A small publisher simply cannot differentiate their ad offerings. If you view that as a good thing for the end user, that's fine, but it's certainly not in favor of the little guy. Little guys depend on differentiation in every area of their business to effectively carve out a niche against a giant like Google's ad network.


I think you are dead wrong. 2015 didn’t mark some ah-ha moment when AMP came along finally we were able to use web on mobile. Most of the websites that did and still do have problems are auto-playing video news sites or sites with way too many ads than necessary.

AMP is just a step above the top results boxes Google puts on the results page that are scraped from other websites. See the other front page article about Google repeatedly stealing Genius lyrics.

Google shouldn’t become the new AOL.


Wouldn't ranking results by size of page have pushed sites towards more mobile friendly lightweight pages?


Yes! You're absolutely right that page size, coupled with something like time-to-render metrics, could do that!

Of course, there might be a wrinkle or two. How do you propose to evaluate the size of a page when large amounts of something like a newspaper article is loaded by reference, dynamic, and depends on third parties making independent run-time decisions? How can you know a page's size won't vary 50% minute-to-minute in a world like ours? And how can you meaningfully measure load time in such a context?

You're absolutely right. Page size and speed could absolutely be better ways to do this! It's just maybe possible that there could be some minor obstacles to doing so.


> How do you propose to evaluate the size of a page when large amounts of something like a newspaper article is loaded by reference, dynamic, and depends on third parties making independent run-time decisions?

You downrank them immediately because that's slow.


Great idea!

Of course, there might be an issue here because the amount of things that work that way is huge. So now you have a scenario where everyone is angry at Google for trying to dictate how they can build web pages and writing angry digital polemics about how this is an unreasonable standard and abuse of power. Nobody actually wants to re-implement massive chunks of how their website works, so everyone will resent this incredibly artificial imposition.

Which is to say it's a wonderfully straightforward answer, but perhaps not better than AMP in practice.


> everyone is angry at Google for trying to dictate how they can build web pages

But we're already doing that because Google downranks results that don't use AMP. We're generally OK with Google downranking sites on actual metrics (such as HTTPS) but not when they're pushing their "solution" that clearly has a number of issues with conflict of interest.


Are you saying you'd be completely fine with the above scenario, where Google downranks each webpage based on the number of external assets it loads and the amount of dynamic content it has? Instead of using AMP?

Personally, I prefer AMP for security reasons. It's tightly restrictive and does a lot to limit the available space to mount attacks aimed at browsers. But I understand that's far down most people's lists, and tends to fall under the same sentiment as "devs should just write fast websites".


Yes, because I think that is a pretty accurate way to measure how much I would hate to go to that page, and it doesn't require AMP to work.


I suspect you may be an outlier, as most people seem to deeply resent the strong incentives to change how they author web pages. Shaping them slightly differently strikes me as unlikely to generate a dramatically different reaction.


Rewriting a page to use amp is not shaping it slightly differently.


You're absolutely correct. Please accept my apologies for being uncolear. I was speaking specificially and narrowly of strong incentives to build weg pages differently being shaped slightly differently under a hypothetical regime.

Again, please accept my apologies for my failure to communicate my point clearly.


This is a really old article, but as long as we're here: just a quick reminder that the AMP standard still includes platform-specific components that favor individual companies[0] over smaller creators. It's still not clear what will happen to the components when those services disappear[1], and it's still not clear whether Google has the guts to tell someone like Facebook that a new component feature isn't performant enough to be included.

Quick reminder that the only way to do captchas in AMP is to use Google ReCaptcha.

There are a lot of reasons to hate AMP, but one big reason I hope doesn't get drowned out is that it's not just anticompetitive in the sense of handing control of traffic or hosting to Google. It's anticompetitive in the sense of reducing functionality on the web to a handful of large corporations that have every incentive to reduce diversity and place harsher performance restrictions on competitors than they place on themselves.

[0]: https://amp.dev/documentation/components/?format=websites

[1]: https://amp.dev/documentation/components/amp-vine/?format=we...


> "Quick reminder that the only way to do captchas in AMP is to use Google ReCaptcha."

That is terrible. ReCaptcha is the worst. Also, ReCaptcha seems to discriminate against Firefox, and if AMP discriminates against other captchas, this might actually count as monopolistic abuse by EU rules.


a website owner converting to amp is no longer an owner, it's a gig worker for google.


I love AMP sites that do it the right way, like Politico. Keeps the real domain, loads fast, clean interface. I wish more sites were like this. I think the first version of AMP where the URL was always "google.com/amp/politico/sdgffsdf" was awful but you can now keep the correct domain and I sometimes prefer it to the regular version of a lot of sites.

https://www.politico.com/amp/news/2019/12/04/trump-impeachme...


It's nicer than the original AMP setup, but still awful for publishers.

For any user that navigates to your AMP page from a Google search...

The publisher gives up the most important piece of screen real estate, and Google highjacks left/right swipes to navigate to your competitors. And, they hijack the back button post swipe too...back equals "back to Google"...not back to the page I swiped from.

It is pretty much like early AOL. A semi walled garden. It offers some speed benefit for users, but way more benefit to Google.


What's the most important piece of screen real estate that they're giving up?


The top. Top left is red hot on any heatmap that tracks eyeball movement. Google controls what goes there.


The page you linked takes 8s to display on my browser, even on subsequent reloads, just because I don't allow third-party scripts. It also has no displayed images, for the same reason. I really don't wish more sites were like this.


> I think the first version of AMP where the URL was always "google.com/amp/politico/sdgffsdf" was awful

But that has the advantage of making it easier to find the real page rather than the AMP page.


is this served from politico's servers and how is it different from a stripped down version of their site?


It's not. It's using "Signed Exchanges", which Chrome supports, but most other browsers do not.

It's just AMP with some crypto that lets Google masquerade as your domain.


Correction: It lets anyone cache your page, not just Google. And no "masquerading"; that's what the crypto is designed to prevent. Also it's not specific to AMP; you can use signed exchanges with any data served over HTTPS.


It's effectively just Google since it's not widely supported by browsers other than Chrome. There's also only one CA provider that can create the right certificate for SXG.

Or maybe you have some notable examples of SXG being used in a production non-AMP scenario?


The standard is brand new, and AMP was the motivating factor for its creation, so obviously the majority of existing use cases are AMP-related. That doesn't mean you couldn't go and implement a non-AMP use case in your own production site today.


One interesting use case for SXG is to allow decentralised and offline websites, since the site's data can be tied to a key/certificate/domain without having to be downloaded from a specific server. As an example, the IPFS project is already trialling the technology:

https://github.com/ipfs/in-web-browsers/issues/121


tying it to a domain name (which is the typical use of the URL) breaks the web though. i could understand if the key is used to show that the origin is a twitter account handle or something, but breaking the semantics of the domain by signing the content doesn't make any functional sense. Other than putting lipstick on a pig (AMP) of course


oh wow, Signed Exchanges are worse than AMP!

"make sure you are visiting mybanksite.com" is no longer safe.


> oh wow, Signed Exchanges are worse than AMP! > "make sure you are visiting mybanksite.com" is no longer safe.

Sounds like you don't trust public key based content signing. This is just broadening public key based signatures beyond the domain to include the domain and the content itself, and using signing to make the authenticity of the content independent of the physical infrastructure that served it.

That' what's being used here to verify authenticity of content's source, just like PGP/GPG does for signed emails.

That's a far stronger guarantee than "the data is authentic because it came IP address range X purchased by company Y".

In fact, without a such signature, there is no guarantee that just because a piece of content came from a particular server/datacenter, that it is authentic.

With signed exchanges, the chain of authenticity is pushed all the way back to the website's content creators - it doesn't stop at the web server. Also, this can't be phished unless you break the the content signing algorithms, and if that happens ... we all have bigger problems.


first, it breaks the URL specification, as the "host" is no longer a host. it breaks user's expectation of one of the VERY FEW things that everyday users understand about the internet.

one may manage to upload an html file to the bank's server and serve a -signed- page that google amp will cache, and then use it to phish customers from within the bank's domain. Or just use a stolen key to make thousands of such pages before the bank finds out. I think , contrary to what you say, it's a brand new, major attack surface.


> first, it breaks the URL specification, as the "host" is no longer a host.

By this definition, "host" hasn't been a host in a long time, since the time it was possible to route DNS traffic to multiple IP addresses, possibly in different datacenters.

> it breaks user's expectation of one of the VERY FEW things that everyday users understand about the internet.

How is signing content directly less authentic than signing only at the web server? Signing content directly at the time of publishing ensures that it was created using the private keys of the entity in question, regardless of the delivery mechanism for the content.

> one may manage to upload an html file to the bank's server and serve a -signed- page that google amp will cache,

Signed content exchanges specifically limit that by putting the content signing step at the content creator level, not the web server level. Unless you steal the content creator's private keys, you can't represent your content as theirs.


> "host" hasn't been a host in a long time,

Does SXG make this better or worse?

> ensures that it was created using the private keys

signing at the server ensures that it was created using the key AND served from a host they control. How is that not better?

> you can't represent your content

wouldn't the server sign all http responses by default? all you would need to do is upload a file


> wouldn't the server sign all http responses by default? all you would need to do is upload a file

No, the content has to be signed when it is created, in the content management system or similar content creation tool, not when the server sends it. The content management system itself must have strong controls on it (ACLs, controlled user accounts, protected private keys stored only on encrypted and access controlled media, regular audits, etc).

Basically the server itself is no longer trusted as the arbiter of content authenticity, the actual content creator is. Concretely, when the editor at a publication approves an article after reviewing it, it is signed for delivery at the moment of publication, not at the moment that the request is served.


so that means i can sign a page on the editor's computer, take it with me and serve it to amp from my website? that sounds even more dangerous tbh. it delegates security from people who may know a little bit about it (web hosts) to people who likely know nothing about it (writers)

what happens if someone's key is stolen and they need to re-issue it? All the previously published copies are now invalid?


> first, it breaks the URL specification, as the "host" is no longer a host.

Really, how so? RFC 3986 goes out of it's way to make clear that the "host" component doesn't mean DNS, and doesn't even have to denote a host.

"In other cases, the data within the host component identifies a registered name that has nothing to do with an Internet host."

"A URI resolution implementation might use DNS, host tables, yellow pages, NetInfo, WINS, or any other system for lookup of registered names."

> it breaks user's expectation of one of the FEW things that everyday users understand about the internet.

What, exactly and concretely, is that expectation?

> one may manage to upload an html file to the bank's server and serve a -signed- page that google amp will cache, and then use it to phish customers from within the bank's domain.

If the attacker can upload arbitrary pages to the bank's website, just why would they need signed exchanges? They've already got their phishing page on the correct domain.


> RFC 3986 goes out of it's way

the RFC uses the word "host" and not "signer". It also says that the "host" is intented to be looked up in some service registry, and there is no such thing for arbitrary signers.

> exactly and concretely, is that expectation

One of the common security advice banks used to give is "check your browser address that you are in our server"

> just why would they need signed exchanges

with signed exchanges they can fool amp to cache the page long after it has been deleted from the server


The RFC explicitly says that "host" doesn't necessarily mean an actual host and you still insist the opposite. So I don't really know what to say.

> One of the common security advice banks used to give is "check your browser address that you are in our server"

So you say that everyday users have an expectation that they're "in the bank's server"? That doesn't seem very concrete, since that could mean anything. Surely there is some kind of expectation they have about actual behavior or property. Something that will happen / can't happen right now, but the opposite with signed exchanges.

> Anyone who has the file can intercept the form data from that page now - a complete phishing attack.

Uhh... And just how would they do that? They can't inject anything into the page, and they can't modify the page. How do you figure they force the browser to submit the form to the wrong server?


> They can't inject anything into the page

assuming that someone finds a way to sign a malicious Html page (e.g. by sneaking into the editors office) they can serve it from anywhere, and the browser will pretend it's coming from the bank


If someone's able to get the signing key you've already failed at security.


> One of the common security advice banks used to give is "check your browser address that you are in our server"

" in our server" is a simplification of the technical explanation: "signed by our computers using our private keys before delivery to you". That is still maintained in the case of signed content exchange, but instead the transport function is provided by a different server.

It's not much different than, i.e. signing a compiled app with your private keys before uploading it to an app store. Such apps also use hosts to identify themselves and their content, even though they are delivered via app-store mechanisms.


> signed by our computers using our private keys before delivery to you

Please try to explain that to an everyday grandma.

I still dont' see how it's an improvement. The file can be masqueraded by an arbitrary server god knows where and still be served as valid to me. Anyone who has the file can intercept the form data from that page now - a complete phishing attack. There are so many things that can go horribly wrong it just makes one wonder what's wrong with googlers these days: https://blog.intelx.io/2019/04/15/a-new-type-of-http-client-...


> one may manage to upload an html file to the bank's server and serve a -signed- page that google amp will cache

Only if you have the bank's private key, and the ability to serve arbitrary content from the bank's domain. In which case... yeah, I don't see how the signed exchanges standard makes that problem significantly worse.


i don't know what's the max expiration for amp's cache, but i could set a really-long expiration date on the file and remove it from the server without the bank ever knowing it existed. SGX don't even require an upload - one disgruntled employee could do the same with a stolen key.

Nobody benefits from this shit than google. Do we really need more attack surfaces?


I hadn't realized the content was actually signed; I assumed we were simply trusting Google to send us the content they said they were sending (much like we do when using the Google cache). I'm curious now: would it be possible to use use the content/markup intended for use by the amp cache to view a static/unscripted/readable version of the page's main content? If so, why hasn't anyone built a browser extension to do so?

On a broader note, this also sounds like it could be used to allow caching proxies to work with https; you'd lose the privacy, but you'd gain from being able to cache content on local network if the browser only had to verify the content, and you trusted the cache not to spy on you.


> I'm curious now: would it be possible to use use the content/markup intended for use by the amp cache to view a static/unscripted/readable version of the page's main content? If so, why hasn't anyone built a browser extension to do so?

If the goal is to get around the AMP CDN, you don't even need to read the main page content. The AMP URL contains the original source URL itself [1].

The extension you are describing would just need to capture all requests with the prefix https://www.google.com/amp (or whatever CDN you are trying to get around), parse out the original URL, and then fetch it, and do what you will with it.

If the goal is to disable scripting on the AMP CDN delivered content, first note that AMP pages can't contain page-author-written JS [2], and any implicit JS has to run async.

But if that's insufficient, you can disable JS in the browser altogether, which would disable it in the loaded AMP content.

You could also try to parse out the main content from your extension from the AMP page if you know from the URL that it's an AMP page. Because AMP's forces relative terseness and simplicity of HTML content, it is probably easier to parse than original page's content. Obviously that won't generalize easily given the large variety of possible of content representations, but you stand a better chance of achieving this with AMP content than the original content.

And if you generalize it enough, you will end up with one component of a web crawl / indexing system in an extension ;)

1. https://blog.amp.dev/2017/02/06/whats-in-an-amp-url

2. https://amp.dev/about/how-amp-works/


I’m not sure you understand the purpose of https. Ensuring integrity of the document served by the server is only one small piece of it.

The other critical components are:

encryption so middleboxes can’t see what you’re looking at

guarantee (via the PKI) that the server you’re about to send your banking credentials to is using a cert that belongs to the domain name in the address bar that you trust sending your credentials to.


> encryption so middleboxes can’t see what you’re looking at > guarantee (via the PKI) that the server you’re about to send your banking credentials to is using a cert that belongs to the domain name in the address bar that you trust sending your credentials to.

The purpose of SXG is to allow publisher signing of edge-cache accelerated public content - i.e. it's read-only - not to encrypt private information like credentials in transport. Https still handles encrypted transport independently of SXG.

Also, why or how would someone create a system that accepted private info or credentials via signed SXG anyways? There's literally no mechanism in it to achieve that. If you tried to build a password entry field for your bank website and distributed it via SXG, it wouldn't even work in the first place.


> The purpose of SXG is to allow publisher signing of edge-cache accelerated public content

Is there a rule that SXG content can't contain forms or sth?


No, you can distribute whatever content you want. But the content distribution network can't listen for posts from those forms when the content is rendered.

SXG doesn't answer DNS requests for your domain. It only says that a particular piece of content has been signed using private keys that have been registered with the displayed host. That's it.

In fact, you don't even need a CDN or DNS to distribute SXG content. You could distribute it via USB drives, or code flags, USB drives attached to messenger pigeons, whatever. The point is that authenticity of the origin of the content is completely independent of how the content got to you.

When that SXG content, however it is distributed, is rendered, the browser represents that content as originating from your domain, which is in fact exactly where it originated.


There are 100 ways to steal credentials if you manage to convince the user that it’s safe to start typing in the page, since you can serve malicious js that way.

I really don’t understand why the browser would masquerade the url just because the content is signed. At best it is able to say ‘the content is signed with x’s key’


> There are 100 ways to steal credentials if you manage to convince the user that it’s safe to start typing in the page, since you can serve malicious js that way.

That's true, but it's completely independent of SXG. There's no way to trick SXG into showing a URL that it's not signed for. You would have to steal the private keys.

> At best it is able to say ‘the content is signed with x’s key’

Remember that x's key is cryptographically associated with their domain - that's how web certs work - so the browser can also say that "this content is signed with domain x's key". That's exactly what happens with https today, but with https, the chain of attribution implied by the signature stops at the webserver, since it holds the private keys for signing the content.

SXG allows the chain of attribution to be completely independent of the transport mechanism, https or otherwise. Of course, you should still use https to encrypt during data transmission over the internet, but that's orthogonal to content signing.

This is also directly analogous to how app stores distribute cryptographically signed apps. For example, it allows an iPhone to open a local native iOS app in response to a URL click in web content [1]: The app and the URL are both cryptographically signed by the same entity, so iOS can conclude that they are from the same origin, and allow the app to handle the URL.

1. https://developer.apple.com/library/archive/documentation/Us...


i agree but i just can't justify the connection between the domain and signed content. The root node here is "X's key" and it is used to sign a domain cert and also a document. It's semantically wrong for the browser to pretend that the document belongs to the domain, and even more wrong when the signed document is being served by another domain with a completely different cert, google's!

Even app stores don't do that - if you download a signed app from any domain, it won't pretend it s downloaded from apple.com but it will report that its signed from Apple Inc. The situation is not analogous anyway because there are very few app stores from 3-4 highly trusted corporates. If any of their app store private keys are stolen the internet is fucked.


> The root node here is "X's key" and it is used to sign a domain cert and also a document. It's semantically wrong for the browser to pretend that the document belongs to the domain

Browsers "pretend" exactly this every time they download a page via HTTPS. It's how HTTPS works. Did you think that they trust that the content comes from the correct source by just doing a reverse DNS lookup on the IP address? They don't. Instead, they check a signature from the web server against their cert keystore, and if the PKI signature check fails, you get a big scary warning that the connection isn't secure/private. The same thing would happen with SXG based content if the signature didn't match the keystore, except the signature to be checked is carried with the content itself, just like with PGP/GPG.

> Even app stores don't do that - if you download a signed app from any domain, it won't pretend it s downloaded from apple.com but it will report that its signed from Apple Inc.

I just checked an iPhone, and they appear to attribute an app to the creator, not Apple, Inc.

But the reason they don't show a download domain is because consumer iOS apps can only be downloaded from Apple, from the App Store, and nowhere else. Adding the information about download source information to the iOS UI would be totally redundant as the value would always be 'downloads.apple.com' or whatever.

If you look at the actual cert signing procedure for iOS apps, the configuration step includes the domain, which is why Apple can associate an entity's apps with it's https websites. Nonetheless, the apps are still signed by the app's creator, not Apple, and the app's creator is responsible for securing the private keys [1]

> The situation is not analogous anyway because there are very few app stores from 3-4 highly trusted corporates.

Why should only the 3-4 big corporates be the only entities who can sign or distribute apps or static web content? They are not the only entities capable of securing private keys. Banks do it all the time, as do individual app developers (note the warnings to app developers about private key management on Apple's website). They are also not the only entities capable of distributing content. App and content stores can provide many other services of added value, like aggregation and curation and payment systems, but signing and distributing content isn't one of those services they can uniquely provide.

You could even argue that distributing the ability to sign and distribute content away from the big corporations reduces single points of failure and makes the whole content distribution ecosystem more robust and fault tolerant.

1. https://developer.apple.com/support/certificates/


Well thanks for your reply , i still think sxg breaks semantics.

> Browsers "pretend" exactly this every time they download a page via HTTPS.

yeah and the big scary warnings are for the connection, not the content. currently browsers tie url host to DNS so the semantics are different, so the cert certifies the distributor. I also think this is only true for certs that don't have an organization name, at least i think that , for extended-validation SSL they still show this: https://upload.wikimedia.org/wikipedia/commons/6/63/Firefox_...

> and they appear to attribute an app to the creator, not Apple, Inc.

indeed , i meant that they attribute the app to Apple Inc as the creator, but not their domain, which is again, different semantics. (although i suppose apple is somehow involved in ensuring that the correct binary is distributed for every developer)

> Why should only the 3-4 big corporates

i m obviously not saying they should , but that it's not analogous situation, with their walled gardens and all. the web is nobody's a walled garden and a large part of the content is public domain which doesnt need any signing. that s why app store logic doesnt apply.

> reduces single points of failure

that 's what software hosts already do with providing hashes for binaries. and it's great that sxg can verify content through the browser. but it shows where the content was created, not where it was distributed , thats why i think it's wrong to change the URL

there is also a laundry list of dangers that they introduce that seem pretty serious for something that is being pushed forward for basically cosmetic reasons: https://blog.intelx.io/2019/04/15/a-new-type-of-http-client-...


I don't think there's a real phishing risk with them, but I object to Signed Exchanges because they are actively making the browser lie to me about the URL being used.


The URL the browser shows is the one which was cryptographically verified to be correct. I don't see how you can call that a "lie".

If I'm offline and I open an offline cached page in my browser, would you call it a lie if the browser displays the URL I originally downloaded that page from in the URL bar instead of saying it came from "your hard drive"?


It's not just us HN commenters that are concerned. Mozilla, for example, is highly opposed to it in it's current state.

"Mozilla has concerns about the shift in the web security model required for handling web-packaged information. Specifically, the ability for an origin to act on behalf of another without a client ever contacting the authoritative server is worrisome, as is the removal of a guarantee of confidentiality from the web security model (the host serving the web package has access to plain text). We recognise that the use cases satisfied by web packaging are useful, and would be likely to support an approach that enabled such use cases so long as the foregoing concerns could be addressed."

Mozilla has the proposal marked as "harmful".

Apple/Webkit have concerns as well: https://news.ycombinator.com/item?id=19679621


> We recognise that the use cases satisfied by web packaging are useful, and would be likely to support an approach that enabled such use cases[...]

That doesn't sound "highly opposed" to me.

Anyway, I read the full report from Mozilla back when they first published it, and while they do have some valid concerns (any new feature introduced to the web will necessarily introduce some new attack surfaces) I believe their concerns are already sufficiently well addressed by the standard.

The paragraph from Mozilla that you quoted is also rather vague and misleading. In particular:

> the ability for an origin to act on behalf of another without a client ever contacting the authoritative server is worrisome

This is super vague. I see no reason why that should be "worrisome". That sort of thing happens all the time in public key cryptography. When you receive a message signed with the private key of a trusted actor, it's perfectly reasonable to trust that the trusted actor authorized that message regardless of where the message itself came from. TLS itself already does exactly that every time you visit a website over HTTPS (your browser trusts certificates signed by a trusted CA, even though those certificates are being presented by an untrusted website, not the CA itself).

> as is the removal of a guarantee of confidentiality from the web security model

This concern is completely unfounded, and I'm surprised Mozilla included it in their summary. The use of the signed exchange standard doesn't reveal any information to any party that would not already have access to that information without the standard (a host serving you a link to a static, public page will necessarily already have access to the plaintext content of that page, regardless of whether they serve you that content themselves or not).


>That doesn't sound "highly opposed" to me

They marked the proposal as "harmful", and it remains marked that way.

I wasn't trying to exaggerate. I could cite other passages that support "highly opposed".

Mozilla did publish a pretty extensive document that explains their position and plans: https://docs.google.com/document/d/1ha00dSGKmjoEh2mRiG8FIA5s...


Yes, I know. Again, I read the full report. I don't think "Harmful" is an accurate summary of their position either. (At least in a layman's sense of the term; it may very well be the correct category from the perspective of Mozilla's formal standards position process.)

The more detailed summary in the full report says:

> There is a lot to consider with web packaging. Many of the technical concerns are relatively minor. There are security problems, but most are well managed. There are operational concerns, but those can be overcome. It’s a complex addition to the platform, but we can justify complication in exchange for significant benefits.

> [...]

> Big changes need strong justification and support. This particular change is bigger than most and presents a number of challenges. The increased exposure to security problems and the unknown effects of this on power dynamics is significant enough that we have to regard this as harmful[1] until more information is available.

> We’re actively working to understand this technology better. The Internet Architecture Board are organizing a workshop that aims to gather information about the bigger questions. That workshop is specifically structured to collect input from the publishing community. The technical details of the proposal will also be discussed at upcoming IETF meetings. Based on what we learn through these processes and our own investigation, we might be able to revise this position.

(Source: https://www.iab.org/wp-content/IAB-uploads/2019/06/mozilla.p...)

That doesn't sound "harmful" to me, it just sounds like they're skeptical, and possibly a bit confused. The meat of their concerns also seem to be primarily political, not technical.

[1]: https://github.com/mozilla/standards-positions


It's "harmful" in it's current form, and Google hasn't yet committed to addressing all of Mozilla's concerns. Mozilla could have chosen a different label than "harmful". They did not. They didn't change it either.

Last I understood, Apple had similar concerns. I find it unlikely that both of those orgs are making noise for no good reason.


There are only 6 labels to choose from. They actually couldn't have picked a different label without making up a new one, or without making their choice of label even more misleading than it already is.

Let's try a different approach. How about this: I've carefully read over both the spec itself and everything Apple and Mozilla have to say on the matter (that I was able to find anyway), and have come to an informed conclusion: both Apple and Mozilla are wrong. (That's actually a rather poor, oversimplified summary of my position. But no moreso than "harmful" is a poor, oversimplified summary of Mozilla's position.)

You are making an argument from authority. I consider myself sufficiently well informed on this particular topic to be making arguments based on facts and reason. I don't find you repeatedly citing a one-word summary of Mozilla's position on the matter (which is actually quite nuanced, and not at all able to be summed up by a single word) to be particularly convincing.


Let's try this...It isn't me anyone needs to convince. An appeal to authority is appropriate when said authorities control the browsers needed for the proposal to succeed.

One of those 6 labels is "non-harmful". It's it isn't harmful, that seems right. Here's the legend:

"Mozilla does not see this specification as harmful, but is not convinced that it is a good approach or worth working on."

Mozilla didn't choose that label.

My view is that the proposal was driven by a desire to make AMP less icky. It looks like it could have broader benefit if the concerns Mozilla outlined are addressed. I am skeptical Google will do that.

As for your characterization of yourself as "well informed" and me as, er, something else...really? Was that necessary?


> I don't see how you can call that a "lie".

It's a lie because the URL being displayed does not reflect the source of the bits.

> If I'm offline and I open an offline cached page in my browser, would you call it a lie if the browser displays the URL I originally downloaded that page from

That's a bit of a gray area. Yes, it is a lie (the browser should provide an indication of the actual source of the bits). On the other hand, the cache was created by you and exists on your own machine, so it's more of a little white lie in that case.


What about something like Cloudflare, would you say they're lying when they return a cached file instead of contacting the origin server?


Yes, because Cloudflare isn't telling me that it's coming from them. However, that's already a lost battle.


How is it no longer safe?


Phishing


They can't alter the content - that's where the 'signed' part comes in. Any forms there would still go to the original source.


I believe it‘s done with via signed exchange. You are free to host it where ever i think.

https://amp.dev/documentation/guides-and-tutorials/optimize-...


I had been told (and I have 0 special knowledge here, this is just what a consultant in this space explained to me a few years ago) that AMP boosted your placement specifically because latency was a scored and important factor.

As such, all you needed to do to get similar rankings was use any sort of CDN hosting for your page and you would get similar results to using AMP.

Also, it sorta seems to me like the author is complaining, "I can't just do a minimum effort AMP page for the search juice, I actually have to make a functional AMP offering or not use AMP at all." Strictly as a consumer, I feel like maybe Google is doing me a favor while telling off a publisher.


CDN helps with the page load/latency variable of Google' PageRank but won't equal AMP.

To get AMP-like speed you'd need: CDN, no render-blocking javascript, minimized image files, "lazy-loaded" assets & inlined CSS for "Above the Fold" content. On the server side you want to cache content with something like Varnish & send it over an "Edge" network like Akamai or Fastly. Ideally everything is served over HTTP2 or SPDY.

Doing all that replicates what AMP does.


That's sort of my point: publishers seem mad at the temerity that they can't festoon their web pages with tons of stuff and then not also be put in a category that is meant to be fast.

There are other parts of AMP I am less okay with, but tbh I trust the publishers even less than I trust Google with respect to sketchy tracking bugs and data collection and useless javascript ux.


Not true. Google favors AMP content in “articles for you” on chrome mobile, as well as featured story carousels on search and inside Google discover. All of these areas can be a massive firehose of traffic.

If you make a living from a content site, you have to play ball and create AMP versions of all your pages.

OR, you can choose to lose to your competitors. Let’s stop pretending like that’s really a choice, or that any sizable share of users will ever switch to DuckDuckGo.

This is where we need government to step in and regulate Google’s de facto monopoly on search.


Has someone demonstrated that if you make a static page with similar characteristics to AMP on a different CDN, that this doesn't get similar placement?

I'd be more inclined to side with publishers here if AMP was the only way to get this. But as an awful lot of content sites are run by folks very mad that they can't run their own invasive tracking and analytics, my sympathy is limited.


Just click on any link inside of chrome's "articles for you" on your phone and look at the url.

I haven't seen a single non-AMP article there, and I've been checking for a year.


The first part of your post is incorrect, AMP is instant because it is prerendered, which just using a CDN can't achieve.

The second part is correct. I hate Reddit AMP results, and I'm happy that Google is telling them to fix it. I'll be even happier when they and other search engines demote Reddit AMP pages that do not match the canonical pages.


> I hate Reddit AMP results, and I'm happy that Google is telling them to fix it.

Is there some kind of real news about Reddit AMP changing, specifically? I don't see that in the link.


It says Google is validating that AMP pages match the content of the canonical pages and warning webmasters when they don't. I desuced that Reddit would get these emails because their AMP pages are the biggest offender that I regularly see.


Can you talk more about the prerendering? I thought that was a thing Google does to many pages, not just AMP, when possible.


It is unsafe to deanonymize a user to a website that the user did not even click on. That's why the page is served from the link aggregator's AMP cache.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: