Also, I wonder if I'm the only one that avoids AMP-based sites out of principle. I highly doubt it affects your conversions (I'm not really the easily "convertible" type), but makes one wonder if there can be effective web tech boycotts.
But imagine you've got an advertising department that wants three different ad networks, a couple different managers that want to see stats from a couple different analytics platforms, and and the designer wants to load a font from fontsquirrel and another one from typekit and another one from google web fonts, and as a developer who wants to keep the site fast you have to fight them every single time they want to add something else that slows your site down. Having the same fight every time, with everybody else saying "oh, it's just one request. and this one is really critical" it's hard to keep fighting that fight.
It's a lot easier to say "i can't do that, it doesn't work in AMP". If you can find a better way to convince large organizations that page load speed is a valuable metric, and more important that whatever other resource they want to load today, I'd love to hear it. But from what i've seen, AMP is the only thing that's had any success in tackling this problem.
I've been fighting against adding additional tracking forever, but constantly get railroaded by marketing because "they're the ones that know how to make us profitable."
Fundamentally I hate what it means for the internet, but I finally have a little power to say "no we can't do that."
There is also a real herd effect. Many people will do whatever Matt Cutts tells them because they think it will be good for their SEO. Yeah right. Some of the people who are good at SEO either went to work for huge brands or quasi-competitors of Google (like about.com) that might have some ability to bring Google to anti-trust court; most of the others switched to paid advertising once they figured out that Google won't let you win at SEO.
Certainly people who write for Spamium (aka Tedium) are the ones who try herd-following methods of getting traffic and they tend to be impressed when they get 100 hits on their blog.
In SpeedCurve, for just around $8/mo per page/site you can set up daily synthetic checks for the business and its competitors, covering multiple profiles: mobile/3g, tablet/4g, desktop/fibre.
You can use both the metrics (Start Render, SpeedIndex, Hero Paint)  and the filmstrip videos to literally show them how they compare side-by-side  – this is super-powerful as it's so visual.
Disclaimer: I don't work for SpeedCurve, just a fan of the tool.
There are some great stats on bounce rate / abandonment on WPOStats: https://wpostats.com/
This is my favourite:
> 53% of visits to mobile sites are abandoned after 3 seconds according to research from Google's DoubleClick.
One thing to note, to those stakeholders who are aware of web traffic stats, is that if a site uses client-side analytics (e.g. Google Analytics) and it hasn't loaded the analytics script by the time the user abandons the site, they won't be tracked, so the bounces won't be affected – it'll be like they were never there.
So ultimately, bounce rates in analytics tools can typically be significantly worse than reported when web performance is poor.
were the filmstrips enough to convince them, or did your bosses want to see numbers/stats like how many were dropping off your site for each x-seconds of delay?
I've been fortunate to spend my career working at sane companies, but I've talked to so many people who were in situations where management wouldn't listen to their own staff, but then turned around and listened to (and implemented) the exact same recommendations when they were made by a consultant.
I know there are often valid reasons for this, and external validation can be important. But it sure must be frustrating as hell to see your company pay $25+k to receive a list of recommendations you already made.
Trusting subordinates who are incentivized to fudge the numbers is very difficult. Most people won't outright lie to their boss to get ahead, but they will make a bad business proposition look good to the point they genuinely believe it themselves. Leadership is hard.
That is mostly a problem if the people above the management are the same. For a sane company with good leadership, failures are a learning experience that makes you better.
In my experience it's next to impossible. We have a bunch of inefficient bloated legacy shit that barely works and we spend most of our time bug fixing. Yet we keep adding to it while hiring more and more people to fix the increasing amount of bugs.
How about customer-centric culture? One where UX is #1, and not out of sight?
> Your job is to show why both parties’ incentives are actually aligned, not opposed.
It's your job to do just that because of your vantage point i.e. you're able to see everything more broadly than anyone else. Understanding that all the stakeholders in your organization are doing something important that allows everyone to have a place to come to work to tomorrow is critical for you as the engineer.
It sounds very good on paper but it doesn't really work in practice because devs are generally not recognized as important decision makers in product design, despite said "broad vantage point".
From there, it's a rather thankless uphill battle of the "no good deed will go unpunished" variety. It's a very misaligned situation.
The need for metrics is another way of saying "I don't trust you", which may be understandable to a degree but stakeholders should also understand that not everything can be measured, and that measuring things in a manner that is both accurate and useful is very hard. I'm skeptical that metrics have more value than the word of a seasoned engineer on a project. We don't have very good results with measuring other vague things, why would software development be any different?
That's a rather odd statement to hear from a dev (or developer advocate) because I've always heard fellow engineers talk about how data is more important than opinion. Why is this an exception?
I think the UX is the most important thing, without it (i.e., a good one) there is no profit.
In 2018 bosses involved in web - directly or indirectly - should be well aware of this. At this point in time the web dev's job should not be to "fix" those above them. There's plenty else to do and keep up with qithout having to add risking their career to the list.
And few if any freelancers are going to speak up. Which is all the more reason the decision makers should understand the implications of their decisions.
That doesn't sound like it relieves you of maintaining a bogged down version of the site, it just means in addition to maintaining the bogged down version you have to now also maintain the amp version, and keep them consistent.
That's a rather a-historical way to describe that. The various cookie notifications used what was already common code to implement it since most people didn't want to spend much time on what they saw as a formality.
You're making nothing at all from 20-30% of your visitors, because they're blocking ads. If you're actually following the GDPR, then your average CPM for EU users has tanked because you can't track them and can't serve targeted ads without an explicit opt-in.
Your average revenue per page view is declining, so what do you do? More aggressive ad units and more of them. Popovers and interstitials aren't new, they're just becoming more widespread because many publishers are starting to get desperate. Your bounce rate will increase, users will eventually learn not to click your links, you'll push even more people to use ad blocking, but none of that matters right now because you've got revenue targets to hit.
If companies can't manage to offer their service without exploiting your data and have to blacklist Europe, that's a win to me.
Thankfully, I don't live in the EU because I would be super pissed about all the websites blocking me because the EU no longer wants to give me the option to trade some data for a free service and I expect it to get a LOT worse once they start actually enacting fines and every company realizes that that they have really been force opting in people to data collection so it is no longer profitable to serve the EU.
In my opinion, it would be far far better to force companies to offer a "fair price" paid service in exchange for not collecting data (I'm 100% for taking them to court if they misrepresent the value your data provides so they don't overprice it forcing people to choose the free option). That way, I can choose to keep getting my free service in exchange for data and you guys can pay to protect your privacy. Does this option sound reasonable to you?
Pretty much everything you stated here is completely wrong. GDPR states that personal information can only be collected on an opt-in basis. Your entire statement therefore is completely off.
Because it has to be freely opt-in you cannot just do an opt-out.
> In my opinion, it would be far far better to force companies to offer a "fair price" paid service in exchange for not collecting data
The GDPR doesn't force to use anything for free.
I said "use said service for data" which means all or nothing, no ability to use a service and selectively decline the exchange of data that makes the company you are exchanging with money.
> The GDPR doesn't force to use anything for free.
I didn't say anything about that. I just don't like that free online content like the la times is now blocking all of the EU because their business model is incompatible with GDPR and so I would prefer if the law still allowed EU users the choice to accept said business model, or use one that contributes revenue in proportion to the old one so their business model works while also satisfying your wants. Does that make sense/sound reasonable?
I think you should at least know the basics of the topic you're discussing before leaving nonsensical comments.
Everything you're saying is completely wrong.
> not even having the option to use said service for data
Companies are blocking all of the EU because of GDPR which is denying them the "option to use said service for data". Here is one list so far: https://gdprcasualties.com/
> Thankfully, I don't live in the EU because I would be super pissed about all the websites blocking me because the EU no longer wants to give me the option to trade some data for a free service and I expect it to get a LOT worse once they start actually enacting fines and every company realizes that that they have really been force opting in people to data collection so it is no longer profitable to serve the EU.
I don't know of a single company that now gives users to opt into each data collection so if they actually start throwing down huge fines (which I fully believe they will based on their history), I expect it to get a lot worse.
Assuming (correctly) that web sites are untrustworthy data collection bandits, why should they behave well only because a user proved their submission (and their gullibility) by paying them?
Didn't know what that was until you said it and still not sure how it applies.
> Assuming (correctly) that web sites are untrustworthy data collection bandits, why should they behave well only because a user proved their submission (and their gullibility) by paying them?
I think you are misunderstanding me. I love everything about GDPR except 1 thing and that thing is not allowing companies to tie providing their service in exchange for data so that they can make money because it denies people the CHOICE (key word here) of using said option.
I proposed an option (just pay for the lost revenue the company no longer makes) so that everyone still gets the option to use current revenue models, pro privacy people can pay their fare share for the service, and companies don't blacklist all of the EU for what a portion of the people want. What does not sound fair about that?
I'm curious, how do you propose to fund the web? Should every site now have paywalls?
You cannot do that not because of the EU but because of basic logic.
Yes...do all the same things but without requiring Google proxy it. There are several things orgs will do to optimize SEO, conforming to AMP rules can be another. Gaming SEO is a concern for all SEO requirements so surely that's a poor excuse for proxying.
EDIT: But it did give me an idea on how to tell websites you aren't happy with something: https://github.com/cretz/software-ideas/issues/91
I hate needlessly heavy sites. I just avoid them. I'm also it a fan of AMP. I especially hate the URL is wrong and trying to share the correct URL is tedious
There's a certain "theory of the firm" that says corporations exist because they reduce the overhead cost of operating certain enterprises.
Things like this -- along with the also oft-seen need to hire consultants to tell some people in your company to do what other people in your company already know needs to be done -- make me wonder if the limits of that theory are well understood.
And then the engineer could point at validator results.
Have google results punish fat pages. Then we can say "I can do that, but you have to sign this the declaration accepting responsibility for our page hits tanking".
Once [cross-origin server push] becomes standardized, regular webpages will be able to take advantage of similar functionality.
There are also privacy concerns – e.g. the remote server could set a cookie in a response without you ever visiting that site but if you disable cookies by default users will have to refetch the resource anyway if there's any session state involved in page generation. You also have issues like companies wanting to distinguish between traffic from actual page views and the prefetch mechanism.
- Size all resources statically
- AMP uncouples document layout from resource layout. Only one HTTP request is needed to layout the entire doc.
- All CSS must be inline and size-bound
- Minimize style recalculations
- Only run GPU-accelerated animations
- Prioritize resource loading
- The new preconnect API is used heavily
- When AMP documents get prerendered for instant loading, only resources above the fold are actually downloaded. Resources that might use a lot of CPU (like third-party iframes) do not get downloaded.
- 1K of HTML with 40K of CSS in a file with a long term cache. Clicking a different page on the site downloads another 1K of HTML.
- A 41K file with everything inlined. Clicking a link downloads another 41K.
However, if you have non-render blocking CSS, or CSS that's used for below the fold or generally lower down the page content, "only render critical CSS inline" is usually coupled with "and then have the rest of your CSS in an external stylesheet". Which you are not allowed to do.
Accordingly, it's ALL inline, all the time.
In the AMP world, this whole stylesheet would end up inlined, and you would download it again the next time someone posted a Medium link on HN.
If Google replaced all of their search results with ads, people would easily switch to bing as bing isn't that much worse.
Facebook on the other hand definitely is "pretty much" a monopoly (but not completely) for certain subsets of the social media market due to network effects which is why they shovel ads down your throat on their platform and they are making tons of money.
Their search results are good because they do machine learning on data from all their other users.
A lesser-used search engine has less data, so even if they have smarter people and a better algorithm, the search results will probably be inferior.
... and let me tell you Bing is not nearly as good as Google when it comes to bringing me the right search results first.
But I got tired of Google asking me to “prove that I’m not a robot” by tapping on roads and street signs with every new search. I use incognito mode, and since they can’t track me, they either are punishing me or just automatically assume I’m not human.
So yeah, just me trying to get blessed by G so everything else improves from G for us :)
Tell them that it won't support all their various analytics scripts, tracking pixels, and A/B testing scripts that fill up their charts with vanity metrics they use to impress stakeholders.
Counterfactual: lots of terrible business decisions are made because someone gets on an ideological bent and runs with it. Revenue today is worth more than revenue tomorrow. Balancing growth opportunities against investment risks is the core of commercial decision-making.
Is there a mobile browser that tracks bandwidth used, and can tell me how much money I'm spending on bandwidth per domain?
I'd like to know which sites are parsimonious about page size. Then I'll limit my mobile browsing to those and ditch the ones that don't care.
Maciej Cegłowski (aka idlewords) recommends exactly this is in his (2015!) talk "The Website Obesity Crisis": http://idlewords.com/talks/website_obesity.htm
His specific comments about AMP: "AMP is a special subset of HTML designed to be fast on mobile devices. Why not just serve regular HTML without stuffing it full of useless crap? The question is left unanswered."
The reason why they do it is because of better search page placement in Google. It's unfair and wasteful, since site speed can easily be measured by the most advanced search engine that already has tools and reports doing exactly that.
I think aside from the icon and the special treatment mentioned above, don't AMP pages get served directly from google and get preloaded on the search results page? So that's the other benefit, I guess. Whether a few less milliseconds from preloading on an already fast page really buys anything is another question.
On the technical side, it hardly matters. If the sales and marketing teams have bitten, they sell to management, and tech is informed they will be implementing it.
That's often how these things go, anyway.
Personally, I think it's stupid from content creators, because if I visit the website I might read more articles, share, or sign up to the newsletter, and that doesn't happen with all AMP implementations I've seen.
You can implement this feature yourself too: https://developer.mozilla.org/en-US/docs/Web/HTTP/Link_prefe...
So, lack of UX with the links.
Because you need to convince business folks about stripping down bloat from your webpages. And doing that by citing sticks and carrots from Big-G is much easier than doing it on your own. So we can attribute this to: laziness of developers (who cannot make arguments to reduce bloat), boneheadedness of business folks (who cannot quickly understand why bloat is bad), tragedy of commons (my competitors are gaining valuable traffic by having their articles in special Google carousel at the top of search results, I am missing out so I must do AMP) etc.
The usability of your site just doesn't really matter if your business is to get traffic from Google and monetize it, which is what we are talking about here.
It's much more likely that the changes done to a website to satisfy AMP affected the conversion rate.
So unless Google actually adjusts the ranking it is useless to get AMP. I wish though that many web sites would chill down with including so much of JS on their pages.
I think that's the only reason.
So yes, sites can be every bit as fast as AMP. And industrial manufacturers can be every bit as clean as required by regulators if they had no regulation. But they aren't, and they don't.
We need an HTML Lite as a mode in the browser that winnows down the enormous featureset of HTML. Not for all content, but for that content where text is king.
Google is going to get in trouble stewarding their gateway into the internet in this fashion with these rules. Rules about slow page load times can be seen as universal. Rules about using a Google proxy however are just begging for political intervention. As someone with a small business, I'm upset that these companies, under the guise of helping users, are giving reasons for these liberal governments to act. It negatively affects me when these inevitable regulations come down because companies like Google can't remain provider-neutral. Arg!
> We need an HTML Lite as a mode in the browser that winnows down the enormous featureset of HTML. Not for all content, but for that content where text is king.
I concur here and have been thinking about this recently especially in the context of web browsing in the terminal. But it has to be driven by user adoption, not rules written on the walls of a few companies' gardens.
In the meantime, I would ask that the search engine not give preference to their proxy over anyone else's but instead define the guidelines that we can all reasonably meet.
From what I am aware, in practice there are a large number of sites which provide different content or different behavior in response to the GoogleBot user-agent.
Pinterest runs its own AMP cache for prerendering pages linked from Twitter. Bing runs its own AMP cache to prerender pages linked from Bing. Yandex runs its own AMP cache to prerender pages linked from Yandex. They can't use each others' AMP caches and get the same benefits.
Honest question: would the author of the linked article have experienced the same issues if the users had actually visited his own site's AMPages?
Or, based on the browser I use when I care about speed, can we design pages around the constraints of Lynx/Links/eLinks? The CSS layouts work when called for, menus work in text-mode, but without images, *.js, and all the other Web annoyances, everything loads quick even on the worst connections.
If I were looking for song lyrics, or a recipe, or schedule information, etc, I would be browsing in HTML Lite mode. I don't want popovers, subscribe now boxes, animated bullshit, etc.
My root post is sitting in the negatives right now, and I own that and embrace it because it is how discussions about AMP always go on HN. A bunch of web developers herd in to tell us how terrible AMP is while the web gets more and more bloated, more and more destructive and tragedy of the commons, and we all layer on various shoehorned, half-assed solutions to fight back (e.g. Ad blockers, nuisance blockers, tracking blockers, etc).
In abstract, I also don't like the notion of giving Google that much power. Even if what you say makes sense, and regulation is necessary, a private company doing it is the worst of both worlds - they are not accountable to the rest of us, and they have their own goals that contradict the public good.
Couldn't agree more with this. People approximate it with e.g. noscript, it would be awesome to have a standard which describes a minimal HTML Lite spec though.
Edit: It would be awesome as another option in the link context menu: open in new tab, open in incognito tab, open in lite tab
Right now the AMP CDN enforces the restrictions of AMP (you can't simply pretend it conforms to AMP to Google and then feed the client something else) which is something that is fundamentally misunderstood when this is discussed on HN. AMP is a protection and guarantee of behavior for the user, it is not for the publisher, it is not for the developer. Google has a profound vested interest in trying to maintain people's interest in the web, especially in the face of various alternative walled gardens like Facebook feeds, Apple News, etc.
To me, AMP is just Google trying to turn the open web in its own walled garden.
Yes, right now there is a technical need for an intermediary because it guarantees AMP is actually AMP. It is trivial for a page to say it's AMP, load the AMP standard library, and then do everything disallowed by AMP. There is absolutely nothing preventing that but that intermediary rejecting non-compliant content.
Now of course we've talked about an HTML Light and that would be the browser enforcing that limited sandbox. It could send a relevant cookie and then reject content that steps outside the bounds. But we don't have that right now.
That's because the standard is not well defined. If you had a doctype just for AMP at the top, that would work. I don't see any technical reason why google's servers are necessary. I see a business reason for Google of course but there's nothing technically that makes sense.
> Now of course we've talked about an HTML Light and that would be the browser enforcing that limited sandbox. It could send a relevant cookie and then reject content that steps outside the bounds. But we don't have that right now.
Well, AMP could be just that if it was standardized.
My point is: Making it easier to serve sane and lightweight HTML will not help fix the problem. Publishers don't want to do that, they like all that crap. If we want them to make sane web pages, we must force them.
The article also mentions the impact on conversion rate. We're interested to learn more details surrounding this. Blank pages loading for many users would explain a lower conversion rate but we'd like to figure out if there's any other possible cause since it doesn't currently seem like most users hit a blank page in actual usage. I'll get in touch with the article author to see if there's openness to digging in further.
The solution needs to start in much simpler terms: Google should publish page traits that are rewarded and punished, and tools for seeing where you stand. It should reward/punish all sites based on how they fare in the published areas.
We don’t need new protocols/wrappers/rehosting or extra scripting or whatever else to drag bad sites into the 21st century. These just create additional issues when we have plenty of engineering problems already.
The web has incredible variety in tech stacks, business structures and even views on what a standard means.
But AMP is a standard. Using AMP doesn't solve any of the problems you're talking about. If people's usage is too diverse to be accurately measured, then it's too diverse for AMP to meet everyone's needs. And if it's not important for AMP to meet everyone's needs, then why is it important for a set of page tests to do so?
I agree there might be problems building a small set of tests to check page speed, but that would still be strictly better than AMP. It would still cover at least all of the use-cases that AMP covers now, and it would open the door to cover more use-cases in the future.
If you're ranking search results based on whether or not someone uses a framework, you are implicitly ranking them based on the attributes of that framework. What people are asking is for Google to make those attributes explicit instead, and to directly test them.
AMP's website defines it as a library: https://www.ampproject.org/learn/overview/
I don't particularly care whether anyone thinks that's technically a standard or a framework or a library by the ridged definition or not; at the point we start down that rabbit hole we're just talking about words, not concepts.
What I was trying to get at above was that any problems people bring up around Google profiling websites for search placement are still present in AMP. Forcing developers to use a specific set of technologies is functionally the same as forcing them to conform to a ridged set of benchmarks. For the purposes of this discussion, we might as well call AMP a standard.
It's actually to the point that I have stopped using non-browser google search on my phone. In fact I didn't really notice it was AMP just that results from "Ok Google" were annoying as hell after upgrading to Google Assistant. Reading this article was an "oooooooohhhh that's what's going on" reveal to me.
I don't even know where he's getting this "Info" thing. To me it just looks like some sort of chrome window that doesn't let me edit the url. Even when I do "Open in..." to send the link from I guess it's called Amp browser (...that somehow looks like Chrome?) to my browser it leaves me trapped in AMP with weird urls. It's extremely confusing.
I just want to escape to the actual website in an actual browser and for whatever reason I end up having to try and re-find the site in the mobile browser. Maybe there's some obvious way to do this but it's just driving me bonkers.
* ublock origin
* view image (google picture search)
* background video playback (eg. YouTube)
* redirector (eg. auto switch to old.reddit)
My non-technical significant other and my dad have repeatedly asked me if those pages are safe to visit: after training them for years on the heuristic of mistrusting a page where the URL doesn't match the expectation, AMP is now breaking it for them and causing unnecessary insecurity.
I'd prefer something that I can install on my phone browser to redirect me automatically to the non-AMP version, I'll do some digging.
Which is Firefox for Android.
Note that "content blocker" is congruent to "resource failed to load", which could always happen.
If, as the article implies, Google will give me the same rankings if I do things to get the same performance as an AMP page then I would rather do that.
Unpopular where? Between web devs on HN which are the main cause of the web site bloat plague?
Even the author couldn't narrow down what exactly was the cause other than maybe:
a) the domain name being different
b) the addition of chrome w/ a link to Google support + a notification "You're now logged in as [user]@gmail.com"
c) minor CSS changes as a result of minimizing static assets
None of these screams an obvious problem, especially given the other benefits this process has given in return (which of course could be accomplished without AMP).
Google has been using indicators like page performance in their ranking algorithm for quite a while now, so to be fair, it doesn't look like it helped that much. A single label and clear prioritisation apparently make it a far easier sell within companies.
(Not that I'm a fan of Google/any single company having to be that warden, but I digress.)
Chrome bug: https://bugs.chromium.org/p/chromium/issues/detail?id=873571
This was the result of a lawsuit by Getty Images - https://arstechnica.com/gadgets/2018/02/internet-rages-after...
awkward for me. i really liked Wave.
After Wave failed, they doubled down on making Gmail into more of a nonstandard product with reduced interoperability (now requiring Gmail API instead of standard IMAP) and increasingly, embrace & extend functionality such as email expiration dates.
Is something not working with gmail's imap support?
Sure you can still access Gmail through IMAP, but if it works differently enough that using a standard IMAP client feels cumbersome and unfamiliar, is it really anything else than a vehicle to tell people that they should really just use the "better" Google product directly?
That said, my original wording of "requiring" the Gmail API was poor and I should have phrased it more accurately.
Also they're moving away from the icon and going to a full "Not Secure" status. Image: https://i.imgtc.com/9DwDQ6r.png
>do not need to be "secured"
Which is it - is the site secure without SSL, or does the site not need to be secure?
In the former case, I disagree wholeheartedly. In the latter case, you're not blocked from browsing the site - only informed that it is insecure, a factual statement.
HTTP is unsafe in the same way that getting a newspaper delivered to your yard is unsafe.
Oh no. Casual passersby know from looking that I have a newspaper on my lawn. If someone wants to snoop when I'm not looking, they now know that I read a specific newspaper. Someone could even steal it.
It's unsafe in the sense that if you leave your driver license, credit cards, birth certificate, cash, and car keys all in your yard over night, you won't be surprised if at least one of them is gone in the morning.
HTTP is a paper in your yard. A poster on a phone pole. A business card on a broken, smudgedy plexiglass subway sign. HTTP is public, and there is absolutely value in putting things out there for everyone to read in public.
You could argue it is bad for publishers. But users? Please.
1) Click on an AMP link by mistake in google results
2) Get annoyed at the less-functional AMP experience to read reddit (limited replies on AMP page, no JS expansion, etc)
It's also super frustrating AMP prevents hold-tap on mobile to open a link in a new window, it just doesn't work.
3) Press the chain link icon to switch to mobile site
I've been frustrated and delayed in getting to the mobile site directly.
Who are the users you're speaking of? The ones on their mobile devices who click AMP links by and large seem to love them. Who hates them, however, are web developers and web exploiters who see it as a threat, a limitation, etc.
"Make no mistake. AMP is about lock-in for Google. AMP is meant to keep publishers tied to Google."
2) Links no longer point to your own website.
3) Analytics don't work correctly, because the wrong URLs are logged.
4) It centralizes the WWW on Google's domain, keeping users on google.com rather sending them deeper into your site.
5) It restricts the way that you can monetize your site.
6) It causes webpages to load slower when 3rd party scripts are disabled.
7) It restricts how you can build your site.
8) It isn't faster than hand-optimized HTML.
Even if nothing else, people should oppose it because centralization is exactly what the WWW isn't supposed to be.
Really? The main comment I've heard about it (when people mention it at all) is that it messes up the URL. I doubt most people notice anything changed.
Such a frustrating experience for users.
By what mechanism? Because copying the URL from the bar does not copy the correct URL, it copies the AMP url.
This was fixed at the beginning of this year. Now what canonical URL the publisher uses is up to them.
Another comment said that relative links are all messed up o n the Reddit AMP version. That's Reddit doing it wrong.
Since AMP uses an overflow technique to handle content/scrolling, the page never triggers a scroll event on the body. So not only do you see a toolbar at the top (which admittedly, slides out when you scroll down), you see the browser toolbar at the bottom (which persists!)
edit: wait, no I don't. I get a header when I visit the page via google search, not when I follow the link directly. I haven't noticed this before because google is the only way I usually end up at AMP pages.
Google link: https://www.google.com/amp/s/techcrunch.com/2018/08/08/samsu...