In practical business terms I would strongly suggest website owners build ultra-light versions of their sites. If you have international aspirations your site should work on Opera Mini. If you have a big audience it is very reasonable and worth it to have desktop/tablet, smartphone, and ultra-lite versions of your site.
Reposting part of a comment I made a month back on AMP:
#1 There is a big problem with mobile sites. I'm using a recent iPhone and many popular news sites, without ad blockers, are as close to unusable as the worst websites I've ever been to, dating back to using Internet Explorer in 1999. Auto playing inline video ads that slide in to view, just insane. These things clearly kill time on site and reader retention. I have theories about why publishers are ignoring this, but who knows.
#2 Google is using AMP to co-opt publisher's traffic. This means users are scrolling to another story from another publisher or easily bouncing back to the Google results when they land on your content. (See the X in the story link on the animated gif example.)
There goes your time on site and long term user retention. If #1 was a problem for you already, you probably don't notice.
#3 AMP & Instant articles are going to put a stranglehold on third party ad networks and represent a very real anti-trust issue.
There are a bunch of other privacy implications too, which have been discussed in length. Publishers should be thinking really hard about their future.
- Display advertising makes the web suck; while display ads fund a large share of general market content sites, from a purely technical perspective they kill site performance.
- All major market participants suck equally, rendering no alternatives for end-users or advertisers to defect to; this inherently limits any incentive for real change.
AMP / Mobile index (new development from Google) breaks the gridlock: divert traffic to sites who deliver a good (fast) mobile experience and (assume site owners are economically rational - we are) is supported by something other than old school display advertising. Solving the 2nd part is left as an exercise to the student....
Interesting enough, you can make a banner ad load really fast. Just host images on your site and load it directly from your server without all the tracking scripts and separate calls from your advertising network (plus others). Or go completely native and render text links in html. Granted, this requires you to actually market your content and merchandise an offer (affiliate marketing).
Display advertising actually isn't the highest CPM option for a website; I did a study on small site auction data a while back and it was the lowest CPM model (admittedly, laziest to implement):
I personally find the gain suspicious, it's already possible to make fast pages.
Soon, we'll all merely be Google content providers.
The more I look at AMP the more I absolutely hate it. This is just another thing to make me not want to use it.
If Google points to a site, they should serve your site. If they're serving from cache, put in a DMCA and tell them hands off your content.
But you should also support antitrust legislation and judicial action against Google. Don't vote for candidates next month receiving large donations from Google (now referred to as Alphabet on most donor lists). The long game is requiring open oversight of Google's search ranking methodologies and forbidding them from modifying them to benefit their own products and services.
Don't get me wrong, your logic is correct. But, in order for politicians to play the game they need large donors like Google. So, if they don't take money from Google they'll just take it from someone else with an agenda. The real solution is to vote for candidates pushing for campaign finance reform (reading Lawrence Lessig may be insightful)
When you see campaign contributions listed "from" Alphabet, that money is actually all coming from individuals.
Employees of Alphabet who give more than $200 are required to list their employer on the donation form. The campaigns then submit that data to the FEC, which makes it public, which is where OpenSecrets gets it. So even if the Google janitor gives $250 to Clinton, OpenSecrets will list that as coming "from" Alphabet.
OpenSecrets also shows money that went to candidates from political action committees (PACs). But any money that passes through a PAC to a candidate must originate from an individual as well. The corporation can only pay administrative fees, like providing an office and Internet connection.
You may note that I said to avoid voting for candidates with a LARGE amount of contribution from Google. Because this isn't about $250, but, for instance, about the $58,000 Googlers have given to Ro Khanna, which pretty much ensures Ro would never vote against Google's interests if elected.
This is funny considering you think "donations" from Google don't have an agenda.
Lets call it what they are without double speak - a fucking bribe.
Therefore, your best bet is to get regulations against Google by electing people paid by someone else, then later get regulations against that someone else by electing people paid by someone entirely different. It's a juggling act of corruption!
While the electorate is bickering over distractions like this cycle you are correct. However, limiting special interest influence strikes me as a non-partisan issue, so if the citizens spoke up I think they'd have sufficient influence.. But, then maybe I'm being an idealist. Regardless, your short term solution works.
Now Google comes along and says: You can't do it, let us do it. Which is perfectly reasonable from their point of view. And when I surf google news on my slow mobile connection I'm always happy when I see that a link is going to an amp target - because then I know it'll be loaded fast.
The AMP symbol tells users that the site loads quickly. Google doesn't indicate the same thing about non-AMP sites that still load quickly. Guess which site the user will tend to choose. I am not sure AMP will be optional for sites that want to get chosen.
What you're suggesting is not a matter of me making my website faster than AMP in isolation. You're talking about every non-AMP site in the world banding together and collectively bringing up the average speed of an internet webpage to be faster than an AMP page.
Without that, there's nothing to make my one faster-than-AMP site stand out from the rest of the slow non-AMP web in Google search results.
Totally disagree. I didn't know what AMP is and I keep seeing it. I avoided it thinking it was yet-another Google adware property like DoubleClick, another cleverly named way to trick users.
For the vast majority of users on 4G connections, AMP loads aren't that much faster (When you download 8MB/sec on 4G, a 2MB page is just not a deal breaker) and I doubt that very many users at all in America are noticing that dramatic of difference on their devices, enough to subconsciously learn to like it.
Take all those connections and introduce a 1-5% packet loss rate, and suddenly there's a very good chance that any given "page load" (really dozens of requests) takes perceptually "forever". That is what AMP helps with.
Google tests and re-tests the crap out of things like this.
But when you're rewarded by a quick page load for the first time, and the second time, you'll start clicking it more because you know what reward to expect.
This makes a significant difference in traffic.
"we've decided to take site speed into account in our search rankings"
Just like the promised benefits of HTTPS. I have yet to hear about a case where search traffic increased after switching to HTTPS.
The problem isn't that Google "steals" traffic in the sense it's going against the wishes of the site publisher, but that the author didn't know what he was buying in to and thus was surprised.
There is the other thing which is the AMP Cache is probably more local and faster than your site. Small sites usually dont use CDNs so there it is.
> Can I stop content from being cached?
> No. By using the AMP format, content producers are making the content in AMP files available to be cached by third parties. For example, Google products use the Google AMP Cache to serve AMP content as fast as possible.
I think it's easier to use a CDN like CDN77 than to convert the average site to AMP.
This is such a trivial optimisation it makes you wonder about the level of competency of the average web development company.
Our competency on projects where we are being paid a full domestic consulting rate is quite high, thank you.
Now, if you're asking us to compete with cut-rate bids from some online freelancer exchange, we will reduce the level of service and QA to match our competitors in that space....
Eg. you may have Nordstrom's service for a Nordstrom's price. Wal-mart service for a Wal-mart price. But don't expect Nordstrom's service for a Wal-mart price; that usually isn't going to fly....
You can insert ImageMagick somewhere into your production pipeline. It's excellent software.
 Like MVG (http://www.imagemagick.org/script/magick-vector-graphics.php) and MSL (http://www.imagemagick.org/script/conjure.php)
For JPEGs: use chroma downsampling, and don't go above 80% quality unless absolutely necessary. If your software has a preview function, use it. Encode with MozJPEG if you can.
For PNGs: compress the hell out of them. There's several tools that can compress them further.
You still need to resize your images appropriately but ImageOptim will do the rest. It passes images through a series of image optimization libs.
How can I make this happen on mobile FireFox?
Yes and no. Google ranks AMP-ed pages differently, as well as showing that lightning bolt icon. It is definitely not a level playing field.
It's more like people don't feel like following a weight loss program until a trusted professional outlines an easy to follow and precise method.
Kind of related: I recently switched my blog from Wordpress to Hugo. I found a minimal theme, but the amount of junk it was loading was shocking. I created a stripped down version if anyone is interested: https://github.com/lucaspiller/hugo-privacy-cactus-theme
Whole WP ecosystem is a huge pile of crap, each plugin adds more and more mess to it. I tried Hugo and Joomla, but they were even worse (Joomla had a few security issues that developers refuse to fix, Hugo was more bloated than WP).
After trying and modifying over 20 acceptable themes I decided to start working on my own blogging platform that will be SEO-friendly and clean.
Hugo is a completely different thing, don't compare it to WP.
They take a different angle, but the goal for both is to allow for easy maintainable websites.
Care to elaborate on this?
If they were doing that, the media companies (and any other site which isn't in the deep web) would be working towards making their content minimal at the core, not branching off with an "optimized" version of the page. This would make the browsing experience better for all devices, not just mobile users.
I'm still undecided on if I like AMP or not, but I do see the reason for the distinction.
AMP is a single 'framework' you can work with in making your site faster - it's just one thing to learn and reimplement your article pages from scratch.
This is of course ignoring the "ecosystem" capture Google gets as well from this, and the pressure/persuasiveness they put on media companies to implement AMP.
I don't understand how validation requires the parts of AMP I dislike, though...
I would expect an un-biased search engine to rate pages with and without AMP equal and to not show 'badges' based on whether or not a page uses tech by the same vendor of the search engine.
(Edit: Sure, Sourceforge was improved a lot after the new owners, it was just used as an example)
- whitelisting (AMP bolt); OR
- blacklisting (down-rankings)
So now it depends on our perspective, is the web default good or bad? default good means blacklisting & vice-versa
Alternatively, maybe DDG can do a blacklisting of sites based on some criteria (bad-practice / malicious / content-farm / etc... )
Well there's your problem :-). Google isn't an unbiased search engine, they're a commercial entity with a financial interest in feeding you particular search results. They're absolutely going to be biased in the results they provide you.
Longer, perhaps, if Hillary is elected with Eric Schmidt's help.
An admittedly cynical view I've developed after years of watching partisan prosecutions, impeachment votes, and pardons.
You may intend this to mean that you will never consciously publish your content in AMP, but there is a good chance that you will consume content that is published in AMP.
It is already something I find myself noticing, especially on mobile. If you see an article work on mobile from a news site and it was fast, it was probably AMP.
You can't. That's probably not an oversight.
I can make pages faster without AMP by stripping the needless bloat and using a proper CDN. Why do I need this tech again, besides the artificial goad of page rank?
You can. In theory. But people don't, as is evident with the web we have today.
So Google comes along and tries to force people to do it. AMP wouldn't exist if people would've optimized their webpages on their own.
(That being said, I dislike it as a content provider and as someone worried Google has too much power)
Plus who needs to lose even more screen space to that stupid bar at the top plus the fact that the browser chrome doesn't hide itself when you scroll like it does on a normal page?
In the long run, it might be bad for users, as it could result in media producing less valuable content, and one big company having more control over distribution.
But yeah, nothing's free. They give you this great technology, free access to their world-wide CDN, and better organic search results, at the cost of having the X button going back to search.
Though I have no idea how the author things that the "X" button should go back to the home page honestly... You open an article, you close the article. If you want to go to the homepage, you click the top banner with the logo.
He's not suggesting the "X" leads to the home page, but to the actual "deeplink" page (blogpost, etc).
I'm not the author, but here's my idea:
Normal webpage: click result in Google, page loads (for a long time), I click reader view, read some, close reader view, explore the site.
AMP page: allows me to skip the loading time & the click on 'reader view' and start reading directly (so far, so good), but then I close "(AMP) reader view" and want to explore the site. Instead I am back at square one, exactly where Google wants me to be.
Their success becomes ever more tied to Google for the next hit.
And again, definitely not by pressing X. Pressing X means you're done with the article. And if you're interested, you'd press the websites logo at the top to go home, not X.
Then Facebook started to court publishers and suddenly the best way to read a lot of web content was inside of Facebook which is obviously terrible for Google. So Google came up with Amp as a way for publishers to provide the same experience on the web.
Yep, and that something was that it was an initiative by Google.
They are not a charity. They would not do this is it didn't benefit them (or inconvenience a competitor, though Google seem to err on the side of the former rather then aggressively going for the latter) in some way. The fact that is offers benefits for others is secondary, or at best has equal footing in their collective mind (many individuals within the company may be more altruistic individually, but the company as a single entity won't be).
That techies like me dont rely on them any more and non techies all share stuff via facebook must be hurting their page and ad hits.
Makes me wonder if this is more a move of desperation on their part. Food for thought even if it is a little too forward thinking.
You're going to have to cite some data for that bold assertion. Lot's of techies outside of your bubble still use (and rely) on Google.
And knowing "how and what" to search is really teasing out of the index the magic combo of jargon, phrases, and synonyms through repeated refinements on your query. That approach works just as well on Google or ddg for me and the bang syntax on ddg for domain specific searches has really sped up some of my searches over what I used to do on Google.
Does anyone still have google search as their home page? And if they do, do they use it?
When was the last time you got past the first page of results? (and btw, its so rare that that people do now did you know if you get past the first couple of pages it checks whether you are a bot).
Im contrasting how it used to be (reading google results making up a good majority of the time online) to how it is now (no more than 1 or 2 pages a day, sometimes not even that in a week)
Google is my home page, I use it so often I don't think about using it. I can't think of one thing I've searched for this morning. But opening my history reveals half a dozen things I've searched for. Most of the time I "knew" my destination but google gets there quickest.
And I'm not sure I ever remember spending a long time reading google results, for me google became my go to precisely because it was so quick I didn't have to spend long just get to the results.
It's true that social media provides a lot of traffic to places around the web but search is still huge.
Thats not extrapolating anything.
Obviously much less experienced techies still need to look up how many sides a square has at least twice a day.
A data point of one isn't particularly descriptive.
I'm pretty sure I'm not.
Kinda guessed as much from the start (hence the - probably to forward thinking conclusion).
But probably worth pointing out. The reason i noticed I hardly used google any more last week, was because I did a search and realised it had been so long since i saw google search results on a desktop i almost forgot what they look like.
And yeah, i know that is because i spend most of my time now on places like hn, closed networks with all the technical manuals you could ever need on hand, specialist forums or tor sites that google doesn't even know exist let alone provide search functionality.
But i am definately not alone. And it definately looks like the future (or at least more like what the internet used to be). Infinately more valuable information. No spam bots or corporate advertising. Just experienced techies sharing knowledge, skills and resources.
But all that is irrelevent to the point that unlike historically techies arent earning google revenue and non techies really dont use google that much.
We went through a time when adsense completely dominated the ad market.
Yet there is another google "dependancy" I havent seen for a very long time.
And that i share with pretty much everyone techy enough to install an ad blocker.
I still go past the first couple pages somewhat regularly and I don't get any sort of CAPTCHA.
for example. just did a search for "trump" on mobile and google search stops offering next after 13 pages.
If you've clicked on clickbait then you are part of that problem, especially if you've clicked on clickbait but not other adverts.
The fact that you even think this makes me wonder if you're even a "techie".
I'm really really curious about how it would be if we had "free" mesh networks under ipfs like content. It wouldn't need to be high capacity, just text, mail, json; enough to communicate.
Citation: The "web" used to be just text and <form>s; we were already there.
> I was expecting it to cause a redirect to the original
> article. Instead it redirects back to Google search
> results. Say What?
If I click on a link and it opens in a new tab, I don't close the tab ('X') expecting to go to the home page - I expect to go back to the page from which I opened it.
But it's a bit strange, why have the 'X' there at all when the browser back button will do exactly the same thing more naturally?
I learned, through a series of usability tests my (former) startup ran amongst its users, that most non-tech people do not click the back button and get very confused if your pages don't have their own back/forward/close/menu navigation.
We moved our apps workflow from "use the browsers back button to go back" to having all navigation including back/close as part of the HTML UI.
I'd be interested in learning what these users do when faced with a site without navigation links. Do they just close the browser and re-type the original URL?
what I am hearing is "the average non-technical person incapable of using Google."
By non-technical I really meant they weren't developers; they were sales, marketing and management type people.
When asked, it simply didn't occur to most of them to simply press the back button.
Now, its not apples to apples, of course, and since this was in a web app where there may well have been a fear that pressing back would lose the page like you do in some (badly designed) single page applications, its possible that this skewed the results somewhat. Either way, we found that adding in-page navigation improved the workflow for almost everyone.
i'm not surprised by your study's result, I'm just not sure you can extend your conclusion to believe that people won't or can't click the back button to return to Google after clicking a result.
It's like Google is not dictating how one should build/style their websites. Thanks, but no thanks. Stay evil, Google.
If the web were going in a direction to serve that demographic organically, AMP wouldn't be necessary.
Or as a web developer you need to make sure to adopt to their stupid standards like renaming your img to amp-img and making sure it works in amp and non-amp so you'd be doing both?
I see it as complete bollocks.
As an end-user or web developer?
As an end-user, I do not care. In fact, it seems that the AMP solution (which isn't an iframe so much as just serving the data from Google's cache at a Google URL) is faster.
> Or as a web developer you need to make sure to adopt to their stupid standards like renaming your img to amp-img and making sure it works in amp and non-amp so you'd be doing both?
As a developer, if I care enough about my user's experience to be bothering with AMP in the first place, I should be using a toolchain that makes it pretty straightforward to go from my meta-representation of my content to the render target (HTML / css, etc.). Because that toolchain should already be doing things like precompiling my CSS, condensing my image data, etc.
Fortunately there's only one site I read regularly that uses AMP - Google News
It even has the advantage of loading inside the Google App, where no adblocking exists, so it might even be good for publishers.
Yes, there's no discovery if you don't design for that, user will just bounce.
So, the bottleneck is really the connection setup time? Seems to me, that http2 largely solves that problem, while html already supports prefetch.
Don't kid yourself, the goal here is a better web locked into the google ecosystem.
It might be locking you in the Google ecosystem, but the project is open source, and if there wasn't a monopoly in search/ads it wouldn't be an issue…
Why should they want to read the post again? The only viable option is to have links to other posts on your site, or maybe a link to the comments.
By the way, what's the purpose of the bar at the top? The back button of the browser already does that, even on mobile.
About analytics: I don't know if those accesses don't show on Google Analytics (I really don't) but what if one uses Piwik? Is there any way to get a report of the access?
It's to counteract the fact that in some browsers on mobile, it will just show "google.com" in the URL bar, the added bar shows the actual website. (Obviously this is a problem introduced by the whole caching thing but that's why it's there I guess)
Maybe they just want people to get used to this bar on top, and then start A/B testing it with ads and see what reaction they get.
Or if they want to copy paste an url to email (perhaps that's too advanced a feature...), it looks like a "let me google that for you" site, not what they're actually sending. Yay.
It has a slide-out animation to it?
The are page transitions. Considering that Google controls both the source and destination page I'm sure they can make one without stealing space on screen. That annoys the user more than the owner of the content.
And if the user has a browser that doesn't support page transitions, it's only a waste of screen space.
And why, why would they break the img tag‽
>And why, why would they break the img tag‽
Introducing yet another additional markup format to serve mobile sites with the only purpose for Google to compete better with Facebook is excessive and additional dev overhead. But it favours Google's interests and Google basically blackmails content providers locking search availability to only those who obey. And additionally it gives Google excuse to serve pages via own domain with backlinking.
There should be just common practices for mobile code simplicity and not another new markup.
It's not so much a shock as it is a "new normal." Whether you like that new normal, that's up to you.
(For what it's worth, AMP causes a significant bump to my bounce rate, made up by the higher number of visits that come in via users looking for the AMP symbol.)
It might discourage webmasters from adopting AMP though, if they have the expectation to lead the visitor to the homepage or other articles.
This is akin to an ISP injecting their own little optimization toolbar into pages rendered to the user, except that cert pinning isn't even an option here, since Google is doing the content re-writing at the app tier. They even manage to keep the browser security icon green, creating the at-a-glance perception that the data provenance can be trusted!
If Google edge caching really is critical, then I'd rather see an architecture in which the AMP content is signed by the originating site, and the signature is looked up and validated client-side. This might incur a bit more of a performance hit for sites that are being overwhelmed, but really, that's as it should be -- as an end user, it's surprising that my AMP results are essentially a Google cache. And it would seem that Google goes to good lengths to make the look-and-feel appear browser-like, instead of calling out the cached nature of the response.
Why would a cached website be shown looking different from a non-cached one?
Typically, cached web pages come out of edge caches, either maintained by a CDN selected by the content provider, an end-user's ISP, or by a box on the local network. In this case, Google -- the search engine -- is fundamentally changing the nature of how the user obtains the third-party search result content.
IMO this is very different than my company or ISP doing some edge caching. It's different because it is much more of an opt-in sort of relationship, and, more importantly, because it's decentralized -- if Comcast starts to muck with search result content, Verizon users will notice the discrepancy. If Google takes advantage of their monopoly position and alters AMP content, nobody except the content providers will be the wiser.
I'd imagine that end users who don't trust Google aren't using Google.
> There's no real indication to an end user that that data is coming out of Google's caches rather than from the content provider.
Users who care can see the URL and the domain for which the cert is signed. Most users do not care.
> IMO this is very different than my company or ISP doing some edge caching. It's different because it is much more of an opt-in sort of relationship...
I don't see why I can't substitute "Google" and "Bing" for "Comcast" and "Verizon" in your example.
For that matter, at least in the U.S., it's a lot easier to switch out my search engine than my ISP if Comcast starts to muck with result content. FWIW, Verizon already redirects failures of domain name resolution to itself. I don't like it, and there's not much I can do about it if I want Internet around here.
> If Google takes advantage of their monopoly position and alters AMP content, nobody except the content providers will be the wiser.
That's a fair concern, but the day Google does that is the day a media firestorm blows AMP out of the water as a trustworthy tool.
And in particular, it looks (at a quick check) like the google hosted files are in a very predictable location, so it should be easy to write something to check both that version and the hosted version against each other to monitor for discrepancies.
Except Google use to just guide your to the content, you just had to trust them to give you the best possible answer. Now you have to trust they have the real one too.
> Users who care can see the URL and the domain for which the cert is signed. Most users do not care.
Most users do not care if their passwords are stored in plain text, or if a bridge they cross has been built by a proper engineer. Consumer ignorance is not a valid reason for shady practices.
AMP's purpose is to deliver content at minimal bandwidth and latency. Requiring the client to side-channel to the originating server (that can be God-knows-where relative to the requesting client) defeats the purpose of creating a system for low-bandwidth users to quickly fetch and display content on mobile devices.
It's hypocrital to bill AMP as an improved mobile experience, while sticking a big button at the top. Also hypocritical to penalize sites for scraped, dublicate content, while doing the same themselves.
As I wrote before, it's "more of a mutually-consented handholding with small amounts of arm-twisting" .
Most users do not actually care what the URL bar says. We know this because of the number of people who get to Facebook by searching for "Facebook".
>The real cost of a website is not hosting it but filling it with things people want to read. Content creators will want to get benefits such as ad revenue to compensate for their effort.
Google doesn't create AMP pages for you - the content creator does. If the content creator wants to insert ads then they can do so via AMP.
> Google is not stealing traffic but it is stealing
HTML is already fast, it's all the extra resources added that makes pages slow. For publishers, the reason sites are slow is a combination of ad revenue pressure, poor tech skills, lack of time and focus on other priorities like producing content in a saturated market. This is changing slowly so that UX is more important but creating an entirely new proprietary system that only takes time away from the main site (and just affects mobile) is not the right answer.
More interestingly, the #1 most used adserver on the planet is Google's own DoubleClick - which means they could singlehandedly make all sites (desktop + mobile) faster by implementing better tech in their own stack.
this reminds me of sites in the early web that used to link out to sites and frame them with a banner at the top. websites countered this with frame-breaker scripts -- but publishers really shouldn't have to do that.
I mean - running a "we'll host your page for you in exchange for injecting a toolbar" service isn't necessarily an evil thing to do. But there's no reason to opt people into it just because they're tying to perform well on mobile.
As for the X button, I suppose google wishes people to use google search result like a news reader. I agree that it is redundant with the back button but the interaction does seem faster (I suspect that the page is actually loaded in an iframe on top of the results page)
Then I realised there is an intermediary providing this service:
It is the worst of all things.
You can also look at the source from an linkis infected link to find the original url
Speaks nothing of HTTP (otherwise they could've done the right thing and added a "text/html;amp=1" content-type).
In theory this should still work: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-...
But I doubt that Google's AMP cache will respect existing headers (the very nature of the cache stuff means that they won't, but what they will do to non-cache headers on an item is undefined AFAIK and so they will probably just strip them).
The usability downside of providing a link to the actual webpage should be obvious - Google is trying to pretend this is the actual webpage. Why would they want a redundant link just to confuse the users?
That said, ever since Android separated google results view from chrome and added that strange "x->back" button thing I keep getting tripped up from a usability perspective. That's more the Android team being silly again, not just AMP, I think.
I set Google to open links in a new tab anyway so the X the OP is complaining about is actually exactly my browsing style to click X and go back to search results.
One of the biggest gripes is getting our paywall model onto the AMP sites, as we have very little input as to how that is done. It also takes ages to hear back about requests/suggestions with little feedback as to why they think its a bad idea.
Hey, this is Malte and I am the tech lead of the AMP Project for Google. While I work on the AMP open source project, I did check back with the Google Search team that is more directly responsible for most of the points mentioned in the post. I personally find it very important to respond, because “stealing traffic” is literally the opposite of what AMP is for. The original idea behind AMP was to allow content to be distributed to platforms (such as Google, Twitter and Pinterest) in a way that retains branding and monetization control for the publisher. AMP traffic is the publisher’s traffic. Period.
I also realize that “just turning on the WordPress plugin” doesn’t get you there. Especially if a WordPress installation is heavily customized, one will need to invest similar effort to get the AMP pages to the same quality. While this may be a lot of work, this is by design: We recommend to really optimize AMP pages and fine tune them to your needs. AMP is not a templated format for that reason. While neither the AMP project, nor Google are directly responsible for the WordPress plugin, the AMP open source project working closely with the authors of the plugin(s) to improve the quality and scope. AMP is very flexible and should be capable of providing most features of a typical WordPress site, but this flexibility also requires respective work to make custom plugins and development show up in the AMP version.
Getting more literal about “stealing traffic”: there are audience measurement platforms that attribute traffic to publishers. They might in theory wrongly attribute AMP traffic to the AMP Cache (not Google) rather than to a publisher because they primarily use referrer information. That is why we worked with them in worldwide outreach to get this corrected (where it was a problem), so that traffic is correctly attributed to the publisher. If this is still a problem anywhere, AMP treats it as a highest priority to get it resolved.
“Ask Google to give users an easy option to view the original post.”
Let us start by saying that we love URLs as much as everyone else, and we tried hard to make the AMP URL scheme as usable as possible given the technical constraints of web apps.
We’re looking at ways to make the source link more discoverable and will update once that is done. AMP is super flexible in terms of how a publisher can direct traffic to their site. Typical ways to get to a publisher’s homepage (like clicking the logo) should just work and are in no way restricted. Also, make sure to check out amp-sidebar (https://ampbyexample.com/components/amp-sidebar/) for adding a menu to your AMP pages.
If you are not comfortable with traffic on your AMP pages, please do not publish AMP pages. Google Search has 2 types of AMP related features:
Normal search: AMP does not influence ranking. Your pages will appear in the same spot with or without AMP.
AMP specific features (such as the “Top Stories Carousel”): For these features, we believe that AMP is the format that currently delivers the best possible user experience on the mobile web. That is because AMP allows for consistent speed, caching, pre-rendering, and enables swiping between full-length pages. This is a big deal for topics where there isn’t “that one best result” that a user might want to look at.
“Google takes away ad revenue on AMP pages”
AMP supports over 60 ad networks (2 of them are owned by Google) with 2-3 coming on board every week and makes absolutely no change to business terms whatsoever. There is no special revenue share for AMP.
“If Google cares so much about the mobile experience, why cover 15% of the small mobile screen with a fat bar at the top?”
The Android users might have already noticed that it is now scrolling out of the way and the same is coming soon for iOS (we’re just fighting a few jank issues in Safari). Similarly we’re spearheading a long term effort (https://github.com/bokand/NonDocumentRootScroller) to allow web apps to define how the address bar is hidden on scrolling. It looks like this will land in Chrome soon, providing even more space to web pages.
Google has to be respecting Cache-Control headers, right? Set your AMP pages to return that. Then they won't be allowed to cache them.
Try using www.yandex.com as your search engine, it is surprisingly good, and has less ads disguised as "search results" and results from blog-spam sites and such nonsense.
> Most importantly, I was surprised to find out that instead of redirecting users to an optimized version hosted on my server, Google was actually serving a snapshot of the page from their own cache.
and now you're upset that they really only made it faster via caching it, not actual magic.
Were you perchance born yesterday or are you just very naive...?