This only helps the heavy handed SEO optimized sites with deep pockets and time to kill get yet another edge.
The article with the best insight isn't likely to be the one with the perfecly optimized website.
I understand that pagespeed effects end user experience when they hit a website, however that's not what I searched for. I did not search "fastest website with okay knowledge about dogs" I asked for "website with the best knowledge about dogs".
I want Google to be able to show me the most awesome page about dogs. The most in depth and relevant information. That niche dog blogger who is so passionate that they spend all their time researching dogs. I don't want "10 cool facts about dogs by Buzzfeed".
Let's say the page with the best information about dogs took an infinite amount of time to load (never loaded). I don't think you'd want that page to be ranked highly.
What about if that page took a year to load?
What about if that page took an hour?
What's your cutoff? Let's say, generously, that you're willing to wait 30 minutes for the page to load. Why such a sharp cutoff? Why do pages that take 30 minutes and 1 second to load get penalized, but pages that take 30 minutes are treated as fine? The user experience is basically just as acceptable (by your standards) / just as bad (by my standards).
Perhaps maybe instead of a sharp cutoff we need some sort of sliding scale, where pages that are only slightly slower than optimal are penalized only slightly, and pages that are much slower than optimal are penalized more.
Gosh, that sounds familiar.
Also your example of a year is obtuse. It's like saying you should drive BMW because you will get killed in a Tesla if you crash at a million mph.
What Google's page speed factor does is differentiate between 1.5 and 2.5 seconds loading time. That has nothing to do with the quality of a page.
If it is truly the best information, it is what I want and I will wait. If I trust that Google can give me the best information then I would be willing to wait. I give up easily now because I can't trust Google to give me quality results. I will not wait for an ad bloated news aggregator.
The tricky thing is also the fact that Google does not inform the user that it is prioritising the faster loading pages instead of the pages with the most accurate match. This is a little dishonest too.
The same IMO can be applied to speed as a factor. A slow site, (due to depth of content?) badly marked up but providing the correct result should outrank the fast site providing wrong info.
Pages that should be heavily penalized are those with lots of extraneous JS, bloated CSS files, lots of auto play videos, and huge image files that are supposedly created by “experts” that need a DevOps team to deploy and distribute all the bloat.
Sometimes images make sense.
Art collections are easy to develop with icons and progressive images.
Progressive images... eh. The js you load for it is 3-4x the images at 720p. Progressive jpeg, that's better.
This sounds plausible, but doesn't translate to reality. Look at the slowest content sites today: they're large, well-resourced news sites. Look at the small, independent individuals running their own personal site: many are very fast.
In reality, I'd say the biggest causes of website performance issues are (a) institutional bureaucracy - managers requiring their website to do something without regard for the overhead and (b) professional graphic designers defining UI features without technical knowledge of their implemention details. These are both, usually, corporate inefficiencies that have little effect on smaller maintainers.
That is, by also having less bytes in the page, then less bandwidth is used. Since less bandwidth is used, less energy is used.
Unfortunately, Google doesn't measure, nor seem to care about energy consumption, as GoogleBot only cares about response times. (Of course, there are energy concerns regarding that too...but it's only every about speed with google and punishes everyone else for not evangelising it)
Using Page Size as metric has important benefits that Page Speed deliberately undermines. It also resurrects the notion that using external cached css and js is a good thing.
Did you know that in order to get 10 urls (the thing that you and I actually want) from google, you need to download 500K? It seems ludicrous to me. Each search result is chock-full of css and js that doesn't change.
Thankfully HN is old-school and uses external files and good-ol' minimalism.
If you can test speed close to the requester geographically, it's possibly an even better indicator of energy usage because it also implies the data doesn't have to travel as far.
Also, Google has no idea what external resources your browser has cached locally.
I'm pretty sure they have enough data to figure that out.
That isn't what search is ranking. It's ranking what users want. And users very most definitely want, on average, among other things, fast.
As an aside, gaming on satellite internet was a real drag, the only game I had that would handle the full seconds of latency was Star Wars Galaxies. Thankfully that was already a favorite.
But who will measure it "objectively" (leaving aside no one agrees on how to universally measure load speed).
Googlebot? From what location and what machine types and how often? Should I get search results based on an average of all possible variables or the closest matching my locale and device? What happens when page content changes? What happens when the page sometimes loads slow content (e.g. ads) and sometimes doesn't? What happens when SEOers start cloaking their ad loads or taking advantage of flaws in the benchmark?
It's good to have a speed signal in search rankings, but this petition shouldn't pretend that's an objective replacement for something that always displays content first and loads media and third party content async
One potential solution that fits the theme of the letter could be logic like this:
MAX = <reasonably fast load time>
if page loads in > MAX:
-> punish page rank
if page uses GOOG proprietary:
-> show at the very top
-> trick anyone that clicks
-> ban website
-> reach out to sell AdWords
"Reasonably fast" could also be defined more loosely as "page uses good optimization practices" (e.g., like a YSlow grade).
> MAX = <reasonably fast load time>
Privacy reasons mean that it is basically impossible to load the content from the publisher's server until after the user clicks on the result. If you wait until the user click event to load the content, then there is nothing you can do to avoid a full network round trip.
AMP preloads the minimal content before the click, removing the round-trip. That's the whole magic, and the only way to get the 'instant' load.
There are unavoidable technical requirements to do this:
- Loading from Google's (or link provider X's) servers to preserve privacy.
- Constraining the html so that the load itself can't break privacy or add jank to the search result page.
If you aren't willing to accept these constraints, then you are always going to face a network round-trip in added latency. You can't have all 3 of:
- Zero perceived latency
- Serving from the publisher server
You can only pick 2. AMP forces choosing the first 2 which are what the vast majority of users care about.
Then AMP goes way out of it's way to give publishers back everything they could possibly miss from the 3rd.
Why this? If the HTML is already loaded from Google's servers, the load shouldn't be able to break privacy even more, and jank shouldn't be affected by the specific HTML that's being served, just by its size. So Google could just preload everything that's small enough.
If you do constrain what can run in your preload to avoid these, you end up designing AMP.
There is another way which enables almost instant loads. Turn off js completely. Try it yourself before you disregard it. It literally turns page loads instant over a regular home connection.
On some wired connections, a network round trip is within user-perceivable 'instant', but for many people on mobile devices, this is very noticeable.
Even on the best LTE networks, the 'core network latency' is 40-50ms. 'core network latency' is the latency for getting the packet from the phone to tower to the packet gateway. This is before the packet even goes on the internet. And you need to double it for the round trip. You usually need to double the whole thing for initial DNS lookup.
Best possible latencies for an html page load on excellent LTE connections without prefetching are around 150-200ms. Most users in the world will experience >1s. A United States major city wired connection is not at all representative of what most people experience on a mobile connection.
Just look at Time to First Byte (TTFB); when you pile on SSL negotiation (since most have moved to HTTPS), the advantage is even greater since all AMP pages currently fall under the aegis of Google's domain and all non-AMP pages will require that overhead at first click.
Increasingly, phones are the dominant computing tool accessing Google search results. Optimizing for phone experience is key to serving the largest growing demographic of searchers.
Google doesn't start preloading 50 documents, it loads the top result or 2. Even viewing a hundred amp pages in a month is going to cost you less data than one 10kx10k unoptimized image that a publisher decides to load, which happens a lot. Note that AMP optimizes images too taking into account your device resolution, so it's most likely saving you bytes.
Everything else wastes my bandwidth and I don't care that it is only 100kb, my monthly, effective datacap is about 50MB and I always browser without images.
The "100kb or less" is only about 500 pages preloaded, not accounting for the size of the google webpage itself. I'd rather not have any preload at all.
They factor in all that stuff to a user happyness metric which impacts rankings, probably via a bunch of machine learned models.
The AMP logos and stuff are to try and force webmasters to take speed seriously. For years people have said it's important, yet big sites like eBay and Amazon, who certainly have the financial means and motivation to, still take 10x longer to load than what would maximize both revenue and user happyness.
And speed is way down on the list. Try searching for lyrics to any popular song and tell me it isn't so.
All they really need to do is add more weight to it. But they don't want to because that breaks the lock-in that they are striving for.
(Of course User permission must be granted, but shouldn’t that be feasible?)
This data is a much better indicator of how fast a page is vs if it is AMP or not.
It is a horrible predictor of load times.
Not every city; the top 500 cities would probably cover 90+% of the users.
You're not understanding the scale of Google. They're incredibly good at exactly this stuff.
Is a cable is cut between Jakarta and Singapore, it can take months to fix as Indonesia only allows Indonesian-flagged cable layers to operate in its waters.
Individual datacenters, CDNs, cables are going offline constantly, so you'd need to be continuously testing every site to rank it correctly. Not only that, each page in a site might have a very different structure, so you can't just rank a site, you need to rank individual pages and articles too, which means testing every new page and article from every city.
I guess you're not getting the scale of that...
AMP standardizes the CDN serving part, largely eliminating that issue. AMP articles can't prevent 3rd party CDNs from caching them, which means that they can be served from the closest CDN after the first time they are used.
Which is the reason why Google focuses so much on the number of roundtrips. For most, downloading 50kbyte doesn't cause a lag but 10 roundtrips before the site can be rendered do.
An user in Indonesia isn't at the closest datacenter in Singapore. It doesn't matter what you do, you can't, from Singapore, try to emulate what an user on a particular ISP in a particular city in Indonesia will face.
In case it isn't clear:
Consider 2 websites, both have hosting in Singapore, but one also uses a CDN in Jakarta.
The Google crawler running on the Singapore datacenter connects to both websites, speeds are great because they are sitting just next to the datacenter. Both get great ratings.
An user is in Jakarta, and tries to connect to both websites. One of them takes a century to load, because it has to load everything from Singapore, and international cables from Singapore to Jakarta are expensive, so connections are capped. Experience sucks.
The second website uses a CDN in Jakarta, located at the local coloc facility to which the user's ISP is also connected to. The website loads instantly.
CDNs today are located in every single small ISP, in all countries, in all medium-sized cities in the world.
CDNs are a hack that can make websites load a bit faster, but that's about it. It won't make them use less RAM, or consume less of your data cap. Ultimately, performance is about the size of the site, the resources it uses on the client, and the number of requests it makes. All of which is independent of the number and location of CDNs.
If you pretend that all your users have net connections like the US, Europe, or parts of east Asia then you're going to end up as a small player in a balkanized internet. That's why all these companies are making aggressive investments and acquisitions in other parts of the world.
CDNs are also part of the fundamental reality of the internet. Could you imagine Netflix or YouTube without CDNs? The amount of network capacity investment necessary to run the internet as we know it without CDNs would cripple it.
And it's not just the internet, but it's a fundamental fact of computing that you have hierarchical caching layers with some data far away and some data closer.
It's not about privilege or capturing emerging market. It's about laziness and waste, pure and simple. The same bloat that prevents someone from accessing your site on a smartphone in the middle of Africa is also significantly slowing down the computer of someone in western Europe, and eats into battery life too.
And let's not bring streaming video into this as it's completely off-topic.
And that's the privileged position. Bloatiness is annoying if you're an American reading American pages, or a Spaniard reading Spanish pages, and we like to complain about it because it means the difference between being able to read an article on the subway and having to wait for the next stop. But I'm never in a position where I have to complain about the backhaul network or the backbone.
In the future, please don't try to police "off-topic" conversation.
Public web sites that only work for "first world internet" customers shouldn't be a thing. Sites that load 100MB of ads and trackers and fifty different JS frameworks to display a few paragraphs of text should be getting the search-engine death penalty until they clean up their act. They should be thinking about people who aren't on fast broadband connections. If they aren't, the search engines should be.
YouTube learned this from the "paradoxical" result of a lighter-weight page increasing load times:
(spoiler: it was because the lighter-weight page meant it was suddenly usable by people who had slow connections who couldn't use it previously, and those people began to use it, which skewed the average load time higher)
Let's say your website is hosted in Jakarta, and serves Indonesian customers. You make it lightweight and fast.
Your competitor hosts its website from Singapore, and is much heavier.
Now Google, from its datacenter in Singapore, wants to test the load times for your website and your competitor's. Sadly, the host you chose doesn't pay for a lot of international traffic on the cable to Singapore, since it primarily serves the Indonesian market. Google tests both sites, and your competitor's, based in Singapore about a mile away from the Google datacenter, loads several times faster, besides being heavier.
Google ranks your competitor higher, and you are pissed off, as your customers load your site faster.
Now another case: you and your competitor host your sites in Singapore. You build a lightweight site, and your competitor's is full of heavy assets, with oversized images, etc. Google tests both of your websites and yours loads faster, and is ranked higher. You're happy.
But your competitor signs up for Akamai and replicates the heavy assets in local servers in all the major ISPs in Indonesia.
Customers in Indonesia search and see both your and your competitor's results. You rank higher and get more clicks, but as your traffic is subject to the congestion on the international cables, your site actually loads far slower than your lower-ranked and heavier competitor. Customers are unhappy.
In all of this, simulating a slow connection on Chrome's dev tools is irrelevant, because you don't know where the bottleneck is!
It does, because the definition of bloatness depends on the network speed.
What counts as bloat for an user sitting in Singapore isn't the same as an user sitting in Jakarta.
> Beyond the fact that the network can be mapped (Google surely does this) and simulated
Network topology is irrelevant in this case. The ISP in Jakarta might have a satellite connection to Singapore, but have a CDN and it will load faster than in another ISP with a 10G international fiber link to Singapore.
> All of which is independent of the number and location of CDNs.
That's completely wrong. Today, the vast majority of internet traffic is served from CDNs, CDNs (and their placements) are critical for the internet as we know it.
No, it doesn't. It depends on the proportion of actual content to the unnecessary cruft. The difference is that what a Singaporean considers bloat might prevent someone in Jakarta from accessing the site at all.
> That's completely wrong. Today, the vast majority of internet traffic is served from CDNs, CDNs (and their placements) are critical for the internet as we know it.
That is if you include streaming video, which is off-topic for this discussion.
Imagine a perfect world with the ultimate CDN solution - something like IPFS, which ensures everyone is a CDN to everyone else (instead of wasting bandwidth with multiple downloads of the same content). The concept of bloat wouldn't disappear in such a world, because bloated websites will still require you to download unnecessary amount of data to your machine, and will reduce your computer's ability to multiprocess while also using up your battery life.
I stand by what I said before - CDNs are a hack (basically a hand-rolled, rudimentary distributed system) to give you some extra download speed. They don't solve the problems of download size and resource usage.
If the problem is a lack of information about global network conditions, then you can deploy canary machines around the world to gather that information.
> If the problem is a lack of information about global network conditions, then you can deploy canary machines around the world to gather that information.
You'd need to put machines in every single city and every single ISP in the world continuously testing every website in the world.
Doesn't make any sense at all. That's not how networks work.
If you're in Jakarta trying to access a certain website, if that website is available at the local CDN might make a bigger difference than which ISP or even which plan you're using.
You could have a 1Gbps fiber connection from your ISP in Jakarta. If you're trying to access a website in Singapore the your ISP's international link is congested, that 1Gbps will be useless.
OTOH, if you have a 10Mbps connection, and you're trying to access a website replicated at an Akamai CDN in your ISP's network, it will be way faster.
Correction: Google can't predict the speed of a random webpage to a random user around the world.
But you can reasonably predict that pages hosted at widely available CDNs around the world will load faster on average than pages not available.
All the hard work of eliminating abuse is something they've done already for those accounts. And their userbase is so big, and presence so strong, that it would generate pressure everyone would benefit from.
My idea doesn't require anything new except them measuring load times, which I'm guessing they're doing already for improving Chrome, and then using this data to influence search ranking. Everything else is already set up and working, and you opt into it if you're using Chrome and are logged in browser-wide.
Then you improve your benchmark.
Edit: To clarify, "improve your benchmark" is not a viable strategy by itself any more than "put your opponent in checkmate" is a viable strategy. As advice it is unhelpful. You generally have to mitigate flaws in the way you rank pages, keep large parts of the algorithm secret, and attack the economic viability of your adversary's strategies. Rather than going for checkmate, you want your opponent to resign so you can do something else instead.
There have been incentives for bad behavior on the web for so long that I don't think we have any hope of coming up with any kind of automated benchmark to tell us which pages have good user experience.
If this were true, then there would exist no search engines consistently capable of finding content highly relevant to the user's query -- as opposed to just returning promoted content, which according to your premise is always winning the adversarial battle.
Sure, there is an adversarial element, which means there is a constant 'battle' for 'fairness' among results (or whatever your goals are as a search engine). But in reality, search engines as a whole seem to work rather well at finding what we want. Of course results are biased from adversarial pressures (and that's what we're discussing on this thread), but it does seem possible for "benchmarks" to work well in an adversarial environment.
I can’t tell if you agree that google doesn’t return highly relevant content or if you just get better results than I do. Almost all my search results are dominated by low relevance high click through brands, regardless of quality. For example: quora often replaces high quality content with low quality paywalled answers researched from wikpedia or other public resources. 10 years ago you would have gotten metafilter, which is (imho) objectively a higher quality source less hostile to web users.
The degree to which search engines are gamed successfully by adversaries is not the variable I wish to present a strong opinion of here, which may explain your confusion on my position here.
Instead, I just wanted to illustrate the logical connection between an adversarial-defensive ability to rank based on [website loading speed], and the ability to rank sites based on [any other factors]. If you can pull off adversarial defense of the latter, you probably can achieve the former with even less effort.
That said, I do present to you tentatively the claim that most popular search engines do a decently good job at fulfilling their goals, even if those goals are not in alignment with users' goals in many cases.
For example, there are certainly many examples of paid ads given preferential treatment in search results; however I do not believe this constitutes a case of an adversary beating the search engine's metrics. Rather, it is a consensual relationship/contract between the search engine and an advertiser being fulfilled. Whether this contract/relationship is in the best interest of consumers, is another matter altogether, of course.
AMP serves a purpose for the end user and it does so well, it loads instantly and doesn't consume much data in the process.
As for their "demands":
1. Google already states that AMP pages are ranked higher because they're faster to load.
2. I'm not sure if it's related but they they addressed that only yesterday: https://amphtml.wordpress.com/2018/01/09/improving-urls-for-...
E.g., your site is just as performant as AMP, but you're not using AMP.
I'm of the persuasion that they can rank and display the results however they please, it's their site after all, so it's a non issue either way.
> I'm of the persuasion that they can rank and display the results however they please, it's their site after all, so it's a non issue either way.
I don't get why anybody says this. Of course they are in control. Nobody can force them to do it differently (maybe the government but whatever).
Most people aren't saying that Google has no right to do what they're doing. People are saying that Google _should_ be doing it differently.
Of course they can legally do it, and we're not judges debating that.
There's a difference between what they are allowed to do legally, and what they can do that keep me coming back as a user. This is legal, but it makes me use DuckDuckGo instead.
This new open standard they are pushing will eliminate the need to even have an amp cache for a site to consume amp, win win.
Hmm, I guess it depends where you look. I saw quite a LOT of people upset by it. Many participated because they felt like they had to in order to get views.
Cool, so how do I get my plain text page which loads faster than the motherfuckingwebsite.com into the AMP carousel?
<!-- yes, I know...wanna fight about it? -->
If anything, supporting AMP has accelerated my non-AMP pages a lot.
You can read it here: https://andrewrabon.com/particle-a-proposal-for-tinier-html-... [Draft]
Would anyone here be interested in seeing the blog post completed and/or helping me build it out? If so drop me a line at andrewrabon at gmail. :)
I also want feedback on if my ideas are actually solving a problem, and are doing so in the correct way. I only recently joined a news publisher so I'm not 100% cognizant of the issues at play.
Hasn't Google gone on record saying that AMP doesn't affect search results given the same page load speed for non-AMP sites?
So yeah, Google's not honest about this.
Disclaimer: I work for Google but not on AMP product.
I don't think the letter authors were calling for an Act of Congress. My read is that they were calling attention to the damage that AMP does to the content providers. And perhaps they could have put more emphasis on the fact that it's in Google's long-term interest not to bleed content providers dry.
They certainly see it as a big enough grievance to include in a manifesto type document which repugnantly entitled.
You seem to suggestthat OP's control over their own site content (ie the manifesto) is somehow an affront to your belief in google's free speech.
You certainly see it as a big enough grievance to write a comment on HN which is repugnantly entitled.
They get to pass comment on google's behaviour (a comment which, incidentaly, a lot of people seem to agree with). You get to pass comment on their behaviour. I get to pass comment on your behaviour.
Holding people and companies to account in a soft manner, without resorting to force (legislation counts as force) is an important aspect of a functioning free society. The freedom of speech is not freedom from criticism but freedom to criticise.
Anti-competitive business practices are not.
It's funny how everyone's a legal scholar these days.
They'll probably do away with the carousal eventually, but for right now an AMP page does have an advantage in search.
If that's true, and amp doesn't have any preference in the carousal, then Google's comments are completely accurate.
Does anyone actually believe a fancy AMP powered site isn't getting nice traffic boost from Google?
If they were to get a nice traffic boost, would it be because the site actually loads faster or because it was ranked higher. I think both would increase the traffic boost.
It requires changing browsers, to let people distribute Google Hosted Apps on their own domains (as long as Google approves). Sites distributed in this way will get preferential treatment by Google.
This is essentially an App store model, similar to Google Play, and the really big danger is that we end up in a place where you either distribute (and host) your website through Google's AMP registry, or you're not actually on the web (in the same way that distributing your Android app as an APK outside of the Play store is not a realistic distribution channel).
I hope Firefox, Safari and Edge will resist this.
As long as anyone can setup their own AMP-like cache and build a search engine similar to Google, that also supports preloading... then I'm not sure it so bad.
The proposal GP refers to has a lot of use-cases, it basically makes caching of HTTPS by third-parties possible. At-least to some extent.
Yes, anyone could build a cache. But can anyone build a cache that can compete with Google's absolutely massive infrastructure and pile of money, not to mention the reach they already have with search?
I'd say this is not a discussion on the level of ‘can you build a better product and win from a giant company?’, more like ‘can you build a planet and get everyone to move there?’
Yes, the web is dominated by big companies more than ever before but there's at least some competition left.
cloudflare is already doing an AMP-cache. If the specs works out I'm sure other big players might join.
But yes, taking on Google search is hard.
Does it? The web packaging spec linked to doesn't seem to require any of that. And it's intended as an open standard for all browsers to implement, so you wouldn't need to switch.
While the letter wants third-party content not to be surfaced in a wrapping viewport that downplays the fact that it's actually Google's AMP Newsreader, the recent AMP announcement details a planned change where emerging tech  will be used to make the URL appear as if the content was directly loaded from the distant origin, because the content being served has been digitally signed by that origin and its serving has been delegated to Google akin to how a run-of-the-mill CDN is delegated the authority to serve content for a domain that's 'spiritually' owned by someone else.
However, the recent AMP announcement does address a very frequent complaint about AMP. Just one that's at odds with the one the letter is requesting.
There could be privacy considerations I suppose, but the standard addresses that already: https://wicg.github.io/webpackage/draft-yasskin-http-origin-...
2. Relies on a "web standard" that Google is pushing and is implementing in Chrome (and so far, I think, only confirmed to be in Chrome)
Update: So after Google announces that they're fixing AMP, some people decided to write a letter demanding that Google fix AMP? I guess the next step will be to retroactively take credit for Google's changes.
Google's results ordering is already sufficiently "optimized" away from my needs that as often as not I end up in 'verbatim' mode or on page 3 of the results before I get near what I'm looking for, so an extra layer of SEO-gameable "user convenience" isn't likely to make my google experience that much worse.
Since the presence of AMP results is based on device detection, it seems a search setting removing them would be a simple matter. Google's unwillingness to provide it speaks volumes about their actual intent, IMO.
But aren't they the ones creating the content you are enjoying? Seems strange to care so little about them, as without them, there would be no content to enjoy.
And the net will be a better place.
How many publications do you subscribe to?
If my only two options are awful intrusive ads or no content, then I'd choose no content.
Google, please allow me to opt-out of AMP.
Funny because I did the same and stayed with DDG even though Google has slightly better results.
AMP pages have broken controls and I frequently encountered completely broken AMP pages (no content visible). Was this the "fast loading page" experience that AMP promises?
1) Performance - Yes, it LOADS fast, but MANY pages "feel" janky, when scrolling, especially pages that allow you to load additional accords by scrolling left/right. Only the simplest pages don't "jank" on my device.
2) Scrolling - It isn't the native scroll. It seems to be a div with "overflow-scrolling" applied. Other than that feeling "off", it prevents the bottom toolbar from hiding, reducing screen real estate.
3) Sharing URLs.
4) Biggest: No way to opt-out. I'm a fan of the open web. It's why I prefer a website to a sandboxed experience you get with native apps. At least for the first click on Google SERP pages, I'm "stuck" with google.com.
One note: I get the AMP use case, and I'm sure there are users who prefer AMP pages for the performance they see. However, I'm fortunate enough to live in an area with decent mobile speed/bandwidth, and loading the original pages are rarely "slow" on my device. However, the originals all address the issues I noted above. I'll take the .5-1 second delay for the original(s), over the AMP version.
I’ve been in mobile development for about a decade, what Google did was re-invent WAP on top of HTML5, turning back the clock at least 10 years.
By also sending all non-Google traffic through the tunnel ("for improved speed!" remember when AOL included SpeedBooster Technology?) they'd control a majority of all non-search internet traffic, like AOL without the disks. And they wouldn't even have to bill you because they'd get an unlimited supply of ad data.
Aside: from a privacy perspective it creeped me the hell out once I realized what was happening.
2) Touching tap to return to the top of a webpage is broken.
3) Searching for a word within a webpage is broken.
This is all basic functionality that I use often. And I'm sure there are other things broken about it.
Dear website owners who wrote this letter: yes, amp has it's downsides but the reason we have amp is that the internet as a whole failed to make fast sites. you had your chance for the previous ~30 years, and it's clear you couldn't be bothered to make speed a priority. I'm glad you're angry that AMP is eating your lunch, because maybe you'll actually start trying to compete now. But you aren't getting my sympathy.
I'd probably have a different opinion if I didn't use Adaway, which blocks ad servers by domain in /etc/hosts. When I've tried going without an adblocker, the web was not such a nice place.
Half of the reason we're in this mess is on-line advertising. The other half is that lots of companies think it's a great idea for everything to require constant data exchange with the cloud.
And now they’re trying to get a new standard feature to let them lie about the URL to make it even easier!
You know what worked well? 20+ years of the URL bar showing me the site I was trying to read without bikinis messing with it.
Also, that was added later. Originally it wasn’t there at all.
And the benefit of everybody with a mobile device that just wants their article to load quickly.
> And now they’re trying to get a new standard feature to let them lie about the URL to make it even easier
These are called "engineering tradeoffs". It's not perfect, but it's better than what we had before.
Google could allow opt out of AMP for all those who don't want it in their search results but they don't because they are trying to wall off and control a section of the web.
If AMP and others exist, there's a reason, and it might be more useful to respond to that than to respond to AMP.
That’s not a reason that matters to me at all.
Fast websites stayed fast, and slow websites stayed slow. They also gave out tons of free "speed test" tools to make it easy for developers to test their speed, and improve it. They even made apache and nginx plugins that would auto-optimize assets as it served them! And still basically nobody used them.
You could try a "size limit", but then you penalize asset heavy sites (like high-resolution image sharing sites, or video sites).
So their solution was to make a very restrictive system where you are forced to do things "the right way" (for at least one definition of "right"), and to pump that up in the search results.
Plugins were made for common blogging and news platforms, and now a portion of the web loads significantly faster for many, and I think that's a win.
It's not perfect, but it's at least the first thing that I've seen google try in this area that is actually working.
Obviously there is benefit to Google as a company in that now results that are served through their AMP system are faster than those of some of their competitors, and AMP gives them a nice easy way to pull some structured information from an article for things like the carousel or other non-search offerings, but I genuinely don't believe that was the main motivator, seeing as Google has years of examples of failed attempts to "fix" this problem (though maybe I'm just not cynical enough!).
That being said, there are technical reasons why this hasn't happened yet. The good news is that some web standards (ex: https://wicg.github.io/feature-policy/) are being worked on that would (hopefully) allow Google to verify performance of sites without relying on AMP.
This would give Google no excuse to give AMP content special treatment, and would hopefully relegate AMP to what it should've been since day one: a framework for performance, but one that wasn't required or bolstered by any Google-colored carrots.
> The developer may want to use the policy to assert a promise to a client or an embedder about the use—or lack of thereof—of certain features and APIs. For example, to enable certain types of "fast path" optimizations in the browser, or to assert a promise about conformance with some requirements set by other embedders - e.g. various social networks, search engines, and so on.
You're right that there's more to it though.
Website did ridiculous things for SEO when Google was the main traffic-driver on the web. Ridiculous. If Google sufficiently penalized bloat in ranking most websites would remove the bloat. But it never was a significant enough factor.
IIRC they treat AMP pages the same as any other in terms of ranking, but since AMP is super fast, it gets a natural boost.
(obviously AMP gets other benefits like the lightning bolt badge and inclusion in the carousel)
I believe that some of the data in the carousel is pulled from the way that AMP is structured, however I'm not 100% sure.
i think its kinda hilarious how there is this fixation with load speeds amongst web devs. most people i know are far more concerned with everything other than the load time.
The caveat comes in when no alternative exists, and when the access to some information of service on very constrained connections (such as you'd see in remote third-world regions with poor Internet access) meets some reasonable definition of vital, but I've not seen where AMP is playing a significant part in that.
For example, Reddit loads extremely fast, and AMP pages are absolute shit, I don't know why they decided to use it.
And I'd rather view the website as intended (minus the ads, sorry), than in a shitty cut down version that takes 3 seconds less to load.
Loading a traditional site can have side effects (e.g. reporting a view to an advertiser's analytics). By using the AMP validator, Google Search knows it's safe to preload and prerender an article, without triggering side-effects like analytics.
So it's more than just loading faster than some threshold; it needs to be safe to cache and prerender the content. The AMP validator is what asserts these are both safe.
(Note, I don't work on AMP and don't represent Google - this is my personal understanding.)
 Ilya Grigorik's time-to-glass talk https://www.youtube.com/watch?v=Il4swGfTOSM
The narrative of wanting to monopolize user traffic doesn't make much sense in light of the recent announcement around URLs.
I think a lot of this is in reaction to FB's Instant Articles and wanting to gain further leverage over publishers. If you can commoditize publishers' content and control the format, and you supply the users and the monetization via ads, you can make the publishers do whatever you want because the alternative is for them to lose money, which they can't afford to do.
In many ways, this is a very defensive play by Google. It also happens to provide a better user experience in some ways, which is awesome, and also helps further Google's goals of controlling ad formats and more importantly the data that is collected on publisher sites that lets them monetize their data in ways they may not be able to with AMP pages.
AMP only works today because it's re-hosted inside the origin (google.com), so much that in order to fix this in the future, they'll have to rush out a new web standard and implement it in chrome (and hope other browsers do the same:) https://amphtml.wordpress.com/2018/01/09/improving-urls-for-...
Move ad blocking to the indexing layer. Create a corner of the web that is naturally fast and user-friendly again without resorting to corporate-defined subsets of well-documented open web technologies.
DDG doesn't have webmaster tools or crawls domains natively.
My experience has been that direct-sold ads produce more revenue for publishers than programmatic ads. I would love to see research on this one day.
Yes, but do they provide an equally good ROI for advertisers? (And the same level of granularity, tracking and reach as something like Google Adwords?) That's the real issue at play.
It's the smaller websites (Bob's Blog) who can't afford eng resources to build to AMP, and to build fast ads, who are "suffering" a loss of search results prioritistion. (These smaller sites may turn to shady ad networks, and suffer further with shitty ad payload, further lowering search preferences if based on page load).
Does that mean smaller websites are stifled? Not sure...
Uh, usually this results in "native advertising", which is advertising pretending to be real articles. This is much more dangerous and insidious in my opinion than an explicit 'sponsored' box with an ad inside.
If this continues, successful publishers will be the ones that push the most profitable corporate propaganda, rather than the ones that effectively inform the public.
I've actually done native advertising and even something as simple as adding a company's logo to an online catalog can pull in $100,000s in ad revenue. You don't need DFP for that, and it's undetectable by ad blockers because it's just an image.
We need to get out of this concept of thinking of advertising as just publishers selling blank squares of "real estate" on their site, and back to the sort of partnership model that existed in print media. A publisher specializes in reaching a certain audience with certain interests; the right advertiser will pay more to reach that audience. You don't need to track individuals to do that or build profiles on them.
Digital advertising made a new pricing model possible: CPM. But just because you have a new way of doing something doesn't mean it's the best way.
Sorry, I'm trying to be as succinct as possible, but I've come to see the advent of ad blockers, intrusive ads, pervasive tracking, client-heavy web practices, and clickbait as a single phenomenon that has its roots in the rise of CPM as the primary pricing mechanism for advertising.
1) Accept that most websites have to make money to continue to provide you content and that advertising is how most of them do it.
2) Do not visit websites that choose to support their business through advertising
3) Use uBlock and continue to use websites.
But given the prevalence of websites that monetize via advertising a search engine that excludes websites with ads is as useful as a social network without any people.
Like I said in other comments - my gripe is not with advertising. My gripe is with CPM. The trend of optimizing for CPM has created a perverse incentive structure that harms the web ecosystem in many ways, and IMO, only benefits Google.
Then you should join the distributed free p2p search engine: http://yacy.net