Now, if you just dumped your other ad networks and ran everything through us, I bet it would be load much faster and that badge might magically disappear..."
We mention that ad scripts are a common type of problematic 3rd-party resource but we use the phrase “3rd-party resource” because there are lots of other common problematic scripts, like A/B tests, social media widgets, and analytics scripts.
Disclosure: I work on web.dev
I sincerely doubt 100% of the HN audience always considers those things.
You might think some of those things are basic, but that doesn't make them useless.
This kind of lack of empathy for new learners can make it incredibly difficult to teach concepts to those who don't have as much knowledge as you may have.
See https://www.youtube.com/watch?v=OkmNXy7er84 for a great video by 3blue1brown on this topic.
My experience with everything google has touched lately suggest that this wouldn't improve speed any. Gmail and youtube make continental drift look speedy and even the search page takes 1.4MB and takes over a second to load for me (maybe corporate network issue), that's approaching the size of doom to display a dozen links.
Google doesn't have any moral authority when it comes to bloat.
Edit - for reference HN takes about the same time to load, but that has to cross the pacific ocean whereas google supposedly has local data centers.
(Note that the table gets cut off on narrow mobile screens... I’ll file a bug)
FCP is 20% of the score. The other metrics capture different milestones in the loading experience. For example Total Blocking Time is intended to bring awareness to sites that may look complete but can’t respond to user input (because the main thread is busy running JS).
The metrics overview that we just launched provides more detail about how our metrics were designed to capture the end-to-end loading experience (or at least that’s what we’re working towards): https://web.dev/user-centric-performance-metrics/
So in the case of your site-that-just-loads-a-spinner example, yes it might have a good FCP time, but it’s LCP time probably wouldn’t be that good, and therefore the overall Performance score would be mediocre.
Long story short, I don’t think it’s as easy to game a good Performance score as you might think
That said, if your search page takes over a second, you're right, it might be an issue with your network.
HN is behind Cloudflare, and so also has local DCs
So if I'm in a site that is loading slow, Chrome will tell me "this site is loading slow". which is almost like putting a sign in the middle of the ocean, "you in the ocean". Like, a person should notice it on its own, right?
Ok, I understand they mean to tell me that it's slow loading due to the fault of the site, and not due to (let's say) my bad internet or device. But I think that most people don't understand those kind of technicalities, and don't really care to know them. I see it, as someone else put here, a "badge of shame".
Moreover, it's kind of regulating the internet. No one gave Google the mandate to regulate the internet.
There are two other problems with this "speed above everything else" approach. First, what is fast depends on how the browser parses websites, so one website can be faster in Firefox but slower in Chrome, and still get a "badge of shame". There's no standard here afaik (perhaps I'm wrong).
Second, the internet is supposed to be a place of equality, where kids, experimental artists and businesses all get the same respect and treatment. But businesses websites are obviously going to be faster, they got the professional technical stuff to ensure that, making the other second-rate internet citizens.
They're determined to make AMP the de-facto way of publishing content and this feels like just another way of making AMP more appealing.
He Google gains by promoting AMP, and double-gains by incentivizing websites to optimize themselves to Chrome instead of to its competitors.
> Second, the internet is supposed to be a place of equality, where kids, experimental artists and businesses all get the same respect and treatment
I think it's okay not to respect poorly written websites as long as it's the people who can judege. But when big corpos are putting regulations in place on what it means to have a good website – I agree with your point and someday it might impossible to run your own website like it's now with mail server.
You kind of did as soon as you started using chrome.
That ship sailed long ago, through Google Search. When Google Search changes its ranking algorithm, the web moves. They've done this a lot: your site ranks better if it has a better mobile experience, if it's faster, if it's machine readable, if it's in a particular language, if it's served using HTTPS, etc.
Generally, the argument is that Google has been given that mandate by the user: they're prioritising websites that are better for the user, and hence people choose to use Google. Chrome is just another tool in Google's toolbox.
Whether you agree with it is up to you, but it's been happening for a while nonetheless.
Sure but do they need to understands them to make this effective? It's effectiveness will come from people being able to say "this website is slow" instead of having to argue whether it's from their computer, their internet, etc... and in the end just being ignored. Not all of them will care to know, but enough will and be able to use that information to go much further with it.
> I see it, as someone else put here, a "badge of shame".
You can begin by saying that a sign in the middle of the ocean is absurd because it's obvious, yet it's a badge of shame. Everyone know it's slow but once it's only a badge of shame once it's been shown?... That's a bit incoherent.
> First, what is fast depends on how the browser parses websites, so one website can be faster in Firefox but slower in Chrome, and still get a "badge of shame".
Does it matter? If it's slow on Chrome, it's slow on Chrome... If someone would tell you it's currently slow on Chrome you would just answer: "Well it's fast on Firefox so change your browser"? I hope not...
This badge is just a formal proof that it's slow on Chrome.
I'd like to see things move more in this direction. Websites that ask for notification permissions or interrupt me to tell me to sign up for their newsletter are frustrating and awful and I'd love for the systems I use to nudge me away from them.
I leave and never come back to cumbersome websites. Even my grandpa does that, though he just doesn't like the bad experience, he knows nothing about scripts.
If this step Google treats the internet like its one of its products, and Google must ensure good user experience for its products. But no, it's not, it belongs to everyone.
In their defense, this is what product owners want. If it didn't work into the numbers, they would stop doing it.
This "badge of shame" is a way to makes it works into the numbers... Your voice is sadly nearly worthless to your product owner (that's so sad that I write this...) but a bunch of complains from people, that may just do it.
Did Google run out of actual features to implement? How about reacting to real user concerns such as controlling the privacy of their personal data on the web? Rhetorical question, I know...
That question is kind of coming in at the wrong angle. Google has identified the speed at which web sites load as a problem, and have determined that is a problem worth their while to try and solve.
The problem is that it's all the tracking and advertising that's the cause of the slow loading and Google, being as that tracking and advertising pays their bills, can't solve that as the fundamental problem, so they're doing a very odd looking Twister-Limbo set of actions in trying to reframe what the problem is: speed of page loading.
Google are attempting to treat the symptoms, because they profit from the cause.
Google needs to solve the speed symptom before someone else solves the ad and tracking cause (ad blockers, PiHole, etc.) in a more fundamentally non-technical-user accessible way.
Making a Rails site pleasantly fast was more work than it should've been. Rails didn't make it easy to cache page components, didn't make it easy to cache the entire page either, made it easy to inadvertently wait on the database. And here's the key, this doesn't seem to be regarded as a big problem. The rails developers and users don't regard slow speed as a big problem. Is that unusual? Do Django, Magnolia, Magento have a different culture? I haven't noticed (but I might not). Assuming not, the root cause is an inattention to making sites pleasantly fast, and that inattention has simply allowed advertising to have the same problem as the rest of the site's software.
This won't win me back from Firefox, but I don't consider it a bad move.
But what loads fast is web pages that are made to load fast. If they are able to load fast despite user tracking (which is entirely possible), then it is simply a different issue, and this should not be flagging them as slow. If they are flagged, it should be by something different.
If Google is going to implement this (which I support in general because so many sites are painfully slow because there aren't enough incentives to make them fast), what they should be measuring is speed of loading, independent of whether it is slow because of user tracking, slow because the developers do other obnoxious stuff (fill it with click bait images, have videos autoplay, etc), use frameworks sloppily, put too much contents on the page, have a high res background image or video, have a crappy server or use a slow backend language, or whatever it is.
While Google does a lot of user tracking, I think the normal way they do it is actually pretty fast. Fault them for user tracking, but if they are able to allow site owners show Google ads without slowing down the page load significantly, don't lump them in with all these other ones that both track users, as well as make the site a painful user experience.
Google lost a ton of good will by making Gmail a laughably bad experience on load.
As far as I could see all the complicated front-end stuff added zero to the user experience. Ok, so yeah, you needed a page load to view a message or whatever, but it was quick and hardly noticeable. All they needed to do was freshen up the stylesheet and call it a day. But, I guess, all those developers and product managers needed something to do in between building a new messaging app and killing the previous one.
The issue is not with the browsers, it's with the damn stupid sites out there. 3G is pretty much useless out there, let alone 2G.
Gmail scores a 50% on Google's own PageSpeed. And that's just the login screen. If Google can't even meet their own metrics and standards then they have no place telling other people what they should be doing with their websites.
The fact that they are failing their own if anything, shows that the tests don't discriminate. It's defeatist to say that we shouldn't strive for faster websites if some webpage fails. I really like how dumb f*s making 40MiB pages are now finally punished.
That is to say, I don't think e.g. Trello would really care if their site was "slow to load" either.
You might want to take a look at Brave, which uses Chromium underneath, has Tor built-in, is compatible with Chrome’s plugins and even has a business model based on paying people to view ads only if they opt-in: https://brave.com/features/
Yeah, it's been years and years since I did much browsing with an ad blocker off.
The only trouble is you'd have to find some way to keep developers from using the DOM as extra memory. But the capabilities of Web scripting languages, outside perhaps some very strict and explicitly enabled on a case-by-case basis sandbox, really ought to be limited more anyway.
Or is this just a way for Google to kill off the progress bar, too?
If this gets site owners to work more on loading speed so they can get a badge that's great. I used to work on mod_pagespeed and one of the big problems was that publishers just didn't care that much about loading speed.
(Disclosure: I work for Google)
Can I expect Google to do the same about data that may be sent by Chrome to Google without users realizing, just so users can make informed decisions?
> Both Android and Chrome send data to Google even in the absence of any user interaction. Our experiments show that a dormant, stationary Android phone (with Chrome active in the background) communicated location information to Google 340 times during a 24-hour period, or at an average of 14 data communications per hour. In fact, location information constituted 35% of all the data samples sent to Google.
The US Federal Reserve Bank does not measure inflation or unemployment. Rather, the US Department of Labour, an independent organisation (and a branch of government rather than a freestanding entity) computes both measures.
This is the principle of division of responsibility and measurement or judgement. You cut, I choose.
Google are both measuring, and rewarding, website performance, as well as designing and distributing the principle tools that benefit by both choices. That's an extreme locus of power. And, history shows, generally a Bad Idea which Ends Poorly.
Also given their market share as a search engine, they also get to decide which websites you will be able to "discover" additionally incorporating their performance metrics into their website ranking. So not only a website will get a "slow badge" but will also be downranked into oblivion.
I'm not saying it's a walled garden (that's a more extreme situation on the continuum between open and closed), I was just using the phrase to illustrate my point that the end-user experience isn't the only factor to consider.
I think the principle is a fine one (though it is a bit odd considering the web's main source of slowness is ads/tracking tech, and... that's Google's main business). But in practice I'd bet that - like most of Google's broad-strokes efforts - there will be many edge cases/outliers where people get penalized unjustly, and there won't be an appeals process.
"Better" as prescribed by who? At this point, I'm under the impression that if you're not implementing AMP for content, Google is going to rank you lower and now publicly shame you to users. They _say_ this is not the case, but it's impossible to ignore now that they're introducing naming and shaming. It just seems like features way outside the scope of what a web browser should be providing to users.
Editing to add my other thought:
I've built websites that routinely score horribly on Google's proprietary PageSpeed Insights despite fixing everything within my power (usually Google's own analytics scripts score badly), but also score very high on every other reasonable industry test of similar nature. It's hard to convince me that they alone should be an arbiter for this type of feedback given to users.
The previous version of PageSpeed Insights was open source (https://github.com/pagespeed/page-speed) and the current version is a wrapper around Lighthouse which is also open source (https://github.com/GoogleChrome/lighthouse).
1. No one trusts Google to be an impartial judge of speed.
2. Increasingly, Google is inserting itself as a non-neutral third party between the end user and creator/developer. Power lies asymmetrically with Google, and threatens both creators & users.
If anyone from Google is reading, this incremental but definite appearance of a power grab by Google will only draw more regulation
We should really just stop using Google so much. You go first, I can't be bothered to switch!
That's a recently very popular idea, but in this general form it is simply untrue. It depends on the specifics of any regulation.
We can regulate oligopolists in a way that doesn't affect anyone else.
Google wants there to be lots of web sites, lots of web pages. So using someone's web site should be more convenient than using that someone's facebook page. How to achieve that? My guess is that Google has done some research into that, and a focus group or whatever named slowness (or an expectation of slowness) as a disadvantage of the web compared to facebook.
It's becoming increasingly common to have all sorts of pop-ups blocking a big part of the screen at best or adding an overlay across the entire body.
If only I knew that my click would result in that kind of monstrosity I wouldn't have clicked in the first place. So maybe it would be more useful to show an of how many things we need to close before we get to the content. I think that something like this would also help get us to "a faster web".
But so far, nothing's been done yet. This is kinda weird to me because one of the reasons why Firefox came to power (and took over a bit chunk of IE's market) was that it had a pop-up blocker. I don't understand why they aren't doing more to block inline pop-ups now.
Presumably a site that is using the dark patterns you've described would have a low SES. Although if you're interacting with the site in order to get past the ads I wonder if that would distort your SES? Then again, if the site is doing annoying things, I'd probably bail quickly, so maybe SES can work after all...
Disclosure: I work on Google Web DevRel. I don't know if there's any plans to incorporate SES into this badging initiative. I was just reminded of SES last night because Periodic Background Sync API uses it  and it seems relevant to this problem.
That seems to cover such things.
Yeah they track you and such, not denying that or supporting that, but Google generally pushes back against the sort of bad experience you find at so many sites.
If performance was the criteria by which they made decisions then they would probably bake ad-blocking directly into chrome, since ads/tracking is one of the leading reasons for poor performance.
Instead we are left with PR pieces and false features. It feels like they are creating a lot of noise to drown out the signal.
That's especially obvious when using nested VPN chains and Tor. Because latency can exceed 500 msec. Sites like HN typically load in a second or two. But sites with ads and tracking take 10 seconds or more.
Sites usually load slowly because because of ads and trackers, and Chrome's only telling me what I'm already seeing with my own eyes. Plus, I already decided to visit the site, and it's probably quicker to finish letting it load than find an alternative source, if one exists at all.
Whereas Search would seem much more useful, since it's where I'm already presented with alternative links, and can actually affect which one I click.
I know Search already says they de-rank slower sites... but obviously they're still appearing, especially for the "long tail" of searches. So badges still seem like win there.
(But the whole concept is sound, I think, because developers often want to build fast sites but management won't dedicate the resources. But if management sees Google is giving them a big red mark, that's suddenly something very easy for them to understand and allow to be fixed.)
We're hearing about this first from Chrome, but that doesn't mean it will happen in chrome first. I suspect that badges of shame will come to the search result pages long before they actually show up in chrome. Say what you will about google's policies, but the chrome team is really pretty good about announcing this sort of thing well in advance. Look at their multi-year rollout schedule for the "not secure" badging in the URL bar.
Search results, on the other hand, will just update one day with no warning.
Exactly what does "high-quality experiences" mean? Penalize sites with bad colors? Sites that are artsy and an algorithm can't understand? Sites that have been repositories of information for longer than Google has been around? Sites that don't meet Google's worldview? That disagree with Google's politics?
Another part of me worries that this will lead to a cobra effect, where people optimize a page's first load so Chrome says the page is fast, while withholding the main content of the page for a delayed load, leading to even more site bloat. Identifying when a page has actually loaded will be tricky.
Also: it's not a "police state" approach. Google loves a good police state.
However, do you think more could be done to make the default MySQL/PHP type configurations faster out of the box? If it is possible to speed up the default config of most websites on the internet, perhaps there is an easy win to be had.
I'm pretty sure that most slow websites aren't slow because their server-side rendering takes a long time. This seems to be mostly a front-end oriented thing (especially directed at pages that load MBs of JS); basically if the page doesn't adhere to all the front-end optimisation guidelines.
The real deciding factor is the amount of dependencies that have to be loaded. This includes JS, but usually the JS for ads/tracking is far larger than the JS for the actual UI, even on fully client-side-rendered sites.
This looks to me like a positive effort. I'm not thrilled with overlays like this, and I'm not thrilled with baking this kind of stuff into the browser. It feels over-engineered and weird.
But, I think it's a better direction than AMP.
There are a lot of ways this could go bad, but very cautious thumbs up from me.
Usually loads slow.
This also seems to still be in the experimentation phase and is not released as far as I can tell and may never actually make it into Chromium based on:
> In the future, Chrome may identify sites that typically load fast or slow for users with clear badging. This may take a number of forms and we plan to experiment with different options, to determine which provides the most value to our users.
Regardless, I agree. This probably should've been caught early.
I've never seen it load so fast I couldn't click the link. Including on Google Fiber.
Then you can set that to your default view. Ta-da, gmail that's as fast as HN or Craigslist or whatever.
Sure, it won't alert you when new emails come in, but normal Gmail eats so much memory I hate to leave it open, anyway.
Am I misunderstanding what Google is trying to do? I'm not seeing the use case.
You don’t want your customers to see that badge while looking for your Product X 3000 Max.
However, it might be nice if it makes some sort of objective measure.
They're trying to normalize Google's approval process on users so that they can leverage this feature for a profit.
But all the stuff that makes the Web slow and pointless to use is how Alphabet makes money.
So as long as you drink the Google-Aid, you're doomed to a slow web of garbage.
Pre-ajax online maps? Almost unusable. I certainly wouldn't want to return to those days.
Use an email and newsreader for email. Use a full mapping suite to download specialized map data. Don't force a web browser to shim into those niche jobs.
Old adage I learned with computers: Do One Thing, Well. That's what each program should be. One thing.
Amp is like ten times as slow for me.. Firefox on android, a bug in amp makes the pages not scrollable. Have to press the tiny little icon in the top right, then again press the url, and then wait for the page to load an additional time.
Edit to add some stats:
94% of sites include at least one third-party resource.
76% issue a request to an analytics domain.
Median page requests content from 9 unique third-party domains.
Google owns 7 of the top 10 most popular third-party calls.
100kB of JS, 100kB of fonts. The images get a pass.
There's jQuery in there. It's a blog post. wat?
Just send me the text.
Yeah, pretty much. If web sites would stop stuffing these giant js frameworks pages + 3 trackers + 3 ad networks into all page we could be using like iPhone 4 now.
As people pointed out, this will make AMP even more attractive, driving more traffic away to Google...
Also, it may impact sites hosted on cheap far datacenters, like my user being in Brazil, and my server being in Virginia. Suddenly, my site is considered slow by Chrome, while my rich competitor who hosts in São Paulo gets the "fast" badge...
I think this kind of classification creates extra confusion and impact at no tangible benefit to everyone that isn't Google.
As a user, I already know if the page is loading slowly.
Pity about that for Chrome, hey.
% curl https://web.dev/fast/ 2> /dev/null | grep -I amp
Terms & Privacy
</a>, and code samples are licensed under the
- What speed metrics?
- How would they be gathered?
If Web Performance was so single dimensional to classify it in this way, it would have been solved a lot better by now.
Come up with a "privacy" or "non-tracking" badge.
Identify free-standing / independent / non-AMP sites.
Note simple and JS-free designs, or graceful degredation.
Well-formed page structures (microformats, HTML5), lack of obfuscated elements (JS, CSS), accessibility.
An open and distributed Web index standard.
Pretty much all of these are generally useful, and disadvantageous to Google.
There are a LOT of people who have websites that depend upon software that they don't directly control, best example might be Wordpress users with plugins. I myself have built a number of hobby websites that I just don't have time to figure out how to update based on Google's recommendations. For example, according to PageSpeed I need to "Serve images in next-gen formats" and I just don't have the ability to do that now (underlying image processing libraries don't support it) without a serious time commitment.
So I'm worried that a lot of smaller sites that don't have the resources to keep up with Google's requirements are going to get a "badge of shame"..
Also Chrome doesn't collect or aggregate performance data for sites without a decent amount of traffic. They are not going to label your hobby sites as slow as there is not enough data to do so. https://developers.google.com/speed/docs/insights/v5/about#f...
I think you are also underestimating the number of hobby sites that are the primary source of information for their niche and have decent traffic, many of them running Wordpress with tons of horrible plugins. Yes, I know, not ideal. But these people don't have many alternatives.
It doesn't take much in the way of "resources" to make a webpage that is "fast" by Google's definition.
The business can't easily and explicitly monetize "fast" to the same degree it can bloated features and ads/tracking.
Before you downvote my comment explain why the above scenario is not plausible.
This situation differs from eg. YouTube because they do not themselves host the content, and are not subject to copyright laws related to it.
Also: Google is made of human beings who have already protested against, for example, censoring Google within China.