Hacker News new | past | comments | ask | show | jobs | submit login
Moving towards a faster web (chromium.org)
245 points by joeyespo 75 days ago | hide | past | web | favorite | 217 comments



Google Capone: "Hey, nytimes.com, your site loads awful slow. We're gonna have to put this badge of shame on it for everyone to see.

Now, if you just dumped your other ad networks and ran everything through us, I bet it would be load much faster and that badge might magically disappear..."


Actually one of the things that Lighthouse (Chrome's speed test tool) complains about a lot is some of the things that are a result of using Google's AdSense.


We’ve also got a section on web.dev focusing on the specific problems that 3rd-party resources can create and how to fix them: https://web.dev/fast/#optimize-your-third-party-resources

We mention that ad scripts are a common type of problematic 3rd-party resource but we use the phrase “3rd-party resource” because there are lots of other common problematic scripts, like A/B tests, social media widgets, and analytics scripts.

Disclosure: I work on web.dev


How does Google works internally between teams on this kind of issues? For instance, do you work with people developing Google apps sometimes? Because it seems that they've gotten slower and slower as time goes: for instance, if YouTube was a porn tube, it will be dead already because it's much much slower than the competition.


Will web.dev finally do what they preach and fix Google's own websites before penalising everyone else?


Ok, you got a click out of me....

This sort of basic advice (use browser features to defer loading, and plead with the masters not to ask for so much JavaScript) is obviously already considered by the HN audience and so this link is basically useless.


I strongly disagree with this comment.

I sincerely doubt 100% of the HN audience always considers those things.

You might think some of those things are basic, but that doesn't make them useless.

This kind of lack of empathy for new learners can make it incredibly difficult to teach concepts to those who don't have as much knowledge as you may have.

See https://www.youtube.com/watch?v=OkmNXy7er84 for a great video by 3blue1brown on this topic.


It also throws penalties for using Google Tag Manager and Google Analytics.


Kind of a good sign that Google's internal teams actually work independently and don't give special exemptions to other units of the company?


Way back, the chrome website got penalized by Google (the search engine) for disallowed SEO tricks ;)


The way that the Google Search unit gives preference to Google AMP unit in its mobile results? Yes, I suppose it's a good sign that they don't also give that sort of exemption here, but I would consider that expected, not praiseworthy.


That's how it works now, but they can fix it internally whenever they feel like to...


I actually seriously have my doubts about that.


> Now, if you just dumped your other ad networks and ran everything through us, I bet it would be load much faster and that badge might magically disappear...

My experience with everything google has touched lately suggest that this wouldn't improve speed any. Gmail and youtube make continental drift look speedy and even the search page takes 1.4MB and takes over a second to load for me (maybe corporate network issue), that's approaching the size of doom to display a dozen links.

Google doesn't have any moral authority when it comes to bloat.

Edit - for reference HN takes about the same time to load, but that has to cross the pacific ocean whereas google supposedly has local data centers.


I think the point being made is that it's no longer actually about the speed, it's about the 'badge of shame'. The analogy being used is accurate, in that 'protection money' isn't, and was never, about protection.


Part of the problem is that they place high importance on the first contentful paint (aka FCP) above all else. They're not very much looking at how it performs overall, mostly just how long it takes for a user to see the first bit of content. If that were a bit of text that said "loading" with a spinner that goes for a minute or two, they may consider that fast.


This section shows how Lighthouse calculates your overall Performance score: https://web.dev/performance-scoring/#lighthouse-6

(Note that the table gets cut off on narrow mobile screens... I’ll file a bug)

FCP is 20% of the score. The other metrics capture different milestones in the loading experience. For example Total Blocking Time is intended to bring awareness to sites that may look complete but can’t respond to user input (because the main thread is busy running JS).

The metrics overview that we just launched provides more detail about how our metrics were designed to capture the end-to-end loading experience (or at least that’s what we’re working towards): https://web.dev/user-centric-performance-metrics/

So in the case of your site-that-just-loads-a-spinner example, yes it might have a good FCP time, but it’s LCP time probably wouldn’t be that good, and therefore the overall Performance score would be mediocre.

Long story short, I don’t think it’s as easy to game a good Performance score as you might think

Disclosure: I work on web.dev


The only PC where I saw Gmail being fast was a Ryzen 9 3900x build, I suspect that if Google devs were given shittier PCs they'd build faster products that would appeal more to the average user.


I run a Core i5-2500K from 2011 and Gmail runs just fine. Far better than Thunderbird or Outlook ever did, but it does take a lot of memory.


I have a Google-issued corporate workstation with 64 GB of RAM and 12 cores. Gmail is really really slow. Like so slow that I if I have to refresh it I'll go get coffee. But I'm on Firefox, and so "I Am Not The User" and all that.


IMO the best way to use Gmail is to open and pin it, only load once a day. It seems optimised for this use case.


There's a big difference between "done loading" and "appears to be done loading". Google is reportedly very cautious about the latter, and pulls some hijinks to appear faster than it is. Frankly, I'd be happier if more sites paid a similar level of attention.

That said, if your search page takes over a second, you're right, it might be an issue with your network.


> that has to cross the pacific ocean whereas google supposedly has local data centers

HN is behind Cloudflare, and so also has local DCs


I think you misspelled AMP.


their site is pure hidden text spam (aka paywall) and they still give them views. i don't know why you'd say that


They would be.... an extremely unwise website to target with that kind of extortion.


Oh, definitely don't start with NYT. Come back to them once there's little other choice.


I'm trying to fully understand it.

So if I'm in a site that is loading slow, Chrome will tell me "this site is loading slow". which is almost like putting a sign in the middle of the ocean, "you in the ocean". Like, a person should notice it on its own, right?

Ok, I understand they mean to tell me that it's slow loading due to the fault of the site, and not due to (let's say) my bad internet or device. But I think that most people don't understand those kind of technicalities, and don't really care to know them. I see it, as someone else put here, a "badge of shame".

Moreover, it's kind of regulating the internet. No one gave Google the mandate to regulate the internet.

There are two other problems with this "speed above everything else" approach. First, what is fast depends on how the browser parses websites, so one website can be faster in Firefox but slower in Chrome, and still get a "badge of shame". There's no standard here afaik (perhaps I'm wrong).

Second, the internet is supposed to be a place of equality, where kids, experimental artists and businesses all get the same respect and treatment. But businesses websites are obviously going to be faster, they got the professional technical stuff to ensure that, making the other second-rate internet citizens.


If Google didn't own AMP then I might be less critical of this change. But they do, and I bet they don't rate any AMP pages served from the Google cache as slow.

They're determined to make AMP the de-facto way of publishing content and this feels like just another way of making AMP more appealing.


You're right. When you judge a move you estimate the intention of whoever made it, and how much they'll personally gain from it.

He Google gains by promoting AMP, and double-gains by incentivizing websites to optimize themselves to Chrome instead of to its competitors.


I haven’t seen an AMP page for quite some time, it seems that strategy is pretty much a failure.


I still see them quite a lot on my iPhone 8... and it's very annoying


I think it's big companies' websites which are bloated and slow compared to "kids and experimental artists".

> Second, the internet is supposed to be a place of equality, where kids, experimental artists and businesses all get the same respect and treatment

I think it's okay not to respect poorly written websites as long as it's the people who can judege. But when big corpos are putting regulations in place on what it means to have a good website – I agree with your point and someday it might impossible to run your own website like it's now with mail server.


I think this might be a try to push AMP. Put the "badge of shame" on slow websites, until ~everyone not on AMP has the badge.


> Moreover, it's kind of regulating the internet. No one gave Google the mandate to regulate the internet.

You kind of did as soon as you started using chrome.


Yeah, if we don't want Google to regulate the internet, maybe we should stop using Chrome and Google Search. Maybe also Android, or at least use one of the de-googlified androids.


> No one gave Google the mandate to regulate the internet.

That ship sailed long ago, through Google Search. When Google Search changes its ranking algorithm, the web moves. They've done this a lot: your site ranks better if it has a better mobile experience, if it's faster, if it's machine readable, if it's in a particular language, if it's served using HTTPS, etc.

Generally, the argument is that Google has been given that mandate by the user: they're prioritising websites that are better for the user, and hence people choose to use Google. Chrome is just another tool in Google's toolbox.

Whether you agree with it is up to you, but it's been happening for a while nonetheless.


I'd be okay with it if it flags up what proportion of the page is content and how much is unused, advertising and tracking code...


> But I think that most people don't understand those kind of technicalities, and don't really care to know them.

Sure but do they need to understands them to make this effective? It's effectiveness will come from people being able to say "this website is slow" instead of having to argue whether it's from their computer, their internet, etc... and in the end just being ignored. Not all of them will care to know, but enough will and be able to use that information to go much further with it.

> I see it, as someone else put here, a "badge of shame".

You can begin by saying that a sign in the middle of the ocean is absurd because it's obvious, yet it's a badge of shame. Everyone know it's slow but once it's only a badge of shame once it's been shown?... That's a bit incoherent.

> First, what is fast depends on how the browser parses websites, so one website can be faster in Firefox but slower in Chrome, and still get a "badge of shame".

Does it matter? If it's slow on Chrome, it's slow on Chrome... If someone would tell you it's currently slow on Chrome you would just answer: "Well it's fast on Firefox so change your browser"? I hope not...

This badge is just a formal proof that it's slow on Chrome.


Why should everything get the same respect and treatment? A half broken personal blog that has no accessibility features is a worse website for a considerable portion of the population.

I'd like to see things move more in this direction. Websites that ask for notification permissions or interrupt me to tell me to sign up for their newsletter are frustrating and awful and I'd love for the systems I use to nudge me away from them.


To play devil's advocate here, this could mean that sites will stop loading 50 different scripts just to show an animated newsletter sign-up box or some other fancy shit. I mean for the love of God a lot of modern web sites look like an amusement park. You install uMatrix and everything is considerable faster. Web devs have really dropped the ball. I don't disagree with the main sentiment though that this isn't Google's problem to fix or even their right to intervene, but even if it's fixed as a side effect I'm fine with it.


I see you're point, but the internet is a free market and people where bad websites will be deserted due to market forces, and not due to regulation of one company.

I leave and never come back to cumbersome websites. Even my grandpa does that, though he just doesn't like the bad experience, he knows nothing about scripts.

If this step Google treats the internet like its one of its products, and Google must ensure good user experience for its products. But no, it's not, it belongs to everyone.


Adding more information doesn't make a market less free.


>Web devs have really dropped the ball

In their defense, this is what product owners want. If it didn't work into the numbers, they would stop doing it.


> If it didn't work into the numbers, they would stop doing it.

This "badge of shame" is a way to makes it works into the numbers... Your voice is sadly nearly worthless to your product owner (that's so sad that I write this...) but a bunch of complains from people, that may just do it.


Nneyt neutrality is only for ISPs, after all.


This reads to me as an attempt from Google to further balkanize the web. This looks a lot like the "blue bubbles" effect from Apple iMessage.

Did Google run out of actual features to implement? How about reacting to real user concerns such as controlling the privacy of their personal data on the web? Rhetorical question, I know...


"Did Google run out of actual features to implement?"

That question is kind of coming in at the wrong angle. Google has identified the speed at which web sites load as a problem, and have determined that is a problem worth their while to try and solve.

The problem is that it's all the tracking and advertising that's the cause of the slow loading and Google, being as that tracking and advertising pays their bills, can't solve that as the fundamental problem, so they're doing a very odd looking Twister-Limbo set of actions in trying to reframe what the problem is: speed of page loading.

Google are attempting to treat the symptoms, because they profit from the cause.

Google needs to solve the speed symptom before someone else solves the ad and tracking cause (ad blockers, PiHole, etc.) in a more fundamentally non-technical-user accessible way.


I disagree strongly with your analysis.

Advertising is a contributor, but it's far from being the cause. AFAICT there are five causes, namely slow DNS lookups due to inappropriately low TTLs, a lack of caching both serverside and clientside, page bloat, an overdose of javascript and an overdose of tracking/advertising (which contributes to bloat and javascript).

Making a Rails site pleasantly fast was more work than it should've been. Rails didn't make it easy to cache page components, didn't make it easy to cache the entire page either, made it easy to inadvertently wait on the database. And here's the key, this doesn't seem to be regarded as a big problem. The rails developers and users don't regard slow speed as a big problem. Is that unusual? Do Django, Magnolia, Magento have a different culture? I haven't noticed (but I might not). Assuming not, the root cause is an inattention to making sites pleasantly fast, and that inattention has simply allowed advertising to have the same problem as the rest of the site's software.


At least based on the picture, the 'slow' badge should only show during the TTFB, ie. before any paint occurs. Since Google Ads loads after the paint websites likely won't have issues of mis-labeling.


The vast majority of people that I know who don't use adblock do so because they are A) afraid that it will slow down their web browsing, or B) believe that it magically generates money for content creators.


Speaking as a real user, let me assure you that website speed definitely falls under real user concerns. I don't know how much I'd use this indicator, but loading speed matters to me. Loading speed matters a lot.

This won't win me back from Firefox, but I don't consider it a bad move.


You know what loads fast? Web pages. What doesn't? User tracking applications that happen to host some content. :(


If you are against user tracking, great, you've got many on your side.

But what loads fast is web pages that are made to load fast. If they are able to load fast despite user tracking (which is entirely possible), then it is simply a different issue, and this should not be flagging them as slow. If they are flagged, it should be by something different.

If Google is going to implement this (which I support in general because so many sites are painfully slow because there aren't enough incentives to make them fast), what they should be measuring is speed of loading, independent of whether it is slow because of user tracking, slow because the developers do other obnoxious stuff (fill it with click bait images, have videos autoplay, etc), use frameworks sloppily, put too much contents on the page, have a high res background image or video, have a crappy server or use a slow backend language, or whatever it is.

While Google does a lot of user tracking, I think the normal way they do it is actually pretty fast. Fault them for user tracking, but if they are able to allow site owners show Google ads without slowing down the page load significantly, don't lump them in with all these other ones that both track users, as well as make the site a painful user experience.


I don't want to be stonewalling progress. But evidence is that any savings win here will be wasted by the same parties that are bringing the savings.

Google lost a ton of good will by making Gmail a laughably bad experience on load.


When I had to use gmail for work I switched back to the plain html interface. It was startlingly fast

As far as I could see all the complicated front-end stuff added zero to the user experience. Ok, so yeah, you needed a page load to view a message or whatever, but it was quick and hardly noticeable. All they needed to do was freshen up the stylesheet and call it a day. But, I guess, all those developers and product managers needed something to do in between building a new messaging app and killing the previous one.


Just switched over to the HTML view, and wow. I'm never going back.


Then demand for smaller sites, nothing will make for a faster web if web pages are larger than operating systems from just a few years ago. do you know how ridiculous it is to have web pages that are tens and hundreds of megs?

The issue is not with the browsers, it's with the damn stupid sites out there. 3G is pretty much useless out there, let alone 2G.


How do you demand for smaller sites? To whom do you plead? Why will they listen to you?


You demonstrate your intent by paying for content subscriptions rather than visiting sites entirely supported by online advertisements.


There is no paid site that matches the quality of HN. Same goes for most social network: What gives them value is contributions, and contributions only seem to happen if the access is free. Otherwise you don’t reach a high-enough density of people.


It should’ve been obvious I was talking about news sites. Even if I wasn’t, you assumed I was talking about social media without any supporting basis.


This is an attempt by Google to influence the bloated websites out there, rather than a browser feature intended to attract users. It's probably mildly helpful as a user to know if a site is slow (as opposed to it just being slow for you), but it's a big incentive for a website owner to speed up their site if this badge of shame appears it for a majority of users.


Except of course unless it's one of Google's sites...

https://developers.google.com/speed/pagespeed/insights/?url=...

Gmail scores a 50% on Google's own PageSpeed. And that's just the login screen. If Google can't even meet their own metrics and standards then they have no place telling other people what they should be doing with their websites.


> If Google can't even meet their own metrics and standards then they have no place telling other people what they should be doing with their websites.

The fact that they are failing their own if anything, shows that the tests don't discriminate. It's defeatist to say that we shouldn't strive for faster websites if some webpage fails. I really like how dumb f*s making 40MiB pages are now finally punished.


Yes, but Gmail doesn't really have to compete in Google Search, does it? So, in their eyes, it's OK for Gmail to score low because it has its own link in basically every navigation toolbar/header Google owns. This won't improve the speed at which Gmail loads, whereas all non-Google websites will have to improve their speed.


It's not that Gmail has a dominant position and so doesn't need to care about its rating - Gmail is an app not a page and it's less important for apps to load quickly than pages. Apps tend to be long lived and so a slow startup time is tolerable, whereas one is constantly opening new pages.

That is to say, I don't think e.g. Trello would really care if their site was "slow to load" either.


This is an attempt by Google to force everyone into AMP and nothing else.


You might be surprised how fast web pages load when the browser you’re using blocks the tracking scripts, especially on a mobile device. Brave runs on iOS and Android; the iOS version with the Brave Rewards wallet is expected to be released imminently.

You might want to take a look at Brave, which uses Chromium underneath, has Tor built-in, is compatible with Chrome’s plugins and even has a business model based on paying people to view ads only if they opt-in: https://brave.com/features/


I have never seen a site that loaded hundreds of megs of data. That seems a little exaggeratory.


You've been lucky and didn't encounter sites loading 1080p videos as "hero images". Or perhaps you haven't browsed a news site with ad blocker off?


> Or perhaps you haven't browsed a news site with ad blocker off?

Yeah, it's been years and years since I did much browsing with an ad blocker off.


Memory size is a problem, too. Even a small amount of Javascript can allocate a ton of memory. I'd love to see a more memory-efficient language replace JS, then a reasonable but lowish memory limit put in place. Say, 8MB, which still seems generous to me.

The only trouble is you'd have to find some way to keep developers from using the DOM as extra memory. But the capabilities of Web scripting languages, outside perhaps some very strict and explicitly enabled on a case-by-case basis sandbox, really ought to be limited more anyway.


JS is not necessarily heavy. For example, the Espruino is a platform running JS on microcontrollers with less than 100KB of RAM: https://www.espruino.com/


Do you really need your browser to tell you a site is loading slowly? You're staring at the screen; don't you know if it's slow just by existing and moving forward through time?

Or is this just a way for Google to kill off the progress bar, too?


Based on my understanding, Google would be communicating _why_ the site is loading slowly (the network, or the site itself), which isn't something that would be immediately obvious.


If I'm staring at the screen waiting for a page to load, a loading indicator can help me figure out if I should keep waiting longer or give up. That's very useful!

If this gets site owners to work more on loading speed so they can get a badge that's great. I used to work on mod_pagespeed and one of the big problems was that publishers just didn't care that much about loading speed.

(Disclosure: I work for Google)


Based on what this feature is and the scenario you described, I think it may disincent site owners from speeding up slow sites. At the moment, a user doesn't know if a site will ever load. If Chrome reassures that "this site usually loads slowly" they may stick around since they are then reassured that it probably will eventually load, it's just taking a while.


Maybe users will respond that way, but that's still a good outcome. Giving users a better idea about what to expect so they can make more informed decisions? Great!


> Giving users a better idea about what to expect

Can I expect Google to do the same about data that may be sent by Chrome to Google without users realizing, just so users can make informed decisions?


Is https://myactivity.google.com/myactivity what you're looking for?


No. I mean a badge that pops up for Google products warning you that they may send your personal data to Google even without any user interaction, when the user does not reasonably expect this. What happened to "Giving users a better idea about what to expect so they can make more informed decisions"? I guess profit has a stronger scent and we can all assume that Google is tricking users with those "slow loading page" warnings just to get more for themselves. I feel that you are being willfully ignorant about this.

> Both Android and Chrome send data to Google even in the absence of any user interaction. Our experiments show that a dormant, stationary Android phone (with Chrome active in the background) communicated location information to Google 340 times during a 24-hour period, or at an average of 14 data communications per hour. In fact, location information constituted 35% of all the data samples sent to Google.[0]

[0] https://digitalcontentnext.org/wp-content/uploads/2018/08/DC...


If their speed assessment works as "great" as their "mobile usability" tool works, i doubt their indicator will be actually meaningful. YEah speed matters a lot , but it's self-correcting. If a website cannot load after a minute on a spotty cellular connection , i ll just close the browser and they lose the impresison


I agree speed is a concern. It takes me at least second to tap the tittle AMP banner and then tap on the real website address.


Walled gardens and proprietary software can allow for fantastic user experiences in a way that open platforms often cannot match. However, I feel focusing on the user experience misses the main point - the web is an open platform, and this is an example of Google exerting an authority that many may not feel comfortable with.


In what way is "indicate that this website is slow" a walled garden or contrary to openness?


Because the criteria will inevitably be biased towards technology and techniques that favor Google, Chrome, AMP, etc. If you trust Google to be an unbiased arbiter of objective measurements of website efficiency, you are giving them a lot more credit than I would at this point.


Visit an AMP page in Lighthouse and you won't see an especially good score. Lighthouse doesn't special-case AMP or pick metrics that would make AMP look good.


Just because a monopoly is not acting maliciously today, that doesn't mean that it won't tomorrow.


In the United States, the role of a central bank is to manage two measures: inflation and unemployment.

The US Federal Reserve Bank does not measure inflation or unemployment. Rather, the US Department of Labour, an independent organisation (and a branch of government rather than a freestanding entity) computes both measures.

This is the principle of division of responsibility and measurement or judgement. You cut, I choose.

Google are both measuring, and rewarding, website performance, as well as designing and distributing the principle tools that benefit by both choices. That's an extreme locus of power. And, history shows, generally a Bad Idea which Ends Poorly.


>Google are both measuring, and rewarding, website performance, as well as designing and distributing the principle tools

Also given their market share as a search engine, they also get to decide which websites you will be able to "discover" additionally incorporating their performance metrics into their website ranking. So not only a website will get a "slow badge" but will also be downranked into oblivion.


Because Google is the decider of what constitutes "slow"? You have a single organization deciding what the threshold is between slow and fast. And, they also have a Google-approved solution (amp) that I'm sure they will tie into the recommended actions for "slow" pages.

I'm not saying it's a walled garden (that's a more extreme situation on the continuum between open and closed), I was just using the phrase to illustrate my point that the end-user experience isn't the only factor to consider.


If Google didn't have a long history of this sort of thing (interpret that as you may) the open web would have lost the battle for mainstream users to the proprietary platforms long ago.


I don't love using slippery slope arguments, but I feel that it's appropriate here...


It reads to me as an attempt from Google to further shame people into following better practices, like when they started factoring HTTPS into search rankings.

I think the principle is a fine one (though it is a bit odd considering the web's main source of slowness is ads/tracking tech, and... that's Google's main business). But in practice I'd bet that - like most of Google's broad-strokes efforts - there will be many edge cases/outliers where people get penalized unjustly, and there won't be an appeals process.


> further shame people into following better practices

"Better" as prescribed by who? At this point, I'm under the impression that if you're not implementing AMP for content, Google is going to rank you lower and now publicly shame you to users. They _say_ this is not the case, but it's impossible to ignore now that they're introducing naming and shaming. It just seems like features way outside the scope of what a web browser should be providing to users.

Editing to add my other thought:

I've built websites that routinely score horribly on Google's proprietary PageSpeed Insights despite fixing everything within my power (usually Google's own analytics scripts score badly), but also score very high on every other reasonable industry test of similar nature. It's hard to convince me that they alone should be an arbiter for this type of feedback given to users.


> Google's proprietary PageSpeed Insights

The previous version of PageSpeed Insights was open source (https://github.com/pagespeed/page-speed) and the current version is a wrapper around Lighthouse which is also open source (https://github.com/GoogleChrome/lighthouse).

(Disclosure: I work for Google)


I believe you, obviously. I have historically received drastically different results between, for example, web.dev / Lighthouse and PageSpeed Insights with the latter outputting a very low arbitrary 'rating' or 'score' and the former providing results I would expect. Like a lot of efforts at Google, things seem duplicated. As an outsider, it is a frustrating experience catering to Google's performance ideals because the tools and 'best practice' positions do not seem to be unified in one place.


I wonder if you might have had a bad experience with the old rules-based PageSpeed Insights? It wasn't as good as the current one, being more of a checklist of "have you done all these things that usually help" and not "we loaded the page, computed metrics, and this is how well it did".


I just checked one of my sites with their tools. Performance 34/100. Biggest issue Google's own ad tech. Third party code blocking the main thread.

https://imgur.com/a/RM8JkbI


Yep. Looking at it cynically, this could be a way of playing hard-ball with web admins: "if you want to keep using Google stuff, and you don't want this badge of shame, you better use AMP"


This is Googles blueprint to eradicate apple. They want Web apps to dominate because they already have a standing there and apple has a large marketshare in native apps.


I will agree with you that Google has killed the web, news groups, federated chat, and so many other things, but I find the term "balkanization" both very disrespectful and at the same time a bad fit for describing Google subverting nearly all communication channels.


The multitude of comments here scream the same thing:

1. No one trusts Google to be an impartial judge of speed.

2. Increasingly, Google is inserting itself as a non-neutral third party between the end user and creator/developer. Power lies asymmetrically with Google, and threatens both creators & users.

If anyone from Google is reading, this incremental but definite appearance of a power grab by Google will only draw more regulation


I mostly agree, but more regulation means higher barriers to entry which would suit them. It's win-win.

We should really just stop using Google so much. You go first, I can't be bothered to switch!


>but more regulation means higher barriers to entry which would suit them

That's a recently very popular idea, but in this general form it is simply untrue. It depends on the specifics of any regulation.

We can regulate oligopolists in a way that doesn't affect anyone else.


Switched about a year ago to duckduckgo. It doesn't have all the bells and whistles, but the search results are pretty darn good. Give it a go :)


I quit, divested, closed my accounts and never looked back (except for YouTube but I block the ads).


I'm not from Google... but I think you omit one threatened party, and it's the most seriously affected: Facebook.

Google wants there to be lots of web sites, lots of web pages. So using someone's web site should be more convenient than using that someone's facebook page. How to achieve that? My guess is that Google has done some research into that, and a focus group or whatever named slowness (or an expectation of slowness) as a disadvantage of the web compared to facebook.


More regulation is good for big established players (who have money to spend on lobbying) and it is bad for small/new businesses.


While I appreciate fast loading sites as much as anyone, there's something I appreciate even more: getting to the content as fast as possible.

It's becoming increasingly common to have all sorts of pop-ups blocking a big part of the screen at best or adding an overlay across the entire body.

If only I knew that my click would result in that kind of monstrosity I wouldn't have clicked in the first place. So maybe it would be more useful to show an of how many things we need to close before we get to the content. I think that something like this would also help get us to "a faster web".


So many sites load 8MB of garbage, often from dozens of domains, just to display 8KB of text. Then as soon as you try to start reading, an obnoxious pop-up window appears, demanding that you sign up for their spam list. These are anti-patterns that need to go away.


If Google actually gave a damn about making the Web a better place, it'd start downranking sites with mailing list popups.


I thought they did? Some years ago they (and/or Mozilla) offered an addon which you could use to report popups like that - https://addons.mozilla.org/en-US/firefox/addon/in-page-pop-u... comes up.

But so far, nothing's been done yet. This is kinda weird to me because one of the reasons why Firefox came to power (and took over a bit chunk of IE's market) was that it had a pop-up blocker. I don't understand why they aren't doing more to block inline pop-ups now.


Disable JS. Fixes 99% of it. Using a JS enabled browser is downright annoying now.


The Site Engagement Score might be relevant here: https://www.chromium.org/developers/design-documents/site-en...

Presumably a site that is using the dark patterns you've described would have a low SES. Although if you're interacting with the site in order to get past the ads I wonder if that would distort your SES? Then again, if the site is doing annoying things, I'd probably bail quickly, so maybe SES can work after all...

Disclosure: I work on Google Web DevRel. I don't know if there's any plans to incorporate SES into this badging initiative. I was just reminded of SES last night because Periodic Background Sync API uses it [1] and it seems relevant to this problem.

[1] https://web.dev/periodic-background-sync/#getting-this-right


They say "Our long-term goal is to define badging for high-quality experiences, which may include signals beyond just speed. We are building out speed badging in close collaboration with other teams exploring labelling the quality of experiences at Google"

That seems to cover such things.


I'm worried that their definition of "high-quality experiences" will be the same one as typically used in UX on the web - i.e. how much money did the users ultimately made the site, and not whether they actually had a good time.


I understand a lot of the distrust of Google her one HN, but I also don't think that would be their style. To me, the sites done with Google ads aren't the ones that are the problem, then tend to load fast and not put a bunch of crap in your face.

Yeah they track you and such, not denying that or supporting that, but Google generally pushes back against the sort of bad experience you find at so many sites.


Install NoScript. The initial whitelisting period is slightly bothersome, but once your common sites are whitelisted, it makes browsing the web _way_ more pleasant. No more fucking "give us your email!" and "this site uses cookies!" popups. It even works decently on Firefox for Android.


This is why I use Safari with reader view enabled automatically.


It’s crazy how misaligned the incentives are for chrome development.

If performance was the criteria by which they made decisions then they would probably bake ad-blocking directly into chrome, since ads/tracking is one of the leading reasons for poor performance.

Instead we are left with PR pieces and false features. It feels like they are creating a lot of noise to drown out the signal.


> ... ads/tracking is one of the leading reasons for poor performance.

That's especially obvious when using nested VPN chains and Tor. Because latency can exceed 500 msec. Sites like HN typically load in a second or two. But sites with ads and tracking take 10 seconds or more.


I appreciate the intent, but why are they putting badges/indicators like this into Chrome instead of Search?

Sites usually load slowly because because of ads and trackers, and Chrome's only telling me what I'm already seeing with my own eyes. Plus, I already decided to visit the site, and it's probably quicker to finish letting it load than find an alternative source, if one exists at all.

Whereas Search would seem much more useful, since it's where I'm already presented with alternative links, and can actually affect which one I click.

I know Search already says they de-rank slower sites... but obviously they're still appearing, especially for the "long tail" of searches. So badges still seem like win there.

(But the whole concept is sound, I think, because developers often want to build fast sites but management won't dedicate the resources. But if management sees Google is giving them a big red mark, that's suddenly something very easy for them to understand and allow to be fixed.)


from the article: "We are building out speed badging in close collaboration with other teams exploring labelling the quality of experiences at Google."

We're hearing about this first from Chrome, but that doesn't mean it will happen in chrome first. I suspect that badges of shame will come to the search result pages long before they actually show up in chrome. Say what you will about google's policies, but the chrome team is really pretty good about announcing this sort of thing well in advance. Look at their multi-year rollout schedule for the "not secure" badging in the URL bar.

Search results, on the other hand, will just update one day with no warning.


Right, I think they're using Chrome as a testing ground for this.


This kind of effort sounds great, but the issue is that Google has already spent their goodwill on improving the web in this area. Who can trust the company to implement this fairly? Will Gmail get a “badge of shame” (it surely deserves one)? Will websites that are fast but don’t use AMP or the new Google hotness be ranked as they should? There are a lot of questions that I’m sure many have already answered in their head based on Google’s past efforts.


Our long-term goal is to define badging for high-quality experiences, which may include signals beyond just speed. We are building out speed badging in close collaboration with other teams exploring labelling the quality of experiences at Google.

Exactly what does "high-quality experiences" mean? Penalize sites with bad colors? Sites that are artsy and an algorithm can't understand? Sites that have been repositories of information for longer than Google has been around? Sites that don't meet Google's worldview? That disagree with Google's politics?


"Quality" should be reserved for the content and its usefulness. Speed is not a quality factor. It's a convenience.


To me this is better than amp - give me clear messaging around a site's performance rather than mandating a platform that gives the site less control. I certainly worry that there's the potential to abuse this, and I also wonder if Google's own sites (i.e. amp pages) will be biased. To their credit they're showing one of Google's own pages as being slow in the example, but I'd be interested in seeing a 3rd party analysis of what pages are considered slow and which aren't.

Another part of me worries that this will lead to a cobra effect[1], where people optimize a page's first load so Chrome says the page is fast, while withholding the main content of the page for a delayed load, leading to even more site bloat. Identifying when a page has actually loaded will be tricky.

[1] https://en.wikipedia.org/wiki/Cobra_effect


See my other comment about how our metrics are designed to capture various milestones in the loading experience. I think it’s already harder to game a good Performance score than you might imagine, and it’s only going to get harder over time as the web platform collectively gets better at metrics: https://news.ycombinator.com/item?id=21511477


I prefer Firefox's way of "moving towards a faster web" - blocking trackers.


Indeed, it's so much more effective and it has immediate real results.

Also: it's not a "police state" approach. Google loves a good police state.


This will give developers leverage to ask for time to optimise, upgrade, and improve things they are not normally allowed to work on, as it will raise the discussion to the level of managers and the general public.

However, do you think more could be done to make the default MySQL/PHP type configurations faster out of the box? If it is possible to speed up the default config of most websites on the internet, perhaps there is an easy win to be had.


> However, do you think more could be done to make the default MySQL/PHP type configurations faster out of the box?

I'm pretty sure that most slow websites aren't slow because their server-side rendering takes a long time. This seems to be mostly a front-end oriented thing (especially directed at pages that load MBs of JS); basically if the page doesn't adhere to all the front-end optimisation guidelines.


The slowness described in the article isn't jank, but time-to-initial-load. Conceivably the initial client-side render could take a nontrivial amount of time, but in the vast majority of cases - just like for server-side rendering - it isn't a deciding factor.

The real deciding factor is the amount of dependencies that have to be loaded. This includes JS, but usually the JS for ads/tracking is far larger than the JS for the actual UI, even on fully client-side-rendered sites.


PHP and MySQL are pretty fast out of the box - and have been for a couple decades (a single server was well into hundreds requests per second by 2000).

What’s slow are the frameworks on top of them, and a lot of that is going to require big changes since things like the WordPress plugin world or most of the JavaScript world have not had a great culture about monitoring and optimizing things. Simply moving this into the category of problems you can’t ignore would be positive - along with tracking data usage for mobile users on metered plans.


One of the regular complaints I've pushed against AMP is that it's a hamfisted way to address page speed. Search placement should be determined by generic speed tests, not by forcing developers to use Google's technology.

This looks to me like a positive effort. I'm not thrilled with overlays like this, and I'm not thrilled with baking this kind of stuff into the browser. It feels over-engineered and weird.

But, I think it's a better direction than AMP.

There are a lot of ways this could go bad, but very cautious thumbs up from me.


I hope they won't whitelist their own pages from this. Maybe then they'll notice Google's own products have a performance problem.


Google Analytics dashboard currently scores zero (0) in the web perf section. It's probably their biggest asset.


Gmail

Loading...

Usually loads slow.


I think the "Usually loads slow" warning shown in the article is grammatically incorrect. Shouldn't it say "Usually loads slowly"?


Came to say this. It seems as though a browser of chrome's scale ought to spell-check before cutting a release?


I'm going to be a little nitpicky here: This would be grammar checking, not spell checking. "Slow" is spelled correctly, it's just not the right word (adjective vs adverb).

This also seems to still be in the experimentation phase and is not released as far as I can tell and may never actually make it into Chromium based on:

> In the future, Chrome may identify sites that typically load fast or slow for users with clear badging. This may take a number of forms and we plan to experiment with different options, to determine which provides the most value to our users.

Regardless, I agree. This probably should've been caught early.


Maybe they used adjectives instead of adverbs because it’s both ad-words and they can no longer feel the difference.


Your comment about "ad-words" made me think: could Google sell ad space on these "slow site" pages? Site X might jump at the chance to pay Google to advertise its service on site Y's slow site page. I'm only half-joking.


There's a link for the Basic HTML version while it's loading.

I've never seen it load so fast I couldn't click the link. Including on Google Fiber.

Then you can set that to your default view. Ta-da, gmail that's as fast as HN or Craigslist or whatever.

Sure, it won't alert you when new emails come in, but normal Gmail eats so much memory I hate to leave it open, anyway.


Generally, I can tell when a website is loading slowly because it, well, takes a long time to load.

Am I misunderstanding what Google is trying to do? I'm not seeing the use case.


I think this is more for project managers and site owners. These days we’ve become numb to the load speed, but if your browser tells you “dude your site is slow” there’s a chance the developer will get a call and money to speed it up.

You don’t want your customers to see that badge while looking for your Product X 3000 Max.


If the progress bar is green and a site is loading slowly, you're having an anomalous experience, be patient? If blue, expected, maybe give up and don't visit again? That's my impression.


It might make more sense to work it into their search engine.

However, it might be nice if it makes some sort of objective measure.


> Am I misunderstanding what Google is trying to do? I'm not seeing the use case.

They're trying to normalize Google's approval process on users so that they can leverage this feature for a profit.


Erm. How do you tell that before you click?


The screenshot in TFA shows a Google slow site warning after a link to the site has been clicked.


Ah. I hadn't actually looked at TFA.


The best way to have a faster web is to get rid of Google Analytics, DoubleClick, AMP, "web assembly", WebDRM, ECMAScript (and all derivatives), and go back to only loading text and images from a blind server to a user-agent entirely defined by the wishes of the specific user.

But all the stuff that makes the Web slow and pointless to use is how Alphabet makes money.

So as long as you drink the Google-Aid, you're doomed to a slow web of garbage.


Bingo. We patted ourselves on the back for killing the worst annoyances of the web (pop-ups, terrible Flash menus that were always some different, cute thing for every site, big, slow, Applets—god, those would feel responsive and light compared to even a "lean" modern "Web App") but kept a scripting language with enough power to re-create all of that, and worse. The fix is getting rid of it. We won't. Web UX is gonna generally be trash until we get rid of or reign in (esp. remove ability to make requests without user interaction, and remove ability to listen for most events) Javascript. Which, again, won't happen. So Web UX—in practice, as seen in the wild, not what a developer may make to demo light web development or what certain niche sites may do—is doomed to be terrible.


A fine way to make sure that users need Windows, iOS or Android to run most applications, and that desktop Linux is completely useless.


Webmail was terrible back when every single click required a fetching a new page load. Even on a reasonably fast DSL connection, JavaScript Gmail is noticeably more snappy than HTML Gmail.

Pre-ajax online maps? Almost unusable. I certainly wouldn't want to return to those days.


Yes, because you're asking a client for text and images downloaded from a server to do things it was never intended to. When you start employing torturous misuse, you're going to find the tool unsuited to the job. You can use a pipe wrench as a makeshift hammer, but a sledgehammer can't tighten a nut (more than once).

Use an email and newsreader for email. Use a full mapping suite to download specialized map data. Don't force a web browser to shim into those niche jobs.

Old adage I learned with computers: Do One Thing, Well. That's what each program should be. One thing.


Pages are made of 30 sites, more. This is what makes the experience shit, and Google are bang up responsible. The architecture of the web did not, and does not, include this - the business model that is sustained by it exploits the web, it does not support it.


> Your website seems slow, try speeding it up with AMP today!


I used to hate AMP with my guts. But to be honest, usually when I tried to avoid or loaded the site instead of AMP, I get presented with a worse experience in a bloated web-site, whereas when I click on the AMP version, it loads immediately. So now I don't care anymore. But google is not my main search engine, so I less very little of it.


I'm thinking the same, this willbe used somehow to push more users into Google's walled garden.

Amp is like ten times as slow for me.. Firefox on android, a bug in amp makes the pages not scrollable. Have to press the tiny little icon in the top right, then again press the url, and then wait for the page to load an additional time.


To save others a search, this text is not in the original post.


Third-party calls are one of the main reasons for slow loading sites. These calls include loading Google Fonts, loading Google Analytics and loading Google's DoubleClick advertising scripts. Remove these calls and scripts, and websites get so much faster.

Edit to add some stats:

94% of sites include at least one third-party resource.

76% issue a request to an analytics domain.

Median page requests content from 9 unique third-party domains.

Google owns 7 of the top 10 most popular third-party calls.

https://almanac.httparchive.org/en/2019/third-parties


I feel like this will get a lot of flak from HN users, but I think this will overall be a benefit to the user experience on all browsers, not just chrome.


I'm on a slow connection. This page took a good while to load.

100kB of JS, 100kB of fonts. The images get a pass.

There's jQuery in there. It's a blog post. wat?

Just send me the text.


This page doesn't contain any body text unless I run their Javascript. Is that supposed to be faster?


Maybe Mozilla can add something like this to Firefox, but for websites that break when using adblockers.


The web is already pretty fast if you don't put too much stuff into your webpages.


️:)

Yeah, pretty much. If web sites would stop stuffing these giant js frameworks pages + 3 trackers + 3 ad networks into all page we could be using like iPhone 4 now.


I hope they roll out good monitoring tools for website owner for these features - i.e not a tool to test your site but a tool to tell you what google reports to it's users about the site. Otherwise you can be fooled by running lighthouse against your local server and never see the "this site loads slowly" message...


This is just another attack on the open & free internet. Why so? Since Google defines per their own metrics fast vs slow, they will imprint that impression upon the user, making an otherwise good willed user a starting bad impression on your site.

As people pointed out, this will make AMP even more attractive, driving more traffic away to Google...

Also, it may impact sites hosted on cheap far datacenters, like my user being in Brazil, and my server being in Virginia. Suddenly, my site is considered slow by Chrome, while my rich competitor who hosts in São Paulo gets the "fast" badge...

I think this kind of classification creates extra confusion and impact at no tangible benefit to everyone that isn't Google.


Can someone explain the point of this for the user?

As a user, I already know if the page is loading slowly.


You might not know why. Is it just a slow site, or is your connection failing you?


You know, I've noticed that most things load just about instantly with ad blockers.


Blue checkmarks but for websites. A merit badge for playing by the biggest boys’ rules.


If it's just a simple display to the user saying "this website generally loads slow, dont worry", I'm ok with that, I often wonder whether some sites are just bloated or it's my internet connection.


Who watches the watchmen?


Mozilla?


Isn't this just going to reward the facebooks, amazons, etsys, pinterests and all the other websites that stole the users attentions away from homegrown sites... even more?


Ad blockers speed up the web quite a bit!

Pity about that for Chrome, hey.


There's a concern over pushing AMP & their ads. I totally get that, but so far the recommendations that they link to aren't directly related to either:

   % curl https://web.dev/fast/ 2> /dev/null | grep -I amp
               Terms & Privacy
             </a>, and code samples are licensed under the


How does Google know which sites “typically load slow for users?” Is this collected via Chrome telemetry or is there another mechanism?



So how does this work? Where does Chromium get the data for the websites which load slow? Does this mean that websites which one visits get sent to Google now? Or do they use the existing SafeBrowsing queries to also send page load times? What are the implications of this for tracking?


I'm a Web Performance guy so moves such as these benefit my business. Despite that - I dislike this heavy handedness by Google.

- What speed metrics? - How would they be gathered?

If Web Performance was so single dimensional to classify it in this way, it would have been solved a lot better by now.


I understand why using an anchor tag is not always technically correct, but I don't see how wrapping something like an image in a button is the right thing to do, for an onclick event.


I’d appreciate if DuckDuckGo would implement this kind of speed badge to their search results. Never going to assume Google would even consider it as they sell the things that make sites slow in the first place.


Rather than play offence, play defence.

Come up with a "privacy" or "non-tracking" badge.

Identify free-standing / independent / non-AMP sites.

Note simple and JS-free designs, or graceful degredation.

Well-formed page structures (microformats, HTML5), lack of obfuscated elements (JS, CSS), accessibility.

An open and distributed Web index standard.

Pretty much all of these are generally useful, and disadvantageous to Google.


> We... hope to land on something that is practically achievable by all developers.

There are a LOT of people who have websites that depend upon software that they don't directly control, best example might be Wordpress users with plugins. I myself have built a number of hobby websites that I just don't have time to figure out how to update based on Google's recommendations. For example, according to PageSpeed I need to "Serve images in next-gen formats" and I just don't have the ability to do that now (underlying image processing libraries don't support it) without a serious time commitment.

So I'm worried that a lot of smaller sites that don't have the resources to keep up with Google's requirements are going to get a "badge of shame"..


> I just don't have time to figure out

https://developers.cloudflare.com/images/about/

Also Chrome doesn't collect or aggregate performance data for sites without a decent amount of traffic. They are not going to label your hobby sites as slow as there is not enough data to do so. https://developers.google.com/speed/docs/insights/v5/about#f...


Yes, exactly, I don't have time to move tens of thousands of images to Cloudflare, update the image upload code, change all of the links, etc.

I think you are also underestimating the number of hobby sites that are the primary source of information for their niche and have decent traffic, many of them running Wordpress with tons of horrible plugins. Yes, I know, not ideal. But these people don't have many alternatives.


Well, if your website is exceptionally slow because you chose a substandard way of building it, why should Google lie to users to... preserve your vanity?

It doesn't take much in the way of "resources" to make a webpage that is "fast" by Google's definition.


It's not just about vanity. Warning users that a site "usually loads slow" is a statement of quality. The website might be the best resource on the internet for the type of service or knowledge being provided, but getting that badge immediately sets a bad impression. The person running the site might not have the ability or resources to fix the problem. As an example, Wordpress sites running a slow theme or plugin. There are a lot of small businesses, publications, and hobby sites that simply won't be able to fix these issues.


We need some proper anti-trust action here. Split chrome from google.


We have to prove some sort of monopoly, but looking back at Microsoft Windows and Internet Explorer should give some legal framework.


Disable JavaScript in your browser. There, the web is much faster :)


I just had to reboot my work computer today because chrome practically wedged it by pushing everything into swap (or whatever Microsoft calls it.)

The web gets a lot faster when you dump JavaScript.


Or just dump Chrome...


It’s like we need some sort of indicator or progress visualization. Something that will show the user were the page is in terms of loading and rendering.


How will this work in corporate environments with intentionally slow web proxies (decrypting TLS traffic, etc)?


Whats the point of colorizing the load bar? If I see the bar, don't I already know its slow?


what i m seeing with these "safety badges", "lightning icons", "green thumbs up" are attempts by google and apple to control the web the same way they control their app gardens. i m hopeful that they can't though.


Ironically, the linked web.dev/fast page took way too long to load for me.


web.dev also seems pretty broken with Firefox's built-in tracking protection enabled. Ho hum.


Google does this while at the same time making a heavy framework like this https://bundlephobia.com/result?p=@angular/core@8.2.13 . This is not fair. This is abuse of power.


Now would be a good time for alternative news aggregators.


This is great news for the anti-trust case.


Just use Safari or Firefox


Reddit is going to earn this so fast lol


[flagged]


"FE" == "front end", as in front-end Web development, for the confused.


I have only my anecdotal experience, but when I did full stack work at two large employers HN readers would definitely recognize, we had some really talented developers, and nearly every poorly performing part of a page was dictated by a product owner.

The business can't easily and explicitly monetize "fast" to the same degree it can bloated features and ads/tracking.


Sorry, but there’s no way anyone can completely blame the POs on this mess.


“ Our long-term goal is to define badging for high-quality experiences, which may include signals beyond just speed.“ My speculation is that they are gearing towards web moderation. Tag sites that don’t agree with their world view as “potentially harmful”. This could make an average chrome user immediately balk at opening the site and reading the content.

Before you downvote my comment explain why the above scenario is not plausible.


Google search has defined itself by being the way people access the web, and already has heavy antitrust pressure put on it but the EU for related reasons. Extending that would go against their stated world view, and more importantly raise even more regulatory concerns.

This situation differs from eg. YouTube because they do not themselves host the content, and are not subject to copyright laws related to it.

Also: Google is made of human beings who have already protested against, for example, censoring Google within China.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: