The author may be unaware of how ridiculously huge web pages have gotten. I just loaded Wired.com, scrolled down and let it sit for a few seconds. It downloaded 96.2 MB, requiring over 33 seconds on one of those average connections. On a pay-as-you-go data plan, it would have cost about a dollar just to load that one page. The front page has about 500 words of content. It also covered 100% of the content with ads, twice.
This is unsustainable. Web developers have utterly squandered all efficiency gains of the last 30 years, and then covered their grossly inefficient work with impossibly annoying, privacy-invading advertising. Google should be applauded if they make these wasteful developers suffer monetarily until they shape up. They've already stolen untold amounts of time and energy from us all.
Some reasonable noscript whitelisting for Wired.com and a few others (out of 12 total that noscript shows me) gives a page size that's still under 5MB.
Looking at the full page with everything loaded and unblocked, the biggest offender here seems to be not web design, but an aggressively downloading autoplay video on the front page. Without that the page itself is - while not necessarily great - at least reasonably bearable.
Truth be told, I'd started this post intending to blame advertisers, and there is still some merit to that since even before the video kicks in the various third party scripts balloon the page size several times over from the minimal functional one that loads with everything blocked. But in this case, it does simply seem to be a wasteful media stream playing without regard to whether anyone wants it to or not.
Once you scroll, however, things get messy no matter what, because of their "Scott Adkins Answers Martial Arts Training Questions From Twitter" auto-play video they have right now. That ate way another 30MB quickly and the video wasn't even visible (I had scrolled past it).
1.37 MB / 722.93 KB transferred, Finish: 6.57 s
versus uBlock default
8.62 MB / 4.49 MB transferred, Finish: 28.38 s
Clean setup mostly increase load time:
11.58 MB / 5.79 MB transferred
Finish: 1.13 min
^ checked with "Disable Cache".
Not much content delivered for such big HTML file
660.46 / 167.60 KB transferred
because it's mostly inline script:
$$('script').map((s) => s.textContent).join('').length
127.70 / 100.15 KB transferred
* * 1p-script block
* * 3p block
* * 3p-frame block
* * 3p-script block
* * inline-script block
* * * block
* * frame block
* 1st-party css allow
* 1st-party frame allow
* 1st-party image allow
Of course, some space should be left for enhancing the text with images, but the Wired front page doesn't have that many of those.
Wired is the digital equivalent of that now. There's some great journalism in there, but no journalism is good enough to put up with a hostile attack on one's sensibilities just to read it.
Isn't this a direct insult to intelligence? And where and when do you draw the line?
In both cases, the top of the search results is a map with a few shops listed, and under this are normal search results.
Repeating this on mobile, the ads are moved inline above the map. They take up about 50% of the screen. That is a bit intrusive. However, I have muscle memory that automatically scrolls down a bit after searching and to be honest, I would not have even noticed them if I wasn't looking.
I don't know where the the 100% of content being ads claim is coming from. Perhaps it's different in different regions?
I'm not trying to be an apologist for Google here, by the way. However, when it comes to intrusive ads there are far worse offenders (although perhaps not when it comes to behind the scenes data collection, which is a bigger issue).
As far as I can tell pretty much everything on that screen (including map results) is an ad. That's 100% commercial results on a fairly large screen. Plus there is more below the fold.
What constantly worries me however, is what happens when most visitors use ad-blockers? My guess is, a war would be declared against current web standards, where users can still modify DOM and remove unwanted content.
Edit: reword for clarity.
Pages and pages of ads before the contents, and then each article would usually be one or two beautiful full colour pages, before a 'continued on page 94' which took you to the remainder formatted as cramped commieblock-looking text at the back - which you got to by skipping over a few dozen more pages of ads.
We need to fix the web by blocking these things. No compromises. If a website can't exist without advertising, it should disappear. Eventually, only good websites will remain. Websites made by people who actually have something to say rather than websites made purely to attract audiences for advertisers.
The tracking is being used to:
tell advertisers how well their advertising is working
tell the site how well articles are working
give unscrupulous sites the possibility of selling that data to others, which are probably advertisers but maybe other companies.
As far as the ratio of advertisement to content this https://www.editorandpublisher.com/news/higher-ad-to-edit-ra... regarding that ratio in newspapers assumes 60 / 40 where I believe the 40 is supposed to be content (although I find the wording not 100% clear)
Authors that rely on advertising have an inherent conflict of interest: they simply won't write anything that offends the advertisers because they're afraid of losing their revenue. Sites like Reddit will nuke entire communities if they prove controversial enough not because they're offended by it but because it causes advertisers to refuse to associate with them. Activist groups can attack and censor anyone these days by putting pressure on advertisers and sponsorships and causing them to pull out.
Turns out, people willing to spout random ideas on a topic are not in especially short supply and 99% of them are willing to do it for free. The best part is, these free users usually get right to the point.
Long form and investigative journalism need to be funded but the kind of information I find junk articles on the homepage of CNN or Fox is usually better hashed out (and much less biased) in the comments section than reading an article.
In a sentence, most media doesn't have much value-add. Even less so if I have to click through 6 ads and be exposed to malware to see it.
Why would anyone want to read stuff like sponsored articles which are nothing but thinly veiled advertising? Articles that were pretty much written by PR firms? Why would anyone trust journalists with conflicts of interest? Social media "influencers"?
I want real information. Real thoughts from real people. Not some censored, manipulated corporate viewpoint created to maximize profits. People who actually have something to say go out of their way to tell as many people as possible about their ideas. They don't need to get paid for it. I'm not getting paid to post here.
> maybe they should be allowed to be subject to advertising so that the author doesn't have to pay for it.
Allowed by whom? The user is in control, not the author. It is the user who owns the computer that runs the browser. If any ad gets shown on the screen, it is because the user generously allowed it to happen. Most people do this out of pure good will only to end up being mercilessly abused for the sake of someone's business model. Nevertheless, it is a privilege which can be unilaterally revoked and there is next to nothing that websites can do to stop it. After content has left the server, the author is no longer in control.
As for professional journalism, the lack of conflicting interests caused by ads is essential for it to be considered "good", so no good journalism website should be clouded by advertising. Yes, that probably means subscriptions.
It can literally cost you $30 a year to host a website.
Many people will spend more on coffee. Per month.
But even if it was very complex -- which it isn't -- I still fail to see how that supports a model of an ad-supported web hosting.
But I'm also a realist. We live in a capitalist society and free content has come out of advertising since before the web. It's an annoying part of the present world but not the most annoying part imo. I don't know how people denouncing just advertising expect the publication of free information to work.
This society has created a vast plenty. I don't see why advertisers, the public and publishers couldn't reach a truce where a moderate amount of semi-relevant text ads get shown the reader in excahnge. But everyone wants to total control, wants to club to death all competitors and that seems to be the way this world of plenty is ending.
I've made a website pure HTML with just a small CSS and no JS; with real great content. Google doesn't take it into account. So I don't know if they are really pushing for a light network, or maybe, I don't know, because it is easier to convert these website into AMP?
Google cares more about responsiveness. A site is responsive if you can start reading it quickly, regardless of how much network traffic is being used by ads, as long as the ads are loaded asynchronously after the primary content.
Penalizing for high total network traffic is short-sighted and would prevent most video hosts from ranking well in Google Search.
the main reason is that ads are coded by people that have no interest in performance. I've seen huge JS libraries loaded to display a static image. I've seen multiple versions of the same library loaded in the same ad. I've seen things that wouldn't fly anywhere else.
Why? Because the ad agencies are under a lot of time pressure to make things quickly, and there is NO penalty if the quality is terrible. So they take what worked last campaign, add new tags to satisfy the new customer, and ship it out. It displays? Perfect. It's huge and slow as hell? Who cares, it's not their website that gets slow as molasses.
Google is people afterall. Corrupt as fuck. That's why plenty call it gulag.
As the browser author with the dominant market share, Google could start by not enabling these "wasteful developers". These ridiculously huge web pages can lock up a user's computer when the user's browser is a "modern" and "supported" one but not when the user agent is something simpler.
I can read wired.com really fast using various tcp clients to make the http requests and links to view the html. If user A reads the articles with Chrome and user B reads them using something simpler, how is user A advantaged over user B? All things being equal, if we quiz them on the readings, would user A score higher than user B?
Software developers have long been squandering user's resources beginning with Microsoft Windows. Hardware manufacturers were Microsoft's first customers and there was an incentive to get users to keep "upgrading" their hardware. - buying new computers.
Web developers are simply following the tradition.
A user can get those 500 words of content in an instant with zero ads, using the right user agent, even on an "obsolete" device. However there are zero incentives for online ad industry-supported companies/organisations maintaining "modern" browsers, web developers writing code to run on them nor hardware manufacturers to help the user do that.
The easiest way to change the "UX" for the web user is to change the user agent. Trying to get web developers to change what they design to only use a small fraction of what Chrome and the user's computer can do is far more difficult, if not outright impossible.
This is not only improbable to happen that web developers will change their malicious behaviour, it's also the user agent's fault for allowing that.
Why is there no connection speed detection in the browser? Why does the browser allow media playback by default? Why is there no mechanism that reflects the expectations of the user? Is the user expecting videos on the news website or just to read text and images?
I personally think that user agents are not really user agents anymore, as there's not even the idea of offering the user a choice like this.
And personally, I do not agree with the concept of trusting random websites on the internet - by default. Any website on the web should be distrusted by default, with the user having the choice on what to load and what to expect when navigating to that specific resource.
If I visit an i.imgur.com/somename.jpg, why am I redirected to an html page with megabytes of files just because the user agent accepts html then? Should this not be outright impossible?
But please take my comment with a grain of salt, I am building a web browser concept that allows filtering and upgrading the web (which is superhard to build)  and it's still a ton of work to get anything working in a stable manner.
Perhaps one of the impediments to the development of new user agents is a feeling that they must be complex and meet some "standard" of features. A standard that is nigh impossible to meet for the average programmer. On top of that, web developers demand the ability to create complex web pages full of executable code.
However we have no proof that users would reject a cornucopia of different agents that did not all have the same set of features. User agents do not need to be designed to satisy web developers. User agents can be designed to satisfy users.
They can be designed to do one thing well instead of do everything.
No user agent need be intended to "replace" any other, and certainly not to replace "modern" browsers. The intent is to create choice and user experimentation.
It is still possible to access the web with simple programs. It is not gopher or gemini but it still can work in a relatively simple way. Web developers probably do not like that but it remains true. The complexity of today's web sites is optional. It is a deisgn choice. Made possible by... "modern" browsers.
1. Over time almost all user choice in "upgrading" has been removed. "Forced upgrades" is a thing.
42 MB RAM without graphical system
64 MB RAM with graphical system
You may run Windows applications on Wine. Or Windows 3.11 on virtual machine.
Netbook I bought in 2008 was underpowered for Windows XP but was perfect for Linux. I still have it around. With up to date Firefox and Chrome it feels slow but in console mode it's snappy.
No need for install with LiveUSB. Everything is here, countless people made it possible, would you use it?
I prefer NetBSD. I do not need graphics. I make custom images that boot from USB or the network.
As for Windows, there was a time, in the 32-bit era, and before the widepsread availability of virtual machines and Windows 3.11 images, when users were compelled to upgrade hardware and Windows versions. It was not made easy for a non-technical user to buy new hardware and use 3.11 if the hardware came with a more recent Windows version pre-installed. Microsoft will not facilitate installing older Windows versions on newer hardware ("metal", not VM) and may actively discourage it. In contrast I can easily install any version of NetBSD I want on new hardware. I am not compelled to install the most recent version. There is user choice.
How easy is it today to run Office in a VM on Linux?
I am running rolling release distribution on desktop and Ubuntu LTS on server. My choice of secure installations is limited . Looks similar in NetBSD . Microsoft had no interest in support - better if customer buy new version and support have significant cost.
VirtualBox solved running Windows in VM at least ten years ago. Office support in Wine is from platinum to garbage , , have not tried. I can imagine running outdated versions behind firewall. Running newer versions requires newer hardware. And internet facing applications should be up to date so modern browser support is limited - Firefox ESR at best. I run w3m from console only on emergency.
Speaking of hardware - Moore's law is dead. I do not think 2020 notebook differs much from 2014 notebook. Except better display and battery.
It usually works though nowadays, unless you go nuts and try to boot Windows XP or something. Are there any processors that flat-out can't run Windows 7 atm?
(Older versions of macOS, on the other hand, absolutely will not run on newer processors.)
Have you ever successfuly imaged Windows 7 from an older laptop and installed it on a new compuer?
I only need Office. I do not necessarily need the latest version, so long as documents are XML-based.
That said, I was able to pretty quickly install Windows 7 on a then-just-released Ryzen 3950X last October. I do remember there being one hitch, I think I had to slipstream in USB 3.0 drivers.
A smaller download would be: https://www.gutenberg.org/ebooks/100
Also FIFY: s/Web developers/Suits/
Looking at the requests it seems to be downloading a video from Cloudfront. And yes, in the middle of the page there is a video playing. I'm sure people with metered connections will love that.
That said, with adblocking at least the design looks clean enough. I'm willing to bet that this is what their designers see, and then another team adds all ad overlays on top of the existing design.
- JS off by default
- Web fonts off by default
- Media elements larger than 100KiB not loaded by default
uBlock origin is the god of internet.
So you want Google to use their dominant position to force webmasters into a new paradigm that probably(?) benefits Google more than today's status quo? And when people start yelling for antitrust provisions, will you still back Google?
It's true, whenever a company has pleaded with me to bring a site in with better performance, I have adamantly refused to do so.
When companies say guess what, Bryan, we are going to focus a sprint on just making sure everything download as fast as possible and we get rid of anything getting in the way of the best possible user experience, I have spent that sprint watching quirky animation, and sometimes turned that animation into a base64 encoded gif and put it in a div that was set with a z-index low enough that it would be not seen by people but would still have to be downloaded by the browser!
I do all these things of course, because it was decided by the Secret League of Obnoxious Web Developers (SLOWD) that we should do the utmost in our power to make the web slower for everyone.
OR - it could be that I have in fact asked project management at sites repeatedly to focus on performance (and accessibility, another thing that always gets ignored) and been told that nobody care about that stuff.
I guess it's up to you to determine where the fault lies.
The worst example of a refused performance improvement I can think of was in relation to improving a help site, it had generally bad reviews from users (it is very difficult to get good reviews from users for a help site because if a user is coming to your help site they are already in a bad mood)
but obviously if you are on a site that you are mad about being on and it then takes a long time to load all the data so you can try to figure out your problem you are going to be steaming.
Project manager wouldn't prioritize the three performance improvement tickets I made with lots of cogent description of why it needed to be done. Somehow though, this is my fault.
In some companies this might be true.
But in many, when a CTO or a senior dev demonstrates business value of a speedy website, it would get attention.
I’m sure project management doesn’t care about keeping packages up to date either. But most devs can successfully communicate that this is necessary and that the alternative is way worse.
When I was consulting I would often do performance analysis of the sites I consulted at, show how performance improvements could be made, put links to relevant studies on performance improvements and the effects on the bottom line with nice quotes, to have the task of cleaning everything up be put in the backlog and forgotten forever.
>I’m sure project management doesn’t care about keeping packages up to date either.
How many days or weeks does it take you to update a package? It generally takes me a minute, sometimes problems happen and I need to spend some hours but those are infrequent. If a package update is going to take too long it becomes something for project management to be aware of and sometimes it is not allowed to update a package.
But generally issues with site performance need handling over a longer period of time than updating a package, I would think that was evident to anyone that has ever updated a package or done a performance analysis of a site. The comparison between something that generally takes a minute and something that takes weeks seems insincere.
But I can make an example where an update was needed that would take a significant amount of time, the reason that the update was accepted was that it would fix certain bugs with the old package and it needed to be done. Either the package was updated or the bugs were allowed to remain or we could fix the bugs with longer time than it took to update the package. I think it is obvious how this is a different argument than the site performance thing, the package update is an argument that this way we will fix the problems that you the business have pointed out, the site performance thing is an argument that this is a problem we want you the business to acknowledge.
I think discussions like this are capable of serving as valuable reference material when we are engaged with project managers on this topic.
Now the thread is quite big so I have not read everything but I have skimmed through it all, and have not seen anything that would be a really useful argument for getting someone to consider maybe we should try to improve site performance. A bunch of people complaining about Google, Wired, and a few other things that they complain about is not as impressive as just the hits you get if you search "effects of site performance on user retention" (replace user retention with other useful metrics to get sources for the argument being written up)
It depends on how you measure efficiency.
Edit: I should add what I would thought would be obvious, and that it is that bloat is related to advertising which drives their bottom line, and I'm doubtful of any existing material alternatives to 'wired.com's already problematic ability to exist. Ergo, the bandwidth is set to optimise for things developers perceive to be inefficient, but not other members of the team.
Unfortunately this isn't the 60's at Sterling Cooper where you can brain storm an idea like go to "work on an egg" or a "a mars a day helps you work rest and play"
When you present your V1 "they" come back and say something along the lines of "can't you make it look nicer/cooler? You know add some wow factor!"
What they mean is: "I just took a pile of money from the client for this and you didn't make it look expensive enough"
I think some commenters are attributing to Google an ulterior motive, whether ill- or good-intentioned, separate from its core business. But in this case no such motivation is necessary.
Basically, Google wants its users to be satisfied - otherwise it will lose to, say, Bing. So it measures user satisfaction - e.g., if a user clicks on a Google result, and immediately hits back button in three seconds, it's a strong signal that the user was not satisfied. And Google tries very hard to increase this "user satisfaction" (and other similar metrics), because not only does it help Google's business, but it also improves the service itself.
And, guess what? When a page takes fifteen seconds to load, lots of people hits the back (or close) button. Showing such a page is giving the user a bad experience. Unless there's literally no alternatives, it makes sense for Google to "penalize" such a page.
Of course no metric is perfect, so it will occasionally backfire and penalize a great page that takes thirty seconds to load. But that's life.
In the webdev community it's well known that good performance is very important for user satisfaction, and that's backed up by research. There are no ideal metrics, and unfortunately every single one of them has some dumb edge cases. You could endlessly bikeshed which metrics could be better, but this particular set is not unreasonable.
It makes sense for search engine to pick websites that not only have the relevant content, but also are able to actually get that content on screen without frustrating users with slow-loading crap that makes browser freeze and stutter.
Keep in mind that your high-end desktop web experience is in minority. The web is now mostly mobile, and mobile is mostly low-end Android. That's a shit sandwitch, and it desperately needs an intervention.
Even putting that aside, I'm not sure how them changing from one rank scoring system to another on their own search engine is them playing "God" for the web. The whole goal of Search is to rank items per some metric of "quality", and speed seems as good and fair as any to me.
No matter what I search for, I'm not going to be pleased to have the results listed in order of page load speed. The goal of search is to rank items according to a metric that matches what the searcher is looking for, not a metric that reflects the search engine's internal preferences.
For any competitive search engine, its internal preferences are expected to be a reflection of its users' preferences and not a whim as you seem to be implying
ie constant standards churn to keep your competitors busy.
If there's a metric that will improve ranking, the SEO people are all over it, and have more incentive and resources to optimize for it than normal publishers.
That way the SEO-ignorant sites that actually have the info you want, but get pushed out of the way due to SEO spam, will have some chances at traffic.
I have never written a search engine, so this comment is worth about 1 kb.
In general, prioritizing speed highly helps small independent sites and over large bloated sites: they typically have less JS, fewer round trips on the critical path, etc. Make your site simple enough (ex: https://danluu.com/) and it will automatically be fast.
(Disclosure: I work for Google, speaking only for myself)
I would make it more generic. Have the user provide his system specs and give them the option to filter out what is or isn't reasonable for them.
I use: 1) a decent desktop, 2) a phone with reasonable specs, 3) a laptop with shit poor specs
a) 400 mbit cable, b) free wifi (crazy slow), c) my ISP provides wifi hotspots that are reasonably fast, d) a prepaid wireless plan where 10 euro equals 1 GB
Shortcomings/experience is pretty obvious with each combination. The laptop (to pick just one) cant reasonably open a google search result, the duck works just fine, fb messenger works too, it can download and play HD videos. Most significant but not all that obvious, it has a qwerty keyboard with which I can write substantial amounts of text. If the search result was tailored for this I could see myself use forums and blogs (with comment sections) over prepaid tethering. It's webcam is unsuitably poor.
Edit: pay 5 cents to view a clean page in stead of 1 euro bandwidth to freeze my client all of a sudden seems a fantastic deal.
DDG has a number of sources, including it's own bot, although it specifically says Google is not one of them, contrary to the GP.
If you also added Google results, your input costs essentially double. Also if so much of your marketing is based on bashing Google, it would be harder to justify such move from a branding perspective.
Speed has always been a good practice for as long as the web has existed. The specific metric changes, but the goal is the same. It's just that "simple" metrics are very easy to game.
An example of this is how they went from "First Paint" to "First Contentful Paint" to "Largest Contentful Paint". They're all trying to get at the same concept, which is when the page loads, but each iteration gets more precise and accurate. Realistically, as a webdev, if your site loads fast, it shouldn't matter which metric is used.
I like this ! Very true !
Compare that to Altavista or Yahoo whose pages were belarded with all sorts of irrelevant links and ads around the search results. Slow to load and hard to visually navigate.
I still think the sparse pages are the best.
The second was being able to serve a large index with parallelized querying, which was relatively easy for a newcomer company with no user base to engineer, and much harder for existing search engines trying to protect a revenue stream. People often don't really remember how late Google was to this business and how much of a difference that page speed indicator was.
Hadn't really considered this - because minimalist page size is often such a given - but, for instance, many amateurs often don't know yet how to crush their pngs and such.
Cool - thanks for this!
(As an aside, it's great to see a continuation of topics like this - which is commenting on last week's article from Parimal Satyal. It makes this place seem more like a forum.)
So much of the web would be better and more universally usable without "modern" cruft.
Is data journalism cruft? Are web applications? Is Google Office cruft? What about the web application my parents have been using to order groceries during the pandemic - that has loads of JS, and loads of CSS to make it readable to anyone over 40. Does that qualify as cruft?
to socially distance myself cruft?
It's all cruft until you ask the people who use it.
The criticism about cruft is one level up. Not about how to accomplish something, but how to do it in a way that isn't extremely wasteful of both computer resources and end-user's time.
Usability features like dyslexia friendly fonts, large fonts, etc., belong in the browser, not on a web page. If anything, this would be easier on the alternative web.
The key idea would be that when you go to a URL on this alternative web, you know you're not going to get slammed with some cycle-sucking, RAM-sucking, virus-carrying, UI horror. More gopher-ish, but relatable to those who have used a web browser.
I can dream.
edit: Ha. Right now, this comment has 6 points, and my original above has -2 points. My illusions of HN rationality are thus reduced. sniff
Browser vendors do. Popups were cruft. Flash was cruft. Not all the time. Just almost all the time.
Ever used Reader View? There's the great de-cruftifier! It doesn't work with web apps, but it sure works great on content. Perhaps some day browsers will default to Reader View, and the web will become more pleasant.
It's sort of like the Gopher protocol, but with links allowed in arbitrary locations.
Definitely. A lot of the "ingenuity and risk-taking" are efforts to make the web something it was never indented to be (e.g. a binary(ish) application runtime/delivery platform) that has lots of downsides.
Didn't think of it before, but your "server-side" suggestion could minimize some of the pain of that, I guess. Low-bandwidth VNC on the client to a browser that's actually running in a DC somewhere. Maybe a VNC add-on to block/freeze rapidly updating squares (videos, gifs, etc).
Not great, exactly, but would ameliorate some of this.
The fact Web browsers won rather disproves your whole notion.
Web browsers have become satanic mega-behemoths of inscrutable code. (I'm too lazy to look--are there more lines in Chrome, deps included, or the Linux kernel?) They are the utter antipode of the Unix philosophy, and arguably an engineering abomination.
People with Vision Lost browse the web. No need for separate web.
uBlock Origin (advanced user)
* * 1p-script block
* * 3p block
* * 3p-frame block
* * 3p-script block
* * image block
* * inline-script block
* * * block
* * frame block
A search engine tries to find all sorts of relevant information related to your query. The more the merrier (it’s searching after all) and then sorted in a way that puts the relevant results first. An answering engine, in the other hand, tries to minimize the number of results. In an ideal world, it would only return one thing, which tells you exactly what you want.
One example of this change is the fact that it’s no longer useful to go beyond the first page or so of Your results. Because anything down that low is irrelevant as an answer and is probably discarded by google anyways, which wasn’t the case when it was a search engine.
I’m not saying this is a bad thing. In fact, I suspect the majority of time the majority of people want an answer, and not a multitude of results. But I think this is what leads to google search changing in a way that does not meet many people’s expectations here.
It means google emphasizes stuff that gets people answers quickly. They parse text and reveal the answers on their page itself. And they are not very useful for exploring anymore.
In an ideal world, it would only return one thing, which tells you exactly what you want.
In Google's ideal world, it would also only return one thing, which tells you exactly what they want to tell you.
Because let's face it, one of the biggest challenges with that idea isn't that there are tons of different valid answers to a question (though that could be the case for many political queries or ones related to art/media quality), but that hundreds or thousands of pages and sites contain the requested info, and search engines have to rank them in some sort of order.
True ! To give a concrete example... LONG FORM RECIPE PAGES - Should be on SERP 100+. It's gotten a little bit better, but still see sometimes a 2000 word article when
I google something like "apple pie recipe" or "apple pie oven temperature"
Your comment is same as "NYTimes is a advertising company hurr durr because they make money from advertising"
Now, Google seems to be trying to learn the details of all the people, and they're selling access to that. Their mission seems to have flipped.
We have a large 300kb animated gif that takes up maybe 20% of the viewport above the fold. The gif demonstrates visually what our service does.
A couple months ago Webmaster Tools reported that page as being “slow” pointing to the large image download. So we decided to show the first frame of the gif as a (30kb) png file, and then swap in the gif 2 seconds after the page is fully loaded.
Except now the new “largest contentful paint” metric is failing on those pages because it includes the 2 second delay when the animated gif is swapped in. I guess technically they’re not wrong in how they’re calculating it.
In fewer words, Google doesn’t like anything being lazy loaded if it’s above the fold.
The metrics and how they’re calculated are questionable. We ended up optimizing for Google and removed the lazy load (ignoring that we think it’s a better UX to lazy load that specific gif).
You might be able to turn a 300kB GIF into a much smaller encoded video; as long as it doesn't have audio, you can autoplay it.
2) Your case, where you actually add value-content with the gif.
Now the speed metrics are just that, speed metrics they "report" in isolation from "content".
So now my question is: Is Google's OTHER-content-signals good enough to overcome any penalty that might have been applied because of the speed ?
As an aside, the other replies to your comment is very telling of the HN audience. Many of which aren't knowledgeable in the webdev field, but "they did it in the 90's", "use bleeding-edge tech" and "everything you're doing is wrong". Also IFRAMEs, hahaha.
So you're stuck with 1 shitty CPU core, and you're stuck sharing it with the browser & JS runtime (yes there's of course multithreaded aspects to the browser itself, but you're still sharing an awful lot of CPU time on your single thread with the browser). 1/6th to 1/8th the performance of the phone is the most you can achieve if you're lucky. That's a fucking terrible starting point, and nobody should be surprised the mobile web sucks ass as a result.
(where user = advertiser)
Advertiser == 0.6 * customer
Their revenues are increasingly diversified, Google received far more dollars from myself and company via non ad revenues, more than $1000 / month now and growing (re: gcloud)
Largely people prefer the free perception of the internet and wouldn't pay the prices it would cost if direct payments were made.
I believe its in the $x00s per year
You have any reasoning behind this or are you taking the people who killed the product at face value? If I were to guess that product (i have no idea to what you’re referring) was designed to fail, like youtube premium.
To which service do you refer?
I think Google would love to block ads more aggressively, but the conflict of interest is so obvious it is going to be raining lawsuits the instant they do that.
Likewise Google Contributor did not reduce tracking, which is why I never signed up.
People pay for content all. The. Time.
> 1. Smaller assets are ideal.
> 2. Minimalistic design is necessary.
This doesn't sound right to me. Aren't the three new page metrics mostly targeting what happens when the page initially loads?
> Largest Contentful Paint (LCP): measures loading performance. To provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts loading.
> First Input Delay (FID): measures interactivity. To provide a good user experience, pages should have a FID of less than 100 milliseconds.
> Cumulative Layout Shift (CLS): measures visual stability. To provide a good user experience, pages should maintain a CLS of less than 0.1.
The first two are about initial loading. For the last one, you can avoid layout shift by e.g. reserving space for images that are yet to load fully.
For example, it sounds like your page could load 100MB of images with the most complex design ever, and it would get a good score as long as the initial viewport of the page displays quickly, is interactive quickly, and doesn't jump around as it's loading.
They sound reasonable metrics to me in terms of fighting bloat but with flexibility for web developers (as opposed to AMP). Who gets to decide the metrics is another issue though.
Having said that, what exactly do customers want?
They want the best experience on whatever device they're on. This is 2020, there's so much that has happened since the 1990s. We can't simply keep using standards from the 1990s.
> 22.82 Mbps will reliably download very complex web pages nearly instantaneously.
The author needs to come down from his high horse and use internet in developing countries. I was in India the other day for a client meet and I was on one of the largest networks there. I had a subscription to NYT and I tried to load an article and whoa, it took me 3 full minutes for the browser to fully load the article to the point where it was even barely readable. I'm not saying the network in India is slow, I'm saying, even with the best networks, when you're travelling your speeds will be in KBPS. If we don't have strict standards, the sizes of these pages will only grow and grow.
Later that day, I loaded the same article from my desktop again. The site made a gazillion requests in the network tab to so many advertising vendors and each of them had consistently sizeable assets. More than being offended for selling me out despite a paid subscription, I was offended how ridiculously unoptimized sites like NYT are, despite being a popular online, large scale publisher.
I'm happy such sites like NYT will be penalized if they don't provide their users a good experience.
Sometimes you want to make a slow website that doesn't fit well on a phone screen?
Leaving aside the fact that you can of course do that, and that if I'm using a search engine on my phone I probably (usually?) don't want to look at your slow site that I have to horizontally scroll...
> Modern web design principles are very rarely directed at regular people looking to make a website on something they are interested in. Instead, the focus is on creating websites that perform well:
> Don't use too many colours. Write short, catchy headlines. Don't let content be too long. Optimise for SEO. Produce video content, attention span is decreasing. Have a an obvious call to action. Push your newsletter. Keep important information above the fold. Don't make users think. Follow conventions.
All that's true to some extent if you're making a product on the web and you have a few seconds to hook a customer before they move on. If you're making a website for enthusiasts in some niche, though, content is your draw and you can worry less about some of these things.
Sometimes you should have the choice of making a slow web site that doesn't fit on the phone screen. A large corporation shouldn't dictate how you present your information.
Sometimes you want to put a document on the web without having to pay someone to run on Google's treadmill of changing standards and policies for the rest of your life.
I'll take a crappy-looking web site with good content over an SEO-pumped disaster that provided no information.
Craigslist, eBay, and dozens of others didn't get to be huge because of their good looks.
Not every destination started as a web search.
Case in point, the list of links on this very site that we're all commenting on. This discussion didn't start with a google search. Google was never involed at all. Not everything on the web goes through google search. Nearly all searches go through Google, but there's a hell of a lot more to the web than searches.
As also evidenced by some of the largest & most visited sites driving a ton of non-search traffic - like reddit. And facebook. And twitter. And etc...
Oh, it's anything but arbitrary. It's totally strategic, which is why people pushing back against them being able to dictate web policies is exactly the right thing to do.
I now see people link these "amp" web links, where the domain you are linking to isn't the domain you are trying to reach.
That's one step away from somebody seeing
You content yourself with the users that type https:// before your site name and never make a typo?
- Small size is not always better, especially when the wait may be worth it for the audience
- Minimalistic design is not always necessary, and in fact it's a good idea to take some risks in the other direction, if it could result in a desired outcome
- Google doesn't have an intrinsic right to dictate best practices. In fact, the idea of a single set of best practices runs counter to the spirit of the web, in which a website can be a creative or cultural experience which transcends convention.
- In addition, on that last item: The web _can be_ more than what it is now. So monolithic standards could easily get in the way.
- Google is going to hold an event and tell visitors how to conduct "modern web development." The author is not comfortable with Google's singular focus on Google's preferred standard and mode of web development being The Only Way, so they are going to attend and push back.
It's a good post. Even the headline by itself resonates in a lot of different ways, especially if you've spent a lot of time on the receiving end of Google's dev messaging.
Speaking more broadly, quite often the blanket design advice given to web enthusiasts and subjective-values-driven creatives comes from the economic side of the web. It starts from "don't annoy your users" and draws a (suspiciously) straight line from there directly to "Largest Contentful Paint". This represents a potentially huge, complicating energy drain for a project that may have a lot of other important design parameters to start with.
In another way, it changes the question. The web wasn't designed as a way to get your customers really specific information as quickly as possible. There's so much more to it than that.
A lot of HN users are researchers from the economics-first side of things though, not artists, so this may be a bit hard to understand. It's like hearing yet another friend tell you they've signed on for a liberal arts degree.
Such researcher-consumers (who incidentally write some amazing online reviews, but that's another story) are also well known for getting frustrated at the slightest delay or sales pitch. Art on the web can end up, therefore, being perceived as a broken web experience, a bad sales pitch for an ephemeral and vapid product, as opposed to something which inspires or changes viewpoints. "I can't even find the site map!"
The web doesn't even have to be users-first--quite often what's best is a series of standards and compromises that alternately put creators and users first at different points.
The priorities shouldn’t be dictated by a single company.
I don’t fault Google for what they are doing (besides the AMP stuff). The problem the author is talking about is largely because google is the de facto way pretty much the entire world finds new stuff.
If website discovery was driven by blofs and the like wns people then added the stuff they liked to their RSS readers then the authors issue would be resolved.
Unfortunately that world doesn’t exist anymore and Google’s preferences end up dictating what all websites prioritize.
1 and 2 are totally wrong and 3 googles moving away from amp ant letting normal pages rank.
I have wasted to many hours of my time on conference calls with people like this.
I really hope they have plans to improve this or find an approach that works as a middle ground for generic SEO content.
When I run that search on Google in Canada, there is a Panasonic link on the first page (third link), but it's for the US store listing, which has less info on it than the normal product page.
But as you say, search is in a horrendous state. So is video discovery on Youtube. It seems like my front page recommendations on Youtube have turned to garbage in the last 12 months.
Maybe ten or fifteen years in the future, paying to Google (or Facebook, or...) will be the only way to have your web page delivered to anyone.
I mean, why should Google display your page to anyone for free, if someone else is willing to pay to have their page displayed instead. It's not like people are going to switch en masse to something else, just because Google results become slightly less relevant. And when paying becomes the new normal (e.g. it will become normal to pay to have Hacker News on the first page for "Hacker News"), the results will return to be more relevant again.
I use Youtube in a very specific way: I subscribe to a bunch of channels that I like and only watch what those people post. My recommendations tend to be very focused around things related to my interests. Occasionally I will look up a random video for whatever reason and I'm always careful to delete it from my history unless I actually want more recommendations related to it.
But you're right, they outright sucks. They are definitely dynamic because I only have to show my son a single Roblox video to change it, but at the same times there are stuff that is repeatedly showing up that I have no interest in. Even stuff that I've told YouTube that I'm in no way interested in...
What I REALLY miss is a random category - for stuff that I would never otherwise see.
Youtube can make me do work to improve my experience, but I do have another option, stop using Youtube's front page and just discover stuff via Reddit or search engine results.
I had a similar experience with Google News trickling up garbage news sources, and I spent way too much time blacklisting sources, and have basically given up on it.
The onus is on the search engine to produce good results, because they're the one providing the service. Hell, that's why I started using Google around 2000, they had the best first page search results.
I ran into this over the weekend.
I had a question about something I took a photo of in a small village, and I knew a priest I met there would have the answer.
I went to Google to find the parish web site, and it's simply not in Google. Not at all. Lots of sorta-kinda matches in places thousands miles away, even when I specified the very unique village name, state, and ZIP Code.
So I dug through the pile of dross that comes home with me after I travel and found an old parish bulletin that had the web address printed on it. It turns out the church has a web site — a pretty nice one — but Google doesn't know about it. Or it's so heavily down-rated by Google that it didn't come up in the first seven or eight pages of results, no matter how I formed the search query.
So I went to the parish's web site to see if he has an e-mail address. The web site indicated that he is no longer the priest there. So I put his name into Google to see where he'd been moved. (I've been told by other priests that unless you're tied to a school or other institution, priests in America are shuffled around every four to six years.)
His name is fairly uncommon, so I expected Google to show me links to church bulletins or announcements about arrival or going away parties, or even church employee rosters, of which there are thousands on the web. But instead, it was page after page after page of SEO spam for "Complete phone number for $priest_name!" "Is $priest_name cheating on you? Find out!" "$priest_name in your city want to meet you!" The few I clicked through just in case had none of the promised information.
I gave up on Google, and phoned the parish office when it opened this morning. He retired six months ago. Here's his phone number.
Time wasted with the company that promised to take all of the world's information and make it available to everyone: 20-40 minutes.
Time spent getting the information the old fashioned way: 2 minutes.
Thanks for nothing, Google.
Now there is just spam.
I bet they simply optimized for the average person on the web, and we're outliers.
I wonder how much of the search abuse / manipulation they are dealing with given the current situations and upcoming elections?
[badgers in montana], [while loop python], [how to fix noisy refigerator], [nude stallman] all do just fine. Are there veins of queries that are particularly polluted? Even [homeopathy covid] is pretty good.
SEO hackers are taking advantage of Google's biggest weakness: they are an advertising company that makes money on search ads.
If search were run by a nonprofit like Wikipedia or the Internet Archive, it could be made to filter out sites with ads on them. SEO hackers would have no way to get their foot in the door unless they drop all of their ads, removing their own revenue source. This would create a space on the internet for non-commercial activity to flourish without ads and all of the tracking garbage that comes with it. A form of cultural ad blocking that, in the long run, could be a lot more effective than the technical arms race of client-side ad blockers.
I would prefer not to see W3Schools, Tutorialpoint or GeeksForGeeks to be so highly ranked. This problem is even worse when I search for ML stuff, with pages upon pages of Medium blogspam.
While W3Schools is fine, I guess... I have to ask if they truly represent the best result for my search query.
The web is a open network. Anyone can share content as well as indices. I know it's mostly impossible to beat google, but niche indices has their place to shine.
For example, here, HN is an index of hand picked (mostly great) content, and there are multiple "unofficial" HN variances. See? the web is very diverse and free.
> 1. design a flawed API (it's fine! APIs are hard)
> 2. ship it in the most-used browser, despite objections
> 3. get cross-browser working group to fix the API
> 4. oops, too late, that would break the web
Is this a bad thing?
In a way, yeah, they made the Web safer. And cleared it out almost completely in the process.
Should Google not adjust their algorithms to demote bad or harmful content. There are two sides of the coin.
Is it really much different from the security cat and mouse game?
Would you prefer no search or what we had before Google?
What would search look like of they did nothing? What would the world look like over the last 20 years?
Then Google started adding more portal-like features and using NLP to perform searches on what they wished you had entered in the search field. They also started tailoring search results to your past search history and whatever profile they constructed about you.
Some people might want a natural language question answerer, others might want a search of only current events, but some definitely just want to search for content.
Google has catered to the former at the expense of the latter. They try to stuff all their different modes into a single text box and display them on a single results page. There's no segmentation to clearly different functional modes.
For me I'd love to see Google go back to actual web searches. Support Boolean operators and term quoting. Return me only search results without offering me shit they think I want to see.
I abandoned Google as my main search engine in favor of DuckDuckGo years ago. While not perfect they at least show me actual search results. If I want other modes of search like video or news I can click those tabs but they don't force that content on me.
We might not see any obvious candidates for that, as Google was dominant and they had no market access.
Imagine a timeline where Palm ruled the phone market, and we’d be here asking rhetoric questions about what it would have been if Palm didn’t do it.
With drugs, we weight that trade-off heavily towards higher quality over new entrants. The question is, is that an acceptable trade off on the web?
Reading this, I now think I should go put AMP on my personal blog. Before that, I needed an HTTPS cert. Before that, I needed to confirm I owned the site to Google.
All things which may make the web better, but it certainly increases the operational burden and may discourage others from bothering in the first place.
The burden is necessary given the proliferation of bad actors
I would not recommend AMP, but there are dozens of standards that are not part of what the public sees, low level enough they don't care to see. These are the important things Google has caused the world to adopt as best practices. You don't have to have http, but you are telling users you don't care about their security. This was largely pushed by green/red indicators in the URL bar.
This heavily pushes the web toward centralization, which I see as a really bad thing, in many ways the cure is worse than the disease. It's happening in so many ways (have you tried standing up your own email server?) and this sort of thing drives it the hardest.
I don't think Google is all bad, in fact I appreciate a lot of what they have done for the world (you mention some of these things). But as with nearly everything in life, it's complicated. I worry a lot that we're headed towards an internet where the individual is at the mercy of organizations to allow them to have a voice.
But the bad actors don't really abuse technical standards. They abuse links. The only part in Google's list of criteria that has largely remained untouched in the last two decades and is still the factor for ranking. So much so, that I'm confident that, if ranking factor relevance was a search result, the first page would just be the same result over and over: Links.
That's the reason for the centralization of power: those with more links will be ranked on the top, will have more visibility and, you guessed it, get more links. Guess what happens: they then start to sell either links or they rent out subdomains and folders on their website.
Google, as a company, does not care about bad actors. Some teams in Google do, but they are not the ones setting the policies.
I'll visit YouTube for free entertainment sometimes, but I hardly depend on it for anything, and I really wish more creators posted their content elsewhere, even at a price.
What has Google actually created that's successful, in house, other than search & gmail?
Here you have Monty Python asking that https://www.youtube.com/watch?v=Y7tvauOJMHo
most of the folks who claim never to use google or google never makes something available - almost always hot air.
I do remember hotmail and MSN and AOL which preceded gmail, gmail was really pretty great when it came out.
I think microsoft may oddly be coming back a bit after the balmar years. I'm not a huge fan of facebook taking over (yuch).
Google took on and basically CRUSHED some major market players and powers (Microsoft basically lost the entire web and handset markets from a monopoly position) as a result of google service offerings.
Check out how much money they pumped into firefox as well - that used to be a major part of their push vis a vis IE.
Regardless of how much google "sucks" - they seem to have at least some things that folks like.
They have changed tune and I now call them the biggest COSS company.
I'd thought they were doomed (ie, no linux cloud offering expected, no multi-platform or open source offerings). Windows Mobile did crash and burn, but Azure actually has linux containers (most common type I think) and their other stuff is sometimes open source / linux friendly now - (SQL Server??) it's whiplash for those of us who were around for the EEE period.
Because of the market dominance
I don't recall them advertising until recently
By purchasing potential competitors before they got big.
Google launched AdWords in 2000
But I agree, not your fault for not looking.