
Google Is Not God of the Web - davidblue
https://bilge.world/google-page-experience
======
gok
> 22.82 Mbps will reliably download very complex web pages nearly
> instantaneously.

The author may be unaware of how ridiculously huge web pages have gotten. I
just loaded Wired.com, scrolled down and let it sit for a few seconds. It
downloaded 96.2 MB, requiring over 33 seconds on one of those average
connections. On a pay-as-you-go data plan, it would have cost about a dollar
just to load that one page. The front page has about 500 words of content. It
also covered 100% of the content with ads, twice.

This is unsustainable. Web developers have utterly squandered all efficiency
gains of the last 30 years, and then covered their grossly inefficient work
with impossibly annoying, privacy-invading advertising. Google should be
applauded if they make these wasteful developers suffer monetarily until they
shape up. They've already stolen untold amounts of time and energy from us
all.

~~~
ricree
>Web developers have utterly squandered all efficiency gains of the last 30
years, and

Loading Wired.com with uBlock and no javascript the page comes in below 1.5MB
for me, with most functionality seemingly intact (in that the front page looks
mostly normal and I can load articles that appear to have their text
completely intact). The bulk of that seems to be fonts and images, which are
probably unavoidable for a media site.

Some reasonable noscript whitelisting for Wired.com and a few others (out of
12 total that noscript shows me) gives a page size that's still under 5MB.

Looking at the full page with everything loaded and unblocked, the biggest
offender here seems to be not web design, but an aggressively downloading
autoplay video on the front page. Without that the page itself is - while not
necessarily great - at least reasonably bearable.

Truth be told, I'd started this post intending to blame advertisers, and there
is still some merit to that since even before the video kicks in the various
third party scripts balloon the page size several times over from the minimal
functional one that loads with everything blocked. But in this case, it does
simply seem to be a wasteful media stream playing without regard to whether
anyone wants it to or not.

~~~
rndgermandude
With uBlockO it was 1.5MB for me, without it was 3.8MB, on Firefox, coming
from Germany. Which is still pretty ridiculous on both numbers for what's
actually visible on the page.

Once you scroll, however, things get messy no matter what, because of their
"Scott Adkins Answers Martial Arts Training Questions From Twitter" auto-play
video they have right now. That ate way another 30MB quickly and the video
wasn't even visible (I had scrolled past it).

~~~
sergeykish
I browse without 3rd-party and 1st-party scripts [0]. I wanted to praise my
setup but it does not work well with Wired:

1.37 MB / 722.93 KB transferred, Finish: 6.57 s

versus uBlock default

8.62 MB / 4.49 MB transferred, Finish: 28.38 s

Clean setup mostly increase load time:

11.58 MB / 5.79 MB transferred Finish: 1.13 min

^ checked with "Disable Cache".

Not much content delivered for such big HTML file

660.46 / 167.60 KB transferred

because it's mostly inline script:

    
    
        $$('script').map((s) => s.textContent).join('').length
        568681
    

And fonts.css is inline font:

127.70 / 100.15 KB transferred

[0]

uBlock Origin:

    
    
        * * 1p-script block
        * * 3p block
        * * 3p-frame block
        * * 3p-script block
        * * inline-script block
    

or uMatrix:

    
    
        * * * block
        * * frame block
        * 1st-party css allow
        * 1st-party frame allow
        * 1st-party image allow

------
yongjik
(Disclaimer: previously worked at Google search)

I think some commenters are attributing to Google an ulterior motive, whether
ill- or good-intentioned, separate from its core business. But in this case no
such motivation is necessary.

Basically, Google wants its users to be satisfied - otherwise it will lose to,
say, Bing. So it measures user satisfaction - e.g., if a user clicks on a
Google result, and immediately hits back button in three seconds, it's a
strong signal that the user was not satisfied. And Google tries very hard to
increase this "user satisfaction" (and other similar metrics), because not
only does it help Google's business, but it also improves the service itself.

And, guess what? When a page takes fifteen seconds to load, lots of people
hits the back (or close) button. Showing such a page _is_ giving the user a
bad experience. Unless there's literally no alternatives, it makes sense for
Google to "penalize" such a page.

Of course no metric is perfect, so it will occasionally backfire and penalize
a great page that takes thirty seconds to load. But that's life.

~~~
freediver
How is this helping Google not lose to Bing, when the change would equally
improve Bing experience (rising tide rises all boats?).

~~~
Flimm
Not all websites are going to improve their performance, even with Google's
incentive. If Google ranks fast websites better than slow ones, that gives
Google an advantage over any competitor that doesn't, as long as slow websites
still exist.

~~~
Andrew_Quentin
And it creates monopolies, perfect for gulag control of the people.

------
pornel
There are lots of questionable ways in which Google owns the web (AMP,
reCaptcha harassing users without Google cookies, Chrome's "Fire And Motion"
web standards strategy), but this one isn't one of them.

In the webdev community it's well known that good performance is very
important for user satisfaction, and that's backed up by research. There are
no ideal metrics, and unfortunately every single one of them has some dumb
edge cases. You could endlessly bikeshed which metrics could be better, but
this particular set is not unreasonable.

It makes sense for search engine to pick websites that not only have the
relevant content, but also are able to actually get that content on screen
without frustrating users with slow-loading crap that makes browser freeze and
stutter.

Keep in mind that your high-end desktop web experience is in minority. The web
is now mostly mobile, and mobile is mostly low-end Android. That's a shit
sandwitch, and it desperately needs an intervention.

~~~
no_wizard
I don’t follow the “Fire and Motion” bit on web standards. Mind an
elaboration?

~~~
antod
I presume it's a reference to this:
[https://www.joelonsoftware.com/2002/01/06/fire-and-
motion/](https://www.joelonsoftware.com/2002/01/06/fire-and-motion/)

ie constant standards churn to keep your competitors busy.

------
bosswipe
What's infuriating to me about these types of "signals" to the search rankings
is that they have little to do with the content that I'm searching for. Google
will hide results that I might find useful because the web master hasn't kept
up with whatever Google decided was today's best practices. How about ranking
based on the best source for what I'm looking for?

~~~
zozbot234
Google's ability to surface useful results has been thoroughly defeated by SEO
spammers. To a lesser extent, the same is true of other search engines (Bing,
etc.) though Google is the foremost SEO spam target for obvious reasons. Given
that state of things, there is some sense in promoting more user-friendly
pages that are thus a bit less likely to be SEO spam.

~~~
Multicomp
I wonder if Google could combat this by having every X search pages swap page
1 with, say, page 5? Or give users the option to jump straight to a given
search result page by default?

That way the SEO-ignorant sites that actually have the info you want, but get
pushed out of the way due to SEO spam, will have some chances at traffic.

I have never written a search engine, so this comment is worth about 1 kb.

~~~
userbinator
For some queries I instinctively jump a few pages ahead because I know the
first few pages are going to be absolutely filled with SEO spam. The remainder
is still not free of spam, but has a higher chance of containing what I want
to find.

------
gumby
Back in 1998 or so people weren't just enthusiastic about the Google search
engine because its results were good but because the search page was simple
and fast.

Compare that to Altavista or Yahoo whose pages were belarded with all sorts of
irrelevant links and ads _around_ the search results. Slow to load and hard to
visually navigate.

I still think the sparse pages are the best.

~~~
partdavid
I worked at one of these companies during this era and yours is an often
forgotten fact. It was a very long time before Google search result quality
was any good. Their initial popularity was due almost entirely to page load
speed, which they included on every page to highlight it.

The second was being able to serve a large index with parallelized querying,
which was relatively easy for a newcomer company with no user base to
engineer, and much harder for existing search engines trying to protect a
revenue stream. People often don't really remember how late Google was to this
business and how much of a difference that page speed indicator was.

~~~
enonevets
The same exact argument could be made for Chrome. Google was extremely late to
the browser game and most of the initial switch was on how quick it was in
comparison to other browsers at the time.

~~~
gumby
Can you expand? Chrome seems to be popular but I don’t know the motivation for
it.

~~~
enonevets
I was replying to the parent comment above mine stating that Google came late
to the browser game just as they did with the search game and if part of what
initially made Google appealing was that they were simple and faster than
other search engines, that is true when Chrome was introduced as well. Google
explicitly marketed Chrome speed as a selling point. For example:
[https://www.youtube.com/watch?v=nCgQDjiotG0](https://www.youtube.com/watch?v=nCgQDjiotG0)

------
kickscondor
> There is a very reasonable argument for essential services like search
> engines and news websites to conform to/adopt standards like AMP, but for
> the rest of The Open Web, ingenuity and risktaking should be encouraged, not
> discouraged, for the true good of all Peoplekind.

Hadn't really considered this - because minimalist page size is often such a
given - but, for instance, many amateurs often don't know yet how to crush
their pngs and such.

> [https://bilge.world/open](https://bilge.world/open)

Cool - thanks for this!

(As an aside, it's great to see a continuation of topics like this - which is
commenting on last week's article from Parimal Satyal. It makes this place
seem more like a forum.)

~~~
downerending
Actually, I'd be interested in an alternative web where ingenuity and risk-
taking would be utterly forbidden. Just HTML, and a very basic subset at that.
No Javascript at all, no CSS.

So much of the web would be better and more universally usable without
"modern" cruft.

~~~
jbreckmckye
Who decides what is and isn't cruft? You?

Is data journalism cruft? Are web applications? Is Google Office cruft? What
about the web application my parents have been using to order groceries during
the pandemic - that has loads of JS, and loads of CSS to make it readable to
anyone over 40. Does that qualify as cruft?

Are dyslexia friendly styelsheets cruft? Is Google Maps cruft? It's full of
JavaScript. Are browser games cruft? I played QuakeJS last night and had a lot
of fun with it. I was also using a WebXR 3D app the other day to preview a
rental property remotely - is being able to socially distance myself cruft?

It's all cruft until you ask the people who use it.

~~~
TeMPOraL
That it _works_ is table stakes. A web application your parents wouldn't be
able to use to order groceries because it was so broken wouldn't even be
discussed, it would be pulled off the Internet and replaced with something
that works.

The criticism about cruft is one level up. Not about how to accomplish
something, but how to do it in a way that isn't extremely wasteful of both
computer resources and end-user's time.

------
NickHirras
If you direct their web vitals tool to test itself
([https://web.dev/vitals/](https://web.dev/vitals/)), the report isn't great:

[https://lighthouse-dot-
webdotdevsite.appspot.com//lh/html?ur...](https://lighthouse-dot-
webdotdevsite.appspot.com//lh/html?url=https%3A%2F%2Fweb.dev%2Fvitals%2F)

~~~
bitpush
[https://en.wikipedia.org/wiki/Shooting_the_messenger](https://en.wikipedia.org/wiki/Shooting_the_messenger)

------
addicted
I think most people haven’t internalized that Google is no longer a search
engine but an answering engine.

A search engine tries to find all sorts of relevant information related to
your query. The more the merrier (it’s searching after all) and then sorted in
a way that puts the relevant results first. An answering engine, in the other
hand, tries to minimize the number of results. In an ideal world, it would
only return one thing, which tells you exactly what you want.

One example of this change is the fact that it’s no longer useful to go beyond
the first page or so of Your results. Because anything down that low is
irrelevant as an answer and is probably discarded by google anyways, which
wasn’t the case when it was a search engine.

I’m not saying this is a bad thing. In fact, I suspect the majority of time
the majority of people want an answer, and not a multitude of results. But I
think this is what leads to google search changing in a way that does not meet
many people’s expectations here.

It means google emphasizes stuff that gets people answers quickly. They parse
text and reveal the answers on their page itself. And they are not very useful
for exploring anymore.

~~~
dhimes
s/an answering engine/an advertising engine/

FTFY

~~~
bitpush
This meme is tired. We get it, Google makes money from advertising.

Your comment is same as "NYTimes is a advertising company hurr durr because
they make money from advertising"

~~~
scollet
Well that's a very rude comment. You are also using false equivalence to halt
any discourse on the matter. Doesn't seem very safe.

~~~
dhimes
Yeah I think the Google trolls came through last night.

------
cj
One anecdote where their “Largest Contentful Paint” metric fails, and fixing
degrades performance:

We have a large 300kb animated gif that takes up maybe 20% of the viewport
above the fold. The gif demonstrates visually what our service does.

A couple months ago Webmaster Tools reported that page as being “slow”
pointing to the large image download. So we decided to show the first frame of
the gif as a (30kb) png file, and then swap in the gif 2 seconds after the
page is fully loaded.

Except now the new “largest contentful paint” metric is failing on those pages
because it includes the 2 second delay when the animated gif is swapped in. I
guess technically they’re not wrong in how they’re calculating it.

In fewer words, Google doesn’t like anything being lazy loaded if it’s above
the fold.

The metrics and how they’re calculated are questionable. We ended up
optimizing for Google and removed the lazy load (ignoring that we think it’s a
better UX to lazy load that specific gif).

~~~
JoshTriplett
> We have a large 300kb animated gif that takes up maybe 20% of the viewport
> above the fold. The gif demonstrates visually what our service does.

You might be able to turn a 300kB GIF into a much smaller encoded video; as
long as it doesn't have audio, you can autoplay it.

~~~
rawoke083600
Interesting case scenarios: 1) a Site has a stupid big gif that shows logo and
staff parties, basically adding nothing but sucking up bandwidth(Not your
case)

2) Your case, where you actually add value-content with the gif.

Now the speed metrics are just that, speed metrics they "report" in isolation
from "content".

So now my question is: Is Google's OTHER-content-signals good enough to
overcome any penalty that might have been applied because of the speed ?

------
entropyneur
As a web developer who has recently spent an ungodly amount of time trying to
make my pages meet Google's impossible standards for qualifying as "fast" on
mobile, I sympathize with the author's point. But I think he's missing the
even bigger picture. Personal computing is mobile now. And even though the
phones have as many megabytes, kilotonnes, little clowns or whatever the
device greatness is measured in these days, browsing the web on them is still
slow as hell. And I would seriously entertain the suggestion that it's all
Apple's evil plot if every Android phone I ever used didn't suck donkey balls
for browsing the web. Whatever the reasons for this are, what's at stake now
is not web's diversity but it's relevance altogether. I'd rather live in the
world where web is needlessly optimized for performance than in the world of
apps.

~~~
kllrnohj
It doesn't help that ~every mobile device is 6+ cores while the web still
largely pretends that there's only a single CPU core & that that core is
getting faster. The web should have started adapting back in 2006 when this
trend really become common reality, but it didn't.

So you're stuck with 1 shitty CPU core, and you're stuck sharing it with the
browser & JS runtime (yes there's of course multithreaded aspects to the
browser itself, but you're still sharing an awful lot of CPU time on your
single thread with the browser). 1/6th to 1/8th the performance of the phone
is the _most_ you can achieve if you're lucky. That's a fucking terrible
starting point, and nobody should be surprised the mobile web sucks ass as a
result.

------
tschellenbach
It's funny that Google is so large, that one way to grow their business is to
improve the user experience of the internet as a whole.

~~~
monadic2
If this were true chrome would ship with ad block and they would accept cold
hard cash for their services.

~~~
verdverm
I tried this, I used it, no big adoption, product got killed.

Largely people prefer the free perception of the internet and wouldn't pay the
prices it would cost if direct payments were made.

I believe its in the $x00s per year

~~~
reaperducer
_They tried this, I used it, no big adoption, product got killed._

To which service do you refer?

~~~
gundmc
I believe it was Google Contributor:
[https://en.m.wikipedia.org/wiki/Google_Contributor](https://en.m.wikipedia.org/wiki/Google_Contributor)

~~~
verdverm
Yes, that is the one, thank you

------
seanwilson
> web.dev is operating on some irritating assumptions:

> 1\. Smaller assets are ideal.

> 2\. Minimalistic design is necessary.

This doesn't sound right to me. Aren't the three new page metrics mostly
targeting what happens when the page initially loads?

[https://web.dev/vitals/](https://web.dev/vitals/)

> Largest Contentful Paint (LCP): measures loading performance. To provide a
> good user experience, LCP should occur within 2.5 seconds of when the page
> first starts loading.

> First Input Delay (FID): measures interactivity. To provide a good user
> experience, pages should have a FID of less than 100 milliseconds.

> Cumulative Layout Shift (CLS): measures visual stability. To provide a good
> user experience, pages should maintain a CLS of less than 0.1.

The first two are about initial loading. For the last one, you can avoid
layout shift by e.g. reserving space for images that are yet to load fully.

For example, it sounds like your page could load 100MB of images with the most
complex design ever, and it would get a good score as long as the initial
viewport of the page displays quickly, is interactive quickly, and doesn't
jump around as it's loading.

They sound reasonable metrics to me in terms of fighting bloat but with
flexibility for web developers (as opposed to AMP). Who gets to decide the
metrics is another issue though.

------
neya
If not Google, someone else must step in and set some standards. Either way, I
don't see anything wrong, their platform, their rules. Don't like their rules?
Don't use them. When less people use their products, they will listen to what
customers want.

Having said that, what exactly do customers want? They want the best
experience on whatever device they're on. This is 2020, there's so much that
has happened since the 1990s. We can't simply keep using standards from the
1990s.

> 22.82 Mbps will reliably download very complex web pages nearly
> instantaneously.

The author needs to come down from his high horse and use internet in
developing countries. I was in India the other day for a client meet and I was
on one of the largest networks there. I had a subscription to NYT and I tried
to load an article and whoa, it took me 3 full minutes for the browser to
fully load the article to the point where it was even barely readable. I'm not
saying the network in India is slow, I'm saying, even with the best networks,
when you're travelling your speeds will be in KBPS. If we don't have strict
standards, the sizes of these pages will only grow and grow.

Later that day, I loaded the same article from my desktop again. The site made
a gazillion requests in the network tab to so many advertising vendors and
each of them had consistently sizeable assets. More than being offended for
selling me out despite a paid subscription, I was offended how ridiculously
unoptimized sites like NYT are, despite being a popular online, large scale
publisher.

I'm happy such sites like NYT will be penalized if they don't provide their
users a good experience.

------
magicalist
Not sure what this article is arguing.

Sometimes you want to make a slow website that doesn't fit well on a phone
screen?

Leaving aside the fact that you _can_ of course do that, and that if I'm using
a search engine on my phone I probably (usually?) don't want to look at your
slow site that I have to horizontally scroll...

> _Modern web design principles are very rarely directed at regular people
> looking to make a website on something they are interested in. Instead, the
> focus is on creating websites that perform well:_

> _Don 't use too many colours. Write short, catchy headlines. Don't let
> content be too long. Optimise for SEO. Produce video content, attention span
> is decreasing. Have a an obvious call to action. Push your newsletter. Keep
> important information above the fold. Don't make users think. Follow
> conventions._

All that's true to some extent if you're making a _product_ on the web and you
have a few seconds to hook a customer before they move on. If you're making a
website for enthusiasts in some niche, though, content is your draw and you
can worry less about some of these things.

~~~
reaperducer
_Sometimes you want to make a slow website that doesn 't fit well on a phone
screen?_

Sometimes you should have the choice of making a slow web site that doesn't
fit on the phone screen. A large corporation shouldn't dictate how you present
your information.

Sometimes you want to put a document on the web without having to pay someone
to run on Google's treadmill of changing standards and policies for the rest
of your life.

I'll take a crappy-looking web site with good content over an SEO-pumped
disaster that provided no information.

Craigslist, eBay, and dozens of others didn't get to be huge because of their
good looks.

~~~
modeless
Sometimes you can choose to make a website whose sole purpose isn't to attract
Google Search traffic, and that's OK! Not every website has to rank in Google
Search to be useful.

~~~
user_0x
but that isn't really true. google gets ~80% of the searches on the web. that
means that if someone who isn't conforming to google's frankly arbitrary
ranking system is being censored, to use strong language.

~~~
kllrnohj
> but that isn't really true. google gets ~80% of the searches on the web.

Not every destination started as a web search.

Case in point, the list of links on this very site that we're all commenting
on. This discussion didn't start with a google search. Google was never
involed at all. Not _everything_ on the web goes through google search. Nearly
all _searches_ go through Google, but there's a hell of a lot more to the web
than searches.

As also evidenced by some of the largest & most visited sites driving a ton of
non-search traffic - like reddit. And facebook. And twitter. And etc...

------
skynetv2
Anyone who can force the web developers to make more responsive, small, junk
free, sites that focus on user than ads will have my support. I don't see
anyone else making the attempt to force a change. The author is mistaken in
their views.

------
walshemj
I thought this was going to be a reasonable article but its just another whine
from a designer who wants to add another 6mb of pretty animation to a web
page.

1 and 2 are totally wrong and 3 googles moving away from amp ant letting
normal pages rank.

I have wasted to many hours of my time on conference calls with people like
this.

------
gdsdfe
I keep telling people that search shouldn't be a monopoly but they just keep
looking at me like I'm a crazy person

------
skilled
Google Search is in a horrendous state right now. Search results have been
getting worse each year, with interesting information being buried 20 pages
down.

I really hope they have plans to improve this or find an approach that works
as a middle ground for generic SEO content.

~~~
blueboo
This seems like a common claim here on HN. I would expect the creativity of
the world's SEO hackers to outstrip any single team, so I expect you'rer
right. But I struggle to find queries where interesting information isn't in
the top handful of links.

[badgers in montana], [while loop python], [how to fix noisy refigerator],
[nude stallman] all do just fine. Are there veins of queries that are
particularly polluted? Even [homeopathy covid] is pretty good.

~~~
slantyyz
I tend to think search results could benefit from a little bit of curation.

For example, when I search anything HTML, CSS or Javascript related, W3Schools
manages to be the top link in many of the cases.

While W3Schools is fine, I guess... I have to ask if they truly represent the
best result for my search query.

~~~
pricecomstock
It feels a little dirty, but this FireFox extension will fix that for you:
[https://addons.mozilla.org/en-
US/firefox/addon/hide-w3school...](https://addons.mozilla.org/en-
US/firefox/addon/hide-w3schools/)

------
aabbcc1241
Google Search is an index, it can do whatever decision it like. If we don't
like it, we can use other indices, even better, we can build more
alternatives.

The web is a open network. Anyone can share content as well as indices. I know
it's mostly impossible to beat google, but niche indices has their place to
shine.

For example, here, HN is an index of hand picked (mostly great) content, and
there are multiple "unofficial" HN variances. See? the web is very diverse and
free.

------
throwaway810
I find this tweet about how Google approaches web standards illuminating. To
quote:

> 1\. design a flawed API (it's fine! APIs are hard)

> 2\. ship it in the most-used browser, despite objections

> 3\. get cross-browser working group to fix the API

> 4\. oops, too late, that would break the web

[https://twitter.com/Rich_Harris/status/1220412711768666114](https://twitter.com/Rich_Harris/status/1220412711768666114)

~~~
ksec
>5\. Rally a group of developers and PR to bash Webkit as the only one not
implementing those flawed API and name it as the new IE.

~~~
throwaway810
Too true. Not to mention that many of the APIs Google introduce are prone to
tracking and spam.

------
verdverm
Google has largely made the internet a safer and more efficient system by
pushing standards through their market dominance.

Is this a bad thing?

~~~
mhh__
What did Google ever do for us?

~~~
verdverm
How many Google services and systems do you rely on each day?

~~~
dooglius
I'm pretty sure it's a reference to
[https://www.youtube.com/watch?v=Qc7HmhrgTuQ](https://www.youtube.com/watch?v=Qc7HmhrgTuQ)

~~~
mrkramer
My exact thought.

------
skywhopper
I agree with the general thrust of the article, but not a lot of the details.
And then there's this:

> Our phones have as much RAM as my “studio” work desktop

This is unlikely to be true. From what I can find, the latest iPhone has 4GB
of RAM, and Samsung is up to 8GB (and they are growing this stat fast to be
sure), but no "studio" desktop made in the last five years is going to have
that little.

> 22.82 Mbps will reliably download very complex web > pages nearly
> instantaneously.

This is definitely not true. It is true that the download time is not large,
but between DNS and TLS latency and the fact that most "complex" web pages are
built of assets from dozens of different servers, your actual wait time for
the assets can be quite long. But even if you discount that, the render time
is probably longer than the download time. If your page is that complex, I
hope it's very beautiful to look at.

------
urda
And we are allowing it. From the dangerous tactic of allowing them to MITM all
users via the AMP platform, to pushing their features only in their browser. A
browser, which I may remind, was pushed to success by abusing their market
position.

Had Google pulled this in the 90's they would have been attacked like
Microsoft.

------
rawoke083600
I see a lot of hate for SEO - Sure it can be use for crappy things but so can
a lot of business functions. I see a lot of complaints about ppl saying "oh
but this stupid page outrank company abc or the page they would expect." I am
just thinking out loud... if we are willing to employ "specialist" lawyers,
accountants, programmers etc in our business, why do we balk at employing a
SEO specialist as a specialist business function?

There are many good seo companies(true, you have todo your home-work, but that
is true for other service providers as well).

The dream is of course to not have to use them and just reply on Google and
it's good nature and or algorithms... But is that not saying.. Oh we will just
relay on never getting sued and therefor never need a lawyer cause, crimes are
bad ?

------
Animats
Now go look at the page source for the home page of Google.

------
NetBeck
The web is optimized for Google Chrome, not the inverse. It's not surprising.
Google is an advertising company with >90% search market share. There is
little competition in the economy, only product differentiation.

------
entha_saava
Didn't expect something as silly as this, especially the assertion that page
sizes don't matter and since internet speeds are going up, web people are free
to use it.

I don't see internet costs going down per MiB. Not everyone is on an unmetered
connection. And users expect the large amount of bandwidth to be utilized for
Netflix or something like that.

And how do you say Google is being evil for doing this? Even if there was no
search monopoly and imagine there were two competing search engines. I think
both of them would factor page load times in search results for better user
experience.

------
arexxbifs
> The entire point of the web was to democratize and simplify publishing [...]
> But the iPhone's [...] shitbox performance means we're all sort of ready to
> throw that progress away.

A hilariously ignorant statement. Simplified and democratized publishing
doesn't require one smidge of javascript, not one pointless tracker cookie,
not a single facebook pixel, no hideous autoplaying videos, not even a jot of
CSS. It needs nothing but HTML and text, rendered plenty fast by any computer
made this side of 1995.

------
robbrown451
It would be nice if you could specify preferences. For instance, if you hate
tracking and ads, you could tell google that and it would down-weight those
results.

------
impalallama
It thought I be in favor of whatever this article was arguing but then it went
in a weird direction by arguing against certain things that are just
considered good practice but its bad now because google is saying it. Like
yes, I would consider it a good thing to have lighter websites. Unless your
site is catering to some niche that _needs_ to offer 1 to 1 perfect image
quality there no reason to not compress your assets.

------
mrkramer
Casual users don't care about details in user experience, when you innovate
casual experience they will consider using new search engine.

------
romaaeterna
I recently began having a lot of problems with Chinese "searchbot" traffic for
a website. Filtering it out, my load went down by two orders of magnitude. It
made me wonder how much of the purpose of this sort of thing is SEO. And how
much slower Google is making the web for everyone by ranking on it, and
therefore encouraging the shenanigans.

~~~
nicbou
Every website is different, but if a single bot is bringing your website to a
crawl, you should probably start caching a few things and working on
performance.

~~~
romaaeterna
Other people are reporting the same thing:

[https://www.hypernode.com/blog/performance/huawei-
aspiegelbo...](https://www.hypernode.com/blog/performance/huawei-aspiegelbot-
is-increasingly-impacting-european-online-stores)

------
DevKoala
I wish there was a way to index the web which was not so susceptible to
marketers gaming the ranking system; it is clear Google has lost the battle
there. Moreover, I don’t believe it is in Google’s best interest to surface
the best results for me, but those that will generate them the most revenue.

------
z3t4
Many inlinks are good for search ranking - people create fake sites and spam.
domain is important for search ranking - people pick domain after search
keyword. Faster pages rank higher - people make fast websites. Quality content
rank higher - like if that ever going to happen.

------
pretendscholar
Is there a formal study available that examines the tradeoffs of google vs a
search engine like ddg?

------
youeseh
God, how? Like, in the monotheistic sense or the polytheistic sense? If we're
going poly, then Google is definitely one of the Gods of the web. Maybe even
the top God, but there may be some competition there.

~~~
throwaway90179
Google is the God of Control of the web. It thinks it knows what the people
want, but it's just using its influence to make it look that way.

The only difference is that Google isn't even aware that's what it's doing.

------
replyifuagree
My favorite Google war story is implementing a web application using Google's
polymer, only to watch their Google bot choke on crawling the site. It took
about a year for them to get their bot working.

------
yingbo
Of cause Google is not God: Money is.

------
karakanb
The article is an unexpected take on what I would have assumed "the basics of
web development" that would not be argued against. I would like to touch to
some of the points here, as the arguments of the article were not very clear
to me:

> The simple assumption that it is always better to have the smallest page
> possible – that images should be resized and compressed to hell and
> typography/other elements should be few in number. I strongly agree with
> this statement overall, and the article doesn't seem to provide any counter-
> arguments against it. We need to serve the smallest page possible because
> the larger the page, the more resources it consumes, it is that simple.
> Every _unnecessary_ byte added to a page literally translates to more
> storage for the page, more processing power to prepare the page, more data
> being sent on the wire, more data on the client-side to interpret the given
> data, and all these add up to more energy being used to consume that page
> and more resources being wasted. If I can remove one byte from a page, this
> is for sure a win for everyone, one byte is one byte, whether that saving is
> relevant considering the scale and the effort is a whole another discussion,
> and the claim was never "send the smallest page at all costs". Considering
> the time and effort, if there is a way to send a smaller page, then do it,
> it is no different than turning the lamp off of an empty room, just on a
> different scale.

> Instantaneous page loads should be priority over any other standards of
> measure for a web page – like interesting design, for instance. I have never
> seen such a claim anywhere before, needs citation. As a developer, I think
> the look and the feel of the pages are as important as performance or
> efficiency, and web on its own can be used as an art platform, which would
> make this whole point irrelevant. Again, the overall point is this: if you
> can offer the same thing with smaller pages witha reasonable amount of
> effort, do it.

> Minimalistic design is necessary. I have never seen such a claim, needs
> citation. As a user, I prefer cleaner design over fancier things, but this
> is neither a "rule" nor the industry standard. There are various research
> being done on this topic and I am no expert on it, but a joint research [1]
> done by University of Basel and Google/YouTube User Experience Research
> shows that the users perceive the cleaner designs as more beautiful, making
> the point that if the user perception is a goal for the given webpage, then
> keeping things simple might actually make a difference there. Again, depends
> on the use-case.

> 22.82 Mbps will reliably download very complex web pages nearly
> instantaneously This is a pain I am living with every day. I have a 4 year
> old mobile phone with 6GB of RAM, and it takes at least 8-10 seconds for
> Medium to be usable over a ~100Mbps connection, combined with fetching the
> page and rendering / interpreting it. This is exactly the point I was making
> above, if the page was smaller, it would have actually made a difference of
> seconds. The same device over the same connection at the same time opens the
> bettermotherfuckingwebsite.com in under a second, so, there is something to
> be seen there.

In addition to that, even if I had 1Ghz connection, a 1-byte waste is a waste,
irrelevant of my connection speed. I am not talking of the effort of saving
that byte, but it is important to acknowledge the waste.

> Google has the right to dictate “Best Practices.” This is a point that I
> agree with more than the other ones, but this seems to be a separate topic
> to me. The previous arguments were against the practices themselves, and
> this one is against the entity that is supplying those practices. Even
> though I agree with the majority of the points there, it would have been a
> more informative read if the claims and frustrations there were stated with
> better point-by-point explanations and data to back those claims up. Google
> having a huge power and monopoly to push people to certain standards is a
> big problem, but it is not clear in the article whether the author is
> arguing against the practices or Google itself.

Overall, I believe it would have been a more resourceful article if the points
and claims against the given practices were backed by better alternatives and
data. We all accept that more data is more processing power + more energy,
therefore trying to minimize it is an important goal, if the author thinks it
should not be, then would be more interested in the answer of "why?" rather
than a rant against long-standing practices.

------
verdverm
This is the best place I know of to have thoughtful discussions and talk about
what most consider taboo.

[http://www.paulgraham.com/say.html](http://www.paulgraham.com/say.html)

[http://www.paulgraham.com/resay.html](http://www.paulgraham.com/resay.html)

There are times where it gets heated, but those are quickly shut down by the
very excellent community moderation.

~~~
dang
We detached this comment from
[https://news.ycombinator.com/item?id=23383662](https://news.ycombinator.com/item?id=23383662).

