
Is the web getting slower? - oedmarap
https://www.debugbear.com/blog/is-the-web-getting-slower
======
thdrdt
I have seen a lot of front-end developers who have no clue about the technical
aspects of their work. They just use multi-megapixel images because 'they will
look sharper'. When they need an icon from Font Awesome they just include the
whole library. And because Bootstrap told them to do so, they nest as much DOM
as possible.

I have seen websites where the homepage's HTML alone was over 1MB! in size.
The only thing that got exited was my CPU.

Let me tell you: if you want a header with a background image where the text
is aligned at the bottom you can just write:

    
    
      <h1>My header</h1>
    

There is no need to write it as:

    
    
      <div id="mainHeader"><div class="align-text-bottom"><h1>My header</h1></div></div>
    

And if you want to stick the header to the top you can use `position: sticky`
in CSS instead of including a huge Javascript file that can do all kinds of
fancy stuff you don't need.

But I am not sure I can blame those front-end developers. Deadlines are tight
and it takes effort to learn about the technical aspects of front-end
development.

My personal standard is that a page should be ready in 1 second. For huge
sites 3 seconds max. I've been creating small and huge websites for over 20
years now and never had a problem with these goals. This includes webapps
built with Javascript.

~~~
j-krieger
The Thing is that most people here belittle Web Developers but they do not
have a clue what we do. At all.

In your pretty short answer you’ve already caught mistake. Let‘s see:

> And if you want to stick the header to the top you can use `position:
> sticky` in CSS instead of including a huge Javascript file that can do all
> kinds of fancy stuff you don't need.

Your „simple trick“ to avoid „including a huge javascript file“ doesn‘t work
in: Opera, Chrome for Android, IE and a couple more[1]. It _does_ work in
Safari, but not as you would expect. Also, position:sticky is still a Working
Draft. It is suspect to be changed at all times. Want the same functionality
across all Browsers? Better use Js.

Another thing is that most people here think „huge JS bundles“ are what make
websites work. These are to be used for web _apps_ though. If you want
anything more than a static site, you‘ll need JavaScript. No way around it.

[1]: [https://caniuse.com/?search=Sticky](https://caniuse.com/?search=Sticky)

~~~
the_gastropod
> If you want anything more than a static site, you‘ll need JavaScript. No way
> around it.

I’ve been noticing this kind of rebranding of what static means a lot. It
seems to create lots of confusion (you can see it in this thread already).

Traditionally, “static” websites were websites without a server-side
programming language backing them—sites whose content was unchanged by any
users. Javascript, CSS, and HTML are the tools used to build static websites.

Today, some developers seem to use static to mean Javascriptless.

The term fits both situations pretty well, so I get why it happened. I don’t
know if there’s a better less-ambiguous term used for either of these things
now?! Non-database-backed?! Nonjavascript’d?

~~~
mrami
The term for JS enabled pages used to be DHTML back in the day. You'd be
talking about a static-server DHTML site. Or something like that.
[https://en.m.wikipedia.org/wiki/Dynamic_HTML](https://en.m.wikipedia.org/wiki/Dynamic_HTML)

------
rcheu
What is going on here? Why have none of the commenters read the article?
Perhaps because it's phrased as a question and people didn't realize it's a
link?

Anyways, this matches my expectations--people tend to be overly negative and
only remember the good part. The mobile web as a whole has gotten faster due
to network speeds+cpu improvements.

It is worth noting that pages are doing more after loading now than they used
to be though. This won't show up in onload or first meaningful paint, etc. So
the first paint is fast, but then if you try to scroll immediately afterwards
you'll probably hit some jankyness while the rest of the page loads
asynchronously (but only kind of asynchronously since there's a single main
thread).

Some other things that could cause the regression are that more people own a
budget Android phone now than before. People may not realize how slow these
phones are. The single core performance of the top budget phone, the Samsung
A50, is comparable to an iPhone 6 which came out in 2015.

~~~
FridgeSeal
> The mobile web as a whole has gotten faster due to network speeds+cpu
> improvements

Making something faster by throwing more hardware at it doesn't meaningfully
count as making something faster IMO, you can make the most inefficient piece
of software "fast" by throwing the biggest CPU and network you can find at it.

The real issue is why should an otherwise capable CPU from 5 years ago
struggle to render the average website today, when it really shouldn't be that
hard. Scrolling through someone's marketing website _should_ be a painless
experience on even a low-end budget phone.

~~~
ooobit2
I don't know the direct answer, but I have a deduction in my head to work
with...? I'll just put it out there: I think speed is deeply impacted by high-
level frameworks that parse or compile at runtime. React Native is a framework
on Android, which is a framework on Java, which compiles at runtime. IDK if
anything in that chain compiles down to Assembly or machine code _before_ you
open an app made in React Native. That tied with the bloat of JIC background
services sitting idle eats bandwidth. Garbage collection checks operate in a
loop, checking again and again if all these unused but loaded processes still
exist at their addresses. And when you pile those on each other, it seems
relatively easy to see how modern CPUs don't seem much faster than chipsets
from 5-6 years ago.

I stopped programming around 8 years ago because I hate the current MVC model
most software is created and maintained with. What got me interested recently
in dipping back in was a video on branchless programming. I love the idea of
unit testing at the machine code level for efficiency, and then figuring out
how to trick the compiler or runtime and the chipset into making quick,
predictive outputs to reduce idling on branches or making 15 steps for
something doable in as little as 4.

That feels like a completely opposing direction to take given the current
priorities of engineers across almost all industries, even oldtime ones like
Gaming.

~~~
throwaway8941
Android applications get compiled to native code on install since Android 5
(2014).

------
rhizome
Yes, demonstrably and provably. Count the number of pageload indicators
(spinners, throbbers...whatever you like to call them) in your daily browsing.
They are everywhere... _now_.

AJAX (and, I suspect, the shadown DOM model) have proliferated in recent
years, and there is no site design simple enough that someone hasn't thought
to put every page element behind a JS call. Don't forget to put CSS in there,
too!

Frontend developers are at fault for all this. There, I said it.

~~~
dijit
I didn’t touch “web dev” since php/dream-weaver/myspace era.

I started working with some web developers for a simple project recently, my
mock product was built with vue.js using the standard way of “including” it:
But what really threw me for a loop was that the web developers told me that
that’s wrong, they are now _compiling_ (Or, semantically “packing”) JavaScript
blobs to make even very simple websites.

I don’t know why we as a community have started doing that, but it feels like
an anti-pattern, it makes adding new dependencies opaque and we can very
quickly end up including dozens and dozens of lines of code which must all be
rendered by every device that comes into contact with my site.

Often this happens because we want only a small bit of the functionality too.

~~~
RandoHolmes
100% with you there.

I personally think it's a bit insane to add node.js as a dependency to a
simple PHP project.

~~~
WrtCdEvrydy
To be honest, we've seen this in Rails and PHP in the form of grunt.

------
Swizec
It’s important to make a distinction between webapps and websites here. We use
the web now like we used desktop in the 90’s and 00’s.

For most users, your browser is your OS. Hell even when it isn’t, most desktop
apps use HTML+CSS for their UI. Hell even many user-facing “embedded” apps
(like TVs) are running linux with chrome and showing a webapp as their UI.

The layout engine is just that good and convenient. And downloading fresh app
source on every visit solves a lot of problems.

This part of the web is getting bloated and slow.

On the other hand are websites. These are fat as heck thanks to CDN and
broadband and advances in server compute power. They load faster than ever.

Remember when downloading memes required eMule? I do.

Now I go to imgur and watch 30MB high def gifs like it’s nothing. My 13 year
old self would shit his pants in awe.

Are ads and trackers bloating websites? Yes. Are most websites built like
webapps even when there’s no need? Yes.

Blame tooling. Go help. What can we as a profession do to make it easier for
random developers with no skill build faster better websites?

Right now we’re actively telling everyone they need to build as if they’re
FAANG. Then we complain when a part time dev working for a mom&pop shop can’t
wrangle all this tooling built for teams of 1000’s into a solid experience.

~~~
TiredGuy
>Blame tooling. Go help.

This might be an opportune time to mention my attempt at helping. I made
barleytea.js [1] as a light framework alternative to React that needs no
webpack. I also recommend much more polished, production-ready things like µce
[2] and heresy [3].

Anyway I just want people to know that there are lightweight alternatives out
there, and they're worth at least checking out.

[1]
[https://gitlab.com/andrewfulrich/barleytea](https://gitlab.com/andrewfulrich/barleytea)
[2]
[https://github.com/WebReflection/uce](https://github.com/WebReflection/uce)
[3]
[https://github.com/WebReflection/heresy](https://github.com/WebReflection/heresy)

~~~
Sebb767
Thanks for the tip!

I've stayed with Angular v1 for a long time[0] due to it's simpler nature (no
compiler, simply JS); but since that is going EOL I'm definitely going to look
into Barleytea.

[0] For private projects only, of course.

------
paxys
Also see Downs–Thomson paradox
([https://en.wikipedia.org/wiki/Downs%E2%80%93Thomson_paradox](https://en.wikipedia.org/wiki/Downs%E2%80%93Thomson_paradox))
- improvements in a road network will not reduce traffic congestion, but
rather can make it worse.

~~~
wayne_skylar
I've always had a question about this. If more people are deciding to use the
road afterwards surely it means that trips are being taken that weren't
before? Wouldn't that be a good thing, economically speaking?

~~~
jdminhbg
> Wouldn't that be a good thing, economically speaking?

Yes, because it means people can live in places they couldn't before, or could
get to jobs they couldn't before. Of course, people who already used the road
won't perceive these benefits; they are in the same situation as before.

------
arminiusreturns
This article isn't very good, so I'm just gonna skip commenting on it.

Yes, the web is slower, but not in a the pure measurement of "is this js
bigger, and does it take longer to execute" sort of way. It's more in the, now
you have 39 js's trying to run on a page that's just some text and a few
pictures, whereas that used to be 0-3 js's. For those of you who use ublock
origin but not matrix, matrix will really open your eyes to this
proliferation.

It's mostly management decisions that get shoved down the throats of devs, but
I would say it's also devs who love to throw a thousand frameworks in-between
content and delivery.

Simplification of stack is a market advantage I think eventually more and more
companies will start to realize... at some point in the future.

~~~
dbeley
> For those of you who use ublock origin but not matrix, matrix will really
> open your eyes to this proliferation.

You can enable advanced mode in uBlock Origin to have an UI similar to
uMatrix.

~~~
lucasverra
Could you throw a print screen for a default config ? I've tried Umatrix twice
and do not understand how to handle all the knots

------
wldlyinaccurate
I've been working in the web performance monitoring space
([https://speedcurve.com/](https://speedcurve.com/)) for 3 years now, and the
3 years prior to that I was doing a lot of performance work at the BBC. I
can't share the data for obvious reasons but I can confidently say that the
web is getting measurably slower every year, despite connection speeds
increasing drastically over the last few years.

We like to over-simplify and try to attribute it to things like JS frameworks,
advertising, media-heavy pages, etc. The truth is that it is all of these
things, and so much more. Yes, devices are more powerful, but we are also
asking our devices to do more. Yes, connections are faster, but bandwidth
doesn't help with things like TCP slow start or browser concurrency limits. On
top of all of this, our perception of speed is changing, so things can _feel_
slower than they really are.

------
ErikAugust
This is one reason I created Trim [0]. I didn’t want to have to load 4-7 MB of
stuff to read a stupid article. It often reduces an article page weight by 99%
and uses no JavaScript.

[0]: [https://beta.trimread.com](https://beta.trimread.com)

~~~
pmoriarty
This is one of the main reasons why I do 90% of my web browsing in emacs-w3m,
which does not support Javascript.

Avoiding Javascript not only lets me avoid all that bloat and slowdown, but
also avoid Javascript-based tracking, malware, and exposing myself to
Javascript vulnerabilities.

I also have the power of the entire Emacs ecosystem at my fingertips when I
surf the web this way, which can be very helpful in many ways.

Unfortunately, some sites I find essential will not work without Javascript,
and for them I go back to Firefox.

In Firefox, I use uMatrix and uBlock Origin to only allow through the minimal
amount of stuff that'll let the web page work, and filter out ads.

I have a really old, slow laptop, so web browsing is slow for me, but not
nearly as slow as it would be if I consented to swallow all the crap that the
modern web tries to force down my throat.

I yearn for the good old days without Javascript or Webassembly. There was
Flash back in those days, but fortunately not a single serious site I ever
visited required it, so I could avoid every Flash-using site like the plague.
But today the Javascript plague is unavoidable.

~~~
larkeith
You can also disable JS via uBlock Origin by default, and enable it on a site-
by-site basis.

~~~
pmoriarty
uMatrix not only allows you to block JS on a site you're visiting, but gives
you fine-grained control over blocking JS from sites that site calls out to,
and subdomains of that site as well.

I find uMatrix much more flexible than uBO in this regard. I don't even know
if doing all this is possible in uBO, and suspect it's not, or that at least
it's hidden away pretty well, while this is front and center in uMatrix's
interface.

~~~
larkeith
You can do some of that with uBO (see "hard" blocking mode), but my
understanding is that uMatrix gives you more granular control. I need to learn
more about using uMatrix to confirm, though.

------
godot
I know most of the comments right now are an anecdotal resounding "Yes" from
most folks (which I also agree anecdotally), I'd like to respond to the
conclusion of the article (which is that as internet speed increased over the
years, page loading times have roughly stayed constant). This makes sense in
retrospect because product development tends to consider what the baseline
acceptable speed/time to load is and then utilize that allowance fully and
load whatever is possible (or optimize down until that point during the
development process).

I only wish this wasn't the case, and that the internet speed gains over the
years actually meant the browsing experince for the consumers actually
improved.

~~~
gerdesj
Page load times, indexed for available bandwidth, appear to follow a sort of
inverse Moore's law. We could call it Gerdes's random stab in the dark.

My first home internet connection ran at 9600 bps except when it clocked
itself down to rather slower. At work at the time I had a synchronous pair of
modems running at less than that for an IBM System/36 site to site link. Later
on oooh V.FAST - 56Kbps except it wasn't really. More like 48Kbps plus a bit
on a good day and a decent marketing department.

I am now using a 1Gbps connection.

My PC on the end of the 9600 bps system was a 80486 based beast running at
25Mhz. It cost me £1600 (thanks Granddad for the unexpected bequethement that
was a major factor in getting me where I am today)

I now use a 8 core + H/T Corei7 beastie running at 1.8GHz laptop - its getting
on a bit now but it can still churn out packets at quite a rate and crunch my
CAD efforts.

Page sizes back in the day were rather small, say 10Kb. Nowadays 10Mb is
pretty common (dodgy assertion with no proof)

Web pages do different things than they used to as well.

I don't think it is quite as simple as you suggest, wrt page load speeds. What
page, on what, with what and why!

------
mrjin
Not only web is getting slower, it is actually everything. I've read a similar
article but talking about input latency(from key stroke to displaying on
screen), which is also steadily growing.

The reason behind might be the same: we have far more computing power than
ever, so we start to abusing it by spending lots of it on visual elements
which is good to have but not essential.

~~~
canofbars
Part of it is that modern screens have a much bigger latency than CRTs did.
Largely I don't think it really matters. My desktop and phone feel very very
responsive, Its only websites that are visibly slow to me. And most of that is
waiting on the server. I click a link or submit a form and I have to wait
seconds for the page to load where as with a local program its instant.

~~~
t-writescode
I keep hearing this come up. Do you have a modern citation for this with
modern monitors, say made within the last 5 years?

It seems to be a constantly repeated myth or something of no consequence. (one
article I read said LCDs have 2 ms higher latency than CRTs, which I would
count as of no consequence)

~~~
archi42
The 3y old 4k Samsung TV in the living room can't be pushed below 50 to 60ms
of latency, even with processing minimized, and compared to an audio signal
NOT routed through it (the TV's line out obviously compensates for that, but
is only stereo).

Back in 2011/2012, when I got Rock Smith, I used a dslr camera to determine
the latency of my LCD compared to my old CRT. Just took a snap of a ms counter
racing up on both screens (mirrored). Can't recall the exact difference, but
definitely much more than 2ms. I measured because I noticed it during
gameplay; I'm pretty sensitive to delayed video/audio, but not 2ms sensitive -
that's one beat off at 30.000bpm. You can use that technique to check for
yourself with a more modern display.

~~~
t-writescode
I specifically asked about monitors. TVs have a huge amount of latency and
post processing that’s only recently gotten better. Computer monitors, on the
other hand, have long since been very fast.

I ask for evidence because I’m not the one presenting the claim and suspect
it’s no longer true.

~~~
archi42
See my edit. I don't have a CRT anymore, so I'm afraid if you're interested in
refuting or confirming that claim, you have to test it yourself ;)

------
victornomad
Yes!

I have the feeling that most of modern websites use too many cpu&gpu
resources. My computer is just 5 years old and each time I visit a modern
website my computer really suffers.

Please designers and engineers, I don't own the last macbook pro with maxed
out specs and my internet connection is quite normal.

Start creating for the rest of us who don't have the resources or interest in
upgrading the computer every couple of years!

~~~
dayjobpork
It's not just web pages, modern software engineers just make terribly
unoptimised programs. Why does a recent game on a powerful pc take longer to
open that old DOS games? Not just loading a level, but even getting to the
start screen takes ages. Wtf takes 20+ seconds to show a menu of New - Load -
Options.... (drm probably)

~~~
FridgeSeal
I think it is partly due to the software development industries prioritisation
of “developer speed”/“developer experience” over everything else.

------
ecmascript
Just like most things, the answer is "it depends" since it's really not that
simple. This is a great article but the answer seems obvious to me.

The web today is starting to get divided into apps and sites. HN is a site for
example. It is small, loads fast etc and works well for it's intended use
case.

When I listen to music or podcasts in a web site, I want it to act like an app
and for that to happen more stuff has to happen in the client. Thus leading to
a longer load time and execution time. This is something I can live with
though, since I can do stuff on the web that was impossible just a few years
ago. I am also developing apps that was impossible to do on the web a few
years ago.

I want to use both sites like looking up that store and order some stuff and
apps like doing design or consuming music and video.

I prefer web apps that are done well rather than native apps. I don't have to
download anything and they are free from the shackles of Apple, Google and
Microsoft. Also, you don't have to make them bloated and big. You don't have
to use a framework. You can use web components and maybe some small router
library and you have the most important stuff a front end framework gives you.

Just check fastmail, their client is super fast and is a very well done SPA
app. Then you can look at Reddit, which is a horrible mess. Like any app, any
language you can make the experience shitty and you can make it awesome.

------
zwaps
For me, the author of this article is wrong.

On my system, Firefox on Windows, these constant delayed loadings of
JavaScript garbage make every website laggy. Yes, the web has become slower.

The author then himself posts that page weights have increased by over 300
percent.

Clearly, usability of todays web is generally worse.

Web development has changed from a field where competent people cared about
user experience given simple and limited tech, to a field where people care
about ads and using the latest fad in terrible JavaScript framework tool blah.

------
nikanj
Yes. Fortunately hardware is getting faster, which mostly compensates. But the
web has definitely turned into a bloated whale

~~~
li4ick
Which should tell you something. The vast majority of tech advancements are
because of hardware, not software. I think that software people should feel a
bit of shame at this point in time.

~~~
canofbars
No one feels any shame because the reason is obvious. Normal users are
constantly demanding more features and not more speed. The average person just
buys a new phone when things are slow. Companies don't have infinite time so
when faced with the option between adding a new feature and speeding up an old
one they always pick the new feature because thats worth more.

~~~
li4ick
Now imagine if the people working on the lower level stuff had the same
attitude. Imagine if the codec implementations were slow because we need more
features. Imagine if the graphics people never gave a damn about perf, just
like the "programmers" you're talking about. Imagine if your database was the
main perf hog in your system etc. etc. I'm sorry, but here's an unpopular
opinion: those people are just not good engineers. The increments of software
development should be abstractions, not features.

------
Funes-
Yes. It's the Jevons' paradox enabled _and_ amplified by the corporate web
abusing JavaScript (bloat, ads, trackers, and other attention-grabbing
shenanigans).

------
Aperocky
Going to certain webpage is like cold starting jvm.

~~~
iso8859-1
You'd be able to boot an x86 emulator in JavaScript to a Groovy prompt in just
a few seconds, so probably faster than you can load any major news site except
text.npr.org ;)

~~~
jandrese
news.ycombinator.com loads pretty fast too.

------
firefoxd
Sometimes it's not because of the marketing team and all their tracking
pixels. Here the dev team, in the name of innovation, slowed their own process
to keep up with Jetsons [1]. Also, if you know any angular developers...

[1]: [https://idiallo.com/blog/hiring-angular-experts-
not](https://idiallo.com/blog/hiring-angular-experts-not)

------
731d2fe149f6957
It's not only getting slower, but shit. Animations everywhere. Unecessary
spaces everywhere. Infinite scroll bullshit everywhere. Still can't find shit
on webpages without ctrl+f

Why do current webdevs think this is a good idea? Well yes, 'mobile first',
but even on mobile it usually looks like absolute bullshit and animations are
even more annoying on mobile. Why can't we have plain webpages that actually
work?

If you want an example: I recently tried to find an appartment using airbnb.
Since the last time I used it they changed their design and it's super slow,
the animations don't even make sense, but rounded corners everywhere. Fuck.
This. Shit.

------
idoubtit
The article explains what there is no way to really measure the page load of
modern sites for a user point of view. Then it analyses the biased data, using
the various flawed metrics that do not match the modern web experience.
Consequently, its overall conclusion that networks speed gains have made the
average web faster is not conclusive at all.

Here is an example where these metrics make no sense. I've been using the
website of the national weather forecast (meteofrance) for years. It used to
load the DOM in less than a second. Since the the content was in the HTML, the
user perception was the same as DOM-loaded. Now, with the same DSL connection,
it loads in 2.2s (DOM). On a mobile, since the network is many times faster
than a few years ago, the DOM is probably loaded faster. _But the real content
is not in the initial HTML anymore_. It is loaded through XHR among the 80+
HTTP queries sent. The forecast is now displayed 5 to 6 seconds after the
initial request for the web page.

I've seen quite a few sites that went slower over time because their content
was no more static but wrapped into some JS framework. It is not only slower,
but less robust, and harder to monitor — I've seen a few blank or broken
pages, and I'm pretty sure this was not logged on the server.

This does not imply about the average web, and I'm not sure any comparison
would make sense, but it does show that some areas of the web have regressed
over the last decade.

------
austincheney
The article never mentioned DNS as a factor.

I can remember when my house internet went from 20mbps to 1gbps. There was no
perceived difference in web speed. 20mbps was fast enough and the conventions
in place forced a low ceiling of performance.

As far as JavaScript goes any mention of performance improvements often
results in hostility from JavaScript developers. Try taking away their 300mb
framework or eliminating DOM navigation by use of clunky query selectors. The
result is hostility even though you can prove performance increases of 500x in
chrome or 20000x in Firefox.

------
yalogin
I don’t have numbers for it but I am sure the size of the web pages have
increased by many multiples over the years and only getting worse. There
aren’t any simple pages anymore, every page includes a bunch of JavaScript
libraries embedded, tons of ad network code, tons of performance and tracking
code, lots of images replaced text. I am not even talking about web assembly,
web sockets and client side rendering etc.

------
shaabanban
Try to use reddit in a browser...

~~~
jcims
new.reddit.com - 327 requests, 8mb, 7 seconds -
[https://tools.pingdom.com/#5d1bf3f952400000](https://tools.pingdom.com/#5d1bf3f952400000)

news.ycombinator.com - 7 requests, 22kb, .125 seconds -
[https://tools.pingdom.com/#5d1bf4a4f8400000](https://tools.pingdom.com/#5d1bf4a4f8400000)

The hilarous thing is that if you put in
[https://www.reddit.com](https://www.reddit.com) to pingdom's measurement
tool, reddit returns the old site layout.
[https://tools.pingdom.com/#5d1bf3c4e3800000](https://tools.pingdom.com/#5d1bf3c4e3800000)

edit: Just poked around a bit to find best worst examples. Unsurprisingly CNN
is the worst I could find after a few mins -
[https://tools.pingdom.com/#5d1bf596ffc00000](https://tools.pingdom.com/#5d1bf596ffc00000)
543(!) requests, 9mb, 6 seconds. Just stroll through the list of bullshit it
sucks into the front page...what a mess.

~~~
pmoriarty
Try old.reddit.com

Also, it's possible to browser Reddit through non-web-browsing applications
that use the Reddit API.

~~~
canofbars
The reddit API is a legacy feature that is on the chopping block any day now.
Its only real purpose now is 3rd party clients which don't show adverts or
insert tracking scripts. It has a small amount of use for bots but those are
mostly a negative user experience and likely to be killed soon as well.

Reddit is pushing to become facebook without your real name.

~~~
Sebb767
> without your real name.

With extensive user profiles and now very visible profile pictures, I'm not
sure that this isn't going to come as well.

~~~
pmoriarty
I wouldn't be surprised if they started asking for your phone number as well,
as so many privacy-invading sites like to do these days.

~~~
canofbars
This seems very likely. Reddit admins have mentioned that they are trying to
crack down on alt accounts and banned users signing up 2 seconds after being
banned. A phone number requirement would solve that easily.

------
lokimedes
The article is a nice attempt at deconvoluting the various factors that may
have changed. From my personal experience the subjective sluggishness cones
from the unbelievable number of third party connections on every webpage
visit. Setting up a Pi-hole has done wonders for the responsiveness of most
websites.

------
achairapart
From a point of view of someone who makes websites since the 90s, two major
catastrophes have hit front-end in its history:

1\. Art directors from the print world in the '00s. And luckily this is over.

2\. Engineers over-engineering things, since the '10s. And they won't stop...

------
nindalf
There’s a disconnect between the author and the comments section here.

The author claims that yes, websites are getting heavier but improvements in
bandwidth, CPUs, protocols (like http2) and browsers offsets this to make the
web as fast or faster for the median web user.

Comments here express frustration with those increased bundle sizes,
especially in cases where we aren’t getting any features in return.

Both sets of folks are talking past each other because there is disagreement
over the metric. The author wants to use load times as seen by users in
wealthy countries. The comments folks want to use KB or features per KB
(subjective).

There’s no right or wrong here, just different metrics.

------
elorant
It's definitely slower and the easiest way to realize it is to use plugins
like uBlock Origin combined with uMatrix and compare load times with them
enabled, and then disabled. For me it's at least 50% faster when they're on.

------
throwaway_pdp09
No, it's faster. Disable JS and - what continues to work, which is quite a bit
- is snappy. It's also snappier than it was a decade ago, even 3 years ago,
very clearly so.

But while people will accept JS and general online abuse, it will get worse.

------
tomxor
The network improvements are not only slower but also increasingly non-uniform
as the limits of long copper are reached.

It's the problem with averages again, which are significantly inflated by a
minority with ever increasing broadband speeds. It's easy to improve a select
few areas to extreme; but hard to improve for the majority. I suspect pointing
at figures of increasing averages to justify larger assets is part of the
problem.

------
didip
Software is like running water. It will fill up whatever container (hardware)
it's on.

~~~
xdavidliu
who would be the Bruce Lee of software?

------
Yetanfou
Well, in a way, yes, especially when using older hardware like the Thinkpad
T42p I'm using right now. Go to the Wayback machine or Internet Archive, find
a page from when the machine was made (2004) and compare its loading time to
the current version. Now take a recent piece of hardware and load the current
page and marvel at the fact that the old page on the old machine loads about
as fast as the new page on the new machine. The old page looks dated, of
course, but that is more a matter of the layout than it is of the lack of
'modern' technology. It would be possible to make the page look close to the
way the modern version looks without incurring all that extra load time. Yes,
this would probably entail server side rendering and some judicious use of
older but still useful tricks like server side includes but it is certainly
doable. It isn't being done because modern pages load fast enough on modern
hardware and developers are incentivised to track the latest technologies to
keep their market value up.

------
intelliot
It's not only about stats/data, but also the user experience. Many web pages
have many more ads, videos, gifs, and clutter than they did before. There's
also more spam and more low-quality content. That makes finding the info that
you want harder than before. I've noticed this especially when searching for
cooking recipes online.

------
anonyfox
I recently did my part to speed up my site: [https://anonyfox.com/blog/need-
for-speed/](https://anonyfox.com/blog/need-for-speed/)

So i‘d assume that it‘s possible to make „the Web“ really fast, especially
when factoring in the improvements of network/cpu over the years.

------
anw
From the article

> Mobile page weight increased by 337% between 2013 and 2020

that is a huge span of time, with the birth and death of stars in-between in
the Web Development cosmos. So much has changed in Web Dev, especially for
mobile.

The issue is not the Web getting slower. It's with the developers and their
companies focusing on the time to iterate. We have traded consumer convenience
for speed to develop and using shiny new tech/methodologies.

We have traded doing things server-side to having most things in an SPA; using
tracking pixels and Google Analytics to now also incorporating New Relic,
Optimizely, a host of CDNs, bot protection scripts, social media scripts for
"better personalisation", CAPTCHAs.

I feel we are moving backwards. Back to when developers used ActiveX plugins
because it was more convenient for them. Only ActiveX is now a combination of
"Javascript in the browser, and a powerful enough laptop to handle it without
choking".

~~~
RandoHolmes
7 years isn't even close to a "huge span of time".

~~~
anw
In relation to Web development, a lot can (and has) happened within 7 years.

If you were a Web developer in 2013, you could find plenty of jobs if you knew
Django (Python), Rails (Ruby), Zend, CakePHP, Symfony (PHP), or Spring (Java).
Those were the frameworks used by either companies (Zend, Spring, etc.), or
loved by devs (especially Rails and Django). Node was starting to come into
its own for Web development, but nowhere close to the others.

Fast forward to 2020. Rails and Django are on the decline from their fame
peak. Most of those frameworks are still being used for legacy reasons. Web
dev work I see is primarily related to Javascript (React, Vue, Typescript) or
the JVM (Scala, Kotlin, Java). The other interpreted languages are nowhere
close to their 2013 status.

Between 2013 and 2020, we also have a huge ecosystem of Golang and Rust coming
out, let alone Docker. All of these helped influence or play a crucial role in
developing web sites and applications.

While a lot of these technologies were around in 2013, they definitely weren't
as pervasive or mature as now, and developing in 2013 was a whole different
animal than doing so in 2020.

~~~
RandoHolmes
Java : 1995 : 25 y/o.

RoR : 2004 : 16 y/o.

Scala : 2004 : 16 y/o.

Django : 2005 :15 y/o.

golang : 2009 : 11 y/o.

Rust : 2010 : 10 y/o.

Kotlin : 2011 : 9 y/o.

Typescript : 2012 : 8 y/o.

React : 2013 : 7 y/o.

Docker : 2013 : 7 y/o.

Vue : 2014 : 6 y/o.

Not even the majority of the tech you mentioned is 7 y/o or less. And you
can't name a single piece of tech on that list that isn't actively developed
today because they're all actively developed, including Java, the tech that's
been around for 25 years.

This is specifically why people talk about fad driven development and make fun
of younger people who have no sense of history.

7 years just isn't a lot of time, anyone who thinks it is is a junior who
thinks they're a senior.

~~~
anw
\-- I have deleted a long comment I was going to reply with as I'm sure you
are not open to hearing actual counter arguments. I don't want to waste my
time. --

My reply instead is to read more carefully, think more carefully, and do not
throw puerile thoughtless sentences out like the last one in your reply above.

~~~
RandoHolmes
Watch as my eyeballs roll out of my head, onto the floor, and out the door.

Oh also...

\-- I have deleted a long comment I was going to reply with as I'm sure you
are not open to hearing actual counter arguments. I don't want to waste my
time. --

My reply instead is to read more carefully, think more carefully, and do not
throw puerile thoughtless sentences out like the last one in your reply above.

Weird... it seems I too can go with the better than thou act.

------
graiz
Conclusion > I don't think the mobile web – as experienced by users – has
become slower overall.

Even as a subjective result this seems really bad. Most web-pages are text &
images, outside of apps like maps/email, most content is static and while
CPU's and bandwidth has skyrocketed the UX has stayed about the same... we
think.

------
hevelvarik
since we’re blaming here, I blame the dominance of web by multi billion dollar
tech companies. This requires any site/app that wishes to be taken seriously
to have a look and feel comparable to the big guys and the only way you do
that on a budget is with a hefty heap of framework abstraction.

------
unixsheikh
From a hardware perspective, everything is many times faster that it was just
ten years ago.

From a user of the web perspective, then I cannot even find the right words to
describe the madness that front-end developers are causing with their utter
clueless usage of absolutely ridicules JavaScript on top of frameworks on top
whatever other useless crap they throw into the mix just to display basic
static HTML!

The web is not getting slower, it's getting faster, but even amazing
technology cannot compensate for front-end developer stupidity!

------
chungalunga
I’ve been using w3m as my primary browser for the past few weeks. Pages load
faster, but the real awesome part has been recapturing attention from image
based ads I didn’t even realize I was losing.

------
luu
This article makes the reasonable point that the web is likely getting faster
for people with cutting edge devices. For example, at one point they say

> Someone who used a Galaxy S4 in 2013 and now uses a Galaxy S10 will have
> seen their CPU processing power go up by a factor of 5. Let's assume that
> browsers have become 4x more efficient since then. If we naively multiply
> these numbers we get an overall 20x improvement.

> Since 2013, JavaScript page weight has increased 3.7x from 107KB to 392KB.
> Maybe minification and compression have improved a bit as well, so more
> JavaScript code now fits into fewer bytes. Let's round the multiple up to
> 6x. Let's pretend that JavaScript page weight is proportional to JavaScript
> execution time.

> We'd still end up with a 3.3x performance improvement.

But then the author concludes

> he web is slowly getting faster

Which ignores a pretty large fraction of users. A part of the article
acknowledges that this all depends on the device, etc., but this is ignored in
the conclusion!

Let's say that, as a first approximation, the first set of quotes is correct.
I think most developers who look at user experience with respect to latency or
performance today (or even ten years ago) would agree that we should not only
consider the average and that we should also look at the tail. If we do so, we
see that device age is increasing at the median and the tail, quite
drastically in the tail even if we "only" look at p75 device age:
[https://danluu.com/android-updates/](https://danluu.com/android-updates/).

If we consider a user who's still using a 2013 Galaxy S4 and ask "does your
phone feel 5 times faster than it did in 2013?", based on some js benchmarks
improving by 5x, I think they'll laugh in our face. I've used a couple of
Android devices that I tried to keep up to date (to the extent that's possible
on Android) and each one became unbearably slow after taking some big Android
update. Those updates probably included improvements in the Android Runtime as
well as V8, and yet, the net effect was not positive. I don't think I'm alone
in this -- if you read any forum where people discuss taking updates for the
phones, one of the most common complaints is that their previously usable
phone became unusable due to performance degradations caused by the update.

Sure, my personal user experience on my daily-driver phone is ok on my phone
because I have very fast phone and I'm often using it from fast wifi. But it's
terrible if I take a road trip across the U.S. and the experience is terrible
anywhere in the U.S. with an old phone. I don't think we should just write off
the experiences of people with old phones or who live in places where they
can't get high-speed internet even if life is good for people like me when I'm
at home on my 1Gb connection. When I looked at this with respect to bandwidth
and latency (inspired by a road trip where I found every website from a major
tech company to be unusable, excluding a few Google properties), I found that,
on a slow connection like you get in many places in the U.S., websites can
easily take more than 1 minute to load in a controlled benchmark:
[https://danluu.com/web-bloat/](https://danluu.com/web-bloat/). My experience
in real life (where I probably had higher variance in both latency, packet
loss, and effective bandwidth) was that many websites simply wouldn't load.

One thing this post looks at is the 75%-ile onLoad time. When I travel through
the U.S. on the interstate (major throughfares which will, in general, have
better connectivity than analogous places off of major throughfares) most
pages are so slow that they don't even load at all, so those attempts aren't
counted in the statistics! I don't dispute that things are getting faster for
the median user or even the 75%-ile slow user if you measure that in a
specific way, but there are plenty of users whose experiences are getting
worse who won't even be counted in this stat that's in the post because their
experience is too slow to even get counted in the stats.

~~~
jaclaz
>Sure, my personal user experience on my daily-driver phone is ok on my phone
because I have very fast phone and I'm often using it from fast wifi. But it's
terrible if I take a road trip across the U.S. and the experience is terrible
anywhere in the U.S. with an old phone.

This is a per peeve of mine, in my (perverted) mind the developers/programmers
(not only web related) that usually work and use (it is fine, it is their
work, they deserve the best of the best) high performance hardware and
connections should have a low performance setup (simulated in a VM or simply
some oldish hardware with a limited amount of RAM and a slowish processor)
where to test what they release to the public for
interaction/responsiveness/etc.

In my experience, bar the professional developers or programmers and a few
designers, architects, engineers, etc. , the only people with high end
hardware are gamers, all the rest (both at home and in the office), for
different reasons tend to have relatively underpowered machines.

------
TeeMassive
> Mobile page weight increased by 337% between 2013 and 2020. This is
> primarily driven by an increase in images and JavaScript code.

Not surprising. Most webpages are over-engineered blogs. If you're not
organizing data other than text and a few multimedia elements then you
probably don't need millions of lines of code of JS libraries. Not only do
they slow transfer, they take time to execute and make most web pages do
quirky things for 1 to 10 seconds before the page finishes loading.

------
mjevans
Hellish AutoPlay Videos... a plague.

------
dognotdog
I've recently tried to get a simple webpage snappier replacing hi-res images
with appropriate sizes and including heights -- not only widths -- to prevent
re-layouts was easy enough, but then…

The single biggest offender in both download and render time that I can do
nothing about is youtube loading in over 400kb of Javascript for an embedded
video, taking ages to first render, making everything feel glacially slow.

------
ChuckMcM
The last time I did a test like this was on the Blekko web crawler, and page
load time was proportional to number of advertisements on the page.

------
jeroenhd
> However, the 1.6Mbps 3G connection emulated by HTTP Archive is very slow, so
> we should expect significant performance improvements as bandwidth improves.
> The average website downloads 1.7MB of data in 2020, which will take at
> least 9s to download on the HTTP Archive connection.

I have family in an African country where the only reasonably priced home
internet connection is about 56kbps over DSL (yes, dialup speeds over DSL,
very confusing). Web pages have gone from very slow to unusable very quick.

I believe and appreciate that the users most people target have faster devices
now, especially in the core hubs of Web development like Silicon Valley.
However, this trend of "we can fit more data in because devices are faster" is
horrible for anyone already behind in available technology or even on a
limited data plan.

My father uses my old smartphone, a Oneplus One with the latest release of
LineageOS. This device runs a Qualcom 801, a chip that was considered to have
flagship speeds t the time of its release. WiFi speeds are over 100mbps, but
websites and applications are still getting slow somehow.

Even a mid-range or cheap smartphone has a better GPU than older devices, and
there are many of those out there. The advice to test on budget smartphones is
solid, but people often go out and buy a new cheap phone instead of using
something that was popular a few years ago. People who can't afford or don't
see the value in getting a new S20 don't necessarily have cheap smartphones,
they often have second hand phones or hand-me-downs as well. Frameworks like
Flutter and browser engines using canvas are very noticeably slow on those
older devices because of the advancements GPU tech has made, as CPU tech
improvements in smartphones have begun to slow down over the years.

Is the web slower? No, not for the people you want to sell your product to.
The question is, why should we accept this bloat? Web applications can be as
slow as you want to make them with festures and pretty designs for all I care,
but the web in general doesn't need two megabytes of javascript to render a
manual or a forum post. The metrics discussed here are wrong in the context
they were originally used in, so the article does have a point. However, I
don't think we should say the web hasn't gotten slower because computers have
increased speeds to compensate with the load. Every byte you save, every
script you ignore, every image you compress can have significant impact on
hundreds of millions of not billions. If you can't justify that to your team
lead, use the argument that the less resources you use, the better the users'
battery lives are and the more responsive a site feels. The impact is about
the same.

------
bobloblaw45
Give almost any local news website a shot. Those things are so bad I think
long and hard before clicking.

------
fomine3
TCP slow start (or QUIC equivalent) should be considered. Usually speedtest is
downloading large files so TCP window size is well increased, but small
websites are sometimes not enough big to increase window size.

------
dirtyid
I'm interested if there's a break down on whether the web slows down by
different regions. I imagine uptake of new web technologies and hardware
differ according the GDP and baseline mobile or network capabilities.

------
dave_sid
It’s that time again when we need to swing from fat clients to thin clients,
and think about what went wrong. It swings back every 20 years or so, so don’t
worry.

------
kahlonel
Why can't we have a direct statement like "The web isn't getting slower",
instead of click-baity question titles?

------
Datsundere
I’ve been thinking of creating a new browser that doesn’t have to rely on the
dumpster node is and a rendering api that’s better than css.

~~~
pkphilip
About time somebody did that. I am interested in doing something similar as
well.

------
ShradhaSingh
Average internet speeds all over the world have slowed. Some broadband
providers are feeling crushed by the heavy traffic.

------
nottorp
No need to RTFA. The answer is yes. And dont find them excuses.

------
coronadisaster
gmail is really low for a company that deranks slow websites...

------
known
I think torrents should be vividly created/used;

------
api
For a web site similar to one from 10 years ago? No, it's getting a lot
faster.

The problem is that web sites are getting more and more bloated and doing so
faster than CPU or bandwidth is increasing.

------
bfrog
Yes. The answer to the question is yes.

------
juststeve
The web is like the third matrix movie

------
ImAlreadyTracer
I've been noticing this lately.

------
layoutIfNeeded
A rare counterexample for Betteridge's law of headlines.

------
chews
Does a bear poop in the woods? Is the earth round?

~~~
hinkley
So... "mostly"?

