
The Website Obesity Crisis - jmduke
http://idlewords.com/talks/website_obesity.htm
======
rdtsc
I think it is because people (designers, coders, etc) get bonuses and
paychecks for creating stuff more than tearing down stuff.

Put this on your resume -- "Implemented feature x, designed y, added z" vs
"Cut out 10k lines worth of crap only 10% of customers used, stripped away
stupid 1Mb worth for js that displays animated snowflakes, etc". You'd produce
a better perception by claiming you added / created / built, rather than
deleted.

So it is not surprising that more stuff gets built, more code added to the
pile, more features implemented. Heck, even GMail keeps changing every 6
months for apparently no reason. But in reality there is a reason -- Google
has full time designers on the GMail team. There is probably no way they'd end
the year with "Yap, site worked great, we did a nice job 2 years ago, so I
didn't touch it this year."

~~~
otakucode
That is most likely part of it. Likewise, the current structure of most
companies simply cannot function in the face of something being 'done'.
Someone has to keep the developers busy for the 40 hours a week they must be
in their chairs to get paid. Even if that weren't an issue, you will always
have someone who wants a promotion. And they will need to have something to
show off to get it. So they will come up with a 'brave new direction' and use
sheer force of will to make it happen.

I remember when I first realized that software companies had a fundamental
problem with how they handled finished products. BulletProof FTP. It was a
magnificent FTP client decades ago. I used it on my dialup connection to
practice data hoarding. I paid for it because it was a trustworthy companion
in my adventures through the net. But, that turned out to be its ownfall. It
was feature-complete and it was excellent (although they never did fix the
problem with quick reconnects resulting in a pre-existing binary transfer
socket getting picked up for use as the command connection, spamming the thing
with random binary data and confusing the hell out of itself and the user...).
But clearly, that was unacceptable. They continued to shoehorn in irrelevant
crap that nobody wanted. Eventually it got so bad that it wasn't even very
good at being an FTP client any more. I've since seen that repeated
innumerable times since. A product is completed, but people still have to be
kept busy with fake work that destroys everything they built. The fact we
still use the structures and ideas designed for factories and assembly lines
for modern companies is ludicrous.

~~~
panic
As written by the Conway of "Conway's law" in 1968:

 _As long as the manager 's prestige and power are tied to the size of his
budget, he will be motivated to expand his organization. This is an
inappropriate motive in the management of a system design activity. Once the
organization exists, of course, it will be used. Probably the greatest single
common factor behind many poorly designed systems now in existence has been
the availability of a design organization in need of work._

~~~
lifeisstillgood
Great quote - any online references? Or even a new book I need :/)

~~~
panic
Here's where I got it:
[http://www.melconway.com/Home/Committees_Paper.html](http://www.melconway.com/Home/Committees_Paper.html)

------
Too
> _It 's like we woke up one morning in 2008 to find that our Lego had all
> turned to Duplo. Sites that used to show useful data now look like
> cartoons._

That is the best description I've heard of the recent trend of making every
item cover 30% of the page so that you can only fit 2 data points. What is the
deal with all this? Keeping the number of options down is one thing, but
making repeated tables of data gigantic serves no purpose at all. It might
look good in a thumbnail of a screenshot but actually using it is next to
impossible.

~~~
hzhou321
That is what we call _fashion_. With the bandwidth and CPU/monitor power, the
utility part is no longer in focus, so fashion part dominates.

 _Fashion_ do not need to make sense. Sensible folks (the minority part of
family) may choose to ignore fashion; but if you are criticizing fashion (with
seriousness), then you are in the wrong game.

~~~
nhf
If we're using fashion as an analogy, it's kind of a mixed bag.

There's both this type of fashion: [https://s-media-cache-
ak0.pinimg.com/236x/c0/66/12/c06612029...](https://s-media-cache-
ak0.pinimg.com/236x/c0/66/12/c06612029107b28aeac841b1586cce8c.jpg)

...and this type of fashion:
[http://img.izismile.com/img/img7/20140521/640/fashion_runway...](http://img.izismile.com/img/img7/20140521/640/fashion_runway_clothing_that_is_weird_and_wacky_640_34.jpg)

out there.

Things can be fashionable, yet crisp and usable. Things can also be crazy,
stupid (to most), and groundbreaking (to a few). Put another way, fashion
currently dominates, but that _would_ be okay since we don't need to sacrifice
good visual design in the name of performance any more.

Unfortunately:

* People get lazy. * People don't user test. * People like to follow trends. * Designers and developers get micromanaged.

...along with all of the other monkey wrenches that most developers and
designers have gotten used to. People end up building cool-looking websites
that aren't as usable as they should be.

------
userbinator
_" Chickenshit Minimalism: the illusion of simplicity backed by megabytes of
cruft."_

This is not restricted to websites - a lot of software has suffered from the
same trend, where newer versions look simpler - and often have reduced
functionality - while for some reason still requiring more resources than the
previous version.

~~~
danso
What kind of examples do you have in mind? I don't disagree entirely...but
I've found that sometimes what seems like gratuitous weight is a result of the
amount of code that has to be used to not only fix past bugs, but reconcile
all the new standards and external mishmashes (such as interoperability with
operating systems and other libraries that themselves have been upgraded) that
have come into existence.

The magnitudes of increase in lines of code for the installers we download
definitely outpace the increase in functionality, compared to 1 & 2 decades
ago...but we have a lot more systems to interpolate with. That said, I'm
always gobsmacked when I download a relatively new game from Steam and it
weighs in at under 100MB...which would've been 70+ floppy disks back in the
day :)

(though in the case of games, the increased weight is most often due to
multimedia assets)

~~~
tim333
Not the poster and not quite what you were talking about but I was recently
impressed that the new Node version of the Wordpress control panel is 126 Mb
whereas the old php version including that and the rest of wordpress is 24 Mb
(both unzipped). I guess the Node thing includes Node and a browser but it
still seems to have quite a good bloat to functionality ratio.

~~~
cben
I wonder how much of that is simply npm duplicating libraries (at slighly
different versions) due to its no-shared-dependencies approach?

[https://github.com/Automattic/wp-
calypso/blob/master/package...](https://github.com/Automattic/wp-
calypso/blob/master/package.json) uses ~130 libs. After npm install (incl. dev
deps), node_modules/ weighs 170M file size (and whopping 550M(!) disk size).
Let's check duplication. Dirs: (Assuming equal size & name indicate equal
content)

    
    
        $ sed 's@\S*node_modules/@@' DU | sort -k2,2 -k1,1 | wc -l
        6259
        ~/wp-calypso (master) $ sed 's@\S*node_modules/@@' DU | sort -k2,2 -k1,1 | uniq | wc -l
        3286
        ~/wp-calypso (master) $ sed 's@\S*node_modules/@@' DU | sort -k2,2 -k1,1 | uniq --skip-fields=1 | wc -l
        2799
        $ sed 's@\S*node_modules/@@' DU | sort -k2,2 -k1,1 | uniq --skip-fields=1 --count | numaverage 
        2.23615576991783
    

Bytes:

    
    
        $ sed 's@\S*node_modules/@@' DU | sort -k2,2 -k1,1 | tr -d , | numsum | numfmt --grouping
        170,022,500
        ~/wp-calypso (master) $ sed 's@\S*node_modules/@@' DU | sort -k2,2 -k1,1 | uniq | tr -d , | numsum
        117,107,993
        ~/wp-calypso (master) $ sed 's@\S*node_modules/@@' DU | sort -k2,2 -k1,1 | uniq --skip-fields=1 | tr -d , | numsum | numfmt --grouping
        91,895,673
    

=> dirs appear in ~2.2 places on average; ~31% of total size is wasted on
exact dups, ~14% more spent on different versions on same lib.

    
    
        $ npm dedupe
        ...
        $ $ du --summarize --apparent-size node_modules/
        166,097,508	node_modules/
    

Underwhelming! Only 2%?! dedupe is constrained by npm lookup algorithm (can
only lift equal versions to parent dir) but 2% is useless. Should have used
sym/hard/reflinks.

Anyway, I now know npm's exact-duplication overhead is not huge (though could
be linked); inexact-duplication is small enough to be easily worth the ability
to mix versions; and that the new control panel is indeed bloated [however I
assume it has more functionality than old?].

------
BinaryIdiot
For the most part this is a pretty good article. I find that the more removed
someone is from the real, vanilla HTML the more bloat they will inadvertently
bring in. Need to focus a form? Download an angular plugin because you're
using angular, why would you want to do it natively!

The native DOM is pretty inelegant but at the same time it can't be ignored;
it must be understood. You don't need a new plugin, font, css reset file to
accomplish everything you want and you can even do it cleanly!

I was a little concerned about this part though:

> [...]ad startups will grow desperate[...]This why I've proposed we regulate
> the hell out of them now[1].

I'm all for downloading of your information but some of the other things are
just a bit off the mark. Like deleting your data can be problematic in any
type of collaborative / productivity app. The right to go offline is nice in
theory but many devices may actually need the internet to work and without it
it wouldn't be able to function. I mean yeah the examples given are good
examples as to things that can be "smart" and "dumb" but what about similar
things, like sensors and other types of trackers? Seems like market pressures
would be better to change those items than regulation.

[1]
[http://idlewords.com/talks/what_happens_next_will_amaze_you....](http://idlewords.com/talks/what_happens_next_will_amaze_you.htm#six_fixes)

~~~
jonahx
> Need to focus a form? Download an angular plugin because you're using
> angular, why would you want to do it natively!

Otherwise known as CBDD -- "Code Bootcamp Driven Design"

------
timthorn
Byte magazine had a cover feature 23 years ago entitled "Fighting Fatware".

We're doomed to repeat history...

[https://archive.org/details/BYTE-1993-04](https://archive.org/details/BYTE-1993-04)

The article starts: Dave Brown, a Keene, New Hampshire-based entrepreneur, got
his Christmas wish last year - a copy of Microsoft's Access relational
database manager for Windows. Excitement turned to disappointment, however,
once Brown tried to run the program. Despite the fact that his system had the
4 MB of RAM that Microsoft recommends, Access was "hideously slow." A call to
Microsoft technical support revealed the truth: He needed at least 8 MB of RAM
to achieve acceptable performance. Now Brown has two options: He can spend
$200 for more RAM or wait for version 1.1, which Microsoft claims will run
better with 4MB.

~~~
timthorn
A second notable quote: One controversial aspect of the software-bloat problem
is the increased use of high-level languages, particularly C and C++. While
assembly programming can produce very tight code, the common belief is that
with C or any other high-level language the code will be larger. However, a
good programmer, says Word­Perfect's LeFevre, can minimize the growth of code
while mak­ing the most of a high-level language's advantages. Lotus
Development (Cambridge, MA) ported 1-2-3 from as­sembly to C between versions
2.01 and 3.0. Consequently, the code size nearly tripled, from 1.4 MB to 4 MB.
Not all of that growth is attributable to the difference between C and
assem­bly-version 3.0, for example, included the printing utility All­ways and
had significantly more features-but it was a major con­tributor. The resulting
product would not run acceptably under DOS until the company delayed its
release to compress and op­timize the code.

------
Animats
On Google's AMP, which endlessly reloads a picture carousel: _" These
comically huge homepages for projects designed to make the web faster are the
equivalent of watching a fitness video where the presenter is just standing
there, eating pizza and cookies."_

On trying to fix the problem: _" These comically huge homepages for projects
designed to make the web faster are the equivalent of watching a fitness video
where the presenter is just standing there, eating pizza and cookies."_

Someone recently commented on one of my web pages for being unusual in that
the pictures were all directly related to the copy.

~~~
JadeNB
> On Google's AMP, which endlessly reloads a picture carousel: "These
> comically huge homepages for projects designed to make the web faster are
> the equivalent of watching a fitness video where the presenter is just
> standing there, eating pizza and cookies."

> On trying to fix the problem: "These comically huge homepages for projects
> designed to make the web faster are the equivalent of watching a fitness
> video where the presenter is just standing there, eating pizza and cookies."

Did you mean to have the same quote in both paragraphs?

~~~
Animats
Oops, sorry.

------
lewisl9029
A large part of the code bloat problem can be resolved if the JS ecosystem's
most popular build tools had support for some of the advanced compilation
features in Google Closure Compiler, like dead-code elimination and cross-
module code motion [1].

ClojureScript makes heavy use of the Closure Compiler's advanced compilation
features, and as a result generates code that is often orders of magnitudes
smaller than what it would be without those features. Think of what a bloated
mess a ClojureScript app would be if it had to include the _entire_
ClojureScript standard library with every build. This is exactly what's
happening in JS world, where developers include by default the entirety of any
utility libraries they're using when they're only calling a handful of
functions from them.

Before anyone starts suggesting "just use Closure Compiler for JS projects",
it's really not that simple (at least when I last looked into it). There's a
huge amount of friction involved in using the Closure Compiler for a regular
JS project (most of which wouldn't apply to a ClojureScript project because
its JS output is machine-generated, and its build chain is designed to work
_exclusively_ with the Closure Compiler and all its quirks), namely writing
all your code as Closure Modules, defining externs for any third party
libraries you use, and setting up the JVM-based compiler itself and
integrating it with the rest of your build tools.

I hope to see some improvement in this area with the dawn of ES6 modules,
since they were designed from the ground up with static analysis in mind.
Robust and accessible dead-code elimination and cross module code motion for
ES6 modules could easily bring about a _revolution_ in JS code sizes on the
web.

[1]
[https://developers.google.com/closure/compiler/](https://developers.google.com/closure/compiler/)

~~~
dchest
_" A large part of the code bloat problem can be resolved..._"

...with deflating the front left tire a little bit, putting a magnet on the
gas cap, folding in the side mirrors.

I think you missed the point of the talk (or didn't read or watch it), and the
kind of solution you proposed is mentioned there.

(PS I agree with what you said, though, regarding JS dead code elimination.)

~~~
lewisl9029
I was only commenting on the first-party code bloat side of the problem
mentioned in the article, and I realize this only addresses a part of it.
Asset bloat and third party scripts are entirely different beasts that can
only really be tackled on a case-by-case basis.

------
meesterdude
Overall, this was a good article and it had me eyerolling at some of the
dumb...dumb things people do. Like the internet.org background logo
continually downloading a movie. Whoever did that should not be making
websites.

Likewise, I took a look at my project and was able to chop the JS size in half
by yanking out some libraries I no longer use, so now the JS and CSS are each
under 500K each. Still a 1.2MB load overall; but it's also cached and an app
people will visit more than once.

I hate that my CSS is close to 500K though. The design itself isn't that
complicated; but I'm basing it off of a bootstrap theme and until I know what
I'm using I can't prune much.

And I think that's a source of some bloat: frameworks and libraries. But, It's
a tradeoff; it's code I don't have to write, which lets me get a better
product to market faster. Sure, I could really spend the time to prune all my
assets; and i think one day that will be a good move to make. But for me, and
for many other projects, it's a tradeoff.

Usually media is the big one to blame, and things like streaming a background
movie and eating up hundreds of megabytes in bandwidth to display it is simply
irresponsible.

In Apple's case, they probably want their images to be high resolution, which
is understandable. But even then they could (may even already) run it through
some compression filters to reduce the size without hurting the quality.

It's something we should all be mindful of. You can, but don't have to go to
extreme lengths to reduce the size of the site. There are often some low
hanging fruit you can reach for that get you 80% of the way there. And
obviously if its site that people a lot versus a site that people will visit
once, your priorities for optimization are going to be different.

~~~
garrettgrimsley
For pruning your CSS:
[http://stackoverflow.com/a/3113120](http://stackoverflow.com/a/3113120)

~~~
mschuster91
The problem with CSS pruning is that you have to run it over ALL sites you
have and all kinds of dynamic pages you generate - or you experience loss.

------
mschuster91
> The article somehow contrives to be 18 megabytes long, including (in the
> page view I measured) a 3 megabyte video for K-Y jelly, an "intimate
> lubricant".

(Warning. Swear words incoming, because the situation has grown far out of
control)

Fuck websites with autoplay (or autoplay-on-hover) videos. Fuck them. Whoever
has invented or implemented this crap, please resign from your post
immediately.

Even in 1st world countries, people use 3G/4G with data caps to work or are in
otherwise bandwidth-constrained environments (public hotspots, train/bus wifi)
etc. You are screwing over your users and don't realize it.

Also, something especially Spiegel Online comes to mind: 30 secs video clip
with a 25s advertising clip. Fuck you.

> Why not just serve regular HTML without stuffing it full of useless crap?
> The question is left unanswered.

Easy actually: because a well-defined restricted subset of HTML can be
machine-audited and there is no way to abuse it. Also, Google can save
resources at indexing.

~~~
Amezarak
> Easy actually: because a well-defined restricted subset of HTML can be
> machine-audited and there is no way to abuse it

This is why I personally use NoScript.

People often ask me "how can you stand to use NoScript on the modern web?
Isn't it a huge pain to whitelist scripts? Isn't everything broken?"

Nope. Most of the web works _perfectly fine_ without loading Javascript from
35 different domains (as my local news site does). You whitelist a few webapps
and you're pretty much good to go. The difference is incredible. Your browser
uses a fraction of the memory. Pages load faster than you can blink. The
browser never catches up or lags. Pages scroll as you expect. Audio and video
never autoplay and distract you. When I briefly used NoScript on mobile, it
was a miraculous life-saver that made my battery live forever.

In the past couple years, however, I have noticed a new phenomenon. Remarkably
- madly, in my view - there are webpages, webpages that should be _simple_ ,
webpages by all appearances that consist of nothing more than a photo (maybe
more than one), a byline, and a few hundred words of text, that _require
Javascript to load_. As in, you will get a a blank page or an error message if
you don't have their scripts enabled.

I don't understand it. I don't want to understand it. I just want it to stop.

I understand that you need Javascript and so forth to run a webapp. I'm not
even asking for your webapp to be less than 5MB. Hell, make it 50MB (I just
won't ever use it on mobile.) Making applications can be a lot of work, maybe
yours is super complicated and requires tons of libraries or some crazy media
loading in the background and autoplaying videos and god knows what else.

But please, please, _don 't_ require Javascript to simply _load_ an article
and don't make a simple article 5MB. Why on Earth would you do that? How many
things have to go wrong for that to happen? Who is writing these frameworks
and the pages that use them?

~~~
bootload
_" In the past couple years, however, I have noticed a new phenomenon.
Remarkably - madly, in my view - there are webpages, webpages that should be
simple, webpages by all appearances that consist of nothing more than a photo
(maybe more than one), a byline, and a few hundred words of text, that require
Javascript to load. As in, you will get a a blank page or an error message if
you don't have their scripts enabled."_

I use noscript as default and I'm noticing the same thing. I post them to
twitter. Here's a sample:

\- 'Here are the instructions how to enable #JavaScript in your #web
#browser.'

\- 'For full functionality of this site it is necessary to enable
#JavaScript.'

\- You must enable #javascript in order to use #Slack. You can do this in your
#browser settings.

\- 'You appear to have #JavaScript disabled, or are running a non-JavaScript
capable #web #browser.'

\- 'Please note: Many features of this site require #JavaScript.'

\- 'Tinkercad requires #HTML5/#WebGL to work properly. It looks like your
#browser does not support WebGL.'

\- 'Warning: The NCBI web site requires #JavaScript to function. more...'

\- 'Whoops! Our site requires #JavaScript to operate. Please allow JavaScript
in order to use our site.'

\- 'The #media could not be played.'

\- 'Notice: While #Javascript is not essential for this website, your
interaction with the content will be limited.'

\- 'Powered by #Discourse, best viewed with #JavaScript enabled'

~~~
toomanybeersies
It shouldn't be surprising that Tinkercad and Slack require JS.

~~~
mistercow
Seriously, did they think Tinkercad was just going to be a pile of CSS
transforms?

~~~
artursapek
He's probably talking about their home page, which is just text and images
selling their product.

------
liampronan
I've found Google's Pagespeed Insights[0] to be a great resource for keeping
obese sites in check. Their site will give you stats on load time as well as
instructions on how to optimize that time, which is very important especially
on a user's first visit -- e.g., I will bounce from a new site in X seconds,
but may give an established site (think Amazon) X + Y second time to load.
It's easy to miss how important page load is when you've been working on a
site a bunch, but every extra second of load time has been show to impact
sites' bottom lines [1].

0 -
[https://developers.google.com/speed/pagespeed/insights/](https://developers.google.com/speed/pagespeed/insights/)
1 - [http://www.fastcompany.com/1825005/how-one-second-could-
cost...](http://www.fastcompany.com/1825005/how-one-second-could-cost-
amazon-16-billion-sales)

------
SwellJoe
I have been using slow internet more often lately, because I'm traveling full-
time again, and only using 3G/4G broadband. It is remarkable how large some
sites have gotten since I was last in this situation. Despite mobile broadband
being somewhat faster now than a few years ago, the time to load (to
usability) for many sites is much higher. HN is notable for loading instantly
in these circumstances, not because the server (I'm guessing it still runs on
one server, plus CloudFlare, but I might be guessing wrong) is blazing fast,
but because it is so small.

~~~
ahstilde
This is why I use opera mini on mobile. It really speeds up browsing.

~~~
SwellJoe
I haven't tried Opera in many years. Will have to give it another look. But,
I'm mostly not browsing on a mobile device...I'm using my laptop through a
mobile broadband hotspot.

------
vancan1ty
I think the primary reason for the rise of increasingly heavy sites is that
animations and visuals can be used to attract and "hook" your reader.

Just as fish like shiny spoons and minnow lookalikes and monkeys like shiny
objects, humans like pretty pictures and flashing visualizations.

Distraction is the same principle that drives the success of TV. It is so damn
_easy_ to just sit in front of the screen and grok out, never mind the fact
that the signal to noise ratio is often astonishingly low.

Quality thought and challenging content consumption is much harder than simply
letting yourself admire shiny visuals. Therefore, simple websites, while they
may contain excellent and meaningful content, will often not stimulate the
user's interest as much as animated websites with large pictures.

~~~
meesterdude
css animations can be done in only a couple of lines; so I wouldn't consider
them to be the primary drive, or any significant factor, behind bloat.

~~~
whorleater
But CSS animations are...rough currently, not standardized, and don't have
about a decade worth's of knowledge behind them. I can sure make a element
fade in on a webpage with CSS, but there's already a billion ways to do it in
JQuery on StackOverflow, so it's a lot more appealing to use Javascript at
first.

------
riboflavin
My only concern about this is that a _lot_ of the technologies identified as
"adtech" on that diagram just... aren't:

* Vimeo

* Hootsuite

* LinkedIn

* UserTesting.com

Marketing != advertising. The overall point is really valid, but this is a
dumb way to back it up.

It's bad enough to just take the ad-serving parts of the diagram he uses,
which add up to hundreds of technologies (or use Ghostery on any news site).

~~~
jacquesm
I'm not with you on linkedin, that's not marketing, that's malware.

------
vinceguidry
It's way easier to acquire stuff than it is to get rid of it, whether it's
website bloat or personal possessions. It takes discipline to adhere to
procedures that trim unneeded code / dependencies as they loses relevance. If
you don't do it while the reason the code was put in in the first place was
fresh in mind, there will be a natural tendency to kick the can down the road.

~~~
hrabago
Not to mention, when they load up on all the new stuff, they try it out on
fairly modern machines and it loads well enough, with a load time that's just
good enough, so there's even less motivation to trim down.

I also would love to have leaner websites and less bloat, but I recognize that
good enough still passes for good enough. It's only when they go past a
tipping point do people pay attention, such as when iMore got called out by
John Gruber.

------
collinmanderson
Is it hypocritical that this website is over 1 MB and has over 100 requests?
[http://www.webpagetest.org/result/160101_KN_FH3/](http://www.webpagetest.org/result/160101_KN_FH3/)

~~~
dchest
No, it's not hypocritical: these 1 MB and 100 request _are_ the content of
this article, i.e. useful payload, as opposed to useless irrelevant junk which
is on most websites these days.

~~~
jarek
Arguably, many of the "slide" images aren't directly relevant to the content,
they're just there because you're expected to have images in a tech conference
talk.

------
bhauer
I was having a good time, reading the article with a grin on my face. Until I
got to

> _On top of it all, a whole mess of surveillance scripts_

And I just lost my cool and laughed out loud. Well written, sir.

~~~
mirimir
Yeah, me too ;)

But he also says this:

> I bet if you went to a client and presented a 200 kilobyte site template,
> you’d be fired. Even if it looked great and somehow included all the
> tracking and ads and social media crap they insisted on putting in. It’s
> just so far out of the realm of the imaginable at this point.

If that's possible, I'm getting that bloat stems from sloppy implementation of
all sorts. Fonts. Ads. Tracking. All of it. I suspect that the copy-and-paste
approach accounts for a lot of it. And using third-party resources, such as
Disqus for comments.

~~~
mschuster91
> And using third-party resources, such as Disqus for comments.

I actually kinda like disqus. Centralized notifications for replies and no
more signing up in order to post a comment (for the users), and as a site op I
don't have to deal with spam, people trying to XSS my comments and especially:
I can statically cache ALL the content and even run without any database at
all!

~~~
mirimir
OK, so I shouldn't have included Disqus under sloppy ;) You do get secure
managed comments. But isn't there a bloat cost, too? There are also security
and privacy risks in using third parties.

~~~
harry-wood
Yep. Disqus is one of the more understandable shifts-to-bloat which developers
have made in recent years. Understandable because spammers have made it so
difficult to run our own comments. The selfish anti-social behaviour of this
small minority of "SEO experts" ends up forcing everyone into tech choices we
should otherwise be avoiding.

~~~
justaaron
spot on. however, network trust issues are behind so many reasons why we can't
have nice stuff... naive initial approaches to network-sharing of resources +
lots of papering over = 20+ years of broken web

------
ahoge
Most of the weight comes from images. The average website (Alexa top 1M)
contains over 1.4 MB of image data:

[http://httparchive.org/interesting.php](http://httparchive.org/interesting.php)

Be sure to pick the most suitable format and to optimize your images. You can
also try to serve WebP to browsers which support it. When it replaces JPEG,
you save about 30%. With PNG8, it's somewhere between 5 and 50%. And with
PNG32, if you substitute it with a lossy WebP, easily 80%.

Scripts come 2nd with ~363 KB. ES6's will thankfully help with that. Creating
the structure of your application declaratively enables better tooling. Not
only does this make your editor a lot smarter, it also paves the way for tree-
shaking.

If you tree-shake your application, only the actually used library features
will be included in the shipped bundle.

------
CaptSpify
Disclaimer: my own blog. If this is considered spamming here, feel free to
remove

[https://blog.thekyel.com/?anchor=Why_I_Block_Scripts_and_Ads](https://blog.thekyel.com/?anchor=Why_I_Block_Scripts_and_Ads)

------
TazeTSchnitzel
It won't be long before my site which literally downloads an entire Windows 95
disk image on every page load will be considered average-sized. It's a mere
order of magnitude away.

~~~
TheRealDunkirk
Interesting analogy. I grew up using Commodore computers, but I'm not trying
to race to the bottom of computing minima. I remember when (we) engineers got
REAL WORK DONE on x86 PC's with 10 MB HDD's, running AutoCAD, Lotus 123, and
Word Perfect. This was inarguably the beginning of the era of the useful,
general-purpose computer. You could load a computer DOS 6.22, Windows 3.11,
several of these programs, and still have room leftover for Doom. The author
references Apple's iPad page at 51 MB. The last time I rebuilt my gaming PC,
the MOUSE DRIVER was 150 MB! It makes me weep for where we are. I don't see
that things have really changed all that much in 25 years, as far as getting
"real work" done. It's just... more for the sake of "more." (The games are
cool, though.) In my opinion, all the bells and whistles really haven't added
anything to the web browsing experience for about 15 years.

~~~
TazeTSchnitzel
It's not just an analogy, it's a real-world example!

[http://win95.ajf.me/](http://win95.ajf.me/)

The disk image is 47MB (when gzipped). This means that the page is actually
_smaller_ than Apple's iPad page!

I likewise weep for modern computing.

------
bikamonki
So I keep telling clients: if we use WP for your low traffic site is like
renting an 18-wheeler to move a small box. Yet, I feel like doctors must feel
when patients argue 'quoting' something they read on the Internet: suddenly
they are the experts now :(

~~~
x1024
It depends on your point of view. The server load caused by WP is much larger
than most alternatives, yes. But the development(and ops) time required to
ship a WP site is much lower than any alternative I know of.

~~~
manigandham
Yet WP does come by default with a lot of the cruft problems mentioned here.
Lots of inefficient js and css added over the years.

------
dchest
Don't miss the video of this talk liked at the top:

[https://vimeo.com/147806338](https://vimeo.com/147806338)

------
jokoon
Aren't there page optimizers, that can factor html and css and make it smaller
?

Another simple way to solve this would be to just "compile" a webpage, like
pre-parsing the dom tree, and write this tree file into a binary file. That
would remove the parsing stage, which take a lot of CPU cycles, and is the
reason why most web services have their own smartphone app instead of a simple
combination of html+js.

Of course, if mozilla does it and creates such format, no developer will use
it and it will die, so again it's up to something bureaucratic like the IETF.

It boils down from the fact that markup language were not really meant for web
applications.

We are in 2016, and there still isn't a well designed, versatile document
format for the web. I have completely zero clue how a browser displays a
webpage, while there seems to be a lot of opportunity to optimize things there
by moving away from a text-centered solution. I don't understand why there is
nothing on this, all that is required is saving the intermediate data a
browser has just after parsing a html file. Computers are not designed to eat
raw text every time.

~~~
jordanlev
I think you've missed the point of the presentation. Optimization tricks such
as these are fun for us nerds to work on but at the end of the day it's like
adding another lane to a highway or deflating a tire on your jeep (to use 2
metaphors from the talk). Compilation of the dom won't reduce the size of
images or JavaScript files. And lastly, the author makes a point towards the
end that in addition to page size, there is also the notion that plain old
HTML and CSS is something new people to the platform can easily learn from and
work on themselves (mineceaft vs. call of duty was the author's analogy about
this).

~~~
jokoon
> Compilation of the dom won't reduce the size of images or JavaScript files.

The problem is not so much about images or js, it's more about webapps
generating fat html and so much css.

As for javascript, to me it's not a good language choice for many reasons.
It's being used extensively like a core language to build web applications,
while it's just a scripting language. HTML was never meant to be used like
this.

To be honest I don't really understand the analogy.

------
Houshalter
I think most of the problem is from images. Images, and especially videos,
take up _way_ more space than text. The Russian novel comparison is somewhat
misleading in that regard.

But it shouldn't really be an issue because of progressive image loading. At
the very least, the text should always load first. Back when I had dialup, it
could take ages for a page to load completely because of the images. You could
watch them slowly fill in, line by line. And if you didn't care about them,
you could ignore them.

There's also now FLIF, which progressively loads images at higher and higher
resolutions as more of the file downloads:
[http://flif.info/](http://flif.info/) The images look very good even at like
10%. Ideally once the image gets to the desired resolution, it wouldn't
download any more of it. So it covers resizing issues too.

~~~
Cyberdog
> But it shouldn't really be an issue because of progressive image loading. At
> the very least, the text should always load first. Back when I had dialup,
> it could take ages for a page to load completely because of the images. You
> could watch them slowly fill in, line by line. And if you didn't care about
> them, you could ignore them.

I remember the dial-up days too. You could only read the text around the
images while waiting for the images to load.

Now we sort of have the opposite problem when pages use huge unnecessary web
fonts; you can only look at the images while waiting for the font to load so
the browser can render the text.

(I found sending fonts.googleapis.com to 127.0.0.1 in my hosts file helped a
lot in that regard. Wish my browser had an option to block _all_ fonts,
though.)

FLIF looks like cool tech, but I won't hold my breath until it has wide
browser support.

~~~
Houshalter
I will agree with you that web fonts are the stupidest thing ever. There seems
to be some extensions to block them, though I haven't tried it yet. I don't
know why the text isn't rendered normally until the font downloads, or why
common fonts aren't cached locally.

~~~
izacus
Becuase web developers deliberately work around that (color text white or hide
it) so the text doesn't display until font is loaded to avoid the text pop. Of
course that makes the site unusable if font CDN is down.

------
agumonkey
And then you have [http://okmij.org/ftp/](http://okmij.org/ftp/) , S/N ratio :
NaN

~~~
dchest
There's an ad on some pages:

"Converted from HSXML by HSXML->HTML"

Bloat!

~~~
agumonkey
Ads, uh, find a way.

------
ufmace
I'm most fascinated by the slides on the bookmarking sites. I'd really, really
like to know exactly how a bookmarking site with 3k daily users manages to
rack up a $23,000 monthly AWS bill, that they only managed to reduce to
$9,000. Did they try as hard as possible to split everything out among as many
AWS services as possible? Yikes.

I just threw up a little blog and a few tinkering sites on AWS. Looked around
at a few blogging systems. Guess I coulda used Wordpress or Ghost or some
other DB backed thingy and used one, or two or three, of their database
services for the backend, which would probably be much more expensive and hard
to maintain. Decided to go with Jekyll instead. Don't need anything but a
nginix now.

------
chippy
Developers in web companies get the latest Mac Book Pros, and it works okay on
their machine.

They develop and make things without thinking about normal people.

------
AstroJetson
I'm wondering what the collision between fat web pages and bandwidth caps will
look like. Lots of sites have untold amount of garbage on them and I'm
starting to notice that many of them have automatic page reloads running. So
unless you remember to close the window, it can be back there sucking up
bandwidth.

I also wonder what effect encryption on the pages will have. Lots of info
should be cached, but most of us are behind proxy devices, how do they do with
the encrypted ones?

His website is a good example, it's clean, total page is 1 Mbyte of text with
102 calls from the page (for the thumbnails). It's sad to see a single graphic
as a banner taking up that much space.

~~~
gotchange
> total page is 1 Mbyte of text with 102 calls

I don't think that this is commendable in any capacity. He should have put his
money where his mouth is and package those thumbnails in maybe 5 - 10 sprite
files and then serve them with CSS.

This way he would have cut considerably on the network latency and fetched
those resources faster without relying that the visitor/user would stay near
above the fold region as the page is loading and not experience those empty
cells waiting for the corresponding image to fill it up.

His proposals like in everything with life, it's always easier said than done.

------
cwyers
I feel like this is a really good diagnosis of the causes of page bloat and
the harms, but the remedy is more nostalgia than anything. The web of the 90s,
with amateur blogs and Geocities sites and what have you, the peer to peer
content model, it's gone and it's not coming back. The thing about the Eternal
September is it's eternal. The Web has transformed into a place where 99%
consume and 1% produce. And that's game over. There may well be an adtech
bubble bursting, but that's not going to bring the 90s back.

------
manigandham
I agree with most of this article, but there are some basic problems.

1) Page size alone is not the same as user experience. This is easily
explained by youtube or netflix. Videos are tens or hundreds of MBs but start
instantly and you get the experience you need. Well made websites follow the
same important content, navigation first approach and stream in the rest as
necessary.

2) The comparison of page size and how you can fit entire Russian novels in
the same size is just weird. The text of the site doesn't add up to megabytes,
if it did then it would be an equivalent amount of text as those novels. The
fact is that the modern web has lots of visual design and media. Even if you
use CSS, people still want images, not just text. That's not a bad thing, it's
just an evolution. Even this article has 1MB of images (even as just
thumbnails and many not necessary). Whining about images is not helpful.

3) The thing about ad network model is just random and seems like the author
has no experience in advertising itself. Either way, talk of bubbles and
tracking without actually looking at all the angles and nuances is also not
helpful and just derails the topic.

EDIT: author definitely loses some respect for this, I'd expect a legitimate
reply:
[https://twitter.com/baconmeteor/status/683040882757505024](https://twitter.com/baconmeteor/status/683040882757505024)

~~~
chrisdone
Page size is part of user experience, however. If you've ever visited another
part of the world where Internet is awful or tried to use 3/4G on a mobile
with bad signal, you immediately start to hate web sites that take forever to
load what amounts to a page of text, which, if it were merely a HTML file
containing the text, would still load immediately even on dial-up quality
connection or less. Most people don't know this. Programmers do, and people
who remember 90's Internet. Taking this into account, comparing against books
is not too weird.

~~~
manigandham
The point was that text is obviously much leaner and highly compressible - if
you just discount the images like that then the web is already very fast. You
have to take into account the entire experience and realize that images are a
big part of the modern web.

I definitely agree there's a major cruft problem but I believe most browsers
already have the ability to disable loading images which gives the client the
power to adjust for their network connection.

------
Tomte
Things That Turbo Pascal is Smaller Than:
[http://prog21.dadgum.com/116.html](http://prog21.dadgum.com/116.html)

------
yason
One factor might be that web pages are big enough that they begin to allow
contributions from different, orthogonal origins.

The initial designer might do the first main layout and bolt in some menus and
the main text area.

But other people then gradually want to add these extra panes, those hot
videos, that extra image viewing layer, this advertisement, all kinds of
scripting, new functionality that all comes together orthogonally but will
together take megabytes of space. These can be added without going back to
reworking the original design too much so people can go in, think "I don't
know how the whole page is actually built but I can just add this little thing
here and not mess with anything else", and copypaste one more of the latest
features onto the page.

Or this mechanism could even be automated: new people just add extra page
modules to a database and the original workhorse will go rebuilding the HTML
with the new additions in it and nobody is left to oversee what all they're
actually serving out on each page load.

------
unimpressive
Typo:

"The graphics card on my late-model Apple laptop could not literally not cope
with the load."

Moving aside from said pedantic nitpick, a hilarious and brilliant essay. I'm
definitely going to be stealing The Taft Test, perhaps somebody should make a
web app that lets you perform this operation automatically?

------
carsongross
_Browsers are really, really good at rendering vanilla HTML._

Amen, brother. HTML, for all its layout flaws, is a gift to us developers. The
fact we are moving away from leveraging the raw parsing and updating
functionality written in C/C++ is a travesty.

------
ivanhoe
Yeah, but just a few years ago when building a site you never needed an image
wider than 1600px, and now you have lots of people with retina and 4K screens.
Add just one full-width HQ image and it's 500+KB on top of everything else....

~~~
eric_bullington
Except for IE, modern browsers support srcset, which allows you to define
different image sizes for different screen sizes, and only the "right" size
image will be downloaded.

If you need to support IE, you can use media queries with CSS background
images. Typically, it's only background images that are bandwidth hogs anyway
-- logos and such tend to be much smaller.

Media queries are supported by all modern browsers, including IE9+, and with a
JS polyfill (respond.js), you can even support IE 6-8. So there's really no
excuse for anyone who's supporting retina not to use media queries to minimize
bandwidth for everyone else. It's only a few extra lines of CSS, and you can
target any imaginable screen size (e.g., see this gist:
[https://gist.github.com/antonioreyna/5809553](https://gist.github.com/antonioreyna/5809553))

~~~
gotchange
Based on Caniuse[0] data, it looks like that «srcset» is not supported on pre
Lollipop stock browser on Android devices. So, it is still not a feasible
solution.

[0]: [http://caniuse.com/#feat=srcset](http://caniuse.com/#feat=srcset)

------
makecheck
A fantastic article.

When there is no penalty for over-consumption and it is _far easier_ to do the
stupid, inefficient thing, then the stupid, inefficient thing will be done.

To really combat this problem, we need:

\- Good tools that ensure the _easy_ thing is also the _optimal_ thing.

\- Penalties for pipe abuse (e.g. web browsers that make it really easy for
users to see the Wall of Shame with the web sites most responsible for
gobbling up their data plans and batteries). Sadly the only thing I have right
now is the OS X app-shaming model that points out high-energy apps, and then
Chrome or Firefox or Safari get all the blame for what is clearly the web
sites themselves.

------
Tomte
Nice timing... two days ago I've decided to lose the web fonts on my web pages
(not implemented, yet).

I still think Equity Text is gorgeous, but the fonts are actually much bigger
than any of my pages. Even the long ones with a few graphics.

~~~
jordanlev
For even more justification of your decision, check out this fantastic
presentation by Jake Archibald:
[https://vimeo.com/145138876](https://vimeo.com/145138876)

Covers a lot of ground, but at one point he's exploring how to reduce page
load times and web fonts turned out to take 4 seconds [out of 8 total] on a 3G
connection. This was due primarily to the font hosting service (not
necessarily the file size), so may or may not be pertinent to your
situation... but really a wonderful talk if you're interested in these kinds
of things.

~~~
Tomte
Thanks!

My situation is indeed a bit different, since I'm hosting the fonts on my own
webspace.

A big plus of Matthew Butterick's fonts: sane licensing. No need to estimate
and document page views. No need to obfuscate fonts, in some vain exercise of
thwarting font piracy. Just pay, host and serve.

------
hyperpallium
" _The way to keep giant companies from sterilizing the Internet is to make
their sites irrelevant. If all the cool stuff happens elsewhere, people will
follow. We did this with AOL and Prodigy, and we can do it again._ "

I love this grassroots empowerment, but the internet is mainstream now, and
like pop music/tv/news and consumer product manufacturers like nestle, proctor
n gamble, colgate-palmolive etc, big corps are fantastic at targetting it.
Most people don't want cool stuff.

BTW turning off js and images solves a lot of this problem - unless you need
to _use_ the site.

------
jordanpg
All of this is true, from a technical standpoint, but does any of it really
matter in the modern world?

This sounds a bit like a structural engineer designing an elegant, perfectly
constructed shopping mall, and then complaining about the massive
corporate/commercial, orgiastic takeover that occurred after the mall opened.

With the exception of bandwidth concerns on data-capped mobile plans and
diehard *nix fans that do all their web browsing with Lynx, why does bloat
matter? Every one of those linked sites loaded on my home 10Mbps connection in
<1 second to my eyes.

~~~
gengkev
Why should I, as a user, have to download all of this crap just to read a text
article? I shouldn't. Maybe it takes 1 second to load an article for you, but
if it also took 1 second to download a text file decades ago, then what am I
as a user gaining?

Aside from that, data-capped mobile plans are extremely common, and in the
developing world, they certainly aren't fast. But not only do network
connections have limits, but so do the CPU and memory of mobile devices,
especially cheap ones.

------
buro9
I wish there was a tool that could follow a browser session and tell me what,
specifically, I hadn't used in my bootstrap CSS.

Or a tool that could similarly tell me which parts of JQuery I could just
delete.

I feel the sites I have produced are fairly trim and optimised, with the
exception of the CSS and JS. Sites like
[https://www.lfgss.com/](https://www.lfgss.com/)

Yes they're responsive, and they load fairly fast, and they're encumbered with
web fonts... but it's the CSS and JS that is the real bloat.

~~~
dropit_sphere
For the JS, Google's Closure Compiler (
[https://developers.google.com/closure/compiler/](https://developers.google.com/closure/compiler/)
) may prove useful. It can be set to strip out all unused JS.

------
hal9zillion
I'd be very interested to see the difference in average page size between
online publishing/new media and other types of sites. I think so much of this
crisis is driven by the economics of that business model. Theres so much
incentive to track & analyse users with various scripts, a desire to have a
pretty looking site with lots of images (the biggest factor in page bloat) and
comparatively little in-house technical resources or spare cash to devote to
something as relatively exotic as web performance.

------
makecheck
For all the effort that browsers go to in order to make sites "scary" when
there are bad certificates, etc. I would _love_ to see a big, red, scary page
that says "you are loading a web site that is consuming an unreasonable number
of resources, are you sure you want to continue?". Make that the default, make
the warning limit 400 K, and watch web sites (complain first, and then)
change.

------
ksec
Purely in terms of Web page size ( and not loading speed ), images is the
biggest problem. We have been stuck with Jpeg for far too long. I really
wanted Bpg to take off. It should at least save 30 - 50% off images size,
which is now the biggest weight in web pages.

Then there is Web Fonts, along with JS Library. Most of them should be cached
already.

So really, we need to cut the images size down. That's it!

~~~
jetskindo
The problem is, if you focus solely on cutting down image size, it's an
invitation for the designer to add even more images.

I certainly remember bragging about my well established single sprite for most
resources. Soon enough it became 20 unmaintainable sprites with duplicate
content.

------
daxfohl
Though more than this, is the app gluttonous crisis. Websites, sure, drown me
in _your_ fatty acids. The alternative is a "lean" app.

Yes it runs slimmer and faster. But why does every app I install need to know
every facet of _my_ existence, and every facet of what every other app knows
about every facet of my existence, and so on? This is a much bigger problem.

------
csomar
I've had the same problem trying to find a WordPress theme for my weblog^.
Well, I ended up hacking it myself. The reason? Despite hundreds of premium
WordPress themes, they were all bloated. Even the ones claiming "Minimalist
theme".

I wonder if there is a real market for such theme.

^ [https://omarabid.com](https://omarabid.com)

~~~
mkjaer
I'd like to ask: why even go with Wordpress in the first place? Aren't you
suffering from the heavy cloud problem that he speaks about?

------
anujdeshpande
On similar lines is [http://deathtobullshit.com](http://deathtobullshit.com)

~~~
exodust
Nice.

A worthwhile eco-hacking campaign might be to "de-bullshit" the worst
offending websites. Hack in, remove bloated bullshit, leave main content
untouched.

It seems spreading the word about bloat isn't working. Might need to turn the
heat up.

Bloat is a health risk in that Bullshit is a health risk. Out senses
assaulted, our minds numbed and taxed with the task of defending that assault.

Anonymous should be on the case. They like the media-attention projects, but
how about de-bullshitting the web for us Anonymous before it's too late?

------
andreapaiola
I think you should be able to choose and also the opportunity to declut your
web Filter and declut your web!
[http://andreapaiola.name/magpie](http://andreapaiola.name/magpie)

------
jarboot
Why isn't there a medium alternative to all this bloat? Do blogs or articles
really need anything more than a pastebin-esque website for rendering
markdown?

Sure it might not make much money without ads, but it would be cheap to host.

------
rtpg
I think this hits a lot of points, but I kind of like having images on
websites. It's kinda hard to do something under a meg and have many
captivating images.

There's obviously a lot of stuff to fix for other stuff though.

------
uptownfunk
Another thing I recall hearing during my days working in SEO. Is that more is
better. The more words the more pictures the more content actually factors
into the algorithms that put you at the top of the list.

~~~
paulddraper
SEO: the witch doctors of the web

But seriously: those guys at Google know how to show relevant results. Make
your results relevant and don't do anything stupid like block googlebot, and
it'll work.

~~~
TeMPOraL
Well, that's the whole point of SEO - you do it when you _don 't_ have
anything relevant or valuable to offer, but still want the money. SEO as an
industry is mostly about fucking up Google's ability to show relevant results.

------
WalterBright
I often get flak for my old-fashioned, boring web pages, even during my brief
appearance on Romanian TV. They consist of text with very little HTML
annotation. But they're small and load fast :-)

------
ars
For the curious, this page that complains about large pages is 989K.

~~~
mschuster91
Yeah but justified so, it's 31 screen pages (for me) of content. No ads, no
clutter, no junk. Only thing I have to complain is that the images don't have
target=_blank set on their links. That sucks.

edit: The optimal solution would be to really compress (or sprite!) the small
images and on a click, display a modal with the image in full size, limited to
100% height or width plus a bit of padding. You don't even need jQuery for
this and only load the images you really want as fullsize.

~~~
idlewords
I tend to open new links with command-click, so I don't set target=_blank for
that reason. Are you on an interface where you can't do something like that?
It's an easy enough change to make.

~~~
js8
I love your pages/presentations but I wish there was a way to save them with
all the big images without having to use wget.

I have a habit of saving everything I read online. Incidentally, this is also
made worse by bloat and especially sites that load some content via JS.

~~~
tptacek
I find a tool called Pinboard works, for me, better than wget. Added bonus:
it's searchable!

~~~
TeMPOraL
I would be surprised if Pinboard wasn't the perfect tool for bookmarking
'idlewords presentations... ;).

------
PaulHoule
The right punishment is to force people who make bloated pages have Frontier
as their internet provider.

------
andreapaiola
Good developers are rare and cost a lot of money... Good managers are rare and
cost a lot of money...

So we are here.

~~~
brianwawok
I know a lot of good developers that have shipped bloat. No one thinks about
performance until something is too slow, with too slow being a moving target.

Can go down the same line of backend ruby vs C.

~~~
andreapaiola
But good managers allow the good developers to do their magic... without good
managers the endless battle between cost and quality go really really bed.

------
soheil
> The article somehow contrives to be 18 megabytes long, including (in the
> page view I measured) a 3 megabyte video for K-Y jelly, an "intimate
> lubricant".

I wonder if he realized this might have been a case of interest based ads
following him around and whether he would have still mentioned it.

~~~
brianwawok
Whoa he admitted he has Sex?

------
quaunaut
Had to get one last one of these in for the year. The last dozen just weren't
enough.

Nothing like tilting at windmills.

------
aruggirello
This. Made me wish I could upvote it twice! Or more...

------
TomorrowRich
The forum software "Discourse" is a good example of hugely over-engineered
bloat and Javascript being required to display some text.

They even require that you have the latest smartphone hardware! To display a
forum post!

As with the examples of Facebook and Google, these are intelligent people
working at these companies. Yet they get it completely and catastrophically
wrong...

~~~
Cyberdog
Agreed. Jeff Atwood's not a bad guy, but the way he acts like he's going to
lead web forums to the promised land with Discourse is really grating,
particularly when all he's really doing is making them more obnoxious to use.

I guess I could make a Slack/IRC comparison, but maybe I don't need to go
there.

~~~
unimpressive
Slack is at least in many ways an actual improvement over IRC, from what I've
seen of Discourse it's pure loss.

------
xdinomode
If you can't build a site without bootstrap, jquery, 500 png's/jpegs.. please
stop coding.

------
intrasight
UBlock Origin will largely solve this - except for inline SVGs. But I
understand that they have to do that due to technical limitations with current
browsers.

I've grown accustomed to reading unstyled HTML, and I gotta say - I like it.

Sites which host their own CSS and JS look as the designer indended. For
example the Washington Post. I just checked the size. The home page is about
200K but when you add everything else it comes to 3MB.

Why should I care about this "bloat"? I don't. Computers are faster (by
several orders of magnitude). The internet is faster (by at least two orders
of magnitude). I have Fios - if you can get it you should too.

------
rgbrenner
As long as there has been an internet, people have been complaining about
bloat. I remember articles like these in the 90s. They do absolutely nothing
to improve the situation or stop the growth.

We need to accept that web pages will be even larger in the future, and start
pushing the technology required to deliver those pages in a reasonable amount
of time.

That means HTTP2, QUIC, zero-RTT TLS handshakes (in quic and tls 1.3), and
other new technologies. CDNs certainly play a role here too with dynamic
acceleration, better caching, and networking. (disclaimer: I'm working that
last bit at NuevoCloud CDN)

That and improvements in connectivity are the only real ways to solve this
issue.

~~~
markism
Are you familiar with induced demand?[0] That would only encourage the problem
to get worse.

[0]
[https://en.m.wikipedia.org/wiki/Induced_demand](https://en.m.wikipedia.org/wiki/Induced_demand)

~~~
rgbrenner
absolutely.. but web pages are going to grow anyway. take a look at this chart
covering 1995-2014: [http://www.websiteoptimization.com/speed/tweak/average-
web-p...](http://www.websiteoptimization.com/speed/tweak/average-web-page/)

In just the past 3 years, the average size has more than doubled.

~~~
markism
Which may well be a consequence of lazy developers taking advantage of faster
internet. I'd bet if internet speed were to stop increasing today it would be
more likely to solve the bloat problem, since developers would eventually
realize they were ruining their UX with load times. It's the same with
processor improvements in that they allow developers to write more inefficient
code since their time is prioritized over CPU time.

~~~
justaaron
this

