
The Website Obesity Crisis (2015) - kercker
http://idlewords.com/talks/website_obesity.htm#crisis
======
jakozaur
Previous discussion:
[https://news.ycombinator.com/item?id=10820445](https://news.ycombinator.com/item?id=10820445)
129 days ago with 1122 upvotes

~~~
makecheck
Actually, it’s a story in itself that a problem noted by an older article is
still relevant enough to be discussed. In the 6 months since, has the industry
even _looked_ at this or is it even worse?

A fast-moving industry needs equally-fast solutions. If 6 months can pass and
people are still talking about a problem as if it was yesterday, then not
enough has happened. Some ideas of more that could be done: talk to your web-
developer friends; have them talk to their managers; start publicly shaming
bloated web sites; submit issues and/or pull requests to web browser projects
with ideas for how to fix HTML; or something.

~~~
jakobegger
The web isn't near as fast moving as you imply it is.

Website have always been to big. We've been using huge Javascript libraries
for the last ten years to add minor functionality.

Even using Flash for websites was a common standard!

The problem is that speed and usability are almost always considered less
important than flashy animations and other bullshit that no end user actually
wants.

I think this post will be relevant for years to come. I don't think web
developers will ever learn.

------
manigandham
While a nice article that I do agree with, it's not exactly accurate.

90% of website size is media (images, audio, video, etc). Discounting that and
saying that a 1MB page is so much bigger than just the text it delivers is
silly and doesn't really make a point. Maybe it's unnecessary, maybe not, but
that's subjective and most people would rather have images and decent UI.

It's true that ads are another factor, and it's slowly being solved, but
ultimately comes down to the wrong incentives across the industry. The more
annoying/heavy ads work better and pay more so everyone from advertisers to
agencies to publishers will continue to optimize toward them until something
changes (which might be adblocking). This isn't a technical problem so much as
business and politics.

However, the publishers these days *are pretty short on tech talent and mostly
use off-the-shelf CMS systems like wordpress (which are already bad software)
and then just layer on their plugins, themes, and ads to create the mess we
have today. Again no easy way to solve that without talent either on the pub
side or in the platforms themselves. Some progress being made here with things
like facebook instant articles and medium.com hosting.

Overall, the web definitely has a cruft problem, but it's not really that bad
considering all the various channels of information access, and it's all
slowly getting better.

~~~
jswrenn
> Maybe it's unnecessary, maybe not, but that's subjective and most people
> would rather have images and decent UI.

The rise and success of cruft-stripping-as-a-service offerings like
Readability, Pocket, Instapaper, etc., seems to indicate that a not-
insignificant segment of consumers think that a "decent UI" is _less_ UI.

~~~
emodendroket
As far as I can tell the main appeal of most of those is "I'll read this
later."

~~~
cholantesh
It is, but many times, I end up using Pocket or Instagram because the text is
presented in a way that is simply unreadable for me, and neither Firefox's
Reading Mode nor Clearly (when I use Chrome) help.

~~~
dyladan
I think you mean Instapaper

------
Mister_Snuggles
I usually have a fast connection and reasonable data limits available, so I
don't worry too much about how much stuff my browser needs to pull down to
render a website. However, there's another type of bloat that absolutely kills
my browser and that is all of the JavaScript.

One of my computers is a netbook-ish Lenovo x131e from a few years ago. It
runs everything I need it to very well, except for web browsing in Firefox.
Many pages cause the browser to completely choke for a few seconds while the
JavaScript does its thing.

I finally broke down and installed NoScript and it feels like I'm browsing the
web on a brand new computer! Pages load fast and render quickly. If JavaScript
is required, I can enable it on a site-by-site basis. There's also the privacy
and security benefits, but my main issue was performance.

I used to think that people who browse with JavaScript disabled were being
silly, but now I understand why someone would want to do that.

I know that part of the problem is Firefox and part of it is the under-powered
computer. Browsing the same sites in Safari on a 6-year-old MacBook Pro isn't
nearly as painful and, generally speaking, I leave the JavaScript alone on
that computer.

~~~
tracker1
In fairness, your 6yo mbp probably has a much faster CPU and more ram than the
newer netbook... It's a hard sell, especially now that newer cell phones even
are starting to have beasts of CPUs...

But I agree, JS is a huge issue... I notice it most on click-bait type sites
on my cell phone. It's really pretty sad, all things considered.

~~~
Mister_Snuggles
You're absolutely right. It's certainly not a fair comparison at all, they're
very different machines and I was satisfying different needs when I purchased
each one.

For the sake of fairness, the rough specs are:

The ThinkPad:

* Lenovo ThinkPad x131e, 11.6", purchased late 2013 * Core i3-3277U processor * 8GB RAM (aftermarket upgrade) * 500GB spinny drive * OpenSUSE Leap 42.1 * Embedded graphics * Firefox browser

The MacBook Pro:

* Mid-2010 MacBook Pro 15" * Core i7 - dual core, not sure which model * 8GB RAM (aftermarket upgrade) * 512GB SSD (aftermarket upgrade) * MacOS 10.11.4 * Embedded and discrete graphics * Safari browser

So yeah, it's hardly an apples-to-apples comparison.

------
exodust
Great stuff, I watched the video presentation too. The audience clapped at the
part where he shows the pyramid illustration of the web - HTML followed by
huge chunk of crap, then surveillance at the top. Obviously he is not alone in
recognizing the problem.

I'm glad to hear Google are planning to label slow sites. We need this.
Bloated websites need to be held accountable.

On mobile devices too, we need a browser option to stop loading any further
data for a given site after a defined point. So if the browser has received 3
megabytes of a page so far, it stops and asks the user whether to continue
with downloading. It might say how many cents this one website has cost them
so far (if the user has setup this feature).

Fair enough "modern web, modern features" but most of the modern features we
enjoy are improvements to browsers, servers and javascript. There's no reason
why this can't mean keeping page size steady while enjoying new features.

~~~
tracker1
A lot of it is library cruft... Angular + jQuery + jQuery library, and all
it's components (used or not) and all the related components.. and then the
iframes for ads, loading in their own different version of jQuery, and
related... etc, etc...

It's pretty bad in a lot of ways.

------
oolongCat
Calling this a "Crisis" is definitely blowing this out of proportion. True,
websites have grown by a lot over the past decade, and so has functionality,
take a look at your favourite websites on the waybackmachine and see how god
awful most of them looked just 6-7 years ago, compare the functionality they
provided vs what we have now.

Yes I agree some websites do really need to go on a diet, loading unused css,
js and multimedia has to be removed, but 1-2MB is not a reason anyone should
be throwing their hands in air screaming CRISIS!! Definitely I would applaud
anyone who would spend their time optimising their websites to be as small as
they can be, but Jesus Christ 2BM and we have a crisis... no way!

I for one think this is a sign we are making progress a very good thing, we
are improving as a community and we are getting more out of our web.

~~~
WA
I just got a text this morning from my phone company: "Your exceeded your
monthly 500MB limit". It's probably not a _crisis_ , but it has some quite
real economic consequences if I have to download 2MB worth of nothing whenever
I click a link.

Yes, Adblockers to the rescue. But you know, Apple doesn't support Adblockers
on an iPhone 5 for whatever stupid reason.

~~~
oolongCat
See the main problem then, is not a problem with if websites are obese or
paper thin, the question is are websites and our browsers doing a good job of
using already existing features for helping users.

I think we should be complaining about developers and admins who doesn't
enable gzipping, we should be angry with the fact some websites don't even
bother setting headers to cache static resources. We should not be throwing
out hands up in air screaming crisis because our web is becoming more useful.

~~~
exodust
"Useful" doesn't need to mean bloated. You seem to miss the point that many
websites these days are bloated even basic articles. We don't need a V8 engine
and oversize tyres to go buy milk and bread.

You're forgetting that mobile devices quickly run out of cache memory. This
includes iPads. You might visit a news site once every few days, but quite
often the same resources need downloading again and again because your iPad
has not kept them in cache memory because it ran out already. Desktop browsers
have a lot more capacity to keep web resources cached.

Not only that, but RAM memory is also limited on mobile devices. Forget about
expecting multiple websites to remain in memory across numerous tabs. My iPad3
can pretty much handle only one tab open. If I switch tabs to another website,
then switch back, the whole page loads again. A less bloated web would help
this issue.

And then there's accessibility for low-speed areas. Did you even read the
article?

------
staticelf
I love this article/presentation or whatever it is.

I often react very positively when websites are super-fast today. Especially
on a mobile device, it is worth a lot to me.

------
thomasjudge
I love this from the article:

These Apple sites exemplify what I call Chickenshit Minimalism. It's the
prevailing design aesthetic of today's web.

"Chickenshit Minimalism: the illusion of simplicity backed by megabytes of
cruft."

------
FussyZeus
Just took over a website for a paying client, local small business. Found out
their previous developer was some rockstar guy, thing was packed with
frameworks and nuttiness.

Built a scraper and converted it to something using Markdown in a weekend, I
barely used any of previous dev's "code."

Sometimes all you need to dig a hole is a shovel.

~~~
exodust
Nice. There's certainly money to be made for making crappy slow websites
better and faster. The trick is convincing clients their site looks and
performs like someone's homework assignment.

------
some1else
We need a decent @media equivalent in HTML.

We need a bandwidth @media selector (that is kept up to date with recent
conditions by the browser).

We need optional (viewport conditioned) lazy loading built into the spec.

It's time for these features to become part of the web stack, but maybe we
need a codified mediaQuery.js first.

~~~
WA
All of this is unnecessary. Because it's actually possible to build
lightweight and fast and yet beautiful websites. The available tools aren't
the problem. The problem are the people who design their garbage collectors
called websites.

Someone in another thread suggested to colorize the address bar based on bytes
transferred and/or requests made. Maybe it'll wake people up if they visit a
website and it has some nice dark red color to show how much bandwidth is
wasted.

~~~
some1else
Your observation that it's possible to build xyz type of websites does not
adequately discount the necessity of such features.

I'm talking about improving those lightweight and beautiful websites even
further. I implement some of those features on every project, to deliver the
maximum content & experience quality in every network and device condition
possible.

You can't seriously try to shoe-horn the entire platform into whatever your
vision of beauty is. Please keep an open mind about a complex future medium,
that can effectively accommodate Netflix, YouTube, Facebook, Wikipedia,
Photoshop, Open Office, Google Maps, Skype, Auto CAD, Grand Theft Auto VR, ...

~~~
WA
Point taken and I'm sure, you (and I for that matter) would benefit from your
suggestions. But it wouldn't change anything for the existing bloated
websites, because they _could_ be written in a lightweight way but aren't.

------
hawkice
Fun fact: Despite the hyper-cautious (2015) in the headline here, the AMP page
described, the one that Google said they were going to fix, still redownloads
the same video file every 30 seconds, thus making it, still, theoretically
unbounded in page size.

The NPR page about ad-blockers which was 12 megabytes without an adblocker and
1 megabyte when using an adblocker is now 1.4MB with an adblocker and "only"
1.9MB without -- to display about 1,100 words.

The medium article that was over a megabyte seems to have removed the
pointless 0.9MB invisible image.

------
hallatore
If 1MB is obesity then what is this page?
[https://www.javapoly.com](https://www.javapoly.com)

~~~
bbuffone
The website loads in 9 seconds (Chrome 50), the processing of the
(peg$parseKeyword) functions takes 4.5 seconds of time during loading which
causes the gap in the waterfall.

One look at the function would cause severe ingestion in most people.

~~~
chrismorgan
(You wanted the word “indigestion”. Ingestion means something quite different.
Though I suppose if someone was prone to eating when depressed, it could
work.)

------
tracker1
I think it may be worth specifically excluding Images from some of these
checks. Or adjusting for what can be compressed in the images themselves.
Though it's enough to say that they can be a huge portion of a site.

That said, I was able to boilerplate Preact + Redux for creation of a control
that will be used stand-alone and the payload is about 16k of JS (min+gz), and
under 1k of CSS [1]. The methodology I used could very well be carried to more
"full" applications. There's very little reason most modern web applications
can't be way under 500kb payload (excluding images). In this case I wanted
more modern tooling, workflow, but a fairly light datepicker modal... I feel
most datepickers suck. Could it be lighter? yes[1], but I wanted a little bit
of a different approach. In the end it works...

All of that said, the biggest points of code bloat are usually in bringing in
an entire library instead of only the pieces needed, especially bad with UI
controls... I really wish more people would use/extend bootstrap from source
here. It's really easy to do... usually I copy the variables file, copy the
bootstrap base file, create a mixins file, and then update all the references
in the copied base. From there, I can comment out/in as needed, and be fairly
light.

Of course, fonts are another source of bloat, I'd suggest people start leaning
towards using svg resources embedded in JS strings, and only those icons
needed... all modern browsers support svg well enough in this case. Other web
fonts should be limited to 2-3 includes of 1-2 families... that's it. Any more
and your design is flawed anyway.

With webpack + babel, it isn't so hard to keep your applications structured,
and much more minimal.

[1] [https://github.com/tracker1/md-
datepicker](https://github.com/tracker1/md-datepicker) [2]
[https://dbushell.github.io/Pikaday/](https://dbushell.github.io/Pikaday/)

------
amelius
Perhaps we need smarter tools and better cooperation. For instance, I bet a
large part of code in large websites is shared with some other website. Why
can't our tools figure out if this is the case, and then create some kind of
"shared library" to be used by both websites.

~~~
criddell
Maybe. That seems like it should be relatively easy to measure with a caching
proxy, right?

In my experience, most site slowness is related to slow loading ads. Since
installing an ad blocker a few months ago, I really haven't had any complaints
about web speed.

------
Loic
Just take the time making _your_ website slimmer. Do not care about the others
and rape the benefits having more visitors than the others. It is egoist, but
this is the only way companies are going to move, if the competition is
getting more visitors because of slimmer websites. If this practically does
not change anything, then, this is a false problem.

For my website (chemical databank) I was able to measure the benefits of
reducing the page size with more visitors from countries with poor Internet
connectivity.

So, just do it and enjoy the competitive advantage as long as you have it!
This is the best way to get things moving.

~~~
sp332
ahem *reap

------
gremy0
Actual figures on the 'crisis'

[http://www.httparchive.org/interesting.php?a=All&l=Apr%201%2...](http://www.httparchive.org/interesting.php?a=All&l=Apr%201%202016)

Average of 2.3MB with 60% of that being images. 50% of that content can be
cached for a least a day.

------
blowski
For the last 20 years people have been saying "web pages today are too big."
And for the last 20 years they've continued to get bigger.

The right size for a web page depends less on the opinion of random
developers, and more on the website's owners and users.

------
andreapaiola
Yeah, I did this

[http://andreapaiola.name/magpie/](http://andreapaiola.name/magpie/)

------
lmm
I was reading an astronaut's biography a few days ago, talking about the
transition from the moon missions (where every half-hour was planned and
accounted for) to Skylab (where people were working in the same place for long
enough that it became necessary for them to have free time). And on one level
it's a huge waste to have someone on the ISS where it's costing $x000/minute
to keep them up there playing candy crush or whatever. But it's also a sign of
maturity, that we are no longer desperately squeezing out every minute we can
possibly get up there.

I think we're somewhere similar with the web. Internet is plentiful enough
that we don't need to scrape and squeeze every last kilobyte. Maybe medium
takes 3mb to render a simple plain page. That's ok.

(I do wish a lot of sites would up their information density though. Above all
I wish I could get a full-width page, not a phone-width segment down the
middle of my widescreen display. At home I've started using a stylesheet that
disables max-width on all elements, and I've yet to see a site that looks
worse that way)

~~~
wanda
> Internet is plentiful enough that we don't need to scrape and squeeze every
> last kilobyte.

For you and I, perhaps. For the people on 2G/3G or worse, trying to read a
simple article, modern web pages are hilariously heavy. The whole point of the
web was openness. This is why Facebook, Apple and Google have all developed
their own solutions to this problem: FB Instant Articles; iOS News app; Google
Accelerated Mobile Pages.

If these companies are taking it seriously enough to develop solutions, there
must be a real problem or serious sections of the market being
missed/obstructed by the failure of publishers to build their websites
_responsibly_.

And in general, no matter that heavy pages can still load quickly for me on my
privileged broadband connection, it still pisses me off when it takes 3-5
seconds for a page to load when I know it should fundamentally take a second
(or less) to show me text and some associated images whenever they're in the
viewport.

Google AMP is the best solution for the web so far, but it's a step backwards
to XHTML/WAP and mobile subdomain sites. The correct thing would have been to
hit websites with genuine SEO penalties for shit mobile performance. This is
especially true because AMP doesn't even work on older mobile browsers.

The biggest reason why the web needs to care about every single byte of
overhead? Because native apps barely have this problem. Neither side, native
or web, is going to simply be killed by this, but the value of websites and
web apps will decrease if they do not prove to be viable vehicles for content.

------
onion2k
As much as I want to agree with the article I really can't. All of the things
that people claim as "bloat" are basically necessary for a website to work in
a modern fashion...

Layout frameworks (eg Bootstrap) tackle the problem of developing a site
that's readable at resolutions from 320px wide on a two year old phone up to a
5K iMac. That is _not_ a trivial task, especially if you need UI components.

Application frameworks (eg Angular) make it _much_ faster, and consequently a
nicer experience, to maintain user state, load content, navigate around
template driven pages, etc.

Media content has grown with resolutions and pixel densities. 10 years ago we
were looking at websites on 1280x1024 displays, with no rich media. Today a
consumer facing website has retina quality video. That's going to impact the
page weight.

Being wasteful is a minor problem; everyone has fast broadband. Everything is
cached at various layers from the browser to the service worker to the CDN to
the origin server. Browsers are _really_ fast. With some clever "cutting-edge-
even-though-its-been-around-for-years" stuff like http2 you can fetch things
in parallel.

 _Obviously_ websites should be optimised. No one should be downloading media
or libraries that aren't used. Animations should be hardware accelerated.
Sites shouldn't be running in debug mode ( _ahem_ AngularJS _ahem_ ). But all
in all, what we get in a browser these days is _far_ better, _far_ faster, and
_far_ more functional than websites were a decade ago. We could go back to the
"works without JS, stateless on the clientside, roundtrip to the server and a
whole new page for every click" world I learnt web development in, but I
really don't want to. It was rubbish.

~~~
exodust
I'm face-palming at your comment, and I'll take the bait.

Firstly, there is no such thing as "retina quality video". If you mean 4K,
then say 4K. "Retina" is Apple marketing spin for their high density displays
that you seem to be using to describe a standard of video. More to the point,
websites are not pre-loading 4K video that is counted towards page weight.

To your claims about needing bootstrap for presenting a page on small to large
screens. There are numerous easy methods for making a site readable on small
and large screens that don't require shoving layout frameworks in and relying
on them to make things look good. Learn HTML. Learn CSS.

> _" Being wasteful is a minor problem"_

That statement would be great as an ironic t-shirt, but otherwise is yet
another facepalm.

> _" everyone has fast broadband"_

Great news. With that assumption out of the way, we can get on with pre-
loading retina video and throwing page-weight caution to the wind.

> _" Everything is cached"_

The utopian world you live in sounds great, but mobile devices are known to be
very bad at both long term resource caching and short term memory caching - as
in multiple open browser tabs.

The point of the article is that even basic articles are bloated whales that
waste precious bandwidth on mobile. On desktop, common sense design is
replaced with "touch friendly mobile-first impressionism" for actually _less_
accessibility. When spinning globe videos and 2000 pixel auto-playing
slideshow presentations in a vast landscape with hamburger menu in corner is
considered the "modern web", we need to take a hard look at what modern should
be.

The improved functionality you're talking about does not need to mean bloat or
excessive amounts of data to drive simple things on websites.

~~~
onion2k
_Firstly, there is no such thing as "retina quality video". If you mean 4K,
then say 4K. "Retina" is Apple marketing spin for their high density displays
that you seem to be using to describe a standard of video._

I don't mean 4K. I mean transcoding video assets to specific resolutions that
match the website's responsive breakpoints, with 2X resolution versions for
high pixel density displays (eg retina). Pushing a 4K video down to someone
whose browser viewport is 800px wide is the sort of wastefulness people
complain about.

~~~
exodust
I think you mean images. Nobody is doing that with video.

Streaming video quality might be dynamically adjusted according to client
bandwidth, but the topic here is unreasonable page weight and bloat. Streaming
media has nothing to do with page weight.

I think I see the point you were trying to make though. Our devices have
larger screens and more pixels, so fill every last one of those pixels up to
the max with content, and then some. Many marketing and advertising folks
would agree with you, while many developers are scratching their heads
wondering why they just built a "chicken shit minimalist" website. Beer and a
paycheck dissolves the care factor.

