
The Website Obesity Crisis (2015) - sysoleg
https://idlewords.com/talks/website_obesity.htm
======
mc3
We had some storms and network outages yesterday. I was down to 3G, it seemed
like dialup speeds. Hacker news faithfully loaded after 30 seconds (I think
most of that time there was zero connection, so when a few bytes could fly
through it loaded up).

I couldn't get any other site to load. Not even google.com.

It made me want to create a website called "lowbandwidthsites.com" or similar
that is itself minimal and just lists the sites that you can load when there
is low bandwidth for news etc.

A low bandwitdth read-only repeater for sites like reddit.com would be handy.
Or maybe a general purpose low bandwidth repeater that can take any site - a
bit like outline.com/mysite.com but even lower fidelity - just returns the
markdown!

Edit looks like it has been done:
[https://www.textise.net/](https://www.textise.net/)

~~~
redis_mlc
Not criticizing, just mentioning ...

> I was down to 3G, it seemed like dialup speeds.

Sounds like first-world problems! :)

HN works great on my EDGE (2G) Blackberry 8700g (2006) with Opera in SF. Handy
on the Muni. Basically no other site is usable at this point unless it's HTML-
only.

However, I understand that telcos are attempting to not support 2G any longer,
so don't plan on that working much longer. T-mobile still has fairly good
urban EDGE coverage, except for SF FiDi.

> I couldn't get any other site to load. Not even google.com.

The google UI appears minimal, but they're loading a ton of javascript.

~~~
mc3
Yeah it said "3G" but it seemed more like 33.6kbps dial up speeds of yore
based on the experience. I've gone through extended periods of using both 3G
and 33.6kbps to know the feel!

But still 1st world problems. A CB or ham radio would have been nice though!

------
medymed
As long as general websites load in under a certain time period the sites seem
to maximize bloat up to that point. Reminds of the commuting time period—-if
more highways are built to reduce traffic and speed up commutes, more people
will drive, eventually more will relocate so that the average commute ends up
being ~40 minutes again. Whatever tolerance is true for the vast majority of
users can often become the standard, and if so then non-vocal small opposing
groups are forced to adapt...or just complain.

~~~
weare138
I understand the point you're making but the difference is increased use of
these websites is not what's increasing the size of the sites themselves.

------
newnewpdro
High dpi screens have made this so much worse.

Try browsing the web on a 128kbps ISDN-like connection, many sites are too
slow to be usable or outright broken due to their own timeouts.

What I've noticed is many new sites are using huge high-res images as some
kind of maximum-common-denominator to ensure things look great on high dpi
screens, while everyone else has to transfer all this junk only to downscale
it anyways at render time.

I'm often on a slow link and it's made me stop using the web almost entirely,
outside of HN and reddit. Slack for instance is completely broken, it can't
even manage to finish logging in without giving up.

It used to be that slow internet users just had to wait longer, but things
largely worked correctly. Nowadays, with lots more javascript and async stuff
having web developers programming timeouts and probably never once testing on
a slow link, a lot of things are flat out broken.

~~~
seanwilson
> What I've noticed is many new sites are using huge high-res images as some
> kind of maximum-common-denominator to ensure things look great on high dpi
> screens

Obviously this is far from good but what's the practical solution for this?

As far as I know, doing this with HTML + CSS alone by yourself is tedious and
error-prone e.g. figure out manually the maximum size of each image, generate
all variations of the image sizes yourself, then use HTML/CSS to load the
right ones.

There's CDN, web server and CMS specific solutions that dynamically resize +
cache images on demand but they're far from common place so I'm not surprised
most website owners don't do this well.

~~~
adventured
The solution was arrived at a long time ago.

It looks like this:

[https://text.npr.org](https://text.npr.org)

Or:

[https://lite.cnn.io](https://lite.cnn.io)

With better formatting. You make a text only or otherwise very reduced version
of your site available to visitors. Ideally in that version you drop the
images entirely. You can optionally attempt to measure the capabilities of a
visitor, how they're experiencing your site, and actively present them with
the reduced option (the far easier thing to do, is to just make a reduced
version available and let visitors know it exists somewhere prominent enough).
Visitors want the content, not the ridiculous SEO image.

If you build for this from the beginning of a site, it's extremely easy to
deliver and maintain.

~~~
kemps4
Try this link for info about reducing bandwidth and energy usage..
[https://solar.lowtechmagazine.com/2018/09/how-to-build-a-
low...](https://solar.lowtechmagazine.com/2018/09/how-to-build-a-lowtech-
website.html)

------
dang
Discussed at the time:
[https://news.ycombinator.com/item?id=10820445](https://news.ycombinator.com/item?id=10820445)

and
[https://news.ycombinator.com/item?id=11659026](https://news.ycombinator.com/item?id=11659026)

2017:
[https://news.ycombinator.com/item?id=14088092](https://news.ycombinator.com/item?id=14088092)

------
ehonda
Should try out [http://wiby.me](http://wiby.me), only light weight websites
are allowed on that search engine although the index is comparatively small

------
ErikAugust
You can trim the fat off articles with
[https://beta.trimread.com](https://beta.trimread.com). Going to open source
too.

