

Is there a 'Moore's Law' for web pages? - jgrahamc
http://blog.jgc.org/2012/06/guardians-home-page-appears-to-be.html

======
dave1010uk
The HTTPArchive has been tracking Page weight (and many other stats) for
thousands of websites for a few years now:

<http://httparchive.org/trends.php>

As any website performance optimizer will tell you, increasing broadband
bandwidth won't really help at all with this "inflation". The biggest
components of page speed is usually the number of round trip times, which only
reducing latency can help with.

~~~
VikingCoder
> The biggest components of page speed is usually the number of round trip
> times, which only reducing latency can help with.

You can also reduce the number of round trip times by using something like
SPDY.

~~~
dave1010uk
True. Only latency can reduce a single round trip time but reducing the number
of serial requests also helps. In addition to SPDY, concatenating or inlining
scripts / image sprites and upping the number of parallel connections helps.

------
givan
I think is related to bandwidth and processing power, internet connection
speed increases and if the page can be loaded in some short time like seconds
then is okay and more content will be added.

Some years ago video on the web was a crazy idea now is something normal.

~~~
mtrimpe
My guess is that eventually (a lot of) web sites will become more like games,
where the highest quality experience is achieved by maximally utilizing the
amount of resources available.

So in that sense we could expect many websites' bandwidth usage to follow
Nielsen's law [<http://www.useit.com/alertbox/980405.html>]

------
EzGraphs
Interesting premise - but might be a bit more convincing with a sample size
larger than one (The Guardian).

------
Alex3917
Broadband is only getting roughly 10% faster per year, so it doesn't really
make sense that this would be a universal law. At the very least it's unlikely
to continue over the next 5 - 10 years since broadband saturation is already
pretty good in the US.

------
VikingCoder
I'd love to see the same calculation for all of the bytes needed to render a
collection of web pages.

<http://yahoo.com>

<http://cnn.com>

<http://slashdot.org>

<https://www.google.com/search?q=pancakes>

~~~
EzGraphs
Does anybody happen to have collected periodic snapshots of these pages on
<http://www.freezepage.com>?

I looked around on the Wayback Machine a bit but it seems like images are not
always stored.

~~~
zio99
A bit of a tangent, but if I don't ask now, I never will - has anyone applied
the _internet wayback machine to google maps_ \- so a kind of _"time machine"
meets street view_ , and you see what your neighbourhood looked like 6 years
ago before that condo blocked your scenic view? Would be a neat idea. If
anyone's come across a solution, please share.

~~~
VikingCoder
I believe Google Earth somewhat does it!

<http://earth.google.com/outreach/tutorial_time.html>

------
zio99
JGC, I was trying a similar computation for the Startup Framework that I'm
building using Wolfram Alpha but it seems that this is available only for
premium users? Am I correct, or did you just use their regular computational
engine for the curve fitting?

~~~
jgrahamc
I was just using regular Wolfram Alpha.

------
bornhuetter
I thought they might be talking about the density of articles on the page
growing exponentially.

The Guardian has one of the worst layouts of any webpage I regularly visit.

