
FasterWeb Wants To Make The Entire Web Up To Ten Times Faster In 2010 - vaksel
http://www.techcrunch.com/2009/07/19/fasterweb-aims-to-make-the-web-up-to-ten-times-faster-and-gets-money-to-do-so/
======
jknupp
"Here's a company we heard about, trying to do something that sounds
ridiculous. They won't tell us how they're going to do it, nor will they name
any of their clients. They have funding, though, and since no funded startup
ever failed, we (TC) will breathlessly report on their existence."

Here's a litmus test for TC to use: if a) a startup makes a wild claim and the
only corroboration you can find is from their funders or b) your article
requires the use of the phrase, "so we'll just have to take their word for
it," rest assured you can skip this story.

~~~
anigbrowl
It is indeed a terrible story, but the VC has a decent-looking track record:
<http://videolectures.net/yoav_andrew_leitersdorf/>

------
mahmud
What an absurd claim. Went to read the story to find either the meat, or spot
a grammatical error or missing specifics:

 _One VC firm, YL Ventures, believes that it can. And they’ve seen it in
action, so we’ll just have to take their word for it, for now._

~~~
omouse
Yet another post that proves that TC is as respectable a news source as CNN
and other American news organizations that do not ask the hard questions and
do not criticize bullshit.

TC is an entertainment tabloid and it's messed up because you can also see
that they want to be respectable and report useful information. They have a
pretty good company database, they've collected some info into an quarterly
analysis package, etc. but they still perpetuate the Web 2.0 celebrity gossip.

------
SwellJoe
My previous company built technology in this space, based on Squid, and some
custom tools. I vividly recall competitors making bold, outlandish, claims
like these on a monthly basis back then. None of them ever came to pass. They
required too much infrastructure build out, too much cooperation from ISPs and
website owners, and the companies behind them demanded too much involvement
and too much money from all parties in the chain. There was a CDN rush back
then, too, and only a few managed to raise enough money and build out fast
enough to be successful at it. This kind of technology requires a build out
similar to a CDN, but with far more involvement of parties that are very
unlikely to have an interest in being involved.

I've had long conversations with folks at Akamai, Red Swoosh, and many others,
in this particular area...and it's astonishing how much money it takes to
build this stuff out, and finding profitable ways to build out small (making
it useful enough to make money on _without_ first spending millions and
signing on dozens of ISPs) is very difficult. I just don't think there is that
much VC money floating around right now, even if (big giant if) they've
actually figured out technology that works on the scale they're promising.

------
ggruschow
Someone here probably knows this: How much latency would you cut if your
average page request to msn.com, yahoo.com, etc resulted in a single
instantly-full-speed download of an archive of all the content the browser
requires to show the page?

Actually, hitting msn.com and yahoo.com now, it looks like each takes <2
seconds on this computer on a normal-ish broadband connection without any
cache. NYTimes and Bing took about 5, but they had all the useful stuff up in
<2.

I suppose it'd be desirable for all of those to be <0.1 seconds, but that'd be
darned hard considering normal ping latencies. Between 2 seconds and 0.1
seconds, I'm not sure how much I care.. I still see the delay, but it don't
think it makes much difference to me in normal surfing.

~~~
pj
I saw something the other day where Firefox will allow you to browse a website
that is zipped up and you can use urls like this:

<http://somewebsite.com/somezip.zip?/index.html>
<http://somewebsite.com/somezip.zip?/images/logo.png>

Then the browser only downloads the zip file once and everything else is
cached. I can't seem to find the link anymore though because everything in the
search engines is so frickin SEO'd that all i can find with zip in the search
terms is winzip or winrar or something about putting a link to a zip on your
website...

This is the closest I can find: [http://www.aburad.com/blog/2008/05/view-
contents-of-zipjar-f...](http://www.aburad.com/blog/2008/05/view-contents-of-
zipjar-files-using-firefox.html)

~~~
sounddust
You're probably thinking of a combination of this:

<http://kaioa.com/node/99>

Which explains how to use JAR archives in Firefox instead of CSS sprites to
optimize page loads, and this:

<http://limi.net/articles/resource-packages>

Which is a proposal for a universal standard of packaging resources like this.

~~~
pj
Yep, that's it. I think that'd be a really cool way to speed up the web.

------
prakash
making a website 2 to 10 times fast is not that hard, put them on a CDN. The
usual suspects in increasing performance are:

\- compression

\- caching (origin, edge, browser)

\- persistent connections

\- tcp-optimizations

~~~
lsb
Caching is easy. The hard part is knowing when to expire an object in cache.

~~~
SwellJoe
That's only the hard part if the site doesn't tell you when to expire it.
Squid (and many others) have excellent cache expiry and replacement
heuristics; but Squid can only cache 20-35% of the web, because it's simply
not safe to cache the rest of it, either because it's session-based and could
be different for every user, or because it is SSL-encrypted and can't be seen
by the proxy, or because it explicitly disallows caching with Cache-Control or
Expires headers, etc. And, even if the site doesn't tell you how long you can
cache something, it's reasonably safe to guess based on the age of the object
(a five year old object is probably not going to change in the three days it
takes to run through a full replacement in your cache; while one that is 30
minutes old could possibly change dozens of times, so Squid will send an If-
Modified-Since a few minutes later, with gradually lengthening periods between
checks as the age of the object increases).

------
lsc
woo! caching and compression! I feel like it's 1998 again!

