
Google Makes Web Pages Load Instantly - olalonde
http://www.technologyreview.com/computing/37818/?p1=A4&a=f
======
piotrSikora
This is really bad news for website owners... We'll see increased (sometimes
by order of magnitude) number of page views and bandwidth usage, without
increase in the number of real visitors.

Same thing happened with DNS when Google enabled DNS pre-fetching in Chrome.

~~~
jamesaguilar
Seems like no major disaster came out of the dns thing. Anyway, this could be
a benefit for many webmasters. Google has demonstrated in the past that faster
load times mean more users stick with a result. Plus, I'd be really surprised
if there were no way to disable it if it is really causing a problem.

~~~
piotrSikora
Disaster? No. Unnecessary cost increase for people operating DNS servers? Yes.

Good example / explanation of the problem:

[http://www.pinkbike.com/news/DNS-Prefetching-
Implications.ht...](http://www.pinkbike.com/news/DNS-Prefetching-
Implications.html) (cached:
[http://webcache.googleusercontent.com/search?q=cache:-3Lip30...](http://webcache.googleusercontent.com/search?q=cache:-3Lip30xln4J:www.pinkbike.com/news/DNS-
Prefetching-Implications.html&cd=7&hl=en&ct=clnk&source=www.google.com))

------
tectonic
Seems like this would completely destroy any chance of using web server visit
history for legal prosecution, since even visiting a site that links to
another site could cause you to appear in the second site's logs. Perhaps this
is a good thing.

~~~
drivebyacct2
My thoughts as well, but who uses for web server visit history for legal
prosecution these days? What is that proof of or evidence used for?

~~~
jarin
"On November 26, 2008, officials released 700 pages of documents related to
the Anthony investigation, which included evidence of Google searches of the
terms "neck breaking," "how to make chloroform," and "death" on Casey
Anthony's home computer."

<http://en.wikipedia.org/wiki/Death_of_Caylee_Anthony>

It doesn't seem too out of left field that investigators would use your
browser history as well.

~~~
tga
The searches themselves, already provided by Google, combined with actually
continuing to some of the results, also logged by Google, are already
incriminating enough. The only thing you could claim is that ISP traffic logs
couldn't and wouldn't make a difference between you and Chrome loading pages,
but that didn't even come into play in the case you quote.

~~~
pasbesoin
As per recent news (amongst other things), links/linking are in the process of
being criminalized in some contexts. Do you want to hit a page containing a
link and become "guilty" for using/following that link, without ever clicking
on it?

It may seem far-fetched at the moment, but I'm reminded of that East Coast
teacher who was fired because the classroom computer was infected (apparently,
not by her actions) and started popping porn up on the screen.

Common sense quickly demonstrated that she was not, in any personal sense, at
fault. Still lost her job; was, IIRC correctly, at least considered for
criminal prosecution, and had to spend months and probably lots of lawyer fees
attempting to clear her name.

I am mindful to be very cautious about assuming a limit to the stupidity or
maliciousness of people who will be making assumptions -- or, regardless of
their own understanding of a situation, simply have an agenda to push -- in
the future.

------
tga
I currently disable DNS prefetching and "using web services to improve my
browsing experience". This is yet another feature that will go on my black
list.

First and foremost, I think they are solving a problem nobody has, potentially
for "my Chrome is faster than your Firefox" points. I do not mind waiting an
extra _300ms_ for having a modicum of control over what my browser does. I
don't know about others, but page loading is not usually something I even
notice anymore.

Secondly, fully rendering sites I never visit, including script execution,
seems like a seriously bad idea. This turns a mildly annoying while slightly
beneficial feature (loading resources in the background) into a monster,
especially when combined with abominations like ro.me on autoplay.

Thirdly, I could imagine scenarios in which my Google searches would be
leaking this way to sites that have no business seeing my traffic.

I realize that what they are currently talking about is opt-in, but can only
hope that in time this feature won't be picked up by the other browsers and
become the silent default for everyone.

~~~
DavidSJ
_I do not mind waiting an extra 300ms for having a modicum of control over
what my browser does._

Your point is still legitimate, but the 1,100 ms (not 300 ms) in that photo
which you refer to is the time to serve the search query results, _not_ time
to load the page you then click on, which usually takes a number of seconds
more.

~~~
tga
Point taken, I was going for dramatic effect there.

Rendering the page, after selecting the result, does take a lot longer (6s for
something like the BBC home page here) but I still maintain that it's largely
invisible and not something most users would complain about or even notice.

~~~
DavidSJ
Speed is invisible before you have it, and once you have it.

It's when it gets taken away again that you notice.

------
noibl
If, as is hinted, they restrict this to SERPs that carry one canonical result,
such as 'facebook login' or -cough- 'html5 spec', it could be a great thing.
If after some interval you do the same search and again click the top result,
Chrome might even start redirecting you there straight from the omnibox. But
if they apply this technique too broadly it's going to be a big drain on
server resources and sites would probably start using visibilityState* to lazy
load their stuff.

* Thanks for that link, abarth

------
wozname
I'd be scared of it automatically downloading inappropriate or even worse
illegal stuff from the search results. Try explaining to the boss you did not
visit that nsfw site but it merely turned up in your search results.

~~~
mentat
This is a good point. It's not just the "used in a court of law" or "national
security" instances where this is dangerous. Corporations don't have to follow
any formal legal process. If you're using a proxy or they're otherwise
monitoring outbound traffic and you have this feature turned on, it will look
like you are visiting inappropriate sites.

------
Tichy
What about tracking cookies? Sounds like this will silently announce my
presence to more web site than I care for.

------
dedward
I would think the security concerns would be front and foremost. When I hit up
google for a list of sites, I want a list of sites. Whether my browser visits
them or not is up to me - I don't want it happening automatically.

Now, personally, I can turn this off, I know, and I also understand the speed
advantage for end user retention. People might say 300ms doesn't matter to
them - maybe not consciously - but it certainly DOES matter when it comes to
retention. People like faster sites automatically - it's a subconscious thing
- very well proven by good studies. (amazon, etc) a 300ms savings on page load
is a HUGE win from the site owner point of view.

Give the site owners some kind of passive control over this feature (a-la
robots.txt for spiders) so those who are really concerned about bandwidth can
opt out (though if you are top ranking for a google search term you probably
WANT that traffic - that's money down the drain otherwise) and appropriate
controls, and it's a good idea.

In fact, it's an inevitable idea. with it's ups and downs. It's going to
happen - so let's roll with it.

------
kasmura
I first thought that this was something only implemented in Chrome for Google,
but I then saw that this is somehow a HTML feature that they will also add to
Google Search.

I don't like if Google uses its own browser to give their sites features, that
could not be implemented without - website's features should only be made from
source code on the server. In my opinion.

This is core innovation from Google - but I see the problem with incorrect
website visitors. It think Chrome when prefetching the pages, should send some
information with the request that the request is a prefetch.

------
dylan62
Hasn't Firefox supported this for years?
<https://developer.mozilla.org/En/Link_prefetching_FAQ>

~~~
Joakal
That is more friendly for web owners. Control over bandwidth.

Google's prefetching is instant search list with the most likely result being
automatically be loaded in the background before the user clicks on it.

Web owner nightmare: Having a common irrelevant keyword that means most of the
world downloads the content without viewing the page.

------
jmcqk6
Remember, this makes it vitally important that your "delete" links in web apps
aren't actually links, but forms. Load up a data grid and boom! All you
resources are gone.

~~~
benologist
If I'm reading the design doc correctly it's only going to be triggered by
<link rel="prerender"/> so it's not by default going to affect anyone.

~~~
dangrossman
The <link rel="prerender"/> will be on Google's search result pages
instructing Chrome to prerender the most relevant search result. The control
is in Google's hands, not the site owner, by default.

~~~
benologist
Right, but beyond that it's not going to go crazy prefetching every URL in
your web apps as the user browses around triggering delete or other actions,
unless you _also_ put that <link> in?

------
nsomaru
What about places where internet access is extremely expensive and comes with
low usage limits?

There will probably be a switch, but non-techs usually don't flip switches

------
pasbesoin
I am annoyed at the "feature". I am fearful of the day I will not be able to
disable it.

When the Chrome development team decides something should be a certain way,
they are sometimes not at all accommodating of differing viewpoints.

------
nazgulnarsil
so now if I'm even on the same page as a link to a malicious script I'm
fucked. If this is implemented I will never start up chrome again.

~~~
Nogwater
I doubt Chrome is going to start running the scripts for the pages that it's
pre-fetching.

~~~
dangrossman
You haven't read the rest of the comments here, including the links to the
specs for this feature. The entire page is loaded in a hidden tab and all
scripts are executed.

------
ronnoch
In order to be instant, it would have to download all images, scripts and
stylesheets too - wouldn't that screw up everyone's web stats?

~~~
abarth
Here's the design document: [http://dev.chromium.org/developers/design-
documents/prerende...](http://dev.chromium.org/developers/design-
documents/prerender)

The browser does download subresources, including images, scripts, and
stylesheets. The new page visibility API is intended to help folks keep
accurate web stats:

<http://w3c-test.org/webperf/specs/PageVisibility/>

~~~
alexmuller
Does this mean analytics providers are scrambling to implement this API before
the Chrome build goes stable?

------
dillon
Umm, pretty much awesome I'd say.

~~~
bonch
I don't like the idea of the browser visiting pages before I've told it to.

~~~
drivebyacct2
It's not rendering it (I would image at least), I don't see this as a security
concern, do you?

~~~
dangrossman
Actually, it _is_ rendering it, in a hidden tab. As far as the site will know,
you visited. Last login times get updated, web stats record hits, stuff gets
marked as read, etc.

~~~
drivebyacct2
So scripts execute too. Hm, that is a bit surprising. Well, I trust Chrome so
much that I don't block scripts and I go to any site that pops up as a google
search without any hesitation so I suppose I don't mind.

------
hardy263
Wouldn't this be a crazy bandwidth hog? If I typed in multiple search terms,
and didn't load any of my results, it would have automatically downloaded
everything for no reason.

~~~
abraham
In Google's own words they only preload the top result if they are confident
it is what you are looking for. If they pull it off there will be negligible
amount of unused preloaded pages.

~~~
NaOH
Shouldn't the top result inherently represent what Google thinks is what I
want? Or is there some secondary "confidence" component to the presumed
efficacy of a top result?

~~~
stellar678
I'd presume they could look at the difference between their confidence
(relevance?) in the first and second results. If the difference is more than
some threshold, _bam_ preload the first.

------
thadeus_venture
Ad services will love this (not).

------
jamesgagan
another really cool product of google that serves to make us more dependent on
them...

~~~
younata
They're releasing the source. From the third paragraph of the second page:

"Other Web browsers could also employ Instant Pages, because Google has
released the necessary code for all to use. "We are opening up the code
because we want other browsers to implement it—it is good for the users and
for the Web," said Singhal."

Even if this weren't the case, as drivebyacct2 said, this doesn't do anything
to make us dependent on them, especially when compared to gmail or google
docs.

------
nvictor
isn't it funny that someone has to always comment "X browser had it for a
while..."? well, if X browser had it, it didn't care enough to let the
populace know about it.

