
Show HN: Make your site’s pages instant in one minute - dieulot
https://instant.page/
======
WestCoastJustin
Live demo on my website @
[https://sysadmincasts.com/](https://sysadmincasts.com/)

Temporary added it in-line for testing. I was already in the sub 100ms level
but this just puts it over the top! Also updated all admin
add/edit/delete/toggle/logout links with "data-no-instant". Pretty easy. Open
developer preview and watch the network tab. Pretty neat to watch it prefetch!
Thanks for creating this!

ps. Working on adding the license comment. I strip comments at the template
parse level (working on that now).

pps. I was using
[https://developers.google.com/speed/pagespeed/insights/](https://developers.google.com/speed/pagespeed/insights/)
to debug page speed before. Then working down the list of its suggestions.
Scoring 98/100 on mobile and 100/100 on desktop. I ended up inlining some css,
most images are converted to base64 and inlined (no extra http calls), heavy
cache of db results on the backend, wrote the CMS in Go, using a CDN (with
content cache), all to get to sub 100ms page loads. Pretty hilarious when you
think about it but it works pretty well.

~~~
fwip
If I mouse-over the same link 10 times, it looks in my network tab like it
downloads the link 10 times.

I'd expect this preload script to remember the pages it's already fetched and
not duplicate work unnecessarily. :/

~~~
aruggirello
Perhaps the author could add a script parameter, or support an optional
'preload-cache-for' attribute, so you'd write <a preload-cache-for="300s" ...>

If you really care about speed anyway, you should already have setup your site
to max out caching opportunities (Etag, Last-modified, and replying "not-
modified" to "if-modified-since" queries) - I'd suggest the author should
ensure the script does support caching to the broadest extent possible -
hitting your site whenever appropriate.

~~~
dharmab
Cache-Control headers already do a better job of solving that problem
[https://developer.mozilla.org/en-
US/docs/Web/HTTP/Headers/Ca...](https://developer.mozilla.org/en-
US/docs/Web/HTTP/Headers/Cache-Control)

------
xpose2000
I've been testing it for the past 30 minutes or so and found that it doesn't
cause the same problems that InstantClick did. (Which was javascript errors
that would randomly occur.) I'll limit it to a small subset of users to see if
any errors are reported but there is a good chance this could go live for all
logged in users. Maybe even all website visitors if all goes well.

Seems to have no impact on any javascript, including ads. Pages do load
faster, and I can see the prefetch working.

Just make sure you apply the data-no-instant tag to your logout link,
otherwise it'll logout on mouseover.

~~~
hn_throwaway_99
> Just make sure you apply the data-no-instant tag to your logout link,
> otherwise it'll logout on mouseover.

Logout links should never be GETs in the first place - they change states and
should be POSTs.

~~~
slim
POSTs are not Links. And Logout service is indempotent even if you can
consider it changes the state of the system

~~~
glacials
Idempotency is not the issue, the issue is that a user might hover over the
logout link, not click it, then move on to the rest of the site and find they
are logged out for no reason.

~~~
leesalminen
Right, which is why the included library includes an HTML attribute to disable
prefetch on a given link.

~~~
pbreit
OP’s point was that logout should not be implemented with a link/GET but
instead with a button/POST for exactly this reason.

~~~
leesalminen
A logout action is idempotent, though. You can't get logged out twice. In my
opinion, that's the use case for a GET request.

I just checked NewRelic, Twilio, Stripe and GitHub. The first 3 logged out
with a GET request and GitHub used a POST.

~~~
zepolen
Idempotency has nothing to do with it. Deleting a resource is idempotent as
well. You wouldn't do that via GET /delete

A GET request should never, _ever_ change state. No buts.

Just because a bunch of well known sites use GET /logout to logout does not
make it correct.

Doing anything else as demonstrated in this and other cases breaks web
protocols, the right thing to do is:

GET /logout returns a page with a form button to logout POST /logout logs you
out

~~~
derefr
Depends on your definition of “state.” A GET to a dynamic resource can build
that resource (by e.g. scraping some website or something—you can think of
this as effectively what a reverse-proxy like Varnish is doing), and then
cache that built resource. That cache is “state” that you’re mutating. You
might also mutate, say, request metrics tables, or server logs. So it’s fine
for a GET to cause things to happen—to change _internal_ state.

The requirement on GETs is that it must result in no changes to the observed
_representational state_ transferred to any user: for any pair of GET requests
a user might make, there must be no change to the representation transferred
by one GET as a side-effect of submitting the other GET first.

If you are building dynamic pages, for example, then you must maintain the
illusion that the resource representation “always was” what the GET that built
the resource retrieved. A GET to a resource shouldn’t leak, in the transferred
representation, any of the internal state mutated by the GET (e.g. access
metrics.)

So, by this measure, the old-school “hit counter” images that incremented on
every GET were incorrect: the GET causes a side-effect observable upon another
GET (of the same resource), such that the ordering of your GETs matters.

But it _wouldn’t_ be wrong to have a hit-counter-image resource at
/hits?asof=[timestamp] (where [timestamp] is e.g. provided by client-side JS)
that builds a dynamic representation based upon the historical value of a hit
counter at quantized time N, and _also_ increments the “current” bucket’s
value upon access.

The difference between the two, is that the resource /hits?asof=N would never
be retrieved until N, so it’s transferred representation can be defined to
have “always been” the current value of the hit counter at time N, and then
cached. Ordering of such requests doesn’t matter a bit; each one has a
“natural value” for it’s transferred representation, such that out-of-order
gets are fine (as long as you’re building the response from historical
metrics,

~~~
zepolen
Don't be a wise ass, with that definition state changes all the time in memory
registers even when no requests are made.

> So, by this measure, the old-school “hit counter” images that incremented on
> every GET were incorrect

Yes they are incorrect. No Buts.

Two requests hitting that resource at the same exact timestamp would increase
the counter once if a cache was in front of it.

------
js4ever
Each time you hover over a link it's doing a GET request bypassing the cache
(cache-control: max-age), even if you hove the same multiple times. Also this
will make all your analytics false... Except that indeed this can improve
greatly the user sensation of speed

~~~
mariopt
The analytics should only be triggered is the page is rendered, assuming it's
done client side. I believe Google does this for first top 3 results if I'm
not mistaken.

~~~
hombre_fatal
Good point.

What would be a good way to avoid prefetch requests in your analytics if you
only derive analytics from server access logs?

~~~
8bitben
Most browsers pass a "purpose:prefetch" or similar HTTP header in the prefetch
request that you can use to differentiate

~~~
hueving
Then how do you know when they actually go to the page? Do you need client
side analytics at that point since the browser already has the page in memory?

------
cjblomqvist
It seems everybody is missing this but this could actually slow down your
experience, and I'd actually guess it will in some scenarios (ie. not only a
theoretical situation).

Considering a user hovering over a bunch of links and then clicking the last
one, and doing this in a second. Let's assume your site takes 3 sec to load
(full round-trip) and you're server is only handling one request at a time
(I'm not sure how often this is the case, but I wouldn't be surprised if
that's the case within sessions for a significant amount of cases). Then the
link the user clicked would actually be loaded last, after all the others -
this probably drastically increase loading time.

The weak spot in this reasoning is the assumption that you're server won't
handle these requests in parallel. Unfortunately I'm not experienced enough to
know whether that happens or not, but if so, you should probably be careful
and not think that the additional server load is the only downside (which part
like is a negligible downside).

~~~
bherms
It actually cancels the previous request when you hover over another link

~~~
collinmanderson
Client side canceled doesn’t necessarily translate to server side canceled.

I used to use a preload-on-hover trick like this but decided to remove it once
we started getting a lot of traffic. I was afraid I’d overload the server.

~~~
x15
I'd also hesitate wasting resources in such a way.

About your first statement though, which server software do you use that still
sends data after the client has closed the connection? Doesn't it use
hearbeats based on ACKs?

~~~
hombre_fatal
The server is still doing all of the work in its request handlers regardless
of whether client closed the connection.

~~~
jsjohnst
Not if the server is setup correctly.

~~~
hombre_fatal
That doesn't make sense. You can't just "config a server" to do this. Even if
a web framework tried to do this for you, it would add overhead to short
queries, so it wouldn't be some universal drop-in "correct" answer.

Closing a connection to Postgres from the client doesn't even stop execution.

~~~
jsjohnst
> You can't just "config a server" to do this.

Unless you are focusing on the word _server_ and assuming that has nothing to
do with the framework/code/etc, then I can assure you it can be done. I’ve
done it multiple times for reasons similar to this situation. I profiled
extensively, so I definitely know what work was done after client disconnect.

Many frameworks provide hooks for “client disconnect”. If you setup you’re
_environment_ (more appropriate term than _server_ , admittedly) fully and
properly, which isn’t something most do, you can definitely cancel a majority
(if not all, depending on timing) of the “work” being done on a request.

> Closing a connection to Postgres from the client doesn't even stop
> execution.

There are multiple ways to do this. If your DB library exposes no methods to
do it, there is always:

pg_cancel_backend() [0]

If you are using Java and JDBI, there is:

java.sql.PreparedStatement.cancel()

Which _does_ cancel the running query.

If you are using Psycopg2 in Python, you’d call _cancel()_ on the connection
object (assuming you were in an async or threaded setting).

So yes, with a bunch of extra overhead in handler code, you could most
definitely cancel DB queries in progress when a client disconnects.

[0] [http://www.postgresql.org/docs/8.2/static/functions-
admin.ht...](http://www.postgresql.org/docs/8.2/static/functions-admin.html)

------
rurcliped
Many people browse the web from an employer who has rules about what types of
pages may be accessed. For example, a person applying for a job with my team
may include a link to a web page about their job-related background --
portfolio.html or whatever. HR tells us to be sure we don't follow links to
any other page that may be more personal in nature, such as a page that
reveals the applicant's marital status (which can't legally be considered in
hiring decisions here). HR doesn't want to deal with complications from cases
where we reject an applicant but there's a log entry showing a visit to, say,
family.html from our company's IP address block. We'd prefer that prefetching
isn't a default.

There's also log analysis to identify the set of web pages visited by each
employee during work hours, and an attempt to programmatically estimate the
amount of non-work-related web browsing. This feeds into decisions about
promotions/termination/etc. Prefetching won't get anyone automatically fired,
but we'd still prefer it isn't a default.

~~~
hombre_fatal
Jesus. I hope they pay you well for that.

I've heard a lot of stories of ridiculous rule-by-HR culture, but that's so
extreme it sounds made up.

~~~
nextos
I don't think it's made up, because I experienced the same thing in a pretty
well-known European research center...

Of course, had I known about these practices in advance, I would have declined
the job offer. But I didn't. I ended up quitting a few weeks later anyway.

IT would monitor all connections from all employees and send a report to upper
management with summary statistics, on a monthly basis.

I was told this was the case by a fellow worker during my second day there, so
I tunneled my traffic through my home server via SSH. When IT asked me why I
had zero HTTP requests, I reminded them that monitoring employees traffic was
illegal under our current legislation. Doing this in a university-like non-
profit research center is hard to justify.

~~~
lucasverra
So they asked you you are surfing the web on a insecure protocol that can
compromise internal data ?

Couldn't you just say "I just dont use http anymore because this X company
data is very valuable to me" ?

------
rprime
Is this a re-rebranding? I remember using something similar 4-5 years ago
(instant.js/instantclick).

But quite an interesting little thing, especially useful for older websites to
bring some life into them. The effect is very noticeable.

~~~
dieulot
Kind of. It’s different than InstantClick in that it uses <link
rel="prefetch"> and can thus be embedded in one minute, while InstantClick
transforms a site into an SPA and require additional work.

It’s a different product. The initially planned name for it was “InstantClick
Lite”.

~~~
rprime
Oh interesting, I'll give this a go.

------
jypepin
This is a feature available on most web frameworks today (for example Link's
prefetch on Next.js), but still could be very useful for smaller website and
other static pages not using such frameworks.

I'd be a little wary of using a script from an unknown person without being
able to look at the code - I'd rather see this open source before using.
Especially being free and MIT licensed, I don't see why it wouldn't be open.

~~~
devinl
In the technical details, he has a link to the open source on github. Here's
the js that's actually doing the preloading:
[https://github.com/instantpage/instant.page/blob/master/inst...](https://github.com/instantpage/instant.page/blob/master/instantpage.js)

~~~
jypepin
I stand corrected then, thanks for sharing :) I missed that part!

------
saagarjha
Perhaps this something that a browser should be doing, instead of websites
themselves?

~~~
hunter2_
The heuristics to exclude logout links and the like would be very disruptive.
Those decisions need to be in the website author's hands.

However, I think if browsers had this, but off by default until seeing tags to
enable it along with any exclusions, that would be great.

~~~
chrisseaton
I think it would only prefetch GET links, which never have side effects.

~~~
hombre_fatal
There's nothing stopping GET requests from having side effects.

It's like pointing to a list of best practices and saying "everyone surely
follows these."

For example, someone changed their signature to `[img]/logout.php[/img]` on a
forum I posted on as a kid and caused chaos. The mods couldn't remove it
because, on the page that lets you modify a user's signature, it shows a
signature preview. Good times.

~~~
arendtio
I think it was a joke as GET requests are not supposed to change anything, but
often they do (probably because many devs don't know about, understand or
respect the RESTful concept).

EDIT: For completeness, I have to add, that I am also part of the group of
people who have violated that concept. Maybe neither frequently nor recently,
but I did it too :-/

~~~
chrisseaton
> understand or respect the RESTful concept

It's nothing to do with REST. It's part of the HTTP spec and has always been,
that "GET and HEAD methods should never have the significance of taking an
action other than retrieval".

~~~
arendtio
Well, if I am not mistaken, REST is just the articulated concept on which HTTP
was built. So yes, the HTTP spec (probably) existed before REST became a term
itself, but in the end, there is no reason to argue if REST defines it or
HTTP.

------
benologist
Embedding via // without explicit SSL should probably be considered harmful or
malicious as there is no reason to make such scripts available without SSL.
Even if the end website is not using SSL users can still fetch your script
securely.

~~~
dieulot
There’s no security gain from going to HTTPS if the site is served over HTTP,
but there’s a small speed hit.

~~~
benologist
The communication between the user and example.com downloading the page
referring to your script is secured by their SSL if they have it.

Separately to that, the communication between the user and your server when
downloading your script is secured by your SSL. This can be secure even if
example.com is not, so it should only be secure.

~~~
esrauch
If the first html load isn't on SSL, and someone is able to intercept your
traffic, they can change the embedded https url to be a non-https url anyway,
so I can't even imagine the attack that is prevented by using https into
something loaded over http.

~~~
benologist
Absolutely correct. But this is the website owner's problem and their
consequences for not using SSL. You can't help or prevent this because it's
not your server, it's not your fault they enabled insecure communication that
can be exploited.

When you forgo SSL on your own server someone can also intercept your script
in exactly the same way, they don't need to hack the website embedding your
script. Now they are your consequences, your fault there's no SSL, and your
problem may be affecting everyone who embedded your script insecurely.

------
ams6110
Not sure how I feel about this. I often hover over a link to see where it is
linking to, see if it has a title, etc. But that's probably not typical of
most users. And I don't do it on sites I use often and am familiar with.

~~~
jolmg
I feel this fails a user expectation that simply hovering over a link doesn't
inform the server of anything.

It's curious you mention checking where links link to, because I think that's
also another user expectation failure. The url that appears in the status bar
below (or what was once a status bar) is not necessarily the link's true
destination. You can go to any google search results page, hover over the
links in the results and compare with the href attributes in the <a> tags.
They're different. It looks like you'd be going directly to the page that's on
the URL, but you're actually first going to google and google redirects you to
the URL you saw.

It used to be that checking the url in the status bar allowed you to make sure
the link really would take you to where the text made you think it would take
you, but that's no longer the case. It seems one can easily make a link that
seems like it would take you to your bank and then take you to a phished page.

~~~
asdfasgasdgasdg
> I feel this fails a user expectation that simply hovering over a link
> doesn't inform the server of anything.

I would bet 99%+ of web users do not have a sufficiently detailed mental model
of web pages that this is something they've decided one way or the other.

~~~
breck
Agreed, my expectations are the opposite.

Google Analytics et al, allow custom events which are used to record mouse
overs, clicks, et cetera on a majority of websites. I always just assume
everything I do, down to page scrolls and mouse movements, is recorded.

~~~
thecatspaw
Yes and people block those for privacy reason.

------
muppetman
I'm curious how this is better than Google quicklink
([https://github.com/GoogleChromeLabs/quicklink](https://github.com/GoogleChromeLabs/quicklink))
which is something I have active on my site currently. Can someone with more
technical knowledge point out which of these two "instant pages" solutions is
better?

~~~
dieulot
Same preloading technique but quicklink preloads more agressively.

------
ahnick
Why use this script as an include from the instant.page domain? I think if I'm
going to use this I'm just going to serve this script up myself from my own
servers.

~~~
nicolashahn
Good call, I just switched my site to hosting the script itself.

------
6c696e7578
Nice idea for HTTP/1.x, however, isn't this what HTTP/2.0 [1] is meant to
achieve by pushing components at the user?

1:
[https://en.wikipedia.org/wiki/HTTP/2_Server_Push](https://en.wikipedia.org/wiki/HTTP/2_Server_Push)

~~~
wldlyinaccurate
The main difference being that instant.page respects users' data allowances by
prefetching only resources that it thinks the user intends to load. You could
combine it with H2 push and/or prefetch response headers to improve the load
times even more :)

~~~
odensc
It probably respects their data allowances even less, considering it
completely re-fetches the page every time you hover over the link.

------
alpb
I also staged it on my blog and it's working awesome
[http://staging.ahmet.im/blog/index.html](http://staging.ahmet.im/blog/index.html)
. I wonder if CMS tools or static site generators (Hugo, Jekyll, Pelican etc)
should have an option for rel="preload". But I guess it still requires some JS
for preload on hover, so is this library going to take off now?

------
hokus
Great stuff, I started doing this in 2006 but manually. I made an unofficial
google toolbar for Opera[1] that (in the unreleased final version lol) also
loaded the images from the search pages when one hovered over the toolbar
icons.

It took a lot of tweaking to give it the right feel. imho it shouldn't start
fetching to fast in case the mouse is only moved over the link. Loading to
many assets at the same time is also bad. Some should preload with a delay and
hovering over a different link should discontinue preloading the previous
assets.

Perhaps there is room for a paid version that crawls the linked pages a few
times and preloads static assets. Who knows, perhaps you could load css, js
and json as well as images and icons.

Or (to make it truly magical) make the amount of preloading depend on how slow
a round trip resolves. If loading the favicon.ico (from the users location)
takes 2+ seconds the html probably wont arrive any time soon.

Fun stuff, keep up the good work.

[1]-
[http://web.archive.org/web/20130329183223/http://widgets.ope...](http://web.archive.org/web/20130329183223/http://widgets.opera.com/widget/4282/)

------
chatmasta
Will this cause problems with "links" that are entirely rendered on the client
side? i.e. using something like react-router... In that case, could it result
in the react app setting invalid state because it thinks the user is on a page
when it's not?

My guess is what would happen when "pre-fetching" a react-router link, is that
it would prefetch the JS bundle all over again for no gain.

------
tgb
I'm just surprised at how slow my hover-to-click time was (never got it below
100ms). Thought it would be <50ms for sure when trying hard.

~~~
hombre_fatal
I had the same reaction. On a trackpad, I take a casual 300ms to click the
damn link.

------
h1d
[https://reactjs.org/](https://reactjs.org/) does it pretty well too.

~~~
kylemathews
Yeah, it's built with Gatsby which has this sort of behavior baked in
[https://www.gatsbyjs.org/](https://www.gatsbyjs.org/)

------
dwheeler
This is cool, and the license as shown at
[https://instant.page/license](https://instant.page/license) is the well-known
MIT license, already known to be be an open source software license
[https://opensource.org/licenses/MIT](https://opensource.org/licenses/MIT)

A problem with the loading instructions is that it reveals, to an unrelated
site, every single time any user loads the site that is doing the preloading.
That is terrible for privacy. Yes, that's also true for Google Analytics and
the way many people load fonts, but it's also unnecessary. I'd copy this into
my server site, to better provide privacy for my users. Thankfully, since this
is open source software, that is easily done. Bravo!

------
elliotec
If you like to hyper optimize your site like me, and since it doesn't do any
good on mobile (Edit: apparently it works on mobile, ignore this), you can
have it selectively grab the script on desktop and save a few bytes like this:

    
    
        <script type="text/javascript">
          if (screen.width > 768) {
            let script = document.createElement('script');
            script.src = '//instant.page/1.0.0';
            script.type = 'module';
            script.integrity = 'sha384-6w2SekMzCkuMQ9sEbq0cLviD/yR2HfA/+ekmKiBnFlsoSvb/VmQFSi/umVShadQI';
            document.write(script.outerHTML);
          }
        </script>

~~~
eridius
The site claims it works on mobile

> _On mobile, a user starts touching their display before releasing it,
> leaving on average 90 ms to preload the page._

------
rhacker
It does mean we're trusting your service - we're executing your JS on our
sites. But I do like that you're putting in the SHA hash so that we know
you're not fudging it.

Just found that you have the source available too :) So overall, this is
pretty cool.

------
chiefalchemist
I looked (quickly) through most of the comments below and couldn't answer
these questions:

1) What, if anything, is the downside here?

2) Is (Google) analytics effected by the prefetch? That is, does that get
counted as a page visit if the link that triggers this prefetch is not
actually clicked?

Tia

~~~
dieulot
The downside is that your pages’s HTML are loaded twice as much, this makes
for additional load on your server.

Client-side analytics like GA aren’t affected.

~~~
Semaphor
> The downside is that your pages’s HTML are loaded twice as much

How? It would load the HTML just as often and not download it a second time as
that would invalidate the usage

> this makes for additional load on your server.

True for users who hover over links they decide not to visit

Or am I misunderstanding something here?

~~~
Zarel
> How? It would load the HTML just as often and not download it a second time
> as that would invalidate the usage

I'm guessing you didn't read the linked article? It preloads after 65ms on
hover, at which point it estimates a 50% chance that the user will click.
Hence "loaded twice as much".

> True for users who hover over links they decide not to visit

Yes, that's the point.

------
ThomPete
Is this kind of like turbolinks which Basecamp uses?

~~~
glacials
Similar in effect, but not in method. Turbolinks fetches pages after a click
like normal, but swaps the body tag from the new page into the current page,
cutting local render times.

~~~
Savageman
Would it make sense to combine them? Instant turbo links.

~~~
sfusato
Yes, it would make lots of sense. If Turbolinks adds a 'prefetch' mechanism,
it will get even faster.

------
mars4rp
This is very nice idea, but isn't the problem initial liad time most of the
time? How could we solve that?

Does this also work for all the outgoing links as well? I don't want to
improve other sites rendering time at the expense of my own.

Very cool regardless.

~~~
s3krit
>but isn't the problem initial liad time most of the time? How could we solve
that?

Off the top of my head, a good way seems to be to write better sites that
don't include 10mb of javascript libraries.

------
randlet
Works very well for me.

dieulot, is there a small bug with the allowQueryString check?

    
    
        const allowQueryString = 'instantAllowQueryString' in document.body.dataset
    

I think should be:

    
    
        const allowQueryString = 'instantallowquerystring' in document.body.dataset
    
    

If I have:

    
    
        <body  data-instantAllowQueryString="foo">
    

then

    
    
        'instantAllowQueryString' in document.body.dataset === false
    

and

    
    
        'instantallowquerystring' in document.body.dataset === true
    

because html data attributes get converted to lowercase by the browser (I
think).

~~~
dieulot
Uppercase is converted to added dashes.
`document.body.dataset.instantAllowQueryString` corresponds to `data-instant-
allow-query-string`.

~~~
randlet
Cool thanks! Didn't know that.

------
sfusato
What would be the best way to integrate this in an app that already uses
Turbolinks?

~~~
ksec
Exactly my thought. May be even better is to have it included in the next
version of Turbolinks.

------
lucb1e
It doesn't seem to work for me. There are no JavaScript errors in the console*
but the <button> that says "test your clicking speed" doesn't even have an
event attached to it. Hovering over anything for multiple seconds doesn't fire
any requests in the network panel.

Anyone else having this issue?

* Well, there's this, but I assume it's not dependent on google... `Loading failed for the <script> with source “hxxps://www.googletagmanager.com/gtag/js?id=UA-134140884-1"`

~~~
dieulot
Are you using an older browser? It sounds like your browser doesn’t support JS
modules.

~~~
lucb1e
I... Never heard of JavaScript modules. I feel very out of date now. Anyway,
yeah, I'm using the latest Firefox that still supports real addons, some of
which aren't even possible to reimplement using the latest APIs in nightly.

~~~
itmeyou
I don't know exactly what that version was, but you may be able to use
about:config to enable JS modules.

------
hunter2_
I don't know that the presence of a query string is a great indicator of
whether the link causes state change (cue the typical GET immutability
arguments, etc.).

For example, in Drupal every path (whether or not it causes state change) has
2 forms: "/?q=path/to/page" (when you don't have access to .htaccess or .conf)
and "/path/to/page" (when you do, and you enable clean URLs).

~~~
hopler
Back when standards mattered, state changes were only in PUT, POST, and DELETE

------
codingdave
The downside to this would be hovering over a document list, where you might
pass by 15 records, GETting them all, until you click the one you really
wanted.

But it is a clever idea. Applied carefully, it could give the impression of a
speedier site. Of course, I see no reason to need a 3rd party for this...
updating event handlers to operate this way shouldn't be outside the abilities
of most web developers.

------
kieranhunt
I wrote a very minimal GreaseMonkey script that adds this to every page:
[https://github.com/KieranHunt/instant.page/raw/master/instan...](https://github.com/KieranHunt/instant.page/raw/master/instant.page.user.js)

------
blattimwind
"49 ms from hover to click" \-- I guess I'm not exactly the target audience...

However, I find it very good the posted snippets include SRI, which sadly to
this day almost every script CDN omits. The code is also small enough to just
include it in projects, which avoids the external request entirely.

~~~
thecatspaw
I thought that as well, but then I realised that I was literally moving my
cross over the button and clicking as fast as I can. Thats not how I usually
browse the web. I move my cursor usually where I am reading, which includes
links, so I am hovering over it before I have decided to click it

~~~
blattimwind
I usually move the cursor out of the way, because I dislike it covering text
or images. So when I do click on something I actually move over there and
click immediately. How fast that goes is pretty much the textbook case of
Fitts' law.

------
ayoisaiah
Wow this looks really great. Just tried it on my website[0], and page loads
are pretty much instantaneous. However, it doesn't seem to work in Firefox 67
(Nightly). Does that mean it only works in Chrome?

[0]: [https://freshman.tech](https://freshman.tech)

~~~
dieulot
Firefox has a bug that makes it redownload the page if that page is not
cached.

~~~
mkl
If the page is not cached, where is the browser going to get it except by
redownloading?

------
lol768
I guess nobody has mentioned this yet, but it presumably doesn't work on
mobile?

I wonder if you could do something similar simply by looking at the viewport
and loading all the links within the currently visible part of the page. Might
be overkill though and end up wasting the user's data.

~~~
CGamesPlay
From the article:

> On mobile, a user starts touching their display before releasing it, leaving
> on average 90 ms to preload the page.

~~~
lol768
Not sure how I missed that, thanks :)

------
PaulHoule
A "1% improvement in conversion rate" is a big claim to make for a claim that
seems so small.

That is, if you wanted to prove that "X is better than Y by 1%" you would need
a sample approaching 10,000 attempted conversions to have a hope of having a
good enough sample.

~~~
tonmoy
And I assume that’s why the author has put 4 references to that claim

------
demarq
I wish there were more examples on that page though. Where can I test this
being used extensively?

------
akras14
Sounds too good to be true, but also brilliant. Curious what others think.
What are the downsides?

~~~
ddebernardy
Assuming you use the script as suggested, letting a 3rd party site know your
stats (and users) sounds like a non-trivial downside.

I'd surmise the author is benevolent, but if this were to be turned into a
business, some kind of data play seems like the trivial next step.

~~~
shaklee3
As other comments have pointed out, you can run the js locally from your site
instead.

------
NPMaxwell
Where does their code get inserted? I could imagine it might help if it were
added to the page you were linking from, but the site seems to indicate the
code gets added to page you want to have load quickly -- the page you're
linking to.

~~~
dieulot
On the pages you’re linking from.

------
ian0
It would be great to deploy this on hacker news itself, at least on the "see
comments" link of the main page. Page-loading is already blazingly fast, but
my latency is still a bit high as I use a mobile internet connection.

~~~
neillyons
You could try injecting the script yourself using a chrome extension.
[https://chrome.google.com/webstore/detail/custom-
javascript-...](https://chrome.google.com/webstore/detail/custom-javascript-
for-web/poakhlngfciodnhlhhgnaaelnpjljija?hl=en)

------
technotarek
Make it opt-in (vs opt-out)? As best I can tell, there is only the blacklist
functionality (via a markup attribute). I can imagine many use cases where it
would be far easier to only activate where a link has .instant-page.

------
desertrider12
I thought chrome already did this. Somebody else here pointed out that's
impossible in general because even regular GET requests can have side effects.
I wonder what it's doing with 1.6 GB of memory then...

------
broth
I’m wondering if a possible use case for this is to use it to warm up app
servers when doing deployments?

Enumerate through a list of pages on your site and use something like
Puppeteer to simulate hovering over links on each page.

~~~
thijsvandien
Why would you simulate hovering over links if you can just visit those URLs
directly? If you want all links anyway, you lose the real optimization here,
which is not the preloading itself, but limiting that to the (small) subset
that a particular user seems interested in.

------
the_arun
This is a good optimization. However, it is like pushing the mud under the
carpet. It’ll hide real problem. I would rather fix the root cause of slow
rendering pages and then use this trick to make it better.

~~~
Zelphyr
I think it _could_ be used that way. If used properly, however, you’d make
sure all your other ducks are in a row before using something like this as
icing on the cake, so to speak.

But you make an important point.

------
SkyLinx
I have just added it to a Shopify store to try it and it does indeed speed
things up! However I am a little concerned with adding a script from another
website... it requires a lot of trust, doesn'it?

~~~
nicolashahn
As another comment mentioned, you can just host the script on your own site.

~~~
SkyLinx
Of course.... I was confused with the Cloudflare Workers thing... I just added
the code directly to the website and it works fine. Thanks.

------
amelius
This should be implemented at the browser level, not in a webpage.

------
joepour
This is awesome! Added it to all the public pages of
[https://tinytracker.co](https://tinytracker.co) \- thank you, @dieulot :)

------
neillyons
This is really impressive! I noticed the script uses `const` and `let` which
might cause javascript errors in older browsers
([https://caniuse.com/#search=let](https://caniuse.com/#search=let)) so I ran
it through the Google Closure compiler to compile it down to ES3 and it works
great. [https://closure-compiler.appspot.com/home](https://closure-
compiler.appspot.com/home)

I've added it to my blog and a Django side project. Really speeds up page
loads. Just need to add `data-no-instant` attribute to the logout link.

~~~
dieulot
It’s loaded as a module so older browsers won’t execute it (and thus won’t
choke on the modern syntax).

~~~
neillyons
Oh. Interesting. I didn't know about modules.

------
leesalminen
Looks very good! Love the landing page.

Testing on my dev box now. I’ll be rolling this out to a subset of users next
week. Cool stuff!

------
Wowfunhappy
Can this be easily tweaked to preload images as well, if desired? Or is their
exclusion inherent in the method used?

~~~
calibas
As far as I can tell it just downloads the page. Nothing is parsed until you
click the link, so there really isn't a reliable way of telling which images
to load until then.

You could write something to parse the page and download images, but I don't
recommend it. You risk a significant performance hit for the client, and a lot
of wasted bandwidth for both client and server.

------
huhtenberg
Does anyone know how using <link href=...> (this method) compares to using a
hidden <iframe src=...> ?

------
bigbadgoose
Just set this up on guidevinetech.com, injected via tag manager. Works a
treat, pages are definitely _fast_

------
hartator
Is there a demo somewhere? Beside the actual site that seems really well
optimized already.

------
fouric
I wonder if you could use Greasemonkey to inject this into pages client-
side...

------
alexpetralia
This sounds like a computer processor's branch prediction algorithm.

~~~
thijsvandien
Let's wait and see what creative ways will be found to exploit this
optimization as well. :)

------
faitswulff
Reminds me of this optimization by Netflix: [https://medium.com/dev-
channel/a-netflix-web-performance-cas...](https://medium.com/dev-
channel/a-netflix-web-performance-case-study-c0bcde26a9d9#1b0c)

The talk
([https://www.youtube.com/watch?v=V8oTJ8OZ5S0](https://www.youtube.com/watch?v=V8oTJ8OZ5S0))
was one of the most watchable performance optimization talks I've seen.

TLDR - they used a combination of the link prefetch technique, which works for
HTML but is not fully supported by all browsers, as well as XHR prefetching,
which will work for prefetching Javascript and CSS.

------
geophertz
Could this be made as a browser extension?

------
gsich
Why not make the page itself faster?

[https://forum.dlang.org/](https://forum.dlang.org/)

click on subforums and threads.

------
homero
Can we see the Cloudflare Worker?

~~~
dieulot
I plan to release it later on.

------
amelius
Probably doesn't work with an agent that doesn't support hover events. Such as
on tablets and smartphones.

~~~
hn_throwaway_99
Did you read the page? It uses touchstart events on mobile.

~~~
DenisM
This comment is against the guidelines:
[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

> "Did you even read the article? It mentions that" can be shortened to "The
> article mentions that."

~~~
hn_throwaway_99
Thanks for the callout, my apologies.

------
mraza007
Can I use this for static pages

~~~
dieulot
Yes.

------
hartator
Also, how this can be free?

I thought CloudFlare was charging workers based on number of requests.

~~~
dieulot
Cloudflare offers a free Pro plan to open source projects, I asked for Workers
instead.

~~~
hartator
Awesome.

------
ryukoposting
...I just keep my website lightweight, so I don't need to preload stuff...

------
xivzgrev
Fascinating!

------
papaman
Live demo on my site as well @
[https://pokatheme.com/](https://pokatheme.com/)

------
papaman
what kind of sorcery is this?

------
harshulpandav
How does this work on devices with touch screen?

(My apologies if this question has already been raised)

~~~
avip
(Your apology was rejected as it's clearly stated in the ultra short OP)

It prefetches on touch event while a "click" is normally triggered by a touch
release.

~~~
msla
> It prefetches on touch event while a "click" is normally triggered by a
> touch release.

So now I can't even touch my screen without being taken somewhere else,
because I don't know where all the active areas are.

------
superkuh
I wouldn't want to expose my various site's users to third party code
execution like this.

