
Instant.page - vinnyglennon
https://instant.page/
======
CGamesPlay
> instant.page’s script is hosted serverless with Cloudflare Workers so that
> there’s no server to hack into.

There are a lot of reasons that this software is "safe" but this absolutely
isn't one of them and highlights to me as a potential user that you haven't
thought through your thread model. The subresource integrity line should
absolutely be the headline here.

~~~
forty
Haha this is really good :) I was never fond of this "serverless" naming,
precisely because it could lead people to believe that there where actually no
servers, but I had never seen such a good example of it :)

Oh and guess what, not only there are servers, but other people can run code
on the same server as your code is running. Doesn't feel as safe suddenly ;)

Alternative phrasing: "This is hosted on Cloudflare Workers, and Cloudflare
engineers are probably doing a better job running my code securely than I
would"

~~~
waterhouse
> I was never fond of this "serverless" naming, precisely because it could
> lead people to believe that there where actually no servers

Agreed. I think "server-agnostic" or "server-oblivious" would have been a
better name.

------
chrismorgan
I think this library has gotten steadily worse over time. After 1.2.2, each
new release just introduces more configurability that >99.99% of sites won’t
want, more complications, more bugs and more misfeatures. Most notably version
5 introduced triggering load on mouse down, which I call _catastrophically_
wrong, 100% dealbreaker with prejudice. 5.1.0 made it opt-in, but it’s still
functionality that just shouldn’t exist.

I golfed 1.2.2 down a long way, removing other configurability and
functionality that I didn’t want; see
[https://news.ycombinator.com/item?id=23204741](https://news.ycombinator.com/item?id=23204741),
part of the discussion a few months ago at
[https://news.ycombinator.com/item?id=23203658](https://news.ycombinator.com/item?id=23203658)
(where there was much wringing of hands about the whole load-on-mousedown
mess). 5.1.0 is 2,842 bytes, 1,169 gzipped; my version got to 981 bytes, 532
gzipped. At that point it’s almost certainly cheaper to inline it than to have
it as an external script (though if your CSP has script-src excluding unsafe-
inline, you’ll need to give it a nonce or sha hash if you want to do it
inline).

~~~
pimterry
> version 5 introduced triggering load on mouse down, which I call
> catastrophically wrong, 100% dealbreaker with prejudice.

Could you provide more context? Why is that catastrophically wrong?

~~~
chrismorgan
See some of the discussion on
[https://news.ycombinator.com/item?id=23203658](https://news.ycombinator.com/item?id=23203658).
It breaks how links work normally.

~~~
celsoazevedo
[https://instant.page/changelog](https://instant.page/changelog)

I agree it was a bad decision to do that by default, but it's opt-in since
5.1: "Make triggering clicks on mousedown opt-in".

~~~
chrismorgan
Please reread my top-level comment.

------
jwr
I think rather than waste bandwidth and fight for those last 80ms, we should
fix sites that take multiple seconds to load. Especially after the first load
— subsequent pages should load instantly.

In general, it's 2020, and if every click on your site takes 5 seconds,
something is very wrong.

~~~
onion2k
_subsequent pages should load instantly_

Every page should load fast, obviously, but _instantly_ means the data has to
be prefetched and that's potentially a lot of data to download that the user
might never actually need. That's a waste of everyone's bandwidth, especially
on mobile. Pages that a user _will_ view (eg first page behind a login), or
that they're very likely to view (eg top result in a search), should be
prefetched/prerendered but whether you should prefetch more than that requires
thought and measurement.

~~~
esperent
As far as I know this only preloads HTML. For the majority of sites, that's
not going to cause a huge increase in data use. Most page's html is only a
couple of kb.

~~~
onion2k
Taking this story on HN as an example, the HTML is about 20kb right now. If
you prefetched every story on the front page that would be 600kb.

And if you read HN as much as I doing that could be gigabytes every day.

~~~
esperent
From my understanding that's not how this tool works (admittedly it's a while
since I examined it deeply). It waits for a mouse hover over a link and then
preloads that link only. Unless it's been terribly abused, there's no way it
should preload all the links on a page.

~~~
porker
You are correct about how it works.

------
compumike
We use this, with

    
    
        <body data-instant-intensity="10">
    

on
[https://ultimateelectronicsbook.com/](https://ultimateelectronicsbook.com/)

It's already a static Jekyll-compiled site served from a CDN, but combined
with other techniques, instant.page makes the site quite snappy. (We had the
EasyList blocking problem described below, so we started bundling it into our
own post-load JS bundle.)

Honestly, unrelated, but the biggest performance improvement for us has been
to do server-side rendering of the LaTeX equations, rather than using MathJax
client-side.

------
stephenhuey
2 decades ago there was a product whose name escapes me. It was a tool to
speed up web browsing, but instead of preloading which link it thought you
were about to click next, it simply started preloading all the hyperlinks on
the webpage.

~~~
laumars
And didn’t that get people into trouble in corporate networks because some of
the links it followed were links the user would never have clicked on. I
vaguely recall that problem arising in the late 90s / early 00s.

~~~
JimDabell
It was a recurring problem in the Rails world because the reaction from DHH
was to tell people the preloaders were "evil" instead of telling people to
follow the HTTP specification.

[http://blog.moertel.com/posts/2005-10-25-google-web-
accelera...](http://blog.moertel.com/posts/2005-10-25-google-web-accelerator-
vs-unsafe-linking-round-two.html)

~~~
rimliu
It was a problem everywhere and in no way specific to RoR. If anything, RoR
were one of the first more or less mainstream framewors to solve a lot of
common problems then (CSRF, HTML spec only having "GET" and "POST" for form
method attribute, etc.)

~~~
JimDabell
It wasn’t a problem everywhere. It was only a problem where people used GET in
an unsafe way, which was a minority of sites.

What makes Rails prominent in this was the creator of the framework blamed
Google for their own bugs and told people to try to detect GWA to hide the
links instead of telling people to follow the HTTP specification. People who
followed his advice suffered the bug a second time, and people who ignored him
and followed the HTTP specification avoided the problem.

~~~
laumars
It’s not just a problem with unsafe GET requests, it’s a problem for anyone
who’s unfortunate enough to work in a company who logs web traffic (which was
common practice back in the 90s / early 00s) and who happened to stumble on a
site that might have hyperlinks to warez or porn (which, basically, could be
any site with user generated content).

I remember hearing several stories about employees getting reprimanded for
accessing sites they never intended to visit.

I also know of a similar tail when a national football team had their DNS
hijacked and visitors were served porn instead of sports news (also happened
around 2000 sort of time)

------
dang
A few months ago:
[https://news.ycombinator.com/item?id=23203658](https://news.ycombinator.com/item?id=23203658).
(Once a submission has had significant attention, we mark reposts as dupes for
a year or so. See
[https://news.ycombinator.com/newsfaq.html](https://news.ycombinator.com/newsfaq.html).)

Also a big discussion from last year:
[https://news.ycombinator.com/item?id=19122727](https://news.ycombinator.com/item?id=19122727)

------
tiffanyh
This is the successor to [http://instantclick.io](http://instantclick.io) for
those who recall that library.

Some things to note:

\- at it's core, this is using PREFETCH

\- Safari at the moment doesn't support link prefetching, so this library
doesn't work for Safari

\- yes, it works on mobile and arguably people see even better results on
mobile due to inconsistent network/latency of mobile networks.

An old HN discussion on the predecessor library:

[https://news.ycombinator.com/item?id=7365204](https://news.ycombinator.com/item?id=7365204)

------
animationwill
>> Before a user clicks on a link, they hover their mouse over that link. When
a user has hovered for 65 ms there is one chance out of two that they will
click on that link, so instant.page starts preloading at this moment, leaving
on average over 300 ms for the page to preload.

This sounds great! I immediately wonder how many additional HTTPS requests
will hit your web server fleet. Has anyone using this calculated a percentage
of additional requests vs additional clicks from their http logs?

~~~
HenryBemis
So my browser starts loading a page from a link that I haven't clicked. Does
anyone else see the potential security nightmare on this scenario?

Drive-by downloads, malware, etc. I totally get the benefits of this mechanism
until it starts being abused for tracking and delivery malware, and then add-
ons will (hopefully) appear to block this.

~~~
RobLach
It's no different than any other javascript enabled browser + web page.

The modern web is pulling in the background all the time.

You'd have to go back decades for 1 click to equal 1 GET.

~~~
theon144
Funnily enough, HN therefore seems to be a time traveler from decades ago :)

~~~
renewiltord
7 requests per page load. No, I don't think so.

~~~
theon144
None of which are javascript-triggered.

------
makeworld
Hmm, my Pi-Hole blocks this. I wonder why.

    
    
        Exact matches for instant.page found in:
         - https://block.energized.pro/blu/formats/hosts.txt 
         - https://v.firebog.net/hosts/AdguardDNS.txt

~~~
dieulot
It’s due to bullshit from EasyList:
[https://github.com/easylist/easylist/issues/4023](https://github.com/easylist/easylist/issues/4023)

~~~
slenk
You said it yourself:

> it’s a theoretical minor violation of privacy

Their goal is to protect privacy. It makes sense to me.

~~~
perryizgr8
Then why doesn't easylist add google.com and facebook.com to their list? I'm
sure G/FB track every single click you do on their pages. Huge privacy
violation. Not even theoretical. Block youtube also, they are tracking your
viewing habits, how much you watched, where you skipped, etc. Just block all
this.

I wish there was a way to easily override individual entries in easylist/ubo.

~~~
monadgonad
Because those are services that people use and consent to (implicitly at
least) violating their privacy. There's a huge difference between YouTube
tracking my habits to improve my YouTube experience (as well as for marketing
reasons) and a third party tracking them solely for marketing reasons.

------
polytely
Fasted hover to click I got was 93ms, it's really hard to get under a 100ms,
very curious to see who the fastest clicker on hn is.

~~~
FractalParadigm
The mobile site features a similar button, which measures the time between
screentouch and release - tapping it casually like I would any other link
netted ~120 ms, fastest I could physically tap was 20ms!

That demonstration alone was enough to pique my interest and give this a try.

~~~
perryizgr8
I'm regularly getting 7-15ms. Not tapping particularly fast. Could it have
something to do with the touch refresh rate? I'm on Chrome/GalaxyS10.

~~~
FractalParadigm
Interesting theory. To throw out some 'anecdata' (maybe we can turn this into
a sort of study?) - my three-device test (iPhone 6S and Pixel 2 XL at 60Hz,
12.9" iPad Pro at 120Hz) would seem to indicate that the refresh speed does
_not_ affect the touch refresh rate, _however_ , this could be the result of
the large performance differences between devices used, as the average and
maximum times trend down with the newer/faster devices in testing.

Min/avg/max of 10 taps, as fast as possible;

    
    
        17/26.6/36 -- iPhone 6S
        16/23.7/34 -- Pixel 2 XL
        17/21.7/33 -- 12.9" iPad Pro
    

I would probably conclude overall that my hands are just slower than yours!

------
masswerk
I'm somewhat concerned about this kind of preloading. Given that network
traffic is a huge, if not the major part of the total energy consumption of IT
(I've seen estimates of about 40% of the total), this is adding a significant
extra (according to the webpage only 50% of the triggered preloads are
actually followed by a page view, which equals to an increase of 100% in
initial traffic) for a gain of a few milliseconds.

Mind that reducing (initial) page loads by a fraction will do the same for you
and even more.

------
cblconfederate
Even if this is a bandaid solution attacking the wrong problem, it's great to
see people caring for speed for a change.

But isn't it dangerous? How does it know you re not preloading the "please
delete everything" link?

It would be nice to have a standard html tag preloading="yes" so that one day
browsers can have it as standard feature. I'd much rather have this than AMP

~~~
lewiscollard
> But isn't it dangerous? How does it know you re not preloading the "please
> delete everything" link?

As a sibling comment points out, if it's possible for someone to delete
everything via a link, your users have already lost. Someone could, e.g.,
trick your users into loading an IMG tag or an IFRAME pointing to that page,
or give them a shortened URL that redirects to your "delete everything" page.

GET requests should never have side effects.

~~~
cblconfederate
clicking an activation link in an email ?

------
rado
Beware, this broke my lightbox component, because the images had href.

------
Tepix
_Self-hosting_

Hosting the script yourself is also possible. Download the latest version at
[https://instant.page/5.1.0](https://instant.page/5.1.0) then add a module
script tag just before _< /body>_:

    
    
        <script src="instantpage-5.1.0.js" type="module" defer></script>
    

You can also install it via npm: _npm i instant.page_

------
aphroz
I managed to have a 603ms click on mobile, who can beat me ?

~~~
ffpip
0 :) [https://i.imgur.com/OhmLu5I.jpg](https://i.imgur.com/OhmLu5I.jpg)

~~~
aphroz
The longer the better, the more you will cheat latency ;)

------
helsinkiandrew
How does this relate to the preload spec <link rel=“preload”> is it only
preloading when the link is visible/hovered over?

[https://w3c.github.io/preload/](https://w3c.github.io/preload/)
[https://caniuse.com/link-rel-preload](https://caniuse.com/link-rel-preload)

~~~
divbzero
Looks like instant.page at its core uses <link rel=prefetch>: [1]

    
    
      const prefetcher = document.createElement('link')
      prefetcher.rel = 'prefetch'
      prefetcher.href = url
      document.head.appendChild(prefetcher)
    

[1]:
[https://github.com/instantpage/instant.page/blob/v5.1.0/inst...](https://github.com/instantpage/instant.page/blob/v5.1.0/instantpage.js#L230-L233)

------
kabacha
Is shaving off 1% really justifies such a hack? The web is already so
extremely implicit that even seasoned web-developers are lost in it.

------
beaker52
Pre-loading is a great idea when you're dealing with optimised, cached
content.

Pre-loading with some third-party script as a magic fix for free milliseconds
is something people should think twice about.

------
The_rationalist
How is this different than
[https://github.com/GoogleChromeLabs/quicklink](https://github.com/GoogleChromeLabs/quicklink)?

~~~
lewiscollard
Quicklink prefetches links as they arrive in the viewport. Instant.page
prefetches them when you hover over them.

------
ritonlajoie
A little bit off topic : the axios website on mobile 4G loads quite instantly
on my phone. Even better than most FANG websites. I'm not sure it's only the
CDN doing this. Any clue?

~~~
adventured
It's modestly fast for how bloated it is, I'll grant them that.

From Pingdom's Washington DC test location, on this Axios page:

[https://www.axios.com/tropical-storm-sally-forms-off-
florida...](https://www.axios.com/tropical-storm-sally-forms-off-florida-
coast-ab1e97df-99bf-4e21-93a8-b3f0bccda493.html)

It rings in at 414ms, 1.9mb uncompressed (~1mb compressed), with a rather
obnoxious 90 requests.

They're loading 961kb of script and 197kb of font content. A whole 41kb of
actual HTML content in that obese vat of bytes.

On a small Quora page with no major images, they come in at 2mb of junk,
1500ms to load, with 79 requests.

A typical small Wikipedia page with one image will come in at 400kb-500kb and
load in 400-500ms, with 26 requests.

GTMetrix lists the average load page size for their performance tests,
compressed (!), at 3mb (with 89 requests). Framework bloat is like living on a
sugar diet.

~~~
minxomat
Thanks for using Pingdom btw. Always happy to see our products used in the
wild

------
noname120
With Vimium[1] I consistently get 0 ms from hover to click. :)

[1] [https://github.com/philc/vimium](https://github.com/philc/vimium)

------
Gormisdomai
Presumably this won't work on mobile because there's no hover to trigger the
pre-loading. I wonder if there's something else that could be used there.

~~~
INdek
They mention mobile in the page, they use the onmousedown to preload, which
(they claim) gives in average a 90ms improvement, instead of waiting for the
click evento.

------
jooize
Would it save time by loading pages without making a normal new page request?
I mean using AJAX/jQuery/… and updating URL with window.history.pushState.

------
donbrae
I use instant.page and it doesn't appear to work in Safari (at least in
desktop version 13.1.2): I can't see any transfers in the Network tab on
hover.

------
moralestapia
Great idea, thanks for sharing!

~~~
laumars
It’s an old idea that keeps resurfacing and frankly I think it misses the real
problem of modern websites: they’re too bloated with code that the user is
expected to run and yet isn’t there to serve the user.

I’d love to see a world where cloud computing / time sharing was tipped on its
head and users and businesses could charge back their compute time spent
running JavaScript trackers, analytics, etc from the sites their users visit.
Of course this fantasy would never be technically possible but one can dream.

~~~
lewiscollard
> It’s an old idea that keeps resurfacing and frankly I think it misses the
> real problem of modern websites: they’re too bloated with code that the user
> is expected to run and yet isn’t there to serve the user.

I agree that the latter is a problem with many modern websites. That, however,
does not mean that this approach does not have a place; many people
(especially people among the HN and HN-adjacent crowd, who tend towards simple
layouts and static site generators) optimise their sites as much as is
reasonably possible, and specifically, _don 't_ pessimise their sites with
megabytes of tracker & ad scripts. For those people, instant.page might well
be a worthwhile optimisation.

------
kissgyorgy
Try Gatsby. It can be even faster than this.

~~~
oatmealsnap
Ok, I'll just rewrite my clients' entire website using a trendy, brand new JS
framework that is built on top of an ever-changing trendy JS library,
requiring the entire site to be compiled from scratch when my client wants to
update a title.

Different tools for different projects. Gatsby and Gridsome, or any other
static site generator is not comparable to a simple script like this.

~~~
beaker52
Throwing a random script onto a website I was responsible for, which claims to
magically preload things and make things faster is not my idea of simple.
Easy, perhaps. Risky, yes. Simple, no.

