Hacker News new | past | comments | ask | show | jobs | submit login

Live demo on my website @ https://sysadmincasts.com/

Temporary added it in-line for testing. I was already in the sub 100ms level but this just puts it over the top! Also updated all admin add/edit/delete/toggle/logout links with "data-no-instant". Pretty easy. Open developer preview and watch the network tab. Pretty neat to watch it prefetch! Thanks for creating this!

ps. Working on adding the license comment. I strip comments at the template parse level (working on that now).

pps. I was using https://developers.google.com/speed/pagespeed/insights/ to debug page speed before. Then working down the list of its suggestions. Scoring 98/100 on mobile and 100/100 on desktop. I ended up inlining some css, most images are converted to base64 and inlined (no extra http calls), heavy cache of db results on the backend, wrote the CMS in Go, using a CDN (with content cache), all to get to sub 100ms page loads. Pretty hilarious when you think about it but it works pretty well.




If I mouse-over the same link 10 times, it looks in my network tab like it downloads the link 10 times.

I'd expect this preload script to remember the pages it's already fetched and not duplicate work unnecessarily. :/


Perhaps the author could add a script parameter, or support an optional 'preload-cache-for' attribute, so you'd write <a preload-cache-for="300s" ...>

If you really care about speed anyway, you should already have setup your site to max out caching opportunities (Etag, Last-modified, and replying "not-modified" to "if-modified-since" queries) - I'd suggest the author should ensure the script does support caching to the broadest extent possible - hitting your site whenever appropriate.


Cache-Control headers already do a better job of solving that problem https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Ca...


Set your http cache headers correctly to instruct the client to save and re-use pages it downloads.


Yeah, this is likely something I need to look into. Since the site changes quite a bit for, non-logged vs logged, I'm not really doing much html caching right now. I'll check that out though. Even if I added something like 30 second cache TTL that might work. I'd have to see if that is even possible and test though. Maybe force an update on login? I'm not a html cache pro so I'd need to see what the options are and then do some testing. For now, it was just fast enough / not worth the time, to do anything other than a fresh page for each request. But, this is a good suggestion. Thanks.

ps. I do tons of caching for any images and videos though. I know those things never change so have weeks of caching enabled.


But how do you know it's still fresh?


HTTP has multiple cache control headers. It's a fairly complex topic, but TL;DR: do the config and it works in any browser since ~2000 (yes, even IE6).


Very impressive! Wonder if it'd be worth rerouting some of the paginated URLs like the episodes `?page=4` to `/page-4` or something like that. :shrug: - either way, looks like you had some fun optimizing it!


/page/4 would probably be more prudent


I don't like this as much since a savvy user might then try to navigate to /pages/ - which has what on it?


Good idea. I'll check that out! Thanks.


I just released version 1.1.0 which allows whitelisting specific links with query strings like those. https://instant.page/blacklist


Awesome, thank you!


This is what the OP should have posted. Very impressive. Will use in my work.


I like that there's an extra optimization where it cancels the preload if your cursor leaves and the prefetch hasn't finished it. Would help on really slow networks and pages that timeout.


Embedding base64 images isn't really more efficient on HTTP/2 servers. Base64 adds overhead, and multiplexing mitigates the cost of additional network requests.


But unless you use server push, it still is an additional roundtrip.


Yes but since it's multiplexed into the same tcp stream it doesn't suffer from slow start and so the tcp window is already large so it's not as bad as it would be on http1.


nothing on my network tab in ff 65.0


ditto. It works on Chrome though (sigh)


Why the sigh? It says right on the site that this gracefully degrades on browsers that don’t support it. Why is it a problem making a site faster in a browser designed for speed, if it does not degrade the experience at all in all other browsers?


That is OK, if you are talking about older browsers or browsers on limited / obscure platforms. Firefox doesn't fit the bill, and optimizing the site just for ~~IE~~ Chrome hurts the FF users and makes Chrome win even more - leading to self-fulfilling claim that Chrome is faster. It's not, but it will be if people optimize for it. This is one of the reasons I make a point to always develop in Firefox and later just check in Chrome (note that I still do check, because majority of people use it) - apart from simply better experience of course. :)


Not working for me either, Chrome works great though.


UPDATE - Feb 11. I removed the include for now. I was seeing a weird caching issue when people login/logout where the navbar would not update correctly (for the first few requests after). I'm still digging into this. I likely need to invalidate the browser cache some how. Doing some research to see what the options are.


Have you signed up for a page speed monitoring service, like https://speedmonitor.io/ ?

I'd be very curious to know what your performance looks like over time, especially as it relates to various improvements that you try out.


Or maybe Pingdom, NewRelic, or gtmetrix?


I wonder if breaking the page into two parts as above and below the fold, Above would even have base64 images and all inlined, below would get analytics script loaded.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: