Hacker News new | past | comments | ask | show | jobs | submit login

This is brilliant, not only for privacy but for speed. Seeing this makes me wonder why I didn't build this yet. I've often thought that Javascript loading tags could include a hash of the desired resource and your browser can fetch them once only for a thousand page loads on a thousand websites. This is not that, but it is extra local caching and on top of that it stops most tracking by CDNs. Guess I always thought of it as something my browser should have instead of an addon.




IPFS does this by design. Everything is content-addressed so you immediately know if you've seen the resource before. This also enables chunk-level deduplication.

If course the P2P nature of the project means other people can find out exactly which of those resources you're looking at...


Only your direct peers can, and they can't tell if you got the content to increase your fitness score or because you wanted the asset for yourself. Peers are incentivized to pull as many assets as they can (which prevents torrent death) in order to build reputation.


I think this is overly optimistic; as soon as you pull down a rare asset you have leaked information, since a peer that's farming would presumably work down the list of assets ranked by some measure of popularity, and would be unlikely to bother collecting obscure content.

This sort of system helps against some sorts of snooping, but certainly not nation-state adversaries.


> This is brilliant, not only for privacy but for speed.

But these resources are probably already cached by the browser anyway (using the appropriate http headers). So how can this solution add any improvements to that, once the resources have been loaded for the first time?


If your browser usage patterns include frequent "private browsing" or frequent cache clearing, this can be a noticeable speed boost.

I often using "private browsing" as a way to get another login session (e.g., login a test user while still having admin user logged in).


I actually worked on a prototype for this 4 years ago.... -> https://github.com/cdnjs/browser-extension (Speed not privacy)

The obvious problem was that storing scripts locally got a bit out of control when considering having to store all versions.


The libraries can't be that big, surely it would work fairly well if you just dedicate a preset portion of hard drive space, and delete the least used when it exceeds that size.


Actually now using this and seeing the results, I think Chrome caches assets much like this extension does. It gives a huge perceptive speed boost to Firefox. Mozilla devs need to look at including this in Firefox by default. It's huge in terms of speed and Firefox's competitiveness with Chrome.


I believe it still sends a network request to check the status of the resource. But an extension can bypass this and assume that the asset has not changed.


Indeed. Or check it every x hours in the background. Or the developer keeps up with jQuery news (and a few other big ones) and pushes updates. Or developers can push updates themselves. Many simple solutions that give a speed boost on many sites already.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: