Hacker News new | past | comments | ask | show | jobs | submit login

I read over a bit of the source. It uses a hard coded list of CDNs and files. So it does nothing unless the the CDN and file is on this list: https://github.com/Synzvato/decentraleyes/blob/master/lib/ma...

Edit: someone asked how this works..

1) it looks up the resource in the mapping (linked above) (matching the cdn and file path).

2) if found, it replaces it with the copy it includes: https://github.com/Synzvato/decentraleyes/tree/master/data/r...

So for those files, requests are never made to the CDN.

If the website uses a different CDN; a lib not recognized; or a version not recognized.. then the request is still made.

I thought that CDN should send proper caching headers and browser will save cached version after first request and never hit CDN again.

They do but the browser has a limited cache size. And because of the gigantic size of even the smallest web sites these days, the cache is maxed every day, and your file are purged again and again. This is basically just a super cache of files you know you never want to invalidate. Also, it prevents OPTIONS and HEAD requests.

Are browsers really this stupid? Seems like an obvious strategy to have several cache buckets, with one dedicated to smaller assets with long expiry times.

It's not stupidity. There is really no way to know which file you want to keep longer than others without risking breaking things. CDN are actually a good, manually updated, source of listing of files that have this quality. But basing yourself on proprietary CDN are not a move any browser in their right might would do.

That it has been looked at at /any level/ (e.g. this plugin) is great, and it does so without waiting for a decade of W3C and browser standards back-and-forth.

still need to check if file is changed, hence 304 http status code.

If the server sends an "Expires" header in the response, then the client doesn't even need to do that check. With an expires header, the server has effectively told the client that the data wont change until at least a particular date, and so the client honours that information.

Last-Modified/If-Modified-Since is an optimisation trick which exists for the situation where the person running the website hasn't bothered to explicitly define expiry periods for content.

That depends on what type of caching headers the CDN uses. If it uses max-age and no etags/last-modified the browser won't send the if-modified-since request and just use the cached resource without asking the server.

It's hard to tell if it's good or bad. The same way adblock/ublock, https everywhere work. They block what they know about.

Wait, does that mean that I have to blindly trust the versions the author put on github?

Does that also mean I don't get the up to date javascript library when they change?

Yes it does! And, IMO, it's dangerous ! Note that the script are hosted on GH that belongs to....... Google !

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact