Hacker News new | past | comments | ask | show | jobs | submit login

Ya I generally think CORS is a waste of time. It would have been better to provide a hash of the file we're linking to and trust that rather than where it came from. Which is precisely what Subresource Integrity (SRI) does:

https://en.wikipedia.org/wiki/Subresource_Integrity

Sadly even though this is an obvious concept and trivial to implement, it took them over 20 years since the web came out to get it in most browsers. The cost to society of having thousands of copies of the same commonly used files (like jQuery) hosted locally on countless servers rather than having a centrally hosted version already cached from previously visited sites is staggering to contemplate. I'd really like to know who was behind the holdup on deploying SRI.




CORS is about a lot more than just static assets. SRI does not replace CORS.


CORS is for data retrieval, not subresource inclusion. In fact, you don't need CORS at all to include a script in your page; that has never been the case.


After sleeping on it, I realized that my comment was a bit too critical and also missing some context. I didn't mean to be as negative at CORS as I came across, I was more disappointed that something like SRI hasn't been part of the web from the start. Some background:

https://en.wikipedia.org/wiki/Content-addressable_memory

https://en.wikipedia.org/wiki/Distributed_hash_table

https://en.wikipedia.org/wiki/Merkle_tree

If we had something like SRI from the start, we could have linked to resources by their hash instead of their URL (more like how IPFS works). There's a name for this concept that eludes me, and also a great video explaining its potential but also difficulties when it comes to security and HTTPS. The short of it is that if we had routers that accepted hashes as well as URLs, then we could ask for a list of all data matching a given hash and download that file (or its pieces) from the closest cache(s). So instead of linking to jQuery at https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.mi... we could just ask the router for the file with SHA2 hash 160a426ff2894252cd7cebbdd6d6b7da8fcd319c65b70468f10b6690c45d02ef and it would return its contents regardless of where it came from, including the browser's own cache if it already had that file (I just used https://hash.online-convert.com/sha256-generator but there would be a better standard for this).

Anyway, hope this helps and sorry for any confusion.


The major hold up on deploying SRI is that a lot of third parties aren't supplying immutable content.

Google Analytics or recaptcha for example aren't versioned. Deploying SRI is just going to break your site when they update the script.


If they hashed and cashed the resource files, then they could be found locally most of the time.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: