Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's not possible with current web tech, is it?

Different webapps can't share common dependencies stored in localstorage afaik.




Not default web tech. It can be done with IPFS via IPFS Companion browser extension - https://chromewebstore.google.com/detail/ipfs-companion/nibj... or browsers with native IPFS client support like Brave or Opera.

Fetching these models over IPFS would locally cache them and dedupe calls for them by IPFS content ids - https://docs.ipfs.tech/concepts/content-addressing/#:~:text=... which functions similar to a file hash (not an exact parallel, since CIDs represent files that are broken up into chunks).

This would help with object DL deduplication if everyone is using the same models & would also help to decrease centralized data egress costs since with sufficient usage you would be DLing these models from other peers that are running IPFS nodes and holding onto the models.


Last time I tried IPFS it was really slow, if you have to run a node to serve files then direct downloads are much better.

A simple extension would do, that manages models and exposes an interface to window so webapps can call it.

Like window.llm.generate({model,prompt})


It can probably be done with a browser extension. It can definitely be done by the browsers themselves. Eventually it will probably be done by the operating system, which the browsers will then expose.


This need wasn’t super prevalent in the pre LLM days. It’s rare to have a multi-GB blob that should be commonly used across sites.


It was a real need given how almost all sites use large JavaScript deps. However, any hopes of sharing those were destroyed by adtech people timing resource downloads to track people.


Lots and lots of websites still use Google and other CDNs for JS deps, fonts, etc.


They are cached independently these days to avoid privacy issues. So if websites A and B both use the same JavaScript dependency from a public CDN and you visit them both, you will download the JavaScript dependency twice, even if you have it cached from your visit to the first website.


Who knows. Maybe the browser would be a more prevalent gaming platform if it could be assumed that loading a multi gigabyte game engine is no big deal, because everyone had one already cached.

A lot of unity games could easily be web games, but aren't because of many roadblocks. I believe this is one of them.


Well, it should be possible to just drag and drop a file/folder


It is, but only within the same origin, which already enables users to not re-download jquery.js or Google Fonts if they previously visited another website that downloaded the same file from the same (usually cross-) origin.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: