Fetching these models over IPFS would locally cache them and dedupe calls for them by IPFS content ids - https://docs.ipfs.tech/concepts/content-addressing/#:~:text=... which functions similar to a file hash (not an exact parallel, since CIDs represent files that are broken up into chunks).
This would help with object DL deduplication if everyone is using the same models & would also help to decrease centralized data egress costs since with sufficient usage you would be DLing these models from other peers that are running IPFS nodes and holding onto the models.
It can probably be done with a browser extension. It can definitely be done by the browsers themselves. Eventually it will probably be done by the operating system, which the browsers will then expose.
It was a real need given how almost all sites use large JavaScript deps. However, any hopes of sharing those were destroyed by adtech people timing resource downloads to track people.
They are cached independently these days to avoid privacy issues. So if websites A and B both use the same JavaScript dependency from a public CDN and you visit them both, you will download the JavaScript dependency twice, even if you have it cached from your visit to the first website.
Who knows. Maybe the browser would be a more prevalent gaming platform if it could be assumed that loading a multi gigabyte game engine is no big deal, because everyone had one already cached.
A lot of unity games could easily be web games, but aren't because of many roadblocks. I believe this is one of them.
It is, but only within the same origin, which already enables users to not re-download jquery.js or Google Fonts if they previously visited another website that downloaded the same file from the same (usually cross-) origin.
Different webapps can't share common dependencies stored in localstorage afaik.