Aren't torrents terrible at handling updates in general? If you want to make a change to the data, or even just add our remove data, you have to create a new torrent and somehow get people to update their torrent and data as well.
In practice, that's mostly how they're being used.
But the protocol does support mutation. The BEP describing the behavior even has archive.org as an example...
> The intention is to allow publishers to serve content that might change over time in a more decentralized fashion. Consumers interested in the publisher's content only need to know their public key + optional salt. For instance, entities like Archive.org could publish their database dumps, and benefit from not having to maintain a central HTTP feed server to notify consumers about updates.
How would preservationists go about automatically updating the torrent and data they seed? Or would they need to manually regularly check, if they are still seeding the up-to-date content?