It would probably be easier to add IPFS archiving to Wallabag than to roll your own Pocket.
- inter-instance link sharing, perhaps via activity pub.
- Some ACL later on top of IPFS to ensure you share links and content with people you trust only (also to avoid eventual copyright litigation)
- a sanitized way to read links from content farms that gets shared on social networks (this was one of the initial motivations for me to start this, and this is why I called it nofollow)
- a perhaps more robust way to have an internet archive.
It is unlikely I will get to do it all, but it is more likely I will get to do some of this in Python (or Go) than to ever pick PHP
I'm personally excited for ipfs, but I've stopped following releases / am feeling burnt out by the slow pace of development. Still, good to see creative uses pop up in my internet bubble!
What kind of hash are we talking about here? And generated from what?
I could imagine the opposite, generating 2 hashes from the 'same' article, due to different markup being stripped. Not what you suggest though.
To back things up, you'll want to pin to an external server like Eternum.io.
Coincidentally, I was (slowly) researching doing something similar with Scrapbook and some kind of synchronization a-la Dropbox. I guess this implementation sidesteps the whole issue of organizing files on the fs, though I'm not sure if that's entirely unproblematic.
I was an enthusiast until 3 weeks ago when I lost all the references to my objects (twice, in two different manners) because of "a minor bug in mfs" and wasn't able to recover them.
I'm not the OP but a use case I might find useful is if you have a blog, want to comment on a news article, and want to have a cached copy of that news article. Since web pages often go way after some time, making sure the news article stays up might be useful. This is particularly true if the entity writing the news article decides it is embarrasing to them and wants to flush it down a memory hole.
 - http://thebitcoinpodcast.com/hashing-it-out-34/
IPFS is perfectly resilient as long as the network participants have the interest in keeping the content available.
Once everyone with a copy deletes it, it disappears. Both for ipfs and the internet on general.
With IPFS however, anyone with the file makes it available to anyone looking for it, regardless of who has it, or how many copies there are.
There are several arguments for this idea, and massive challenges to overcome if you actually try to act on this idea.
One of these challenges is how authors and similar content creators could possibly make a living in a world, where everything they produce can be copied by anyone.
There are several thought experiments which try to address this problem, none of which have been successfully implemented large-scale in real life (at least to my knowledge).
but the fact that challenges arise from such an idea does not discredit the validity of said idea.
it just means that we haven't found a sustainable way live by that ideal
to be honest, I've never read of a major example successfully utilizing that.
If you're just mentioning things which ever attempted to solve that issue: there have been a lot
UBI (universal basic income) comes to mind, just like services such as Patreon, on which people essentially donate to producers with negligible rewards.
Some music bands also give away their music for free and try to make their living with concerts... there have been countless attempts, honestly.
But they're ultimately of negligible impact if you compare their revenue to top selling products. that's what i meant with successful and large scale.
"Please note that content shared with this addon is just cached on IPFS servers. If you want to store the content permanently, you need to have IPFS node running on your computer."
It's a common misconception that ipfs is something you can "upload to." Ipfs makes links permanent in that they will never change. There is no guarantee that the content of the link exists unless someone explicitly stored it on a node.
How does 2read detect the local node? My IPFS node runs on different ports. Will it be detected?
Maybe have a failure condition where it then does:
If you can get it working with the protocol handler (ipfs://) that'd be awesome, but I went to the effort of changing my IPFS settings, so I can go to the effort of changing a setting for your addon.
I think there's a couple other addons and things that will enable ipfs:// but Companion is the most common.
Reminds me of Github's forks
All that can be assured is that the content that the person mirroring it and linking you to it is as they intended.
It does something similar but stores it locally. We have cloud support too so you can sync between your devices.
Thinking about IPFS at some point but it doesn't solve all the use cases/APIs I need supported yet.