Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Using Freenet (bluishcoder.co.nz)
128 points by doublec on Dec 18, 2014 | hide | past | favorite | 22 comments


Haven't looked at Freenet in many years. Maybe it's time to take it for a spin again. It's an extremely interesting project that tries to implement secure and anonymous communications with completely different objectives than Tor and other P2P networks like I2P.

The fact that the network itself also contains all its content in an anonymous, distributed way it a very interesting approach since no one knows what encrypted fragment they hold and there are no central servers.

I remember that since each resource in a website could be scattered across many many nodes, any webpage that loaded external resources would take a very long time to load, and many of the resources would have disappeared after a while, especially when considering that a lot of people would not stay connected to the network continuously or allocate much storage for their node.

One solution would be to load all resources as part of a single page site that would contain HTML, CSS and base64 encoding of the images. It would take a long time to get all the blocks for a large page/site, but nothing would be lost.

Enabling Javascript is probably a bad idea when browsing Freenet, but using it would open a lot more possibilities (like using the newfangled BGP image[] format to significantly reduce the payload).

[]:http://bellard.org/bpg/


Freenet has the concept of containers which contain multiple files (essentially just tar files with extended index and optional redirection to external files). These still get spread through the network in chunks, but with 100% redundancy, and when the site loads, all the content is there.

There’s a development version of freesitemgr which allows you to harness that: https://github.com/freenet/lib-pyFreenet/tree/files-in-manif...


I didn't know that. Looks like this could take care of the issue of avoiding broken resources.


Sone works around the lack of JavaScript by being a plugin rather than a site. This requires you to trust the plugin code of course.

An interesting project would be to come up with a sandboxed safe scripting system that allowed dynamic sites. This could remove some of the need for plugins. I'm not sure how possible it would be to do this without also enabling tracking or information leakage.


> since no one knows what encrypted fragment they hold

But they do, since they have the key to it. The encryption is just feel-good obfuscation.


No, they only have they key if they have the address of the content, which may not even be publicly known.

https://wiki.freenetproject.org/Plausible_deniability


I guess it was changed. When I ran a node a years ago the only option was to encrypt the entire store with one key.


I think you're referring to something different.

The "request key" is what you request from Freenet, but it transformed into a "store key" (not official Freenet terminology) before the request is sent out over the network. If you have the encrypted content you only see the "store key", but the "request key" is required if you actually want to decrypt it.

This means that, to see what content you're actually storing, you'd need to go to a lot of trouble (basically a dictionary attack on the store key). Not impossible, but enough for plausible deniability.


the data within the store is encrypted separately. (depending on the scheme used,) The URIs consist of an ID and the key, but the store only knows the locator.

So you might know you have CHK@???,ID in your store, but to understand what it is you're hosting, you need to be aware of its full URI CHK@key,ID.

(some details left out on purpose)


>>The fact that the network itself also contains all its content

That is simply not true. Website in i2p is just same as your normal www site, just that you get different route every dozen requests, therefore you can not reliably determine site location, not because "network itself contains all its content".


With "the network", Renaud was referring to Freenet, not i2p.


Yep, sorry, I somehow mixed those up.. my bad.


I've followed the Freenet Project for some years at its early stages, and I have to say that it has come indeed a long way. It's also been positively affected from more CPUs, RAM, and the general computer speed advances in recent years. The ecosystem has also stabilized (jSite, SONE, etc. as mentioned in the article). I personally like what has happened in the Freenet user interface and the ability to choose between levels of connection security/trust over speed (read about that on the site).


I used it for a while a few years ago, but ultimately the slowness and the lack of content discouraged me.

I may try it again, to see what has changed. It's a very interesting project but contrary to TOR, not very popular.


Freenet has really sped up since the last time I tried it. I'll definitely dive into it again in the upcoming weeks.

The Freenet has a wild array of content and much of it is not particularly interesting to the common man. And the web interface is kind of not so user-friendly. But the mechanism itself is great: you could use Freenet as an anonymous shared data store for your application.

For example, PirateBay could move to live on Freenet and have its own client (either a local application or a locally run webapp) just fetch the torrent listings from Freenet data blocks. For now, Freenet can't do big files so that must still be done on the opennet but much of the metadata could be irreversibly stored in the decentralized system that is Freenet.

As computers and the internet get faster and faster, we might be able to sacrifice some speed for safety, anonymity and plausible deniability. At the simplest, big files could be distributed in the open by simply xoring them together with other big files. You'd have to download two big files to construct the one you want. On the other hand, a single file would host information from several files so downloading it couldn't possibly be considered infringing anything. It could just be genuinely random data that, when xored with another file, turns out to be an mp4 of a film.


Out of interest, why do you say that it can't do big files?

It definitely has the capability to distribute large files (it has a built in download manager so it can do it in the background).

Or are you arguing that it just isn't good at it in practice?


I have stored 100+ megabyte files as tests and it works fine. It takes a long time to insert though. Thankfully you can queue them up using the download manager as you mention.

A pirate bay like site on freenet could provide magnet links for those that want to torrent externally and also provide files for those that want to retrieve from freenet itself.


What are the legal implications of participating in Freenet in the US?


Having encrypted information on your computers is a liability. It could be misconstrued as information you have access to and relevant to whatever criminal charges you might have against you. Failure to provide the encrypted information, even though you know you can't decrypt it, can land you in jail for life.

But that's pretty obscure and unlikely.


I don't know about the US but in the UK you are correct. In the UK unless you can unencrypt on demand of court its jail time - end of. I do not think this is obscure or unlikely. It is highly likely that paedophiles will use this network for instance and that somewhere on your computer you'll be storing fragments of child rape videos - a serious crime in itself. You will also be an accessory to the original crime.


That's baseless armchair lawyering. Having encrypted information on your computer is not a liability. Encryption is commonplace these days.


There shouldn't be any legal implications, unless it is revealed that you're doing something illegal with it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: