

BitTorrent Everything? - engxover

Q: Could/should the internet be converted into a P2P network model? By this I mean individual web pages are seeded/"leeched" as needed, instead of downloaded from a single source for every individual browser. How would this be accomplished? Would this have any advantages or disadvantages?<p>A: I can't answer the question, but I'd like your help with it.<p>Firstly, forgive my lack of knowledge on this question. I'm a mechanical engineer by trade, but have recently seen the tides shifting more and more to the tech end of the spectrum in my career opportunities. While I'm not too worried about my job right now, I'd be appalled if I became obsolete, which is why I'm currently trying to put myself through the pain of acquiring computing skills. Just in case. So imagine this as a teaching exercise: I'm just looking for knowledge, and hopefully someone here can enlighten me and perhaps point me in the right direction for more information. This seems like one of the more intelligent/mature places I've come across on the web, so I hope this question is as "deeply interesting" for you as it is for myself.<p>In my line of work, if we have an excellent solution to a problem (in this case the problem is data transfer, and the solution -at least for large clumps of data- is p2p file sharing) we go to that solution first, and see if it is applicable to the task at hand. It seems to me like these torrents are the fastest/best way to handle large files, so why couldn't they handle websites themselves?<p>Again, I'm quite ignorant on the actual inner workings of the protocols which actually run the web and organize the traffic inside it, so you'll have to forgive me if my questions betray this ignorance. So links to articles, research, textbooks, etc, related to my question would be greatly appreciated. And if you believe this is a question worthy of discussion, please add to that as well. Thank you.
======
knurdle
I think bittorrents work well for big chunks of static data. Web pages are
made out of lots of tiny bits of data that change for a lot of users.

Most sites these days use some sort of CDN which gets the data to you probably
more efficiently than a P2P network.

------
malandrew
About two years ago before I started programming I posted this on the what-wg
mailing group about a real use case that is financially useful in reducing
costs:

[http://lists.whatwg.org/htdig.cgi/whatwg-
whatwg.org/2010-Jan...](http://lists.whatwg.org/htdig.cgi/whatwg-
whatwg.org/2010-January/024809.html)

Back then it wasn't really technically possible, but technologies like WebRTC
may change things since they make browser-to-browser communication possible.

Check out btapp.js for an implementation of bit torrent in the browser:

<http://pwmckenna.github.com/btapp/docs/btapp.html>

Besides that you may want to look at this html5-filesystem api polyfill:
[http://ericbidelman.tumblr.com/post/21649963613/idb-
filesyst...](http://ericbidelman.tumblr.com/post/21649963613/idb-filesystem-
js-bringing-the-html5-filesystem-api)

At the end of the day, something like this is only really going to be useful
for file types that likely can't be modified to be malicious. e.g. css and
images.

I know of a few people looking into stuff like this. Personally I think it is
only a matter of time before we see it applied to more things. I don't really
think it's a matter of the technology being available at this point. It's more
a matter of someone building it. After all it took almost 5 years from the
time the Mosaic web browser was introduced to the market for blogging to make
an appearance in 1999.

------
TomAnthony
I imagine there are many technical reasons why this is a bad idea. A few from
the top of my head:

• Very few sites could allow their database to be distributed.

• Pages could be altered by nodes in the network for nefarious reasons.

• Anything with a session would probably be very hard work.

Shame really, as it'd make it harder of ISPs to shape traffic.

~~~
brianshaler
• Pages could be altered by nodes in the network for nefarious reasons.

Not if you use a BitTorrent model. You'd get checksums from a known source and
discard any chunks you receive from nodes that don't match the expected
checksum. Not _impossible_ to spoof/alter, but exceedingly unlikely.

~~~
Hrundi
Similar to our trust on the 13 root servers then.

I guess this all depends on who to trust.

------
zhovner
<https://freenetproject.org/>

~~~
femto
There's also

<https://gnunet.org/>

I gather one difference is that GNUnet uses an economic model (like
bittorrent) to allocate resources, whereas freenet does not.

------
ivank
It could, it should, and it's a really hard problem.

<http://www.youtube.com/watch?v=8Z685OF-PS8>

<http://www.ccnx.org/>

