
Future Internet aims to sever links with servers - Libertatea
http://www.cam.ac.uk/research/news/future-internet-aims-to-sever-links-with-servers
======
evunveot
The linked article is obviously really vague and not technical enough for the
HN audience. It looks like the real details of the project are here:

[http://www.fp7-pursuit.eu/PursuitWeb/?page_id=158](http://www.fp7-pursuit.eu/PursuitWeb/?page_id=158)

There's obviously a lot of work going into this, and it's a lot more
interesting than just reinventing BitTorrent and magnet links. However, I
reflexively don't like the sound of this part (from the introduction to D5.5):

 _All main documents are being made publicly available and the partners are
committed to open source releases of prototype code. This open approach does
not adversely affect the exploitation possibilities of the project partners
themselves: as originators of the technology, they have a first mover
advantage in the initial adoption of the technology, also retaining the
possibility of patenting the key results created by them in the project._

~~~
throwaway993mlw
Infection detected!

[http://www.fp7-pursuit.eu/PursuitWeb/?page_id=158](http://www.fp7-pursuit.eu/PursuitWeb/?page_id=158)

The requested URL contains malicious code that can damage your computer. If
you want to access the URL anyway, turn off the avast! web shield and try it
again.

Infection type: JS:HideMe-J [Trj]

------
loup-vaillant
This could be huge. I note however that Peer to Peer is not mainstream. It
either never was, or it ceased to be. Bittorent is popular, but it's not
_mainstream_. If it was, YouTube would distribute torrents of it's videos (it
could be done right now by writing a torrent client in JavaScript). The sad
reality is, there is a long lasting, major roadblock:

Asymmetric bandwidth.

~~~
vectorjohn
I don't think that's necessarily a major roadblock. Depending on
implementation, with peer to peer you may connect to many peers, and their
combined bandwidth will be good.

Another way of looking at it: if I used 100% of my upload traffic
continuously, it would be a lot more total data sent than I generally download
with normal use. So as long as a peer to peer system ensures QoS, the
bandwidth is there. I think.

~~~
loup-vaillant
Nope.

On a peer to peer network, download equals upload. If you arrive late, sure,
everyone will upload to you. But if you arrive in the middle of the viral
explosion, you won't be the only one who wants to download. Peers will upload
to each other, not just you. So, unless one of the peers happen to be a big
fat server with monstrous upload, you won't download the video much faster
than the median upload rate.

The idea of using your uplink to the fullest sounds good, but to do that, you
need to acquire some content in the first place. And _that_ requires the
upload bandwidth of other peers…

------
JonSkeptic
An interesting quote from the article:

>That would potentially make the internet faster, more efficient, and more
capable of withstanding rapidly escalating levels of global user demand. It
would also make information delivery almost immune to server crashes, and
significantly enhance the ability of users to control access to their private
information online.

As I understand it, the plan is essentially to turn the internet into a giant
peer to peer network. I'm not sure how this fits into the claim that it will
"significantly enhance the ability of users to control access to their private
information online". If anything, it seems that this model will lead to an
even greater and equally permanent duplication of online content. This means
that user data will be reproduced and recopied even more than it already
is...how does that give users a greater control over privacy at all?

I don't think I'll touch on the problems of the implementation as its
described, but the following quote:

>the researchers behind the project also argue that by focusing on information
rather than the web addresses (URLs) where it is stored, digital content would
become more secure. They envisage that by making individual bits of data
recognisable, that data could be “fingerprinted” to show that it comes from an
authorised source.

really makes it sound like they are working toward making the internet
completely search engine based. How else will you find content if the servers
are gone? You search for it. It sounds like kind of a terrible idea, but maybe
someone else can champion their cause.

~~~
betterunix
"This means that user data will be reproduced and recopied even more than it
already is...how does that give users a greater control over privacy at all?"

[https://en.wikipedia.org/wiki/Encryption](https://en.wikipedia.org/wiki/Encryption)

In particular, if we ignore the issue of patents (which we should, since when
are patents on math valid?), this is a promising approach:

[https://en.wikipedia.org/wiki/Attribute_based_encryption](https://en.wikipedia.org/wiki/Attribute_based_encryption)

~~~
swombat
I guess the way it solves it is that there is no central entity that gets ALL
the data anymore. Instead, the data is spread across many, many servers.

------
npsimons
On the one hand, this just sounds like P2P. On the other hand, I am glad to
see people recognizing the value in P2P (wasn't that what was envisioned for
the Internet originally?) and trying to head back in that direction, away from
centralized, implicitly trusted services run by gatekeepers/middle men.

~~~
evunveot
After digging around on their website, it actually sounds like they see the
current internet as P2P and they want to replace it (or at least have an
alternative) that's concerned with routing between sources of information
rather than peers/devices.

Like in the video in the article, the guy's heart rate monitor, rather than
having a constantly changing IP address as he moves through different
networks, would be identified as "source of information about X's heartrate"
(using an anonymous/opaque identifier I'm sure), and this "future internet"
would be keeping track of how to route that information to the people who have
access to and are subscribed to it.

So the "future internet" routers would be kind of managing pubsub at the IP
layer. That's what I gather from these pages, anyway:

[http://www.fp7-pursuit.eu/PursuitWeb/?page_id=338](http://www.fp7-pursuit.eu/PursuitWeb/?page_id=338)
(prototype implementation; "Runs over Ethernet or raw IP")

[http://www.fp7-pursuit.eu/PursuitWeb/?page_id=179](http://www.fp7-pursuit.eu/PursuitWeb/?page_id=179)
"Our Vision"

 _While a model such as in IP networking enables a stream of data between two
explicitly addressed endpoints (with total transparency as to the information
represented in this sent data and the communication surrounding this exchange
of data), the model envisioned in our aspiration elevates information onto the
level of a first class citizen in the sense that data pieces are explicitly
addressed and therefore directly embedded into the network, unlike in today’s
IP-like networks._

------
oscargrouch
So they have invented P2P? (just kidding :))

Im saying this because im working hard to create a P2P equivalent platform,
and boy, the devils are in the details..

Its very sophisticated right now, and i also focus on information; im doing it
using the chrome engine, so developers will create isolated applications that
will just inherit the mechanics of the system (pretty much like a browser
does)

I didnt publish anything, no paper or idea, because im pragmatic (and not
backed by any university) and want to publish the real software so devs can
create the applications and uses can just use..

I wonder how many of my own conclusions will share with the conclusions of
this work of theirs..

If anyone knows more details about this and how this is different from P2P and
MQTT(based on pub/sub) it would be cool to know about... what are the
innovative parts.. etc..

PS: if anyone want to know more details about what im doing just let me know

------
digitalsushi
None of this involves any new technology, but certainly a new way of using it.

Like my Italian grandmother learning how to cook Thai food in her late 80s: so
uncomfortable that the same flour and oil was making this weird foreign
deliciousness. But she had to be coaxed into trying it before she realized the
new flavors were a good thing!

We'll feel the same way - data is data is data, but getting it from some cache
from a few hops upstream? ISPs and content providers are not going to like
losing this control. Very much its own topic and nowhere near as much fun to
talk about!

------
peterwwillis
Some of the methods detailed here would definitely be useful paradigm shifts.
Digital signing for data, Mobile addressing, and broadcast GPRS would all be
useful features of a future internet. But who's going to pay for it?

In one of their documents they offer the idea that there's an incentive for
ISPs to cache this information to prevent charges from lots of queries. But
even with caching, they still aren't getting any money from anyone to pay for
the services. You're not even sure where it's going or what it contains.
Somebody's going to have to pony up for the bill, and it's not going to be the
ISPs out of the goodness of their hearts. And if it involves different
networks (like cellular) you're going to multiply the number of people who
need to get paid to make global data access useful.

I predict that this "future internet" is going to require a subscription-based
fee that rivals your cell phone and cable bill combined.

------
websitescenes
This sounds awesome but it is also scary. Seems like the end of pirated
content to me. Ban it once and its gone globally? Not sure if I understand
correctly but it seems like a brilliant way to control property rights and
precisely funnel information.

------
j2labs
This isn't revolutionary, this is BitTorrent.

------
Aardwolf
What if I want to host an interactive server instead of a static
document/file?

~~~
betterunix
Why wouldn't the interaction be peer-to-peer as well? The point is to remove
servers from the system, so there should not even be a concept of an
"interactive server."

~~~
hrjet
Downsides of peer-to-peer (speaking in general):

    
    
      * No golden reference for content.
    
      * Updates are slower / content may not be fresh
    
      * Peers may not be online at the same time that you are online.

~~~
betterunix
"No golden reference for content"

Can you elaborate on this? I am not sure I understand what you mean.

"Peers may not be online at the same time that you are online."

That is already the case now with social networking, email, IM, etc. I do not
really see this being a serious problem -- P2P networks are pretty resilient
to people going offline as long as some threshold number of people remain
connected.

"Updates are slower / content may not be fresh"

This is certainly a problem, but perhaps not insurmountable.

~~~
hrjet
About lack of golden reference: Illustrative but extreme example is Bitcoin.
There is no golden transaction history. What emerges through consensus from
your peers is accepted as the transaction history. If there are network
outages and silos get formed, then content can diverge.

About peers being offline: I am not talking about people being offline, but
their systems going offline. Say I want to publish a photo to my friends.
Their systems may not be available at the moment my system is online.

~~~
betterunix
I think that in both cases, Usenet serves as an example (and an old one) of
how the problem might be solved. At worst, what happens is that messages
arrive in a different order than they were posted, hence the importance of
"References" headers and quoting posts (the rest becomes a UI problem).
Digital signing defeats forgery (if the network fractures, then your signed
messages will only appear in one of the pieces of the network until the
fracture is resolved), and encryption and mix-nets protect privacy.

So let's say you wanted to build a social networking system on top of Usenet.
Here is one approach:

* Use attribute-based encryption to distribute keys to your friends; this allows you to control who can see the posts you make

* To post, you encrypt, sign, and encrypt again your message; then you send it through a mix-net to alt.anonymous.messages

* Your friends all download from alt.anonymous.messages, decrypting those messages they can decrypt. Your signature tells them who made the post, and ABE ensures that only those who have permission to see the post can decrypt it.

You can imagine extensions for commenting on posts, or creating groups, etc.
You can also see that Usenet is not really relevant -- any P2P broadcast
system could be used.

~~~
hrjet
Your example is practical, and something I might even like to use. In theory,
each user could run their own Usernet server and that could be called P2P. But
if the default configuration is to use a shared server, it doesn't sound P2P
to me.

I will read up more on Usenet though. I have used it but don't know its
internals.

------
freework
This sounds kind of like my project, LibraryDSS:
[https://github.com/priestc/LibraryDSS](https://github.com/priestc/LibraryDSS)
although, I'll probably never finish it

I was disappointed when the pursuit website turned out to be nothing but
design documents and "mission statements". There doesn't seem to be any actual
software anywhere...

------
eloff
So like a P2P CDN. Big deal. A CDN like Amazon's is cheap and efficient. For
web applications, e.g. dynamic content, a CDN isn't helpful, and neither will
this system be. I don't see this as revolutionary.

~~~
CoffeeDregs

        For web applications, e.g. dynamic content,
        a CDN isn't helpful, and neither will this system be.
    

CDNs are helpful for "web applications". See:
[http://www.allthingsdistributed.com/2012/05/cloudfront-
dynam...](http://www.allthingsdistributed.com/2012/05/cloudfront-dynamic-
content-support.html) Lots of "dynamic" content can be cached for _some_
amount of time. At very least, the truly dynamic parts of a page can be cached
and the dynamic stuff can be AJAXed or SSIed in.

------
tbrownaw
Would having to query a DHT or whatever for each piece of data really be
faster than just doing a single DNS lookup at the beginning and then knowing
where to get everything?

------
fowkswe
Sounds like BitTorrent

------
GunlogAlm
Sort of like... BitTorrent?

~~~
sixothree
[https://news.ycombinator.com/item?id=6640504](https://news.ycombinator.com/item?id=6640504)

