

Let's remake the Internet - ShawnJG

One of the best things about writing software is that you constantly get to make things that didn't exist before. Internet was originally created so scientists could share information almost instantaneously. The Internet has evolved into something most people could not have conceived back then. Now that we know it's capabilities, if we could remake the Internet from the ground up, how would you do it, what would you include? Even with web 2.0 there's a lot of legacy involved if we were completely unshackled from the past to make it in HN's image, how would it look?<p>I know this is a radical idea, but just imagine making something completely new to compete with the entire Internet! Forget just targeting individual sites or services, what could be awesome enough to change the entire face of online computing?
======
sandroyong
I would replace the server - I mean, do we really need servers? In short, the
internet and networks in general were not designed with security in mind. (I
would like to apologize in advance for the lengthy response but I want to make
myself clear)

The Client-Server model is the most widely adopted model of networking. As the
basis of the internet, it is also most difficult to depose. Even in its most
basic form, it precludes security and rejects attempts to secure interacting
systems for the following reasons: 1) The server is just that - a slave that
serves its masters, the clients. Despite security measures to limit and
control client access, the server must at least: a) Listen for client requests
- clients must be able to locate, and thus can target servers. b) Attempt to
interpret, then determine whether to grant or deny the request - performing
redundant (permissions are decided a-priori) and risky work in its own
environment. 2) Clients need servers throughout their entire presence on the
network, so servers remain open to attack throughout a client’s session. 3)
Servers have access to all resources and to other clients’ data during any
session with an individual client. 4) Exploiting a server confers the ability
to exploit all of its clients and all resources.

The mode of implementation of networks introduces even more insecurities: 1)
The system relies on explicit security-related information - such information
can be falsified, thus the system cannot support non-repudiation. 2) The
system transports clients to any destination they specify - it is up to the
destination to defend itself against unwanted guests. 3 )Any client or server
on the network can be discovered by any other client or server - the network
can be searched systematically to find and exploit vulnerable targets.

In short, the server is and will continue to be a target for hackers. However,
if the target (server) was removed, would we have security breaches? Probably
not, but more importantly for all users, we would not have an internet as we
know it. Therefore, we need a medium that accomplishes both - a hardware
element that allows for network communications (but without the insecure
handles that make up a server) as well as allowing us to make up the network
we now call the internet. Pure fantasy or are we just too engrained with what
we have to be able to ‘think outside the box’?

Let’s look at why? There have been admirable strides towards making today’s
systems more secure. There is also significant and proactive efforts focused
on finding vulnerabilities and developing patches to fix them before they can
be exploited. Although security measures exist, none are truly pre-emptive. In
my view, everyone is just making variations of the same thing and, more
importantly, we’re just ‘barking up the wrong tree’! Current defense
strategies follow three common underlying themes: 1) most tend to focus on
particular attackers, attacks, or attack methodologies; 2) many aim to defend
particular targets or groups of targets; 3) all are confined and subverted by
the existing framework for computing and networks. In its entirety and more
importantly, current defensive strategies cannot 1) prevent most unknown
attacks; 2) make targets unavailable for exploitation, nor 3) compensate for
design flaws in the system being defended. Therefore, would it not be more
instructive to examine what enables the initiation of attacks? i.e., what are
the handles that allow the system to be breached?

As I eluded to above, the interaction between client and server
software/systems, on which (often flawed) software is based, presents too many
handles for misuse and abuse. Therefore, this argument points to one and only
one common denominator - the network is inherently insecure by design, i.e.,
this is the problem that security people should be addressing, not a new
variant-kind-of patch. So, if I could travel back to the 1990’s to design the
networks/internet (as you have suggested) and knowing what I know now, what
and how would I design it? More importantly, can it be redesigned today? Or
should security product developers stay content and conform to the present
hardware and software platforms and just develop ‘patches’? Even if software
could be made flawless in logic, it may not be feasible to prevent the misuse
of software. Don’t forget the human element - that in itself is the weakest
‘link’ in network security.

We are therefore left with redesigning the computer and network environments
to allow flawed software (and people) to operate securely and render such
software (and people) inoffensive should their flaws (and actions) be targeted
(be used) for exploitation. This implies that the network (and elements of the
network) be fault-tolerant or, more to the point, secure by design. So, we can
have the benefits of a network and the protection from network security into
one all encompassing computer-network infrastructure.

My suggestion: It should be conceivable and possible to map and mete out
resources to clients/users as they are accepted onto the network. This is the
basis of client completion; for each user/client, a host environment is
created in which all allowable services and resources are locally available
and locally supported. We already pre-determine the "stuff" the user
needs/requires to have access to on the server - so why not bypass the server?
Since access to and management of these resources is local, there is no need,
or means by which, to interact with the server or the network; the client is
thus complete - a discrete entity on the network. The containment of this
discrete component must be as complete as possible to ensure that there is no
"leakage" to other client or server environments if the user exploits any
vulnerability in the client.

I can see a distinct and separate internet from the current one and more
businesses and services coming out from this model.

OK, I've said enough...hopefully that will spur more discussions on this
subject =)

------
nextparadigms
Make it even more decentralized and more P2P based. Eliminate as many points
of failure as possible. Use something like the Phantom Protocol for increased
freedom and security as well:

<http://code.google.com/p/phantom/>

