
Decentralisation: the next big step for the world wide web - carsonfarmer
https://www.theguardian.com/technology/2018/sep/08/decentralisation-next-big-step-for-the-world-wide-web-dweb-data-internet-censorship-brewster-kahle
======
sixdimensional
It strikes me that, if we are ever to expand the Internet into space on a
universal scale (a la Vint Cerf and "delay tolerant networking", as an
example), the inherent physics problems involved with distances and
connectivity in space would probably make decentralization an absolute
requirement. I mean, it seems it would not be uncommon for there to be a
"local net" and a "universal peer-to-peer or mesh network net".

We're obviously a long way off from colonizing space and needing the Internet
to spread, but we still have the physics problems here on Earth.

I'm not convinced that centralization in its current iteration (cloud
operators controlling huge infrastructures) is the best long run. As we saw
with the recent Azure outage in South Central US, even the huge infrastructure
can have problems too.

Secure decentralization has seemed like a panacea for a long time - for all
things that resemble a public utility. Even things like the power grid.

~~~
amelius
On a related note, I'm wondering how distributed systems that rely on atomic
clocks (e.g. Google Spanner) would work in the space era, given that
relativity says that there's no such thing as a global clock.

~~~
gunnihinn
I suppose those continue working fine in very local systems (contained in a
ball of a few light-seconds radius) whose components move at speeds where
relativistic effects can be discarded. Drop those constraints and you also
need to drop even system-local globality because of relativity.

~~~
gunnihinn
You _could_ introduce the One True Lamport Clock, and as long as you're in its
light cone you can get global syncronization, but that comes at the cost of
having to learn a lot about patience.

~~~
AllegedAlec
That would only work up until the point at which we start to travel at
portions of c.

------
kickscondor
Beaker Browser (mentioned as one of many possibilities in this article) is the
real deal. If it doesn't get you fired up about the possibilities, then - well
let me explain I guess:

* The interface is dead simple - share this folder, done.

* It is a read-write browser. Netscape (and other browsers) used to be this way - they had some limited HTML creation tools. Beaker brings this back in the form of making an "editable copy" of a website. It's a choice in the address bar.

* Making an "editable copy" doesn't have to mean you're now editing raw HTML. An editable copy can direct how it is edited through JS. (See the recently released "dead-lite" for an example of this.)

All these attempts are exciting but I'm actually starting to use Beaker
because it's so useful even without adoption.

~~~
zmw
I checked out Beaker Browser, and apparently it's based on the Dat project
[1], which seems to be very similar to IPFS. Then apparently it follows that,
just like IPFS, you can't throw random things onto the network and expect it
to stick; you need to pay someone for hosting and bandwidth (that someone
could be yourself) to have it pinned, and in order to have it available
worldwide at all times you still need to pay for a CDN of sort — the Linux box
in your closet, or worse, your laptop that sometimes goes offline just won't
cut it. Eventually it's just another protocol to copy stuff around, where
stuff originates from various servers (your browser basically embeds a server,
capable of serving stuff), with the possible benefit of popular stuff may be
p2p'ed (but if you're a business you probably can't rely on that anyway). I
fail to see how it's radically different.

(Also, I'm not even sure how you could p2p private user data, unless you
expect everyone to carry around one or more yubikeys, or implant chips into
fingers or something; plus all devices need into buy into that. But I haven't
given that much thought.)

[1] [https://datproject.org/](https://datproject.org/)

~~~
pfraze
Some things in p2p hypermedia (dat) that's not possible with http/s:

* You can generate domains freely using pubkeys and without coordinating with other devices, therefore enabling the browser to generate new sites at-will and to fork existing sites

* Integrity checks & signatures within the protocol which enables multiple untrusted peers to 'host'. This also means the protocol scales horizontally to meet demand.

* Versioned URLs

* Protocol methods to read site listings and the revision history

* Offline writes which sync to the network asynchronously

* Standard Web APIs for reading, writing, and watching the files on Websites from the browser. This means the dat network can be used as the primary data store for apps. It's a networked data store, so you can build multi-user applications with dat and client-size JS alone.

I'm probably forgetting some. You do still need devices which provide uptime,
but they can be abstracted into the background and effectively act as
dumb/thin CDNs. And, if you don't want to do that, it is still possible to use
your home device as the primary host, which isn't very easy with HTTP.

~~~
zmw
Thanks for the list.

> You can generate domains freely using pubkeys and without coordinating with
> other devices, therefore enabling the browser to generate new sites at-will
> and to fork existing sites.

Not entirely sure what you mean,

\- We can generates HTTP sites at will (all you need is an IP address);

\- We have existing protocols for mirroring sites (not implemented
universally, but nor is dat://);

\- When you talk about pubkeys with coordination, there are obvious problems
like the last paragraph of my original comment, right? Again, I'm probably
misinterpreting what you're saying.

> Integrity checks & signatures within the protocol which enables multiple
> untrusted peers to 'host'.

Basically subresource integrity? Granted, with this protocol you can in theory
retrieve objects from any peers (provided that they actually want to cache/pin
your objects), not just the ones behind a revproxy/load balancer, so that's a
potential win from decentralization.

> Versioned URLs

We can have that over HTTP, but usually it's not economical to host old stuff.
In this case, someone still needs to pin the old stuff, no? I can see that
client side snapshots could be more standardized, but we do have WARC with
HTTP.

(EDIT: on second thought, it's much easier to implement on the "server"-side
too.)

> Protocol methods to read site listings and the revision history

> Standard Web APIs for reading, writing, and watching the files on Websites
> from the browser.

You can build that on top of HTTP too.

My takeaway is it's simply a higher-level protocol than HTTP, so it's unfair
to compare it to HTTP. Are there potential benefits from being decentralized?
Yes. But most of what you listed comes from being designed as a higher-level
protocol.

~~~
pfraze
> We can generates HTTP sites at will (all you need is an IP address);

That's not really so easy from a consumer device with a dynamic IP.

> \- When you talk about pubkeys with coordination, there are obvious problems
> like the last paragraph of my original comment, right? Again, I'm probably
> misinterpreting what you're saying.

You do need to manage keys and pair devices, yeah.

> My takeaway is it's simply a higher-level protocol than HTTP, so it's unfair
> to compare it to HTTP. Are there potential benefits from being
> decentralized? Yes. But most of what you listed comes from being designed as
> a higher-level protocol.

The broader concept of Beaker is to improve on the Web, and we do that by
making it possible to author sites without having to setup or manage a server.

Decentralization is a second-order effect. Any apps that use dat for the user
profile & data will be storing & publishing that data via the user's device.
Those apps will also be able to move some/all of their business logic
clientside, because theyre just using Web APIs to read & write. Add to that
the forkability of sites, and you can see why this can be decentralizing: it
moves more of the Web app stack into the client-side where hopefully it'll be
easier for users to control.

~~~
zmw
> Decentralization is a second-order effect.

I see, I was looking at it backwards.

------
antpls
When we think about decentralization, we sometime think about uncontrolled
extreme decentralization. That's the ideal : no single point of failure, with
everyone hosting their data where they want, and deciding who access what
about them.

Practically, as pointed in the article, there are laws to comply with and
those are IMHO the biggest lock to decentralization.

The fair middle ground between extreme centralization à la Facebook/Twitter
and total network anarchy is something based on federation, like emails and
Mastodon. With federations, there are several providers for the same end-user
application, with native data exchange and interoperability. The idea is to
give the power to anyone with hosting capabilities to compete with the Giants,
even if only a few domains will actually survive (like Gmail, Hotmail, etc
because of network effects and funds, probably).

What we need is a framework, or a backbone, that allows people to easily
create new federated-native apps ("dapps") without thinking about consensus
issues, protocols versioning, and with native laws compliance.

~~~
PinkMilkshake
> What we need is a framework, or a backbone, that allows people to easily
> create new federated-native apps ("dapps") without thinking about consensus
> issues, protocols versioning, and with native laws compliance.

I agree that this is probably the way forward. The only downside is how your
identity is tied to the service provider you choose. It was a PITA when
Lavabit went down and I lost that email address.

~~~
synctext
> The only downside is how your identity is tied to the service provider you
> choose.

Fully agree with this. The link to identity is not often brought up. I run a
university lab focussed on re-decentralisation of the Internet as a day job.
We focus on identity & p2p + trust.

Beaker browser is impressive early work focussed on the raw bit transport. It
re-uses DNS for global discovery, its hard to do everything decentralised at
once. How to do global search on a decentralised Twitter or spam control?

The hard issue we need to solve in the coming decade is the governance of such
systems. Ideally it would rule itself. Definition of a self-governance system
as: a distributed system in which autonomous individuals can collectively
exercise all of the necessary functions of power without intervention from any
authority which they cannot themselves alter.

~~~
naveen99
Would be nice if all devices that wanted to be routable could get an ipv6
address including mobile, and we not have to worry about turn and stun...

------
bobm_kite9
This seems to miss the problem completely to me. The internet essentially
solved the problem of _distribution_. That meant disintermediating away from
publishers, and allowed anyone to publish.

The problem then moved onto being one of _curation_. Companies such as google,
facebook, amazon are in the business of providing curation: i.e. taking away
the leg-work of what we should attend to.

A de-centralized web doesn't appear to decentralize the problem of curation at
all, which means we are going to still end up with centralized curation and
the same or similar monopolies on attention that we have now.

------
drasticmeasures
This kind of decentralisation -- taking existing platforms controlled by
megacorps and making them P2P...

...feels, to me, like a huge mistake.

How would one eliminate hate speech and toxic content from it? Or illegal
content? Or anything you put there and need removed to keep living your life
freely? The technologists developing this tech hand-wave these concerns away
citing "freedom of speech" \-- but one's freedom ends where another's begins,
and hate speech, toxic content, illegal content, not being able to have what
you said or did forgotten online, all these things curtail someone's freedom.

And by making it decentralised, they're just making it harder for people who
are the victims of these problems to hold the people responsible accountable
and to stop them. These technologists want freedom of speech at the expense of
everyone else's freedom.

~~~
zzzcpan
> How would one eliminate

You simply make your own choices and don't follow/subscribe/view all that
illegal, toxic, hate content. You know, the same way you do today by not
visiting all those illegal, toxic, hate websites. They still exists though for
those who don't share your views on policing content for other people.

~~~
drasticmeasures
Ignoring the problem doesn't make it go away for the victims.

The women whose boyfriends posted private sex pictures as revenge, or the
minorities who will be the victims of hate groups organizing on social media,
the children who were filmed while being raped and have their video
circulating online, the victims of bullying whose bullies are empowered by
other people seeing it and not doing anything to stop them...

You can choose to ignore this when you see it, but the victims can't, and it's
for their freedom that I'm concerned for.

~~~
zzzcpan
But you are advocating exactly that - ignoring the problem by blocking content
instead of dealing with people engaging in those behaviors.

~~~
Faark
Yeah, its kind of ridiculous having e.g. the german government expect Facebook
what posts are and then delete hate speech and co. If that stuff is illegal,
then the persons doing so should be held accountable by our legal system. But
politicians love using those companies as easy scape goat. Anyway, p2p
solutions usually try to make finding a person to hold accountable harder than
current centralized services.

------
tannhaeuser
I'm all for decentralized apps, but before venturing into that kind of
complexity, how about getting back control over the Web? It was created as
decentralized net (if not in the blockchain sense) and is the work of an
entire generation.

~~~
dcposch
That's what this is about.

Current web tech is inherently centralizing. Say you want to create an
experience like Instagram or Twitter, delivered via HTTP. You have to pay for
bandwidth, CDNs, storage, app servers, DB servers, etc etc. At scale, it's
millions a month. So only corporations can do it, and with a few exceptions
(eg Craigslist, Stack Exchange) they end up monetizing and "growth hacking" in
user hostile ways.

The big open question is: can we create an experience as compelling as
Instagram or Twitter over the P2P web?

It's a hard technical challenge, and today the answer is no. But if we get
there, then internet mass media can be delivered via open source projects over
open protocols, with a bunch of competing clients to chose from. No central
organization controls and monetizes the thing.

Like BitTorrent, but for applications more complex and interactive than just
file sharing.

\--

If you're interested, here are imo the most compelling projects in this space:

\- Dat

\- Beaker

\- Augur

\- OpenBazaar

\- Patchwork / Secure Scuttlebutt

They are working on overlapping subsets of the same fundamental challenges,
eg:

\- How does a node choose what to download? The BitTorrent answer is "only
things the user explicitly asked for". The blockchain answer is "the entire
global dataset since the start of time". For something like a decentralized
Twitter, both of those are unsatisfactory, you need something in between.

\- How do you log in? Current systems either have no persistent identity at
all (eg BitTorrent) or they just generate a local keypair, and it's your job
to back it up and never lose it (eg SSB, Dat, all blockchain protocols). Both
are unacceptable for wide-audience social media. Ppl lose their devices, get
new devices, forget their password, etc all the time. They expect and rely on
password reset, etc.

So there's a lot of hard tech and UX problems left unsolved, but also a lot of
recent projects making solid progress

~~~
magila
The hosting cost for things like Facebook and Twitter are a pittance compared
to the cost of employing all of the engineers/designers/etc who enhance and
maintain those services. _That_ IMO is the biggest economic challenge facing
decentralized applications.

You can make some nice proof-of-concepts with a group of volunteers, but the
effort required to provide a UX comparable to centralized services is going to
take more than a handful of people working evenings and weekends.

Decentralized services generally do not afford the same monetization
opportunities as central services. Decentralized proponents consider this a
feature rather than a bug, but it leaves open the question: Who is going to
pay for all of this?

~~~
adventured
> The hosting cost for things like Facebook and Twitter are a pittance
> compared to the cost of employing all of the engineers/designers/etc who
> enhance and maintain those services.

Facebook had $20.4 billion in operating expenses in 2017. Less than 1/3 of
that was the cost of its 25,000 employees (at the end of 2017). Facebook is
spending more on its infrastructure than it is on all of its employees
combined (and that much more when you reduce it to just engineers). Engineers
are maybe 1/5 of its operating costs, including their all-in costs.

Both Facebook and Alphabet had roughly $15 billion in total capex for 2017.
Data centers, networks, electricity, et al. cost a lot at that scale. It's not
a pittance. Facebook spent ~$7 billion in 2017 on capital expenditures related
to their network, data centers, etc.

Facebook's first Asia data center is a billion dollars to just start up.[1]
When they put up new data centers in places like Henrico County VA, New Albany
OH, or Newton County GA, it's similarly nearly a billion dollars a shot to
start those up. Once you have dozens of those operating, it's billions of
dollars per year to operate them all.

[1] [https://money.cnn.com/2018/09/06/technology/facebook-
singapo...](https://money.cnn.com/2018/09/06/technology/facebook-
singapore/index.html)

~~~
jonnydubowsky
I wonder how much of that cost is toward user-facing improvements and how much
is toward extracting additional profit out of the surveillance economics
model? As Mastodon and Patchwork and other federated social media platforms
continue to grow, it would be an interesting and useful effort to analyze the
cost structure of these alternatives.

------
brianzelip
A good recent podcast about the decentralized web, with a technical focus, is
JS Party #42,
[https://changelog.com/jsparty/42](https://changelog.com/jsparty/42).

Featuring Mathias Buus and Paul Frazee from the Beaker project.

~~~
kickscondor
Thankyou for mentioning this! I hadn't seen it passed around I guess.

------
jotm
How long until it reverts back to some nodes having way more
influence/power/data than the others?

This is not only a technology problem, it's (mostly, I'd say) a social one.
Humans will always want more power and control, whether it's in real life or
online.

Every single type of governance has fallen victim to human greed and ambition,
as will any kind of Internet, I believe.

Fix the users - save the Internet! :)

~~~
elvinyung
I think a lot about this.

In _A Thousand Plateaus_ , Deleuze and Guattari talk about the opposition
between the state apparatus and the "war machine" (their term for a
nomadic/decentralized structure). They talk about how it seems like nomadic
societies are primitive, but actually a lot of nomadic societies have
"collective mechanisms of inhibition" to ward off the formation of a state
apparatus, by preventing power from accumulating within any one party and
"evening it out" among everyone.

The applicability of D&G's ideas on the war machine to our current problem of
platform power is immediately apparent. A centralized platform is exactly like
a state apparatus. In our situation the collective mechanisms of inhibition
might be something like stronger/more proactive antitrust laws to break
up/nationalize entities that become infrastructural components of the society.

But as you've mentioned, I think this problem of "uneven development" is a
feature of any marketplace-like structure. In sufficiently large numbers, a
power law tends to assert itself with no other checks on power. This is why
blockchains by themselves won't solve the problem. The debate, then, shifts to
be about whether this is a feature or a bug, which is something that I'm never
sure about.

To close, another quote from ATP comes to mind ("smooth space" is another term
they use for nomadic spaces):

> Smooth spaces are not in themselves liberatory. But the struggle is changed
> or displaced in them, and life reconstitutes its stakes, confronts new
> obstacles, invents new paces, switches adversaries. Never believe that a
> smooth space will suffice to save us.

~~~
hestefisk
Awesome to see others on HN loving D&G. But perhaps also power is cyclical.
When the web was first popularised, it had the same potential as what DWeb has
now. TCP/IP was written to be inherently distributed and provide resilient
routing. Then, as soon as it starts to threaten existing power structures,
forces kick in to try and stabilise it through control, surveillance, and
‘governance’. It becomes part of the rhizome, the rhizomatic system of power,
that the new system (in this case TCP/IP / www) set out to challenge, creating
an even more complex, ever-evolving rhizome of power (surveillance, paywalls,
censorship). The same thing happened with other revolutions throughout history
— the power base they set out to challenge, transmorphed into a similar power
structure as an unintended consequence.

~~~
elvinyung
Well, I think this cyclical pattern shows exactly exactly why the thinking
around the war machine is so important. Thinking about this _very_ naively, to
get closer to the kind of smooth space that D&G conceptualize, it is necessary
to have some kind of homeostatic system that _recognizes_ abstractly when
power (and I'm using this term in a very naive, non-Foucauldian way) is being
disproportionately concentrated in any one body, and corrects accordingly.

That said, as from my previous comment, I'm not totally confident that this
kind of decentralization is even optimal, but that's a story for another time.

------
orasis
Large systems want to centralize for the sake of efficiency. I was among the
first wave of P2P hackers in the 2000s and we learned the hard way that
decentralization leaves a LOT to be desired.

~~~
antt
Not to mention that back then we a performance ratio of edge to node that was
order of magnitude better than the one we have today. A laptop today and one
from 2010 don't have that much difference in performance, a data center from
2010 and one from today are day and night.

And who will pay for it? With experience Xanadu seems like the only solution.
The reason why it's in development hell is because the problem it's trying to
solve is so hard.

------
wslh
There is an easy contrarian view about decentralization: even if a
decentralized protocol wins, at the end it is all about the UI/UX/Aggregation
which clearly cannot be decentralized. For example, OpenBazaar can be great
but the one who develop the best UI and search engine will win.

~~~
amelius
There is a semi-centralized solution: make search run on a handful of central
machines, and check if the results of (e.g.) two of them match.

~~~
fghtr
This problem is essentially solved by YaCy: [http://yacy.net](http://yacy.net)

~~~
wslh
I don't think you both understood what I said. It does not matter if you can
find a decentralized solution to a problem because at the end the user will
access it through an UI/UX/App that, in you example, can choose how to rank
the search results beyond what the protocol dictates.

In the past people used the simple "mail" command to read emails but now they
choose GMail or others because the UI/UX (or any other reason) is better or
they like it more. The SMTP (federated) protocol is hidden.

------
briatx
Repeat after me: The Internet is already decentralized. The web is already
decentralized.

Blockchains do not have a monopoly on decentralization. People who assert this
are trying to redefine the term to mean some kind of extreme P2P model that
fits their narrative.

~~~
hazz99
Repeat after me: It's effectively centralised. It's effectively centralised.

Almost all our traffic goes through Google, Amazon and Facebook. It's
extremely centralised.

Blockchain a don't claim to have a monpoloy - they're just the most recent
thing that's repopularised decentralisation.

If Amazon servers go down, so does a significant portion of the internet.
That's centralisation at work!

Blockchain tech doesn't claim to have a monopoly on the term
'decentralisation', it's just re-popularised the technology.

~~~
briatx
> Repeat after me: It's effectively centralised. It's effectively centralised.

So is bitcoin. Only 3 or 4 companies own the majority of mining.

Companies on the internet are centralized, but not the internet itself.

~~~
hazz99
Yeah, Bitcoin-based consensus probably isn't the best choice for a project
like this. I'd love to see a project that uses non-blockchain (but similarly
publicly immutable) tech, like Nano or Iota.

Still some issues with centralisation (since consesus is achieved through vote
delegates), but that's _much_ easier to fix than redistributing hashpower.

------
empath75
> And one of those is speed. Because of the way the DWeb works differently
> from the current web, it should intrinsically be faster

This seems wrong to me.

~~~
lbriner
I think the logic is that instead of everything going via a handful of servers
at the big companies, it goes through many more thousands of individual PCs
but of course, that depends on the technology used. Torrents _should_ be
faster but only if lots of people seed and don't simply download and
disconnect.

~~~
na85
Torrents themselves are often faster but it destroys the speed of anything
else on the same pipe.

~~~
voxadam
I found that configuring Smart Queue Management (SQM) on my OpenWrt router
drastically improved this issue.

[https://openwrt.org/docs/guide-user/network/traffic-
shaping/...](https://openwrt.org/docs/guide-user/network/traffic-shaping/sqm)

[https://openwrt.org/docs/guide-user/network/traffic-
shaping/...](https://openwrt.org/docs/guide-user/network/traffic-shaping/sqm-
details)

[https://www.bufferbloat.net/projects/codel/wiki/](https://www.bufferbloat.net/projects/codel/wiki/)

[https://www.bufferbloat.net/projects/codel/wiki/CakeTechnica...](https://www.bufferbloat.net/projects/codel/wiki/CakeTechnical/)

------
avhwl
There are inherent benefits to large, centralized platforms. Users, especially
content creators, desire a large network of users and excellent UX, both of
which have typically been properties of centralized systems or at least far
easier to achieve with centralized systems. Every decentralized social network
thus far has failed, and the modestly successful ones (Mastodon, for example)
are in fact federated.

Many of the problems alluded to in this article, in particular the privacy
risk of centralized data, are more effectively solved by policy changes and
iterated technology (differential privacy as well as bread-and-butter
cryptography) rather than furious hand-waving about blockchain protocols.

------
troquerre
Handshake.org is tackling this problem by decentralizing DNS and removing the
need for (and vulnerabilities associated with) certificate authorities.

Disclosure: I founded Namebase which is a registrar for Handshake

~~~
legionof7
I'm using the Namebase beta right now, it's great!

~~~
troquerre
Awesome, I'm glad you like it! We're shipping a mobile-first redesign soon
that's much better. Lmk if you'd like more testnet HNS when we do :)

------
mikebess
Complete decentralization as a philosophical and absolute goal misses the
point. There are great benefits to decentralization, for sure, just as there
are benefits to centralization. Projects that aim to decentralize everything
“for the sake of it” are doomed to failure. I want control over my privacy, my
spending, my choice of content. But at the same time, I want a great user
interface; I want curated content; I want performance. I want a service, and
I’m willing to pay for it. I don’t have a fundamental problem that there are
big companies out there who provide that service to me, and who make money
doing so. Even, in some cases, a whole lot of money. Good for them.

It boils down to: what’s the best way to provide services that I want?

I’m working on a project to provide a decentralized marketplace for software
and infrastructure services, competing with AWS and Azure. The marketplace
itself is blockchain-based: partially decentralized, but with a permissioned
blockchain that still allows governance, legal compliance, removal of bad
actors, KYC compliance, etc. The kind of things that customers (corporations)
need for them to use the marketplace.

I think we need to be pragmatic about it and figure out where technologies
like blockchains can help build better services, instead of trying to cram
decentralized systems into everything whether it makes sense or not.

------
pankajdoharey
Hasnt half of the decentralisation problem been already solved by the Tor
network which is encrypted and already decentralised? And they still regularly
identify and stop tor servers. What makes them so sure that the decentralised
nodes on a bitcoin network cannot be physically identified and stopped ? A
truly decentralised network cannot be built on borrowed network. A truly
decentralised interruption free, government free Internet would probably be
built on a satellite based network not this.

~~~
hazz99
Important note: Tor does not encrypt your traffic

Tor simply hides where your web requests originate from - it's up to you to to
visit HTTPS sites and encrypt your communications.

Also, Tor is quite decentralised but the existence of directory authorities
undermines this, since presents a centralised component.

~~~
LinuxBender
Sorry you are getting downvoted. This is very much correct and folks simply
put a lot of faith in the proxy transport as the ends to a means. One
vulnerability / bug (Tor has had many) can weaken that link. Tor is rarely
installed correctly or in a secure manor. (forcing all packets through it and
dropping anything that leaks from the browser, for starters)

~~~
hazz99
Do you have any links on how to install it properly, and to test that? (Maybe
through Wireshark or something similar) I admit I've haven't used it in-depth
(although I've studied the protocol quite a bit)

~~~
LinuxBender
I don't have one handy, though if you might find one in the documentation for
Tails linux OS.

At a high level, the client workstation must not be allowed to send any
packets to anything other than the socks port running on the Tor host. The
Workstation must have a static arp entry for it's gateway. The Workstation
should use a ram-disk linux distro and not persist anything to unencrypted
disk. The Tor host must not allow anything inbound other than the Tor SOCKS
port. The Tor node must only speak outbound on 80 and 443 (formerly known as
the fascist firewall setup). Ideally, the Tor node should be running on a
cheap VPS host, ideally payed for with a burner card and accessed via a VPN so
that Tor traffic from the home ISP is not evident. The VPS host should be
cycled from time to time.

This is of course a lot of setup work, but most of it can be automated.

[Edit] Speak of the devil. Here is a zero-day published on the Tor browser [1]

[1] - [https://www.zdnet.com/article/exploit-vendor-drops-tor-
brows...](https://www.zdnet.com/article/exploit-vendor-drops-tor-browser-zero-
day-on-twitter/)

------
sheeshkebab
With phones doubling performance every 1-2 years, and desktops/laptops largely
stagnating, it looks like in a few years, a server rack will fit into your
smartphone.

And most enterprise software (and almost no individual user) barely needs more
than a rack of current gen servers.

So yeah, decentralization will be upon us soon enough.

~~~
skybrian
This seems implausible. Moore's law is over and phones are limited by size and
power. There's no reason to believe progress will continue until a phone is
the same speed as a typical desktop machine, let alone a server rack.

~~~
adventured
Interestingly it's the centralized network that will grow radically more
powerful, while the home devices continue to stagnate.

The total capacity of infrastructure entities like AWS will increase by 10x at
a minimum over the next decade. By comparison, your phone or laptop will
modestly nudge forward. Consumers are not going to buy 10x the number of
laptops, desktops and smartphones that they do today, ten years out. Most
likely, those figures will barely move (the smartphone industry is already
stagnating). Most of the incremental spending and investment will go into the
centralized infrastructure by the giants.

Network speeds will continue to increase relatively rapidly. We can easily go
from routine 50-100mbps home lines to 1gbps over the next decade. We're not
going to see a 10x increase in the power of the average laptop (lucky if it
doubles in ten years). It's primarily going to be useful for
streaming/consuming very large amounts of data from epic scale central systems
for gaming, 4K+, VR, etc. Decentralized systems owned by consumers will be far
too weak to fill that role.

The AI future isn't going to be decentralized. The very expensive
infrastructure that will demand, and its need to run 24/7, will be centralized
and owned by extraordinarily large corporations.

It's precisely the typical consumer's home hardware that will act as the
ultimate bottleneck guaranteeing decentralized can never take off. This has
always been obvious, it won't prevent the fantasy from maintaining its allure
of course. That will perpetually draw headlines and hype in tech, for decades
to come, with no mass adoption breakthrough.

~~~
skybrian
That sounds very plausible to me, but I still think decentralized server-side
infrastructure has some potential. Sandstorm didn't take off and maybe
Mastodon isn't it, either, but it seems like someone's going to have consumer-
friendly, <$10/year, general-purpose server accounts for running apps. Maybe
some game will make it popular?

------
icc97
I hadn't heard of the Dweb, the term appears to have been coined by Mozilla
[0].

[0]: [https://hacks.mozilla.org/2018/07/introducing-the-d-
web/](https://hacks.mozilla.org/2018/07/introducing-the-d-web/)

~~~
lgierth
Nah, long before that, by the dweb community and the Internet Archive, leading
up to the first Decentralized Web Summit / DWeb Summit in 2016 :)

[https://2016.decentralizedweb.net/](https://2016.decentralizedweb.net/)

------
int_19h
Can there be a truly decentralized web without decentralized physical layer?
Mesh networks etc.

Which, of course, will not necessarily be connected... but that's a part of
decentralization and freedom. "Diamond Age" and its virtual polities come to
mind.

------
tomhallett
I have started a project as a way to explore how web applications can become
more decentralized. In the blockchain the starting point are dapps, so I’m
calling mine ddapps. Would love any constructive criticism (not quite ready
for ShowHN yet):

ddapp.org

------
dfee
Is there anywhere we can track the growth of these platforms though? For
entrepreneurs, we need to understand the opportunity, not to menation OS
developers adding support to their browsers to make this mainstream.

~~~
lgierth
Just the other day, Chrome declared their Intent-to-Implement-and-Ship for
more URL schemes in registerProtocolHandler(), which is a small first step:
[https://groups.google.com/a/chromium.org/forum/#!topic/blink...](https://groups.google.com/a/chromium.org/forum/#!topic/blink-
dev/29sFh4tTdcs)

Firefox has been supportive of the effort for some time already, working on
libdweb:
[https://github.com/mozilla/libdweb](https://github.com/mozilla/libdweb)

------
mrhappyunhappy
"lose your password and you lose access to everything" \- I'm sorry but this
UX blunder just won't fly. If we are to have decentralized web we'll need
services for key recovery.

~~~
CodeWriter23
I think you mean "services for account exploitation and warrantless search by
law enforcement"

~~~
UpshotKnothole
I, and I suspect many other people, often have online profiles and existences
and we simply don’t care about the government seeing it. Don’t get me wrong,
I’d like a high security option for _some_ things, but most of what I do
online is frankly trivial nonsense. I’d be much more upset if I lost access to
it forever than I would if some jackbooted thug decided to snoop around it.
Why does everything need to be the digital equivalent of a Supermax prison? I
want the full range from a guy on furlough to Hannibal Lector fullly
restrained, mask and all.

If you’re only willing to offer me the “Lector” package, I’m going to pass.

~~~
CodeWriter23
I'm more concerned about criminals using such proposed credential recovery
procedures to rob me. Thanks however for sharing your views about how we
should all trust the government without question.

------
cityzen
I think the web is still so new that people will really struggle to understand
a decentralized web. Even 20 years on and .com still reigns supreme as the
best TLD around.

I'm in tech and I'm interested in a decentralized web but I also feel that
throwing the baby out with the bathwater isn't a great idea. The article says,
"The decentralised web, or DWeb, could be a chance to take control of our data
back from the big tech firms." To me it sounds like we're basically saying,
"ok Facebook/Google/Twitter/Instagram... you're all too big to regulate so
we're going to build A WHOLE NEW INTERNET". If they're smart enough to pollute
the current system, they're smart enough to pollute a new system. In fact,
these corporations are so big that you'll find out eventually that they've
funded quite a bit of this decentralized web.

As a parent, I would feel at least a little better seeing some bankers, Pharma
bros, tech execs, etc. actually go to jail and have their lives ruined for
their blatant disregard of pretty much everything. I don't want to tell my
kids, "well, we're too dumb to regulate the internet so we made another one..
and that one got messed up too... herp derp"

~~~
troquerre
.com still reigns supreme but the icann only recently started letting people
register new TLDs, and even then only 500 a year are registered. Handshake is
helping to solve this by decentralizing DNS and letting anyone register new
TLDs, so in the future .com may be way less popular than it is today.

Disclosure: I founded Namebase.io which is a registrar for Handshake

~~~
wolco
When you use a .com you expect the domain to just work.

When you pick a random tld like .io for example you are not getting the
reliability of a .com. .io had a few big issues last year (1/5 of dns queries
were failing, ex-google employee bought ns-a1.io and was able to take over all
.ios).

As more tlds come from good and bad faith actors people will flock to .com as
a known respected entity. Limiting to .com, .org, .net and country codes and
slowly introducing new tlds made more sense and gave time to estiblish trust /
create brand awareness. 500 a year creates noise and forces distrust of any
unusual or new tld.

------
vezycash
IMO, there are two key barriers to a decentralized internet (other factors are
minor in comparison.)

(1) IPv4 (2) Bandwidth limits

IPv6 makes NAT unnecessary. With IP scarcity gone, IP addresses might become
permanent like phone numbers.

ISPs are currently making money off fixed IP addresses. Market forces would
change that eventually.

~~~
lolive
The need for a search engine seems, imho, to be the major argument versus
(full) decentralization. Another formulation for that: there is a (valid) need
for (some kind of) centralization. (and from there, you all go back to full
centralization, first because of search, then because of convenience, at last
because of laziness).

~~~
vezycash
P2P File sharing services like LimeWire, kazaa, torrent... had acceptable
search experience before being sued to oblivion.

Even if lawsuits didn't kill p2p networks, Virus / safety concerns would have.
Imo, trust is a bigger issue than discovery, hence need for curation -
centralization.

The reputation system of thepiratebay makes it my primary torrent site.

Laziness, convenience is more of a trust than a search issue.

Many uploads are viruses/adware/ransomware masquerading as movies, books,
games...

This necessitates multiple downloads - it's frustrating. I remember
downloading several gigs of rar and encrypted .avi movies files, only to be
greeted with a message asking me to fill a survey to get password.

Yify - a reputable source eliminated this concern for movies.

If decentralization works out, I believe specialized search engines will
emerge.

But note, trust is the bigger issue than search or content discovery for
decentralization. If not, iTunes store, app stores and other walled gardens
would have long failed.

------
d--b
The web has some parts that are decentralized, and I think it’s worth noting
the bad aspects of what exists today.

Perhaps the most decentralized part of the internet today is BitTorrent. It’s
a very efficient way of sharing files and has a lot of success. One can see
how BitTorrent could become the backbone of a decentralized web. However:

1 BitTorrent “naturally” throttles popular files over anything else. Niche
items which are hosted by fewer people will be slower to download =>
BitTorrent makes a cultural echo chamber

2 BitTorrent needs some kind of centralized search engine: it’s not possible
for everyone on the network to host a copy of the entire index of files on the
network. The only way is to have a search engine, much like Pirate Bay. In
fact one could say that google was this in the first place.

3 decentralized social media would be much more polluted with fake accounts
since no “authority” would be able to fix it.

People have been excited by decentralized web for at least 5 years. The
technology has existed for at least 10 years. If it was to happen, I think it
would have happened already...

------
bushin
Let's be realistic here, AMP is the next big step for the web :(

~~~
superkuh
You're not wrong. But it's a step backwards and not forwards.

AMP is not any faster to load than the actual site not on google's CDN. The
only reason it appears faster in most situations is because google is abusing
it's monopoly search position to pre-load and prioritize AMP results.

AMP gives google more control and that's why they push it so hard. This plus
their quest for hiding and/or getting rid of URLs so they can use AOL-style
keywords within their AMP walled garden is multiple steps back. All the way to
the late 90s.

------
mrtnmcc
Just use WebRTC? Built in to most browsers (Chrome 28+, Firefox 22+, Edge
12+). Main hurdle is the UDP hole punching of NAT for users behind it needing
a third party to help initiate the peer to peer connection.

[https://en.wikipedia.org/wiki/WebRTC](https://en.wikipedia.org/wiki/WebRTC)

[https://en.wikipedia.org/wiki/UDP_hole_punching](https://en.wikipedia.org/wiki/UDP_hole_punching)

------
godelmachine
Isn’t the founding concept of the Internet & by extension, WWW -
decentralization?

------
imranq
Isn’t this the premise of HBOs Silicon Valley?

------
PaveSteel
Where can I buy coins for that?

------
gui67gk
The next big step?

The fundamental technologies were designed with decentralization in mind

Mastodon is just peered IRC all over again

The ISPs have poopooed running shared services from home connections.

DNS and the core protocols can run in decentralized ways no problem

It’s the social order that doesn’t enable it

------
moltar
I thought pied piper solved it already

------
cryptozeus
May be a bit philosophical but I feel real decentralization can only be
achieved if it’s created by not human. Like air , no one controls it and its
free to breathe. Blockchain started this way to be totally decentralized but
as we know if someone gets control of the 51% of the mines then its game over.

