
We're heading for AOL 2.0 - ericdykstra
http://jacquesmattheij.com/aol-20
======
Tiksi
I absolutely agree with this, and I'm not looking forward to the near future
of the internet, but it's inevitable. We will hit AOL 2.0, well a few of them.
We'll have the Apple internet, the Facebook internet, the Google internet,
etc. but it won't last.

The rein of aol was killed by stagnation and outside innovation, people seeing
that there's more and better outside of aol, and the second round will
probably die a similar death. People will start to see cool new things
happening outside their silo, or get fed up with them and the silos will
eventually fall. These trends seem cyclical, we go from mainframes and silos
to personal computers and an open network, then back to the same mainframes,
this time called the cloud, and the same silos. It's not gonna be a fun time
for those of us who don't like the confines of the new internet, but the
handful of us who care won't stop the inevitable.

Me, I still run an irc server because I can, because there's a million clients
to choose from, and everyone can have their own interface, and the protocol is
so simple a decent programmer can hash out their own in a day. Nothing has
come close to it yet (xmpp was great in theory but too complex for its own
good). If you want a pretty ui and emojis and images, well there's a client
that can do it, if you just want text, plenty of those too.

I'll be over here riding out the inevitable with my own file sync, git server,
chat server, web server, and making my internet how I want it to be, hoping
that in time people will notice that you don't have to just accept your
preferred silos interpretation of it.

~~~
aembleton
The problem with that is the network effect.

None of my friends use IRC, 95% have never even heard of it. Most of my
friends are on Facebook Messenger.

For more public conversations; a lot of that is now taking place on twitter.
Twitter gets promoted by the media with hashtags at the start of TV
programmes. I never see any reference to IRC or other protocols.

I'm sure people will get bored with these current silos and move on but I
doubt they'll move to something more open. The financial incentive just isn't
there to develop the protocols and clients for an open system, whereas there
is clearly big money to be made by creating the next big silo.

~~~
Tiksi
Oh I don't disagree with you, I'm not delusional enough to think that people
are going to move to irc, or that it'll become mainstream, it's just a bit
saddening. But the thing is it doesn't really matter because irc is a
protocol. I don't ever have to worry about support getting dropped because
someone wants to monetize it differently. The protocols will never get closed
down all of the sudden because it's not popular enough. I can keep running my
server as long as I want, independently of any company's decisions. I got a
few friends set up on it years ago, and new people join from time to time.
It's versatile enough that I just run bitlbee and have access to a bunch of
other chat services over irc, all in one client.

I think people will get tired of not being able to communicate between their
silos and if they get locked in enough, won't want to leave their own service
for another one. It happened with AIM/MSN/Yahoo messenger, for a little while
everything was moving to xmpp. Your AIM account could talk to people on gtalk,
etc. It was nice while it lasted.

~~~
noir_lord
I think something like Slack but aimed groups of friends rather than
development teams.

I play chess and I ride a bike, I have clubs with both of those things and
they currently use facebook to co-ordinate and it works well enough but slack
would be _considerably_ better I think.

IRC is awesome but it only really handles the talking to people side of things
well, sharing files (well), setting events and calendars - the stuff slack
does is a huge multiplier.

~~~
gcr
Slack has exactly the same problem.

How could one connect to Slack, without using Slack? What open protocol does
Slack use for its internal communication?

Convincing everyone to switch to Slack is just as bad as convincing everyone
to switch to Facebook Messenger.

~~~
e12e
It's not just connecting that's the problem: How do you implement your own
server for slack, without using slack?

Because if that's off the table, the rest of the discussion doesn't matter.
Now, of course not _everyone_ will write their own irc-server, or port one to
a new architecture -- but without free and open source implementations, any
alternative is just another dead end.

Considering slack is doing so much right, I wish they'd realize this, and just
either open up slack, or publish a new free/open slack server implementation,
along with protocols for federation etc.

As I understand it, many people are getting weary of having to choose between
IRC -- which have a number of issues, not the least of which is its (lack of)
security architecture (even with TLS bolted on), and "extensions" being
limited largely to bots -- and XMPP which is over-engineered.

I suppose the natural next step is something that is to IRC/XMPP as JMAP[1] is
to IMAP -- a lightweight protocol, probably json (or: capt'n'crunch)-based --
that tries to combine strengths from IRC/SILC and XMPP without being complex.

If I were running slack, I'd worry about if it was easier to "own" the new
protocol, or to port slack to support it after someone else releases a Free
alternative to slack.

[1] [http://jmap.io/](http://jmap.io/)

------
alexro
The Internet has been raised on the shoulders of giants - people who basically
devoted their lives to it.

The current wave is brought up by these trying to grab a piece of the Web and
keep others at bay, the game is ruled by investors. This bunch only
understands walled gardens, no hope to talk them into open thinking. At all.

We've already had this discussed more than once that people themselves do not
care about any particular approach, people just want something to work. As
long as commercial services do the job people will use them.

Like with any freedom movement there should be the underlying philosophy that
will live long enough for others to finally catch on.

But going just head on like the author suggests will not help. If you start a
new service the last thing you want to spend your resources on is a RFC. And
you won't ever get more resources than the investors of AOL 2.0.
Unfortunately.

~~~
jackgavigan
_> The current wave is brought up by these trying to grab a piece of the Web
and keep others at bay, the game is ruled by investors. This bunch only
understands walled gardens, no hope to talk them into open thinking. At all._

Sadly, my experience has been that this is true. Walled gardens allow the
companies that control them to capture all of the value created within them,
which is far more attractive to a VC firm than a company that's looking to
create an open standard/platform that others will be able to use.

~~~
kuschku
It’s the reason why long-term, VC firms (like Y Combinator) and the US startup
culture have to fail. All of them focus only on short-term, exponential
growth. That is not sustainable.

The only question is if humanity will die due to the environmental effects of
exponential growth first, or if these companies default first.

~~~
hluska
The unfortunate (or maybe fortunate) part is that data shows that
occasionally, short-term exponential growth creates massive companies (ie -
Google and Facebook) with relatively minimal marketing charges. I remember
when Facebook was first available at my University. Seemingly overnight, all
of the LiveJournals died and Facebook absolutely took control.

~~~
kuschku
It’s unsustainable growth, though.

Facebook is even today far overvalued. Their price has to crash at some point.

And even Google is stagnating.

------
captainmuon
Technically, it would be not too difficult to use Facebooks API and some
scraping and make it connect to gnusocial for example. But it is explicitly
against the TOS (of the API and Facebook itself) to do so, so you can't.

I would be really, really happy if lawmakers would realize that Facebook (just
one example of many) has now become infrastructure, and has to be regulated as
such. What to do? Make every service with more than, say, a million users
required to offer federation. There must be an API, and everybody must be able
to get their data in and out free of charge, and there should be no
limitations (except for fighting spam).

This way, Facebook would loose their network effect which makes it impossible
for a competitor to be successful. They would still have their users, rich
features, mature codebase, and infrastructure as an advantage, so this will
not drive them out of business. But they will be forced to compete with others
on terms of user-friendliness, features, non-invasiveness and so on.

Here's an idea. The next time you invent a nifty protocol for social, don't
make it proprietary. Don't release the source under copyleft either. Make a
license such as: "You may use this protocol / implementation free of charge in
any way you like (source closed or open) under one condition: you have to
offer federation."

(* Federation means that users with accounts on different services can
communicate freely, like it is the case with email. Imagine you could use your
Twitter account and write to a friend's Facebook wall. Or you could even host
your own gnusocial instance and use it to take part in other social networks.)

~~~
ams6110
The problem is that Facebook is not _essential_ infrastructure. Government has
no business regulating it.

~~~
yoshuaw
Anything above the first tier of Maslow's pyramid is not _essential_.

Ask yourself: do we as a society benefit if Facebook uses open protocols? Law
should facilitate society's needs, not the other way around.

~~~
andrewflnr
This is an incredibly dangerous position to take. It can be used to justify
almost any law.

A law is a threat of force or violence. Law is the code executed by people
with lethal weapons and the authority to use them. Choose them accordingly.

------
angrybits
> when the last user of it finally gives up and moves to gmail so they can
> continue to communicate with their contacts or maybe they give up entirely

Between the title and the quoted, I think you have used up your hyperbole
quota for one day.

I guess I don't see the issue. The internet is a large piece of infrastructure
on which citizens and companies can publish (almost) anything they want.
Lamenting that for-profit ventures have tried to wall off their parts is
curious, as I am not sure how that impacts me or my decisions. I don't
facebook, I don't tweet, and I could not care less that these things exist.
(Not entirely true, but my objections to them would be off-topic.) The world
is a very large place, and there will always be people who hang out in the
more distant corners of the net, you can go be with like-minded people and
talk about the good ole days of (insert bygone era here).

Now if you are lamenting this because you want a piece of the action and the
big kids are being bullies, then I suppose my answer isn't going to comfort
you any. But for just simple usage, I think this is a tempest in a teapot.

~~~
wishinghand
The issue is anyone with a web browser (even ones provided by the ones with
walled gardens) can use the HTTP protocol to see hypertext documents. But
Facebook users aren't using a public protocol in the Facebook messenger, so
they can't easily communicate with Twitter users, who also don't use a public
protocol.

It's not the end of the world, but allowing more people to connect more easily
is a good thing. The larger issue is that people are less likely to connect
once they find their comfortable niche. You don't use Facebook or Twitter but
someone who does and would be interested in your thoughts are less likely to
stumble across you.

~~~
fossuser
I think there are practical concerns too that push away from openness and
towards silos - it isn't just about money.

Open protocols can be slow to improve because of the very nature of their
openness - they have dependencies and need to remain interoperable.

Closed protocols can adapt faster because one entity controls the entire
thing. In the case of chat XMPP was in the lead for a while, but the mobile
situation was terrible - connections were constantly dropped and even though
there were clients that supported multiple accounts (Meebo) it was a pretty
bad software experience. The current messaging products are not open, but they
are _better_.

This is a shame I think - because it'd be a lot better if the open products
were actually better, not just better because they are open.

Maybe the trick is to hack on a better open protocol, but even then it could
similarly be outpaced. Maybe the path to this is defining an open 'social'
protocol and fixing the internet identity issue at the same time. Maybe
keybase.io can do this?

------
decasteve
The key (IMO) to a distributed and decentralised Internet is a net-equality
for upload:download speeds. There used to be an argument that we download more
than we upload. To phrase it another way: we used to consume more content than
we published. The fact is, we individually produce more content now than we
consume, much more than the traditional media companies of the previous
generation, as evidenced by the content on Facebook, Google/YouTube, Twitter,
WordPress, et cetera. But the Internet in 2015 is an AOL-2.0-like one and we
need to nudge it in the direction of a distributed one again.

The solutions are within our grasp because they exist already. The software
and the communication protocols exist, and are open and tested. A small change
at the root, like Net-Equality, would result in an avalanche effect change in
how we communicate and store information. The Internet of 5 years from now
could be vastly different (and better in my opinion) than the one we have
today. It could fulfil the 1990's predictions of people like Bill Gates and
Mark Andreesen, who now sing a different tune, but used to preach the power of
a distributed Internet.

~~~
tbrownaw
But what happens when my home internet connection goes out, or I don't have an
always-on desktop machine?

That's also an inequality in upload/download, and it's far more important than
just a bit of imbalance in total bandwidth. Especially when most of that
download bandwidth is watching popular (and probably commercial) videos.

I'm not sure which side I'd put the typical "no servers" TOS, since if enough
people ignored that it would probably go away.

~~~
icebraining
_But what happens when my home internet connection goes out, or I don 't have
an always-on desktop machine?_

The content can get distributed and hosted right after you publish it,
ensuring access even if your machine goes offline. See Freenet, Bittorrent,
etc.

------
Canada
> That document would then be sent out to various parties that might have an
> interest in using this protocol who then would supply their feedback and as
> a result of that a more complete version of the original document would be
> released. And so on until the resulting protocol was considered mature
> enough for implementation.

Is that so? The early IETF process was, "Rough consensus and running code."

Protocols were implemented first, then revised and described in RFCs later,
weren't they?

------
dvh
I blame firewall admins for this. If app was using port 1234, it was simpler
just to block this port. After while this make app unusable so app creators
tunneled everything through port 80.

~~~
stuff4ben
And I blame lazy software developers who had security holes in their apps that
endangered enterprise networks. I'm an app developer and we shouldn't be
placing blame on someone trying to do their job and protect their internal
network. We need to own up and start making security a priority over yet
another feature someone wants in their app. Yes we were quite clever to tunnel
things through port 80, but we're just delaying the inevitable.

~~~
egeozcan
I partially agree but isn't the port blocking mostly about reducing the attack
surface? If I were managing infrastructure (I never did), just trusting a
browser and its security model sounds better than following bunch of white-
listed apps for zero-days.

------
bsaunder
In some ways, it may be worse than the OP suggests as we apparently race
towards mobile apps. Seems about the right time for PointCast+. Its the
proverbial technology pendulum. It seems obvious that we'll continue our
momentum and maybe even abandon the generic browser and revert to the (now
mobile-based) fat client past.

Much of the article seems devoted to concerns about proprietary protocols and
closes with a plea to keep protocols open with a published specification. I
think this is a reasonable concern and a noble plea, but yet also an
(unfortunately) unreasonable request. Companies are developing their protocols
to support their needs, not yours.

Its reasonable to want access to the data companies are collecting. Its
valuable data. But these companies are not in the data services business. And
if they were, its reasonable for them to charge a handsome price for the
access you are looking for. Imagine the black eye a company would get for
trying that approach.

Perhaps there's a view that says, users freely give the data so they should be
able to freely get it back. But there's a counter argument that users are
"compensated" for contributing their data in the form of free access to the
services it provides. Given the amount of time people spend on social apps,
its reasonable to think they value the "free" service highly.

IIRC there was a start-up that attempted to build an alternative social
network based on open standards... It didn't go very far. Perhaps people need
to learn to vote with their actions. Maybe we need to try that again and
convince more of the influential technical folks to jump ship. Maybe through
competition the non-open establish players can be shown the value of openness.

~~~
hohenheim
I agree that user data is driving the "free" service of social media. But I
don't see your point when you say an open social media implementation failed
because it was open. You imply that Facebook succeeded since it was a closed
protocol implementation.

Why did Facebook succeed? Would have it succeeded if it was open? Perhaps
market needs and competition scene has more to do with success than the nature
of protocol. What percentage of users even care about protocols?

~~~
bsaunder
For the open facebook alternative, I believe I was thinking about Diaspora
(apparently it's still alive and kicking).

[https://diasporafoundation.org/](https://diasporafoundation.org/)

Maybe its worth investigating.

As for the "failed because it was open" comment. Sorry, I didn't mean to
suggest that it failed _because_ it was open. Merely that it was open _and_
failed (apparently maybe not so much failed).

I had thought it failed for the obvious reason: not attaining the necessary
critical mass that a social network needs. One of its primary propositions was
an open data policy. Apparently this might not have been sufficient to launch
it into success. Again, there's still something there so maybe its still
rumbling on.

------
abrgr
We're in a world of APIs and de facto protocols now. The bulk of the
interesting communication at this point is about data. REST over HTTP provides
a decent mechanism for interacting with arbitrary nouns. Anything
significantly more structured would, I imagine, essentially be an exercise of
modeling a particular domain of data. With rapid innovation, it becomes
difficult to codify a particular data model ahead of time, so we end up in a
world of RESTful APIs, some well-documented and some not-so-well-documented,
and, when one emerges as the winner, it becomes the de facto protocol.

That said, the walled garden approach of all platforms today (especially
Apple) certainly endangers future innovation. Though, I see that as a content
delivery problem, not a protocol problem.

~~~
anon1385
>we end up in a world of RESTful APIs, some well-documented and some not-so-
well-documented, and, when one emerges as the winner, it becomes the de facto
protocol.

Can you give an example of a proprietary RESTful API that then became a de
facto standard protocol that was supported by a wide variety of software (both
clients and servers)?

~~~
jalfresi
Microsofts "flavour" of WebDAV?

------
jessaustin
_And please never do what twitter did (start open, then close as soon as you
gain traction).

Posted by Jacques Mattheij August 5, 2015

If you read this far you should probably follow me on twitter:_

Is there a contradiction here? Why do you use twitter? (Rather than following
you on twitter, I have your RSS feed in my newsreader.)

~~~
captainmuon
If I want to use twitter-the-thing, I have to use twitter.com-the-service. No
other service offers decent twitter. There is gnusocial.de and quitter.se, but
their twitter is not as feature rich as twitter.com's twitter, and also
twitter.com refuses to federate with their accounts. That means you have no
access to twitter.com's biggest asset, their user base, which is pretty rude
if you ask me.

(I hope someone gets what I'm trying to say :-))

------
otis_inf
It might be me, but the article seems to make the same point as "The Web we
have to save" [https://medium.com/matter/the-web-we-have-to-
save-2eb1fe15a4...](https://medium.com/matter/the-web-we-have-to-
save-2eb1fe15a426)

~~~
e12e
HN discussion:
[https://news.ycombinator.com/item?id=9994653](https://news.ycombinator.com/item?id=9994653)

------
moe
Related: [http://qz.com/333313/milliions-of-facebook-users-have-no-
ide...](http://qz.com/333313/milliions-of-facebook-users-have-no-idea-theyre-
using-the-internet/)

------
smoyer
"Since then there has been a lot of movement on the web application front but
none of those has resulted in an open protocol for more than one vendor (or
open source projects) to implement."

I know of several RFCs that are in the process of being created/approved that
specifically detail protocols over REST/JSON. Probably the most impressive is
the System for Cross-domain Identity Management (SCIM) [1].

In general I agree with the sentiment of the OP - that we should have more
standardized protocols if HTTP is going to be the new TCP. I should also
mention that HTML is a data specification for pages transported via HTTP, so
there's definitely precedence.

[1] [http://www.simplecloud.info/](http://www.simplecloud.info/)

------
jegutman
AOL seems to think they're on version 9.7:
[https://help.aol.com/articles/upgrade-to-the-latest-
version-...](https://help.aol.com/articles/upgrade-to-the-latest-version-of-
the-aol-desktop-software)

------
tsunamifury
The Death and Life of Great American Cities by Jane Jacobs strongly outlines
that this pattern is a problem of segregation in almost all communities. The
best superficial designs are static and proprietary. The best, long living
designs are open and engage with inevitable chaos at scale. High end
segregated suburbs eventually die, as people move to open, less segregated
environments that provide more serendipitous opportunity.

It seems to be natural in complex networks that proprietary systems that don't
serve the entire population function better due to like-mindness, conservation
of resources by the wealthy, and bias.

Apple is like a private community for those who can afford it. Systems and
resources can be allocated centrally, and designed for a limited number of
static requirements. Its great, until requirements change.

I would argue that Google should continue to support the open environment it
is designed on. Its software is inherently linkable (you can link to
documents, G+, youtube videos), it is mostly open to non-logged in
experiences, and it is free for anyone.

It comes at great cost to support those ideas, sometimes in the form of money,
and sometimes resources, but I think its absolutely worthwhile.

------
bane
Abstracting problems is not always the right solution, but for some reason (I
think it's mostly a social effect) seem to have an overwhelming desire to
abstract away problems.

"XMPP support is slowly but surely being removed (just imagine a phone system
where every number you call to may require a different telephone)"

So what happens in these kinds of cases is that somebody invents an
abstraction (a metaprotocol) that cross-connects all of these. And then
somebody comes up with yet another protocol that doesn't fit into the
metaprotocol (or some slow moving standards body can't fit the new one in), so
somebody else comes up with a metametaprotocol that bundles the metaprotocol
and the new one in....and its abstractions all the way up..

Until somebody realizes that the tower of abstraction is introducing 300ms of
lag into things and we all pine away for the good old days of just XMPP or
whatever.

There's nothing that prevents technically oriented people from setting up ftp,
nntp, smtp, mtp, etc. servers and so on _except_ that these things really tend
to rely on some kind of software running on the user's OS and these days
people pretty much just run a browser and not much else.

The answer then is move the functions those protocols supported into the
browser (like in the way that most browsers support ftp), but it's just not
worth trying to shove telnet or whatever into browser support, so people just
replicate whatever telnet is supposed to do into a web app that's access over
http.

I think more troubling though is non-browser server-server communication is
now just http. Sure it's pretty simple, but for server-to-server communication
there's almost always a better protocol to use, but people can't be bothered
to come up with one.

~~~
jacquesm
The whole problem goes back to the early days of the web when everybody was
focusing on making it easy to _consume_ the content rather than making it easy
to _produce_ the content. To this date there is no application equivalent to
Mosaic 1.0 that does not require either some 3rd party service _or_ the
capabilities of a systems administrator.

How hard could that be? And is it still possible (with the TOS of many
providers now stating flat out that you can't run any servers on your line).

~~~
bane
I was just thinking this morning, _way_ back in the dial-up internet days, I
used to go around helping people setup web pages on their free 5MB of space
the company I worked for offered. A surprising number of our customers, even
elderly ones, went out of their way to learn basic html and put something up
there.

Over time we got geocities and various CMSs and such to fill that, but the
_function_ of what most of those people were putting up is just their facebook
wall these days, only it's a bit more transitory.

I remember one old gentleman, a WW2 vet, who spent hours every day building a
page about his dog so that he could share it with his friends and family. It
was a bit odd, but my FB wall is full of similar kinds of things.

So I guess it also comes down to "what's the end goal the person is trying to
achieve?" if it's just to keep people updated on their dog, FB more or less
fills that function. Most people actually _weren 't_ interested in building
random webpages, it was in communicating something to people.

~~~
jacquesm
A friend of mine said: people have an unlimited capacity for wanting to
communicate. And I guess facebook is the end-result of following through on
that desire. Which answers what will replace facebook: something that makes it
even easier to continuously communicate. Definitely not something federated
but something even more siloed.

~~~
bane
Something that's frustrating with FB is that it _is_ so temporary and
continuous. This is fine for most day-to-day updates (news, gossip, etc.). But
it's not great for long-term reference material. You might notice that people
don't really write long FB posts, part of it is likely that it just isn't
worth it as it it'll be off everybody's walls in the next 24 hours.

If FB added some kind of "archive this post" so people could save particularly
great posts or conversations, it would fill this gap. But I have a feeling
somebody else will get to it first.

~~~
jacquesm
I think you (finally!) nailed why twitter is successful. Updates are short by
nature, real content requires more space and has a much longer 'best before
date' and so does not fit either facebook or twitter.

That should have been the thing that google+ aimed at rather than doing the
same thing all over again. The content would eventually achieve critical mass
all by itself.

~~~
e12e
Maybe the world is ready for Google Wave now.

------
jasode
I agree that it would be helpful if we had more open protocols but I disagree
with the conclusion that _not having them_ inevitably leads to AOL 2.0.

First, I think it's helpful to clarify what jacquesm is saying. My
interpretation in concrete terms of URI[1].

    
    
      URI:  <scheme>://<location>:<port>
    

As of 2015, the scheme is almost always "http" or "https" and the port is
almost always "80" or "443".

Because of many factors (I'd say mostly social dynamics[2], not technical),
companies end up layering their proprietary/opaque protocols on top of "http".

I believe the essay asserts that the web would be more "free" and "open" if we
pursued more "schemes"[3]

For example, if you're starting a new company that lets people crowdshare
cooking recipes, the 1980s mindset may have been to submit a new RFC so
everyone could then do:

    
    
      cooking://johndoe.com
    

_(and default port for "cooking://" would be 867 or whatever)_

Instead, we now have a situation where we have a cooking REST API or cooking
iPhone/Android app that sends proprietary undocumented bytes over "http". Yes,
the "http" is open in an academic sense, but for practical purposes, the
cooking data is "closed" because the bytes over it are proprietary. Related to
this is that the bytes go to a central entity that has self-interested
economic motives instead of a peer-to-peer situation like "ftp://" or SMTP
email.

I don't have to time at the moment fully explain my disagreement but I don't
believe this leads to AOL 2.0.

Instead, what happens is that the proprietary protocols simply become more
_inefficient_ by "tunneling" or "layering" in or on top of "http". We
collectively waste lots of HTML/MIME overhead bytes to send opaque data. We
also expend a lot of security effort with cat & mouse "deep packet inspection"
of "http" because of this. Lots of cpu cycles are burned up to pay for these
inefficiencies.

As for pushing for the ideal of more schemes with public and transparent RFCs,
this is a goal that's couched in technical terms and it hides what we're
_really_ asking of each other: we are asking economic actors to forego their
economic interests. That is a very tough sell.

For example, in the USA, there's Intuit Quicken and they have a proprietary
transport for downloading financial transactions. I loathe Intuit and their
business practices but I do understand why their protocol is opaque instead of
being an RFC. They are the ones that did all the legwork of convincing the
banks to open up their mainframes to facilitate data downloads. The government
didn't grease the wheels; Intuit did. Now they want to be rewarded
economically for it. Can't blame them. That's why it's not an open RFC.

Contrast Intuit's situation with the public RFCs for "ftp://" and SMTP. The
people creating those protocols were economically secure. They had stable jobs
at universities or government institutions.

Today, the opaque protocols are made by actors in a different economic
landscape. If a YC2015 company takes $130k from ycombinator and several
million $$$ from VC rounds, defining a transparent RFC that harms their
economic interests doesn't make sense. This goes against human nature. The
essays pushing for a more open web are not addressing the hard-to-resolve
economic interests.

[1][https://en.wikipedia.org/wiki/URI_scheme#Generic_syntax](https://en.wikipedia.org/wiki/URI_scheme#Generic_syntax)

[2]corporate firewalls and network appliances policies of restricting ports,
users' home computers' firewalls default config of blocking ports, etc. New
software can't expect to reverse the conservatism of Cisco admins, etc. Also
look at why residential SMTP mail servers are blocked by others even though
SMTP is theoretically "open". That's a social problem caused by spam.

[3][https://www.iana.org/assignments/uri-schemes/uri-
schemes.xht...](https://www.iana.org/assignments/uri-schemes/uri-
schemes.xhtml)

~~~
williamcotton
It's shocking that so few people in a forum that is basically dedicated to the
business of information technology have such a blind spot to the economics of
our industry. Yours is the only mention of the word "economics" out of many
dozens of comments.

You'd think by now that we'd have a way to treat digital media as property
that can be bought and sold on a marketplace. This price discovery would then
make a great entry in to a book of accounts.

The current info-economic system is mainly selling advertisements to pay for
nothing more than servers and other operation costs. The content itself is
worthless, not because it doesn't have costs in making it or is something that
consumers don't value, rather because we've purposefully kept it out of a
marketplace.

These VC funded information technology companies only have one business model:
create and exploit a private monopoly. Control identity. Authorize the use of
any and all interactions with data.

I'm not going to discuss my thoughts how to fix this because my opinions on
the matter are incredibly unpopular on HackerNews and frankly, I'm sick of
feeling like crap after getting a bunch of silent downvotes after I've spent
time writing something thoughtful.

~~~
eevilspock
I can't remember your other opinions but I agree that the economics is a
serious problem.

[https://news.ycombinator.com/item?id=9961761](https://news.ycombinator.com/item?id=9961761)

------
sosuke
I've been saying the same thing, but I feel this goes too far into an alarmist
view.

When Facebook, Google, and Apple hold all the cards the creators, the
developers, will move back out and create the 'old' Internet again. Then the
people will follow to the new creations. Eventually we'll hit AOL 3.0 and the
cycle will continue.

At least that is my hope.

------
api
There are many reasons for this, but a big one only hinted at in this article
is the firewall. Firewalls and NAT effectively break the Internet. They
_require_ centralization -- two devices cannot communicate without it. We've
adopted a topology that forces centralization, and now we're surprised at the
result.

------
cwyers
I think the author makes a lot of good points, but misses a key point. We're
not shuffling off into walled gardens because Google and Facebook want to
"own" their users -- they do, of course, but users aren't just being herded
like cattle by Facebook. Users are getting into walled gardens because walled
gardens offer users a lot of advantages right now.

1) Internet users are becoming less technical as the number of Internet users
increases, and they want things to just work. They don't want to know whether
their mail server is POP or IMAP, they just want to send and receive e-mail.
They don't want to have to try and figure out which IRC server to get on to
get a decent ping or to avoid a netsplit, they just want to chat. Zero setup,
zero installation, just go to a webapp and all those details are handled for
you.

2) Internet users are becoming increasingly mobile, and most of our pre-HTTP
Internet protocols scale poorly over mobile. Mobile devices are power and
bandwidth constrained in ways that protocols didn't envision. Chat
applications not written with mobile in mind are giant battery hogs.

3) The open web is a dumpster fire. SEO makes a lot of Google search results
difficult if not impossible to wade through (try searching for information on
a particular printer or lawnmower and see how many results you can find about
anything other than someone who wants to sell you one). An article with
pictures that measures under half a megabyte comes with four megabytes of ads
and trackers. If you peek behind the spam filters, something like 90% of the
e-mail anyone gets is best described as junk. Some of it's malicious junk at
that. Most open forums quickly degenerate into a showcase of the worst
humanity has to offer -- go check the comments on a newspaper or TV news
website if you don't believe me.

4) Nobody can make any money except through advertising, and even that's
becoming problematic. The problem with things like subscriptions is that they
reduce the value of the hyperlink towards zero. And you can say, "for just the
price of a cup of coffee you can keep this website open," but having to buy 50
to 100 cups of coffee every month to support all the websites you like to
visit at some point during a month quickly becomes untenable. And so
advertising comes to dominate the web in increasingly perverse ways (which
explains about half of point three). Walled gardens are one way to offer a
respite from the worst abuses of advertising -- Facebook doesn't need to send
you 5mb of JavaScript to track you, they can do a much better job of tracking
you with a lot less overhead than third-party trackers can.

If you want to fight walled gardens, you need to find ways to solve these
problems in ways that don't involve walled gardens or offer ordinary users
(users who don't share this community's convictions that open is inherently
better) benefits that a walled garden doesn't or can't.

~~~
JoeAltmaier
In fact the walled internet gardens remind me more than a little of the walled
communities in Snowcrash. In that story the USA has been 'Balkanized' across
commercial lines, marginalizing the federal government in favor of security
and stability.

Folks use the part of the internet that works for them. Its the largest
crowdsourced community-building experiment of all time.

------
orionblastar
I remember AOL it was a dial-up network with its own custom GUI that set up a
TCP/IP stack at the time that getting on the Internet with WINSOCK programs
was difficult because you had to configure a SLIP or PPP profile with other
ISPS.

AOL made it easy for anyone to get on the Internet, they mailed out floppy
disks for a free month for example. The hard part was canceling the service
and getting them to stop billing you once you stopped using their service.

AOL software grew and went on CD-ROMs, at least with floppies you could
reformat them and use them for other stuff, the CD-ROMs were drink coasters
and made wind mobiles out of them.

AOL software grew when they merged with Time Warner, the AOL software was a
web browser and media surfer and email client.

When broadband began to take over there was a $20/month option to add in your
AOL account to it and use it over broadband.

If you want AOL 2.0 today you have to have a phone company or cable company
bundle their broadband service with the software that is part web browser,
chat client, email reader, news reader, media surfer, stock ticker, and other
stuff rolled into one app.

Google or Apple if they wanted to do this would have to lay down some fiber
optics to provide broadband Internet access and then bundle their AOL 2.0
software with it. It is not a cheap thing to do, the dial-up modem was better
because you could plug a modem into any phone line and get Internet, but it
was slow. Today you need a high speed phone line and a DSL modem, cable modem,
some sort of high speed modem that uses fiber optic, etc. They company has to
send someone to install the high speed cable and modem to hook you up, and it
is not so easy to do that these days. You have to wait for someone to be
available to do that, and hope there is no technical delays.

AOL was killed by broadband Internet and Web 2.0 sites replacing what AOL
software did.

Oddly enough you still find the elderly using AOL dial-up for $35/month on
really old Windows PCs. They don't want to lose their aol.com address.

------
bsaul
I think one solution would be for internet providers to upgrade their offer.
They started by offering you an email box, then stopped. Some offer storage
space, and a few of them offer blog platform, but they've all surrendered
against facebook.

Why wouldn't internet providers offer your own personnal profile page and news
feed, only instead of being stored and owned by a company whose respect for
private data and business model seem contradictory, let someone you already
pay something for handles it.

Then we would need protocols again, because internet providers are numerous,
and don't need to dominate the world to be profitable.

Note : i can think of many other services that would be good candidates for an
isp to offer : youtube, linked in, photo sharing, etc.

~~~
ohitsdom
I would not be excited about my ISP offering these extra services. I just want
them to be dumb pipes- deliver my data without messing with it in any way.
That should be their only business concern, and all innovation efforts should
be spent on improving their network.

~~~
yoz-y
Being relegated to a dumb pipe status is the nightmare of all ISPs though.

~~~
jimktrains2
Why? It's what they are (well, should be).

~~~
yoz-y
Because being a simple provider of infrastructure removes any added value from
your product and opens you to a price race to the bottom. Of course there will
not be a slew of new ISPs given the cost of the infrastructure but it is never
a good feeling when you know that people can simply ditch your service and go
elsewhere without any cost.

This is what happened to the mobile market, to the PC market, is currently
(somewhat) happening in small scale public transportation and (at least in
Europe) has happened to food producers.

~~~
jimktrains2
> price race to the bottom

You can only race if you have competitors.

~~~
yoz-y
But there are several ISPs out there. Granted, it is a very hard market to get
into but nevertheless there is (most of the time) more than one. Personally
I'd like the Internet infrastructure be handled more like roads (i.e.: the
infrastructure is technically owned by the government but built by and rented
to private companies)

------
cvwright
I see what he's saying, and he makes some good points.

But I think he's too pessimistic. There are still some bright spots of open
protocols on the Internet, even if they're now predominantly used through web
browsers and/or HTTP.

I'll name two that are more recent than the last RFC referenced in the
article, from 2009:

JMAP - the JSON Mail Access Protocol, here to save us from the dark ages of
IMAP. [http://jmap.io/](http://jmap.io/)

WebRTC - cross-browser, cross-platform voice and video chat without being
locked in to a single provider like Skype.
[http://www.webrtc.org/](http://www.webrtc.org/)

------
wyclif
Heads up, this post is great but it really needs a once over editorially:

 _NNTP has been ‘mostly dead’ for years (though it still has some use the real
replacement usenet for discussion purposes appears to be Reddit and
mailinglists) and so on._

~~~
jacquesm
Thanks, I'll fix that and read it carefully once more. Really annoying no
matter how often I proofread I _still_ mess up.

------
hendry
Good post! Protocol Handlers aka chrome://settings/handlers are really really
badly done or utilised. If there were done better, maybe we would see some
more interoperability between the silos.

I created an example showcasing mailto:
[https://www.youtube.com/watch?v=HeE9XPmYcq8](https://www.youtube.com/watch?v=HeE9XPmYcq8)

Though for example ICS:
[http://sg.hackandtell.org/ics/](http://sg.hackandtell.org/ics/) I've not
managed to make it work with Google or Fastmail Calendar.

BIG SIGH.

------
siculars
Sandstorm[0] and projects like it are the future of the internet and the
digitally connected world. Data silos, privacy, inability to interoperate,
data ownership will all collude to bring down the wall. At the same time,
virtual compute providers and software like sandstorm will become ubiquitous
and intuitively easy to use. I, for one, am long term optimistic.

[0] [https://sandstorm.io/](https://sandstorm.io/)

------
jessaustin
ISTM this is just the end-to-end principle taken one step farther. Instead of
intelligence residing in the endpoint node, now it resides in an app running
on the endpoint node.

That isn't to say that the "aol2" phenomenon doesn't pose the risks to users
described in TFA. I only observe that the same factors that encourage end-to-
end (flexibility, reliability, loose coupling, etc.) probably encourage "aol2"
as well.

~~~
pjc50
No, it's a moving of the "ends" and insertion of a middleman. If I write some
text and publish it to people, rather than sending it from my computer to
theirs I send it to an intermediary, who reformats it and presents it to other
users of that intermediary.

~~~
jessaustin
TFA is really talking about two things. I responded to the "layered on top of
HTTP" point, while you seem to be talking about the "silos of end user data"
point. _Perhaps_ the first point contributes to the second, but I'm not
convinced. Users could choose http apps that don't silo.

~~~
pjc50
It strongly tends to, because having your own computer (especially a phone)
set up to answer inbound HTTP is both administratively difficult (fiddling
with NAT and dyndns, etc) and a security risk.

This is Schneier's argument about "security feudalism". And to some extent
touched on by the article - people build on top of well-understood transport
layers in order to reduce risks, and novel protocols are firewalled in case of
novel risks.

------
snarfy
I had a similar feeling the moment I started seeing browsers hide the protocol
in the URL bar as much as possible. The protocol is important and I didn't
like them munging the protocol and address together.

------
syats
AOL never sold hardware, but google, apple do. It's going to be worse.

------
k__
But open protocols still happen.

JSON-API[0] for example, is layered on top of HTTP and an open protocol.

[0] [http://jsonapi.org](http://jsonapi.org)

~~~
jacquesm
That protocol does not describe an application or aspects of an application
but is rather of the 'transport' variety.

~~~
jackgavigan
Oh, so you're advocating more open standards at the _application_ layer?

~~~
twoodfin
Calendaring is a great example. At least it's _possible_ to get a unified view
of my work Exchange calendar and my Google personal calendar, as well as
manage both through the web, Siri, or any number of apps on every platform.

That's not possible without an open application protocol. I can't get a
similar unified view of my Twitter and Instagram follows, for example.

------
vdm
Slack.

------
mahouse
And http is not a plumbing protocol nowadays?

Or maybe you expect twitter to write a rfc about its api, which for obvious
economic and strategic reasons they don't want to open? WhatsApp maybe? How
would benefit them to have 3rd party clients they wouldn't be able to control?
Or having to justify or change their protocol decisions with an rfc?

I don't understand, I'm genuinely confused, this article sounds very naïve,
very delusional, so probably I am just misunderstanding it.

~~~
detaro
Remember twitter and the discussions when they nuked third-party clients?

And yes, messengers not working with other clients is one of the most user-
unfriendly things they do and something many users consider a bug. And really
only has taken off with the start of mobile messengers, before that most could
be interfaced from different clients.

~~~
icebraining
_And yes, messengers not working with other clients is one of the most user-
unfriendly things they do and something many users consider a bug. And really
only has taken off with the start of mobile messengers, before that most could
be interfaced from different clients._

I think that's somewhat misleading. MSN Messenger, Yahoo!, etc didn't use
standard protocols, it's just that people would reverse engineer them. Third-
party clients did just stop working from time to time whenever the providers
changed something in the protocol.

~~~
amyjess
I really have to wonder what happened to all the people who used to reverse-
engineer chat protocols.

Did they all just give up?

~~~
icebraining
They're still around - WhatsApp has had multiple unofficial clients pop up,
for example, though the company is much more heavy-handed in taking them down
(DMCA takedowns against libraries that use the API, banning users of
unofficial apps, etc).

