
What should follow the web? - telotortium
https://blog.plan99.net/what-should-follow-the-web-8dcbbeaccd93
======
evmar
Though I disagree with many of the points made here, I love that this person
is taking a position and putting in the real work to to lay out his argument
for it.

I think the main point he's missing is that most platforms die due to lack of
adoption. The ones that succeed must attract developers and the current state
of software regrettably shows that developers (and users) aren't generally
motivated by security concerns -- they're motivated by more fundamental
questions like "can I get the job done quickly" (users) and "can I get paid"
(developers) and (importantly) "can it run on an iPhone" (which kills most new
app platforms immediately).

This post focuses a lot on tech, but platforms rarely succeed for having good
tech. If history's any guide, they succeed by having adequate tech and a huge
pile of marketing money.

~~~
root_axis
> _they succeed by having adequate tech and a huge pile of marketing money._

The key to the web's success is its completely unparalleled levels of
accessibility. All you have to do is remember a domain name and you have a
portal into anyone's personal network application on any device; that feature
can never be topped regardless of the tech's quality or the marketing budget.

~~~
goatlover
> that feature can never be topped regardless of the tech's quality or the
> marketing budget.

Never? That's a silly thing to say in the computer industry. I'm going to
guess that we'll have an alternative to the web sometime in the next 20 years.

~~~
root_axis
"Never" should really be "not in the foreseeable future", but certainly not
within two decades. The web is the apex of incumbent systems and there is just
no clear path, desire, incentive or business need that would facilitate a
transition to an alternative, especially as the web approaches full feature
parity with "native" code. The only people that have irreconcilable problems
with the web are a certain subset of programmer, but for most people an
alternative is unnecessary and perhaps even the opposite of progress.

~~~
goatlover
> The web is the apex of incumbent systems and there is just no clear path,
> desire, incentive or business need that would facilitate a transition to an
> alternative, especially as the web approaches full feature parity with
> "native" code.

I don't know about that. Considering the possibilities with AR/VR, quantum
computing, machine learning, smart fabrics and environments, and the like and
seems to me there could be an incentive for an alternative inside a 20 year
timeframe. Will all that run over the web, or have a web interface? I tend to
think at some point big technological changes will make an alternative
appealing.

But maybe not and it will all be web assembly apps writing to a canvas
container ;)

~~~
root_axis
> _possibilities with AR /VR,_

The possibilities for multimedia and gaming are quite interesting, but there
is no reason to suggest that this technology will have any impact on an any
possible alternatives to the web.

> _quantum computing_

Quantum computing is in the "unforeseeable" category, we're not even entirely
sure that quantum computing will ever be practical (as far as general purpose
computing goes), certainly it has no implications with regard to an
alternative to the web.

> _machine learning_

Same situation here. The machine learning discipline is expanding quickly and
yielding very interesting developments, however, there's just no reason to
suggest that machine learning will provide the impetus for a transition into a
web alternative.

> _smart fabrics and environments_

Same here too.

All of these developments are orthogonal to the state of the web and none of
them really provide any mechanism or incentive to create or move to an
alternative.

------
inglor
> Components, modules and a great UI layout system — just like regular desktop
> or mobile development.

I don't understand this - I develop for all platforms (web, iOS, Android) and
web is always much faster to develop for in terms of productivity. So much
that a lot of mobile developers are starting to use web stacks (like react
native).

~~~
kbuchanan
Early in my career, having mastered Visual Basic I decided it was beneath me
and I'd move on to more sophisticated tools like J2EE. When Ruby was born,
Java's IDEs and static tooling became stupid, because Real Developers(TM) use
raw editors like vim and emacs. Years have passed, I develop less and manage
more, but I still use vim and I've moved on to yet other languages.

And now, 15 years into my career, I face the painful realization that VB
remains the most productive environment I ever worked in. Yes, it was narrow.
If Microsoft hadn't implemented an API for a feature, you waited for the next
version.

Therefore, this question _does_ give me pause: Why has the web failed to
produce an experience as productive as _several_ (Visual Delphi, anyone?)
options over a decade ago? Competition is the web's strength. But it's also
its undoing. Personally I think this article is Quixotic—and I prefer a
decentralized, ineffecient (JS), sort-of secure (HTTP), competitive web to
whatever top-down integration visionaries might push—but I think he—and many
older developers—are asking the same question: What happened?

~~~
inglor
People often make this complaint with flash rather than with VB (although VB
is also a nice example - thank you).

The web platform aims to solve a _very broad_ problem. I don't know if you've
used VB for big projects - but as someone with lots of nostalgia for VB6 (and
winforms) it got very painful around the 20K LoC mark and didn't really
support complex UIs in a very efficient way.

There certainly _are_ visual builders for web design (literally hundreds -
google "React UI builder" or "Angular UI builder" for some examples). They
just haven't received adoption. I think it's because people always get weird
UI requirements and the tools break. How many UI designers worked with you
when you built UIs with VB? For me that number was 0.

In general, UI is designed today in sketch/photoshop and then a tool like
Zeplin is used to export the design's properties to code automatically.

~~~
kbuchanan
You're absolutely right: the web and its breadth of tooling solves A LOT more
problems than VB or Flash could ever dream of.

~~~
narag
I disagree. I don't know about Flash and I just assume that VB was more or
less like an underpowered Delphi. The problem was not tooling at all... at
least with Delphi, please see my other comment.

------
mixedbit
In 1994 Sun engineers that worked on RPC published a "Note on Distributed
Computing" [1] where they argue that hiding remote interactions behind
interfaces that are designed to look exactly like local interfaces is not a
good idea. RPC in itself is not bad, but only if the local<->remote
interactions are exposed. Web with ajax makes message passing explicit and
obvious, so even not experienced developers know which calls are remote
(expensive and can fail). If these interactions are hidden behind RPC API the
performance and reliability of apps may suffer.

The NewWeb seem to be inspired by past successful frameworks for building
desktop applications, such as Delphi or Visual Basic. But these were local
frameworks, NewWeb, being a local client - remote server, can't emulate these
solutions by masking remote interactions, because it will end in the trap
described by the Sun engineers.

[1] [https://github.com/papers-we-love/papers-we-
love/blob/master...](https://github.com/papers-we-love/papers-we-
love/blob/master/distributed_systems/a-note-on-distributed-computing.pdf)

~~~
kelnos
I think that advice is a little outdated. Not because it's incorrect, but
because we've already taken care of that distinction with common patterns. If
something returns a future/promise, it's a signal that the operation is at the
very least expensive, and likely non-local. That's a pattern used pretty much
everywhere now, and I think developers understand what the implications are.

Beyond that, 1994 was a time when RPC was rare in apps that people would run
on their desktops. Developers needed a big fat notice that a function call
could involve network access because it would be an exception to the norm. Now
there are few apps that don't do some sort of RPC, and people are pretty used
to dealing with the latency and error-prone nature of making network calls.
(Not to say people are always _good_ at dealing with this, but they're at
least aware that they're doing RPC without much fanfare needed.)

------
mwcampbell
I know the OP isn't discussing this one until next time:

> 5\. IDE oriented development

But it should also be practical to manually edit text files as a fallback,
assuming one is OK with not having niceties such as auto-completion and on-
the-fly error checking. Counter-examples are Apple's .pbxproj (Xcode project)
and .xib/.storyboard (Interface Builder) files.

The reason I want this is in case part or all of the IDE is unusable or just
unproductive for some developers. For example, as far as I know, blind
programmers can't practically create a UI in Apple's Interface Builder. Hand-
editing markup like XAML is OK though.

~~~
thomastjeffery
Build tools can always be replaced. Look at autotools, cmake, scons, etc.

------
creatrixcordis
Maybe the reason why a lot of “web apps” are so complex and huge has something
to do with over engineering, developers loving lazyness as a good trait of
their craft. Which results in undiscipline, then you add pressures from
management to produce faster rather than focus on simpler solutions. I don’t
see how these factors would not creep in into any “newweb” platform. Things
are complex because of people, not machines. After all we made them so. Unless
you have strict standards that are adhered to by lots of parties, i don’t see
a way out of complexity, clutter, large size .js downloads. Then you add
competition which does not have anything to do with rules, rather it loves
breaking them to get a higher margin. That doesn’t really add simplicity to
the mix. I happen to like the way the web is going and this “newweb” business
sounds like the ever growing .js framework churn. I agree there are problems
but definitely not structural enough to warrant a restructure. Then again that
is the power of the web, anyone can pursue this.

------
jv22222
Something that should be baked in to a "new web" should be anonymity and
decentralization.

We need to find a way to get out of this big brother scenario we've got
ourselves into.

A "new web" would be the perfect time to fix that.

~~~
mirimir
Yes! We need a web on an overlay network, so nobody truly does know that
"you're a dog".[0] Or where you are. For both users and websites.

0) [http://knowyourmeme.com/memes/on-the-internet-nobody-
knows-y...](http://knowyourmeme.com/memes/on-the-internet-nobody-knows-youre-
a-dog)

------
trgv
Is the web not salvageable? Look at web assembly. The browser vendors added a
VM and a bytecode format that runs on it. And they did it pretty quickly. In
my mind, this is the way things are going to improve in web-land. Not "throw
it all out and start over."

If someone/a company wants to build an "app browser" and associated protocols
and publish it then the market will decide how that goes. I think it's a cool
idea, but I doubt it would gain traction because almost everything enumerated
in this post is a developer concern, not a user concern.

In software, there's always going to be that temptation to start all over
again due to fundamental mistakes that were made years ago. But I feel there
are far too many entangled interests in the web for that strategy to make
sense. Hey, if someone wants to work on it, god bless them.

~~~
mike_hearn
WebAssembly is symptomatic of the wider problems.

There have been many initiatives over the years to add more languages to the
web platform than just JavaScript. I remember one from Google that tried to
add a form of Java that had better integrated DOM access and didn't use a
separate/traditional JVM (more like another mode on top of V8, iirc). There
was NaCL, two attempts. Of course there were plugins as well. They were all
killed off with justifications that ranged from the reasonable-sounding to the
quite obviously ideological.

WebAssembly is the latest attempt to scale this wall, and what does it give
us?

A new bytecode format that nothing uses and which is accepted by no hardware.
It looks suspiciously similar to JVM bytecode, featuring a small set of
bytecodes and a bytecode verifier, except that there's no support for
expressing patterns that all high level languages have, like exceptions or
garbage collected memory allocation.

The primary new capability, after years of work, seems to be that you can now
embed C into a web page (as long as you bring your own C library, I guess) and
run it slowly until it warms up. Although they handwavingly claim that in
future they'll add GC support and that'll unlock support for other languages,
it will come at a massive performance penalty that makes it effectively
useless because such languages require complex JIT compiling runtimes to
achieve good performance and you won't be able to do that sort of thing inside
a wasm module.

The web's designers had many other options that would have been far more
useful but they went with this one. What is their solution to avoid web apps
now importing the world of manual memory management vulnerabilities? They
don't have one: the best you get is preventing the C code from affecting the
browser.

The only thing I find sadder than WebAssembly is the undeserved adulation it
seems to be garnering from the web dev community. I get it - at least there's
progress of sorts. But, sigh.

 _> I doubt it would gain traction because almost everything enumerated in
this post is a developer concern, not a user concern._

That's true for any developer platform. Users ultimately don't care how
software is built, unless problems of the platform hurts the quality of the
apps.

~~~
niutech
AFAIK, WebAssembly is based on asm.js, which can be compiled from any LLVM
language, not only C, using Emscripten. There are full implementations of Lua
([https://daurnimator.github.io/lua.vm.js/lua.vm.js.html](https://daurnimator.github.io/lua.vm.js/lua.vm.js.html)),
Python ([http://pypyjs.org](http://pypyjs.org)) and even Java
([http://teavm.org/](http://teavm.org/)) in asm.js. You don't need to write in
the C language at all.

~~~
mike_hearn
Yes, I'm using C as shorthand for "C and C-like languages", i.e. ahead of time
compiled with manual memory management. It can do C++ too if you ship your own
STL.

TeaVM has some sort of experimental wasm backend. Presumably it has to ship
its own GC and can't do any kind of runtime profiling, which turns out to be a
20% win for Java and far, far more for other languages (like scripting
languages).

My view is: if you were looking to bring to the web better performance and
higher productivity, let alone more security, _why_ would you combine low
performance with low level languages? That's the worst of all worlds! And
WebAssembly _will_ be low performance for a while - they have to redo all the
work done so far on optimising JIT compilers.

~~~
krapp
It's better than pretending Javascript is a "bytecode for the web," and
transpiling to that. It's better than Flash or Java applets. If we want to
consider the long-term archiving of software and keeping it runnable through
the web (and I think we _really really_ should consider that), we need
something better than what we have, and WebAssembly seems like a step in the
right direction.

------
datashovel
What if anyone in the world who had an idea for a "new web", could build it
tonight, and have it used tomorrow?

My solution to the "new web" problem is radically different (though to be
fair, trying to reinvent the web is in itself a radical idea).

I've posted my idea to many similar threads, so I apologize for repeating
myself, but I feel pretty passionately about it and I feel it's a good idea to
try to spread. Until I start receiving convincing evidence / arguments that
the idea isn't worth spreading I'll probably continue.

Imagine if we didn't have to decide what the "new web" was going to be, BUT we
did allow that experimentation to take place? I say we shouldn't make it a
requirement to "convince people it's the right thing to do before it gets
built and people start using it".

What if users didn't use "browsers". Instead they used "meta browsers". An
application which hosts browser engines. And not only could apps / documents /
etc. be downloaded by this "meta browser", but the experience of switching
between browsers was also seamless to the end user. If they didn't already
have that browser engine they would be prompted to download it if a particular
app / document developer decided they were supporting it.

In this "new web", the "document / app" developer decides which browser engine
the "meta browser" should render their app with.

What if a browser engine developer decided he didn't want to support RFC
[https://tools.ietf.org/html/rfc3986](https://tools.ietf.org/html/rfc3986)

It probably feels like they're on a suicide mission (for their browser), but
why should it be a "requirement"? If the ideas are bigger and better and
eventually catch on (ie. things we haven't even thought of yet) why should
"the web" somehow dictate what core set of ideas are the right ones?

~~~
macroexchange
The WWW uses DNS. Think about it: DNS evolved as an upgraded distributed
hosts.txt table. Fundamentally in an entry in that name-ledger is a point to a
machine which you would log into. Who gets to change that ledger? Its static.

Now we have blockchain tech where names can point dynamically to content or
cluster of machines. Ethereum ENS is an early version of this.

Imagine you would want a global map of places on the Internet itself. Like
Open-map but so secure you can rely on it, so that self-driving cars can use
it. Technically its not impossible anymore. The Web can't do that, because of
single authorship of data (way less secure than multi-party authorship).

Ironically Mike Hearn has suggested such system with TradeNet [1], but
apparently has missed what is happening with the evolution of blockchain tech.

The next web will be a transaction system, not a communication system - the
former is a generalization of the latter. If you're interested in building
these kinds of systems - we are startup building the foundations and are
hiring.

[1]
[https://www.youtube.com/watch?v=MVyv4t0OKe4](https://www.youtube.com/watch?v=MVyv4t0OKe4)

~~~
mike_hearn
My day job is working on Corda, which is a distributed ledger platform. So I
haven't missed it.

When an app identity is based on a public key, you can start to do things like
load them from a BitTorrent style network (if you want to). Or define a
traditional CDN as the primary entry point but have slower/more decentralised
systems as backups. App identity doesn't change so locally stored data and
sessions are not lost.

------
aaron-lebo
You've got browsers today that are finally capable of doing everything from
playing interactive media, running real applications, to displaying unstyled
text documents. It's really good at all of those, and it's based off of widely
distributed open source tech (how many browsers are there, HTTP servers, etc).

You can reach 2 billion people for pennies. What's more, this stuff isn't
hard. Children can learn HTML and CSS, modern web frameworks make it so that
you can build great stuff on the back or frontend in a matter of days or
weeks. Seriously, write a small Sinatra or Flask app. It's easy.

Why is that not good enough? We really so ungrateful to believe that
reinventing everything is the right step now? Nothing is stopping anyone from
taking existing tech and building great, usable, lightweight sites. Seems like
focusing on that would be about a billion times more productive, yeah? You
can't really expect to throw away 30 years of ongoing progress and build a
better experience in months or even years without losing something.

Just seems like a bunch of fluff. He'll never build anything like it. This is
like sitting there in 1600 and saying that the printing press sucks because it
doesn't encrypt documents. You've got so much potential that remains
unharnessed (low-hanging fruit even), and we'd rather build a new system.

~~~
stuxnet79
As someone still familiarizing himself with all the esoteric arcana that makes
up modern web development, I'm always open to reading what-ifs. The modern web
to me is WTF in a lot of ways. I remember reading an article on HN about the
history of CSS that was quite fascinating. I have nothing against the author's
article. It is pretty interesting and I even find his other blog posts quite
illuminating to read.

IMO what's rubbing people the wrong way is not his specific points but how he
is framing them. It really does not matter how brilliant his alternative is
... it's clear that HTTP is firmly entrenched and the network effects run so
deep, it would require a miracle to really "start fresh". I'd really like to
be proven wrong though.

~~~
aaron-lebo
I'm open to what ifs, but I've seen enough critiques of pretty basic stuff
that he's got wrong to let me know that he doesn't really understand the web
as well as he thinks he does, and therefore his criticism is pretty biased,
and misleading, and unlikely to produce any real solutions.

~~~
stuxnet79
Yea? Do you mind sharing the basic stuff that he's gotten wrong? Because I
learned a few things and found the article to be well researched (but what do
I know ...)

~~~
aaron-lebo
[https://www.reddit.com/r/programming/comments/71y6dy/its_tim...](https://www.reddit.com/r/programming/comments/71y6dy/its_time_to_kill_the_web_mike_hearn/dnepq1l/)

------
z3t4
You are thinking about how to evolve the web, while meantime the rest of the
world has not yet discovered it. I mean people are still sending word
documents via e-mail instead of making web documents in HTML and CSS and with
an URL. When Dropbox came out you guys thought it was stupid, while normal
people found it amazing that you no longer had to e-mail or print the
documents in order to share them. People are using the web to search for
information. Now we have to make it possible for them to also _share_
information. Sure we have Twitter and Facebook, but that's not "the web" even
though many people believe it is.

~~~
sjy
>people are still sending word documents via e-mail instead of making web
documents in HTML and CSS and with an URL

What's so bad about this? I know how to make web documents but I still do
this. Email attachments are available for as long as the recipient chooses to
keep them, and can't be changed once they've been sent. Sometimes that's a
feature.

~~~
z3t4
Maybe a revision system can be baked in. Like the browser could show you what
has changed since your last visit. And you could see how a web site looked
like on a given date.

------
Entangled
Let's start with the protocol app:// and build from there.

[http://](http://) is for hypertext, app:// is for apps

app://microsoft.com/word should be explicit enough

~~~
MentallyRetired
That's not what protocols are for, though. It would still be served over http,
so what you're looking to replace is actually the world wide web, or www. In
other words, [http://app.microsoft.com](http://app.microsoft.com) is what you
meant.

~~~
EGreg
I am not so sure that all protocol schemes require that the transfer be over
http. How about if I type ftp://example.com in the browser? Or https? Or
gopher? :)

~~~
MentallyRetired
They don't because those are different protocols. I'm assuming he means
SPAs/PWAs/hell even XUL, and not some kind of byte code application.

------
inglor
The reason the web is successful is because it is a relatively _small_
platform that you can build on in userland.

I think ideas like including a single binary serialization format as misguided
at best - in 3 years we'll want to do something better and then what? Today we
can just use protocol buffers on both sides if we please and be over with it -
the web is about capabilities and not about APIs.

~~~
mike_hearn
The web is not small by any stretch of the imagination. It already privileges
_three_ different generic data structure formats: XML, JSON and data-foo
tagged HTML5. It also contains support for many different image formats,
including an entire animated vector graphics language, several quirks modes,
an entire OpenGL stack, a low level(ish) audio API, USB support, rich text
editing that is hardly used because it's too buggy ... and peer to peer video
streaming. That's, like, 1/10th of what the browser alone provides. But nobody
uses just the browser these days, they all add frameworks on top!

But you know, that's also fine. There's nothing wrong with large platforms,
especially if the reference implementation is open source. Big platforms do
more work for the developer. The problem with the web platform is that it's
such a bizarre mishmash of things, often designed in isolation without any
specific app driving the platform forward. So you get design-by-committee
standards that may or may not be implemented on any given browser.

~~~
kowdermeister
> The problem with the web platform is that it's such a bizarre mishmash of
> things, often designed in isolation without any specific app driving the
> platform forward. So you get design-by-committee standards that may or may
> not be implemented on any given browser.

That called organic growth and evolution with natural selection. I find it a
much better way than a group of coders sit down in their ivory tower and lay
down the "proper" way how I should do things because they know better than me.
Good luck with that.

~~~
mike_hearn
What lives or dies isn't actually chosen by natural selection. It's chosen by
a variety of arcane closed-door meetings between the major browser vendors
where they agree or disagree to implement each others specs.

~~~
TeMPOraL
That's still natural selection, with browser vendors executing the selection
part.

The problem with the web is ultimately the same as with everything else in
this industry - it's that _the fitness function sucks_. And that,
unfortunately, is caused by economic incentives being broken, which is not a
trivially fixable problem.

------
tannhaeuser
For apps, I personally like React more than I'm willing to admit. Basically,
React is MVw done right and has kindof won the JavaScript framework wars for
me. My only wish would be that JSX be based on a standard syntax extension of
JavaScript to handle markup literals in the language rather than using
transpiling/ babel. And that React could be more readily used with a notion of
web components (eg. layered on top of an independent component API).

For docs, OTOH, I think that anything going after this space has to be able to
support all of HTML markup, given the massive installed base of browsers and
web content. HTML as a markup language, however, has seen no love from WHATWG
who focus on APIs instead, and W3C'S XML and RDF efforts for a more
declarative web are epic fails. I've spent significant time to bring back SGML
to the web, which remains the only standardized technology being able to
completely parse HTML ([1]), and having the ability to define new markup
vocabularies. One goal here is to come up with a way to publish content on the
Web and on p2p at the same time, based on straightforward file replication,
including for the case of limited forms of distributed authoring.

[1]:
[http://sgmljs.net/blog/blog1701.html](http://sgmljs.net/blog/blog1701.html)

~~~
201709User
It's just client-side CQRS/ES with re-invented terms (due to ignorance or web
bubble) slapped on top of a legacy scripting language and a document markup.

------
autarch
The part about SQL makes little sense to me. The author seems to be equating
relational databases with SQL, but I don't know why. Yes, that's how everyone
learns about relational databases (or the approximation thereof that SQL
provides), but SQL is just one (not so awesome) language for interacting with
a relational database. You could provide a binary protocol instead of or in
addition to SQL for any relational database. The protocol is not the
implementation!

~~~
mike_hearn
You're right, I used SQL as a lazy shorthand for "relational database
management system that definitely supports SQL but could also support other
protocols in theory".

------
0xcde4c3db
> To stress a point from my last article, text based protocols (not just JSON)
> have the fundamental flaw that buffer signalling is all ‘in band’, that is,
> to figure out where a piece of data ends you must read all of it looking for
> a particular character sequence

While this is indeed a drawback of many text-based protocols, there's no
fundamental connection between it and being text-based. Many binary protocols
and formats have the same drawback. Conversely, netstrings [1] don't require
scanning for delimiters, escaping any special byte values, or special behavior
for netstrings-within-netstrings. They could easily be used as the basis for a
text protocol (and probably have been, somewhere; BitTorrent uses something
similar but not identical, mixed with delimited fields, and not text-only).

[1]
[https://cr.yp.to/proto/netstrings.txt](https://cr.yp.to/proto/netstrings.txt)

~~~
hossbeast
tldr, a netstring is a string prefixed by it's length in ascii encoded
decimal.

------
jmull
This has a "let's start all over" feel to it. That's usually not the optimal
way to make big changes.

All of the design principals could be implemented and adopted incrementally
within the current system (and to a greater or lesser degree _have_ been).
This gives a current system a massive competitive advantage. E.g, to the
extent you could convince many people to use a certain unified data
representation (we have many dozens to choose between!), they can adopt it
without swapping out __everything else __at the same time. They don 't need to
switch to the newweb to get the part they want, so why would they? The
entanglements of the current web make a wholesale switch to something else an
extremely difficult sell.

------
arkis22
> 4\. Platform level user authentication

Sounds like an awful idea to me, I consider decentralized identities a huge
feature of the web

~~~
mike_hearn
Can you explain why email address client certs are less decentralised than
what we have now?

In practice all web apps use the email address as the underlying identity.
Email addresses _are_ a decentralised identity, to a large extent. But the
fact that it's so awkward for end users is pushing people towards solutions
like Facebook and Google OAuth, which aren't bad, but it seems like a better
design to let the browser present identity credentials instead of always
having to go back to a big centralised network to get it. The scheme I propose
is somewhat similar to Mozilla Persona, if that helps.

~~~
christophilus
I may have missed it, but what about shared computers like in a library? I
wouldn't want the library browser to have my certs.

~~~
mike_hearn
You wouldn't want the library computer to have your session cookies either.
There are already mechanisms to do that sort of thing, like Incognito mode or
"log off wipes cookies".

------
dangoor
How would this be meaningfully different from Java Web Start? Or Adobe AIR?
Why would this succeed where those failed?

Why should we expect this platform to be better than Android?

So many choices we make when designing systems are tradeoffs "6 of one, half-
dozen of the other." One side is not necessarily "better" than the other,
because different tradeoffs are made. You add up thousands of these tradeoffs
and now it appears that a system is full of cruft and should be thrown away.

~~~
MarkMc
Don't know about Adobe Air, but Java Web Start was bloated, slow, ugly, with a
complicated API and poor security. Perhaps this new attempt would simply have
a better implementation.

Currently the most common development approach is to develop a web app, then
later an iOS and Android app. So there must be a deficiency with web apps if
so many companies develop a native app even when they have already developed a
web app. If Android apps also ran on Windows, Mac and iOS then they would most
likely dominate web apps.

------
nicklaf
No mention of Ted Nelson or Xanadu?

------
stablemap
Myriad discussion of part 1:

[https://news.ycombinator.com/item?id=15321015](https://news.ycombinator.com/item?id=15321015)

------
chubot
There should be a name for the fallacy in this type of post -- perhaps the
"boil the ocean fallacy".

Something as big as the web is necessarily evolved rather than designed.
Obviously every programmer and their mother thinks they can design a better
web. But that's not how large systems are built.

I take more of a pessimistic view of systems, as in:

[https://en.wikipedia.org/wiki/Systemantics](https://en.wikipedia.org/wiki/Systemantics)

This post seems to have a little too much optimism :)

In particular one slogan from that book is "large systems operate in a
permanently degraded mode". They perform poorly and some part of them is
always broken. There is massive redundancy and inefficiency.

That couldn't be more true of the web. (It's also true of things like
governments and health care systems, but those things also serve their purpose
to some extent.)

But this post does nothing to convince me why that won't be true of a
replacement. It seems oblivious to the facts and forces of technological
adoption. There is only one mention of the word "compatibility" in the post,
and it's in regard to authentication. I'm not sure where all the binary
protocol stuff is coming from.

Without compatibility, it's not very interesting to contemplate, because it
will never happen.

There will be something beyond the web, but it won't be "like the web but with
certain of my pet peeves fixed". It will have to have a fundamental new
capability, like iOS/Android were to Microsoft Windows, which is another
platform with an enormous network effect. Whatever new platform that comes
along will coexist alongside the web for decades, just like iOS/Android and
Windows do.

EDIT: The style and tone of the book can be annoying to some, but there is
wisdom there. Here are some nuggets of wisdom relevant to the web:

\- _One of the problems that a system creates is that it becomes an entity
onto itself that not only persists but expands and encroaches on areas beyond
the original system 's purview._ \-- This is everybody's complaint about the
web evolving from a hypertext platform to an app platform.

\- _Complex systems tend to produce complex responses (not solutions) to
problems._ \-- Every new technology introduced on top of the web (e.g. PHP,
CGI, Java applets, Flash, CoffeeScript/TypeScript, PaaS, etc.) is better
viewed as a temporary response than a solution.

\- _Great advances are not produced by systems designed to produce great
advances._ \-- I believe this applies to the solution proposed in the original
article.

~~~
AlphaWeaver
It seems as if the author makes a nod to that point:

>We also need to focus on cheapness. The web has vast teams of full time
developers building it. Much of their work is duplicated or thrown away, some
of what they’ve created can be reused, and small amounts of new development
might be possible … but by and large any NewWeb will have to be assembled out
of bits of software that already exist. Beggars can’t be choosers.

~~~
chubot
Well, I'm talking more about design than implementation.

It's good he acknowledges the point about reusing existing code. But the
design still seems to be ignorant of the network effects of the web, or any
existing platform. It seems completely incompatible, so I don't see why anyone
would use this new platform.

A good analogy: to break the Windows monopoly, superior hardware and OS X
being superior barely made a dent, if count by percentages.

What you really need is iOS, i.e. brand new functionality. That's how Apple
"beat" Microsoft. Platforms are generally not "replaced" with something
similar but slightly better.

(And what he's describing sounds better on some fronts but worse on others.)

~~~
mike_hearn
When Android and iOS were new, they were completely incompatible with
everything else including the web. Yet somehow I find myself using Android
apps every day and many mobile websites I visit try to push me towards their
app version.

Console makers reset their platforms from scratch every ten years or so. New
operating system, new hardware platform, often entirely new CPU architecture.
Xbox is still around.

I think some people may be over-estimating the difficulty of dislodging the
web. Obviously it'd be a long term endeavour. But web apps don't have a lot of
network effects. If anything it's the opposite; web apps hardly integrate with
each other at all, so using one rarely makes another better.

~~~
chubot
Yes, that's exactly my point... There will be new platforms, but they need
significantly new functionality, like being able to carry around the Internet
in your pocket.

There's a reason Steve Jobs worked hard to get Google Search, Maps and YouTube
on the iPhone, and writing a completely new web browser optimized for it. The
iPhone was bootstrapped off the web to some extent, and open protocols like
e-mail. (It didn't have apps in the first iteration.)

Or they need a proven customer base which is willing to shell out money for
games.

The web is the highly unusual case, because it was not developed by a big
company, but by a grass roots effort. It was a distributed RFC/Usenet-type
process as far as I can tell. But the web offered significantly new
functionality -- at the time nobody in the world was using networked
hypertext.

The ideas laid out in the post seem heavy on developer pet peeves rather than
user functionality / monetary incentives.

As much as it pains me to say this, security is almost a developer pet peeve,
in the sense that users generally don't adopt products for security reasons. I
think there's evidence they use products in spite of bad security.

Of course, I would very much like a more secure web. I think it would be a
worthy goal to attack only that portion, while leaving out all the rest of
your proposal. As far as length prefixes, it probably makes sense for the
control channel like HTTP headers, but not for the data channel like HTML.

~~~
mike_hearn
Isn't iOS a good example of my point? Steve Jobs tried the "just write web
apps, we'll keep native to ourselves" line. It was rejected by developers.

I think users will happily assign some value to security if it's offered in a
meaningful way. The virus-resistance of iPhones (and to a lesser extent
Android) is definitely seen as a selling point by many people, mostly for the
iPad.

A lot of developers think users don't care about security. I think what is
happening is that users aren't being offered real security. Rather, developers
say "can I spend X hours this week on security" but the value delivered as
perceived by the boss or customer is often very unclear and is often seen as
near-zero, because people don't really believe programmers can write secure
software. Additionally, security is often seen as a bottomless pit into which
you can empty money forever with no perceptible difference in the product. The
cost is far, far too high, and the likelyhood of being "secure" in the end is
far, far too low.

If a platform gains a reputation for security though, as iOS has done, then
people will start to trust that programmers actually mean it when they say "we
can make this secure" and users will start to care more.

 _I think it would be a worthy goal to attack only that portion, while leaving
out all the rest of your proposal._

It can't be done. Otherwise I'd have made such a proposal. The web is so
deeply flawed, security wise, nearly nothing can be preserved. It all has to
be reset.

For instance, switching to binary for the HTTP headers is already done in
HTTP/2, but that hasn't had any noticeable impact on the rate of security
vulnerabilities in web apps. It helps a bit: perhaps HTTP/2 is invulnerable to
header splitting attacks (I'd hope so!). But the rest of the stack is still
seriously flawed and that's where the bulk of the attacks are.

------
shalabhc
In my opinion we should eliminate _shared_ data formats as much as possible.
Instead, try to define the core of this distributed system in a minimal way,
perhaps it only needs an inter module messaging system and a module naming
system. All other services are not standard but just modules.

So something like the browser is replaced by a micro-kernel style program and
all other functionality is in outside processes. Each 'program' has a unique
name in a _global_ namespace - they are are downloaded as needed and
instantiated. The programming paradigms and stacks of each program are not
prescribed. Programs instantiate other programs by the global name as
necessary, etc.

------
pmontra
There are some interesting points. I might be wrong but it seems that the
author would like to see a narrow set of technologies mandated from the front
end to the backend, much more than HTTP/HTML/CSS/JavaScript (it's pretty much
all that matters now). This didn't happen with the web for a reason. You can
mandate end-to-end technologies when you completely control a platform (iOS,
Android) but control and open web are contradictory. At best a controlled
stack can succeed in a controlled subset of the web. The "do as you want"
approach is what made the web successful (plus ease of deployment.) So, for
example, keep sending data to the browser in the format you prefer. There are
many, some good, some bad, appropriate to different needs. There is no need to
mandate one which anyway is going to be obsoleted quickly. That would stifle
innovation.

And about removing the back button from the browser

> the back button can be provided by the app itself when it makes sense to do
> so, a la iOS.

No thanks, IMHO that's the worst UI design decision ever made by Apple. I
always have to figure how to go back to the previous screen anytime somebody
hands me an iPhone. On Android and in the browser is just the back button,
always there in the same position.

About sessions and authentication:

> NewWeb would not use cookies. It’s better to use a public/private keypair to
> identify a session.

> [...]

> A client side TLS certificate is sufficient to implement a basic single
> sign-on system

I'm open to those. I would generate a different key pair for every site (no
SSO over the whole Internet please), add a passphrase and that's it. They'll
probably going to ask me my email for the user profile so we're back to email
and password but at least I'll only have to paste a password from the password
manager. Note that I don't use the same email for every site (personal email,
business email, customer's email, @mailinator etc) so using different keys is
really important to keep separate identities. I guess it's not going to be a
simple transition. Cookies provide sessions over a domain, I wonder how it's
going to work with keys. Maybe "use this key for all the .x.y domain"?

------
Animats
WeChat.

------
rhabarba
Why not just improve Gopher?

------
Udik
>> Here’s my personal list of the best 5 things about the web: Deployment &
sandboxing

Shallow learning curve for users and developers

Eliminates the document / app dichotomy

Advanced styling and branding

Open source and free to use

I would add:

write once, run anywhere; dynamically adapt to any viewport; incremental
transfer of resources; no deployment; allows linking of views and resources

------
jbergens
I think MS got closer to this with Windows RT and its apps. They had/have a
central store which causing some problems but that could be expanded.

The apps ran in a sandbox which allowed more than you might want but new, more
restrictive, sandboxes could be developed.

------
type0
[https://news.ycombinator.com/item?id=15351141](https://news.ycombinator.com/item?id=15351141)
This?

------
pascalxus
The Web is an incredibly productive environment. HTML/CSS, javascript, AJAX
are the core tech that make the web so productive. are there any platforms can
that can be developed in less time than the web? Loose typing is not bad for
productivity -> it's good for productivity.

The problem with the web is when people try to add libraries and frameworks
that don't make sense for what your building - and alot of these frameworkss
are just really bad for productivity, like Angular 1, etc.

------
dasil003
I like this far better than the first post, at least the author is proposing
some concrete ideas that can be debated, which I think is worthwhile.

However actually going and attempting to implement this idea from whole cloth
is a fool's errand. I'm not really cynical by nature, so it pains to be that
guy, but in this case I just think it belies a complete naivete about how
software ecosystems evolve and thrive, and why the web in particular was
successful.

The web you see today bears almost no resemblance to the original proposal.
What got the ball rolling was a very simple document format and protocol that
made it much easier to share documents. That was a unique value-add over
Email/FTP/IRC/Usenet/Gopher that drove early adoption. The reason the web is
so crufty is because all the application stuff was bolted on after the fact,
but here's the rub—if they had designed it for applications from the beginning
it would not have been too complicated and wouldn't have taken off. There were
tons of proper app development platforms that could have formed the basis for
"proper" cross-platform GUI app development, but none of them could cross the
chasm to true ubiquity the way the web has.

By the late 90s, the amount of effort being invested in the web was far
greater than any single company or organization could ever bring to bear on a
problem. 20 years later it is a deep sedimentary stack of more technologies
than anyone can really keep track of, and yes it's been quite tortured and
abused, and often feels extremely janky.

But tempting as it might be to think you can design something better, you
really can't. You can't solve all the problems the web has solved. It will
seem good when you start, but as you go you run up against the edge cases, and
the scope grows, and then you have to bring in more people to solve the new
problems, and you start to lose control of the design again, and you end up
with an entirely new mess that people write hang-wringing blog posts about.
Either that or you play benevolent dictator and try to control everything, but
then adoption slows and it becomes more of a proprietary platform that never
gains ubiquity.

The only tractable way to improve the web is to pick an area and work on
improving it, it's not as satisfying because you have to much cruft, but it
_is_ possible to slowly improve things. It takes a long time this way, but you
are leveraging the millions (billions?) of man-hours that have gone into web
technologies to date.

That's not to say the web is invincible, yes the web might be replaced, but it
won't be replaced by someone building a better app platform (that will never
gain traction, mark my words). Instead, it will be something unexpected, it
will be a simple use case that, in unforeseeable ways, over time, creates an
entirely new ecosystem that simply makes the web irrelevant.

~~~
mike_hearn
_> There were tons of proper app development platforms that could have formed
the basis for "proper" cross-platform GUI app development, but none of them
could cross the chasm to true ubiquity the way the web has._

I think this is the heart of our disagreement.

App platforms that were actually _designed_ , like iOS, Android, Java etc have
all been very popular. Smartphones are ubiquitous, but the smartphone that bet
most heavily on the web platform (Palm Pre) was wiped out by the smartphones
that bet on relatively cleanroom designs.

Heck, the Android API isn't going win many awards for simplicity or elegance,
but the BeOS and Danger people had OS design experience and pretty much knew
what they were doing. Android gets far more right (in my view) than it gets
wrong. The entire "app revolution" that followed the iPhone launch was more or
less a huge slap in the face to the web platform. If it was really so hard to
beat the web, why is all the innovation on smartphones in the native app
space, with web devs getting the crumbs years later after various
Apple/Google/Microsoft talking shops have finished finalising and shipping
WebTiltSensor or whatever feature we have in mind?

The key difference between mobile and desktop, in my view, is deployment.
Android and iOS handle deployment and upgrade for you. Desktop platforms only
started trying to tackle that recently, and mostly screwed it up. The web has
a great deployment story.

~~~
dasil003
> _If it was really so hard to beat the web, why is all the innovation on
> smartphones in the native app space, with web devs getting the crumbs years
> later after various Apple /Google/Microsoft talking shops have finished
> finalising and shipping WebTiltSensor or whatever feature we have in mind?_

Because platforms have different strengths. It's obviously not hard to design
a better platform than the web for _apps_. What I'm arguing is that you can't
take one of those clean room implementation and turn it into a truly cross-
platform standard.

As popular as iOS, Android, Java, Flash, QT and whatever else are or have
been, they are still hamstrung by being single-vendor efforts. The web on the
other hand is _table stakes_ for any new computing device, _on the
manufacturer 's dime_—it's not cost center or support burden for the platform
"owner". This is a world of difference that's hard to overstate.

> _The entire "app revolution" that followed the iPhone launch was more or
> less a huge slap in the face to the web platform._

Why? Again, they have different strengths. If you need high performance, and
access to hardware then you _have_ to go native. Of course all innovation will
happen in closed environments where the vendor can control the full stack and
move quickly. There was no way a loose set of open standards like the web can
compete with that, the results should surprise no one.

But does this mean apps are going to eclipse the web? No! Because there is
still a high threshold of trust to install an app. As much as the web security
model is a mess, it also has been reasonably successful at isolating the
hardware environment from the on-demand functionality it delivers. Can you
imagine a world where you only ever install apps for everything and never use
the web? Even if you use Google and Facebook's apps, there will always be a
long tail that you aren't willing to install, and the web is there for that
use case. Because of this, companies _have_ to continue developing websites to
serve the top of the funnel, and as they do so, they will demand more and more
standards to tie into hardware etc, so slowly web standards will creep in and
commoditize functionality which today is only available via native APIs.

More fundamentally, I think you underestimate the sort of worse-is-better
strength of the web being document-centric but support app-like functionality.
There are far more documents than apps in the world, and many of the apps deal
with things resembling documents. So the web has this incredible low barrier
to entry where you can throw some documents online, and then slowly build
functionality around it. If you're working on apps all day, and living in SV,
it's easy to see the warts and lose sight of what a powerful dynamic this is
for web adoption.

I hope you can prove me wrong and come up with an idea that can revolutionize
the web, it's just that it runs counter to my observation about the forces
that shape standards and technology ecosystems at a higher level than
individual minds and platforms.

------
mattacular
My prediction: Nothing is going to succeed the web.

------
aaronhoffman
Probably something like Johnny Mnemonic

------
SwellJoe
So...Android? I mean, I get there's a lot of differences, and Android, the
platform, would have to be much more open to begin to compete with the web.
But, it ticks off an awful lot of the boxes he's named. If one could run any
app in the world on Android without installing it and without significant
delay, it'd be a reasonable substitute for the web, I guess (which it can,
because it has a web browser, but we're back to the previously listed failings
of the web).

Of course, maybe one can argue that openness is the killer feature of the web
and unless Google really embraces open, it'll never surpass the web.

But, I still don't buy it. I was critical of the first article, and I'm maybe
even more convinced now that the web won't be beaten by a "better 90s", which
seems to be what's being proposed, in many regards. The author seems to want
to reset the clock just before the web exploded, and build something new from
where we were in about 1994, but with a bunch of lessons learned from the web.

Some specific areas of contention I have:

1\. IDE-oriented is a terrible idea, IMHO. IDEs are fine, I guess, but they
always strike me as being indicative of insufficient/incorrect abstraction,
rather than being a great productivity booster.

2\. I think designing UIs for the web is today better than for desktop apps.
Maybe this is indicative of my lack of experience with desktop apps, but even
when I've tinkered with Android development, I found UI builders to be tedious
and frustrating, _especially_ when it comes to making them responsive. With
Flexbox and Grid, and a component library, I can whip up a nice, responsive UI
in minutes for the web...and I'm not even good at it!

3\. On data representation and binary formats, I tend to think everybody
settling on JSON is Good Enough. It's slightly sub-optimal, but it's
standardized on the front and back ends, it is universal, and it can be
compressed using standard universally available tools. To me, it looks like a
classic "Worse is Better" scenario. Anything else will take years to work its
way into common usage.

Which brings me to the biggest issue: By the time a new platform reaches even
a tiny fraction of the reach of the web, the web will have likely caught up.
It is very difficult to overstate how _fast_ the web as a platform is moving
today. No single developer can fathom how advanced a lot of the tools have
become just in the past couple of years. I've been sort of giving myself a
crash course in Node/React/etc. lately, and it's, frankly, astonishing. You
can go down any of dozens of rabbit holes and find incredibly powerful tools
in all sorts of domains. Building apps for the web is stupidly easy once you
get over the (admittedly large) learning curve for putting all the pieces
together and overcome analysis paralysis and stop reading about every new
library and framework.

I guess I remain unconvinced. I come away feeling the same optimism I had for
the web before reading (maybe even more, because for nearly every point he
makes, I can see a clear path where someone is _already_ working on solving
the problem for the web, or it already exists in a rudimentary form), and the
same strong doubt that anything can beat the web short of being the result of
the continuing evolution of the web.

------
Tenobrus
Seems like Urbit ticks at least some of these boxes

------
enriquto
gopher

~~~
MentallyRetired
RIP you glorious protocol

------
X86BSD
I'm honestly waiting for the return of Gopher. I'm not kidding.

