
HTTP 2.0 First Draft Published - Garbage
http://www.infoq.com/news/2012/11/http20-first-draft
======
jgrahamc
Personally (i.e. not speaking on behalf of my employer) I'm not terribly keen
on SPDY being HTTP 2.0. It fixes HTTP/1.1 pipelining by adding a layer of
binary complexity to a protocol that has succeeded by being simple and
textual.

Also, it's a sort-of-layer because it really is dependent on details of HTTP
(such as the existence of headers) to operate so it is not an independent
layer. In networking I'm much happier when layers are independently specified.

Also, I think the implementation of header compression is poor because it uses
a fixed dictionary that is based on current HTTP headers. There's no provision
for negotiation of this going forward. It would be better if the compression
dictionary were negotiated on session creation. Notice how the dictionary has
been updated between standard drafts. Once it's in the standard updates will
not be possible.

And SPDY requires TLS which means that if it were HTTP 2.0 it would require
the deployment of certificates everywhere. That's an added expense for a web
site operator. Unless SSL certificates are suddenly free and trivial to deploy
then that would make HTTP 2.0 complex.

I would much rather see a proposal to change HTTP so that pipelining works,
using textual headers and not require TLS.

~~~
tbrownaw
_requires TLS which means that if it were HTTP 2.0 it would require the
deployment of certificates everywhere. That's an added expense for a web site
operator. Unless SSL certificates are suddenly free and trivial to deploy_

Get the browser vendors to stop being idiots, and treat self-signed SSL
connections as identical to plaintext connections.

Or even better get them to implement a notary model [ <http://perspectives-
project.org> ] (as a default rather than an extension), that doesn't allow for
a single insecure CA to break the security of the entire web.

~~~
tptacek
The first step towards an alternative to the multiple- single- points- of-
failure CA model is already underway: it's Trevor Perrin and Moxie
Marlinspike's TACK project (currently an Internet Draft). TACK allows sites to
cache certificate pins, typically after first contact, so that any site on the
Internet can have the same protection that Google's properties have today by
virtue of the hardcoded pins in Chrome.

As Moxie Marlinspike has said, on HN even, TACK is a sensible and achievable
first step towards a substrate that we can build notaries or other trust
models on top of. It doesn't require years of study and consideration; it
merely extends something that browser vendors already do for their preferred
sites.

Meanwhile, browsers can't simply treat sites under self-signed certificates as
normal plaintext HTTP sites. The user reached the site through an HTTPS URL,
which promised them security. When the browser detects and warns about a self-
signed certificate, it is telling the user "this site is lying about its
security". The simple way to understand this: start by asking what a browser
should do when the Citibank Online Banking Login presents a broken cert, and
then ask how the browser should know when it's OK for a site to present as
merely "not encrypted" (ie, HN login) and when it's not OK (ie, online
banking). It can't. The browser has to assume that HTTPS sites with broken
certificates are sensitive.

Remember also, the "broken certificate" case is exactly what happens when an
attacker intercepts a TLS connection for a MITM attack.

~~~
tbrownaw
_Meanwhile, browsers can't simply treat sites under self-signed certificates
as normal plaintext HTTP sites. The user reached the site through an HTTPS
URL, which promised them security._

The user reached the site by clicking on a link or bookmark, and doesn't know
or care about http vs https.

 _start by asking what a browser should do when the Citibank Online Banking
Login presents a broken cert_

It should not show the green "Citigroup Inc (US)" at the left of the address
bar.

If I go type "citicards.com" into the address bar, I end up _redirected to_ a
SSL site with an EV cert. If my DNS got hijacked, I would probably end up _not
redirected_ to the SSL site, rather than redirected to a site with a broken
cert. So non-SSL sites are just as dangerous ad sites with bad certs, and
should be presented the same way.

 _how the browser should know when it's OK for a site to present as merely
"not encrypted" (ie, HN login) and when it's not OK (ie, online banking). It
can't. The browser has to assume that HTTPS sites with broken certificates are
sensitive._

The browser should visually distinguish sites that are safe for sensitive info
from those that are not. Plaintext and self-signed SSL are both not safe. Site
with "EV" certs are supposedly safe. Site with other CA-signed certs are also
supposedly safe, but slightly less so.

So, show EV sites with the green name by the address bar, like recent browsers
do now. Show sites with other CA-signed certs with the little lock icon, and
maybe color it light green. Show plaintext and self-signed sites with nothing
at all, and maybe color the address bar slightly red. But, _do this
identically_ for non-signed and self-signed sites.

~~~
tptacek
Your HTTPS "session" with your bank isn't just one connection that can be
checked a single time when you first connect; it's hundreds of individual
HTTPS connections, each of which needs to be verified, or an attacker will
just corrupt the least obvious connection and use that to break the security
of the whole app.

~~~
tbrownaw
Browsers already complain if a site mixes http and https, why can't they
complain if security levels are mixed at all (plaintext, self-signed, normal-
CA-signed, EV, TACK-pinned)?

------
HerraBRE
I think it will be quite tragic if HTTP 2.0 ends up being a binary protocol.
The discoverability and readability of the basic text based Internet was, in
my opinion, one of its most fantastic qualities. It meant that as a young
hacker I could just look at things and _see_ how they work and then progress
to learning and experimenting by writing very naive partial implementations of
said protocols. I am not sure the bits saved justify the obfuscation of
everything.

------
mtgx
My favorite part about it is that everything will be encrypted by default in
the future.

~~~
deelowe
And require SSL certs, which cost how much again? It's quite an expense for
hobby sites, non-profits, and such.

~~~
selectnull
Like most (all?) digital products, the price of SSL certificates will converge
to zero with higher usage. So mass adoption might be beneficial after all.

~~~
zobzu
I believe the Internet should stay free, and making your own website should
certainly stay free.

Right now, you don't have to have a domain. You don't have to pay for a SSL
certificate. You don't have to pay for hosting. The only thing actually need
you pay for is the link to the ISP.

And it's not like there weren't better, free alternatives either. Ultimately
it's not about the price tag. It's about having a third party controlling your
stuff. It's more about freedom than free as in beer.

~~~
Firehed
Then continue using HTTP/1.1.

SSL needs to verify the site's identity to be effective, period. For a certain
level of trust (EV certs, for example) that requires humans doing work, at
least for now. Humans cost money. StartSSL's free certs work perfectly well
for non-EV requirements, which basically amount to a verification level of
"someone who can read email on this domain has requested a cert for it" -
which can be, and is, completely automated and therefore available for free.

~~~
zobzu
That reply is wrong on so many levels.

First of all, everyone wants the benefits of HTTP/2.0, obviously. Else I'd be
using gopher, thank you very much.

Then, startssl is a company, that happens to give free certs. For one single
sub-domain. Got two subdomains? Gotta pay. They can also decide to make those
non-free at any given moment, if they feel like it.

The only part I agree with, is paying for EV certificates. But you should NOT
need to pay and you should NOT need a third party to be responsible for YOUR
certificates if you do not want to.

And again, there's quite a few distributed trust models around that work well
and do exactly that, but get great push back from vendors, since, by nature,
they don't bring as much money back.

------
NathanKP
Does anyone else have any experience setting up SPDY on a web server? Since
SPDY is already supported by Chrome and Firefox, I'd be interested in at least
experimenting with it if there is an Apache or Nginx based solution which is
stable enough to work in production. Of course I would also need to serve old
style HTTP for those that are using a browser which doesn't support SPDY but
if there is a way to serve SPDY to those who can support it I'm all for trying
it out.

~~~
jolan
There's mod_spdy for Apache 2.2 developed by Google employees:

<http://code.google.com/p/mod-spdy/>

And a spdy patch for nginx from one of the main nginx developers:

<http://nginx.org/patches/spdy/>

Wordpress and CloudFlare both use the nginx patch so it should be production
ready.

I use mod_spdy and haven't had any problems.

------
wildranter
And here we go again, another design by committee to haunt us. If this piece
of crap sticks we all will have to keep using HTTP 1.1 which by the way isn't
that bad. First web sockets, and not this. Just great.

