
This is why sites choose to stay vulnerable to Firesheep - ericflo
https://www.google.com/adsense/support/bin/answer.py?hl=en&answer=10528
======
kneath
This is a problem we (GitHub) are facing in a big way right now. Google Charts
doesn't offer https alternatives, so almost all our users get a big "this site
is going to steal all your private information" (mixed content warning). We
chose to roll out SSL first, then deal with the hard problem of mixed content
warnings (building ridiculous image proxies) later.

I think a lot of developers underestimate how big of an impact this warning is
on users, especially on browsers like IE that throw up a dialog on every page
that has this warning. Developers understand that it's not that big of a deal
— but to a user, it looks like the site is full of viruses, malware and is
going to steal all your bank account information.

~~~
Udo
Exactly. Browser makers (including Mozilla/Firefox to a large degree) are
responsible for the fact that HTTPS hasn't become the standard protocol as it
should have been years ago. It's not only the unproductive mixed content
warning but also the insistence of all browsers to only accept expensively
bought certificates and throw a very scary and hard to overcome error dialog
if a site uses any other kind of cert. While that isn't a problem for
big(gish) commercial sites like GitHub, it presents an insurmountable hurdle
for private sites and small-time projects for no good reason. For most sites I
don't need "secure" origin verification as badly as encryption. The lack of a
verifiable server address shouldn't mean that I should be bullied to not use
an encrypted connection with it. But even if the verdict is that you
absolutely can't have one without the other, browser makers should AT LEAST
include trusted root certs of authorities who offer free SSL certificates,
too.

~~~
kijinbear
While your frustration is understandable, I think you're speaking from the
perspective of a tech-savvy person and not the average user. If browsers began
accepting all free / self-signed certificates, it would be only a matter of
time before something like "Firesheep FX" came along and permitted random
strangers to MITM anybody's SSL session. Some of us can notice when that
happens, but most people won't have a clue unless the browser presented them
with a big scary red warning.

However, I agree with you that we need some good free CAs. The difference
between free and $10/year is bigger than most of us think it is. Fortunately,
there are registrars such as Gandi which will give you free certificates with
every domain.

~~~
Udo
> _If browsers began accepting all free / self-signed certificates [...]_

Right now, browsers are accepting any unencrypted old HTTP connection without
any warning, while non-verified securely encrypted connections are actively
prevented. Tech people can circumvent the block, but normal users cannot. Nor
do they have any reason to because the warning they are being shown sounds
like the end of the world, while any unsecured connection looks perfectly fine
to them. This is something that could be done right now to make everybody more
secure, at no cost, but it threatens the business model of companies like
Verisign.

Nobody is suggesting that browser makers should display the much-sought-after
"lock of absolute protection" icon on any random SSL connection, I'd be fine
if they reserve that for paid-for-certs. I'm merely suggesting they show free
(or even self-signed) certs the same courtesy as basic HTTP, the most
permissive protocol of all time, instead of actively preventing users from
using encryption.

I agree with you about the threat of "Firesheep FX" and believe Wifi
connections should probably all use WPA2, even at coffee shops where internet
access is free. The threat of MITM is real, but the attack can be made more
difficult using a number of schemes, and it even includes free certs that
offer way more protection than any unencrypted link ever could. Yet, we are
currently encouraging unencrypted connections while actively blocking
encrypted ones.

If HTTPS could have the same UI mechanisms as, say, an SSH connection I'm
convinced the online world would be a much safer place.

~~~
nikcub
_Wifi connections should probably all use WPA2, even at coffee shops_

if you just write the password on the wall, it defeats the purpose - everyone
who logs in is on the same network again, just like a public network

~~~
eru
Everyone being on the same network isn't too much of a problem. They still
can't read each others traffic. See
<http://en.wikipedia.org/wiki/IEEE_802.11i-2004> or
<http://en.wikipedia.org/wiki/Wi-Fi_Protected_Access#WPA2>

Every device negotiates its own keys with the access point.

~~~
hannibalhorn
Very common misconception, but it's still a problem. Any client with the
network password can capture the initial key negotiation, and then decrypt the
client's subsequent traffic. You can enter the network password in Wireshark:
<http://wiki.wireshark.org/HowToDecrypt802.11> .

~~~
pmjordan
If the shared key is known, it's also trivial to install a rogue access point
with the same SSID and a transparent, tampering proxy without a realistic
chance of anyone noticing.

------
jasonkester
To be accurate, this is not the reason many sites choose not to go with SSL
for everything. The real reason is that most sites don't _need_ to be SSL for
everything.

I run a travel blogging site, where 99% of all pageviews are from random
people off the internet reading people's trip reports and looking at photos.
Encrypting all that traffic would do nothing except bog the site for
everybody.

Every once in a great while (in terms of total traffic), somebody will log in
and post something. That tiny moment could benefit from SSL, since chances are
it's happening from a public internet cafe or wifi hotspot. That's the only
time a user is actually vulnerable to this sort of attack, so that's when they
need to be protected.

But when you look at the internet at a whole, the traffic fraction that needs
protecting looks pretty much the same. When you're showing me pictures of cats
with funny captions, please don't encrypt them before sending them to me just
because you read something about security on HackerNews.

~~~
tghw
The thing that Firesheep brought to people's attention is that the login is
not the only thing that needs to be SSL protected. The cookies you get after
signing in are often sent in the clear, and that cookie is just as good as
your login for gaining access.

~~~
dazmax
It's not the same, because with someone's password you can completely lock
them out of their account instead of just acting as them.

~~~
tghw
In a lot of systems, you can change the password without knowing the old one
as long as you're logged in. Others, you can change the email address, only
confirm on the new email address, and then get a password reset.

So if you really cover all of your bases, and require confirmation at every
step, then the least they can do is access your data and generally impersonate
you until you log out of that session (which no one does) or the session times
out (which it won't, because you're still logged in as them).

It's pretty much the same.

------
drivingmenuts
Honestly, how many sites are aware that they are vulnerable?

It seems like you assume that because the security-oriented 0.5% of the web
knows about it, the rest of the web should, too.

For most people, just making sure that their site runs at all is quite enough
for them to handle, and keeping current on the latest vulnerabilities is way
down on the list.

Additionally, fixing a site takes time. How long has Firesheep been out? A
week? Two? You should realize that for many sites, even those staffed by very
competent tech people, a month is the minimum amount of time for immediate
action.

~~~
drivebyacct2
How many sites (that any of us are legitimately worried about) employ
webmaster, developers, system admins or other that DON'T know why SSL/HTTPS is
important? You can't honestly be giving facebook, twitter, etc a pass on
understanding very basic concepts... (sniffing, http (cookies))?

Firesheep has been around for 2+ weeks now, but come on, we've all known this
has been possible for forever. I'm 20, and I knew how to do this (and did)
/years/ ago. I think Firesheep is just what everyone needed.

There are really good reasons why this is taking a long time and it is NOT
lack of knowing that this problem exists.

That having been said, my laptop is now running a LiveCD of x2go's LTSP client
and my desktop computer is running the x2go server. Very near-native
performance and total security. (I trust my desktops' endpoint).

~~~
dekz
I can understand a possible flaw in a websites attempt to provide secure
transfer of data, but just disregarding encrypted communication makes me
question their title of web developer.

------
CaptainMcCrank
I read the overclocking ssl posting
([http://www.imperialviolet.org/2010/06/25/overclocking-
ssl.ht...](http://www.imperialviolet.org/2010/06/25/overclocking-ssl.html)) &
I've been seeing plenty of follow up about how SSL is cheap and easy to scale,
but I have yet to see one tutorial that describes actually implementing
overclocking SSL or implementing it cheaply.

So- to the HN community, is this whole "ssl is cheap" a false meme, or does
someone have actual instructions on how to deploy & implement a scalable SSL?

------
al_james
There has got to be a sensible way around this. It seems overkill to require
every pageview to be over HTTPS, even for otherwise public sites. For example,
should these public discussion pages be over HTTPS on hacker news?

On my site I am planning the following: operate the login page over HTTPS and
issue two cookies. One is HTTPS only and other other is for all pages. The
public (non HTTPS) cookie is only used for identification (e.g. welcome
messages and personalisation). However, all requests that change the database
in any way are handled over HTTPS and we check to make sure the user has the
secret HTTPS cookie as well. Often forms submit to a HTTPS backend and then
redirects back to the public page over HTTP. Also, all account information
pages (sensitive pages) will be over HTTPS.

This way, the worst that can happen via cookie sniffing is that someone can
see pages as though they were someone else. In your case, this is not much of
a risk.

~~~
mike-cardwell
This is just dangerous. Example. If news.ycombinator.com implemented this dual
cookie method. A man in the middle could intercept the page I'm looking at
now, where I'm entering this comment in a textarea. They could modify the
underlying form to post to the same page as the update form on the profile
page, and set a hidden email field. Then when I hit the "reply" button, even
though I'm posting to a HTTPS page, I'm not posting to the one I think I am,
because the page containing the form it's self wasn't protected by HTTP.

I hope I explained that well enough. Mixed content is _hard_ to do right.
Forcing every page over SSL prevents anyone making any modifications to any
page, and is just inherently safer.

~~~
al_james
Well, I would say its not 'just as dangerous' as a man in the middle attack is
harder to set up.

Good point though. Maybe this could be solved by including a unique access
code with the form that is a hashed value of the user'id and the url that you
are submitting to (with salting to make this unguessable). Simply check this
value upon submission to make sure it matches the URL seen by the controller.
That would prevent anyone rewriting a form to submit to a new endpoint.

~~~
mike-cardwell
CSRF protection should be implemented even if your entire site is protected by
SSL.

Also, I didn't say "just as dangerous", I said "just dangerous"

~~~
al_james
Sorry for misquoting you.

However, with proper CSRF protection your man in the middle argument is not
the case is it?

------
iwr
Browsers should use two kinds of notifications: "encryption on" (green or red)
and "certificate is present" (green or red). Websites that do banking or
handle sensitive information should be green/green (SSL-on with verified
cert), while ordinary websites could be green/red (SSL-on with self-signed
cert).

------
jules
Is the solution to Firesheep to have every logged in page in https? Or is this
not necessary?

~~~
DrStalker
Any HTTP request that includes the session cookie needs to be secured,
otherwise the firesheep user will be able to grab the session cookie and use
it in their own requests.

~~~
mike-cardwell
That is the solution to protect websites from the current iteration of
FireSheep. It doesn't fix the underlying problem though. If a version of
FireSheep comes out that can do MITM we might have bigger problems.

The solution to the problem is SSL on every page.

~~~
iuguy
I'm not sure I'm parsing your post correctly, but as I understand it you're
talking about third party websites accepting responsibility to protect you
over an insecure network connection. If that's the case, then I think you're
mistaken.

Certainly SSL is not required on every page, and MITM tools have been around
for some time (including fairly friendly ones like Cain -
<http://www.oxid.it/>). At the end of the day companies such as Facebook,
Twitter et al have a moral (and in some cases legal) obligation to protect the
information assets you uploaded to their systems from compromise. Likewise it
is not unreasonable that you take certain steps to protect yourself.

The current version of FireSheep is a real known threat. We don't know what
might be in future versions. For protecting against Session ID theft, SSL and
the secure flag on cookies are the way to go. Certainly for data that doesn't
need to be secure (such as static publicly available graphics), there's no
need to use SSL for the majority of use cases.

The use of SSL for delivering dynamic client side code (such as HTML or
Javascript) is an interesting issue, but ultimately the user has to bear some
responsiblity for their own actions somewhere along the line. Not every
network is insecure, not every browser has to support a zillion and one
insecure means of using Javascript.

Rather than using SSL on every page and expecting the web sites to do the
heavy lifting, consider not using insecure bearer networks, or some sort of
means of securing insecure Internet links such as a VPN or SSH tunnel.

~~~
mike-cardwell
I do think websites should try to protect their users over an insecure network
connection, yes.

~~~
iuguy
That's an interesting though. Where do you draw the line? Should websites
protect users who don't have AV, firewalls or share user accounts?

~~~
mike-cardwell
I don't think websites should protect users that don't have AV, firewalls or
share user accounts no. I also don't think websites should protect users who
cross the road without looking both ways. None of those things are relevant to
protecting the communications between the website and the user.

If the users machine is compromised, that's the users problem. If the users
machine isn't compromised, yet the website can't be accessed over an
inherently untrustable network like the Internet, then the website has some
flaws that it needs to deal with. SSL is a start. DNSSEC is becoming important
too and I will be using it on grepular.com when Verisign signs "com" at the
beginning of next year.

------
EricButler
While sites wait for services such as adsense to support SSL, adding a second
Secure cookie and requiring on sensitive pages and to perform destructive
actions can help reduce risk to users. Depending on the site, it may be OK to
skip showing ads on a few authenticated pages. Wordpress implemented this in
2008: [http://ryan.boren.me/2008/07/14/ssl-and-cookies-in-
wordpress...](http://ryan.boren.me/2008/07/14/ssl-and-cookies-in-wordpress-26)

This won't protect against active attackers, but is definitely a step forward
and will make a full transition easier in the future, when possible.

~~~
technoweenie
We spent about a week trying this on GitHub. It works pretty well as long as
you have no ajax requests. We were basically left with this option:

1) Lose the ajax (and spend a significant time redoing bits of the site) 2)
Scary iframe hacks. 3) SSL Everywhere.

I feel like we made the best choice (I certainly don't mind removing any
chance we'll have adsense any time soon :). It cleaned up a lot of logic based
around determining which pages were important enough to require SSL (admin
functions, private repos, etc).

It's brought on some other issues though. Safari and Chrome don't seem to
properly cache HTTPS assets to disk, for one. This is an old problem:
[http://37signals.com/svn/posts/1431-mixed-content-warning-
ho...](http://37signals.com/svn/posts/1431-mixed-content-warning-how-i-loathe-
thee) . I'm not too worried about increased bandwidth bills on our end, I'm
worried about having a slower site experience. We're also seeing users
complain about having to log in every day. Are browsers not keeping secure
cookies around either?

------
Groxx
An alternative? : <http://www.tcpcrypt.org/>

~~~
erikano
> Tcpcrypt is opportunistic encryption. If the other end speaks Tcpcrypt, then
> your traffic will be encrypted; otherwise it will be in clear text.

I like that, as opposed to requiring users to have to install some plugin
before they can even talk to the server.

~~~
SpikeGronim
You may like it but it really reduces the security. A man in the middle can
make it look like you don't speak Tcpcrypt by manipulating the first few
packets of a connection. It's the same issue as mixing HTTP/HTTPS - the HTTP
parts leave you vulnerable. If encryption is not mandatory then it might as
well not be there.

~~~
Groxx
Which requires active attacks, ie MITM. And _stopping_ MITM is essentially
impossible - the best you can do is use our current CA setup, which assumes
the original download of the certificate you got wasn't hijacked.

In preventing _passive_ listeners, like Firesheep, this would work 100%
effectively anywhere it can work at all. The only way it reduces security is
in making people who don't understand MITM attacks feel they're safer than
they are - at absolute worst it's like you don't have it installed.

------
tapz
Is there an alternative to HTTPS?

------
alexyoung
It's kind of obvious that a major reason is IPV4. Getting another IP so you
can run both SSL and plain HTTP is only going to get harder.

------
hapless
Is there any reason that the web server can't set up an SSL reverse proxy to
fetch the adsense ads ? (Obviously cookies would have to be passed-through...)

It would cost you more bandwidth, but then there's no annoying warning message
about mixed content.

------
trumbo
What's the deal with that page not having any stylesheets? It looks completely
broken. Am I the only one seeing this?

~~~
ohwaitnvm
Turn off adblock and reload the page.

------
joshfraser
it's one of the reasons. probably not the only one.

------
yanw
Facebook and Twitter don't run Adsense, it's mostly run on content sites that
don't require you log into them.

~~~
ericflo
I don't understand your argument here--are you saying we shouldn't mind if the
vulnerable sites aren't Facebook or Twitter?

~~~
ryanto
hes saying that most sites that use adsense do not require a login, thus do
not need https. he is somewhat correct, but not enough for google to just
ignore this issue.

~~~
ericflo
Most content sites have login systems that people use to customize their
experience, post comments or upload content, etc. Millions and millions and
millions of people are logged into content sites and are vulnerable to this
attack.

Also, I disagree on the premise that adsense is mostly used on content sites.
It's used on all kinds of websites.

~~~
elliottcarlson
Most notably would be the oodles of forums out there that are ad supported and
require logging in.

------
benblack
Irrelevant contrarian opinion that adds nothing to the debate, but indicates
with certainty I am more interested in being pedantic and scoring points than
having a useful discussion.

~~~
tptacek
Successfully executed.

