> If you’re submitting your login form over HTTPS, that’s good, but it’s not enough. You have to deliver* the form over HTTPS too.*
I'm glad they mentioned it. Too many people think their sites are secure if logged in sessions use https and everything else is http.
Their example is that an attacker could insert JavaScript to steal the password, however they could just as well change the form target from https to http which is even less noticeable.
There's also a more detailed post [0] with reasons and explanations. You can enable this setting in Firefox 44+ by setting `security.insecure_password.ui.enabled` to true.
As I understand it, if not everything is HTTPS, the attacker could just inject JavaScript that changes the page to the login page when the user clicks "Log in". That way, they can still record the password.
Yes, it's disturbing how many sites still do this. Most of the big shopping sites like amazon.com, ebay, Target, WalMart are http until you log in. Was also until recently very common on credit card sites, though all of mine are all https now.
Uhh, what caching advantage are you talking about? Everything that can be served cached over HTTP can also be served cached over HTTPS. Unless you are talking about a caching proxy outside of the website control. In that case, they aren't needed, CDNs solved that problem in a much better way.
> Unless you are talking about a caching proxy outside of the website control. In that case, they aren't needed, CDNs solved that problem in a much better way.
There are still parts of the world where caching proxies are used to conserve limited transit bandwidth; CDNs don't solve that.
"Everything that can be served cached over HTTP can also be served cached over HTTPS. "
Only if you're taking about a browser cache. Any other cache couldn't intercept the data and cache it without alerting the browser (unless you mess around with installing SSL certs).
I wonder if even simply the overhead of setting up and encrypting the secure connections even just by itself has a measurable effect on people's buying behavior.
You don't have to wait for the user to click anything. Just read each key press and send a message out.
Hard to believe this is still a common vulnerability. Of course, even in 2007 a significant number of bank and credit card web sites operated like this.
That's one of the reasons why HSTS [1] was invented. Basically your website itself informs the browser to never contact it though http. Then a man-in-the-middle, which can inject whatever in any http pages you visit, simply cannot inject (nor replace) your website ever, because they (supposedly) cannot fake a valid certificate covering your domain.
You can even submit your domain [2] so that current browsers automatically apply the https-only preference to your domain, even when it hasn't ever been contacted before. Cool, isn't it?
For too long popular a lot of popular gambling sites such as betfair and skybet have done this. I think betfair now is all https, but I'm not sure of skybet (and I can't be checking while at work).
Betfair is finally using all https. Their security was terrible because once you logged in, the site went back to http again. This meant that anyone on the same network could grab your cookies and take over your account...
Is there a caching story to tell, here? In other words, is it possible that most web infrastructure is set up in such a way that caching is used to mitigate much of the load on the servers, and moving too many things to HTTPS too quickly would result in a MUCH greater content generation load?
That kind of caching is actually extremely rare in my experience. It's difficult to set up for all but the most simple static sites, and since it isn't available by default the vast majority of sites don't bother.
The times I've implemented full-page caching (generally using Varnish) I've set it up so nginx runs in front of Varnish handling SSL termination, which means that I still have SSL support even though I'm serving through a cache.
I wish they outlined a plan to push this icon out to Stable. Even better the plan should call for the browser to eventually refuse to submit forms with password fields unless HTTPS was used for both loading the form and submitting it.
I think that developers who are still using HTTP with passwords either don't understand the implications (and a tiny icon won't help), don't care, or don't have "management buy-in" to spend the time to fix it. Having the browser force best security practices will benefit them and everyone using their websites.
Browser should display a scary warning popup when submitting form to http (either always, or maybe at least when there is input type=password in a form). This would be annoying enough to get management buy-in to implement https, if someone still maintains the app - better than a tiny icon.
Breaking stuff is a last resort, nuclear option. There are many forgotten, old web apps that would totally stop working and people would switch to another, less secure browser as a result.
But it was removed in later versions of Netscape and Internet Explorer, because everyone turned it off as soon as they made their first search engine query.
I remember that, though honestly internet was a bit different 15 years ago - it was in (almost)-pre-HTTPS, pre-public-WiFi, pre-Snowden times. It's time to progress now that the realities and technical capabilities changed.
Today there should not be "do not display this anymore" checkbox.
1Password will also refuse to autofill passwords when it can't verify the application's signature (for example, if Chrome hasn't been updated in a while).
That is not particularly scary though. That's the kind of thing a user will automatically press "next" on. This is the kind of warning that looks scary: http://i.stack.imgur.com/2kaXO.png
I have recently deployed Content Security Policy (CSP) on a website. When I first looked at violation reports, my jaw dropped. The amount of malware (rouge extensions, toolbars, viruses, ...) that is blocked is staggering.
If you really want to help your (clueless) users, never ever serve a login, registration or credit card form without CSP. It really helps - at least until the malware catches on (I already see "Kaspersky Labs" is injecting its domain into the CSP itself).
Note that per the CSP spec, browser extensions should NOT be affected by CSP. The fact that they are in browsers is technically a bug, caused by the fact that once you've injected stuff into a page browsers don't so much track where it came from...
This does mean that currently CSP can stop various malware-ish extensions, but also that it stops legitimate ones (e.g. say an extension wants to apply a certain font that the user finds more readable to the entire page). It's a tough tradeoff.
Yes, I was about to say this - if you have a secure enough content security policy (and the browser in question supports it properly) it will be impossible for an attacker to execute their inserted Javascript (which to be able to do this anyway is also a security vulnerability).
But yes, the best plan is to have HTTPS everywhere, something that looks a lot closer than it once did! Thanks NSA!
> if you have a secure enough content security policy (and the browser in question supports it properly) it will be impossible for an attacker to execute their inserted Javascript
I don't follow your reasoning. Why wouldn't an MITM attacker modifying an HTTP response body to insert rogue Javascript also be able to modify the response headers to strip or alter the Content Security Policy?
I don't understand the distinction between the login form and every other page of the site. If someone is logged in and then back to normal http, someone can just grab the cookie and pretend to be that person already-logged-in.
I suppose if one uses the same password for every account they have then knowing their password is more harmful than just having access to 1 site... but other than that it seems like a distinction without a difference.
If a website only uses HTTPS for login, then it has to set a cookie for HTTP as well, otherwise how will the user navigate the site after login? From top of my head, you can implement this by associating the randomly generated session ID that you assign to all visitors, with the login ID.
Regardless, what jordanlev said still applies. The session can be hijacked.
Exactly! What is the purpose of being "logged in" if when you then go to browse the rest of the site you are no longer actually "logged in" (because the secure HTTPS cookie isn't being sent on those insecure http: pages).
You answered your own question. If a site works this way it is risking the privacy of user data on itself, but at least it isn't endangering the credentials which may be useable to attack other sites.
Previously when crypto was expensive, there were (and still are, PG&E is one and Mariott another) people thought just doing login over ssl was sufficient.
The best solution is just to make everything use HTTPS. Using HTTPS for all traffic also helps to obfuscate which connections are the important ones.
Certain unnamed three letter organizations and nation states have the computing power to crack HTTPS encryption if they really want to, but using that power is expensive. Making sure all your traffic is encrypted makes it a lot harder for potential snoops to decide which traffic is worth spending the time and money decrypting and which isn't.
By using strong encryption for everything, even just reading wikipedia pages, you're doing everyone a favor by helping to make user privacy that much harder to violate.
Certain unnamed three letter organizations and nation
states have the computing power to crack HTTPS encryption
if they really want to
Some ciphers and key lengths are vulnerable, but I do not believe it to be true to say that the NSA can "crack HTTPS", outside of a suborned-CA MITM attack, which isn't at all deniable or subtle.
I don't think many people pay much attention to the address bar, let alone a little lock icon with a red line through it.
If they want to get serious about it, they should just disable http support and require https for all actions. Or throw up a big in-your-face warning on http pages, not just make a small change to an obscure icon that nobody really understands.
> If they want to get serious about it, they should just disable http support and require https for all actions. Or throw up a big in-your-face warning on http pages, not just make a small change to an obscure icon that nobody really understands.
Even though I agree with you, I think that it's still a little bit too early for that. My prediction is that the browsers are going to start experimenting with this near the end of 2016 and that it'll become a part of stable versions of browsers somewhere in 2017.
There are too many security holes and corner cases to serve any authenticated page over http. It's better to have your login page be served over https, but if you use authentication cookies, any MITM can easily pick them up from the following http pages and do what they want (for as long as the cookie is valid).
And that's beside a whole rake of issues from having http and https on the same domain. It's just easier to have https everywhere.
For what it's worth, this was not true ten (or maybe even five) years ago.
I'd say it's true today, for most websites, but it's worth keeping in mind that it's a relatively recent phenomenon that HTTPS is now pretty easy to set up, use, and maintain. That wasn't always the case!
Five years ago was 2010. Other than Let's Encrypt, AFAIK, there hasn't been exactly leaps and bounds in terms of making https easier, at least for self-managed servers. It's still buy a certificate and set it up in nginx/apache/your favourite load balancer/etc.
I have found Caddy (https://caddyserver.com) really helpful in this regard. By default it grabs a Let's Encrypt cert and serves your site over HTTPS (oh and it handles renewals too.)
Caddy + LetsEncrypt = easy https everywhere :D
> It's still buy a certificate and set it up in nginx/apache/your favourite load balancer/etc.
To a first order approximation, yes, and for small and medium sites, that's pretty accurate. But for very large sites, there can be additional complications and expenses, and that situation has improved over the last several years. Don't underestimate the effect that the downward price of bandwidth over the years has had - at very large scale, those pennies can really add up.
There's a reason why sites like Google, Facebook, Reddit, Tumblr, etc. took so long to add SSL to everything Or why some sites like Comcast still don't provide it. It's not (always) that they simply don't care or don't have knowledgeable engineers; it's that the logistics of managing[0] SSL at that scale are non-trivial. Arguably worth it, yes, but it's not so straightforward.
[0] heck, the logistics of paying for - even with a CDN, the marginal cost of adding SSL is not cheap.
Prominent? A crossed over lock icon in the address bar? Try again.
A prominent warning would be something ridiculous, like a full page cover saying "THIS PLACE IS NOT SECURE – HERE BE DRAGONS!" or something. Browser vendors should do more of this for egregious errors on the publisher's side. Unless users complain loudly that stuff is uncomfortable and broken and scary and what not, you can write articles like this every day of the week and publishers still won't do anything about it, and the only users to care will be the geeks who understand what the damn icon means in the first place.
Use to many prominent warnings and people learn that they happen all the time and ignore them. It's a fine balance, and browser vendors invest a lot of effort in it (I know the chrome team had papers/talks on quite a few conferences about the topic)
What I don't understand about authentication over HTTPS is, though, why not making login a part of the protocol? Wouldn't it be much better to authenticate a user with a public key of the user like in SSH, instead of password authentication over the public key of the server? It'd be more resistant to attacks such as MITM or stealing the private key of the server. If a user can register a password on a website, why does it have to be a password rather than a public key? The only hindrance is the fact that the protocol doesn't support it.
I have no idea why this easy change hasn't been made in the protocol.
There are sites that can use keys to authenticate. They're usability is miserable.
Key based authentication is difficult for a layman to manage and understand. May mother can memorize a password and use it across computers. Asking her to do the same with a key will be difficult.
There was, fairly recently, a half-hearted attempt to do that in the way of Persona. Unfortunately, neither Mozilla nor any of the other browser vendors implemented it, and the fallback mechanism was very poor UX.
The most secure approach is to serve everything over HTTPS AND redirect HTTP to HTTPS WITH "Strict-Transport-Security" header [1]. Even this is not perfect, but it's the best approach available AFAIK.
However, one warning, if you like privacy, you should turn off HSTS or clear it on browser close at least. It's effectively a giant supercookie and it provides little security benefit over checking URLs for SSL yourself.
I'm puzzled. As a developer the sites I work on are (mostly) going to be hosted on my local machine. I usually don't bother with all the effort to set up SSL certificates for my development web server unless I've got an SSL-specific issue to investigate. Is this feature disabled for sites that are local? If not I'd expect I would just come to ignore it quite quickly. Then when I then look at the production version of a site I'm more likely to continue to ignore it as I've been conditioned into assuming it's a false indicator.
At the same time, for normal web users I can see how such a warning could be helpful. But it seems normal editions of Firefox won't have this enabled by default.
Or are my development practices unusual in some way?
The comments indicate that the icon will never be displayed on `localhost`.
I'm not sure what the situation is like for websites hosted on a LAN or using something like zero-conf, but I wouldn't be surprised if they do display the icon in those scenarios.
On sites that allow users to embed things like images (forums, comments, etc), in order to avoid mixed content warnings or interstitial confirmation dialogs, you either have to pipe everything through an SSL proxy, or severely limit the types of things users can embed.
I'm not writing login code personally but this sounds like something fundamentally broken in the web and asking devs to fix it case by case is a complete copout. If a browser can detect this shit, why can't it fucking proxy it or do something clever to mitigate it automatically instead of putting up stupid hyroglyphs and expecting to shame developers into duplicating a bunch of work case by case and thereby probably getting it wrong in many cases? .. like.. ff detects insecure login form.. Browser DON'T FUCKING SEND CLEARTEXT PASSWORD (instead sending some encrypted bs to firefox grand central station, where it is securely forwarded through a secure backchannel? Maybe it has to talk to the isp directly who has to inject the password back into your process, and they charge u extra tax for securing your shit.. and then.. gasp.. problem solved and not every aspiring developer needs to become a cryptographer.
Passwords are stupid. The internet is broken. My Spyware infested machine and browser knows all my passwords anyway.
Note to any techies reading this.. fix this problem. Here is your killer app. Disrupt this shit.
Facebook had this problem for years, even after they started hiring industry veterans who would know better (https://www.sslshopper.com/article-how-to-make-a-secure-logi...). Amateur hour lasted for so long that just thinking of their codebase makes me ill.
I don't know why sites don't just use HTTPS for everydamnedthing. It's 2016. SSL is not that computationally expensive and it's just easier to develop an entire site that way anyway (rather than making some pages secure and other non-secure). Just redirect everything to https and forget about it.
I work on a site with a long history. I recently created a benchmark and found that I could make around 500 synchronous HTTP requests to our infrastructure per second (essentially localhost calls). If I switch my benchmark to HTTPS I can only make about 5.
That's a big difference. I'm a developer, not a hardware guy, so I don't know what causes the slowdown for us. I assume it's the actual setup and teardown of the HTTPS connection. That is a fairly significant difference when I want to use API's via HTTP/HTTPS and I need to make a lot of calls quickly.
My point is that HTTPS still seems to be 100x more computationally expensive than HTTP.
I agree with you, however the bureaucracy at some companies is insane.
In order for me to get a TLS certificate where I work I have to create a CSR, fill in a word document, attach those to a Jira ticket, the operations team will then hand those over to a contractor who talks to our chosen CA. Up to 10 working days(!) later I get an OV certificate back.
Injecting JS into the page with the form isn't my main concern, even -- it's changing the form POST action to their own server rather than the one I think I'm logging into. That's much harder to detect and block without encryption.
The form, all its js assets and form api endpoint all have to be secured. And https for everything that contains code. Deploying SRI for web pages over https is also another layer of defense against js tampering.
Also, sending passwords across the wire in any reversible manner is really more dangerous than is necessary. Passwords/passphrases could be salted hashed by the browser in JavaScript using a PBKDF similar to scrypt or bcrypt, before being sent to the backend for constant-time comparison... it just takes a little more prudence and effort, but it's absolutely doable.
> Passwords/passphrases could be salted hashed by the browser in JavaScript using a PBKDF similar to scrypt or bcrypt, before being sent to the backend for constant-time comparison... it just takes a little more prudence and effort, but it's absolutely doable.
This is not safe! Now an attacker just needs to intercept the hashed password and replay that, and he gets to login without knowing what the password is.
Use https. And don't do client-side hashing, it's no improvement.
This doesn't make any sense to me. If attacker can intercept connections I'd rather he get the hash than the plaintext password. At least then actual password is still not compromised, which is good, given how many people share passwords accross services. If he can tamper with connections, there's nothing that can be done anyway.
It would be very nice if there was a feature in web browser that allowed user to opt in to something like this:
When there's a password input on a website:
1] automatically take website's domain concatenate it to plaintext password
2] generate a secure hash from 1]
3] Derive some reasonably portable text password (20-30 characters) from 2]
4] Make it so that original web page never has access to user's plaintext password
5] Submit result of 3]
Downsides:
1] Stupid websites enforcing password rules other than length (can be worked around mostly transparently).
2] Extra stupid websites limiting password length. (requires user interaction and configurable exceptions for such websites)
3] Can't login with browsers that don't implement this.
This would be very nice for people who share passwords between services. Also would make web service data leaks less valuable to attackers. User would not depend on service password handling quality and/or operator's morality.
I use different passwords for each and every service. But many people can't be bothered or don't get the risks of password sharing, so it would be at least some help to them.
Exactly. That's the risk to mitigate. Here's scrypt for js using emscripten on actual scrypt C source... the site has to provide and adjust/migrate the three factors over time to ensure a sane amount of cpu/memory is used: https://github.com/tonyg/js-scrypt
For real-world, practicality's sake, the client-side of an app would contain the PBKDF or some AA code, as opposed to the traditional, slightly-riskier "let the server handle all of it"-approach... the server still has final say on authentication and authorization, just move some of it into client-side js. There's really not much impact on FE development considering most FEs and BEs are codeveloped these days anyhow.
Btw, for web devs Two-factor auth is cheap and easy to do, Google Authenticator OTP has open source libs and requires no calls to Google to work.
You're wrong because it's a strawman since if an attacker can intercept the hash they could as easily intercept the plaintext in your traditional server-side architecture. The attacker cannot replay a TLS unless there is a problem with it, there have been many issues in TLS stacks, but it's the most widely deployed.
Furthermore, using plaintext any further from the owner or exposed longer than is necessary is inherently less secure because your breach of https would also compromise users' passwords.
> Passwords/passphrases could be salted hashed by the browser in JavaScript
Not the worst idea from the developer's point of view, but it doesn't protect the user: the site could still serve bad JavaScript which didn't properly hash the password (this, for example, is why Mozilla accounts are completely, totally and utterly insecure, and why Firefox password storing should never be used by anyone who doesn't wish to share his passwords with Mozilla, any Mozilla employee and any state which can compel any Mozilla employee). The browser needs to natively support doing this.
There's also the issue of replays. This topic has been well-studied; I believe that Secure Remote Password (SRP) is currently the gold standard for this.
The worst offender I interact with frequently is Hulu.com. This is a high-profile site with millions of users. It's one of those tell-tale signs as a consumer where I really can't say I trust them with my data. I really wonder, once they have a high-profile "hack" (I assume this will happen), if the state can charge them with criminal negligence?
I love this, but would prefer they didn't re-use the "invalid certificate" icon.
As a devs, we use self signed certificates for building our products and we have trained QA to ignore the HTTPS with a slash through it as an error that is acceptable in dev.
(And ourselves for that matter)
That makes this one easy to miss.
My preference would be a browser-level warning bar to roll out over the page. Like the one used in 'this plugin is not installed'
This is a huge error and its much harder to miss that way.
Hmm. When technology basically took away musicians's ability to sell digital music files, the world pretty much said, "figure out a new way to sell music."
So, "figure out a new way to advertise." Welcome to disruption.
Now this is the use of HTTPS that matters. This should get a major browser warning. It's far more important than "HTTPS Everywhere", which is mostly security theater.
I'm glad they mentioned it. Too many people think their sites are secure if logged in sessions use https and everything else is http.
Their example is that an attacker could insert JavaScript to steal the password, however they could just as well change the form target from https to http which is even less noticeable.
There's also a more detailed post [0] with reasons and explanations. You can enable this setting in Firefox 44+ by setting `security.insecure_password.ui.enabled` to true.
0: https://blog.mozilla.org/tanvi/2016/01/28/no-more-passwords-...