Hacker News new | comments | show | ask | jobs | submit login
Chrome 56 will mark HTTP pages with password fields as non-secure (googleblog.com)
947 points by vladootz on Jan 22, 2017 | hide | past | web | favorite | 398 comments

Firefox has started to do this recently and it's been fantastically informative and helpful.

It's the one new browser feature I never really considered wanting/needing before, that's really stood out to me as being incredibly valuable since I've started to see the warnings pop up.

Firefox has started to do this recently

"Note that is warning in the url bar is only in Firefox Nightly and Firefox Developer Edition. This has not been released to Firefox Beta and Firefox Release."[0]

So this is not a security feature that most end users can rely on, yet.

[0] - https://developer.mozilla.org/en-US/docs/Web/Security/Insecu...

Firefox Release channel has been logging console errors for this for at least two years.

"Password fields present on an insecure (http://) page. This is a security risk that allows user login credentials to be stolen."

Looks like that url lost its final char [1]

I use DE, so I have been seeing these for a little while, but I think even in the stable channel you can toggle the security.insecure_password.ui.enabled param to true in about:config.

1: https://developer.mozilla.org/en-US/docs/Web/Security/Insecu...

Well it will soon land in stable as well I am sure. But its amazing to see major browsers moving in this direction and educating users about security.

It's pretty interesting to see a few sites loaded over HTTPS that submit forms over HTTP.

I can see it is a great way to warn users.

Also I love FF.

Kinda like how your antivirus tells you about how the formidable threats it saved your ass from today?

Or like "did you know your house COULD have been ransacked today, but it didn't happen!!"

Now all my users are going to hear that my site is insecure, when nothing at all changed.

How long ago did they announce that? I think just a couple months? They should have announced this much sooner.

It's going to hit me hard as my site is pretty niche and driving even more people away is the last thing I hoped for :( My shared hosting doesn't offer Let's Encrypt, and makes me pay to "install" a free certificate anyway. So I have to move everything to a different web host.

> Now all my users are going to hear that my site is insecure, when nothing at all changed.

You're being pretty irresponsible if you aren't using SSL for passwords. You users should be told that your site is insecure, because it is. You should care more about the security of your users.

If your hosting does not allow SSL, you have an obligation to change hosts for the safety of your users. If you aren't willing to do that, you're negligent and you should stop doing business with the public.

This is a huge red flag. If you really don't think SSL is important, it raises disturbing questions about your approach to security in general. Which other standard security practices have you ignored? Are you using strong hashing for passwords? Are you properly handling input to prevent SQL injection?

"insecure", "huge red flag." For a banking site, an e-commerce site, a webmail, sure. Must an aquarium enthusiast forum be resistant to Man-in-the-Middle attacks? What about weak ciphers? Should every site be resistant to offline decryption by a state actor?

Until users stop using the same password everywhere, your aquarium website is effectively the security for all your users accounts, including their bank.

+1. So much +1.

That is 100% correct and I never questioned that in my comments here.

I have to realize I'm not the lambda user I guess, as it's obvious to me to use a different passsword betwene my main emails and other services.

It's certainly better for users to IMPLEMENT SSL. But to outright tell them a site is "insecure" is bully-ish from Google, and a half baked approach from them. How about they disrupt this ridiculous SSL certificate market instead? But they don't have the balls to do that so it's the website owners that are paying the cost.

Not to mention Let's Encrypt is something that need to be renewed and how long will it work or be reliable?

But anyway, not like we have a choice right!

> How about they disrupt this ridiculous SSL certificate market instead?

They have. It's called Let's Encrypt, which is sponsored by (among many other companies) Google.

> But anyway, not like we have a choice right!

No. You do not. Browsers are already beginning to shut off certain features (like location access) for non-HTTPS sites, and HTTP/2 will only be implemented for encrypted connections. This has been coming for years, and the industry has made herculean efforts to make the process easy for service providers.

Deal with it. And if you're frustrated? This, of all fora, is not the place for fact-agnostic venting.

How is that the websites fault?

It isn't the website's fault that users do things wrong. It's the users' fault. However, it is the website developer's job to mitigate obvious problems. We know that users do things that are stupid so we have to work a little harder. We have to build products that recognise what the basic minimum standard is, and then try exceed it. If you're transferring passwords across the internet in plaintext then you haven't managed to do that and you need to try harder.

Because it's just another down-side of an already bad idea that has an easy solution.

Security follows a weakest link approach. With your aquarium enthusiasts forum that weakest link would be shared passwords between that forum and other, more important sites.

It's also important to remember that grabbing a password for a higher-value property is only one kind of attack. An attacker could use a compromised account to log in to the aquarium forum and subtly post malware infected links (or change existing links after other members have validated the original one). They can use the account for social engineering ("Hey guys, I'm sorry it's off topic, but my son has cancer and we're having trouble paying the bills, can you please donate here?") - etc. This will hit a large audience of often less technically sophisticated targets, in what for these small hobbyist sites is often a high-trust environment. All without crossing the boundary of the compromised site.

Shower Thought:

Why doesn't the browser hash the inputs for all password fields, then compare them when attempting to submit a form, and alert the user that they are doing something insecure?

Besides issues like requiring javascript or something, its usually not a useful step. The hash of the password can be stolen just as easily as the password itself. You've just made a new password.

If you salt the password with the url, all you've done is made a unique password per website which is what you were supposed to be doing anyway.

The point was for the browser to warn the end user about password re-use.

The browser doesn't need javascript to see the contents of a password field, or to show an indicator in the browser's chrome. It's the browser.

If you salt the password with the url, all you've done is made a unique password per website which is what you were supposed to be doing anyway.

Note that browsers can already store password lists (ex: Chrome settings, search manage passwords). There would just be an extra step to compare those passwords together.

Because my passwords for google.com, gmail.com, youtube.com, and google.co.uk are exactly the same, and the browser has no way of knowing that that's okay.

(Google specifically has probably rerouted everything through google.com these days, but the general problem exists.)

Its a real problem, in the new anti-phishing protocols (U2F/UAF) have some ideas.

The Web Origin Concept - https://tools.ietf.org/html/rfc6454

The also require the server to provide a list of Origins that are valid for the protocol, if the domain your logging into is not in the list, the challenge of the server will not be signed. Its called AppID in the protocol.

See: https://fidoalliance.org/download/

I have this exact problem with LastPass. One of my few pain points with it.

You can define equivalent domains in LastPass to solve that.

Yes!!! Thank you!!!

Wouldn't that essentially give an attacker a nice list of websites where the user uses the same credentials?

Sounds kind of rube-goldberg-y compared to automatically generating and managing a unique password for every site.

Probably not a bad idea. But at that point you're fighting human nature.

Yes. Every website. Because if only the sensitive sites are strongly encrypted, then bad actors and authorities know exactly where to focus their efforts in trying to steal information. When everyone is strongly encrypted, attackers are stretched much more thinly.

Besides, it's easy and free these days. Unless, apparently, you use some crappy shared hosting provider. Get a VPS, man! They're cheap!

Hell, even shared hosting like Dreamhost which I use for a number of old projects I host for other people has built in LetsEncrypt functionality. I was setting up a WP blog for a family member the other day and saw the option and it was super easy. I've added to my backlog to go turn it on for all my other sites as soon as I've confirmed it won't break anything (ie. loading assets using absolute urls that use http. But even that is easy with this WP plugin I found that rewrites all url's you list in pages/posts and while it's probably meant for changing the URL of your blog from abc.com to def.com it works flawlessly with going from http://abc.com to https://abc.com).

> Hell, even shared hosting like Dreamhost which I use for a number of old projects I host for other people has built in LetsEncrypt functionality.

How do they do this? How do you set up an SSL cert on a shared host (aren't SSL certs tied to an IP)?

I have a site on a virtual host (via Hurricane Electric), and I don't want to move to another hosting provide (cloud hosting) if I can avoid it. But unless something has changed (which I admit, it may have - I am not up on all the latest web hosting tech), my understanding was that you can't have an SSL cert on a virtual host.

This is no longer the case: http://webmasters.stackexchange.com/a/13990 You can have multiple sites on the same IP all have their own TLS cert.

After I posted I did some research and saw that, but thanks for posting the clarifying link.

I also found that my hosting provider (HE) does have something where on vhosts they have an SSL side - but as far as I can see they don't support (that is, by default or whatnot) Let's Encrypt certs.

However, digging deeper - there may be a way for me to set it up; I'd need to use a different LE client that doesn't need root access (there are a few), then I'd also have to set up a cron task to renew the cert every (< 90) days or so. Then update all of my pages, templates, code, etc to use https instead of http where applicable...a not insignificant amount of work, but doable.

I might still fire an email over the HE - maybe it's finally time for me to move away from them and over to Digital Ocean or something that supports LE certs out-of-the-box (I'd still have to fix the links on my site - but then again I've thought about just revamping my site again - it's due for it)...

Wouldn't putting someone (whose server experience level warrants shared hosting) in charge of a VPS create more security issues?

Linux is fairly secure by default, so most of what that admin has to do is just not screw it up. Turn off SSH passwords and disable root login, then read a hardening guide for their particular tech stack. Once it's set up, you just need to log in once in a while to update software.

Also backups, if it needed to be said.

Well I can configure a Linux vm for development I guess I could handle that.

Wouldn't these VPS come with a sensible default?

It's the only way to learn...

How cheap? Would you have a suggestion? Thanks.

DigitalOcean.com starts at $5/mo for a VPS that is going to be faster and better than most shared hosts would be. Of course you can scale it up and down as needed. Their $10 a month server is actually really powerful for any sites that get less than say 10,000 visits a day. They even have one-click wordpress installs now among many other open source programs.

I use DigitalOcean now for almost all of my sites. I have abandoned wordpress lately and so I am using custom built sites for all my projects. DigitalOcean is awesome.

Low-end dedicated VPS runs around $5/month. See DigitalCloud or Amazon Lightsail.

If you want to save a bit more money and are willing to commit to a 1-year contract, you could even get a t2.nano instance from Amazon EC2 at around $3/month.

If you check out deal aggregation sites like LowEndBox you can usually get one even cheaper than that (I pay $20 annually for mine)

I think the use case is people browsing your site in a coffee shop. They wouldn't shout their password for everyone in the room to hear, but that's exactly what their computer's doing. Yeah, probably nobody overhearing the password is going to use their account to start posting embarrassing nonsense. But it's good for their user agent to inform them that that could happen.

State-sponsored actors aren't going to try to brute-force the encryption so they can post to your aquarium forum, that's true.

Do you ask for personal information as part of your signup process? First name, last name, email address etc? If you don't encrypt the data or take basic precautions against unauthorised access you're likely breaking Data Protection law.

If you've already implemented a web server with SSL then "weak cyphers" might bother you. As you don't already have it configured, you're no worse off configuring a fully TLS 1.2-compliant web server with SHA256 signed certs and ChaCha20-Poly1305 cypher suite. It's just a configuration option if you're doing it for the first time.

I think of it like herd immunity. Users typically reuse passwords between sites, so if leaches the passwords out of the aquarium forum they may get their Gmail password as well, etc.

Yes, every site should be. The average user shares their niche acquarium site login username/password with their gmail login and bank and everything else.

In the physical universe we occupy a given user's most security-sensitive site is exactly as vulnerable as their least security-conscious site. It behooves us as professionals to take that fact seriously.

>Yes, every site should be. The average user shares their niche acquarium site login username/password with their gmail login and bank and everything else.

So what? Not every site has a login, or stores details about users. What about sites that are purely informational? If a site doesn't have passwords, why are you worried about users re-using passwords?

Not to mention, underlying my sentiment in my comments here, is that basically EVERY website out there can be broken. We hear about this every day now, Yahoo, Dropbox, and on and on.

If someone skilled wants to break your site, and they have a good reason to, they will. This is especially true for small sites / forums / blogs which the owners can not reasonably protect the way a corporation like Facebook can.

So on those smaller "hobby" / community sites it should be a given that using good passwords and precautions is necessary, as it always has been and a lot of people in my audience use dummy emails and tend to shy away from real names, etc.

So that's my main beef. Google is bullyish here, and is hitting the small guys, the pet projects, the "garage band" developers, and doesn't give these people a simple upgrade path. Hence, more and more I feel like pet projects and websites are going to disappear in favor of using third parties and I think this is a downside to all this fear mongering.

It's necessary but it's a painful change. The web isn't the playground that it used to be and I guess that's just the way it is.

> basically EVERY website out there can be broken

False equivalence. There is a huge difference between the significant effort required to break these big sites, and then a script-kiddie running a wifi sniffer at a Starbucks.

> Not to mention, underlying my sentiment in my comments here, is that basically EVERY website out there can be broken. We hear about this every day now, Yahoo, Dropbox, and on and on.

Yes. They can. By putting in a substantial effort, in order to break big sites, which probably isn't worth it for the small fry. But if you're not using SSL, they don't need to put in the effort on site-specific exploits - they just need to be listening on the public wi-fi.

> So on those smaller "hobby" / community sites it should be a given that using good passwords and precautions is necessary, as it always has been and a lot of people in my audience use dummy emails and tend to shy away from real names, etc.

Ummmm.... what exact hobby/community sites are you talking here? Judging by most studies on the matter, I think you have an inflated opinion of your users' security practices.

I've never said SSL isn't important nor that I don't care about it.

My beef is how Google forces this change on everyone, but at the same time haven't the balls to shake things up and make SSL easily available for everyone.

Of course as there is a massive business out there selling empty air "certificates" which are jsu tnumbers on a database requiring next to no maintenance, for princely sums.

THAT is lame on Google.

But I guess this is the transition now that is going to be painful for lots of small sites / apps like me. I genuinely never heard this a year ago so they shoudl also have made a big announcement of this much sooner to give time to small guys like me to prepare.

As for the general security question. I understand the "insecure" aspect is mainly related to public networks, and indeed nowadays it's becoming increasingly coming to use public wifi networks on the go.

But some of reaction is also implicitly that using a good password was always a good measure before or after this change. Hence saying something is "insecure" outright is somewhat bullyish on Google's part. Of course it's insecure, so is using a car.

You're making lots of assumptions. My passwords are encrypted and "salted". My site properly handles input I'm not that dumb thank you very much. I like to question decisions like these. Just have to vent a bit I guess. Yes, it's a good thing, but I can't help to think it's still bullyish and a half hearted solution from Google.

In any case it looks like the first notice of this won't be as bad as I thought, it's a small "Not Secure" text... won't scare users too much while I move host and add SSL.

> Of course as there is a massive business out there selling empty air "certificates" which are jsu tnumbers on a database requiring next to no maintenance, for princely sums.

Which is not Google's business. Google does not have the obligation to make your job easier. As a browser vendor, however, it does have the obligation to protect its users.

> But some of reaction is also implicitly that using a good password was always a good measure before or after this change. Hence saying something is "insecure" outright is somewhat bullyish on Google's part. Of course it's insecure, so is using a car.

There is no such thing as absolute security. That does not mean that security is a meaningless adjective. Sending passwords over unencrypted HTTP is demonstrably less secure than sending over HTTPS - it opens up the user account to compromise from any network host anywhere on the path from them to your server.

> My beef is how Google forces this change on everyone, but at the same time haven't the balls to shake things up and make SSL easily available for everyone.

Really? https://letsencrypt.org/ Chrome is listed amongst the major sponsors.

I don't even use, or have much love for, LetsEncrypt as it happens because it was a PITA to set up with node when I tried it. But even without that getting and using a certificate issued by Cloudflare was easy.

Creating a self-signed certificate for dev and test is also pretty easy. It just takes a handful of commands in bash: http://www.akadia.com/services/ssh_test_certificate.html. It's the work of literally 5 minutes.

What legitimate reason do you have not to use SSL? It's free.

Are you really saying that your site would be no more secure with SSL? Because that is objectively, provably false. If your clients are paying you to make sites like this that is borderline professional malpractice.

If people entered their good passwords on your site without noticing that it was unencrypted, then great, they should now consider that password compromised... As a user, as soon as I notice it's unencrypted, I'm _going_ assume the password is also probably not encrypted nor salted, and other users will probably have done so.

(Of course, even better would be a different password per site, but...)

> But I guess this is the transition now that is going to be painful for lots of small sites / apps like me. I genuinely never heard this a year ago so they shoudl also have made a big announcement of this much sooner to give time to small guys like me to prepare.

Pretty sure they did. Sorry, but if you're going to be the one responsible for keeping a website online, you have at least some responsibility to keep an eye on tech news just to see if there are any major security breaches or changes in how the web will work coming up. If you don't have time to do this, you really ought to take the site down and move the functionality to some other type of hosting where somebody else takes care of this for you. Otherwise, you may find your site hacked and running a spam server or serving kiddie porn or something one day.

> You're being pretty irresponsible if you aren't using SSL for passwords.

So you're gonna tell me the owner of this site is irresponsible because it has a page with a password field that is not using SSL? http://www.w3schools.com/html/tryit.asp?filename=tryhtml_inp...

How can you make any claim without having any idea what (if anything) the password is protecting?

I don't see why you (or w3schools) would object to a warning being shown on that page. It's put forward as an example, it shouldn't be treated as real and something requiring security, so showing a generic warning confirming it's not secure is perfectly harmless and even expected

This is such a weird argument to have when incentivizing w3schools to use SSL would be a good thing. They can clearly afford the cost of acquiring, installing, and maintaining an SSL certificate. And as a website that teaches people about the web, they should be setting an example by using SSL. Alternatively, they should be setting an example by demonstrating that password prompts that aren't over SSL will cause warnings.

But even if it were legitimate for w3schools to not use SSL, security is about tradeoffs, and the tradeoff is that it's absolutely worth displaying a warning on one page that arguably doesn't need instead of not displaying it on millions of pages that definitely do.

It doesn't matter what the password is protecting on any particular site. Password reuse is common enough -- who knows what else the user is protecting with that password.

This is actually a good example. The password clearly isn't protecting anything. It's a tutorial on how to make a password field! Nobody is going to put a valuable password in there.

And now they'll have an extra reminder not to enter a real password in an example field. Seems good to me.

> who knows what else the user is protecting with that password

How does it even matter when the password field is never even read? There are better alternatives. Chrome could just give a security error when the password is actually accessed. Or alternatively it could prevent the page from storing any data locally or sending any data to any server if the password field is non-empty. Just because a password field exists that doesn't mean the page is insecure.

If you have a password in a form that gets submitted to the server and you're not using TLS, then anybody between your user and the server (which is a lot of people these days) can read that password and the associated username (and all data ever sent) in plain text. There's zero confidentiality.

If you're putting a password field on a page where nothing is ever sent over the wire, I'm not sure what value that password field is really adding, anyway. Might as well swap it for an input and, voila, your users won't have any warnings.

> If you're putting a password field on a page where nothing is ever sent over the wire, I'm not sure what value that password field is really adding, anyway.

You don't think it was adding anything in the example page I just linked you to?

Only a false sense of security.

Since the page is loaded as plain text, it can also be altered by anyone with network access between the server and the user. Javascript can be very trivially injected that simply sends each keydown event to the server, giving away the user's "hidden" password.

So even if the code you wrote doesn't ever send the password input to the server, that doesn't mean code hasn't been injected by some third party by the time it gets to your customer/user.

Can't do that when JS can access the field contents asynchronously via the DOM. Blocking JS execution for a user prompt wouldn't fly, either.

The password form on that page _IS_ insecure and it's good that the user is given information about that. They can then make the decision about themselves about the lack of security and how it effects them and the page they are on.

Are you seriously suggesting that a password field in an online code editor on an HTML tutorial site is comparable to the situation we are discussing here?

The point is the result is the same - Chrome will flag that page is insecure.

Because it is insecure. Sure, that form doesn't actually do anything, but why not help and train visitors to expect security and know how to spot it?

w3schools is for web developers who are learning. Maybe they'll see that the page is marked Insecure and the lesson they'll get from it is that they need to secure their logins.

And the user will know they shouldn't type a password they want to keep secure into that field. Makes sense to me.

I am not a security expert, but as I understand it, passwords sent in the clear are vulnerable to being intercepted. Even if users of your site don't have much to worry about from those accounts being compromised (this may or may not be true), lots of people use the same password for more than one login, so their accounts on other sites could be compromised too.

That's definitely a significant security risk, even if it wasn't being explicitly labeled before now. It sucks for independent website operators like you, but I'd probably blame your hosting for making it difficult to secure your site rather than browser vendors for protecting their users.

I know I guess I just have to vent some frustration.

Time to move on I guess.

Does anyone have good hosting suggestions for a web app that has a 1GB database and a few thousand active users?

I can only afford ~10-20 EUR a month on shared hosting atm.

Try OVH. $3.5/mo for a 2GB RAM VPS https://www.ovh.com/us/vps/

I'm currently on OVH. If you've a tight budget, I'd highly recommend. They're a LetsEncrypt sponsor so they offer free SSL out of the box https://www.ovh.ie/news/articles/a2224.ovh-your-free-ssl-cer...

Have you tried Amazon Lightsail or Digital Ocean? Both of them give you more than a paltry 1 GB for their $5/mo plans.

Seconding DigitalOcean. You can get a 20GB SSD + 1000GB xfer for $5 a month.

Recommending against DigitalOcean: https://gist.github.com/justjanne/205cc548148829078d4bf2fd39...

TL;DR: Too expensive.

(And, additionally, they tend to fuck over customers who paid for their money. "100$ free credit!". "You only need to pay 5$ to activate your free credit!". "Sorry, but because you didn’t use it, we removed your free credit!")

Dreamhost is in that price range and has LetsEncrypt built in (it's just a checkbox when configuring the domain). I don't use them for my real projects but it's great for blogs and low traffic sites (a couple thousand users should fair fine I'd think).

> I can only afford ~10-20 EUR a month on shared hosting atm.

Sounds to me like you can move to a better host that does provide LetsEncrypt and save money on your hosting costs to boot!

Thanks so much for all the suggestions!

LDHosting's shared hosting costs me 35€/year, gives 5GB of disk space and lets me upload my own cert for free. I've been with them for a few years and the uptime has been excellent.


9 EUR, 2 cores + 6 GB RAM, 40 GB SSD.

It still baffles me that someone is using shared hosting services and upload their php files via ftp...

Scaleway has 50G disc and 2G memory for 3€(+VAT)/month

Webfaction still rock

I think you could probably just point your DNS at Cloudflare to proxy your site through them; their service includes SSL plus some extras like caching and such for free. I've used them for a handful of projects and it's worked great.

That said it will still be insecure because of the unencrypted path from cloudflare to you server but it will hire the error

Cloudflare will provide you with certificates they generate, that they verify but won't be accepted by anyone else. (No cost because of that) - this keeps the data secure between you are them. Obviously, you are still trusitng cloudflare in the middle, but still less trust required.

If you can install a certificate, you can already get a real one from Lets Encrypt (you don't actually need to run their client on the server). The problem is that many shared hosting services are still stuck in the past, and don't let you use SSL/TLS at all.

Without running the client, that means manually changing the cert for expiry, which is very short on LetsEncrypt certs. That introudces the possibility of forgetting or messing it up.

I agree that the best option is for shared hosts just to build in support for LetsEncrypt.

Hmm, so let's say I'm hosting my static files on S3. I've currently got CloudFlare setup in front of it but that apparently doesn't help.

Anything I can do other than not using S3?

Use CloudFront? Took me about an hour to set up for my S3 based blog, free TLS, http/2 and IPv6 without any setup apart from a checkbox.

Right, so I've currently got CloudFront in front of it, but doesn't that move the problem? Now the connection between CloudFront and S3 is unencrypted.

(I'm probably understanding this wrong, but I'd like to understand why.)

For some definitions of "insecure".

Hmmmm. -4. I fleshed out my thoughts in slightly more detail in another comment: https://news.ycombinator.com/item?id=13458224

If you have control of your nameservers you can use cloudflare's free TLS offering and keep your current webhost.

What about actually solving the problem instead of tricking the user into believing their data is encrypted while being transferred to you?

I'm personally in favour of Cloudflare as the simplest solution - even simpler than letsencrypt. However - there are a few caveats. They tend to hit some countries with a Captcha unless you disable it. Might not be an issue. Their "Flexible SSL is controversial as it only encrypts from client to them - not from them to the server. Personally I think this covers the most obvious threat models and is probably "good enough" for the a lot of use cases.

You have a few thousand active monthly users (since it's a web app that requires an account, I'm assuming that corresponds to 50-100k page views per month) and you can't recoup 10-20 EUR a month to cover server costs?

I think it's time for a bit of light monetization.

Perhaps johndoe4589 doesn't want to monetize?

(Asking users for donations might work where advertisement perhaps doesn't.)

> Now all my users are going to hear that my site is insecure, when nothing at all changed.

Correct, nothing has changed, it has always been insecure.

So what? It's the harsh reality of unsecure http.

Maybe your website has some important information and the user is just unaware of the dangers when entering his credentials on your website. Good for him, now firefox is warning him. He can choose to continue or not. Fortunately, if your website doesn't really hold any sensitive information then the user will go forward.

The market demand will naturally require that all shared hosts start offering some sort of free HTTPS as webmasters such as yourself will simply be required to migrate somewhere where $hosting + $HTTPS is cheaper. This means shared hosts may start integrating with services like Let's Encrypt to save costs.

In fact you could be proactive and announce to your shared host that for this reason you will be relocating. Let them know there will be a trend of other webmasters relocating for the same reason.

As more and more website features (passwords, geolocation) start requiring HTTPS by browsers we will naturally approach the point where HTTPS is free and ubiquitous, at which point everybody wins.

Also, you've had a one year notice that this was going to happen: https://blog.mozilla.org/tanvi/2016/01/28/no-more-passwords-...

I agree. I don't think I'm going to tell them until after I moved though :p In any case they are hostgat0r and while the service is good overall, I hear they've been bought and it's not quite as good as it used to be. They gave me SSH, and even moved server when my site was being a bit sluggish (optimized the queries since)... so hmm.

I genuinely don't have bad things to say about the hosting performance itself. But the documentation on their site is so bad, it alone makes me want to move on. Tired of spending hours trying to find the procedure to do this or that. And their live chat tkes forever to reply.

Matter of fact, they require a fee for an external certificate, and then apparently you have to buy a static IP too. So another option is to upgrade the shared hosting plan, to the one that has a SSL bundled in. But.. then it works only on one domain AFAIK, so if my app also has a forum , I still need a second certificate! WHat if I want an API on another subdomain like api.foobar.com ? Yet another certificate.

So I think I'll just have to move to a Let's Encrypt aware hosting.

>Or like "did you know your house COULD have been ransacked today, but it didn't happen!!" >Now all my users are going to hear that my site is insecure, when nothing at all changed.

I'm not sure you should be allowed to drive a webserver.

I'm not sure what the "driving" equivalent is for a battleship, but I'm pretty sure you do that to a server, not drive it.


Commandeering web servers will probably get you arrested ಠ_ಠ

I don't know what you do with a battleship. Helm it, maybe?

No no, you only get arrested for commandeering next to another server and then boarding it.

Commandeer: Verb: take possession of (something) without authority.

Sounds like you'd get arrested for that....

Yes. One of the positive aspects of this change is that it punishes the people who are either unable or unwilling to migrate to SSL.

>Now all my users are going to hear that my site is insecure, when nothing at all changed.

Nothing's changed: Your site really is insecure, it's just that now users know.

> How long ago did they announce that? I think just a couple months? They should have announced this much sooner.

A year ago.[1]

[1]: https://blog.mozilla.org/tanvi/2016/01/28/no-more-passwords-...

Move your site....

Your negligence pose security risk to your users, they should be warned.

Your site's server is just one node in the chain of nodes between your server and the browser. Any one of these nodes could be malicious and tamper with the data in either direction. In other words, this isn't about just your node, it's about all nodes.

It may not be an option for you, but you could consider using a free proxy service such as Cloudflare.

With the caveat that while the password will be encrypted from the browser to Cloudflare, it will still be transmitted as plain text from Cloudflare to your own server if your server doesn't support HTTPS.

So it's an improvement but not entirely a fix.

Actually, Cloudflare offer you certs they sign (which wouldn't be trusted by others, but they verify), that you can use to encrypt from the server to them. You still have to trust Cloudflare, but it's not plain text from cloudflare to your server.

If you mean the case where you literally can't serve under HTTPS, it's not just getting the cert that is the problem, in most cases running a local proxy of something that will would fix it, although I accept there are cases (cheap shared hosting, I guess) where that's not an option.

Wouldn't he still have to install the Cloudflare certs on his server then? In that case why not get LetsEncrypt?

In some cases you have the ability to add certs, but not to run the LetsEncrypt software (e.g: shared hosting) - with the short expiry date on LetsEncrypt certs, doing it manually is error-prone.

Which is not really a true https, depends on your view, but the flexible plan is not encrypted to the source server as one might expect. But if it's possible to set up things that way, it is https, i guess.

Seconding this, because this is what I do for one of my projects.

Shameless plug (since I work on Firebase) but if your site works on static hosting + BaaS, Firebase Hosting will give you free SSL + CDN support.

http://www.Netlify.com gives a free tier away with static HTML hosting + ssl + cdn as well.

Your site IS INSECURE. Nothing at all changed because it was, and now the user can just see it.

In my country and probably a lot more, you'd be held legally responsible if by any means customer data would leak out. Be it stuff sent over a wire unencrypted, or an account with administrator access being compromised due to unencrypted password transfer.

I dare you to go to a hacker or security conference once, just for kicks. Connect to any wifi there, log in. See what happens.

I don't have "customers" though. I thought this was implicit but maybe I needed to clear that up. Sure makes no difference to security. But that is the consideration when Google pushes everyone to have to buy SSL, even those who just have a hobby. I've just been venting frustration a little bit as to seeing the web change from the playground it used to be to a much more regulated thing, but so it goes. Times change :)

Why do you have a password field on a http site?

Like every other person who started a forum some years ago or a wordpress blog you mean?

Do you think web hosts offer SSL by default?

NO they don't. Duh. That's what is annoying in these comments. Everyone seems to shrug like SSL is standard feature nowadays, except is isn't.

That's the point. SSL/TLS is a standard feature nowadays. If you're a web host and you don't support certs or charge some unreasonable fee for adding them (and I'm well aware there are far too many of these out there), then you need to be losing business fast. Your customers should be moving to better competitors. This measure just accelerates this process.

True, it's just the transition atm is a bit painful.

My host is asking ~80 USD for multi domain SSL. It's not that bad, but they don't support Let's Encrypt yet afaik. Are they aware of this and trying to cash in on people who don't want the hassle of moving their sites?

On the other hand, if I buy there is none of that "auto renew" business...

You're leaving your users vulnerable because you can't be bothered to do things correctly. You are not the wronged party here, they are, and you're just finally being pushed into doing it right.

Any web host that doesn't offer SSL by default, for free, is offering an inferior product. I worked on my university's web host from 2006-11 and we put a lot of work into making sure people could use HTTPS whenever they wanted, despite technical limitations (mod_vhost_ldap and SNI don't play well together), and that was a volunteer project, well before the current era of free certs.

If this change convinces people not to use commercial web hosts that don't offer SSL, it will have done a good thing for the web.

Err, I work for a web host and we offer SSL by default as part of the onboarding. Speak for yourself :).

Pm - "why is this page insecure"

Developer - "chrome labels password fields as insecure over http"

Pm - "what if it wasn't a password field"

Don't you need to use type = "password" to get the -for-every-character treatment?

I suppose you could implement your own (e.g. type = "text" with an onKeyDown listener that cached each keystroke and inserted a into the field), but that sounds like a terrible solution in so many ways.

I would think the laziest possible way to workaround this would be to use a CDN like Cloudflare to proxy all traffic to your site. Looks like they have a service called Flexible SSL that terminates HTTPS at the CDN, and sends unencrypted traffic to your backend:


A few solutions:

- Create a font such that every character shows up as a * and use it for a text input.

- make the input field use white text on a white background using a fixed-width font, monitor length, and display the correct number of *'s above it using a div.

- implement the text box ground up from scratch using div's and JS, like google docs does.

- implement a HTTPS password field in an iframe and communicate with it over post messages.

Honestly, why would you do any of those things, now that installing a certificate takes 5 minutes and is free, with Let's Encrypt?

I know that you're just exploring solutions because it's interesting, but all those things take longer than Let's Encrypt.

It may take way more than 5 minutes.

My experience with let's encrypt so far:

- the name of their tool was changed form "letsencrypt" to "certbot", breaking my cronjob

- for daemons that try to access the cert/key as non-privileged users, additional fiddling with permissions is necessary, which may even be overwritten on cert update if done incorrectly

- when the certs are renewed, daemons need to reload them. This means that ideally, you need to detect when a renewal actually happens (as opposed to an attempt), keep an up-to-date list of all daemons that use the certs and possibly completely restart them, dropping all existing connections (some daemons just don't support a live reload)

I'm not saying these problems are unsolvable, but may take way more than 5 minutes and I, for one, opted to renew my startcom certificate for another 3 years instead.

I just followed the instructions on their website, and it took me less than 5 minutes, including setting up the cronjob.

The cronjob correctly renewed the certificate at least once on multiple servers, with no issues whatsoever. I was also warned that the certificates were expiring via email, which I thought was awesome.

Your mileage might vary.

> I, for one, opted to renew my startcom certificate for another 3 years instead.

Wait really? When did you do this? I thought all certs signed by their root after sometime in October or November are all considered invalid due to their shenanigans with WoSign. Not so?



> Distrust certificates with a notBefore date after October 21, 2016

Just in time!

If the only thing you host is a small blog without ads on a server that you have complete control over and spun up last month or something, sure it's 5 minutes.

In the real world, for many commercial operations, and especially for legacy code, there can be significant hurdles. For example, it took the NY Times 2 years to move to HTTPS, significantly more than 5 minutes, and they haven't even migrated 100% yet. https://www.thesslstore.com/blog/new-york-times-moves-websit...

Troy Hunt, a security expert, wrote about this topic a year and a half ago, and explained why, at the time he wrote it, his site wasnt HTTPS. http://www.troyhunt.com/were-struggling-to-get-traction-with... (this was before let's Encryt)

>unless you’re reading this in the future, you’re reading it on my blog over an insecure connection. That wasn’t such a big deal in 2009 when I started it, but it is now and there’s nothing I can do about it without investing some serious effort and dollars. I run on Google’s Blogger service which ironically given their earlier mentioned push to SSL, does not support it. Whilst Google doesn’t give me a means of directly serving content over SSL, I could always follow my own advice and wrap CloudFlare’s free service around it, but that won’t work unless I update the source of every single image I’ve ever added to almost 400 existing blog posts… and there’s no find and replace feature. This is the joy of SaaS – it’s super easy to manage and good at what it’s specifically designed to do, but try and step outside of that and, well, you’re screwed.

Another real world example, that I've encountered - your software is an intraweb site running on your customer's server and you have to play by their rules and policies on what your customers can put on their servers and when. And it's on exactly nobody's priority list.

It can indeed take some time to switch over, but why would you intentionally suppress a correct warning in the meantime? There is no good reason to mislead users here.

Because, here in the real world, my paycheck depends on keeping my customers happy.

How happy will your customers be when they find out that you're just pulling the wool over their eyes instead of doing your job?

Did you read what I wrote?

I do my job but there's only so much I can do if the owners of the servers who are serving my software tell me "before we can change this server's configuration we need approval from another office, it will be a few months until they can get back to us."

(This is a hypothetical situation, for me, because I eventually got all my customers on HTTPS back around 2012 or so. There was some push back and it did take a long time and we had to be very persistent with some customers... But it was well worth it!)

We are talking about the world of corporate IT and pointy haired bosses. I don't implement a social network, or email, I don't accept payments - I make intraweb sites that non-technical corporate paperpushers use to do their job. They are only interested in getting their work done; they are happy when they can get their work done without scary warnings they don't understand. If my software is giving them errors they are going to believe that it's my software that's the problem, and that doesn't look good for us. And we have to field support calls and explain ourselves.

...Or we could put a quick and dirty stopgate in to avoid something that makes us look bad and we can't do anything about.

Man, that fourth paragraph is a work of art.

Aaand...the very same security expert wrote this last week:


TL;DR: HTTPS is now workable and affordable.

If you have 3rd party ads on your login page, you're doing something very, very wrong and you deserve to have all kinds of warnings flashing up.

Have you ever been on a team at a large company? Tons of bureaucracy. You're correct that Let's Encrypt is the path of least resistance in a shop where you control everything. In many large companies the path of least resistance is tweaking whatever is in your immediate control (e.g. the text inputs / javascript you're writing).

> takes 5 minutes

This is entirely dependent on how your site is hosted. For some shared hosting platforms, it may not be possible at all.

also u have to do the request over port 443, and the alternative dns validation is not supported in in the official client. so its 5 min only in ideal case.

Because that is just not true in many cases. And let's encrypt is a big hassle if you can't automate the cert replacement.

> now that installing a certificate takes 5 minutes and is free, with Let's Encrypt?

It's only easy if you're using a Linux webserver. In IIS land it's a pain.

Bullshit. Not if you have shared hosting. Or 1000 over different situations that you clearly have not thought about.

So shared hosts will have to move to making HTTPS simple to implement for their users. Sorry, but progress must be made in this area.

And if not, it's no big deal, users will just be informed that the page is not secure, which is true.

Don't like the message? Work to secure your damn website.

Let's Encrypt is great, but completely useless for... Actually every single website I host. No wildcard certs, the rapid rotation that requires software to renew it regularly, etc. The cost of implementing HTTPS for dozens of sites with no sensitive data is simply not worth it.

When companies like Google and Mozilla decided how to handle HTTPS, they decided based on their needs and their perception of everyone else's needs, like banks and major corporations. This led, IMHO, to a complete failure to recognize a lot of other uses for the Internet, and so their solutions fail to adequately account for them.

HTTPS is for the user's benefit, not the site owner's (barring legislation, of course). Also, HTTPS prevents hijacking, not just sniffing, of content by a MITM. That includes malware injection.

This has been coming for quite a long time. The time for excuses is over. If you think the safety of your users is "simply not worth it", well, I'd like to know what your websites are so I can block them at my firewall. I'm not saying this to be a dick, I'm saying this because this is an attitude of callous disregard on display, and it's downright odious given the modern security climate.

LE is not that hard to use, and I seriously question whether you can't make an API call once every 90 days per subdomain. The requirements have never been lower.

HTTPS is very much for the site owner's benefit as well. If your site is not HTTPS then you can't be sure that your users are seeing what you intend them to see. Ads, malicious script, whatever, can be injected or replace your content.

HTTPS is to ensure privacy and integrity for the end user not for the benefit of Google and banks.


This. This this this. Automates everything so easily. I have helped someone personally deploy HTTPS for over a dozen sites and they all auto-renew without a hitch every 90 days.

No Excuses. EFF did us a solid

Except the part where if you're using shared hosting and don't have the ability to run this software on your server, it's useless as I said.

One might also conclude that the hosting provider is useless.

Or the expectation that everyone take a huge cost burden to appease El Goog is a bigger burden than the startup industry realizes. There's really no solution for HTTPS that does less than double my hosting costs, either I have to buy expensive certs or move to another hosting provider which would support Let's Encrypt. Either way it's a couple hundred dollars a year to maintain hobby sites, which don't pay for themselves to begin with.

Of course, it works in Google's favor to make it unfeasible to maintain a website outside a cloud platform. It's amazing here people are so opposed to the democratization of the Internet, and so supportive of the death of it, over security provisions that will, in retrospect, be considered largely ineffective.

What are you talking about? There are plenty of low-cost VPS providers that give you full root access on which you can easily run certbot. That's what I'm doing now, and my hosting provider costs a whopping $20/year.

Say what you will, but pushing for passwords to be transmitted securely isn't Google fighting against the democratization of the Internet. They're doing that in other ways, sure, but promoting encryption isn't one of them.

Encryption could be offered without certification authorities that charge huge sums for certs. And there's a link on /new right now about Symantec which continues to reinforce how relying on CAs is a broken concept.

So, right now, I have 24/7 American-based phone support (this is a must-have), 99.9% uptime guarantee, WHM/cPanel software licensing included, 60 GB disk space, 600 GB bandwidth included. By all means, if you have a VPS service that can offer all of this at less than $30 a month, I'd love to consider it. I haven't changed hosting providers in a while, but I haven't found a company capable of meeting the requirements.

Have you thought about using that 24/7 phone support to ask them to upgrade cPanel? Since August it comes with LE support in the form of the AutoSSL plugin.

Then use one of the offline challenges.

If there's not sensitive data, then there's no password field (passwords are by definition sensitive), and Chrome won't show a warning. So what's the problem?

Passwords are NOT by definition sensitive, and this is the sort of absolutist nonsense that I'm complaining about. Passwords are only as sensitive as the data they access.

This is false. Passwords are only as sensitive as all the data they access. Given that it's impossible for you to know what other data the user is protecting with the same password, you must assume all passwords are as sensitive as the most sensitive data a user might reasonably secure with that password.

Do I wish things were different, and that everyone on earth used unique passwords for every site? Of course. But I think you know that's never going to describe reality.

As someone who runs like a roleplaying site for like ten people (or several of them), I cannot be responsible for other people's bank passwords, nor should I be punished for daring to host websites without the huge added burden of cost of HTTPS.

The notion that every homebrew website is supposed to support HTTPS is also never going to describe reality.

Then really, Google's done you a solid. Now everyone using your site will know it's not as secure as their bank, and therefore, when their creds for your sites are stolen, and they get their identity stolen as a result, you can just say "Hey, everything told you it wasn't secure, not my problem"...

You should not be responsible for running any of the sites with this attitude.

Again, this is not a productive or useful security attitude to take. We've made some grave privacy missteps with poor security advice time and time again, so simply saying "HTTPS is better and everyone should use it" is not inherently accurate. Especially when it's completely impractical with the tools available.

> - implement a HTTPS password field in an iframe and communicate with it over post messages.

The top-level page also has to be served over HTTPS for the warning to not appear. (source https://developers.google.com/web/updates/2016/10/avoid-not-...)

The last one won't work. The parent frame has to be HTTPS as well.

You can implement a custom element which shows the characters as dots (essentially your third suggestion).

But what do you do when you have a page with form data that isn't sensitive and has no password on it and the users can't care less about the content of the form but chrome still warns about unsecure page?

Why would such a form need a password field?

Well, that is the whole point I'm trying to make. Why does chrome think I'm using a password on the page when there is no password? Anyway, Chrome will mark all http as insecure sooner or later so will just have to force https on all connections...

There seems to be many people with similar problems of false positives for nonexistant passwords so I guess it's a bug.

I haven't heard of this bug, but regarding the decision to mark all HTTP as insecure:

Remember, HTTPS isn't just for security, but also privacy. And even if your site is such that there is no privacy advantage in hiding the exact URL you visited (as opposed to the hostname, which unfortunately must leak for now), even if there are no cookies sent to your site, or to any iframes it uses, which can be used for identification or profiling…

Even then, there are the benefits that only accrue if a user's entire browsing session is HTTP-free, including hiding the user agent from a network attacker and preventing injection of everything from tracking cookies to DDOS scripts (China's Great Cannon) to zero-day attacks.

> Remember, HTTPS isn't just for security, but also privacy.

And the third thing: authenticity.

No-one has modified the page, for example to insert or change advertisements

I don't really know what your point is.

This will mark pages as insecure that have a '<input type="password">' field on your page. If you don't have that, you are fine.

I don't know of any reasons to have a password field if it's not actually sensitive information that's being entered.

...for now. Marking all non-secured HTTP as insecure (duh) is in the pipeline - it seems.

This is actually a Good Thing - with HTTPS-friendly CDN and/or Letsencrypt, rolling out sites that are secure-by-default is now easier and cheaper than ever before.

Well...HTTP is insecure. That's what S in HTTPS stands for.

Then that's a bug you should report, and let the Chrome team fix it.

> onKeyDown

Right-click paste from my password manager, and it doesn't work. Thanks.


This idea is terrible in general, but if you do, against all that is holy, implement it, please, please use onInput.

also type='password' don't get their submitted values suggested for the autocomplete thing in broswers.

You can avoid this by using `autocomplete='off'`. Most browsers will still allow you to autocomplete the field because many were abusing that attribute, but they won't save what you put in it.

It's still a horrible idea, but it'd work.

That's literally what my router's login page does. It is a text box but has JavaScript which converts each non-* character into a * character and stores the actual value in a JS variable.

Why? They added a "Show Password" radio and I guess they figured this hack made more sense than simply using JS to update the DOM to turn it from a type password to a type text.

My previous router did this to obscure password length, by inserting three stars into the field for every character typed. Which completely broke browser password managers, and the ability to paste the password.

IIRC, changing the 'type' of an input doesn't (or, at least, didn't) work in some browsers.

> an onKeyDown listener that cached each keystroke and inserted a into the field... sounds like a terrible solution in so many ways.

For anyone who is wondering what these ways are, here are a couple:

1) Backspace is a crufty special case

2) What happens when someone highlights text in the input box and types over it?

More broadly: a text input box is in fact a small but surprisingly comprehensive text editor. It supports a cursor with insert, delete, and overstrike; highlighting, undo/redo, cut/copy/paste; and shortcut keys for all that. It supports every keyboard layout and every input method. It obeys standardized focus rules. It's even got word wrap and spell checkers these days.

So, you want to go your own way? How much of that do you need to reimplement? And how confident are you that the subset you choose doesn't completely ignore some vital use case you forgot about?

You certainly can't get away with just wiring keystrokes to a field.

You can: you'll just get a buggy half-assed implementation. In a corporate envirnonment with bad politics, that might still be the only way to go. (Apart from quitting.)

CodeMirror has figured this out. When you type it's actually into a hidden input, and it updates a separate display.

Reimplementing CodeMirror (including highlighting, etc) is hard enough that most web developers probably can't do it. But it only takes one person to create a library.

+1 for using CloudFlare. I just deployed the front-end website for my new startup (https://elasticbyte.net) using Google Cloud Storage (like S3) and CloudFlare for custom SSL. CloudFlare also allows me utilize CNAME flattening, so the the root record for my domain simply points to c.storage.googleapis.com.

you can have a hidden text input and use * in the visible one, then just use javascript to push the real text into the hidden field. but the pm would probably be fine showing the password in plain text, since the threat of a visible password is low<sarcasm/>...

I've noticed a lot of sites default to "show password" with a toggle so it wouldn't be insane to think they'd opt for plaintext the whole way.

> I would think the laziest possible way to workaround this would be to use a CDN like Cloudflare to proxy all traffic to your site.

Interesting. Where are all these warnings when a CDN man in the middle attacks your connection? Or when google gets to access all the email communication of gmail users? Or when ad networks track you all around the web?

IIRC, type=password also prevents copying saved input to clipboard

So accurate. We had this exact discussion. Going to go with insecure warnings until we get https up shortly.

For those wondering you can mask a normal text field in css input { -webkit-text-security: disc; }.

My colleague used a custom web font where every glyph was replaced with a filled circle. Better browser compatibility, you know.

Although our reason was actually to do with password managers. At $DAYJOB we have a CRM/ERP system with lots of password fields for other entities (not the current cookie user). It's increasingly difficult to opt out of browser autofill, and LastPass in particular was corrupting password data in the system whenever forms were submitted.

> It's increasingly difficult to opt out of browser autofill

Good. I hope browsers autodetect these web font tricks and pop up similar warnings. I can't stand when some random website make thinks it can do a better job of credential security than major browser makers.

I think the point is more like an HR administrator who opens a web page, containing an employee's details. They need to update the employee's home phone number, but their password manager dumps the HR administrator's password into the "Set new password" field, which is therefore overwritten.

So don't put the "set new password" field right in the employee's details page, use an extra page or popup for that.

Our application is still maintained, so we can find workarounds or restructure the form to use an extra popup (to the detriment of usability). But i'm sure many applications won't be updated, and as a result of this change, data will be silently corrupted when they are used. The browser has knowingly broken compatibility with the web application.

This is a Torvalds "don't break userspace" moment.

Fair enough, that's a valid concern. But it wouldn't be solved by allowing pages to opt-out of autofill, since they'd have to be updated to use those as well.

I used this same hack for a XenForo add-on [1] and even got labeled as black hat.

[1] https://xenforo.com/community/threads/let-tls-wait-paid-dele...

Well I'm fairly certain it will go like this for me.

Pm - "why is this page insecure"

Developer - "chrome labels password fields as insecure over http"

Pm - "we'll need to setup encryption. It will need to be FIPS-140 certified or it's not secure"

Developer - "But you didn't care when there was no encryption"

Pm - "We don't need to certify plaintext, that should be obvious. You need to learn more about security".

Developer - "FIPS-140 has been compromised by the NSA. We don't want government spies in our servers."

where do you work

They seem to be a government contractor.

I laughed out loud. Then started crying.

Sigh. Our industry in a nutshell.

I've seen that:


Click "Acceso a clientes" and write numbers.

I don't even get to that badness: the browser needs to accept third-party cookies first. (I wonder what badness is behind all there.)

Pm - "why is this page insecure"

Developer - "chrome labels password fields as insecure over http"

Pm - "what if it wasn't a password field"

Pm - "If its important enough to hide, its important enough to stop from being intercepted. Think social security numbers, PINs, tokens, drivers license numbers, etc. Why aren't we encrypting things that matter?"

Pm - "why is this page insecure"

Developer - "chrome labels password fields as insecure over http"

Pm - "what if it wasn't http?"

The Law of Unintended Consequences at its best

Pm - "why can't you use some JavaScript to hide this?"

Developer - "..."

The correct answer is always "IE doesn't support it"

Good point. We use the Windows stack :D

It would be better if form is served up via HTTP it gets marked as insecure.

Developer - "any could see the password..."

Pm - "put one of them modal over it"

Developer - "but then how will anyone..."

Pm - "we're switching to <completely different stack they heard about from someone in their uber last week>"

Yay, plaintext input fields for passwords.

/me opens a sake bottle.

I'm glad I'm not the only developer who can't stand the kind of PM you describe.

I've got to say though, that this is a wee bit frustrating as a developer. SSL libraries are terrible, bug ridden, hard to work with, and there are huge sacrifices using a pass-through proxy to offer SSL.

The brittleness of SSL libraries manifests not just in the form of security exploits, but also in the form of delaying the next generation of HTTP technology. Node doesn't support natively support HTTP/2 due to HTTP2 fitting issues [https://github.com/nodejs/NG/issues/8]. Jetty was delayed for Java SLL changes. Same with Go.

If Google wants to make the whole web secure? That's great. But we also need to work on making it simple to secure. So much research goes into novel ciphers and optimal ways to defeat timing attacks, and etc etc, but the spike in complexity means that we're reaching a point where almost no individual or group can approach a correct implementation.

It worries me that we're approaching a point where we're utterly dependent on a security standard no one can understand.

As with most things, progress isn't clean or easy. Shifts in policy or practice cause disruptions, and then people adjust. The world is a dynamic place.

Software is no exception. SSL libraries will get better if they get used more. The developers will make them better. Or if they can't, we'll find a solution that works.

The question is whether the benefit of the disruption outweighs the cost. Browser-makers decided that their users' needs were best served by this change. Mozilla and Google have been telegraphing their actions in this direction for years. They have attempted to make a responsible and gradual transition, and to a large extent have succeeded.

Every once in awhile though, a break needs to be made and some folks will get left behind until they adapt, or don't.

> SSL libraries will get better if they get used more. The developers will make them better. Or if they can't, we'll find a solution that works.

I keep hearing this, but failing to see it. Since OpenSSL's inception.

BoringSSL and LibreSSL are two non-trivial projects to improve SSL libraries that started within the last 2 years. They may not be at an ideal state yet, but a lot of work is being done to move the baseline to a better state.

But that's exactly my point. Perhaps we needed this to push us?

> Same with Go.

Out of curiosity, what are you referring to? Go has great HTTP/2 support, and is enabled by default since 1.6. It doesn't depend on OpenSSL either, which is a big bonus in my book

Go has it now, but their delay was their own internal SSL rework as well.

I hope they do this for CC numbers too, because I know of a website I had to use that passed your Name, address, CC number, CC exp, amount; the whole shebang over plain ol' http to do a payment shudder.

They do it for CC numbers too, as outlined in their page for developpers [1]:

> To ensure that the Not Secure warning is not displayed for your pages, you must ensure that all forms containing <input type=password> elements and any inputs detected as credit card fields are present only on secure origins.

[1]: https://developers.google.com/web/updates/2016/10/avoid-not-...

Do they also do it for IBAN?

The people who decided that the new SEPA payments should include a way for creditors to take people's money using just public information and their signature should be fired. It's like they learned nothing from the billions of dollars wasted from fraud in the credit card system. Payments should always start after an explicit order by the payer to their bank, not just having the payee say "trust me, they totally want me to have this money".

Well, luckily, there is!

SEPA is a bi-directional protocol – if you try to take money from a bank account, the bank can say "nope", and the transaction can fail (with the person trying to pull the money taking the loss).

As banks allow you to configure this – mine allows me to disallow all direct debit, or disallow foreign direct debit, or only allow it from specific companies – this is not an issue.

I'm fairly sure that violates PCI-DSS.

I suspect PCI is okay with it so long as it is an unsecure page that posts to a secure one. Not that it's a great idea, but it would be encrypted in transit.

Edit: It appears PCI DSS V3.2 does ask that the form itself be on a secure page (section 4.1.g):

"for browser-based implementations: 'HTTPS' appears as the browser Universal Record Locator (URL) protocol, and Cardholder data is only requested if “HTTPS” appears as part of the URL."

Yeah, because MITMing the origin page to submit to evil.example.org is trivial.

In such a case one would expect the evil page to present something that looked like a credit card input to the user, but not to the browser. Sites would still want to use HSTS to combat the MITMing itself.

Nope, too risky. Just redirect to an evil HTTPS page, and do all your phishing there - look, it's got the green lock and everything >;-)

PCI-DSS is okay if you put it in an HTTPS iframe. Many sites I've seen use that workaround.

TNope that would violate PCI as well since you are then subject to clickjacking attacks unless you configure the site to only allow framing in from a specific url.

> The Hosted PCI Web Checkout module allows merchants to take credit card information on any page of their website. This includes checkout and my account pages. Hosted PCI uses an “Iframe” that can be easily installed on any website. Our Iframe is secure and is 100% Level 1 PCI Compliant. Our merchant’s websites never see the customer credit card information. That means, our merchants websites are not in scope for PCI Compliance requirements so you don’t have to spend time or tens of thousands on PCI audits yourself!


Honest question, who is in a position to tap your connection such that this becomes a serious security concern? IT staff at your company? The admins at your ISP? The NSA? I'm assuming that public wifi has session-specific encryption keys. I don't see these as the kinds of concerns that would warrant the kind of panic that some people seem to show over HTTP.

> I'm assuming that public wifi has session-specific encryption keys.

That's false for open wifi networks. Remember Firesheep? Just fire it up at your local coffeeshop and off you go.

Even for more secure public wifi like WPA2, vast majority of coffeeshops still don't change the default router admin passwords so you can take over it easily and listen in on all the traffic.

Further, it's not hard for some rando to setup a safe-looking access point and get people to connect to it. Camp out near an office with a router, I'm sure you'd get plenty of hits.

There's no shortage of attack vectors with no warrants required.

> who is in a position to tap your connection such that this becomes a serious security concern?

When you use HTTP everything is sent in plain text. This means...

- Anyone on the same network as you can see all of your traffic. This includes company networks, coffee shop wifi, your house, the library; any place that has a WiFi network. Caveat: it's possible to use network isolation to hide your traffic but this is crazy rare to see and typically is done to isolate networks, not individual traffic.

- Your ISP can see and log everything sent over HTTP.

- Anyone at the router level that your traffic passes through. Your traffic makes a lot of hopes over various routers on the internet before making it to your final destination.

Overall it's a terrible idea for anything that needs to be sent securely.

If I have two devices connected to a switch, how can they see each other's traffic?

ARP poisoning[1]. Ettercap lets you do it with a couple of clicks, without any advanced knowledge.

[1] https://en.wikipedia.org/wiki/ARP_spoofing

Just download Wireshark and you'll have an easy to use tool that'll show you the traffic.

Not by default, only if you do ARP poisoning, which most consumer switches wont guard against.

All you'll see without it is broadcast crap.

Your assumption about public wifi is wrong. If you connect without a password, your traffic is sent in the clear and MITM attacks are trivial. If you don't want your password exposed to any hacker with an old Pringles can within a mile of your location, you need end to end crypto.

Good to know, thanks. That seems rather negligent that per session encryption wasn't built into the protocol.

Agreed. I can see why you would assume there would be. Seems like the obvious thing to do.

HTTPS isn't just about encryption either. It's about authentication and integrity as well.

Making sure the server you are talking to is the correct one, and making sure that nobody along the way injects ads, trackers, malware, or anything else.


> Beginning in January 2017 (Chrome 56), we’ll mark HTTP pages that collect passwords or credit cards as non-secure, as part of a long-term plan to mark all HTTP sites as non-secure.

Would that really prevent you from purchasing?

Like lets say something rare was available for purchase, or music festival tickets that will sell out in 2 more minutes

would your first thought be "woah thanks Chrome you really saved me this time!"

I've been paying rent with www.rentpayment.com which unfortunately serves up their home page with multiple logins over http. Naturally, emails and tweets to their support go ignored. Maybe they'll finally respond after more people ask them why they're "non-secure".

I wrote a short article on this topic with approaches for less tech savvy folks to set up HTTPS:


Mods, can we please get the "?m=1" part of the url removed? I think the current link is for mobile.

But that would make it worse for people on mobile.

Man, someone should really invent a way to make webpages respond to the dimensions and capabilities of the user's device.

Plain, un-styled HTML?

Stop making up technology

If the page is loaded on a mobile device you'll get the mobile version anyway.

Alright, for what its worth everyone, if you haven't seen this already, here it is! The Cert bot from EFF!

Get that HTTPS motor running. This really does make it easy.


Advancing HTTPS is one of a few good things Google made in the recent years. Thanks Google.

It's one of the few things they do that I can't find a reason they would be financially motivated to do so, other than increase developers' opinions about the company as a whole, which is a good thing for all.

google competes with the likes verizon, comcast and at&t, and the data google gathers on you is very valuable. why would they want to share that data with the line operators for free?

sorry to say, but https is not an altruistic move by google.

This is ascribing an absurd level of Machiavellian intent to Google. Why can't people just accept that occasionally, Google engineers do stuff that isn't profitable to the parent company?

Just speculating, but I think Google is at such a huge scale whenever the internet wins Google wins. That's why they're trying to improve internet access with Project Loon and WiFi in Indian railway stations.

If privacy improves that will benefit the internet (privacy has real costs that don't involve state actors like hackers, etc). Many people, especially in developing countries, are afraid of transactions over the internet and use cash on delivery.

And since a large majority of websites import either google analytics or google words, google gets all the infomation for https websites anyway.

I'm willing to bet that the terms and conditions for both services allow google to reuse use the analytics infomation.

Exactly. My understanding of the push towards HTTPS by Google is that they control the websites more than they control the connectivity, so they want to make sure that ISPs can only aspire to be substitutable intermediaries between users and websites. Putting everything in HTTPS makes traffic opaque to ISPs.

I can understand why they would want to convince everyone to use HTTPS to restrict access of your data from their competitors, but it limits the data they can collect as well. HTTPS has stricter cross domain policies so their ads would gather less information and their Fiber ISP would no longer be able to collect data by tapping connections. But on average, I think you're right that HTTPS would restrict more of your data to Verizon etc than to Google.

underrated comment

This is a long term, strategic decision.

Better security = Greater trust of the web = Higher adoption = More ads.

No different to them working on Google Fiber.

In addition to all the genuine "let's make the web better" reasons, comprehensive TLS prevents middleboxes from inspecting or tampering with traffic, which in turn allows the introduction of better protocols like SPDY and HTTP/2, which brings content to users faster and allows pages that contain more disparate/modular content, which makes web applications more capable (in addition to the added security), which makes it easier and more appealing to migrate away from locked-in platforms. (And that's just one of many lines of reasoning.)

I can think few.. Adware substituting their adsense ads for theirs, also isps messing with ads etc.

Thanks let's encrypt.

Google should've started way way earlier

I work for google, and i had a fair amount of work on moving Ads to HTTPS (moved mobile app ads to https)

The work actually started quite a while back, but the overall ads industry and internet as a whole moves really really slow. Add the mobile ecosystem to the equation, and there is a bunch of issues.

The whole work is a combination of a bunch of things (in no chronological order): 1. Google pushed search ranking changes. 2. Google moved all of ads to HTTPS, and this took some time to make it happen. 3. Apple created ATS to make people think about it. 4. Apple wanted to enforce ATS for non-web content, had to back out. 5. Let's encrypt made access to certs free. 6. Big vendors joined.

Unfortunately, the world is slow when it comes to changes like these, but i am quite happy with the outcome so far.

edit: added context.

How early? Put a year on it. What should they have done, and when?

If Google hasn't threatened people that ranking will sink if they didn't migrate to HTTPS, Let's Encrypt wouldn't have been so popular despite the fact it's free due to the pseudointellecutal SEO "muh performance hit, muh redirect rules" meme

Firefox has a similar feature enabled in dev edition: https://blog.mozilla.org/security/2017/01/20/communicating-t...

Actually, it's in Beta now, and will be shipping to Firefox release channel users on Monday or Tuesday.

Countdown until a JS extension that takes a normal <input> field and uses &bull; characters to make it look like a password field without tripping Chrome's detector...

Can we just ban inputs being manipulated on input? It's really annoying for custom implemented types like phone numbers that do the '(___) ___-____'. Half the time it seems they break if you mess up.

I'd be in favor of banning input manipulation but adding custom fields like phone which would accept regex formatters.

Browsers already support that. Good look convincing managers/clients/designers that they're sufficient on their own though.

The real problem is UI. Most of these plugins are changing the UI of forms to something that looks fancier (and more consistent) than the defaults.

There really does need to be a better way to style inputs. It's all over the place now. Hell, just making `type=search` look the same across browsers isn't as straight forward as it should be.

Yeah, but if you want something slighlty different that isn't solved by one of the existing input field types you would be completely out of luck. And even if what you need is in the HTML spec you might be out of luck. Firefox is only adding support for date inputs sometime this year (my estimate) [1].

1: https://wiki.mozilla.org/TPE_DOM/Date_time_input_types#Roadm...

Then another countdown until normal input fields over HTTP are marked as insecure by Chrome.

Stupid question: Is the warning going to show up for localhost i.e. using chrome to see the local dev version of your website?

It has for me with the Firefox version of this, and based on my experience of it sofar, this is fine. For one, it's an obvious differentiator between my local copy and live, but secondly I also think local certs are something that we really need to find a way to make easier for devs to set up and test with.

Where do you see the issue? Creating a certificate that no one else has to trust seems pretty easy already.

I guess a part of this is inconsistency between development environments as well, but generally speaking, while not completely impossible, the dev user experience is just that bit more fiddly.

For a prod server, the process can be as simple as (Ubuntu/Apache being a common setup):

apt-get install letsencrypt && certbot --apache

Or more generally:

$PKGMGR install certbot && certbot certonly

For dev, you need to either select and install an SSL library, generate snakeoil certs and install them into each vhost you use, and thereafter go through every varying and occasionally unsurpassable browser warnings about unverified certs, OR - much, much more complex - install and maintain a boulder setup.

Given the above, most devs will continue to take the chrome://flags easy way out, which doesn't really allow proper testing of a HTTPS setup locally.

You could use a tool like easypki that allow you to manage a testing CA, whose root cert you can import into your browser. Maybe use a different browser profile so a compromise of the CA keys would not allow anyone to compromise your banking sessions, too.


But sure, if you're using Let's Encyrpt in production, this won't approximate your production setup in testing.

see chrome://flags to disable security warnings on localhost -- great for development

it's weird that the warnings aren't disabled by default on chrome for localhost: it's officially classed as a "secure origin"


That's…actually quite smart, I wasn't aware of that.

Thanks -- this has solved an ongoing problem I was having with a couple of clients.

What should be done for routers and printers that are accessed by their IP address?

Well, they are insecure.

For example, I've given the WPA2 password to many people; and that can be used to snoop on it passively (there's no forward security).

Modems are anyways way more insecure due to the default password being admin or 123456, etc. And many are accessible from the public internet.

I think my modem got hacked due to that (I saw some login attempts a little before the DNS got changed causing Youtube to stop working).

The best solution is for them to be accessed through a publicly registered hostname e.g. https://router0123.netgear.com (that would only resolve locally). They could provision certificates for themselves using the Let's Encrypt DNS challenge.

So, now we have a single certificate on all routers? What happens if I take apart a router?

Or would every router get its own certificate? But then netgear would have to become its own CA.

And in either case DNS hijacking is a massive issue.

Or would every router get its own certificate? But then netgear would have to become its own CA.

Why? They could just partner with an existing CA, like Cloudflare does with Comodo.

And how would DNS hijacking be an issue? The attacker wouldn't be able to produce a valid cert anyway (the router would come with a custom burned-in key that it would use to authenticate itself to the CA and get the cert).

> the router would come with a custom burned-in key that it would use to authenticate itself to the CA and get the cert

I take apart the router, and get a valid certificate. Now I hijack DNS, and get you to connect to me.

HTTPS within LAN for this purpose is useless.

I take apart the router, and get a valid certificate

You only get a valid certificate for your router's address. But if you can take apart the router, you don't need to hijack the DNS, you can simply control its traffic.

But if you're a guest in my home and I see you take apart my router, you'll have to answer a few questions. Same in an office or coffeshop. Having LAN access doesn't mean you have complete physical control of the router. So the HTTPS is not useless.

You only get a valid certificate for your router's address.

Considering basically every router has the same address, I now have a valid certificate for basically every router.

Considering basically every router has the same address

They have the same IP address, not necessarily the same DNS hostname, which is what the certificates are tied to. The user would just be told to connect to the hostname (possibly printed in the sticker) rather than to the IP.

That's certainly one option, but what happens now if I change the IP of the router in its config, because I use multiple in my LAN, one as router, the others as AP?

There is no option for any of this that isn't completely messy and hacky

On first boot and every time you change its IP, the router sends an authenticated message to the server to update its DNS records.

As a bonus, the user doesn't have to change anything to keep accessing the router admin page after the switch.

Now you're vulnerable to DNS hijacking.

HTTPS prevents that.

No, because the implication here is that the private key for these router0123.netgear.com type hostnames will be known to the consumer devices that are serving the pages, so they will be essentially public.

I think the idea is each device has its own FQDN, and gets its own certificates. Thus, breaking open your router only gets you "your" private key, they'd all be different. Buying one on eBay might be risky, but if you buy sketchy network hardware on eBay you're at risk in so many ways already...

You can't do this with Let's Encrypt out of the box (unless you make small numbers bespoke devices) because of their Rate Limits. But several commercial public CAs like Comodo would probably be interested in cutting a deal with a big electronics manufacturer or a trade group.

Correct. The rate limits for the parent domain wouldn't apply if it was added to the public suffix list. The IP restriction wouldn't apply because every router would be requesting from a different IP.

This doesn't make a whole lot of sense as a viable deployment strategy. The routers would necessarily need to ask for the domain name in question to be pointed at their internet-facing, public IP (if indeed they even have one!), because that's all that Lets Encrypt could possibly verify, but the administration interface is usually on a private RFC1918 address. And what secure protocol are you going to use for the router to request that domain name update?

And how is first time setup supposed to work anyway? You need to connect to the administration interface to give it your ISP credentials before it can connect to the internet and obtain its Lets Encrypt certificate.

If you forget about Lets Encrypt and instead point hundreds of thousands of router-<serial>.vendor.com addresses at, with a pre-made certificate, you then only have the problems of baking an individual private key into each router at the factory and boxing customised documentation (like maybe a sticker on the router itself) telling the user what the unique domain name is they need to setup their device. Oh, and the problem of what to do when the user wants to change the local address used by their router.

Nope. HTTPS only prevents you from falling for it - but you still can't get there.

I suppose it really depends on your threat model. I think most offices will be fine with the warning. However, you could configure each device to only respond to the trusted IP address of a HTTPS proxy which at least lowers the attack surface for snooping.

wait, wait, wait, are you saying a "trusted ip address" out on the web? or requiring every office to set up their own https proxy? Cause a "trusted IP address" out on the web would just be insane... "Oh, let's just send my login to and give full access to my network to this third party/manufacturer."

Continue to use them normally. What do you think needs to change?

There should be an HTTP Header (or a CSP directive) to allow servers to set sites as "Not Secure" manually. That would help a lot of people dealing with phishing attacks on web hosts.

It would function in the same way - if Chrome detects CC/password forms, it labels the site as Not Secure.

I know the article is older, but it's January 2017, just a reminde. The message will appear in the address bar.

Not a Chrome user, but this is a great feature, and is at least moving things in the right direction. Really they should go farther though. The UI treatment is almost un-noticable, even if they went with the "red triangle" version. How about a red-background interstitial page or a modal with a clear "Get Me Out Of Here" and "I Know What I'm Doing" choice for the user?

And for all those "small businesses" that are going to get affected by this? It's hard to muster up much sympathy at this point. It's 2017, and you're still horsing around with vanilla http?

I'm going to go ahead and make another shameless plug, since a lot of folks who are hesitant about this new HTTPS stack are worried about deployment, and thats for the fantastic folks over at Caddy. They make an Apache/Nginx alternative that has built in letsencrypt renewal support and automatically encrypts your site by default and serves over https/2.


I am not an affiliated developer, but I am a user, and have recommended this to others as well, its a solid product.

How does it handle password inputs that are added to the page with JS?

Maybe we can make it blink by adding and removing the password field

The console warning appears as soon the password field is rendered, but the location bar doesn't updates.

I'd go a leap further and change the background color of the address bar to red if it's a non-HTTPS page. No excuse for any site to be HTTP in 2017, especially with LetsEncrypt. Your host doesn't allow LetsEncrypt? They need to get with the times, or you need to switch hosts. (Why would you want to use a host that doesn't see the value of HTTPS?)

As stated in the article, that is in fact the long term goal for treatment of the Google Chrome address bar.

I'm in an A/B test group where all pages are marked either green 'Secure' or red 'Not Secure', password or not.

I like it.

I hope me trying to push this on G+ and Twitter for years helped.

This was always my first install on a new Chrome.


Will this also apply to data URIs? Thinking of the recent data URI phishing exploits [1]

[1]: https://www.wordfence.com/blog/2017/01/gmail-phishing-data-u...

Working on a new community site to help people move to HTTPS: https://blog.movingtohttps.com/dedicated-to-simplifying-the-...

It's great that Google wants to move more sites to https, and I'm in support of this, but it also creates challenges for security vendors such as myself.

Currently DNSFilter and others Man in the Middle traffic destined for sites our customers have decided to block. This works great for http, but not https, as certificate warnings are presented.

The standard work around is arguably less secure: adding a third-party CA to all end-points. This can still present problems with HSTS and certificate pinning.

I'd like to work with Google to create a standard where vendors can either be on a whitelist or have new recognized SSL cert fields, not to MITM traffic, but just to present users with a friendlier message explaining whats happening, and providing a separate https:// url to visit for information from the vendor about the block.

Implementing such a standard in browsers would further increase user security, and provide a viable method for filtering on guest networks where there is no end-point access.

Removing insecure HTTP altogether is the road Google is taking. That should make this a non-issue.

How does removing HTTP solve the issue presented:

When actively interrupting an HTTPS connection as a network element, there is no way to provide information to the user about the reason for the interruption or steps the user could take to prevent the interruption.

This can be done with HTTP, where a filtering proxy could show a page 'our software thinks this page violates company policies, but click here to override or contact IT to fix', or see also captive portals.

Maybe the right answer is simply there's no reasonable way to handle this use case in a secure manner, but taking away an established use is a real issue.

What happens if the page is insecure, but the attacker places an iframe in the page with HTTPS url, which then tricks the user into sending their credentials (unsuspecting users will think they are logging into the site).

I'm not sure that really fit what is changing here... If the forum is submitted it's going over https even if the iframe is on an http page. If an attacker has the ability to add code (iframe or other) to your site you've already lost.

that's exactly my point. I am hoping/assuming chrome would notify the user about this as well.

Why? IIRC cross-origin will prevent the http page from reaching into the https iframe and furthermore the password is being sent over https so google doesn't really care.

But since top lev is insecure, an attacker could inject a legit looking form whose destination is set to steal passwords.

I guess it depends on the attack this is supposed to stop. This change does prevent sniffing of passwords and protects them while in transit but no, it doesn't prevent MitM attacks. That said google plans on marking all HTTP pages as "Non-Secure" in the not-too-distant future which will help warn against the potential for MitM.

What about services like Sellfy that let you add a button to your site's pages, that loads a shopping iframe? Will it change the indicator when the iframe appears after a user clicks the Buy button?

Ultimately, how is this plan by Google going to affect sites that are hosted on a virtual server hosting plan?

For instance, I have a website hosted at Hurricane Electric on a virtual server plan. I've had hosting there for well over a decade. I like their service, the virtual host works well for most of my needs. There are two areas where it doesn't work, though (AFAIK):

1. I can't run a pure NodeJS website.

2. I can't set up HTTPS.

Number one isn't relevant to this discussion; but as far as I know, the second one is a big deal. There isn't any way (AFAIK) to host multiple virtual servers each with their own certificate.

So right now (well, with the release of v56 of Chrome) - if you have a Wordpress site or something on a virtual host that has a login - it's going to show something that says "unsecure" for the login/password form. Honestly, I am fine with that. My own site isn't a Wordpress site, but I do have a login/password box on the site, and having it show that it is insecure is not a big deal to me. While there isn't much or anything I can do about it, I do understand and support the reasoning.


...in the future, they want to mark -all- non-HTTPS sites as "insecure" - regardless of what the site does, presumably. It could just be a collection of static html pages (no javascript, no forms, nothing special), and it will still be marked as "insecure"? Does this sound reasonable? Suddenly, all of these pages will be deemed pariahs and non-trusted because they choose to use non-encrypted means of presentation?

Is there any solution to this, as it stands? Or are all of us with virtual hosting solutions going to have to migrate to some cloud-based server solution, with it's own IP, then obtain our own certificate (easier today, I know - and cheap to free, too) - just to get around this? Is this the end of virtual private server hosting (or is it going to be relegated to third-tier)?

I don't currently know what if anything Hurricane Electric plans to do regarding these changes. I don't want to move to another hosting provider if I can avoid it (while HE isn't the cheapest for what you get, they are nice in that they assume you know wtf you are doing - your hosting is basically access to the server via ssh and sftp - so you better know how to admin and set things up via a shell, because they aren't going to hold your hand).

I'm thinking I should send an email to them to ask them what they're planning to do - if anything.

If you are talking about this service: http://he.net/web_hosting.html, then it's supported SSL since 2013, at least, with a simple admin panel to set it up... If it's an actual VPS, then SSL is fully on you, and trivial to set up with common stacks and LE.

It should just label HTTP pages as "not secure", full stop. Because they aren't secure. Or at least, any page with a form. Never mind if it's a password field or not.

"Studies show [...] that users become blind to warnings that occur too frequently."

So right now it would be counterproductive to mark all http pages as "not secure". But it's the long-term goal.

The majority of the big sites that people use (Google, GMail, Youtube, Facebook, Reddit, NYT, WaPo, etc.) are already being served using HTTPS. I think if a couple of HTTP sites an average user still browses start showing these warnings they will notice them. And what matters, the owners of those websites will notice them and will ask their "IT guy" hey "why our website is marked as insecure? I want a green lock like Gmail has".

Their IT guy?? There are gazillions of people like me who have a blog, or some small project that has a small audience of tens to just a few thousand users. All these people now have to fork for SSL, or have to move everything to a different shared hosting that supports Let's Encrypt.

Your shared hosting service doesn't need to support Let's Encrypt, it just needs to allow you to upload a certificate. You can use https://gethttpsforfree.com/ to generate it.

They want an "installation" fee.

And then how does the renew process work?

The renewal is the same, follow the steps and you get a new cert. Keep the Account key and the CSR, so it's just a matter of copy-pasting. If they charge an installation fee again, you're probably better off paying for cert that lasts longer (not from Let's Encrypt) - you can get one that lasts three years for $15. Or just switch hosting providers :)

Right now, you can just put Cloudfront in between. It's free, and takes maybe 5 minutes to sign up and adjust your DNS entries.

Of course relying on a provider that might cancel the free plan at any time is not ideal, but worst case you just have to revert your DNS and it's done.

I assume you mean Cloudflare and not Cloudfront. While you could use Cloudfront, AFAIK there's no free option. (Aside from the usage you get as part of the AWS free but that is time-limited.)

AWS Cloudfront isn't free but costs pennies a month if you're not big.

I'm running a web app, isn't Cloudflare best for static / semi static content?

It is, but it might work for your app, too – js/css/image caching at CF nodes helps a lot. Moreover, you can disable their cache/cdn features and use it only for SSL.

They even have multiple modes, the "Flexible" one works even with no changes at your server at all. It obviously makes the CF<->server transfer insecure, but your users would still get a "green lock", if that's what you're after.

This is from their in-settings help: https://www.cloudflare.com/a/static/images/ssl/ssl.png

And that IT guy will go well there isn't anything I can do about it.

The user will then forever ignore it (because they like that web site) and the whole exercise is wasted.

I really don't like this idea.

It would freak a significant proportion of the community out who fail to distinguish what is a web page and what is their computer. They would honestly think that their computer is not secure and their local files/photos would be at risk. After the initial panic and speaking to their "friend who is good with computers" that they will just ignore it forever.

The far better approach is to aggressively warn at users at the point where they are trying to do something secure over an insecure connection. Personally I would be adding UI elements next to HTML forms saying "Do not enter your password here".

That will happen eventually; both Firefox and Chrome have already taken some steps in that direction. But it has to happen as a gradual transition, with plenty of time for people to switch to HTTPS in response.

This would be really nice. Tools like google analytics (if they aren't already) should probably show "Secure visits" and "insecure visits", and they should get even more attention.

They aren't secure, but neither is HTTPS, it is only more secure. HTTPS leaks the actual name of the site you are accessing as well as your IP; HTTP leaks the content you are seeing too. Tor would be more secure, because it leaks neither.

^ 100% agree. I think just marking HTTP pages with password fields as Not Secure would make HTTP pages without passwords fields appear to be secure. This is obviously not the case---they are just as insecure because you are sending your session cookie which is equivalent to your password---so all pages should be marked Not Secure.

Plenty of websites think they only have to secure app.example.com, while www.example.com can be left uncovered even though it serves www.example.com/login, because /login POSTs to the app. subdomain, and the session cookie is only set by and for the app. subdomain.

From a naive perspective, all requests passing private information are protected in such a scenario. But, of course, since the www. subdomain is uncovered, it can be MITMed and replaced with a phishing site. (A.K.A. a "spear-phishing" attack.)

This change somewhat fixes that scenario: the developer can no longer keep the /login route on the insecure www. subdomain; they'll have to serve that page securely (either by making www. secure, or, more likely—because it's lazier—just moving /login to the app. subdomain.)

Even though www. can still be MITMed to replace it with a phishing subdomain, that subdomain can no longer serve a login form itself. It would have to link to a "real" phishing FQDN (one the attacker controls enough to get a TLS cert for), at which point that domain can just be found and blacklisted by the browser vendors.

In a sense, it forces the attacker into the open, where the attacker themselves can be caught/blocked, rather than simply their attack being caught/blocked. (In other words, it forces MITM attackers into a situation more akin to current botnet malware-writers, where they must put up C&C infrastructure which can be traced back to them.)

Of course, this isn't nearly as good as just securing the www. subdomain; and it will force much more work (likely, doing said securing) on those who want to embed a login form directly on their www. subdomain's landing page. But, in the interim while we work on universalizing TLS, it will drastically decrease the value of spear-phishing attacks, just as spam filters drastically decrease the value of unsolicited bulk email ad campaigns.

> because you are sending your session cookie which is equivalent to your password

This is extremely different for most users that don't use password managers and/or unique passwords per site. As if your password is leaked. Maybe all your other sites are now leaked(with same pass) . The same can't be said of cookies.

What if the <form> submits to an https page but the page is served up on an http page? The form submission will be secure, correct? Will Chrome still mark as insecure?

The form submission is only half the issue. If the http page gets compromised the malicious party could simply read the contents of the password input.


Technically it would be possible to use JavaScript to intercept the onSubmit event of such a form, and alter the submission location or send the data insecurely wherever you want with AJAX, completely ignoring the destination action that came with the initial HTML. This is one of the reasons people have needed to use forms within secure iFrames to circumvent PCI Compliance requirements when sending credit card numbers.

I was also thinking of the opposite: submitting from an https-loaded page to an http page. I can't imagine why any application would do this (other than by mistake), but it would ideally be flagged as insecure as well.

It's already the case most of the time. If you submit via a plain form (without js), you get the (old) "This page is encrypted, but the information you submitted will be sent unencrypted" message. If you submit via an XMLHttpRequest, it should be blocked as Mixed Content.

Needs to start blocking form fields that have no corresponding input text box because...these unused fields still get autofilled with cached but personalized info.

Do they send a letsencrypt notice to these domains ? notifying users is awesome, helping "late" hosts into HTTPS would be perfection.

Insecure websites could get around this by not marking the fields as password fields but using javascript to make them appear so to the user.

I think the title must be "Chrome 56 will mark non-HTTPS pages with password fields as non-secure"

So if we have an http page with a password field that posts via https, it will be marked non-secure?

Yes, because it is insecure.


I'd like to see this with Adobe Flash and third party scripts. (Pandora!)

About time, this is an excellent feature.

Thank God

> A substantial portion of web traffic has transitioned to HTTPS so far, and HTTPS usage is consistently increasing. We recently hit a milestone with more than half of Chrome desktop page loads now served over HTTPS

Well OBVIOUSLY when the traffic is increasingly going to the same top ten sites like Faceboo, Twitter and Co.

This is such a dumb idea on google's part (and mozilla's) because people are now going to program dumb workarounds for this.

Google seriously has to stop trying to police the god damn web.

Note that they have a "long-term plan to mark all HTTP sites as non-secure", so dumb workarounds will not work forever.

With free ways to encrypt web coming out, you really have no excuse to not use https for login forms. People should be informed.

This is not the right way to educate people. This is a sort of like shaming someone in to doing something. Many small companies are going have an impact thanks to this.

There are many companies I have personally witnessed that use a direct IP to access web based solutions to their inhouse software, how are these people supposed to get a ssl cert.

We need to educate people, not shame them in to doing the things big google wants from them.

>Many small companies are going have an impact thanks to this.

if those small companies aren't offering secure logins to their users, they should be impacted. that's the whole point. You shouldn't get to risk your user's security just be being small. Implementing SSL is not difficult or expensive, and the prevelance of password re-use means that a small company with an unsecured login is causing a risk all around the net.

we've been educating people about SSL for years. if they haven't figured it out yet, it's time to start shaming them.

A web developer or website manager has a responsibility to be informed. SSL/TLS is not a new development, it's been around for 20 years, and recommended for login forms for at least a decade. At this point, what exactly should they do? Send people to knock on the doors of every business with a site?

In house software isn't an issue: internal staff are not going to go away and use a competitor if they see a security warning. They're just going to learn to live with it.

As for saying "small companies", I really don't see how this has an impact on smaller companies more than others. Certs are free, and trivial to install for any public domain (others in the comments above have mentioned valid problems with non-public domains, which remain to be solved but they are somewhat less affected by this feature anyway).

This maybe my own rule of thumb but i believe that companies that serves sensetive forms over http also save the password in plain text.

They should be a shamed.

"Thanks, Captain Obvious!"

Is there an option to turn this off, for those of us who feel we need it like a hole in the head?

Applications are open for YC Summer 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact