Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Server Side TLS (wiki.mozilla.org)
233 points by benologist on Aug 1, 2016 | hide | past | favorite | 37 comments


Worth mentioning https://cipherli.st/ too. But I think more warning about HSTS is needed, since misconfiguring HSTS will cause the domain to be inaccessible for long periods.


I'm behind https://cipherli.st, together with some friends. IMHO there is no reason not to have HTTPS everywhere, especially now Let's Encrypt exists. I did think and discuss a lot with people on how 'strong' the page is, and if we might want to change that. The page is targeted at sysadmins who I expect to do at least some research before bluntly copy-pasting config files off somewhere, there are enough warnings on the page.


I'm not normally someone who complains about other people's designs but is there a chance you could fade the watermark a lot more? It's still quite bold and immensely distracting which makes it harder to read the content (or at least it does me, but being dyslexic I do have to concentrate harder with reading blocks of text anyway).

That aside, your site looks at valuable resource. Thank you for publishing it.


> IMHO there is no reason not to have HTTPS everywhere, especially now Let's Encrypt exists

I don't want to disagree with you but I do. I most certainly agree that HTTPS must be everywhere and it's easier than ever before. Where I disagree comes with less experienced developers. I can write a quick PHP / Rails / Node / whatever web server to show some website real fast, deploy by uploading it to a shared hosting package or something fancier like elastic beanstalk, and it's done and up there. Yes it's on HTTP but it's so easy. Now you want to add HTTPS to it? It's not easy. Let's Encrypt makes some aspects of it easier but until the amount of fiction is similar to the process of deploying HTTP you'll never see HTTPS ubiquity in my opinion.


> Now you want to add HTTPS to it? It's not easy.

Try Caddy with automatic HTTPS [0] in reverse proxy mode [1].

[0] https://caddyserver.com/docs/automatic-https

[1] https://caddyserver.com/docs/proxy


Pardon my ignorance, but as a complete beginner how do you hook that up with a python (flask / gunicorn) app?


Run the python process on a different port and let Caddy act as a proxy, forwarding requests from the original port to it. As described in the second (proxy) link.


i use let's encrypt on google app engine... it took less than 5 minutes. google could very easily automate it for everyone, but that removes the direct verification between domain owners and certificate authorities.

granted, you're already giving up this control when you host with any 3rd party, but the CAs are being reckless if they encourage it.


For HPKP that's true, for HSTS not really. It may just force you to deploy HTTPS and not go back - which some might say is a good thing :-)


IMO, too many of these websites recommend includeSubdomains as part of the HSTS stanza right out the gate.

Personally, I'd deploy HSTS incrementally and wait until such time there is a majority of subdomains that are TLS capable before deploying includeSubDomains. Otherwise, there could likely be some nasty surprises.


You can't get on the HSTS preload list included with browsers without includeSubDomains.


To clarify, most of the customers I deal with are enterprise customers and so as such have many internet facing hostnames/domains/subdomains.

Arbitrarily enabling includeSubDomains is going to lead to nasty surprises if there is no prior coordination.


> majority

I'd argue that all, not most, subdomains must be HTTPS capable (I won't say TLS generally, either; this is only dealing with HTTP). Any that aren't will not be accessible by a user agent that recently (within the max-age) visited the parent domain if it had an HSTS header with that flag.


Agree. I started with extremely low max-age like 120 seconds, and once I am comfortable I change to a larger value like 6 months.


You can check your configuration afterwards using this tool: https://www.ssllabs.com/ssltest/

For command line, this his nice: https://github.com/iSECPartners/sslyze


Another cool util to test/debug HTTPS is https://testssl.sh

It especially useful in locked-up environments where the server-to-server communication must be TLS, yet both servers are not directly accessible from the 'public' internet.


If you want to contribute to this page, or the config generator [1] that goes with it, check out its repository on github: https://github.com/mozilla/server-side-tls

[1] https://mozilla.github.io/server-side-tls/ssl-config-generat...


I've already used it a few times buddy :)


Lately I've just been throwing everything behind Caddy (caddyserver.com) in reverse proxy mode. This is all you need in your Caddyfile to get automatic TLS. It's genius.

<hostname> { tls <your email> proxy / localhost:<port> }


Wow, this might actually be good enough / full-featured enough to let me stop copying around my huge HAproxy configuration boilerplate and Ansible roles for every project I spin up. Very cool!


Thanks for that recommendation - Caddy looks awesome.


For automation with Nginx and Letsencrypt: https://github.com/Z3TA/letsencrypt-nodejs-nginx


> return 301 https://$host$request_uri;

Don't do this. After I did it, my 3 websites were completely wiped-out from search results. From 3k UU to 20-30UU /day.


Sorry to hear that, but academically very interesting.

Did you try to recover the traffic? If yes, what did you try and did it work?

Also, after setting the permanent redirect, I believe it would be a good idea to update the internal navigation of the site so that all links are formed with https. That way, both the crawlers and the web server will have less work to do and eventually, the entire site will be indexed/updated in the search engines with the https protocol.


I updated all navigation, RSS, links to images, menu, search, sitemap, everything. Dump of my site didn't contain any 'http' link to the same domain.

nginx was redirecting everything with 301.

I let once website work on http and https and made most links protocol-independent, only site map and search results are forced to https. There was a thread about it on reddit where a few more people said the same.

I slightly recovered from it, got 101 UU yesterday.


How long since you changed to https? It may take Google a few weeks to fully recognize the changed/redirected URLs. During this period, the rankings frequently drop but once everything settles down, the rankings and traffic should gradually come back to normal.

If it has been a while (at least a month or two) since the changes were made and the traffic has not returned, that would be a cause for concern.


I switched to full HTTPS in January/February (depends which domain), by end of April I had less than 100UU/day from search results.


And before the switch, the traffic used to be around 3k UV/day? If so, this is definitely a (big) problem. If you share your URL, I'll take a look and give you some input.


This recently changed. Google is no longer penalizing redirects, and is in fact rewarding http->https redirects. Hopefully other search engines will follow suit.

https://moz.com/blog/301-redirection-rules-for-seo


The things that can make your website drop seem random. I had my visitors plummet after a redesign, even though I kept most of the URLs and redirected the old ones properly. Things never recovered, I don't know why, since I was serving the exact same content as before.


Did you also set your website up with Google Webmaster Tools to register the 301?


No, I don't have google account and was never registered there. I don't want any US corporation or government to spy on my visitors.


That's a hard argument to stand on, considering that your search traffic is coming from Google.


Also form: bing, baidu, duckduckgo, yahoo. My posts were many times on reddit and in 3 years I've been 4 times on HN with +500 upvotes. Currently 1-3 visitors per day come from search results, 0 or 1 are from google. I use goaccess and piwik.


Any chance this is a tracking issue? You might be losing referer information because of the 301, see [1].

[1]: http://piwik.org/faq/troubleshooting/#faq_51


I disagree. A site can get (a decent amount of) traffic from Google even without giving it (Google) the open access to ALL the activity taking place on the site by way of using the Webmaster Tools.


That's.. not how Google Webmaster works.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: