I was involved in this launch and I want to address a very common misconception I'm seeing here and elsewhere.
Some webmasters say they have "just a content site", like a blog, and that doesn't need to be secured. That misses out two immediate benefits you get as a site owner:
1. Data integrity: only by serving securely can you guarantee that someone is not altering how your content is received by your users. How many times have you accessed a site on an open network or from a hotel and got unexpected ads? This is a very visible manifestation of the issue, but it can be much more subtle.
2. Authentication: How can users trust that the site is really the one it says it is? Imagine you're a content site that gives financial or medical advice. If I operated such a site, I'd really want to tell my readers that the advice they're reading is genuinely mine and not someone else pretending to be me.
On top of these, your users get obvious (and not-so-obvious) benefits. Myself and fellow Googler and HNer Ilya Grigorik did a talk at Google I/O a few weeks ago that talks about these and a lot more in great detail:
So Google's position is that SSL is such a high priority for content sites that they will officially incite a mad scramble for every content site on the planet from big media companies to hobby blogs to secure their page behind https to keep their ranking, but yet doesn't see anything wrong with the fact that every Blogger blog and even the Google Online Security Blog that it is announced on, is insecure. Nice.
In my country, the cost of a SSL certificate is around 60% of my hosting costs, per year. I run a low-traffic blog with comments disabled, so users do not "interact" with the site in any way - except consume the content. I don't see any benefit from this.
StartSSL is pretty harmful as evidenced by the events after Heartbleed. The certificates are free but they charge you to revoke them, and after we found out about Heartbleed and realized a lot of those free certs were compromised a lot of people refused to pay up for their free keys and continue using the compromised ones. What's more is that StartSSL refused to do the right thing and revoke them, leading a lot of folks to even go as far as petitioning to remove StartSSL from Firefox's Certificate Authorities because any given site using their free certs could be compromised. 
actually, what i think is.. they're as near 'free' as it gets, probably. at least there's no up front cost using them. then its a lottery as to when u need to pay them to revoke... it could still end up cheaper than paying yearly fees for other certs, i imagine.. total cost of ownership or something..
You're right, so I fixed my post. What I meant was that my particular cert wasn't compromised. Either way, the StartSSL/Heartbleed fiasco is a real thing and I've added a link to the original discussion I was citing.
Free certificates tend to result in ugly warning messages in browsers …
Cheap certificates are available, however, they are still not for free. And hosting more than one domain with SSL is a problem too with most hosting providers if you do not want to book additional hostings.
> Free certificates tend to result in ugly warning messages in browsers
StartSSL is free, and as long as you correctly bundle the intermediate cert (something you have to do with many, many other CA's anyway) your SSL will look no different than a $100+/year one from an A-list provider.
SSL does not come cheap. Certificates have become cheap but you need your own IP, i.e., shared hosting is a problem and hosting becomes more expensive. Certificate sellers, hosters etc. on the other hand are certainly happy about these new business opportunities – although we all know that SSL is inherently broken.
OK, good to know – although there are apparently still some restrictions according to comments by other HN users.
SSL is still more expensive, though. For most small content websites (< 500-1000 visitors a day), a shared hosting is sufficient with costs of maybe around 100 USD/year. For SSL, you usually need a more expensive hosting, you have to buy a certificate (OK, available for less than 10 USD if you don't care about it's quality but need mainly browser support without an ugly warning window) and most hosters allow SSL only for one domain in a hosting.
Shared hosting with 4 WordPress blogs, SSL is active but only to access the control panel since the hoster allows SSL only for one domain. Costs incl. a cheap SSL certificate: 110 USD/year.
All 4 WordPress blogs with SSL, i.e., 4 shared hostings plus 4 cheap SSL certificates: 440 USD/year.
(And caching with a Wordpress plugin is probably no longer possible …)
StartSSL is 0 USD/year. There should be more providers like them, and if the barriers to entry ($$$$) weren't so insurmountable, I'd happily start one myself. But they do a good job, and I've used several free certs from them with no issues.
I am more than happy to migrate my site to https and I took a two days to watch your youtube video to ensure i do not miss anything
But I got one very valid concern. Most websites running some kind of affiliate links and banners. Most of the affiliate links and banners is not on the https platform. This will cause mixed content error message by the browser. First, is using protocol relative urls solve this mixed content error issue? Second, can the non-https affiliate links and banners work correctly(tracking etc) on https website?
I am sure this is the one big hurdle need to be addressed or else more than 50% of the websites in existence will have difficulty to migrate.
I assume self-signed will be treat as having no certificate at all, if the reason for the difference in ranking is that a certificate implies the user will more definitely read what the server sends, as a self-signed certificate protected site is just as easy to MitM as one without a certificate at all.
Please stop spreading this lie. It's been debunked many, many times. Just because something doesn't provide 100% security doesn't mean you should give up and use nothing.
Once again, self-signed SSL raises the cost of an attack from "basically free" passive monitoring to a much more expensive MitM attack. It's a travesty that apache doesn't simply auto-create a self-signed certificate if it doesn't have one so plain HTTP can be retired forever.
Note: this is about transport security, and the UI presented should not suggest any kind of authentication has been achieved. In firefox, this means not showing the "locked padlock" and other changes usually associated with SSL.
So please, stop undermining the security of the web. We could have been all-HTTPS a long time ago if this nonsense wasn't brought up each time.
 and hard to use against everybody simultaneously
Considering the importance of HTTPS to, in Google's words, "[making the] Internet safer more broadly", this seems like a good time to again suggest that Google enable HTTPS for Google Analytics by default.
Google Analytics is on 50.8% of the top million domains on the Internet, and on 26.96% of a randomly selected 48.5 million domains. Of the 42 billion links analyzed in my research, over 48% of them had Google Analytics on either the start or the end. That's a lot of information leakage.
Anyone who is eavesdropping on HTTP connections to the Google Analytics endpoints can observe a web user's traffic history trivially. This enables simple mass surveillance by specifically looking for these connections and recording them. HTTPS would prevent that.
I should note, whilst there is an option to specifically force SSL in the new Google Analytics, it must be enabled by default in order to have a positive impact. We can't rely on the owners of millions of domains to upgrade to ensure an end user's privacy.
Sorry to jump in with a tangential reply, but BEWARE of the following!
Google treat the http and https versions of a domain as SEPARATE PROPERTIES. This means that even if you 301 every http page to https when you transition, all of your current rankings and pagerank will be irrelevant.
You can verify this behaviour for yourself in webmaster tools.
I suppose this is because it's possible to serve up different content on http/s, but really, who does that?!
In short, don't do this until google rethink their stance on what counts as a property. I'm currently nursing a client with a 30% revenue hole as a result of this.
> Google treat the http and https versions of a domain as SEPARATE PROPERTIES.
That's not quite accurate. It's on a per-URL basis, not properties. Webmaster Tools asks you to verify the different _sites_ (HTTP/HTTPS, www/non-www) separately because they can be very different. And yes I've personally seen a few cases - one somewhat strange example bluntly chides their users when they visit the HTTP site and tells them to visit the site again as HTTPS.
> This means that even if you 301 every http page to https when you transition, all of your current rankings and pagerank will be irrelevant.
That's not true. If you correctly redirect and do other details correctly (no mixed content, no inconsistent rel=canonical links, and everything else mentioned in the I/O video I referenced), then our algos will consolidate the indexing properties onto the HTTPS URLs. This is just another example of correctly setting up canonicalization.
By the way, if you're moving to HTTPS, following our site moves guidelines:
Nope, we followed the instructions to the tee. Straight 301 redirects from http to https, appropriate canonicals on all pages referencing https, and their SEO has seemingly started from scratch - used to be in position 1 for a variety of important keywords and searches, now they're beyond page 10.
That does not follow logically. not-X is typically zero, just like not having an inbound link from a high pagerank page is not a negative. Besides, there are three situation: no-https, both http/https and http-only, which makes your claim that the middle one is negative seem less likely.
Say there are five sites that would normally be returned for a query and they have scores A:20 B:18 C:10 D:8 E:4. The results will look like "A, B, C, D, E". Say none of them support https, and then the search engine adds https as a positive ranking factor worth +3. Site C turns on https, the order still is "A, B, C, D, E". Now site B turns on https, the order is now "B, A, C, D, E".
Imagine instead they had added "lack of https" as a ranking factor worth -3. The rankings on the page would have changed exactly the same way.
"not having an inbound link" can be thought of as a negative without changing rankings. In the example above, if getting an inbound link from apple.com would move you up 4 points, then if B got a link from apple that would put them at 22 to A's 20. If instead "not having a link from apple" was worth -4 points, then A would be at 16 and B at 18.
There is no doubt that https adds a positive value, and not having it would put you at a disadvantage. But that is not what is being discussed here, the question is whether having BOTH https and http is a negative.
the default noted there seems fine? if HTTPS, then GA uses HTTPS, if HTTP, GA uses HTTP
with firefox adding in mixed-content-complaining not too long ago , along with IE having it for a while, and apparantly chrome having it too, its best to match protocol to minimize issues for the user
Browsers only complain if you go from HTTPS=>HTTP, not the other way around, so there is no mixed content warning. The article itself, hosted on Blogger, demonstrates this if you check the source code -- whilst the website is HTTP, it uses JS hosted on HTTPS, with no mixed content issue.
To reiterate on the issue with HTTP default, the issue is that Google Analytics being HTTP on all HTTP sites results in a far easier man-in-the-middle target. An attacker only needs to eavesdrop on messages being sent to the Google Analytics endpoints, a far smaller and simpler task than observing and parsing all HTTP traffic.
As such, a default of HTTP even if the website itself uses HTTP is something I'd term a major issue. An ISP or government agency could track the web traffic of an enormous number of users without having to perform any real processing of their own. Admittedly, they'd only see a subset of what Google sees, but that's still a lot.
I'm sorry, but this simply isn't something a search engine should be dictating. Turning enabling SSL into some arms race that panics small businesses into buying millions of new, pointless certificates just isn't very fair.
This kind of policy needs to be discussed openly in a suitable forum, e.g. the IETF, not handed down to us by a single company who think they have a right to dictate how the Internet works - and have provably done a horrible job of it in the past (websocket over SPDY, anyone? Yeah, I'm not even sure which version combination of SPDY and websocket I'm talking about either - pick one of the hundred)
There are strong arguments for not enabling privacy by default - not least since it prevents any kind of decentralization or caching of content. At a time when OpenSSL just suffered one of its worst bugs in history, forcing small sites to assume the risk of running code like this, which they inevitably will get wrong, materially worsens security for all, it doesn't improve it.
This kind of policy needs to be discussed openly in a suitable forum, e.g. the IETF, not handed down to us by a single company who think they have a right to dictate how the Internet works
I don't see how is this any different from any other signal that Google uses to prioritize sites. Forcing small businesses to buy certificates doesn't seem any different than forcing them to have faster websites, for example.
There's an argument for more diversity in search engines, but I don't see how is that specific to this signal.
There are strong arguments for not enabling privacy by default - not least since it prevents any kind of decentralization or caching of content.
How does it prevent decentralization?
At a time when OpenSSL just suffered one of its worst bugs in history, forcing small sites to assume the risk of running code like this, which they inevitably will get wrong, materially worsens security for all, it doesn't improve it.
How many people could exploit Heartbleed before it was publicly announced compared to how could sniff traffic on open networks, as countless tutorials explain how to do?
Heartbleed was bad, and OpenSSL is a mess, but let's pretend that unencrypted logins are somehow less bad.
Apologies, I haven't made myself clear with that idiotic of a snarky remark :) What I meant is that their actions in the past shouldn't be an excuse to their actions today.
The principles behind PageRank are based on unbiased reputation, and provide for a good ranking system (spammers aside). Whatever's thrown on top needs to be carefully considered not to enforce biases towards any group in particular.
> I'm sorry, but this simply isn't something a search engine should be dictating.
Damned if you do, damned if you don't. If the announcement from Google had been that they wouldn't want to use their considerable clout to promote SSL, they'd been criticized for putting their profits over improving the general long term health of the internet.
> have provably done a horrible job of it in the past
That's a single example of a new technology that didn't work out well. How about pioneering certificate pinning for a counter example? Nobody's perfect, if you never break anything it means you never try anything new. Also, websockets and SPDY isn't even close to "dictating" anything, it's a new technology that you can use or not use as you want to.
"Having HTTPS support will only get you a very minor boost in rankings."
If your livelihood depends on getting traffic from Google - and a lot of sites do - then even a minor boost may equal a lot of money. Plus the fact that you can never know quite how much, so to be safe you must assume it's worthwhile.
The problem I have with this move is that to me it appears as Google are furthering their own political agenda. They want the web to be https, so they penalise sites that are not. It would be different if the argument was that sites on https tend to hold more quality content than non-https sites, but that doesn't seem to be the reasoning.
Why is "quality content" an objective measure and "user security and privacy" a political agenda?
I agree that it's dangerous for one entity to have so much power over the web, but I don't see how is this particular signal any different from any other they already use, including those which define the quality of the content.
> At a time when OpenSSL just suffered one of its worst bugs in history, forcing small sites to assume the risk of running code like this, which they inevitably will get wrong, materially worsens security for all, it doesn't improve it.
OpenSSL is not the only SSL stack you know. I run one of my websites on Tomcat so I can benefit from the pure-Java TLS stack it uses (the default one actually). Something like heartbleed is impossible for such a stack.
Would be more awesome if they offered free certificates and an API to renew them.
Right now enabling https is not a one-time investment, since a new certificate has to be requested and installed each time the old one expires.
Computers are supposed to bring down cost and automate tedious tasks, for https the opposite is the case.
It’s worth mentioning that https://www.startssl.com/ does offer free certificates. But without a paid account they last only a year and cannot be issued to wildcard domains, so you quickly end up with a lot of certificates that has to be manually renewed each year.
My issue with SSL everywhere is that I have to effectively buy my domain twice: once for the domain, and one again for the certificate. My registrar should give me a wildcard certificate good for the time I've paid for my domain.
The price wars for domain registration pushed the cost way down over the past few years. Certs are starting to move in the same direction. As volume picks up, they can cut margins. And some guys will start to treat it as baseline feature and not a buy-up.
This is my concern too. I manage 100+ clients registered and hosted in a variety of places (often because I'm inheriting their choice of host, registrar, etc). It's painful enough without adding SSL to the mix for even the littlest of sites.
I'm interested in statistics (especially from websites with non-technical and international audiences) about what percent of visitors are using browsers/devices that don't support SNI.
I don't know how representative this is, but it looks like StatCounter Global Stats  says that slightly over 10% of recorded visitors are still using Windows XP, and many of these users won't have SNI support.
Small websites without strict security requirements often use shared hosting, where SNI is the only practical way to implement HTTPS. Alienating something like 10% of visitors with a security warning is probably not desirable. I imagine this could be a not insignificant roadblock to widespread SSL adoption on small websites, but would like to see more detailed stats.
http://www.utilitydive.com/ is a US-based news site for the electric utility industry. About 4.5% of visitors are on Win XP and most of those people are using XP. It's trending down pretty sharply; it was nearly twice that at the start of the year.
I don't get it. I have a website that is purely content and available to everyone. It has no user accounts, no sign-ups, no nothing but static pages. Why should I use HTTPS for that? To prevent man-in-the-middle attacks?
Using HTTPS will let you know if the website you're visited is being messed with by a company, country, or regionally controlled firewall or content filter. If someone operating one of these filters took issue with your site, they could block certain content and users wouldn't necessarily know that it's happening.
Usefulness of SSL aside, is anyone else terrified that Google can essentially dictate what it wants developers to do, with low search rankings as the penalty for not following them? In my opinion, this sets a scary precedent.
Makes sense. The reason seo spam is effective is because it's so cheap to get a new site (or ten thousand new sites) up and running. If you make that cost $50 per domain for the ssl cert, that will help ensure all those sites sift nicely down to the bottom of the rankings.
Bonus points if they allow a single bad site to tarnish the reputation of all sites under a milti domain cert.
We could have had this from the start if domain names weren't essentially free via domain tasting. But hey, better late than never.
I do agree, however remember that you can get SSL certs from $9 (e.g. from NameCheap). You might be able to pay lower if you shop around too.
Also even if it was used as a fairly strong ranking signal, if Google still approach their rankings like they do now, spammers might still have sufficient ranking 'weight' to overcome a lack of SSL certificate.
I'm a freelance web developer for dozens of restaurants. They pay for the site, then a yearly hosting fee every year after launch. They get a basic CMS so they can update their hours/menus/etc.
I host all their sites on a few VPS servers. Some of my contracts require support for IE 7 or IE 8 on Windows XP, and those browsers don't support SNI. So in addition to what you've mentioned - maintaining certificates and losing more of what little money I make on hosting (I basically charge a small % of the VPS cost plus a few hours' worth of work), I now will need to figure out another solution. It seems like a waste to spin up a new VPS for each site that requires XP support.
Clients look at the <10% of visitors coming from Safari and IE7+8 on XP and say "those are potential customers." It's difficult to argue with that.
For now though, I'm going to do nothing new. All indications are that HTTPS is going to be maybe 1% of the ranking, and I know my market well enough that the sites rank highly for local searches - which is the important part. They're responsive and they've all got social media presences, so until SSL is more important for PageRank, I'll wait it out.
It probably bugs me the way it does, because this "signal" has nothing to do with the contents or the usability of the web site (unlike speed, validity of HTML or, well, content itself), but is purely a "we just think you should do X" situation.
I would definitely prefer to use a site that supports HTTPS over HTTP. For personal safety reasons in addition to privacy and general welfare of the web.
If you're searching for something and roughly the same content is available at safedomain.com vs. notoriouslysketchy.ru, I'd think you'd prefer to be shown the former above the latter. I don't see how this is much different.
I'm surprised by the amount of negative comments. Independently of what do you think about HTTPS and CAs in general. Given there the alternative currently is plain text, I'm actually surprised that it wasn't a signal before.
Google is not "enforcing" anything, people react like if you are not going to show up in the results at all, or Chrome won't work via HTTP. HTTPS is signal, just like having a link from a well ranked website like HN is a signal, and probably dozens other.
The points you mention are in fact indicators that someone has put care and resources to make their site work more securely, which says a good thing about the site, which google rewards with some points in their algorithm. Makes perfect sense to me that this will somewhat improve the quality of their results. Would you also complain about google using fast response times as a signal because that "forces" people to pay for better servers?
About your security point, google can not do that without loosing 50% of its customers, I really don't understand what that has to do with the rewarding HTTPS being good or bad. Looks like a red herring.
Right, you will not disappear from the results. The reaction (granted maybe overreaction) is about Google pushing HTTPS hard for security (which could be good but not automatically so) and not caring in areas where it is as important if not more.
You are just proving my point. Google rewards the richest, those who have the resources as you say. As for care, I would be clad if people were not going to do it for the wrong incentives. Will Google just check if HTTPS is available and reward or will it also check for broken cipher and penalize ?
I am not against HTTPS. Just saying that rewarding HTTPS is not enough. It's worst actually, some will set it up quickly and badly just for the extra ranking points and not the actual security it should be providing.
To me the red herring here is pretending doing it for security. What is the point of HTTPS if I receive my password by mail ? To me email is more important to secure first. Google could perfectly incentive security practices in Gmail without loosing a single customer. I would even settle for just signing instead of encrypting mails.
As for enforcing, HTTP2 (that is SPDY) IS enforcing HTTPS.
IMO, Good HTTPS where it matters is more important then Crappy HTTPS everywhere just is ridiculous and could even be dangerous thanks to a false sense of security.
> Google rewards the richest, those who have the resources as you say.
Google doesn't care who is it rewarding, google cares about the users that search, they've said that multiple times. And yes, people with better resources build on average better things than people without them.
> I am not against HTTPS. Just saying that rewarding HTTPS is not enough. It's worst actually, some will set it up quickly and badly just for the extra ranking points and not the actual security it should be providing.
Even then, still 10 times better than plain text HTTP so my whole office can see what I'm browsing with a simple console command.
> What is the point of HTTPS if I receive my password by mail ?
Your email inbox should be accessed via TLS, it's something up to you. And while you don't control the origin (nobody can without breaking compatibility) intercepting a message in transit like that if not exactly something most people I know can do. While getting that password over HTTP is almost trivial for anyone sitting around me.
> As for enforcing, HTTP2 (that is SPDY) IS enforcing HTTPS.
The day you can only see a website via SPDY then I would call that enforcing it. Yes if you want to carrot (performance) you have to pass through the hop (security), nobody forces you to eat the carrot.
> IMO, Good HTTPS where it matters is more important then Crappy HTTPS everywhere just is ridiculous and could even be dangerous thanks to a false sense of security.
I really can not get which scenario you are picturing here. Setting it up is not rocket science.
> Google doesn't care who is it rewarding, google cares about the users that search, they've said that multiple times.
Hum, well I've grown wary of what Google say. Like puting comercial mail in a separated inbox is to help the user.
It also happens to indirectly help Adsense.
> And yes, people with better resources build on average better things than people without them.
Does that mean content created by association without a dime for instance is on average inferior ?
I happen to like cooking. I often find websites with great content by word of mouth. They are generally badly ranked because they look like they were done on Frontpage and from Geocities ages. Yet the content is very good and even sometime quite unique. They rank badly because they are not speedy and in beautiful html5. That's elitism. Maybe they should by Adwords.
> Even then, still 10 times better than plain text HTTP so my whole office can see what I'm browsing with a simple console command.
That is one of the few good arguments for HTTPS everywhere : privacy.
> And while you don't control the origin (nobody can without breaking compatibility)
You can encrypt or even just sign emails without breaking compatibility. Put commercial email in a separated inbox is OK but put unencrypted and/or unsigned email in a separated inbox is not ?
> While getting that password over HTTP is almost trivial for anyone sitting around me.
> I really can not get which scenario you are picturing here. Setting it up is not rocket science.
Is it better to have open WiFi or WiFi with WEP ? It's the same because WEP is nowadays easily broken by script kiddies with simple tools.
That the scenario I'm picturing here. A web full of weak/broken certs to comply for ranking, people feeling safe (it's encrypted right ?) and script kiddies with trival tools to break the WEP equivalent of weak/broken HTTPS certs.
Granted, maybe I'm over-pessimistic here but the trend annoy me. i don't take Google at face value anymore. You know they excel at long play.
On the bright side, maybe people will use their certs for more than HTTPS ... say mail server for instance :)
Don't have a ton of experience with SSL and only recently started messing with TLS on my Apache server but question: Google makes mention of a 2048 bit certificate but most of the certificates I see are 128/256. Is this number referring to something else other than the strength of the encryption?
Messages like "your connection is encrypted with 256-bit encryption" don't tell you anything about the size of the RSA keys in use.
During the TLS handshake, your browser and the server do public-key crypto to authenticate each other and share private information without a previously-known shared secret. Because public-key crypto is really, really slow, they then share a small secret (say, 128 or 256 bits), and use that secret as the key for a traditional symmetric encryption algorithm like AES. That's the number you're seeing.
thank you both, that really makes sense. I thought it was a little peculiar that Google decided to mention the whole 2048 bit thing, I couldn't have been the only one that was thrown off by that a bit.
2048 is the RSA key size. 128 or 256 is the AES key size (or more generally the key size for the symmetric cipher). Nowadays the latter doesn't depend on the cert, but in the 90s it artificially did thanks to U.S. export policy. CAs still advertise as if we were stuck in the 90s.
Perhaps more people would be more inclined to get nudged by Google in this direction if Google got in the CA game and sold certificates themselves, 2048 bit, right on Google Play, Chrome Optimized certs, NFC and QR code-enabled. How about that?
there is something odd about this announcement. since it is no harder for a black hat publisher to switch to https than a white hat publisher, this signal will likely get noisy very quickly. i think google knows this, and its why they rarely specify ranking signals.
so, then really, they are just using their dominance in search to effect change. now, you may agree with the change (they did something similar by announcing pageload time as a signal), but it makes me a little uneasy for google to leverage their dominance in this way even if i happen to agree with the goal.
Do you have to throw out all your social rankings?
I've been wanting to switch to HTTPS but have been avoiding it due to significant accumulation of Facebook likes and some +1's. Last I checked, you had to do some big hacks to maintain your Facebook like count. Has anybody found a good way to handle this or do you just have to start over?
I'm going to come out and say it. HTTPs is borked, in a functional way. On a social/technical level, it has become a false sense of security. The PRISM revelations let us know that the three letters and any corporate wannabe was doing MITM not just on http but on HTTPS whenever possible. I would say the ISP's and the CA's should all be considered compromised.
We need something new and better, not to push HTTPs on everything as an imagined stop gap...
That being said though, I do understand that if this was pushed to wider adoption, it would create a higher cost to perform such attacks, for ISP's and three letters?
No. A plain cert is one you normally buy, an EV cert stands for Extended Validation which only specific CAs can give out and there's extra guidelines for that, plus they're generally much more expensive. Browsers generally show the identity of the certificate in the URL bar when they are EV, which they otherwise do not.
I'm all about this but what about third party static sites like everything on Github Pages? We're using Github to host http://kili.io which has all of our marketing material but there's no way to upload a certificate there. I'd rather not move off of Github Pages for the main site because it's easy to just push changes, it's fast, it makes it easy to tie into the rest of our open code (https://github.com/kili), and it's free.
Nice organization and cause, looks really interesting. Not being snarky at all but asking - you provide hosting but aren't able / won't host your marketing site on your own servers? Even if you don't have your own infrastructure, even grabbing a VPS from digital ocean or linode and throwing up a cert there could solve your problem.
Just doing a little more research. It appears as though many of these EV certs require you to verify your company information, domain registration, phone number and even address. With Google using however many hundred or thousand ranking signals, this makes sense. It is essentially another layer of trust and really great for UX as well. Online shoppers trust a site with the "green location bar" much more than ones without it, and I could definitely see how Google might reward this type of website.
I don't buy the 'this is for your own safety' nonsense. Having said that, when are Google going to improve their search algorithm? These days there are so many shitty content farm results that clog up the first page itself. How about improving that first?
Unfortunately, Google is pretty much a monopoly when it comes to online advertising and search that few companies will have a choice in this matter. Google unilaterally forcing them to buy stuff doesn't sit well with me.
After you buy an SSL certificate, Heroku charges $20/month to use it. You can circumvent this with Cloudflare, but they also charge $20/month for SSL. Is there any easy way to use SSL on Heroku for $0-5 / month?
Are you able to elaborate? If not that's fine, but I just don't think paying for a cert makes it somehow better, so I liked the idea of CAcert. If I should be concerned about using them to secure personal sites (in this sense really only used by myself and a few friends who have the CA root cert) I'd be interested in knowing.
Let's kill small ISV and raise the bar of entrance to the internet. Oh certificate management is too complex and expensive for too little ROI ? Well see, here we have a nice "cheap to the eye" cloud solution just for you.
Security isn't the priority here. Selling cloud is.
Switching over to HTTPS in and of itself shouldn't stop much data leakage given that the hostname - at least at current - isn't difficult to obtain (and really gives the game away for the content you're visiting as far as my site is concerned), but I suppose it's a step in the right direction and will stop primitive tracking attempts.
Protecting against code injection is actually a fair point though.
Sure they would benefit. They could check whatever is on your site without anyone in between noticing (WLAN e.g. at starbucks, corporate LANs and the proxies used etc etc). There are companies out there, who buy surf-habits (read: browsing logs of URLs visited) and mine it for valuable data.
20% of android devices, but definitely not 20% of any website's android traffic. the amount of web traffic that comes from those android phones is approximately 0 unless maybe you're in africa or china - the people who still have android 2.x phones aren't browsing the internet with them.
it's definitely not zero. as for whether or not i'm willing to serve those people a big scary error message, i'm not sure yet. It's something we're actually going to have to come to a decision on in the next couple weeks though, this isn't a hypothetical for me.
a lot of my traffic is repeat, so we'll probably do a good campaign to push users off IE8 this fall and officially declare it unsupported in Nov/Dec.
I hope that at least one site runs without https so that when I am using airport/airline/FlyingJ/Starbucks/etc wifi, I can access it and be presented with the button I need to press to access the network.