TL;DR: Bing doesn't support SSL on www.bing.com and has never publicized it as a supported feature. The submitter had to manually type https://www.bing.com into the address bar to generate this 'error'.
>TL;DR: Bing doesn't support SSL on www.bing.com and has never publicized it as a supported feature. The submitter had to manually type https://www.bing.com into the address bar to generate this 'error'.
Or use HTTPS Everywhere. Personally, I'd also like it if in future, web browsers would try HTTPS first and HTTP second.
Working in information security, I see this far, far too often in support tickets from employees who are unable to get to a site because our proxy is blocking misconfigured certificates. Usually we like to reach out to the owner of the site and have them update their configuration, and it gets quite frustrating when we find an unresponsive organization. Having to bypass cert checking for a site on our end is a huge security risk, and defeats the purpose of even having an SSL cert.
Companies! Make sure your certs are all in order! There's no reason to send a page to your users over HTTPS if they can't trust the certificate. Canonical has been a long-time offender of this, with many of their pages sporting a certificate signed to canonical.com but being served by ubuntu.com.
> There's no reason to send a page to your users over HTTPS if they can't trust the certificate.
There can be. HTTPS still gives you encryption over the wire. It still protects against a passive eavesdropper, like a casual packet sniffer on a public wi-fi network. The whole certificate deal protects against a Mallory with power to intercept and spoof messages. Of course nobody on the public internet can be sure there isn't a Mallory in or at the edge of the user's ISP, but in plenty of intranet or otherwise controlled networking scenarios, HTTPS with certificates lacking a trust chain can be reasonable.
Valid points, but in effect what you're doing is training users to believe that HTTPS means trusted. What happens if your site is compromised? The users will see the same untrusted SSL warning that they're seeing if your certs aren't in order. You're giving them security for your site, but removing their security awareness. This hurts them, this hurts the Internet, and this could come back to hurt you.
Training users to click through messages that are completely valid warnings is just shitty behavior.
Right. MitM attacks and variants are a proper subset of the mischief that can be achieved with an unsecured connection. Among other things, it's routine that people (hi!) use self-signed certs for personal or temporary TLS sites. I'd be very annoyed to pass a basic-auth-protected https git url to someone for quick-and-mostly-secure read-only access and have their access be denied by a local proxy...
>I'd be very annoyed to... have their access be denied by a local proxy
Untrusted certificate chains is a valid security risk. If it annoys you that users are blocked from viewing your site because you haven't gone through the established and accepted practice of actually completing the SSL trust chain, that's not the local proxy's fault.
My number one concern is the integrity and security of my company's data. Your personal or temporary TLS site is much lower down that list, especially if you're serving up untrusted certificates and expecting that users will completely ignore the warnings that try to keep them from falling victim to the kinds of attacks SSL is meant to avoid.
Arrgh. Sorry, but this is an absolutely classic example of why "security professionals" get laughed at by engineers. This requirements analysis is just completely backwards.
In the example in question (which frankly isn't very interesting, I can come up with hundreds of scenarios like this) I have a git archive I need to share with someone on an ad-hoc basis. For whatever reason, I'd like to do it securely, so it needs authentication and encryption and shouldn't go on anyone else's site. So the obvious solution is to throw it up on a static webserver somewhere and use TLS (or use ssh, of course, but that's subject to exactly the same root-of-trust problem as a self-signed cert -- surely you disallow outbound ssh access too, right? right?).
Your fantasy world wants to pretend requirements like this don't exist, and that you can simply refuse to support this kind of transfer via fiat. But it's not the real world. In the real world, this is what people have to do. Unless you break their systems, in which case they'll work around your silly rules in even less secure ways.
To answer your question, yes outbound and inbound SSH is blocked except through our secure gateways. Every security organization at every company will have procedures in place for bypassing the official policy on an ad-hoc basis. There's always going to be a one-off situation that requires a different set of rules on a limited time basis, and there are many ways of dealing with these exceptions.
On a broad basis, though, untrusted certificates are blocked outright. I'm sure you can understand that for every one-to-one or one-to-few valid communications like your case, there are dozens or hundreds of one-to-many not-so-valid communications. SSL has certificate trust built in for a reason. Engineers may laugh at security professionals, but we're here for a reason. Even though we're directly at odds with each other in approach, our goal is the same. You want to help further the business by pushing new software. We want to further the business by making sure that software is secure. No one benefits from the new e-commerce solution if the business bank account is completely drained the day after the product launches.
I can come up with a hundred good reasons why I should be able to enter a store after business hours, pick up a few things, and leave (I left the money on the counter!), but for some reason businesses still lock their doors. Do you laugh at them for having completely backwards requirements?
It's true that there's always a procedure in place for bypassing the official policy as needed. However, it often comes down to a question of whether its faster to follow this procedure or to simply undermine the security. I've repeatedly seen people solve the problem of handling unsigned certificates by simply moving back to pure HTTP. This is a great security hole than the unsigned certs ever were, but it's perfectly valid with their security policy.
I can actually take your key analogy into true story territory. I worked at an office where a manager had to be gone for a month. He had the only key and the policy to get a new key wouldn't get one to us for weeks. It was also against policy for him to give the key to someone else while he was gone. The solution was to simply leave all the doors unlocked while he was gone.
I understand the need for security. I understand that security is your job and that you don't want to compromise for mediocre security. However, when you won't compromise, we don't get great security. We get no security.
Completely understood. I can't speak for other companies, but mine has a very strict policy and also many ways to accommodate exceptions to this policy provided the employee has a valid reason for needing an exception.
I agree that strict policies with no recourse for exceptions is a bad thing. I don't know of anyone who would stand up under that kind draconian stance.
We were experiencing a similar issue with a third party analytics solution where their SSL cert all of the sudden started to be delivered by Akamai as opposed to the FQDN of the company, as it was before. I am curious if Akamai is at fault here?
Was this in the past week? Because just a few days ago I was experiencing a similar problem. I attempted to access https://rememberthemilk.com and chrome complained about the SSL cert, as I was receiving the cert for their CDN and not for rememberthemilk itself. I couldn't find anyone that could reproduce the issue, but it occurred consistently on my computer regardless of browser. It resolved itself 20 min after I first noticed it, but I'm still really curious as to why it happened in the first place.
Usually the CDN will have a single certificate which then has multiple SAN entries. What can happen depending on the size of the CDN is it can take a long time for the SSL to be added to the SAN entries on all the edge servers. So if a site just starts moving to a CDN they sometimes jump the gun and instead of waiting for all the edge servers to have the SSL they just change DNS records.
You then have a scenario where the DNS is waiting to propagate and the SSL is still propagating around the CDN. Hence you get soem users without a problem, some with and it eventually all clears itself up.
This is Akamai kinda trying to serve HTTPS on domains that don't have HTTPS setup. I'm not sure why they attempt it at all, but it's not really their fault. Same thing on my domain: https://www.theblaze.com/
I made a site that checked for SSL cert expirations and misconfigurations, but I couldn't acquire any customers. I still think there's a business there somewhere, although maybe it really only sells as part of another product.
Edit: If anyone wants a script to check their certificates, here you go: https://gist.github.com/bretwalker/5420652. You'll just need to add in some sort of notification logic, especially for expirations, since they need to happen before a problem arises.
Very useful. It's not a super hard problem to solve, but I did think it provided enough value that companies would be willing to pay for it. SSL is used on the most important pieces of a business' infrastructure, I thought it'd be a simple sell.
I also can't tell if certalert.me checks for hostname mismatches. That was another part of my service, because a misconfiguration is just as fatal as an expired certificate.
Bing is important. It's a good search engine, comparable to Google's quality and size. And it's the only competition Google has in the US and most of Europe. (Sites like DuckDuckGo and Yahoo pass queries on to Bing). Dumb errors like this SSL problem are embarassing, but the larger frustration is how despite years of having a good product, they have so little market traction.
I have noticed that Bing is actually getting better. However, I still prefer Google for most things. Google "assumes" what you are looking for, this can be good or bad depending on what you are searching. Bing on the other hand is like the early version of Google, literal search with minimum assumption. I am glad it is there as a fallback from Google. If the Google servers go down or if they pull some crap like ban Google search from windows phone, Bing is a decent alternative.
One thing I really appreciate from Bing is catching upto Google maps. Throughout the whole Google Maps and Windows phone fiasco, I din't miss GMaps a bit, between Nokia Maps and Bing Maps , there isn't much I missing out from not using Google maps. Infact, I would to love Google to catch up to Nokia by offering offline maps.
Another mention is their Bing Flight  search, similar to kayak minus their cookie/OS price manipulations. It also comes with an awesome price prediction.
Overall, Bing has some great products but if they really want to compete with Google Search, they really need to improve their algo.
Do you really the pronunciation difficulty and lack of meaning is so significantly different than "Google" was when it launched? It's more vocalized, yes, but only a single syllable for Google's two, and relatively noone actually think of "googolplex" when talking about Google. The sound "bing", I think, conveys speediness, which they took advantage of with the slogan "Bing and decide". It's a quick, sharp sound - as should be your searches.
You have to move your mouth into a strange position to say the word's single syllable. And it's an opinion, as common as it may be, that Bing is an awful name for a search engine.
When I first heard of the search engine google, I was immediately reminded of those childhood conversations about a number with 100 zeros following it. So yes, instant evocation. And maybe this would only happen with geeks, but that's exactly who they needed to convince to use their search engine.
Microsoft seems to have decided to ignore geeks and just markets directly to 'average users'. This to me seems like a very poor strategy. No smart people telling you to use Bing or Windows 8 and guess what...
I understand Hulu and Netflix but I have a hard time understanding why Amazon wouldn't support SSL. It does put a heavier processing load on web servers but you would think that if someone wants to encrypt their shopping traffic Amazon would be open to accommodating that.
It is not, by any means, trivial to find out what products someone is looking to buy and could be the basis of a social engineering hack.
Looks like someone pushed the wrong SSL cert to production:
www.bing.com uses an invalid security certificate.
The certificate is only valid for the following names:
a248.e.akamai.net , *.akamaihd.net , *.akamaihd-staging.net
(Error code: ssl_error_bad_cert_domain)
The post does not explain why Akamai's cert is being used. If bing over SSL were working correctly in the past---which I believe to be true---and given that Bing were using Akamai in the past, why the problem now?
I couldn't find something on google, so I decided to give it a try. Can't hurt. I wanted to find out if anyone has done a study on language confusion(the effect where Russians have a hard time getting good at Polish because they confuse Polish words with Russian ones). Still can't find it ;)
Well, I know neither Polish nor Russian, but here in Prague I know a lot of Russians that never learn to say the word dobrý. They say dobry(short y) instead, which exists in Russian. They never learn to say it long, even when you point it out to them.
What IE debacle? Trident is now widely considered to have the most correct HTML5 implementation in a major browser. It might not have all of the draft stuff that Webkit has, but it also doesn't require all of the webkit css hacks.
Wow. This has been going on for an hour now. It's such a simple fix, especially since one would assume that they already have a valid cert somewhere (or that Akamai does). Yet they've had an hour of SSL downtime.
Does anyone know if Bing has any SSL-only clients? Like do any of their toolbars or built-in search widgets in Windows use SSL by default?
Assuming that to be the case, is having a wildcard certificate for *.example.com and a second certificate for example.com the solution? It'd be nice to have the entire domain covered by a wildcard certificate and not just all subdomains.
Bizarrely, this only works for second-level domains however, and isn't disclosed in advance.
Really, CAs shouldn't be throwing in "free bonus" SANs without customer authorization ever. It would be much better to have a place to enter the SANs, or a checkbox asking if I want "www." as well, or to apply to the parent also, or whatever. That would also make the process more apparent to the user in addition to being more secure.
Not all that off-topic. For your example, you should get a certificate made out to ("Subject CN") example.com or * .example.com, then list both the wildcard and the subdomainless entry as Subject Alternative Names.
Basically Akamai is using the same ssl certifate on (most?) of its edge servers. The reason for that is that traditionally it is difficult to decide for a server that is serving multiple domains, which SSL certificate to show for a client -- the HTTP header, which contains the hostname is sent way after the certificate information has been exchanged.
A certificate can contain several hostnames (in the SubjectAlternativeName extension) - but that does not scale if you have a big number of sites for a number of reasons (re-signing the certificate all the time is a nuisance, browser behavior with certificates containing several thousand hostnames is kind of fun, etc.).
Nowadays there are solutions to that problem (using the Server Name Indication TLS extension -- which basically sends the desired hostname in the TLS exchange before the certificate is exchanged). However, the number of sites actively using SNI is very low - google is the only site known to me that is doing it (try accessing google.com with/without SNI and you will get completely different certificates).
The reason why SNI is not yet that much used is that client support is still a bit flaky. Afaik it is supported by all recent desktop browsers. However, I think the XP TLS stack does not support it (and there are still enough users on that), android only supports it starting with version 3.0, etc.
So - at the moment you basically still need a separate IP for each site (or at least one IP for sites that can share one certificate).
I don't know if akamai also supports custom SSL certificates. Facebook seems to use kind of an interesting mix between akamai and self-hosting - facebook.com itself seems to be hosted by facebook. However, if you use facebook over ssl and check the url of served profile pictures, you will see that they go to https://fbcdn-profile-a.akamaihd.net (or similar) -- hence to one of the hostnames that is mentioned in the akamai edge certificates.
Akamai isn't using the same certificate on its edge servers, unless you mean the same customer certificate being replicated (in which case you are correct). Basically, Akamai maps each ssl certificate to a slot on the cache server, which is assigned to a map similar to the standard edge CDN. Each edge machine thinks itself the site. Each time a new certificate is issued by an Akamai partner CA or the customer's CA of choice, it is pushed out by Akamai to the ssl edge network.
SAN certs are an entirely different ball of wax. Akamai does support them, but there are some challenges getting them deployed.
Unfortunately, the deployment for any type of certificate with Akamai is a very manual process.
Actually, it's been like this a really long time. I just noticed, that HN stories which have nondescript titles fare better, so I decided to conduct a little experiment. 1st spot on the front page seems to confirm my hypothesis.
With very few exceptions, everything is supposed to be served over HTTPS (or other secure protocols). Certainly something as sensitive as search traffic. Tell me your search queries and I'll tell you your interests, health problems, relationships, if you're happy in your job and what kind of porn you like.