https://www.bing.com: subject=/C=US/O=Akamai Technologies, Inc./CN=a248.e.akamai.net
Bing does support SSL on ssl.bing.com and publishes various links on that sub-domain, such as https://ssl.bing.com/webmaster/home/mysites
The fact that the https://www.bing.com redirects to the HTTP version should be enough to show that this a known, unsupported case on the primary domain. The behavior has been like that for years.
Or use HTTPS Everywhere. Personally, I'd also like it if in future, web browsers would try HTTPS first and HTTP second.
I can construct a web server that sends version A of a site over normal HTTP, and version B of that same site over HTTPS.
In fact, sometimes I do that by accident. :)
It's a bad assumption.
(OK, 10 hits a day is probably an exaggeration but shockingly little of one.)
My browser only talk about the warning, but i only have old chromium... Not auto-updating g chrome
Too Long; Didn't Read
It sounds like internet speek, it's cooler and more 1337.
Interesting article on a related topic: http://www.bbc.co.uk/news/magazine-21956748
Companies! Make sure your certs are all in order! There's no reason to send a page to your users over HTTPS if they can't trust the certificate. Canonical has been a long-time offender of this, with many of their pages sporting a certificate signed to canonical.com but being served by ubuntu.com.
There can be. HTTPS still gives you encryption over the wire. It still protects against a passive eavesdropper, like a casual packet sniffer on a public wi-fi network. The whole certificate deal protects against a Mallory with power to intercept and spoof messages. Of course nobody on the public internet can be sure there isn't a Mallory in or at the edge of the user's ISP, but in plenty of intranet or otherwise controlled networking scenarios, HTTPS with certificates lacking a trust chain can be reasonable.
Training users to click through messages that are completely valid warnings is just shitty behavior.
Untrusted certificate chains is a valid security risk. If it annoys you that users are blocked from viewing your site because you haven't gone through the established and accepted practice of actually completing the SSL trust chain, that's not the local proxy's fault.
My number one concern is the integrity and security of my company's data. Your personal or temporary TLS site is much lower down that list, especially if you're serving up untrusted certificates and expecting that users will completely ignore the warnings that try to keep them from falling victim to the kinds of attacks SSL is meant to avoid.
In the example in question (which frankly isn't very interesting, I can come up with hundreds of scenarios like this) I have a git archive I need to share with someone on an ad-hoc basis. For whatever reason, I'd like to do it securely, so it needs authentication and encryption and shouldn't go on anyone else's site. So the obvious solution is to throw it up on a static webserver somewhere and use TLS (or use ssh, of course, but that's subject to exactly the same root-of-trust problem as a self-signed cert -- surely you disallow outbound ssh access too, right? right?).
Your fantasy world wants to pretend requirements like this don't exist, and that you can simply refuse to support this kind of transfer via fiat. But it's not the real world. In the real world, this is what people have to do. Unless you break their systems, in which case they'll work around your silly rules in even less secure ways.
On a broad basis, though, untrusted certificates are blocked outright. I'm sure you can understand that for every one-to-one or one-to-few valid communications like your case, there are dozens or hundreds of one-to-many not-so-valid communications. SSL has certificate trust built in for a reason. Engineers may laugh at security professionals, but we're here for a reason. Even though we're directly at odds with each other in approach, our goal is the same. You want to help further the business by pushing new software. We want to further the business by making sure that software is secure. No one benefits from the new e-commerce solution if the business bank account is completely drained the day after the product launches.
I can come up with a hundred good reasons why I should be able to enter a store after business hours, pick up a few things, and leave (I left the money on the counter!), but for some reason businesses still lock their doors. Do you laugh at them for having completely backwards requirements?
I can actually take your key analogy into true story territory. I worked at an office where a manager had to be gone for a month. He had the only key and the policy to get a new key wouldn't get one to us for weeks. It was also against policy for him to give the key to someone else while he was gone. The solution was to simply leave all the doors unlocked while he was gone.
I understand the need for security. I understand that security is your job and that you don't want to compromise for mediocre security. However, when you won't compromise, we don't get great security. We get no security.
I agree that strict policies with no recourse for exceptions is a bad thing. I don't know of anyone who would stand up under that kind draconian stance.
Subject: CN=* .bing.com
X509v3 Subject Alternative Name: DNS:ieonline.microsoft.com, DNS:* .bing.com, DNS:* .windowssearch.com
Subject: C=US, O=Akamai Technologies, Inc., CN=a248.e.akamai.net
X509v3 Subject Alternative Name: DNS:a248.e.akamai.net, DNS:* .akamaihd.net, DNS:* .akamaihd-staging.net
You then have a scenario where the DNS is waiting to propagate and the SSL is still propagating around the CDN. Hence you get soem users without a problem, some with and it eventually all clears itself up.
Edit: If anyone wants a script to check their certificates, here you go: https://gist.github.com/bretwalker/5420652. You'll just need to add in some sort of notification logic, especially for expirations, since they need to happen before a problem arises.
I've since moved on to a new project. If anyone wants to talk about taking over the site, shoot me a message at bretwalker /at/ Google's email. It's Django and has some branding and a domain: http://static.nyquistrate.com.s3.amazonaws.com/media/certcia...
It's not a huge business, but it was almost trivial to setup with all the advanced monitoring & diagnostics that stackify typically does anyway.
I also can't tell if certalert.me checks for hostname mismatches. That was another part of my service, because a misconfiguration is just as fatal as an expired certificate.
One thing I really appreciate from Bing is catching upto Google maps. Throughout the whole Google Maps and Windows phone fiasco, I din't miss GMaps a bit, between Nokia Maps and Bing Maps , there isn't much I missing out from not using Google maps. Infact, I would to love Google to catch up to Nokia by offering offline maps.
Another mention is their Bing Flight  search, similar to kayak minus their cookie/OS price manipulations. It also comes with an awesome price prediction.
Overall, Bing has some great products but if they really want to compete with Google Search, they really need to improve their algo.
Sure they might be godo enough that if they are a default, lots of people won't worry about changing, but not being as good as Google means there isn't much impetus for Google users to switch.
When I first heard of the search engine google, I was immediately reminded of those childhood conversations about a number with 100 zeros following it. So yes, instant evocation. And maybe this would only happen with geeks, but that's exactly who they needed to convince to use their search engine.
Microsoft seems to have decided to ignore geeks and just markets directly to 'average users'. This to me seems like a very poor strategy. No smart people telling you to use Bing or Windows 8 and guess what...
It is not, by any means, trivial to find out what products someone is looking to buy and could be the basis of a social engineering hack.
this is my favorite http://translate.google.com/#ru/pl/%D0%BA%D0%BE%D0%B2%D0%B5%... (in polish it sounds like 'caviar in a can')
we also share a lot of funny misunderstandings with czech. čerstvý chléb is zatuchlý chléb in polish, imagine being there for the first time and seeing grocery stores advertising old bread :D
IE 10 doesn't have the most features available, I never said it did. What I did say that that the features that are present are more correctly implemented than other browsers.
Microsoft is definitively cleaning its tab.
www.bing.com uses an invalid security certificate.
The certificate is only valid for the following names:
a248.e.akamai.net , *.akamaihd.net , *.akamaihd-staging.net
(Error code: ssl_error_bad_cert_domain)
I expect that Microsoft will fix Bing much more quickly.
Does anyone know if Bing has any SSL-only clients? Like do any of their toolbars or built-in search widgets in Windows use SSL by default?
I don't like or use Bing services but when you get as big and complex as Bing, nothing is easy, it's not like you can open your text editor, modify one line, commit and push.
There so many other variables to take into consideration, by hurrying to fix, you could face all sort of vulnerabilities / other issues.
I'm not saying this error is acceptable, I'm saying it's ridiculous to say it's and easy fix without actually knowing anything, it might or it might now.
I'll be getting a wildcard certificate for a project and, never having used one before, I had assumed the certificate would be valid for an entire domain.
I understand from this situation that a wildcard certificate is relevant only to https://*.example.com and not the subdomainless https://example.com.
Assuming that to be the case, is having a wildcard certificate for *.example.com and a second certificate for example.com the solution? It'd be nice to have the entire domain covered by a wildcard certificate and not just all subdomains.
This can be achieved via SAN (alternative names): http://en.wikipedia.org/wiki/Wildcard_certificate#Limitation
Incidentally, beware of RapidSSL and their "free www." SAN; they only grant it in certain specific and undocumented circumstances.
So basically make sure your CSR request is for www.domain.com if you want to also secure the root domain.
Really, CAs shouldn't be throwing in "free bonus" SANs without customer authorization ever. It would be much better to have a place to enter the SANs, or a checkbox asking if I want "www." as well, or to apply to the parent also, or whatever. That would also make the process more apparent to the user in addition to being more secure.
Bing is doing something similar (see https://news.ycombinator.com/item?id=5576271), but they appear to have forgotten to list the plain domain.
(Actions speak louder than words.)
Basically Akamai is using the same ssl certifate on (most?) of its edge servers. The reason for that is that traditionally it is difficult to decide for a server that is serving multiple domains, which SSL certificate to show for a client -- the HTTP header, which contains the hostname is sent way after the certificate information has been exchanged.
A certificate can contain several hostnames (in the SubjectAlternativeName extension) - but that does not scale if you have a big number of sites for a number of reasons (re-signing the certificate all the time is a nuisance, browser behavior with certificates containing several thousand hostnames is kind of fun, etc.).
Nowadays there are solutions to that problem (using the Server Name Indication TLS extension -- which basically sends the desired hostname in the TLS exchange before the certificate is exchanged). However, the number of sites actively using SNI is very low - google is the only site known to me that is doing it (try accessing google.com with/without SNI and you will get completely different certificates).
The reason why SNI is not yet that much used is that client support is still a bit flaky. Afaik it is supported by all recent desktop browsers. However, I think the XP TLS stack does not support it (and there are still enough users on that), android only supports it starting with version 3.0, etc.
So - at the moment you basically still need a separate IP for each site (or at least one IP for sites that can share one certificate).
I don't know if akamai also supports custom SSL certificates. Facebook seems to use kind of an interesting mix between akamai and self-hosting - facebook.com itself seems to be hosted by facebook. However, if you use facebook over ssl and check the url of served profile pictures, you will see that they go to https://fbcdn-profile-a.akamaihd.net (or similar) -- hence to one of the hostnames that is mentioned in the akamai edge certificates.
SAN certs are an entirely different ball of wax. Akamai does support them, but there are some challenges getting them deployed.
Unfortunately, the deployment for any type of certificate with Akamai is a very manual process.
To me, this looks like they pushed the wrong cert to prod.
Most-likely an oversight, but it does suggest ssl isn't supported.
25000 more employees can't fix this kind of incompetence.