Hacker News new | past | comments | ask | show | jobs | submit login
Bing now supports https (bing.com)
47 points by foolproof on Jan 11, 2014 | hide | past | favorite | 45 comments



For some context:

https://news.ycombinator.com/item?id=5576041 (8 months ago)

https://news.ycombinator.com/item?id=6937686 (1 month ago)

http://www.zdnet.com/bing-is-fine-insecure-as-ever-but-fine-... (April 2013)

> Bing has never supported secure connections...


Not only did it not support secure connections (which on its own is kind of shitty, but not laughing-stock material IMO), but it also failed to fail elegantly when you attempted to make a secure connection.


Issuer: "Microsoft Internet Authority"

Just like with Google: Google Internet Authority. Interesting to see big companies not trusting intermediate CA's anymore that they go through the length of becoming a CA themselves. However it could also be a cost effective strategy.


With the amount of subdomains they use, and seeing as they each have a browser with significant market share, it's a no-brainer. I'm frankly surprised it took so long for them to bring such an essential security task in-house.


Also, not just for HTTP either since they can sign their own software binaries with it too.

Probably cheaper for them to be their own CA long term.


Wildcard certificates.


Wildcards are not as useful to massive organisations as you would think. The wouldn't want to have a single public/private pair (that is essentally what a certificate is: the public key to your private one, signed by a 3rd party key that the 2nd party has in its trust list) for .microsoft.com (nor would Google for .google.com) as a single all-powerful key could be far more hassle should it get into the hands of an incompetent or malicious individual/team. If they had to revoke the certificate for *.microsoft.com then there would be a hell of a lot of administration work to be done to reconfigure each part of their infrastructure and renegotiate relevant internal trust relationships between parts of the that infrastructure using the new keys. While having different keys for everything is a burden when all is going well, the burden is worth taking on the offchance that sometihng somewhere does go badly wrong: the damage caused by any given problem can be limited.

We have wildcard certs for each of our properties, but the resources using those certificates are many orders of magnitude less numbersome than the resources covered by the name of a multinational monster. And even though we use a wildcard for internal resources we get specific keys generated and signed for client specific stuff (if we host any service on <client>.<ourdomain>.<tld> for instance) just as we have different SSH keys and such for accessing information sources they provide for integration purposes: not having the one all powerful key limits the potential damage (and work involved) should any partticualr key/sub-key become compromised. If out internal key were to be stolen by a malicious entity or accidentally made public by a mistake on our part the no client specific resources would be affected (of course to ensure this separation you need to distribute access to the private keys effectively so that they can't all get compromised in a single event.


I see, hmm yeah indeed it'd be much more desirable to have an own CA if you have such huge infrastructure. They could even issue a new pub/private keypair per server for compartmentalization. Interesting idea.


Well that certificate comes from the Baltimore Cybertrust Root certificate. Doesn't that mean MS is just an intermediate CA?


They get an A from Qualys (yay?): https://www.ssllabs.com/ssltest/analyze.html?d=bing.com

… but no PFS :/


LOL even my blog with free SSL cert from startssl gets A. I suppose that's nothing to brag about. https://www.ssllabs.com/ssltest/analyze.html?d=ahmetalpbalka...


Setting up SSL/TLS on a single server in a virtual environment that you don't even have to manage is a different story than getting major changes tested and deployed on a huge multi-billion dollar, multi-tenant distributed system spanning not just the globe but multiple teams, languages, people and requirements.


Just so i'm clear, your argument is essentially: we should be more impressed more because they designed it in a way that made it difficult for them to do this?


There's no way to design a multi-million-user search engine in a way that makes it easy to design, test, and deploy major changes.


Where are the DuckDuckGo guys with a counterexample when you need them...


I know empirically this is false.


In my opinion, anyone should be able to get an A with a free certificate, or at least nothing beyond administration costs. Certificates cost nothing to make. Anyone paying a dime is just helping keep up the illusion that it is expensive.


So, why stop short of implementing PFS? An oversight?


Don't forget to add PFS, Microsoft. And no NIST-corrupted curves, please.


In other words, don't use elliptic curves? And, therefore, don't use forward secrecy? Does current browser support for curves even allow you to set up a "NIST-free" ECDHE TLS server?


There's also DHE, which is not "NIST-corrupted" I guess. As far as I know, in theory it should be possible to use the Brainpool curves in TLS, but haven't seen such a thing in actual use.


Right, but it's also not performant enough at scale.

NSS doesn't support the Brainpool curves. OpenSSL does, but no mainstream browser uses it.


Support in unreleased versions (openssl 1.0.2) counts as support?


I guess.


You're using the same argument RSA used when they decided to just keep Dual EC DRBG, because it was "too late to change", even though they knew it was a backdoor. Granted, I think they are lying and did it on purpose, but even their lie is pretty bad logic.

They need to talk to Google, Mozilla and others, and decide on using a new set of safe curves in their browsers. Using a broken one is not a solution.


IE and IIS never used Dual EC as their CSPRNG.

The "NIST corrupted curves" you refer to are, for all intents and purposes, the Internet standard curves. Microsoft could provide a configuration that used only the Brainpool curves, but no browser would be able to talk to them.


Next step: enable perfect forward secrecy :)


no document.referrer. So this is untrackable. I think it's a shame, search engines should pass a referrer and hide searched term for privacy.


I thought this was inherent in following a link from a https resource to a http resource with a different authority, that the browser would never expose the secure URL (via header or script state) to the insecure content.


This is correct. Browsers will not send the referrer header when moving from HTTPS to HTTP URIs. This can easily be solved by just converting your website to HTTPS only at which point search engines will index your HTTPS URLs and you will begin to receive referrer headers once again.


Though Google strips the search terms even on https to https (don't know about others).


The goal is to avoid leaking information on the url that might be sensitive. The fact that the user is coming from bing is not sensitive, and this might be important for webmasters. The query might be sensitive and thus should be stripped.


I saw this on twitter, and was confused. (Twitter removed the HTTPS in the url it shows... so now it makes sense.)


The SafeSearch setting isn't saved on https. Did they even test this thing before pushing it out?


They didn't yet?!


finally my neighbour cannot see what i bing ! huh ?


Good to see Microsoft making progress on this!


Wow… I wasn't waiting for anything else.


what?


Before, the https version of bing only showed a blank page. Seems like MS finally fixed it.


They enabled TLS (HTTPS).


AFAIK, it worked for https://bing.com. It was enabled for https://www.bing.com.


Bing used to be hosted by Akamai, looks like they switched over their own edge network.

Also, https://bing.com just redirected to plain http, now you can actually search over https like with Google and DDG.


I think he means finally they allow ssl?





Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: