
Bing now supports https - foolproof
https://www.bing.com
======
jared314
For some context:

[https://news.ycombinator.com/item?id=5576041](https://news.ycombinator.com/item?id=5576041)
(8 months ago)

[https://news.ycombinator.com/item?id=6937686](https://news.ycombinator.com/item?id=6937686)
(1 month ago)

[http://www.zdnet.com/bing-is-fine-insecure-as-ever-but-
fine-...](http://www.zdnet.com/bing-is-fine-insecure-as-ever-but-
fine-7000014285/) (April 2013)

> Bing has never supported secure connections...

~~~
bjackman
Not only did it not support secure connections (which on its own is kind of
shitty, but not laughing-stock material IMO), but it also failed to fail
elegantly when you attempted to make a secure connection.

------
nivla
Issuer: "Microsoft Internet Authority"

Just like with Google: Google Internet Authority. Interesting to see big
companies not trusting intermediate CA's anymore that they go through the
length of becoming a CA themselves. However it could also be a cost effective
strategy.

~~~
dublinben
With the amount of subdomains they use, and seeing as they each have a browser
with significant market share, it's a no-brainer. I'm frankly surprised it
took so long for them to bring such an essential security task in-house.

~~~
lucb1e
Wildcard certificates.

~~~
dspillett
Wildcards are not as useful to massive organisations as you would think. The
wouldn't _want_ to have a single public/private pair (that is essentally what
a certificate is: the public key to your private one, signed by a 3rd party
key that the 2nd party has in its trust list) for _.microsoft.com (nor would
Google for_.google.com) as a single all-powerful key could be far more hassle
should it get into the hands of an incompetent or malicious individual/team.
If they had to revoke the certificate for *.microsoft.com then there would be
a hell of a lot of administration work to be done to reconfigure each part of
their infrastructure and renegotiate relevant internal trust relationships
between parts of the that infrastructure using the new keys. While having
different keys for everything is a burden when all is going well, the burden
is worth taking on the offchance that sometihng somewhere does go badly wrong:
the damage caused by any given problem can be limited.

We have wildcard certs for each of our properties, but the resources using
those certificates are many orders of magnitude less numbersome than the
resources covered by the name of a multinational monster. And even though we
use a wildcard for internal resources we get specific keys generated and
signed for client specific stuff (if we host any service on
<client>.<ourdomain>.<tld> for instance) just as we have different SSH keys
and such for accessing information sources they provide for integration
purposes: not having the one all powerful key limits the potential damage (and
work involved) should any partticualr key/sub-key become compromised. If out
internal key were to be stolen by a malicious entity or accidentally made
public by a mistake on our part the no client specific resources would be
affected (of course to ensure this separation you need to distribute access to
the private keys effectively so that they can't all get compromised in a
single event.

~~~
lucb1e
I see, hmm yeah indeed it'd be much more desirable to have an own CA if you
have such huge infrastructure. They could even issue a new pub/private keypair
per server for compartmentalization. Interesting idea.

------
semenko
They get an A from Qualys (yay?):
[https://www.ssllabs.com/ssltest/analyze.html?d=bing.com](https://www.ssllabs.com/ssltest/analyze.html?d=bing.com)

… but no PFS :/

~~~
aabalkan
LOL even my blog with free SSL cert from startssl gets A. I suppose that's
nothing to brag about.
[https://www.ssllabs.com/ssltest/analyze.html?d=ahmetalpbalka...](https://www.ssllabs.com/ssltest/analyze.html?d=ahmetalpbalkan.com)

~~~
booi
Setting up SSL/TLS on a single server in a virtual environment that you don't
even have to manage is a different story than getting major changes tested and
deployed on a huge multi-billion dollar, multi-tenant distributed system
spanning not just the globe but multiple teams, languages, people and
requirements.

~~~
DannyBee
Just so i'm clear, your argument is essentially: we should be more impressed
more because they designed it in a way that made it difficult for them to do
this?

~~~
snowwrestler
There's no way to design a multi-million-user search engine in a way that
makes it easy to design, test, and deploy major changes.

~~~
christiangenco
Where are the DuckDuckGo guys with a counterexample when you need them...

------
d0ugie
So, why stop short of implementing PFS? An oversight?

------
higherpurpose
Don't forget to add PFS, Microsoft. And no NIST-corrupted curves, please.

~~~
tptacek
In other words, don't use elliptic curves? And, therefore, don't use forward
secrecy? Does current browser support for curves even allow you to set up a
"NIST-free" ECDHE TLS server?

~~~
pbsd
There's also DHE, which is not "NIST-corrupted" I guess. As far as I know, in
theory it should be possible to use the Brainpool curves in TLS, but haven't
seen such a thing in actual use.

~~~
tptacek
Right, but it's also not performant enough at scale.

NSS doesn't support the Brainpool curves. OpenSSL does, but no mainstream
browser uses it.

~~~
harshreality
Support in unreleased versions (openssl 1.0.2) counts as support?

~~~
tptacek
I guess.

------
zokier
Next step: enable perfect forward secrecy :)

------
dudus
no document.referrer. So this is untrackable. I think it's a shame, search
engines should pass a referrer and hide searched term for privacy.

~~~
prodigal_erik
I thought this was inherent in following a link from a https resource to a
http resource with a different authority, that the browser would never expose
the secure URL (via header or script state) to the insecure content.

~~~
RKearney
This is correct. Browsers will not send the referrer header when moving from
HTTPS to HTTP URIs. This can easily be solved by just converting your website
to HTTPS only at which point search engines will index your HTTPS URLs and you
will begin to receive referrer headers once again.

~~~
tonfa
Though Google strips the search terms even on https to https (don't know about
others).

------
alphex
I saw this on twitter, and was confused. (Twitter removed the HTTPS in the url
it shows... so now it makes sense.)

------
gtklocker
The SafeSearch setting isn't saved on https. Did they even test this thing
before pushing it out?

------
lucb1e
They didn't yet?!

------
ateevchopra
finally my neighbour cannot see what i bing ! huh ?

------
diziet
Good to see Microsoft making progress on this!

------
bedspax
Wow… I wasn't waiting for anything else.

------
northisup
what?

~~~
sp332
They enabled TLS (HTTPS).

~~~
adamnemecek
AFAIK, it worked for [https://bing.com](https://bing.com). It was enabled for
[https://www.bing.com](https://www.bing.com).

~~~
timrivera
Bing used to be hosted by Akamai, looks like they switched over their own edge
network.

Also, [https://bing.com](https://bing.com) just redirected to plain http, now
you can actually search over https like with Google and DDG.

