
Mozilla fears DarkMatter ‘misuse’ of browser for hacking - Santosh83
https://in.reuters.com/article/us-usa-spying-darkmatter/firefox-maker-fears-darkmatter-misuse-of-browser-for-hacking-idINKCN1QL28T
======
steventhedev
Can the admins change this to point to the actual mailing list thread[0]?

I'm still reading through the thread, but it seems that DarkMatter have some
small technical issues around issuing certifications. Specifically, the claims
seem to be:

1\. using 63 bits of entropy for the serial number instead of the required 64.

2\. name constraints were incorrectly filled in for their intermediate
certificates.

3\. DarkMatter's other business units may compromise the CA internally for
malicious purposes.

[0]:
[https://groups.google.com/forum/#!topic/mozilla.dev.security...](https://groups.google.com/forum/#!topic/mozilla.dev.security.policy/nnLVNfqgz7g)

EDIT: more background - DarkMatter have an open request to be included in the
Mozilla CA Store[1], and are currently trusted through intermediate
certificates rooted with QuoVadis, who were recently acquired by DigiCert.

[1]:
[https://bugzilla.mozilla.org/show_bug.cgi?id=1427262](https://bugzilla.mozilla.org/show_bug.cgi?id=1427262)

~~~
bluesign
I think DarkMatter is right on 64 bit entropy discussion, requirements are
clearly wrong unfortunately.

\- it says you should use 64-bit entropy as output from a CSPRNG

\- it says Serial number is a signed number

If you use 64bit serial, you clearly lose 1 bit. That means any certificate
with 64bit serial is not conforming to the standard.

~~~
jstanley
> If you use 64bit serial, you clearly lose 1 bit.

That's not clear to me at all. A 64-bit signed integer can still contain 64
bits of entropy, it just has to be negative 50% of the time.

~~~
tialaramex
Some things don't like negative serial numbers, and so the Baseline
Requirements (which is where this 64-bit entropy requirement comes from) say
to only use positive serial numbers. Thus, "you clearly lose 1 bit".

------
michaelt
I've got to say, DarkMatter seems suspicious to me just because they're trying
to enter the CA business when Let's Encrypt already exists.

I can understand legacy CAs staying around as long as their existing customers
keep paying. And I'm sure there will be an ongoing market for one or two other
CAs specialising in the needs of big corporations.

But starting a new CA today, trying to sell a product that's available for
free? You've gotta wonder where they think they're going to add value.

~~~
unethical_ban
Well, we shouldn't want a single CA, no matter the organization. We should
always have at least several that are independent in its leadership, ownership
and auditing.

And yes, there are things LE doesn't yet offer. They still don't do EV, SANs,
or wildcards do they?

~~~
LinuxBender
They do wildcard certificates now, I have several of them. I _think_ they do
SAN as well.

------
kam
If you thought the title was nonsense, it gets worse when the author tries to
explain what a certificate authority does:

> Websites that want to be designated as secure have to be certified by an
> outside organization, which will confirm their identity and vouch for their
> security. The certifying organization also helps secure the connection
> between an approved website and its users, promising the traffic will not be
> intercepted.

~~~
nightfly
That is a perfectly reasonable description for a non-technical reader.

~~~
kam
The biggest issue with it to me is the "vouch for their security" part,
perpetuating the myth that HTTPS means the site is "safe" or that a
certificate is a security audit of some kind.

------
vik016
Sometimes I find Reuters stories boring compared to all the superficially
exciting news produced by everyone else. But these are the kind of stories
that make me thankful they exist. That contrast, in distracting times, reminds
me of what choices I make when I consume news.

Kudos to Mozilla for keeping things open. Holding on to your values, in cut
throat competitve times is fucking hard. Mad respect.

~~~
hannob
I'm not sure for what you praise Reuters here.

This story has been reported by multiple news sites before, and I don't see
that Reuters adds anything that hasn't been reported before.

(Except if you mean the original Reuters / Project Raven story that was indeed
exceptionally.)

------
oger
I had to read it twice to get that they were writing about a CA. Definitly a
writer without a clue.

~~~
billyhoffman
Compounding the problem was that the author kept using the word "safe".

FTA: "While Mozilla had been considering whether to grant DarkMatter the
authority to certify websites as safe, two Mozilla executives said in an
interview last week that Reuters’ report raised concerns about whether
DarkMatter would abuse that authority."

Conflating "safe" or "secure" with "this website is who they say they are" may
make it easier for a lay person to understand, but is wrong and creates a
dangerous impression.

------
gpm
The main thing this thread (going with steventhedev's link) demonstrates to me
is that the CA program is completely broken. Here we have a potential CA that
is

\- Demonstrably involved in internet surveillance [0]

\- Doesn't understand the word "entropy" [1] or intentionally interprets the
guidelines in an obtuse manner in which the sequence 0, 1, 2, ... can be
considered randomly generated with 64 bits of entropy [2].

\- Likely in the business of lying [3] (either they are lying or reuters is
thoroughly mistaken or lying)

How could whatever public benefit is gained by including these people as a CA
possibly outweigh the risk? I'd think that any one of the above should
disqualify them.

It also seems completely backwards that the root CAs publish certificates that
give subCAs equivalent power while not having the subCAs go through the same
level of scrutiny as root CAs.

[0] [https://www.reuters.com/investigates/special-report/usa-
spyi...](https://www.reuters.com/investigates/special-report/usa-spying-
raven/) \- also whatever you think "defensive cyber security" means.

[1] This is after repeated prompting and consulting with engineers (hint for
those not familiar with entropy, it's not possible to have more than x bits of
entropy in a x bit random number, and they are using a 63 bit number)
[https://groups.google.com/forum/#!msg/mozilla.dev.security.p...](https://groups.google.com/forum/#!msg/mozilla.dev.security.policy/nnLVNfqgz7g/_OPSSs2uAAAJ)

[2] Kindly pointed out by Matt Palmer:
[https://groups.google.com/d/msg/mozilla.dev.security.policy/...](https://groups.google.com/d/msg/mozilla.dev.security.policy/nnLVNfqgz7g/c6HoK97RBQAJ)

[3] Compare mentions of dark matters in the first link, to there statement in
the second link (one assumes that any reasonable definition of defensive does
not involve spying on people in other nations)

\- [https://www.reuters.com/investigates/special-report/usa-
spyi...](https://www.reuters.com/investigates/special-report/usa-spying-
raven/)

\-
[https://groups.google.com/d/msg/mozilla.dev.security.policy/...](https://groups.google.com/d/msg/mozilla.dev.security.policy/nnLVNfqgz7g/C2A-t0xMBQAJ)

~~~
gruez
>It also seems completely backwards that the root CAs publish certificates
that give subCAs equivalent power while not having the subCAs go through the
same level of scrutiny as root CAs.

AFAIK the logic is that the since root CA is fronting the risk (in case of any
misissuance), there doesn't need to be any extra scrutiny on the part of
browser vendors. Besides, if this was somehow outlawed, then the root CA will
simply provide a API that rubber-stamps whatever certificate is requested.

~~~
gpm
> AFAIK the logic is that the since root CA is fronting the risk (in case of
> any misissuance), there doesn't need to be any extra scrutiny on the part of
> browser vendors.

This logic doesn't make sense (though I think it is the logic)

\- We audit CAs precisely because the cost of being caught isn't high enough
to provide a reasonable deterance.

\- No one is suggesting removing digicerts root certs over this. The cost to
the public is too high.

> the root CA will simply provide a API that rubber-stamps whatever
> certificate is requested.

\- We audit CAs precisely to prevent them from doing things like this... Any
CA caught doing this after it was forbidden isn't worth anyones trust and
could be immediately removed.

\- Even that would be an improvement because they would presumably keep
logs...

------
throw2016
This is a pattern. Mozilla and others are furthering the interests of
gatekeepers under the pretext of 'end user security'. The browser community
have been scaremongering against self signed certs for years now and yet are
happily handing 'authority' to shady projects in the middle east. What does
one make of that?

Because of this obsession with 'central authority' even local ajax calls to
127.0.0.1 and localhost are blocked by Mozilla. What is even worse is they are
doing this intentionally inspite of the standards explicitly not requiring
this for localhost or 127.0.0.1 and are misleading users on bug reports and
wasting their time. [1][2][3]

Some have wasted days mucking about with cors headers and ssl only to realize
this is a Firefox only issue. Even Chrome which is not shy about furthering
gatekeepers and their own interests doesn't do this. Needless to say this is
not a problem on any other browser.

[1]
[https://bugzilla.mozilla.org/show_bug.cgi?id=903966](https://bugzilla.mozilla.org/show_bug.cgi?id=903966)

[2]
[https://bugzilla.mozilla.org/show_bug.cgi?id=1440370](https://bugzilla.mozilla.org/show_bug.cgi?id=1440370)

[3] [https://stackoverflow.com/questions/51269439/why-are-my-
requ...](https://stackoverflow.com/questions/51269439/why-are-my-requests-
from-https-domain-to-http-localhost-failing-cors)

~~~
rhn_mk1
Reading the bug reports, it seems that Firefox allows access to 127.0.0.1
ports if the originating website is _not_ https. This is weird - I would never
want some random site control my local services (including LAN) without
getting an explicit permission. This leads to vulnerabilities:
[https://lwn.net/Articles/703485/](https://lwn.net/Articles/703485/)

~~~
throw2016
A standard website cannot access your local services. 127.0.0.1 and localhost
are local addresses to the server, not the end user system. Private addresses
cannot be accessed over the public internet.

This is about a server accessing backend local services on the server or local
private network via a SSL terminating reverse proxy etc.

~~~
rhn_mk1
I don't understand a couple of things from this answer.

> 127.0.0.1 and localhost are local addresses to the server, not the end user
> system.

If localhost is relative to the server, then a browser running on an end user
device cannot access it due to unroutable IP in the first place, obviating the
problem, which is not the case, seeing that the bug is open in Firefox.

> Private addresses cannot be accessed over the public internet.

That makes sense, the 127.0.0.0/8 space is routed only to localhost.

> a SSL terminating reverse proxy

If the proxy endpoint is on the localhost, it looks the same to the browser as
any other local service before a connection attempt. If the proxy cannot
certify that it matches the website (using a SSL cert?), then it's
indistinguishable from any actual local service, and therefore should not be
accessible by default.

EDIT: I genuinely do not understand what those sentences mean in the context
of that discussion - they seem to be using terms not in a way I've always seen
them used.

~~~
throw2016
The standards explicitly name and allow 127.0.0.1 and localhost. Is there any
value in rehashing this?

This works perfectly on Chrome and all other browsers. Its only Firefox which
is a problem. So its for Mozilla to explain why they are going against the
standards and other browsers.

~~~
rhn_mk1
If Firefox is doing something that's closer to the right thing to do in terms
of security, then it's not Firefox that is the problem, but the standard.

It would not be the only Web standard that acts against the users (WebRTC
enumeration of IP addresses, canvas fingeprinting, 3rd party cookies, many
others come to mind).

~~~
throw2016
Then what's the point of having standards?

One could make the case Firefox is trying to 'protect users' if they resisted
or protested any of the user hostile standards you mention, but Firefox has
implemented all of them.

It's just this case and this is not 'user hostile'. If they are going their
own way there must be transparency, disclosure and reasoning. But in the bug
reports they are denying it.

~~~
rhn_mk1
Okay, I'm not saying Firefox is trying to protect the users in this case.
Based on [0], they care about following the standard, and they implemented a
patch once they saw they were not compliant. About 6 months ago, the bug came
back. With a charitable outlook, it looks like a bug they don't give much
priority to.

When it comes to the bigger topic of standards, they are meant to increase
interoperability. However, there are many ways to interoperability, and they
can serve different masters. Seeing how standards are just codified
behaviours, they are similar to law, and it's quite clear that law is not
always good. Civil disobedience is widely recognized as a valid way to
influence laws.

In this light, I would say breaking standards can be a form of civil
disobedience, if the standard doesn't serve the regular, disenfranchised
person.

[0]
[https://bugzilla.mozilla.org/show_bug.cgi?id=903966](https://bugzilla.mozilla.org/show_bug.cgi?id=903966)

