I'm still reading through the thread, but it seems that DarkMatter have some small technical issues around issuing certifications. Specifically, the claims seem to be:
1. using 63 bits of entropy for the serial number instead of the required 64.
2. name constraints were incorrectly filled in for their intermediate certificates.
3. DarkMatter's other business units may compromise the CA internally for malicious purposes.
EDIT: more background - DarkMatter have an open request to be included in the Mozilla CA Store, and are currently trusted through intermediate certificates rooted with QuoVadis, who were recently acquired by DigiCert.
- it says you should use 64-bit entropy as output from a CSPRNG
- it says Serial number is a signed number
If you use 64bit serial, you clearly lose 1 bit. That means any certificate with 64bit serial is not conforming to the standard.
This has been generally accepted (via discussions on MDSP and in the CABForum itself) to mean that the serial must contain at least 64-bits of entropy. Not that the generation process consumes 64-bits. If it was the latter then `head -c 8 /dev/urandom | head -c 1` (and clamping the top bit to 0) would be a permissible method of generating a serial since it consumes 64-bits of CSPRNG output to generate the serial number.
CAs are given enormous power in the ecosystem and we judge them based on their incident response and process improvement. The barrier to entry should be very high for new entrants as we strive to improve the state of the industry. This incident alone is not a disqualifier for DM's inclusion but should be considered in the context of the other issues (both technical and otherwise) raised during review of the application process.
Regarding your claim that all certificates with 64-bit serial numbers are guaranteed invalid: A certificate with a 64-bit serial MAY be valid under the standard because a CA might choose to implement the spec such that they generate a serial and prepend a null byte if and only if the high bit is 1. To determine whether or not a CA is doing that experimentally you could use services like crt.sh or censys to find all CAs with valid 64-bit serial number certificates and see if they also issue 72-bit serial numbers. If you don't see any >64-bit they're likely violating the BRs and you should submit a report to them (Mozilla maintains a contact list for CAs on their wiki) and also send an email to MDSP. Independent researchers are the lifeblood of the current wave of enforcement across the industry!
In this specific case, they have the option to use 160 bit fields for the serial number. They chose to use the minimum required of 64, apparently out of performance concerns for embedded devices with limited memory/compute.
The CA platform (EJBCA) they use defaulted to this behavior, and from what I was able to understand are in the process of pushing out an update to all their customers to resolve the issue. It is very likely that someone will start going through the CT logs and we will see many reports of insufficient entropy in cert serial numbers.
Mozilla doesn't have a big team of investigators wandering about looking for possible problems. The CAs could dob themselves in (filing a "problem report" that says they broke this rule and what they're doing about it) or an independent researcher could write to m.d.s.policy about a problem they found. Otherwise it may go undetected forever.
Most interest is focused on problems that either directly indicate a grave problem (e.g. some researcher got a CA to issue for example.com when they shouldn't because that researcher doesn't control example.com) or suggest a management problem that could hide further grave problems from us (e.g. Symantec had paperwork that directly contradicted the operational behaviour of CrossCert, so why did we notice the problem before they did?)
The 64-bit entropy rule isn't such a problem. CAs should use random values here because it protects us against a problem with the secure hash function (SHA256 usually these days). If the hash function is slightly broken the random numbers buy us time to get a new one. But if you have 63-bits instead of 64-bits, it's not like that's the difference that makes an attack viable. It's just a technical violation, it's a Brown M&M in that sense. The rules say to do it, and they didn't, so we are more worried than otherwise that they are not obeying the rules.
You have some software to get key for serial from RNG, when you get 64 bit and put it to serial field, you feel it is ok.
This is probably what happened in this case. But if requirements made sure lets say signature part is 64bit random + some extra (serial, identifier etc) forcing serial field to be one more octet, this could never happen.
That's not clear to me at all. A 64-bit signed integer can still contain 64 bits of entropy, it just has to be negative 50% of the time.
I can understand legacy CAs staying around as long as their existing customers keep paying. And I'm sure there will be an ongoing market for one or two other CAs specialising in the needs of big corporations.
But starting a new CA today, trying to sell a product that's available for free? You've gotta wonder where they think they're going to add value.
From the EFF
> The concern in this case is that DarkMatter has made its business spying on internet communications, hacking dissidents’ iPhones, and other cyber-mercenary work. DarkMatter’s business objectives directly depend on intercepting end-user traffic on behalf of snooping governments. Giving DarkMatter a trusted root certificate would be like letting the proverbial fox guard the henhouse.
Its also worth mentioning that the FF truststore is regularly used as the truststore in various flavors of Linux.
And yes, there are things LE doesn't yet offer. They still don't do EV, SANs, or wildcards do they?
> Websites that want to be designated as secure have to be certified by an outside organization, which will confirm their identity and vouch for their security. The certifying organization also helps secure the connection between an approved website and its users, promising the traffic will not be intercepted.
Certificate authorities do "help" secure the connection by providing trust by "promising the traffic will not be intercepted" through an exploit in their certificate, and oftentimes insuring the connection to provide this trust (that's a lot of the difference in price between cheap and expensive certs, they insure for more). You could say this helps secure the connection by providing enough assurance that people are then willing to use the connection at all.
Welcome to the world of marketing and PR, where loose terminology, ambiguous phrasing means and a little imagination means never having to say you were wrong...
Sometimes you will see a journalist say something that's just _technically_ wrong, in an "Um... actually" kind of way. Like saying there are 4 billion "Internet addresses". Yeah it's actually 2^32 and those are IPv4 addresses, and anyway some are in reserved classes we can never use and... We shouldn't care about these cases, it's fun to make a big deal as a _nerd_ but it's not a real problem.
But here they are misleading people as to the general purpose of a CA.
The CA doesn't in fact promise "traffic will not be intercepted" and how could they. The CA promises that their subscriber proved to them that they know a private key to which the corresponding public key is in the certificate, and that they control the things (usually FQDNs) named in the certificate.
Insurance, as another poster points out, is basically worthless. In fact it's worthless in multiple independent ways, it's deliberately engineered to _be_ worthless. For consumer insurance that would be illegal in many countries (taking people's money to "insure" against a risk that won't happen is prohibited in those countries), but they don't sell it to consumers, they sell it to the Certificate Authority and non-individuals are more free to make bad decisions since nobody important will get hurt.
Right, but surely you see how 90% of that statement is utterly incomprehensible to a general audience?
Part of a journalist's job is to find a way to summarise an inevitably complicated story _without_ in the process making it into a completely different story. "Bilbo the wizard stole a magical ring" is not a good summary of The Hobbit, because it is wrong. The wizard isn't named Bilbo and that isn't what that story is about even though maybe it's important for other reasons.
Certificate Authorities are organisations that validate a web site's "domain name" and issue certificates for validated names. The site uses a certificate to prove its name to your web browser when you visit an encrypted site.
This seems relatively easy to understand, most importantly it's clear that the certificate is about naming and not some nebulous "security" or identity more broadly. Also valuable it's clear that the CA plays no direct role in actually securing an HTTPS connection (very common for people not to understand that, even fairly technical people) and yet it's vague enough that I don't have to explain about how public key technology works which is a whole can of worms.
Sounds more like you're describing a domain registrar or NIC. And why is "domain name" in quotes?
Also the insurances/warranties are a joke, the way they're written, there is pretty much no way anyone will be able to ever land a claim against them.
Kudos to Mozilla for keeping things open. Holding on to your values, in cut throat competitve times is fucking hard. Mad respect.
This story has been reported by multiple news sites before, and I don't see that Reuters adds anything that hasn't been reported before.
(Except if you mean the original Reuters / Project Raven story that was indeed exceptionally.)
FTA: "While Mozilla had been considering whether to grant DarkMatter the authority to certify websites as safe, two Mozilla executives said in an interview last week that Reuters’ report raised concerns about whether DarkMatter would abuse that authority."
Conflating "safe" or "secure" with "this website is who they say they are" may make it easier for a lay person to understand, but is wrong and creates a dangerous impression.
- Demonstrably involved in internet surveillance 
- Doesn't understand the word "entropy"  or intentionally interprets the guidelines in an obtuse manner in which the sequence 0, 1, 2, ... can be considered randomly generated with 64 bits of entropy .
- Likely in the business of lying  (either they are lying or reuters is thoroughly mistaken or lying)
How could whatever public benefit is gained by including these people as a CA possibly outweigh the risk? I'd think that any one of the above should disqualify them.
It also seems completely backwards that the root CAs publish certificates that give subCAs equivalent power while not having the subCAs go through the same level of scrutiny as root CAs.
 https://www.reuters.com/investigates/special-report/usa-spyi... - also whatever you think "defensive cyber security" means.
 This is after repeated prompting and consulting with engineers (hint for those not familiar with entropy, it's not possible to have more than x bits of entropy in a x bit random number, and they are using a 63 bit number) https://groups.google.com/forum/#!msg/mozilla.dev.security.p...
 Kindly pointed out by Matt Palmer: https://groups.google.com/d/msg/mozilla.dev.security.policy/...
 Compare mentions of dark matters in the first link, to there statement in the second link (one assumes that any reasonable definition of defensive does not involve spying on people in other nations)
AFAIK the logic is that the since root CA is fronting the risk (in case of any misissuance), there doesn't need to be any extra scrutiny on the part of browser vendors. Besides, if this was somehow outlawed, then the root CA will simply provide a API that rubber-stamps whatever certificate is requested.
This logic doesn't make sense (though I think it is the logic)
- We audit CAs precisely because the cost of being caught isn't high enough to provide a reasonable deterance.
- No one is suggesting removing digicerts root certs over this. The cost to the public is too high.
> the root CA will simply provide a API that rubber-stamps whatever certificate is requested.
- We audit CAs precisely to prevent them from doing things like this... Any CA caught doing this after it was forbidden isn't worth anyones trust and could be immediately removed.
- Even that would be an improvement because they would presumably keep logs...
Because of this obsession with 'central authority' even local ajax calls to 127.0.0.1 and localhost are blocked by Mozilla. What is even worse is they are doing this intentionally inspite of the standards explicitly not requiring this for localhost or 127.0.0.1 and are misleading users on bug reports and wasting their time. 
Some have wasted days mucking about with cors headers and ssl only to realize this is a Firefox only issue. Even Chrome which is not shy about furthering gatekeepers and their own interests doesn't do this. Needless to say this is not a problem on any other browser.
This is about a server accessing backend local services on the server or local private network via a SSL terminating reverse proxy etc.
> 127.0.0.1 and localhost are local addresses to the server, not the end user system.
If localhost is relative to the server, then a browser running on an end user device cannot access it due to unroutable IP in the first place, obviating the problem, which is not the case, seeing that the bug is open in Firefox.
> Private addresses cannot be accessed over the public internet.
That makes sense, the 127.0.0.0/8 space is routed only to localhost.
> a SSL terminating reverse proxy
If the proxy endpoint is on the localhost, it looks the same to the browser as any other local service before a connection attempt. If the proxy cannot certify that it matches the website (using a SSL cert?), then it's indistinguishable from any actual local service, and therefore should not be accessible by default.
EDIT: I genuinely do not understand what those sentences mean in the context of that discussion - they seem to be using terms not in a way I've always seen them used.
This works perfectly on Chrome and all other browsers. Its only Firefox which is a problem. So its for Mozilla to explain why they are going against the standards and other browsers.
It would not be the only Web standard that acts against the users (WebRTC enumeration of IP addresses, canvas fingeprinting, 3rd party cookies, many others come to mind).
One could make the case Firefox is trying to 'protect users' if they resisted or protested any of the user hostile standards you mention, but Firefox has implemented all of them.
It's just this case and this is not 'user hostile'. If they are going their own way there must be transparency, disclosure and reasoning. But in the bug reports they are denying it.
When it comes to the bigger topic of standards, they are meant to increase interoperability. However, there are many ways to interoperability, and they can serve different masters. Seeing how standards are just codified behaviours, they are similar to law, and it's quite clear that law is not always good. Civil disobedience is widely recognized as a valid way to influence laws.
In this light, I would say breaking standards can be a form of civil disobedience, if the standard doesn't serve the regular, disenfranchised person.
I’ve used https://ziahamza.github.io/webui-aria2/ in the past to manage aria2c with aria2c running on localhost (see https://github.com/ziahamza/webui-aria2/blob/master/README.m... for instructions) and the webui loaded directly from said address. That was a while ago and I don’t know if any browsers still allow access to localhost by default but at least some of them did last time I used that webui at least.