
Launching in 2015: A Certificate Authority to Encrypt the Entire Web - mariusz79
https://www.eff.org/deeplinks/2014/11/certificate-authority-encrypt-entire-web
======
digitalsushi
This certificate industry has been such a racket. It's not even tacit that
there are two completely separate issues that certificates and encryption
solve. They get conflated and non technical users rightly get confused about
which thing is trying to solve a problem they aren't sure why they have.

The certificate authorities are quite in love that the self-signed certificate
errors are turning redder, bolder, and bigger. A self signed certificate
warning means "Warning! The admin on the site you're connecting to wants this
conversation to be private but it hasn't been proven that he has 200 bucks for
us to say he's cool".

But so what if he's cool? Yeah I like my banking website to be "cool" but for
200 bucks I can be just as "cool". A few years back the browsers started
putting extra bling on the URL bar if the coolness factor was high enough - if
a bank pays 10,000 bucks for a really cool verification, they get a giant
green pulsating URL badge. And they should, that means someone had to fax over
vials of blood with the governor's seal that it's a legitimate institute in
that state or province. But my little 200 dollar, not pulsating but still
green certificate means "yeah digitalsushi definitely had 200 bucks and a fax
machine, or at least was hostmaster@digitalsushi.com for damned sure".

And that is good enough for users. No errors? It's legit.

What's the difference between me coughing up 200 bucks to make that URL bar
green, and then bright red with klaxons cause I didn't cough up the 200 bucks
to be sure I am the owner of a personal domain? Like I said, a racket. The
certificate authorities love causing a panic. But don't tell me users are any
safer just 'cause I had 200 bucks. They're not.

The cert is just for warm and fuzzies. The encryption is to keep snoops out.
If I made a browser, I would have 200 dollar "hostmaster" verification be some
orange, cautious URL bar - "this person has a site that we have verified to
the laziest extent possible without getting sued for not even doing anything
at all". But then I probably wouldn't be getting any tips in my jar from the
CAs at the end of the day.

~~~
pilif
_> A self signed certificate warning means "Warning! The admin on the site
you're connecting to wants this conversation to be private but it hasn't been
proven that he has 200 bucks for us to say he's cool"_

no. It means "even though this connection is encrypted, there is no way to
tell you whether you are currently talking to that site or to NSA which is
forwarding all of your traffic to the site you're on".

Treating this as a grave error IMHO is right because by accepting the
connection over SSL, you state that the conversation between the user agent
and the server is meant to be private.

Unfortunately, there is no way to guarantee that to be true if the identity of
the server certificate can't somehow be tied to the identity of the server.

So when you accept the connection unencrypted, you tell the user agent "hey -
everything is ok here - I don't care about this conversation to be private",
so no error message is shown.

But the moment you accept the connection over ssl, the user agent assumes the
connection to be intended to be private and failure to assert identity becomes
a terminal issue.

This doesn't mean that the CA way of doing things is the right way - far from
it. It's just the best that we currently have.

The solution is absolutely not to have browsers accept self-signed
certificates though. The solution is something nobody hasn't quite come up
with.

~~~
Joeri
Self-signed certificates are still better than http plain text. I understand
not showing the padlock icon for self-signed certificates, I don't understand
why you would warn people away from them when the worst case is that they are
just as unsafe as when they use plain http. IMHO this browser behavior is
completely nonsensical.

~~~
pilif
How would a browser know that the the self-signed certificate that was just
presented for www.mybank.com is intended to be self-signed (show no error, but
also show no padlock) or whether it's the result of a MITM attack because
www.mybank.com is supposed to present a properly signed certificate (show
error)?

How would you inform people going to www.mybank.com which is presenting a
self-signed cert in a way that a) they clearly notice but that b) doesn't
annoy you when you connect to www.myblog.com which also is presenting a self-
signed cert?

~~~
seryoiupfurds
If the user typed www.mybank.com, let the server redirect to https but don't
show the lock icon if it's self-signed. This is no worse than an impostor that
just doesn't redirect to https.

If the user typed [https://www.mybank.com](https://www.mybank.com), show the
usual warning for self-signed certificates.

~~~
blendo
This is EXACTLY what I want for my intranet sites. It lets me protect my users
from the wireshark in the next cubicle.

~~~
tacticus
The solution for this is to run your own CA internally and push out the cert
to all the machines. (if you have byod stuff it makes it a little harder but
you could still have an internal ca signing only a certain subdomain and get
people ot install it)

------
Karunamon
This is awesome! It looks like what CACert.org set out to be, except this time
instead of developing the CA first and then seeking certification (which has
been a problem due to the insanely expensive audit process), but the EFF got
the vendors on board first and then started doing the nuts and bolts.

This is _huge_ if it takes off. _The CA PKI will no longer be a scam
anymore!!_

I'd trust the EFF/Mozilla over a random for profit "security corporation" like
VeriSign any day of the week and twice on Sunday to be good stewards of the
infrastructure.

~~~
tatterdemalion
I don't see how this actually keeps the CA PKI from being a scam. While I
personally trust the EFF & Mozilla right now, as long as I can't meaningfully
revoke that trust, it's not really trust and the system is still broken.

~~~
mike_hearn
You can revoke your trust in any CA at any time, you don't even need to see
any errors! Just click the little padlock each time you visit a secure website
and see if the CA is in your good books. If it's not, pretend the padlock
isn't there!

OK, that's a little awkward. A browser extension could automate this. But in
practice, nobody wants to do this, because hardly anyone has opinions on
particular CAs. It's a sort of meta-opinion - some people feel strongly they
should be able to feel strongly about CAs, but hardly anyone actually does. So
nobody uses such browser extensions.

~~~
desdiv
Can't you just delete the CA from the browser?

On Firefox it's preferences -> advanced -> certificates -> view certificates.

~~~
technomancy
I'm curious as to whether Firefox's sync functionality propagates CA overrides
across machines. If not then this is something you'd have to repeat over for
every machine you use, making it effectively too tedious to be practical.

~~~
desdiv
It doesn't yet, unfortunately. There's a related feature request for syncing
user added certificates:

[https://bugzilla.mozilla.org/show_bug.cgi?id=583935](https://bugzilla.mozilla.org/show_bug.cgi?id=583935)

But syncing which certificates to delete is probably a much harder sell.

At least there's a way to do programmatically:

    
    
        apt-get install libnss3-tools
        certutil -d /home/$USER/.mozilla/firefox/$FIREFOX_PROFILE -D -n $TARGET_CA_NAME

------
Animats
The EFF has a bad track record in this area. The last time they tried
something to identify web sites, it was TRUSTe, a nonprofit set up by the EFF
and headed by EFF's director. Then TRUSTe was spun off as a for-profit private
company, reduced their standards, stopped publishing enforcement actions, and
became a scam operation. The Federal Trade Commission just fined them: "TRUSTe
Settles FTC Charges it Deceived Consumers Through Its Privacy Seal Program
Company Failed to Conduct Annual Recertifications, Facilitated
Misrepresentation as Non-Profit" ([http://www.ftc.gov/news-events/press-
releases/2014/11/truste...](http://www.ftc.gov/news-events/press-
releases/2014/11/truste-settles-ftc-charges-it-deceived-consumers-through-
its)) So an EFF-based scheme for a new trusted nonprofit has to be viewed
sceptically.

This new SSL scheme is mostly security theater. There's no particular reason
to encrypt traffic to most web pages. Anyone with access to the connection can
tell what site you're talking to. If it's public static content, what is SSL
protecting? Unless there's a login mechanism and non-public pages, SSL isn't
protecting much.

The downside of SSL everywhere is _weak_ SSL everywhere. Cloudflare sells
security theater encryption now. All their offerings involve Cloudflare acting
as a man-in-the-middle, with everything decrypted at Cloudflare. (Cloudflare's
CEO is fighting interception demands in court and in the press, which
indicates they get such requests. Cloudflare is honest about what they're
doing; the certificates they use say "Cloudflare, Inc.", so they identify
themselves as a man-in-the-middle. They're not bad guys.)

If you try to encrypt everything, the high-volume cacheable stuff that doesn't
need security but does need a big content delivery network (think Flickr) has
to be encrypted. So the content-delivery network needs to impersonate the end
site and becomes a point of attack. There are known attacks on CDNs; anybody
using multi-domain SSL certs with unrelated domains (36,000 Cloudflare sites
alone) is vulnerable if any site on the cert can be broken into. If the site's
logins go through the same mechanism, security is weaker than if only the
important pages were encrypted.

You're better off having a small secure site like "secure.example.com" for
checkout and payment, preferably with an Extended Validation SSL certificate,
a unique IP address, and a dedicated server. There's no reason to encrypt your
public product catalog pages. Leave them on "example.com" unencrypted.

~~~
AlyssaRowan
Regarding your first paragraph, I agree: all CAs need continuing scrutiny.
Certificate Transparency, for example.

Regarding the rest of your post, however, I'm calling bullshit. You give very
bad advice. Deploy TLS _on every website_. Deploy HTTP Strict-Transport-
Security wherever you can.

The sites people visit are confidential, and yes, are not protected enough at
the moment. (That will eventually improve, piece by piece.) That's absolutely
no excuse at all for you not protecting data about the _pages_ they're on or
the specific things they're looking at, even if your site is static, or not
protecting the _integrity_ of your site. You have no excuse for that. Go do
it.

Your other big problem is thinking that _anything_ on your domain "doesn't
need security"! Yes it does - unless you actually _desire_ your website to be
co-opted for use in malware planting by Nation-State Adversaries with access
to Hacking Team(s) (~cough~) - or the insecure parts of your website being
injected by a middleman with malicious JavaScript or someone else's "secure"
login page that's http: with a lock favicon. (I have seen this in the wild,
yes.) If you've deployed a site with that bad advice, it could be exploited
like that today: go back and encrypt it _properly_ before someone hacks your
customers. This is why HSTS exists. _Use it_.

Regarding your CDN point, kindly cite - or demonstrate - your working "known
attack" against Cloudflare's deployment?

~~~
Animats
_kindly cite_

Black Hat 2009, "Why TLS Keeps Failing to Protect", Moxy Marlinspike, slide
42:
[https://www.blackhat.com/docs/us-14/materials/us-14-Delignat...](https://www.blackhat.com/docs/us-14/materials/us-14-Delignat-
The-BEAST-Wins-Again-Why-TLS-Keeps-Failing-To-Protect-HTTP.pdf)

Basic concept: 1) find target site A with shared SSL cert. Cloudflare gets
shared SSL certs with 50+ unrelated domains. 2) find vulnerable server B in a
domain on same cert. (Probably Wordpress.) 3) attack server B, inserting fake
copy of important pages on site A with attack on client or password/credit
card interception. 4) use DNS poisoning attack to redirect A to B.

All it takes is one vulnerable site out of the 50+ on the same cert.

The whole shared-cert thing is a workaround for Windows XP. Cloudflare does it
because they're still trying to support IE6 on Windows XP, which doesn't speak
Server Name Identification, and they don't have enough IPv4 addresses to have
one per customer.

~~~
AlyssaRowan
Wrong.

5) Cloudflare's sni??????.cloudflaressl.com presents an error to you because
the Host: header is either missing, doesn't match the SNI, or otherwise,
serves the _correct_ site to you instead of your phishing page.

You obviously haven't tested this. And it's Mox _ie_.

Vhost-confusion is a relevant attack on TLS with non-HTTP protocols, HTTP/0.9
and sites which serve a default domain to clients with no Host: headers.
Cloudflare quite specifically does none of these, and is not vulnerable in its
deployment - it needs the Host: header to know which site you want it to
select with its reverse-proxy, and you can't poison that because it's
protected by TLS to Cloudflare.

Also you, the attacker, don't have the cert.

If you can DNS poison away from Cloudflare, please report it to their security
team, but you'll find they're looking at deploying DNSSEC soon.

------
lambada
Looking at the spec [0] I'm concerned about the section on 'Recovery Tokens'.

"A recovery token is a fallback authentication mechanism. In the event that a
client loses all other state, including authorized key pairs and key pairs
bound to certificates, the client can use the recovery token to prove that it
was previously authorized for the identifier in question.

This mechanism is necessary because once an ACME server has issued an
Authorization Key for a given identifier, that identifier enters a higher-
security state, at least with respect the ACME server. That state exists to
protect against attacks such as DNS hijacking and router compromise which tend
to inherently defeat all forms of Domain Validation. So once a domain has
begun using ACME, new DV-only authorization will not be performed without
proof of continuity via possession of an Authorized Private Key or potentially
a Subject Private Key for that domain."

Does that mean, if for instance, someone used an ACME server to issue a
certificate for that domain in the past, but then the domain registration
expired, and someone else legitimately bought the domain later, they would be
unable to use that ACME server for issuing an SSL certificate?

[0] [https://github.com/letsencrypt/acme-
spec/blob/master/draft-b...](https://github.com/letsencrypt/acme-
spec/blob/master/draft-barnes-acme.md)

~~~
pde3
This is a question about the policy layer of the CA using the ACME protocol.

The previous issuing CA should have revoked the cert they issued when the
domain was transferred. But a CA speaking the ACME protocol might choose to
look at whois and DNS for additional information to decide whether it issues
different challenges in response to a certification request.

------
lowglow
Free CA? This is cool. Why this wasn't done a long time ago is beyond me.
(Also please support wildcard certs)

An interesting thing happened at a meet-up at Square last year. Someone from
google's security team came out and demonstrated what google does to notify a
user that a page has been compromised or is a known malicious attack site.

During the presentation she was chatting about how people don't really pay
attention to the certificate problems a site has, and how they were trying to
change that through alerts/notifications.

After which someone asked that if google cared so much about security why
didn't they just become a CA and sign certs for everyone. She didn't answer
the question, so I'm not sure if that means they don't want to, or they are
planning to.

What privacy concerns should we have if someone like goog were to sign the
certs? What happens if a CA is compromised?

~~~
mike_hearn
It wasn't done a long time ago because running a CA costs money (which is why
they charge for certificates), so whoever signs up to run one is signing up
for a money sink with no prospect of direct ROI, potentially for a loooooong
time. This new CA is to be run by a non-profit that uses corporate sponsorship
rather than being supported by the market; whether that's actually a better
model in the long run is I suppose an open question. But lots of other bits of
internet infrastructure are funded this way, so perhaps it's no big deal.

There aren't a whole lot of privacy concerns with CA's as long as you use OCSP
stapling, so users browsers aren't hitting up the CA each time they visit a
website (Chrome never does this but other browsers can do).

Re: CA compromise. One reason running a CA costs money is that the root store
policies imposed by the CA/Browser Forum require (I think!) the usage of a
hardware security module which holds the signing keys. This means a
compromised CA could issue a bunch of certs for as long as the compromise is
active, but in theory it should be hard or impossible to steal the key. Once
the hackers are booted out of the CA's network, it goes back to being secure.
Of course quite some damage can be done during this time, and that's what
things like Certificate Transparency are meant to mediate - they let everyone
see what CAs are doing.

~~~
hdra
I'm curious. Whats the biggest cost in running a CA? As in, what makes those
certs so expensive?

~~~
JakeSc
Ensuring physical security of CA private keys is expensive. This requires
things like sturdy padlocks, closed-circuit security cameras, and up-to-date
hardware and software.

These are the things you pay for when you buy a certificate from a CA. In
fact, I would be 100% opposed to obtaining my website's cert from a CA if it
were free-of-charge, because I know good physical security is expensive.
However, I already trust the EFF and the Umich researchers (and their
assurances of physical security), so I'm absolutely happy with obtaining a
free certificate from them.

------
teamhappy
I couldn't be happier about the news, the EFF and Mozilla always had a special
place in my heart. However, the fact that we have to wait for our free
certificates until the accompanying command line tool is ready for prime time
seems unnecessary. Another thing I'm interested in is whether they provide
_advanced_ features like wildcard certificates. This is usually the kind of
thing CA's charge somewhat significant amounts of money for.

~~~
schoen
The thing that's causing the delay is _not_ the client software development,
it's the need to create the CA infrastructure and then perform a WebTrust
audit. If we were ready on the CA side to begin issuing certificates today, we
would be issuing them today.

~~~
teamhappy
I think I may have misunderstood all of you. Is the audit process itself
really _that_ time consuming? I can imagine the amounts of bureaucracy
involved, but I can't image this takes much longer than, say, a month or so.
Most of the time is probably spent waiting for someone or something, right? I
mean we're talking about very capable people here who have done this kind of
thing before.

~~~
lucaspiller
You are lucky to not have had to deal with corporate beuracracy - these things
take time :-) At work I'm integrating an API for a mobile operator, it's
apparently working and ready to be used however I've been waiting a couple of
months to get all the documentation and everything setup.

Even once they have the CA it needs to be added to browsers which will take
time. Taking into account release cycles of embedded devices (read phones
where the manufacturer hasn't released an update), summer 2015 seems rather
optimistic.

~~~
schoen
The CA will be cross-signed, so it does not need to be added to browsers right
away in order to be accepted. It will be treated as an intermediate cert, not
a root cert, by all mainstream browsers at the outset.

But there is a lot of paperwork to be done, and a lot of engineering to be
done, and a lot of things to buy and people to hire, in order to get a CA
operating.

------
mangeletti
So, one CA to rule then all?

There's a scenario (simplified for illustration, but entirely possible) that's
normally not a huge risk because there are many CAs, and they are private,
for-profit companies that have an economic incentive to protect you and your
certificate's ability to assure end users that a conversation's privacy won't
be compromised.

1) browser requests site via SSL

2) MITM says, "let's chat - here's my cert"

3) browser asks, "is this cert legit for this domain?"

4) MITM says, "yes, CA gave us this, because of FISA, to give to you as proof"

5) browser says, "ok, let's chat"

I'm not trying to spread FUD, but if you're NSA and you've been asking CAs for
their master keys for years, doesn't a single CA sound great (free and easy ==
market consolidation), and doesn't EFF seem like the perfect vector for a
Trojan horse like this, given its popularity and trust among hacker types
gained in recent years?

~~~
schoen
We will look for ways to mitigate the risk of misissuing for any reason,
including because someone tries to coerce us to misissue. One approach to this
that's interesting is Certificate Transparency.

[http://www.certificate-transparency.org/](http://www.certificate-
transparency.org/)

There's also HPKP, TACK, and DANE, plus the prospect of having more
distributed cert scans producing databases of all the publicly visible certs
that people are encountering on the web.

~~~
Tepix
DANE is the way to go forward. Have your TLD CA sign your domain key and sign
your web certificates with your own key.

Only one "root CA" to trust per TLD, and it's free if you own a TLD that
supports DNSSEC (most do these days).

Now we just need the DANE check built into the browser without any plugins
that require installation.

------
overshard
The "How It Works" page,
[https://letsencrypt.org/howitworks/](https://letsencrypt.org/howitworks/),
has me a bit worried. Anytime I see a __magic__ solution that has you running
a single command to solve all your problems I immediately become suspicious at
how much thought went into the actual issue.

If I'm running a single web app on a single Ubuntu server using Apache then
I'm set! If I'm running multiple web apps across multiple servers using a load
balancer, nginx on FreeBSD then...

All the same I'm really looking forward to this coming out, it can be nothing
but good that all of these companies are backing this new solution and I'm
sure it'll expand and handle these issues as long as a good team is behind it.

~~~
general_failure
I don't get why they are releasing a command line, instead of just giving us a
cert that we can install by ourselves.

~~~
erickt
That wouldn't be safe, because then they would have access to your private key
and impersonate you. Having you (indirectly via their script) generate the key
and submit the public key for signing means your private key never leaves the
premises.

~~~
jackalope
There is no reason for the CA to ever see the private key. All they need is a
CSR. This approach is fundamentally broken.

------
xxdesmus
Who will handle abuse complaints and revocations of known bad actors? I'd be
curious to see who's abuse department will be handling those issues.

~~~
belorn
What kind of abuse where you thinking about? If the domain is hijacked, you
simply repossess the domain and request a new certificate and the old one is
revoked.

~~~
xxdesmus
As in, revoking a cert for a known C&C box, or a confirmed spammer, confirmed
box serving an exploitkit, confirmed phishing domain (such as my-apple-ikloud-
verify.foo)

Basically, my assumption is they won't want to be providing certs to known bad
actors. So I'm curious who is going to own the abuse handling for the CA.

~~~
belorn
Those issues are in theory handled by taking down the people who commits it,
and in practice by taking down the domain names, since those normally has been
registered using false credentials. One can hope/assume that this system will
automatically revoke domains that expire or get removed.

------
tatterdemalion
This seems like a really great step toward an HTTPS web. It will be an
immediately deployable solution that can hopefully TLS encryption normal and
expected.

However, it doesn't do anything about the very serious problems with the CA
system, which is fundamentally unsound because it requires trust and end users
do not meaningfully have the authority to revoke that trust. And there's a
bigger problem: if EFF's CA becomes the standard CA, there is now another
single point of failure for a huge portion of the web. While I personally have
a strong faith in the EFF, in the long term I shouldn't have to.

~~~
orthecreedence
Agreed. For all the hoopla, this is basically just like any other CA (but
free). Until we have a truly distributed (namecoin-esque) and accepted CA
structure, signed certificates may as well be pipes directly to the NSA.

That said, not having to pay some jerk for sending me an email and having me
enter a code is _really nice_. The current CA system is a pitiful excuse for
identity verification, and not having to pay for it will be nice.

------
byuu
Here's my current issue with moving to TLS: library support.

I do a lot of custom stuff and want to run my own server. I can set up and run
the server in maybe 50-100 lines of code, and it works great.

I know, I should conform and use Apache/nginx/OpenSSL like everyone else.
Because they're so much more secure, right? By using professional code like
the aforementioned, you won't get exposed to exploits like Heartbleed,
Shellshock, etc.

But me, being the stubborn one I am, I want to just code up a site. I can open
up a socket, parse a few text lines, and voila. Web server. Now I want to add
TLS and what are my options?

OpenSSL, crazy API, issues like Heartbleed.

libtls from LibreSSL, amazing API, not packaged for anything but OpenBSD yet.
Little to no real world testing.

Mozilla NSS or GnuTLS, awful APIs, everyone seems to recommend against them.

Obscure software I've never heard of: PolarSSL, MatrixSSL. May be good, but
I'm uneasy with it since I don't know anything about them. And I have to hope
they play nicely with all my environments (Clang on OS X, Visual C++ on
Windows, GCC on Linux and BSD) and package managers.

Write my own. Hahah. Hahahahahahahahah. Yeah. All I have to do is implement
AES, Camellia, DES, RC4, RC5, Triple DES, XTEA, Blowfish, MD5, MD2, MD4,
SHA-1, SHA-2, RSA, Diffie-Hellman key exchange, Elliptic curve cryptography
(ECC), Elliptic curve Diffie–Hellman (ECDH), Elliptic Curve DSA (ECDSA); and
all with absolutely no errors (and this is critical!), and I'm good to go!

I'm not saying encryption should be a breeze, but come on. I want this in
<socket.h> and available anywhere. I want to be able to ask for
socket(AF_INET, SOCK_STREAMTLS, 0), call setsockcert(certdata, certsize) and
be ready to go.

Everything we do in computer science is always about raising the bar in terms
of complexity. Writing software requires larger and larger teams, and
increasingly there's the attitude that "you can't _possibly_ do that yourself,
so don't even try." It's in writing operating systems, writing device drivers,
writing web browsers, writing crypto software, etc.

I didn't get into programming to glue other people's code together. I want to
learn how things work and write them myself. For once in this world, I'd love
it if we could work on _reducing_ complexity instead of adding to it.

~~~
IgorPartola
Wow, of all the arguments I could think of against the current CA/TLS/HTTPS
situation, a hobbyist deciding to write their own web server would not be one
of them... Yes, you should just conform and stop doing this. Or at the very
least you could let another process to TLS termination and just handle HTTP if
you really want to create your own off-by-one remote code execution errors
instead of using the ones supplied by apache et al.

~~~
byuu
> a hobbyist deciding to write their own web server would not be one of them

nginx started out as a hobby project by Igor Sysoev. Maybe he should have just
used Apache too?

> Or at the very least you could let another process to TLS termination and
> just handle HTTP

A well-designed HTTPS->HTTP proxy package could work. Install proxy, and
requests to it on 443 fetch localhost:80 (which you could firewall off
externally if you wanted) and feed it back as HTTPS. Definitely not optimal,
especially if it ends up eating a lot of RAM or limiting active connections,
but it would be a quick-and-dirty method that would work for smaller sites.

But it won't handle other uses of TLS, such as if you wanted to use
smtp.gmail.com, which requires STARTTLS. Or maybe you want to write an
application that uses a new custom protocol, and want to encrypt that.

If you put this stuff into libc, and get it ISO standardized and simplified,
and have it present out of the box with your compilers on each OS, then you'll
open the door for developers to more easily take advantage of TLS encryption
everywhere.

Look at the _core_ API for GnuTLS:
[http://www.gnutls.org/manual/html_node/Core-TLS-
API.html](http://www.gnutls.org/manual/html_node/Core-TLS-API.html)

This is just _insane_. It would take an average developer _months_ to fully
understand that API.

~~~
IgorPartola
FWIW, Igor did code up nginx to support HTTPS, despite terrible SSL libraries
:)

I don't really understand the problem you are having. If your sites are small
personal/side projects, why worry about things like your web server? That
stuff is so trivial, it's boring. If your sites are so large that the overhead
that HTTPS has over HTTP makes that much of a difference (pretty sure that'd
be Google, Facebook, Twitter, and nobody else), then why use your own server
implementation which you must know contains more bugs than something like
nginx which is already blazing fast. All of these things are a solved problem,
there is no reason to solve them again unless you are explicitly developing a
web server, an email relay, etc. If so, that's awesome, but in 2014 if you
develop a web server that doesn't work with HTTPS, it's pretty much dead on
arrival.

Having said that, check out [http://www.openbsd.org/cgi-bin/man.cgi/OpenBSD-
current/man3/...](http://www.openbsd.org/cgi-bin/man.cgi/OpenBSD-
current/man3/tls_client.3?query=tls_init&sec=3). This sounds like exactly what
you need.

~~~
byuu
> why worry about things like your web server?

I want to do different things that haven't really been done before.

I want to write my site in C++ instead of PHP. In my own case, it's not as
much about speed (that's a free benefit), it's more about the language: I
favor the strong typing, compile-time checking and stronger inheritance model.
What I envision is that I have C++ source files. I upload one, the server sees
there is no binary for that page, and invokes Clang to build it when the page
is accessed, and caches the binary for future use. Once I upload the source
again, and the modified timestamp is newer than the binary, it rebuilds the
cached binary.

I also have a lot of parsing built-in to the server itself to do things like
Markdown/Stylus, but optimized for what I need (my syntax generates automatic
anchors and TOCs for sections, great for documentation.)

I also plan to code up a forum so that I don't have to run phpBB, and thus
don't have to run PHP and Apache.

I can also test my server on my local desktop without having to install
PHP+Apache. I just open my server binary, and go to localhost in my browser.

> in 2014 if you develop a web server that doesn't work with HTTPS, it's
> pretty much dead on arrival.

Exactly! That's why I want to support TLS. Yet they've made that as difficult
as is humanly possible, it seems.

> This sounds like exactly what you need.

Yes, libtls is exactly what I need, and mentioned before. Its limitations are
that it's not available in my package manager, and hasn't had a lot of real-
world testing yet (though I trust the OpenBSD team.)

In 6-12 months once this is easily usable on Windows, OS X, Linux and FreeBSD,
I'll definitely be giving it a shot.

But isn't it sad that it's 2014 and a library with this level of simplicity
only just came out 2-3 weeks ago?

~~~
IgorPartola
Got it. Well, while I can't see why you'd want C++ of all languages for this
(Haskell seems like what you are really looking for), apache might actually be
your friend here. At an old $WORK we had a number of web applications written
in C++ and C (yup!), as apache modules. This way apache does most of the
boring stuff such as header parsing, routing requests, config files, etc. and
you do just the functional parts of your site/app. Then again, this was one of
the slowest and most error-prone ways to code this up, and the guys working on
this were full time C/C++ devs.

------
higherpurpose
> Let's Encrypt will be overseen by the Internet Security Research Group
> (ISRG), a California public benefit corporation. ISRG will work with
> Mozilla, Cisco Systems Inc., Akamai, EFF, and others to build the much-
> needed infrastructure for the project and the 2015 launch

What's Cisco's role in this? I'm quite worried about that. It has been
reported multiple times that Cisco's routers have NSA backdoors in them, from
multiple angles (from TAO intercepting the routers to law enforcement having
access to "legal intercept" in them).

So I hope they are not securing their certificates with Cisco's routers...

~~~
jauer
Lawful Intercept isn't a blanket government back door per se. It's a
featureset that allows the operator to configure what is effectively a remote
packet capture endpoint. That endpoint is disabled by default and requires
operator configuration to be enabled.

It just happens that every ISP/telco in the US needs this capability to comply
with CALEA so it's manufacturers responding to market forces. Juniper supports
it, A Latvian router manufacturer supports it
([http://wiki.mikrotik.com/wiki/CALEA](http://wiki.mikrotik.com/wiki/CALEA)),
there's even open source code to do it
([https://code.google.com/p/opencalea/](https://code.google.com/p/opencalea/))
if you're building your own routers.

There's a place to focus your ire over wiretapping. The manufacturers aren't
it.

------
vbezhenar
We have DNS system in place which should be enough to establish trust between
browser and SSL public key. E.g. site could store self-signed certificate
fingerprint in the DNS record and browser should be fine with that. If DNS
system is spoofed, user will be in bad place anyway so DNS system must be
secured in any case.

~~~
spindritf
No. A proper certificate protects against malicious DNS resolver.

What you're talking about is being introduced alongside DNSSEC, and it's
called DANE.

[https://en.wikipedia.org/wiki/DNS-
based_Authentication_of_Na...](https://en.wikipedia.org/wiki/DNS-
based_Authentication_of_Named_Entities)

------
peterwwillis
Two things:

1\. I really hope this is hosted in a non-FVEY territory.

2\. Why can't we set a date (say, 5 years?) when all browsers default to
https, or some other encrypted protocol, and force you to type
"[http://"](http://") to access old, unencrypted servers?

------
mike-cardwell
Glad I don't work for a CA right now.

------
fsiefken
Will these certificates work with Internet Explorer and Chrome?

~~~
danielweber
You can add your own CA to browsers.

~~~
Someone1234
You can and I can but 99.999% of normal users cannot and will not.

------
justcommenting
from the ACME spec, it looks like proof of ownership is provided via[0]:

>Put a CA-provided challenge at a specific place on the web server

or

> Put a CA-provided challenge at a DNS location corresponding to the target
> domain.

Since the server will presumably be plaintext at that point and DNS is UDP,
couldn't an attacker like NSA just mitm the proof-of-site-ownership
functionality of lets-encrypt to capture ownership at TOFU and then silently
re-use it, e.g. via Akamai's infrastructure?

[0] [https://github.com/letsencrypt/acme-
spec/blob/master/draft-b...](https://github.com/letsencrypt/acme-
spec/blob/master/draft-barnes-acme.html)

~~~
schoen
Four things:

(1) You can do the attack you describe _today_ with existing CAs that are
issuing DV certs because posting a file on the web server is an existing DV
validation method that's in routine use.

(2) There is another validation method we've developed called dvsni which is
stronger in some respects (but yes, it still trusts DNS).

(3) We're expecting to do multipath testing of the proof of site ownership to
make MITM attacks harder. (But as with much existing DV in general, someone
who can completely compromise DNS can cause misissuance.)

(4) If the community finds solutions that make any step of this process
stronger, Let's Encrypt will presumably adopt them.

~~~
justcommenting
agree completely and it's worth noting that i don't have a solution to the
issues i mentioned, either.

leveraging other (potentially-insecure) paths to establish trust might help
further enhance confidence in authenticity; e.g. verification using something
like the broad-based strategy of moxie's perspectives (except via plaintext)
or maybe through additional verification of plaintext on the site as fetched
via tor or retrieving a cached copy of the site securely from the internet
archive or search engines.

dvsni and multipath testing sound quite interesting, and i think defense in
depth is the right approach.

having been at akamai's recent edge conference, i didn't hear much from them
on this. does anyone have any additional details of their interest in the
project?

~~~
brians
It was quiet, and indeed uncertain, at that point. For myself, I'm extremely
excited about the "Let's Encrypt" project's opportunities for experimentation:
bringing the marginal cost of certificates to zero should have great effects
on Web and Mail services, but should also have something to say about S/MIME
and other client-cert uses.

------
ademarre
This is great news! I'd also like to see a push for technologies like DANE
(and necessarily DNSSEC) which address the flawed CA trust model.

While we're at it, let's get a non-profit domain registrar going.

~~~
kayone
> non-profit domain registrar

domain squatters are already an issue. imaging if you could register domains
for free. I think having to pay $10 for a year is pretty fair. That's one
reason I don't mind paying ~$70 for .io domain. it keeps most squatters away.

~~~
ademarre
You misunderstand. Domains must not be free, and domain cost isn't the
problem. Nonprofit registrar != free domains.

The problem is the horrible user experience of registrars like Godaddy. I'd
rather give my money to a nonprofit that isn't confusing non-technical website
owners into buying products they don't need.

The registrar landscape is better now with Gandi, but still I'd rather pay a
fully transparent nonprofit registrar if one existed.

~~~
kayone
Sorry, I did misunderstand. Completely agree.

------
tmmm
Won't people need to have LetsEncrypt CA certificate installed on their
computers to not get that red SSL incorrect certificate thing? Other than
that, this is awesome.

~~~
joshmoz
IdenTrust will be cross-signing our roots while we apply to root programs.

~~~
diafygi
Thanks for the clarification! You might want to add that point to your
technical how-it-works section[1]. I was wondering how older browsers would
accept a new CA's signature.

Also, I really wish AOL would have donated their root certs to y'all[2] so you
didn't have to set up a whole new CA.

[1]:
[https://letsencrypt.org/howitworks/technology/](https://letsencrypt.org/howitworks/technology/)

[2]: [https://moderncrypto.org/mail-
archive/messaging/2014/000618....](https://moderncrypto.org/mail-
archive/messaging/2014/000618.html)

~~~
iancarroll
I don't know why AOL keeps being brought up, but it's highly unlikely they
would do this. For one, it's probably used internally for smart cards/SMIME.
Secondly, it'd be very hard to get AOL to spend _money_ on doing something for
free. Moving a CA to a different company is no small feat, operationally...

~~~
anonbanker
AOL bought Netscape, and incubated the Mozilla project while Netscape was
still alive. they've spent a lot of money on doing things for free.

~~~
wnevets
how long ago was that?

~~~
anonbanker
feel free to let me know why that matters.

------
balabaster
How does a CA that's formed by a conglomerate of U.S. companies (under the
jurisdiction of the NSA) make us any safer than we are currently? It doesn't.
The chain of trust chains up all the way to a U.S. company, which can be
coerced into giving up the certificate and compromising the security of the
entire chain. I'm on the side of the EFF trying to encrypt the web, but this
is not the solution.

~~~
anonbanker
truth be told, it doesn't make anyone safer. it's a big fat placebo,
especially once the NSA realizes that this project is entirely under their
jurisdiction.

Now, if there was a project in Iceland or Seychelles that was doing something
similar, I would be much more apt to participate.

~~~
balabaster
Security theatre for the win(?) Do these people [EFF] not realize that the
people they're trying to win over are network nerds? These are people that
actually understand this shit and the repercussions of it.

I can't profess to understanding all the details of encryption infrastructure,
but I learned very quickly in kindergarten, you can't trust anyone you don't
know. It doesn't matter who they are, who they know or what they know. Half
the time, you can't even trust "cold hard facts", the facts are frequently
misinterpreted, fabricated or eventually proven to be wrong - once it was a
fact that the earth was flat, then we were the centre of the universe, now the
universe as we know it is held together by a God particle. Science claims
facts that invalidate there being a God... all facts are a matter of our
fallable understanding of this scientific instrument we are building. Even
people you do trust can be coerced into doing things that compromise your
ability to trust them or their motives.

If you want to automate trust, then you're eventually going to have to realize
that you can't. All you can do is mitigate the cost of being wrong.

Absolute power corrupts absolutely - the CA (or whoever controls that CA) has
absolute power in this scenario. If you have the director's family hostage,
everyone else's security just went down the pan.

Chain of trust is like putting all your eggs in one basket. You just don't do
it. Web of trust is a marginal step up, but it's more of a pain in the ass and
can also be overcome by a group with malicious intent.

------
ilaksh
So this means that GoDaddy, Namecheap, Verisign and other sellers/resellers of
SSL certificates will need to lower their prices soon, right? Because in a
short time many websites won't need to purchase one since they can get it
free.

Also, have they built this system with a completely scalable distributed
architecture? For it to be practical it needs to be performant.

Also, does the NSA have access to the core of this system?

~~~
thuejk
You can already get free certificates from startssl today:
[https://www.startssl.com/?app=1](https://www.startssl.com/?app=1)

~~~
elijahpaul
Aren't those only for personal (i.e. non commercial) websites?

------
Aardwolf
My website only contains publically available stuff for people to read.

Is there any reason why I would want to use https for this use case?

Or what does "entire web" mean?

~~~
grey-area
_Is there any reason why I would want to use https for this use case?_

Yes it can help you stop:

ISPs inserting adverts into your content (this has happened)

Governments censoring your content or rewriting it

Governments putting people in jail for reading your publicly available (in
your country) content, which is illegal in theirs

People impersonating your website

But if you don't want to use it, that's cool too. I suspect all websites will
be encrypted at some point soon though, the disadvantages are getting less and
less important.

~~~
pbhjpbhj
> _Governments putting people in jail for reading your publicly available (in
> your country) content, which is illegal in theirs_ //

How does that work, surely the gov can still see people accessing the
information by monitoring network traffic and the info itself is still public.
HTTPS doesn't encrypt the actual request traffic does it, and in any case the
gov would still see which server the traffic is going to unless you're using
something like tor [and possibly still then].

~~~
tlb
HTTPS does encrypt both request and response.

However, you can figure out what pages on a large public site like Wikipedia
people are reading over HTTPS, based on statistical traffic analysis, because
you can see the size of the request, page, and each of the images. Combined
with link following analysis, you can make a fairly accurate guess as to what
people are reading.

~~~
desdiv
I believe that the combination of HTTP/2 and TLS length-hiding makes that
attack impractical. Though admittedly we're still years away from widespread
deployment of those two technologies.

------
xs
While this is nice and I'm happy to see such a product coming, I still don't
see a free TLS solution for my smaller projects. Heroku will still charge me
$20/mo for TLS even if I have my certificate. Cloudflare will also want to
charge me to inspect TLS. I could drop both and get a Linode but then that
costs too and is a pain to setup a server myself.

------
fixermark
"With a launch scheduled for summer 2015, the Let’s Encrypt CA will
automatically issue and manage free certificates for any website that needs
them."

'Automatically?'

So we're replacing owning people by snooping on their HTTP traffic with owning
people by directing them to fake websites digitally signed by "m1crosoft.com"?

... actually, yes, that is kind of an improvement.

------
sschueller
A little vague on details.

Apache only or also Nginx?

Who is the CA?

No way I am running something like this on a production machine.

I like the idea but I would rather have the client just output the certificate
and key in a dir so I can put the files where I need them and I can configure
the changes to my webserver.

Also this does not solve the issue of a CA issuing certificates for your
domain and doing MITM.

~~~
pde3
This is just a pre-announcement to let folks (OSes, hosting providers, other
platforms) plan and do integration work. Per our own warnings, we _definitely_
don't want this running on production machines until it launches in 2015.

Our Apache code is a developer preview, we'll be working on Nginx next.

ISRG will be operating a new root CA for this project. Although if you think
that your choice of CA makes you more or less secure, you may not have
understood how PKIX works -- you can buy a cert from whichever CA you like,
but your adversary can always pick the weakest one to try to impersonate you.

~~~
mike_hearn
> ISRG will be operating a new root CA for this project.

Are you going to be cross-signed by IdenTrust or something? If you're really
going to try and create a new root CA from scratch, surely you will be impaled
on the spike of low coverage for many years?

~~~
schoen
That's quite a painful spike indeed, but fortunately we'll be cross-signed by
IdenTrust.

------
mkhpalm
I don't want to be a full-fledged sponsor but I'd love to see a donate
function to their site. Once this is released if the CA is trusted by all the
major browsers I am more than willing to shift all the money we spend in certs
from all these other "authorities" to something constructive like this.

------
chunkiestbacon
The real problem here is that http is unencrypted by default. It really should
be encrypted so that passive listeners can't see the traffic. I know that this
is no protection against man in the middle attacks, but at least WiFi sniffers
and similiar would be stopped. State Actors would have to actively do
something which might be registered. It would be a great improvement, because
in the current system, most websites are going to stay unencrypted because it
takes money and effort to set up a certificate. The millions of shared hosters
won't do it by default.

What we can do: \- Change the http protocol to be encrypted? \- create an
apache module that automatically does this and needs no setup time (generate
private keys automatically?)

Of course there shouldn't be any indicator of this encryption in the adress
bar of the browser.

Maybe it's too late.

------
steven2012
This is an awesome idea. But I thought the whole idea of a certificate
authority is so that we can trust that the CA has vetted the person/site that
they have given the certificate to. If all they do is issue certs for free,
all we get is encryption, but no identity verification.

~~~
jrochkind1
With basic certs, the CA just verifies that the entity controls the website
the cert is being issued for. The OP explains how Let's Encrypt will do that.
(And if they appeared not to be doing that, no software vendors would include
the CA in the trust list).

With an "Extended Validation" cert, the CA additionally verifies that they are
who they say they are on the cert (not just that they control the (web)sites
the cert was issued for). I'm not sure if Let's Encrypt plans on issuing EV
certs, but if they are, they will have to comply with whatever verification
standards are standard, in order for vendors not to revoke them from trusted
stores. Same as anyone else.

~~~
schoen
Further upthread, Josh (one of the other people working on the project)
explained that Let's Encrypt currently only has plans to issue DV certs, not
EV certs.

[https://news.ycombinator.com/item?id=8624634](https://news.ycombinator.com/item?id=8624634)

This is because of the automation aspect. EV cert issuance involves a human
being looking at offline identity; DV issuance involves proofs of control that
can be checked online by a computer, just as existing DV issuance by existing
CAs is based on such checks.

------
chmike
Wouldn't this result in putting all the eggs in a single basket ?

Beside, as an European, I'm not so excited that such initiative is under
control of American Law. I suspect that American interests will prevail.

~~~
schoen
Would you like to spell out more explicitly which effects of U.S. jurisdiction
you're most concerned with?

I agree that there are several possible effects of jurisdiction on CAs that
people could reasonably be concerned with (whether as would-be certificate
requestors or would-be relying parties), but I'm wondering which ones are
concerning you most.

~~~
chmike
The effect is that the NSA, the FBI or others could obtain the private key of
the EFF root CA through legal arm twisting and gagging.

Certificates are public, so there is no problem with certificate request.

If the project is US only, than it won't make much difference with the actual
situation. It wasn't explicit in the announcement.

------
niutech
Even today you can have all your HTTP traffic encrypted and compressed, using
Mozilla Janus[1] or Data Compression Proxy[2].

[1] [https://addons.mozilla.org/en-US/firefox/addon/janus-
proxy-c...](https://addons.mozilla.org/en-US/firefox/addon/janus-proxy-
configurator/)

[2] [https://chrome.google.com/webstore/detail/data-
compression-p...](https://chrome.google.com/webstore/detail/data-compression-
proxy/ajfiodhbiellfpcjjedhmmmpeeaebmep/)

------
jfindley
It would be nice to have support for ECDSA certificates. I've not found a CA
yet who'll provide one of these, despite the fact that many clients to already
support them. Unfortunately, after a brief look through client.py I can't see
any support for this. Is there any good way of filing an RFE or contributing a
patch?

ECDSA certs are much cheaper to decrypt, and there's still some places
(especially mobile) where TLS is a noticeable overhead - it'd be great to have
a CA that provides them.

~~~
AlyssaRowan
Indeed, I think that might be viable: it certainly was for CloudFlare! And
good ECC certainly is "modern security techniques and best practices". I would
however be OK with RSA-2048 using SHA-256, I guess; it's what many other CAs
currently use (and this is partially a lowest-denominator problem).

Comodo definitely has an ECC root available now, a cross-signed ECDSA root
with secp384r1, signing a secp256r1 intermediate. (I had heard there were 3
others deployed out there, although off the top of my head I'm not clear about
which they are, perhaps they're also cross-signed?)

Why is ECC so poorly deployed in TLS? I've heard indications Certicom had
formerly aggressively asserted patents, hence the lack of ECC-supporting CAs;
but I don't know which ones. I highly doubt they're still extant, however:
many have since expired.

Do be aware however that ECDSA can present a huge hazard if the _k_ value
needed for every signature is even partially predictable and varies.
Officially, it should be random, and _very strongly_ random (even the first
two bits being consistently predictable is cumulatively disastrous; using the
exact same _k_ to sign two different things is absolutely catastrophic and is
how the Sony PS3 root keys were calculated!). If you have a strong PRF for
your RNG, you should be fine (e.g. LibreSSL uses ChaCha20); if you want some
insurance just in case, you can use a more-auditable and less fragile
deterministic process with a strong PRF so the same signature always gets the
same, unpredictable _k_ (see RFC 6979), or a combination of the two approaches
(e.g. BoringSSL). DSA also had this issue. If you haven't audited your RNG and
know it's strong, maybe you should check it before you deploy ECDSA: if your
system's headless, its entropy is running on empty and your idea of a mixing
function is RC4, it might not be such a hot idea.

It'd be fantastic to have the option available to have an ECC root around, and
let us have RSA or ECC certs. (Yes, you can cross-sign across algorithms.)
Perhaps have ECDSA off by default for a while in light of the above, but it
can provide _very_ good performance for people who use modern software and
turn it on!

I'd suggest using secp256r1 (aka NIST P-256). It's already deployed and
256-bit curves lie at about RSA-3072 strength (stronger than most deployed CAs
now, which typically use RSA-2048). A few others have deployed secp384r1, but
that was following the NSA's Suite B lead; I'm not sold on that being relevant
at all. secp256r1 is also fairly fast, with some very well-optimised constant-
time routines available in OpenSSL if you enable a flag or even faster ones if
you're using 1.2 beta (expect something like a 200%-250% performance boost
over the generic elliptic curve routines); it's not quite Curve25519 speed,
but it isn't bad.

I do however acknowledge the extreme murkiness surrounding the generation of
the NIST/SECG/X9.62 curves. That does present _some_ concern to me. I tried to
get to the bottom of that (see my posts on CFRG) and I can summarise what I
found out as basically (now-expired/irrelevant) patent-related shenanigans.
I'm not super-comfortable with that degree of opacity in my curves - however,
I also don't know of any actual security problems with secp256r1 (or
secp384r1) as they stand, _providing they are properly implemented_ (very big
proviso!). I don't think they're backdoored, but make sure you check the
hinges on the front door, and I'd prefer a house with better foundations!

More transparently-produced curves (such as the Brainpool curves) do exist,
but Brainpool is sadly very slow in software (less than half the speed than a
good P256 routine, and no scope for optimisations).

So looking forward, CFRG (at IRTF) were asked by the TLS Working Group to
recommend even better new curves: most probably Curve25519 in my opinion as
that seems to admirably satisfy all the criteria for a fast strong mainstream
curve, and probably one larger "extra paranoid" curve will be recommended
which I really don't know what it'll be at this stage. These hopefully will
be/are even faster, strong, _and_ rigidly-explained, without murky origins.
And hopefully there will be better algorithms than ECDSA (perhaps a Schnorr-
based algorithm such as Ed25519, now the patent expired?). I very much doubt
if all the supporting infrastructure for that like HSMs and widely-deployed
software support will all be "ready" for this project, however, in the
timeframe we'd like, however, so in the meantime, P256 or RSA-2048 I guess is
OK.

In general: This is absolutely wonderful news. It, and the efficiency of TLS
1.2 (and later, TLS 1.3) will enable people to run TLS everywhere. I am very
probably going to use it myself.

While the Israel-based CA StartCom do already offer free TLS certificates and
I have previously lauded them for that, they pulled an absolutely unforgivable
move detrimental to internet security as a whole in refusing to revoke and
rekey certificates for free _even exceptionally in the immediate wake of
Heartbleed_ (and I do think they should have their CA status reviewed very
harshly as a result or revoked, because I do _not_ think that is compliant
with CA/B guidelines: they definitely still have live signatures on
compromised keys that have not been revoked, which is totally unacceptable).
If this initiative means we can replace and dump bad CAs, it's even better
news.

------
mike-cardwell
I'm hoping that one day soon, I'll be able to remove this line from my nginx
config:

    
    
      ssl_certificate /path/to/file.crt;
    

My web server will notice that I want SSL, but haven't specified a path to a
cert. It will then go off and generate one and get it signed automatically
using an API like the one being discussed. It will also handle renewing
automatically when the time comes.

~~~
TillE
Automatic unconfigured behavior is bad, but something like a
ssl_certificate_auto directive that's in the default config would make a lot
of sense.

------
drderidder
This is a great initiative. On the other hand, I'm beginning to think that
security models based on any central authority will always be at risk of
getting compromised from within. Techniques that allow trusted security to be
established between two parties without the need for a third-party authority
to validate them would be nice to see.

------
cm2187
That's a great idea and I'm a big fan of the EFF. But what browser support
will this have? Even if all browser on all platforms add this to their root
certificates, how many years will it take before even half of the devices in
use support it (remember the number of people still using windows XP!)

~~~
schoen
It's initially cross-signed by IdenTrust, which has wide browser support
today.

------
silvenga
Whatever happened to [http://www.cacert.org/](http://www.cacert.org/)?

~~~
lingben
is there a reason they don't use it on their own site
([https://cacert.org/](https://cacert.org/))?

~~~
Buge
They do. You just don't have their root installed so it gives an error. You
can install their root here
[http://www.cacert.org/index.php?id=3](http://www.cacert.org/index.php?id=3)

------
spindritf
ACME sounds great. Copying codes from emails is suboptimal at best. Free
certificate from command line and free revocation from the same client sound
even better.

I just don't know about the automatic configuration tool. Like webpanels for
managing a server, it has never worked for me.

------
darka
How does this compare to StartSSL?

~~~
blibble
maybe they won't try to extort you if you require a revocation?

(StartSSL do even for paying customers... stay away)

------
iancarroll
Very interesting, it looks like they're working with IdenTrust on this. I
wonder if it supports wildcard certs.

Like StartCom selling Class 2/3, running a CA is very expensive and I wonder
how they plan on recouping the fees for this.

~~~
shawabawa3
> running a CA is very expensive and I wonder how they plan on recouping the
> fees for this

Is it? Seems like it should be dirt cheap to me. It's basically just an API
for generating and revoking certs.

~~~
iancarroll
WebTrust audits are the bulk of the cost. We got quoted $150k for our first
audit. This is a _yearly_ thing too.

You also have to pay for your own cage in a datacenter, the HSM, validation
staff, etc...

------
JoshTheGeek
I wonder how many of the cheap web hosts will impelent this. I think the
increased hosting cost on top of the certificate itself also discourages
people from using TLS. Wishful thinking, perhaps...

------
king_magic
It's an interesting idea, I'm just not clear on how it works (even when
looking at the "How it works" section) - e.g., how do I integrate this with...
say, nginx?

~~~
kevinchen
TLS/SSL certificate setup is a pretty mechanical task. I would imagine their
program detects common web servers (nginx, apache, etc), puts the private key
somewhere, and points the configuration files at it.

------
jpetersonmn
I just setup a ssl certificate on my website for the first time and it only
took like 10 minutes all together. I don't get any warnings from any browsers
and it was only. $10

------
nodata
And all because people like YOU donated :)

Thanks :)

btw if you want to donate too, here is the link:
[https://supporters.eff.org/donate](https://supporters.eff.org/donate)

------
mattste
I'm so excited for this. I know both the people working on the team from the
University of Michigan, and both are extremely smart people passionate about
web encryption.

------
tacojuan
Wow, I had a goofy idea a few months ago that one day we could have some sort
of non-profit/charity that just runs a free, as in beer and freedom, "common
good" CA.

Looks neat.

------
eyeareque
This is great news, but I am wondering how they will handle revoking
certificates. For example: Do we really want malware sites popping up with
valid Ssl certificates?

~~~
BogdanCalin
You can revoke the certificates from the command line. It's shown at the end
of the video.

~~~
eyeareque
Ah, so there aren't plans to add this CA into web browsers. That makes more
sense.

------
neals
Finally! Man, is getting and managing certificates a pain in the *ss for our
small shop that does a great number of small websites.

------
andrewbarba
This news put a huge grin on my face. Let's hope Heroku drops that ridiculous
$20 charge for SSL endpoint as well.

------
general_failure
I hope they do wildcard certs as well.

------
bmahsh
Hi. This such an amazing project to work on. Who started this? Who came up
with this idea?

------
dutchbrit
Wondering how they're going to cover the costs of being a CA.

------
zobzu
Whos auditing the ca?

~~~
psykovsky
Who's auditing the CA's currently trusted by your browser?

~~~
zobzu
various 3rd parties. this is required by the cab forum which my browser
requires as well.

inform yourself if you want to write stuff like that. even more, its sad that
people think CAs have zero checking and just give what, money to browsers to
be included? Thanksfully its not like that yet.

~~~
psykovsky
So,if you knew the answer to your own question, why did you ask?

~~~
zobzu
I did not, actually. Same subject, but the question is different since this is
a CA signed by a CA in this case.

------
itistoday2
Kudos to the EFF for making an easy-to-use tool to generate TLS certs!

Kudos also for creating the second CA to issue free certificates (the first
being StartSSL).

The next step needs to be to man-in-the-middle (MITM) proof these certs. We
still have to address that problem. We'll be talking about how the blockchain
can be used to solve this problem tonight at the SF Bitcoin Meetup, if that
interests you, you're welcome to come:

[http://www.meetup.com/San-Francisco-Bitcoin-
Social/events/18...](http://www.meetup.com/San-Francisco-Bitcoin-
Social/events/180197882/)

A primer can be found here:
[https://vimeo.com/100433057](https://vimeo.com/100433057)

~~~
richardwhiuk
The blockchain can't fix this problem - it's too large for most embedded
devices, which do matter. It isn't a solution just because it's a 'cool new
crypto idea' to every problem on the planet. Just because something uses
crypto doesn't mean adding the blockchain to it makes it any better.

~~~
anonymousDan
Hmm, I'm not sure exactly how the parent intends using it, but isn't something
like the blockchain (i.e. a publicly auditable log of all changes to
certificates) already being proposed for improving the current PKI
infrastructure? Also, is the problem you have for embedded devices that they
can't afford to download, store, and verify the full bitcoin block-chain?
Surely there are compromise solutions that could be made? Not denigrating your
comment btw, it's an interesting observation!

------
sbierwagen
Uh oh, this looks like it kills sslmate.com

Sorry agwa.

------
mangeletti
One possible solution is a BitCoin-like block chain of certificate proof, so
that a website's certificate can be verified against the domain without a
central authority.

~~~
fragsworth
That doesn't even remotely work, who has the private keys to authorize the
certificates?

~~~
mangeletti
What authorization is required in this scenario? I'm talking about a novel
idea here, one that doesn't fit into the existing CA model. There would be no
CA in this scenario; verification would be decentralized, based on shared
information, not on knowledge of a secret.

~~~
akerl_
I'm not sure web-of-trust can be considered a novel idea in 2014.

We can all look at the variety of web-of-trust methods to see how well that's
taken off amongst internet users.

~~~
mangeletti
It is novel, in terms of there being any such service in existence, ever.

------
jgrahamc
Can wait until summer 2015 for a free cert? CloudFlare offers Universal SSL:
[https://www.cloudflare.com/ssl](https://www.cloudflare.com/ssl)

~~~
lawl
That allows cloudflare to read any traffic because you SSL to cloudflare and
auth their private key...

