
Just how bad is OpenSSL? (2012) - francium_
https://lists.randombit.net/pipermail/cryptography/2012-October/003388.html
======
aerovistae
Frankly I've never liked man pages. To me they always screamed "This is how
documentation was done in the 90s." The examples are often very unclear or
incomplete, and the explanations often assume prior knowledge without
providing links in case such knowledge is absent.

Modern documentation has gotten way better, as seen in the Stripe docs and
many others, and I wish the man pages could be updated accordingly.

~~~
dingaling
Yes man pages are usually upside-down; the examples should be right at the
start and then lead to a drill-down into options. 9/10 times I end-up having
to search the web for a basic introductory example.

But even in big corps corps with ISO9000 accreditation there is seldom self-
questioning as to whether documentation is _useful_ rather than just ticking
the box for process-completeness.

~~~
ori_b
No. 90% of the time, I know what I want to do, and how my tools work. I just
don't remember the options are called.

Having a summary of the options right there at the top is the most valuable
thing in a reference.

~~~
derefr
You're presuming man pages are primarily meant to serve as a reference. But I
rarely need man-pages as a reference†.

Most of the time, if I'm looking up a man-page for something, it's because
I've just installed a new package that sounded like it would solve a problem
and then did a dpkg-query(1) to find out what binaries came with it—or used
apropos(1) to find a relevant binary already installed—and now I want to know
what the uses of a given binary are and whether those uses include solving my
particular problem.

† Well, except for the utilities with absolutely horrible command-line UX-
design, like tar(1) or ps(1) or rsync(1), where I just memorize the options I
need for my usual case, and then have to look in the man page to do anything
novel.

~~~
qwertyuiop924
>You're presuming man pages are primarily meant to serve as a reference.

They are.

~~~
derefr
Let me rephrase: you assume that _it makes sense_ for manpages _to continue_
serving primarily as a reference—that this is the primary use-case people have
for the standardized program documentation that ships with their distro
packages.

Shipping a reference to a binary with that binary may have made sense before
the internet. But nowadays, it's the opposite.

• Complex programs with many options have (sometimes dozens of) websites
documenting them thoroughly. (Try searching with any search-engine for "wget
mirroring", for example; the number and complexity of the results is
overwhelming.)

• Meanwhile, for the simple "corner-case" programs, you really _hope_ that
they shipped with docs—because seemingly nobody else out there on the web
cares to bother documenting them. With a lot of these little programs, the
only web doc you can find are, in fact, online mirrors of their man-page.

For the popular-and-complex programs, man-pages are just redundant, because
everyone will document what _they_ did to achieve whatever. But for the
simple-but-weird programs—the ones for which man-pages _aren 't_ just
redundancies—if the man-page doesn't give usage, then _nothing_ is going to
give usage.

Now, I can understand why man-pages for these little utilities are the way
they are. These programs are usually created by a single author, so time spent
writing docs is time not spent fixing bugs or scratching their itch or
whatever else. And an options reference certainly is the "minimal normalized
form" of documentation: it lets others brute-force combinatoric-search the
space of invocations until they find some combination that Works For Them™.
Basically, you can (through a lot of trial and error) _generate_ a cookbook
from an options reference. So the author probably doesn't feel a strong need
to add anything beyond an options reference, because the people who _really_
need to solve the problem their binary solves are willing to go to that
effort.

But if you're a distro downstream packager, and it's your job to make your
distro easy for people to use, you should have every incentive to submit
upstream patches to said author, with manpage additions of cookbook example
usages resulting from _your_ trial-and-error experimentation with their
program.

Annoyingly, you, as a distro packager, probably don't have time to _do_ that
trial-and-error experimentation, especially if the utility serves a niche use-
case that you don't even understand. That—and not the fact that "manpages
should be a reference and nothing more"—is most of the reason manpages
continue to be the way they are.

~~~
ktRolster
Man pages are written (ideally) as the authoritative reference on your system,
where you can go to find information. That sort of thing needs to exist
_somewhere,_ it has a clear use case.

An ideal man page is concise, informative, and complete. Learning to read them
is like learning to read scientific literature: a pain, but once you figure it
out, you're at a higher level.

------
__b__
"For instance it doesn't have everything you need to validate certificates..."

Yet it has all the CA crap thrown in, via the overloaded openssl binary. As
"examples". And according to the documentation, not even "correct"
illustrations of how libssl should be used.

Encryption and authentication are two separate problems.

Just because you figured out a way to encrypt a message does not mean you have
also figured out how to a way to send it to only the correct recipient... over
an insecure network. (Insecure not only in the sense of "plaintext" but in the
sense you are not in control of much of anything - routing, PKI
infrastructure, etc.)

It seems to me that one would want to solve the authentication problem _first_
, and then move on to encryption.

This comment shows that for proponents of using SSL on the _public_ web, it's
been the other way around. Authentication was never sorted out.

When it comes to _authentication_ , all due respect to the OpenSSL authors,
SSH has provided a better attempt at a solution than any implementation of PKI
using SSL/TLS.

And one more thing, how many ciphers does a user really need? As we've heard
time and again, many of them are not even "safe" to use. Some of the
alternative SSL libraries have wisely removed them. But I guess OpenSSL is
append only?

------
qwertyuiop924
OpenSSL is pretty bad. After reading about some of the stuff that lead the the
libressl fork, I wouldn't trust it with my lunch money. Sure, the algorithms
are good, but as far as the code's concerned, Heartbleed was the tip of the
iceberg.

~~~
ktRolster
There's a saying, "don't roll your own crypto," and it's good advice.

In the case of openssl, you might be better off rolling your own. At least the
vulnerabilities you end up with are different than the ones that the rest of
the world has.

~~~
mSparks
the deeper and deeper ive gotten into breaking crypto. the more and more ive
come to the conclusion that saying is positively poisen.

"dont try and do it all on your own but trust no one else to do it for you"
probably better.

An open, modular project with a wide choice of options would be a godsend and
wipe out most digital crime almost overnight.

Of course, then, they couldnt use the likes of yahoo and google to read drug
dealers emails.

------
nickpsecurity
The experts writting it for themselves part seemed inaccurate given what I
read in LibreSSL commits. It was one atrocity after another. Still love Ted
Unganst's observation about them making surd endianess of CPU doesnt change
while protocol is running. Just cant remember how often that check was
performed.

"Experts"... lol...

~~~
stefs
i interpreted this as: this was written by security experts (cryptographers),
not expert programmers. this means the algorithms are generally ok, but the
implementation is wacky (and issue prone).

~~~
red_admiral
It was also written by cryptographers who for years asked for support, and got
barely enough to keep the server running let alone live off it. Meanwhile the
world and his wife joined in with feature requests and complaints about things
they didn't like, but mostly without offering to help.

So it doesn't surprise me that unit testing, documentation, code review etc.
weren't a top priority for spending more unpaid hours on - people literally
got what they paid for.

~~~
nickpsecurity
I'll partly agree with that. Mostly even. I draw the line at expecting a
security-critical library intending widespread adoption at least follow secure
coding guidelines if nothing else. It really doesnt take much effort vs what
was already done. Tiny fraction of it.

That plus the larger trend of developers ignoring basic, good practices is why
I critique the project a bit. Plus, LibreSSL team illustrated my point nicely
by doing 10x what I expected in a very short time with no pay.

------
ori_b
After the string of vulnerabilities, I know that OpenSSL got a wave of
investment.

I'm curious how much of this still stands today.

~~~
anonbanker
look at the list of CVE's since[0], as well as tedunangst's commentary on his
blog[1]. they pair up nicely.

0\. [http://www.cvedetails.com/product/383/Openssl-
Openssl.html?v...](http://www.cvedetails.com/product/383/Openssl-
Openssl.html?vendor_id=217)

1\. [http://www.tedunangst.com/flak/post/analysis-of-openssl-
free...](http://www.tedunangst.com/flak/post/analysis-of-openssl-freelist-
reuse)

~~~
ktRolster
20 vulnerabilities found so far in 2016 in openSSL, that's basically saying
that the codebase is still not secure.

------
red_admiral
If you think OpenSSL is bad, try MIRACL (only documentation I could find is a
word file that's basically a list of function signatures). And OpenSSL at
least generally builds fine on a vanilla Ubuntu machine.

In contrast, libsodium deserves praise for writing documentation like they
want people to actually use their library.

------
mSparks
openSSL dates from a time when security was mostly of low importance. (not
that things have really changed that much. iot I'm looking at you).

shock horror it shows.

i find it really quite painful that no one seems to be taking this as
seriously as it deserves.

cost must be literally hundreds of billions a year now of electronic crime
simply because we have been denied secure communications from day 1.

