
Heartbleed and the misconceptions about Open Source - pytrin
http://www.binpress.com/blog/2014/04/12/heartbleed-misconceptions-open-source/
======
jleader
Generalizations about whether open-source developers or closed-source source
developers have more resources, or are more professional, or whatever, are
silly. The two groups of developers are very large with high variance in many
dimensions, and a lot of overlap. There are open-source projects with one or 2
developers, and open-source projects that are the primary focus of
$100-million, thousand-employee companies. There are also closed-source
commercial projects developed by teams of hundreds, and closed-source
commercial projects developed by a solo programmer when he's not busy
answering customers' phone calls. Lots of developers work on both open and
closed-source projects at one time or another.

It's important to discuss what changes we can (and should) make to make
problems like heartbleed less likely in the future, but wildly waving
competing generalizations in the air doesn't help anything.

~~~
Spearchucker
What you say is true. The argument is still useful though. There are those
that follow one at the expense of the other. This is no more useful than
saying the only way to develop software is agile.

------
mehrdada
"you cannot expect a person working in academia to be held to the same
standards as professionals working in the industry for many years"

This is absolutely BS, especially in security and cryptography. Most security
related code written by most so-called "professional" software developers is
astonishingly terrible (e.g. ECB mode encryption, storing encryption key in
code, reusing encryption keys, relying on (unauthenticated) encryption for
authenticity, reusing IVs, linear time MAC verification, ...). Most
cryptographers are academics. Also, anecdotally, the poisonous "demo an
exploit or it doesn't happen" attitude in response to hints at a flawed system
design is much more prevalent among "professional software developers" than in
academia.

If anything, we should encourage more security experts in academia to engage
in implementation, verification, and improvement of security code, not the
other way around.

(Not that most academics write good code either, but this is not an
academia/industry issue. It is a security expert/non-expert issue.)

~~~
fidotron
Isn't the problem that cryptography, solid protocol design and secure software
development are actually almost unrelated disciplines that just happen to
overlap in some applications?

I don't see why an OpenSSL developer would need either of the first two
skillsets at all. Ideally, yes, but not necessarily.

~~~
mehrdada
I disagree with the premise of your question. I think the overlap is larger
than it looks like on the surface. I have come to believe that you cannot do
solid secure protocol design without a firm grasp of the basics of encryption
schemes. You don't need to be a full fledged cryptographer, but to quote Phil
Rogaway, "Reading [Schneier] does not qualify one to do cryptographic design."
[1]. In fact, the context in which he wrote "Problems with Proposed IP
Cryptography"[1] is a perfect evidence of this: IETF committee designing
IPsec, back in 1995, which I think is fair to think of as mostly "professional
software developers", was clearly warned about potential consequences of some
of their design choices[2] (like using Encrypt-then-MAC), but they reacted
quite harshly to his comments and did not take them as seriously as they
should have, and actual attacks were demonstrated on those issues years
later[3]. For me, reading the mailing list is enough evidence that it is very
dangerous to draw the line between secure protocol design and other things.
Similarly, I believe the lack of understanding of the reasons behind design
choices in a protocol specification will surface as implementation bugs done
by purely "secure software development" experts (e.g. "optimization" of random
data generation at the beginning of the protocol to use in place of an invalid
input by reordering the operations leading to a timing attack in TLS). There
is enough historical evidence that makes anything except the "ideal" case
dangerous.

[1]: [http://www.cs.ucdavis.edu/~rogaway/papers/draft-rogaway-
ipse...](http://www.cs.ucdavis.edu/~rogaway/papers/draft-rogaway-ipsec-
comments-00.txt)

[2]:
[http://www.sandelman.ottawa.on.ca/ipsec/1995/04/msg00148.htm...](http://www.sandelman.ottawa.on.ca/ipsec/1995/04/msg00148.html)

[3]:
[http://www.isg.rhul.ac.uk/~kp/CCSIPsecfinal.pdf](http://www.isg.rhul.ac.uk/~kp/CCSIPsecfinal.pdf)

~~~
fidotron
I'm not sure that you're actually disputing my statement at all.

Your main case there is software developers failing to do protocol design,
which is exactly what I'm saying is a bad idea if they don't happen to be
independently expert in it.

The reality is these situations would be a whole lot easier to resolve if
people actually respected the expertise of each other. This works in all
directions, but recent bugs show a definite weakness in terms of respect given
towards relatively basic software engineering practices.

~~~
mehrdada
I may have been unclear: I am claiming that you cannot be a good protocol
designer without sufficient expertise in cryptography AND you cannot be a good
implementer without sufficient expertise in protocol design (I have updated my
OP to add an instance of an implementation bug caused by insufficient
knowledge about protocol design as well: TLS timing attack).

~~~
fidotron
You could make the argument that if the implementer has to know anything about
why the protocol or crypto works then the spec is poorly specified.

Certainly it helps to have a minimum of appreciation of the other parts of the
domain, but I think you greatly overestimate how important that is for the
kinds of problem we've been seeing lately.

~~~
cbhl
In any other field, I would agree with that assessment. In the world of
crypto, with side-channel attacks on _the sound of your keyboard as you type_
, I think that having domain knowledge is essential for any implementer.

------
linuxhansl
It seems to me that the author of this piece has a lot of "misconceptions
about Open Source" himself.

An example: "anyone can contribute, regardless of background or proficiency".
I'd encourage the author to research how open source projects are run before
making claims like this.

Also.. How was this bug found again? Oh yeah. By analyzing the _open_ source
code.

Professionalism is orthogonal to open source vs. closed source. There's a
place for both, and there is good and bad open source and closed source
software.

Moving right along nothing to see here.

~~~
daeken
> How was this bug found again? Oh yeah. By analyzing the _open_ source code.

It was found by fuzzing. OpenSSL being open has absolutely nothing to do with
its security, in a positive or negative way. It's just a poor project.

------
whatts
Apple has all the resources, and they had the "goto fail". You should not
underrate open source. Bugs are shallow, but that can never mean _every_
single bug. Some bugs will always be overlooked, no matter if open source or
closed source.

------
owenversteeg
This is BS; the bug was found by people analyzing the _open-source_ code
because anyone can do so. Also, criticizing the developer because he's a PhD
student makes zero sense; the two best developers I've known were a student
and a 13-year-old.

I also love how the author puts a thinly-veiled plug of his slimy "open-
source" code-selling website in the middle. As benatkin said in his excellent
comment [0], all four of their featured products are closed-source. The OSI
should sue them for violation of their trademark of the term "open source".

[0]
[https://news.ycombinator.com/item?id=7579700](https://news.ycombinator.com/item?id=7579700)

------
jokoon
open source doesn't necessarily mean "anyone can edit it and improve it".

patches and added features need to be reviewed by project owners.

open source mostly mean "you can read the source and modify your version, but
that doesn't mean you can make a change that will go into the official
release."

There are some very sensitive implementations of software which should be
thoroughly examined by experts and criticized if they're not good enough. If
there is no resources available to maintain a particular open source software,
don't bother use it, ESPECIALLY if it's sensitive like openssl.

Open source allows software companies and other programmers to easily work
together to solve a problem. Developer's time is precious so it's often time-
saving to use somebody's else work, but that doesn't mean you should use it
blindly.

~~~
pytrin
You are correct, of course. But what the Heartbleed showed us is that even at
the scale of OpenSSL (millions of users), almost all were using it blindly.
People often care more about the free in pricing aspect more than the freedom
to inspect, modify and contribute - because as you say, dev time is precious.

> open source doesn't necessarily mean "anyone can edit it and improve it".

I think you missed the point in the article - it was about how anyone can
create or contribute to open-source. Not about submitting patches to existing
projects and have them pulled upstream without any review process.

~~~
z3phyr
Many of them do use Microsoft Windows, but they are deliberately blinded by a
few people who are centered at power.

You can look at the frequency of patches between proprietary and open source
software, which shows a lot.

------
zobzu
"every software has bug and opensource has less resources to look at it"

i'd rather say "and you just don't know about the closed source ones because
they're harder to find" ;-)

------
upofadown
There has been a lot of really insightful hindsight about the heartbeed issue.
This one seems to fall into the category of "we should of expended more
resources on such a critical piece of infrastructure", where resources could
of been time, attention or money. That is true, but not really very helpful.

This particular observation comes up any time something goes wrong in any
context. The stuff about the shallowness of bugs really has nothing to do with
the argument. This bug was in fact quite shallow, some random entity just
found it by looking. If more people had of been looking then it would of
likely been found sooner. You can only find a bug once.

------
benatkin
BinPress is misusing the term Open Source in their slogan. All four of their
featured "Popular Products" are closed source. IMO
[http://opensource.org/](http://opensource.org/) should be suing them to
protect their trademarks, because their use of the term is trying to piggyback
on the popularity of the Open Source community that OSI represents.

So I don't think they are in a good position to be talking about the meaning
of Open Source, as they're doing in this article.

~~~
quadrangle
Well, yes, __Binpress totally abuses the term "Open Source" __, but __the Open
Source Initiative was _denied_ their application for the trademark on the term
__, so there 's nothing legally that can be done. All we can do is post a
comment on Binpress links telling everyone that despite their claim, Binpress
sells mainly just _proprietary_ software.

~~~
benatkin
Good to know. I tried to find it out by googling, and I had a hunch this might
be the case but wasn't sure. Still, it might be worth a shot, because they
could argue that it's a reference to the Open Source Initiative and its
approved licenses, just like Chick-Fil-A is trying to argue that "Eat More
Kale" is a reference to "Eat Mor Chikin".

------
njharman
"OpenSSL is used and run by millions of companies around the world, many of
which have dedicated software engineers working for them full-time, while
reaching hundreds of millions of users. And yet, this issue was undiscovered
for almost 2 years"

This is almost a non-sequitor (Sp?). Almost none of those software engineers
looked at the source (and those few that did got eye bleed).

I quit reading after that.

------
stuhood
Does the security team not count as one of the sets of "eyes"? Would they have
discovered the bug without inspecting the code?

------
markbnj
The main point, that more eyeballs doesn't necessarily lead to more bugs found
and fixed, is a good one. Reading code, or text, specifically with the intent
of finding errors is very hard, and is itself an error-prone activity. Anyone
who has had to do close proof-reading knows this. It's hard work, so our
brains are constantly fighting us and trying to "relax" back to a higher level
of abstraction. That's one of the reasons I read the Coverity post with some
interest. We humans are hopelessly ill-suited for these tasks, and we need all
the help we can get.

------
Ologn
> “Given enough eyeballs, all bugs are shallow” – Eric Raymond > only obvious
> problems are easily caught. An issue that manifests itself only under very
> specific conditions or not in a way that is obvious to the end-user, can go
> undetected for a long time.

There are fundamental differences between bugs and security holes. Bugs are
something everyone has an interest in fixing. If a bug rarely manifests itself
- then it is not that much of a problem.

Security holes are things which some people scrupulously search for, and then
sometimes keep secret, for their own ends. Sometimes people even try to create
security holes where there are none (
[http://lwn.net/Articles/57135](http://lwn.net/Articles/57135) ).

------
awalton
That's funny. When I make this _exact point_ here on HackerNews I get
downvoted to oblivion.

Open Source gives you potential to build a rocket to the moon. But it requires
money and time and people willing to mind the code, and people with humble
attitudes willing to accept when they've made mistakes and patch the code.

Quality Assurance requires effort, and that's where the fallacy of "Free"
software really comes from. If you're not paying for it, you're going to pay
for it. (Either by being the QE team and fixing bugs yourself or by living
with buggy software.)

~~~
__david__
> _That 's funny. When I make this exact point here on HackerNews I get
> downvoted to oblivion._

The article wasn't particularly good, which may explain that.

> _…that 's where the fallacy of "Free" software really comes from. If you're
> not paying for it, you're going to pay for it._

The "Free" in "Free Software" has never meant "No Cost". It has always meant
"Freedom". When you start talking about cost you are missing the point.

~~~
pete3087
> The article wasn't particularly good, which may explain that.

So it's not that _you think_ the article isn't very good but instead it just
isn't? Bold statements like this need an explanation...

Stating that the article isn't very good and using it as an explanation as to
why someone's comment was downvoted without giving any reasons does not really
contribute to the discussion and also does not prove your point. Just because
awalton has a different opinion does not mean that he/she is wrong.

~~~
__david__
> _So it 's not that you think the article isn't very good but instead it just
> isn't? Bold statements like this need an explanation..._

First off, _everything_ I say is "according to my opinion". That is implied
and I don't have to explicitly state it on every sentence I write (especially
on sentences that already sound like opinions).

Second, it's really not a bold statement. Read the other comments here—quite a
lot of them are critical and the top rated ones all have excellent points.
Given that, claiming the article isn't very good isn't much of a stretch. You
want reasons? Read the rest of the %$#@! comments.

Now, my comment may have been on the pithy side, but I found it particularly
funny that the guy was commenting about how he agreed with the article,
complaining that when he expresses the same sentiment on HN he gets downvoted,
but nearly every other comment on the story was attacking its shallow
understanding of Free Software/Open Source, cryptographic library programmers,
and virtually every other point it tried to make. IE, he appears to have same
misguided opinions as the article's author, but not the self awareness to
enlighten himself.

------
quadrangle
Binpress doesn't promote Open Source software as anyone else knows it.
Binpress promotes proprietary software where licensees can see the source code
and modify it privately. Binpress calls this "Open Source" although it lacks
all the qualities that everyone else assumes with that term.

Thus, Binpress always looks to combine their one very _good_ point (that
better funding for Open Source is important) with a bunch of junk trying to
say that buying their proprietary software is the answer.

------
fidotron
OpenSSL, and the other security problems lately, are just the top of a rabbit
hole that is only ultimately resolved with isolated special hardware. Frankly
we shouldn't trust our systems, open or proprietary, on the very simple basis
they are too complex to verify.

Only by moving crypto functions to a separate user maintainable black box will
this tide ever be stemmed. Of course, verifying that black box then becomes
problematic, but it would be easier than the current situation.

~~~
mehrdada
> too complex to verify

There is a verified optimizing C compiler, CompCert. Admittedly, it is not
gcc, and it is not easy to do, but still. Writing a verified SSL
implementation is probably not more difficult than that.

~~~
pjreddie
seL4 is 9000 lines of C and took 11 person-years to verify. How big is
OpenSSL?

Also, verification isn't a magic bullet, you need a good spec.

~~~
mehrdada
> seL4 is 9000 lines of C and took 11 person-years to verify. How big is
> OpenSSL?

Total market cap of the top three tech companies is more than a trillion
dollars. Even a hundred times more resources is affordable for them given the
criticality of the project. The replacement does not have to be written in C,
it can be written in ML-like languages and expose an external C interface.

> Also, verification isn't a magic bullet, you need a good spec.

True, but drawing from the CompCert anecdote, I suspect bugs in a verified
implementation would be orders of magnitude less likely.

------
cabinpark
I've always interpreted Linus's law in the following way: given a bug, there
will exist someone to whom the bug is obvious and will immediately spot it.
However the law doesn't state how many people you would need to check, it
might be 2 or it might be 100,000 required.

------
arikrak
If openSSL was closed-source and a vulnerability was found in it, couldn't it
have been patched without revealing what the issue was? This seems to be a big
security issue with open-source.

~~~
skybrian
No, unless it's the sort of software that doesn't need to be distributed at
all. Security patches to widely-used software are attractive targets for
reverse engineering.

