
Cryptography is a science, not engineering - cperciva
http://www.daemonology.net/blog/2013-06-17-crypto-science-not-engineering.html
======
sdevlin
I don't think this is good advice for the developer population at large.

> Sure, it's complex and you have to get all the details right

What's really dangerous about cryptography (someone on HN pointed this out a
few weeks ago) is that you get very little feedback on this. It's very easy to
take reliable primitives and build a broken system. When I say "broken" I
don't mean "theoretically vulnerable" \- I mean "game over".

I get paid to look for these bugs, and they are legion.

> so for developers, I recommend a more modern approach to cryptography —
> which means studying the theory and designing systems which you can prove
> are secure.

The average application developer does not have the time or the need to learn
the theory at a deep level. And as before, it's very difficult to say with
confidence "this design has no bugs".

The best advice I can give developers is to play very conservatively. Our
default recommendation is PGP for data at rest and TLS/SSL for data in
transit. Or some high-level library like KeyCzar, NaCl, etc.

~~~
majelix
> is that you get very little feedback on this.

Are there any easy-to-implement tests for this sort of thing? For example

* For mathematical functions, I can trivially test for an expected value, a too low value, a too high value, and not a number input.

* For input/output sensitization, I can pump crazy unicode characters or system code.

* For encryption, I can ... open up tcpdump and see that it's ostensibly garbage-looking?

~~~
sdevlin
Unfortunately, I don't think so.

It's pretty easy to eyeball your ciphertext and think it's sufficiently
garbage-looking, but it's difficult to predict what an attacker can do with
access to your running system. Seemingly innocuous flaws can lead to complete
plaintext recovery or the ability to forge arbitrary ciphertext. (Here's a fun
way to get a taste: [http://www.matasano.com/articles/crypto-
challenges/](http://www.matasano.com/articles/crypto-challenges/))

@cperciva is right that cryptographic security is something that needs to be
proven. Since I'm not smart enough to do that, I avoid designing my own crypto
systems, and I recommend clients do the same. Lots of risk, little or no
reward.

------
lawnchair_larry
This article is quite confusing. cperciva seemingly doesn't understand how
"modern crypto", as far as the general developer population understands it, is
very much _not_ different from 90s crypto.

Here is a great real-world example from last month. Synergy added "encryption
support":

[http://synergy-foss.org/spit/issues/details/12/](http://synergy-
foss.org/spit/issues/details/12/)

(scroll to the bottom)

 _" Stryker, we actually use the Crypto++ library and do not "code the
encryption" as you put it. If you are not happy using Crypto++, then please
disable encryption and use SSH tunnelling instead. The trend seems to be that
most users do not know how or do not want to use SSH tunnelling, and would
prefer for this to be built into Synergy itself.

Discussing this further is a waste of time. Patches welcome."_

~~~
_phred
Wow, that comment thread... does not lend itself to confidence in their
project's security.

It also illustrates a really key point about crypto: because it _looks_ simple
(oh, just run the bytes through that function/hash/send them over SSL), people
assume that it _is_ simple they know enough to hack together a decently secure
system.

At the very least, a healthy respect of crypto theory is called for. In my
experience most developers do not have this healthy respect and see crypto as
a magic black box that makes data unreadable.

I find attacks on cryptosystems illustrative for the "oh CRAP" moment. Oh CRAP
salted hashes are a terrible idea. Oh CRAP you can pad a hash to make a remote
system accept "signed" data. The more I learn and the older I get, the more
cautious I am.

------
jacquesm
This argument is essentially 'breakers versus builders'. Breakers have the
upper hand here, breaking is a lot easier than building and criticizing is
easier than creating. So people who don't create and in general fulfil a role
which consists of merely trying to destroy that which others created have an
inherent advantage: they need to succeed only once in order to prove their
perceived superiority over the creators.

Crypto is complex enough that more people will land in the destructive camp,
it is by far the simpler approach to fame and riches.

But that does not mean that a 'breaker' can automatically move to 'builder'
status, testimony to that is the number of people that we think are capable of
constructing solid cryptographic systems. Colin is one of the people that has
definitely achieved 'builder' status, he's not scared to publish his work and
has made meaningful advances in the field.

If you feel like, you too can be famous, analyse tarsnap and see if you can
find a bug, Colin will be happy to advance your stature, see:

[http://www.tarsnap.com/bounty-winners.html](http://www.tarsnap.com/bounty-
winners.html)

Meanwhile, take Colins advice if you ever want to advance from 'breaker' to
'builder', any idiot can break a window, it takes a lot of expertise to make a
flat piece of glass and to set it properly, far more than breaking a window
does.

~~~
tptacek
It drives me fucking bananas when people compare "breaking" to "criticizing"
and "building" to "creating". Tell that to Don Coppersmith or Daniel
Bleichenbacher.

Anybody can create a block cipher. Anybody can create a new encrypted
transport. It's _far_ easier to cough up a crappy new design than it is to
break even a mediocre one.

But, more importantly, "breaking" a cryptosystem isn't "destructive". It's the
most productive thing you can do to crypto. You can't prove a negative. All
you can do is spot the designs we can know we shouldn't be using.

"Any idiot can break a window". Sheesh. Have you ever coded a crypto attack?
Would you like me to send you a model problem, Jacques?

~~~
jacquesm
How about you create something for a change instead?

~~~
apu
Wow this is so out of line.

I can't believe you'd write something this awful as a direct personal attack.

~~~
jacquesm
Agreed.

------
glurgh
Can't cryptography be both a science (or branch of mathematics) and an
engineering discipline?

The science part of it will continue to provide methods of greater robustness
and security, laden with increasingly better and broader security proofs and
properties.

The engineering side of it will continue to seek and cling onto any slightest
toehold afforded by mis-steps in design, implementation, protocol, all the way
up to UI. Much of this is not, at least pre-emptively, subject to mathematical
analysis.

This seems to have been a recurring theme for as long as cryptography has been
around rather than some specific aspect '90s vs '10s crypto.

~~~
mikegioia
Honestly. There's computer science and computer engineering. There can be
crypto science and crypto engineering just the same.

------
wglb
This is a very interesting discussion between two practitioners that I have
tremendous respect for.

To me the discussion highlights one interesting point: the provability of
programs. In particular, Colin says _I recommend a more modern approach to
cryptography — which means studying the theory and designing systems which you
can prove are secure._

Now, I remember back in the days of structured programming, and we were all
reading Dijkstra and thinking about proving programs. We concluded that it was
a pretty expensive to do so. In all my decades of programming, I haven't made
a serious attempt to do so. The best I have done is to (sometimes) write
programs that are hopefully provable.

The possibility of writing provable programs to me is confounded by the work
done by John Regehr, who seems to be the current champion of fuzzing
compilers. His techniques find legions of bugs in compilers, including 17
errors in one research compiler that is _proved correct_.

Nonetheless, my question for Colin is how many programmers do you feel are
capable of proving programs correct, be they compilers or cryptography
libraries? For those of you who haven't given it a shot, take a look at some
of the links on the side of Colin's blog post, the "Software Development Final
Exam". Not very many of us did very well with that. (My feeble excuse is that
there weren't any computer science departments when I went to Engineering
School.)

I guess that Colin would agree that anyone who does not do well on those exams
might not do well in writing and proving programs correct.

And on the other hand, Thomas explodes stuff that is written by a range of
programmers covering the whole spectrum of talent. He sees on a day-by-day
basis of the result of confident, talented programmers when subjected to
excruciating punishment.

I do enjoy the discussion between these two.

~~~
wglb
Uh, the programs are subject to excruciating punishment, not the programmers.
Just to clarify.

~~~
nknighthb
I'm reminded of a talk by a guy who ran a penetration testing team (I think in
Israel?) which was hired by a bank on a highly permissive contract. It led to
them trying to physically rob the bank just because they could. (Spoiler: It
didn't go well.)

Unfortunately, I don't know many programmers who would take kindly to rubber
hose pentesting.

~~~
andrewaylett
This?

[https://www.youtube.com/watch?v=RJVHTQSvUIo](https://www.youtube.com/watch?v=RJVHTQSvUIo)

------
clarkmoody

      ...it is like planning a gravity-assisted interplanetary
      trajectory. Sure, it's complex and you have to get all 
      the details right — but once you start moving, the only 
      way you will fail to reach your destination is if the 
      laws of physics (or mathematics) change. 
    

I will make a _slight_ complaint about this statement, from the perspective of
an aerospace engineer: long trajectories may be subject to a significant
amount of uncertainty. Recall the fears of a huge asteroid[1] hitting Earth in
2036. It wasn't until refined measurements came in that we could be certain
that it wouldn't go Deep Impact on us.

Other long-range space probes come under the influence of the many small
objects in the solar system on their cold voyages, leading to course
corrections during the mission. Estimating the position of these spacecraft or
asteroids leads to an _uncertainty_ ellipsoid in space, representing N%
probability that the object is inside it.

So, yes, astrodynamics does work really well over short distances, but just
like other engineering arenas, there is no Exact Answer.

As for the rest of the article, I'm not qualified to agree/disagree. The
engineer in me just wants to use secure tools to make secure services and
protect private data.

[1]
[http://en.wikipedia.org/wiki/99942_Apophis](http://en.wikipedia.org/wiki/99942_Apophis)

~~~
andrewflnr
Probably the "refined measurements" you mention fall under "get[ting] all the
details right". But then I guess you can explain almost any requirement that
way.

~~~
clarkmoody
Right. We don't have a perfect model of the solar system, or Earth for that
matter. Of course, with a perfect model, we could predict orbits (or any other
physical phenomenon) perfectly.

I think crypto, being a digital phenomenon, is way more on the science side
than orbital determination.

Now, _crypto on satellites_ poses a unique challenge, due to the interaction
of solar radiation with electronics.

------
tptacek
Excuse the prolix comment; I'm not feeling well today.

I think people who follow both me and Colin on HN know that I have a lot of
respect both for him and for Tarsnap, the service he runs, which is the only
encrypted backup service I have ever recommended to anyone and which is to
this day my go-to recommendation for people looking to safely store data in
the cloud. Colin has built one of the very few modern cryptosystems I actually
trust.

First, let me dodge Colin's whole post. My Twitter post was:

 _If you’re not learning crypto by coding attacks, you might not actually be
learning crypto._

(I was cheerleading people doing our crypto challenges
[[http://www.matasano.com/articles/crypto-
challenges/](http://www.matasano.com/articles/crypto-challenges/)] and didn't
think much of my twerp; I didn't exactly try to nail it to the door of the All
Saints Church).

Note the word "might". I specifically chose the word "might" thinking "COLIN
PERCIVAL MIGHT READ THIS". Colin, "might" means "unless you're Colin".

Anyways: I think the point Colin is making is valid, but is much more subtle
than he thinks it is.

Here's what's challenging about understanding Colin's point: in the real
world, there are two different kinds of practical cryptography: cryptographic
design and software design. Colin happens to work on both levels. But most
people work on one or the other.

In the world of cryptographic design, Colin's point about attacks being
irrelevant to understanding modern crypto is clearly valid. Modern
cryptosystems were designed not just to account for prior attacks but, as much
as possible, to moot them entirely. A modern 2010's-era cryptosystem might for
instance be designed to minimize dependencies on randomness, to assume the
whole system is encrypt-then-MAC integrity checked, to provide forward
secrecy, to avoid leaking innocuous-seeming details like lengths, &c.

While I think it's helpful to understand the attacks on 1990's-era crypto so
you can grok what motivates the features of a 2010's-era ("Crypto 3.0")
system, Colin is right to point out that no well-designed modern system is
going to vulnerable to a (say) padding oracle, or an RSA padding attack
(modern cryptosystems avoid RSA anyways), or a hash length extension.

In this sense, learning how to implement a padding oracle attack (which
depends both on a side channel leak of error information _and_ on the failure
to appropriately authenticate ciphertext, which would never happen in a
competent modern design) is a little like learning how to fix a stuck
carburator with a pencil shaft.

The deceptive subtlety of Colin's point comes when you see how cryptography is
implemented in the real world. In reality, very few people have Colin's
qualifications. I don't simply mean that they're unlike Colin in not being
able to design their own crypto constructions (although they can't, and Colin
can). I mean that they don't have access to the modern algorithms and
constructions Colin is working with; in fact, they don't even have
intellectual access to those things.

Instead, modern cryptographic _software_ developers work from a grab-bag of
'80s-'90s-era primitives. A new cryptosystem implemented in 2013 is, sorry to
say, more likely to use ECB mode AES than it is to use an authenticated
encryption construction. Most new crypto software doesn't even attempt to
authenticate ciphertext; cryptographic _software_ developers share a pervasive
misapprehension that encryption provides a form of authentication (because
tampering with the ciphertext irretrievably garbles the output).

I think it's telling that Colin breaks this out into '90s-crypto and
2010's-crypto. For instance:

[http://www.daemonology.net/blog/2011-01-18-tarsnap-
critical-...](http://www.daemonology.net/blog/2011-01-18-tarsnap-critical-
security-bug.html)

This is an AES CTR nonce reuse bug in Colin's software from 2011. Colin knew
about this class of bug long before he wrote Tarsnap, but, like all bugs, it
took time for him to become aware of it. Perhaps he'd have learned about it
sooner had more people learned how cryptography actually works, by coding
attacks, rather than reading books and coding crypto tools; after all, Colin
circulates the code to Tarsnap so people can find exactly these kinds of bugs.
Unfortunately, the population of people who can spot bugs like this in
2010's-era crypto code is very limited, because, again, people don't learn how
to implement attacks.

But I'll push my argument further, on two fronts.

First: Colin should account for the fact that there's a significant set of
practical attacks that his approach to cryptography doesn't address: side
channels. All the proofs in the world don't help you if the branch target
buffer on the CPU you share with 10 other anonymous EC2 users is effectively
recording traces of your key information.

Second: Colin should account for the new frontiers in implementation attacks.
It's easy for Colin to rely on the resilience of "modern" 2010's-era crypto
when all he has to consider is AES-CTR, a random number generator, and SHA3.
But what about signature systems and public key? Is Colin so sure that the
proofs he has available to him account for all the mistakes he could make with
elliptic curve? Because 10 years from now, that's what everyone's going to be
using to key AES.

So, I disagree with Colin. I think it's easy for him to suggest that attacks
aren't worth knowing because (a) he happens to know them all already and (b)
he happens to be close enough to the literature to know which constructions
have the best theoretical safety margin and (c) he has the luxury of building
his own systems from scratch that deliberately minimize his exposure to new
crypto attacks, which isn't true of (for instance) anyone using ECC.

But more importantly, I think most people who "learn crypto" aren't Colin. To
them, "learning crypto" means understanding what the acronyms mean well enough
to get a Java application working that produces ciphertext that looks random
and decrypts to plaintext that they can read. Those people, the people
designing systems based on what they read in _Applied Cryptography_, badly
need to understand crypto attacks before they put code based on their own
crypto decisions into production.

~~~
cperciva
Excuse the prolix reply; I have a flight to catch.

As I wrote in my blog post, I have a lot of respect for Thomas. He's who I
usually point people at when they want their code audited. I really hate
reading other people's code and I trust Thomas (well, Matasano) will do a good
job.

 _two different kinds of practical cryptography: cryptographic design and
software design_

Agreed.

 _Colin happens to work on both levels. But most people work on one or the
other._

I'm generally writing for an audience of people who already know how to write
software, but want to know something about crypto. So I take one as given and
focus on the other.

 _modern cryptographic software developers work from a grab-bag of
'80s-'90s-era primitives_

Right, and that's exactly what I'm trying to change through blog posts and
conference talks. We know how to do crypto properly now!

 _This is an AES CTR nonce reuse bug in Colin 's software from 2011. Colin
knew about this class of bug long before he wrote Tarsnap, but, like all bugs,
it took time for him to become aware of it._

To be fair, that was not a crypto bug in the sense of "got the crypto wrong"
\-- you can see that in earlier versions of the code I had it right. It was a
dumb software bug introduced by refactoring, with catastrophic consequences --
but not inherently different from accidentally zeroing a password buffer
before being finished with it, or failing to check for errors when reading
entropy from /dev/random. Any software developer could have compared the two
relevant versions of the Tarsnap code and said "hey, this refactoring changed
behaviour", and any software developer could have looked at the vulnerable
version and said "hey, this variable doesn't vary", without needing to know
anything about cryptography -- and certainly without knowing how to implement
attacks.

 _Unfortunately, the population of people who can spot bugs like this in 2010
's-era crypto code is very limited, because, again, people don't learn how to
implement attacks._

Taking my personal bug out of the picture and talking about nonce-reuse bugs
generally: You still don't need to learn how to implement attacks to catch
them. What you need is to know the theory -- CTR mode provides privacy
assuming a strong block cipher is used and nonces are unique -- and then
verify that the preconditions are satisfied.

~~~
fixxer
Get a room you two.

EDIT: On a more serious note, isn't crypto both science and engineering? We
have the theoretical aspects, etc... Then we have the practical aspects of
implementing these systems in production within an ecosystem that is
constantly fighting entropy. I declare a draw.

~~~
mcherm
No, please DON'T get a room. I am learning something by seeing the (polite)
debate played out in a public forum.

------
pbsd
I agree with both Ptacek and Percival, with one caveat: the Ptacek statement
should be "If you're not learning how to use crypto by coding attacks, you
might not actually be learning how to use crypto".

Even within the academic community there has been recent upheaval over the
value of proofs. Ostensibly, proofs are good in the sense that they restrict
the focus of cryptanalysis. Without a security proof of a mode (say CTR or
HAIFA), cryptanalysis wouldn't be able to focus on distinguishing the core
function from random in some sense, but on everything. Note that coding
attacks does not help much here. It also seems that the role of proofs is
often misunderstood, and assumed to mean much more than it does.

Then there are the practical aspects of the trade: cryptographic engineering.
This involves avoiding information leaks (timing, power, etc), knowing what to
do with IVs and keys and nonces, and the list goes on. This is a much more
hands-on task, and often much less documented ("the implementation is left as
an exercise to the reader"). Experience on building and attacking these is
often the best way to learn how to do it, and not by reading a book about it.

------
gridmaths
Where can one find a concise introduction to 'Modern/2010 Crypto : theory and
practice' for programmers ?

~~~
JoachimSchipper
You could do very much worse than Katz & Lindell's "Introduction to Modern
Cryptography". It's fairly mathematical, which fits well with Colin's point.

------
anologwintermut
Though neither side said this, I feel its important to point out: If you’re
learning crypto SOLEY by coding attacks, you ARE NOT actually learning crypto.

A good chunk of bad cryptography was done by people who thought "eh, I can't
see any attack against this" \-- or read Applied Cryptography and thought that
was enough. Hopefully the Matasano crypto challenges don't have the same
effect.

------
tel
Assuming I have every interest in using NaCl for all practical intents and
purposes for a long time, assuming I have no interest in doing Matasano's
crypto challenges, assuming I have a very healthy dosage of algebra, number
theory, proof methods, and a sufficient general mathematical reading level,
how do I go about learning crypto from the theory? If I raced with someone
doing Matasano's challenges will I beat them? If I do, will I be able to
practically implement better security systems by arguing for the value of
2010-era cryptography and motivating comprehensive reviews of the 80s era
systems we already use?

------
yk
I do not really have something to comment on the posting, but I think the
headline is based on a false dichotomy. What we call cryptography spans both,
since the study of crypto primitives is clearly mathematics and the use of
OpenSSL is very much engineering. (And designing secure systems is somewhere
in the middle.)

------
sobering
For those interested, Dan Boneh's course on Cryptography starts another
session as of today.

[https://www.coursera.org/course/crypto](https://www.coursera.org/course/crypto)

~~~
jessaustin
Anyone have any idea when they're going to quit putting off Cryptography II?

~~~
dinkumthinkum
I think it starts in July.

------
e3pi
F L Bauer says in his Springer-Verlag book regarding a holocryptic running key
from any offset of sqrt(2), sqrt(5), and sqrt(17) arbitrary digit expansions
are vulnerable, anyone know why? The only thing I can possibly imagine
related, is 19yo Gauss's famous Euclidean constructable proof of the 17-gon,
and the pentagon may also be constructed. Something in early Galois, abstract
algebra, may be applied to recognizing digit expansions of certain algebraic
irrationals?

------
sbi
I'm surprised nobody has mentioned Koblitz and Menezes in this thread.
Unfortunately, the mathematical literature is littered with insecure, but
"provably secure," cryptosystems.

[1] [http://anotherlook.ca/](http://anotherlook.ca/) [2]
[http://www.ams.org/notices/201003/rtx100300357p.pdf](http://www.ams.org/notices/201003/rtx100300357p.pdf)

------
quinndupont
FWIW, the history of cryptography backs up this basic assertion. One of my
still-not-yet-published assertions is that cryptography is often better
conceptualized as a hermeneutic exploration of the natural world. But, not the
physical world (as "typical" science is), instead the "ideal" world---the
laboratory of the mind. Bacon, Wilkins, Leibniz and many other "fathers" of
modern Western science were all excellent cryptographers.

------
oggy
This article has a bit of an "ivory tower" reek to it, I don't wholly buy it.
Some points are clearly valid: use modern crypto primitives if you can, and
know their properties. But things are more complicated than that.

First, even if you say that proofs are enough, you got to know what you're
proving. The problem is that, AFAIK, most security properties are actually
defined as absence of a particular attack (or a class thereof). Thus knowing
the properties you want to achieve is equivalent to knowing the attacks you
want to avoid. In other words, even if I build my system to have a property A,
I might not know that I also need to attain property B (thus securing it
against the complement of B).

Second, if you do want actual proofs, well, good luck. You start off with the
indistinguishability stuff, which is not easy in itself. Toss in the
distributed aspects of Internet applications and you've got yourself a proper
mess. Case distinctions abound, this stuff slowly crosses into the domain of
intractable. If you look at the game-based security proofs, well, for anything
non-trivial, who can really be confident that the proofs are correct? Machine
verification would help, but our tools are both still too weak, and still only
a domain of select few specialists.

Third, even if you do get your proofs, well, more likely than not, these are
going to be based on a simplified model which will sweep a lot of stuff under
the rug. E.g. I don't know of works which address things like timing attacks
in anything of even remotely practical value. And there's a bunch of other
stuff, keys distributed, management, etc.

Lastly, as lot of other commenters have pointed out, you are also more likely
than not to deal with existing codebases at some point, where you might end up
plugging the holes rather than constructing.

------
b0rsuk
And I say it's shamanism, not science. It's very hard to prove something can't
be done. Rather, "as of now, we know no way to do X". Time after time flaws
are found in RSA, SSL, PGP. New number properties are constantly discovered.
That's why it is - at least partially - security through obscurity.

The Pythagorean theorem doesn't get outdated ! Cryptography does.

~~~
TeMPOraL
Well, science part of crypto is Just Maths. It is on par with Pythagorean
theorem. The problems occur mostly on the engineering side of the equation.

~~~
b0rsuk
Fair enough. But it's incompatible with the claim made in the article - that
cryptography is science and not engineering. It can be both. In fact, it's
likely that it will split to two disciplines in future, much like probability
and statistics.

Cryptography reminds me of cellular automata in that both are made up concepts
that you can have lots of fun with, if you enjoy that kind of thing. I prefer
CA because of its visual nature.

------
Sami_Lehtinen
Unfortunately RC4 is still very widely used. About 50% of SSL/TLS aka HTTPS
traffic is RC4 encrypted "because it's the most secure cipher". Uh.

~~~
tptacek
Don't ever use RC4 for anything.

~~~
voltagex_
So, say hypothetically I wanted to check a hypothetical game console's use of
SSL to make sure it didn't use RC4. How would I go about it?

------
Robin_Message
Good science requires being willing to falsify even your own theories.

That willingness to break what you have made, to take on an attacker's mindset
and say "What can go wrong with this?"

That is _exactly_ what coding attacks teaches.

~~~
dinkumthinkum
Science was a poor choice of word. Mathematical discipline is a better one.
Science implies empiricism, cryptography is mathematical. The falsification
you're talking about applies empirical, observable phenomena, rather than
mathematical theorems.

------
jamwt
Let's just all use ( [http://nacl.cr.yp.to/](http://nacl.cr.yp.to/) ) and stop
arguing.

~~~
dchest
NaCL is awesome and simplifies many things, but it's not a protocol, so we
can't stop arguing :( Example questions: how to do user authentication, should
I use random nonce or a counter for this particular project, what happens when
the nonce is reused (does it break my MAC, encryption, or both), how do I
distribute public keys, etc.

~~~
jamwt
Most of these questions can be solved by a little "cookbook" that maps these
use cases onto pieces of NaCl (and maybe scrypt). I understand we could then
argue over the cookbook (though I don't know in practice how much we would),
but it's nice that we're arriving at a place where the libraries are high-
level enough that it's somewhat hard to screw it up with just a little bit of
guidance.

------
etler
Can we sum up this semantic debate as cryptography is the science, security is
the application and engineering of it?

------
coherentpony
Engineering is a science. Shitty title is shitty.

~~~
CPAhem
I agree.

Most attacks on secure systems involve attacking the engineering - the
implementation of the system, rather than even attempting to break the crypto,
even if it is only DES.

~~~
dchest
WEP. Do you include it in "crypto" or "implementation" category?

What about not using authenticated encryption? Crypto or implementation?

Storing SHA512 of passwords? Using non-cryptographic random number generators?

~~~
coherentpony
> non-cryptographic random number generators

Care to explain the distinction between a 'cryptographic' random number
generator and a 'noncryptographic' random number generator.

A random number generator is a random number generator. Some are worse than
others under various metrics. Arguably, random number generators that are ill-
equipped to generate high-quality random numbers shouldn't be used at all. For
_anything_ , cryptography included.

~~~
tptacek
All cryptographers make a deliberate distinction between RNGs and CSPRNGs.

There are high-quality RNGs that aren't good CSPRNGs. CSPRNGs need to be fast,
ready to deliver results after a cold boot, and unbiased. What you're probably
thinking of as a "high-quality" source of random numbers is just part of a
CSPRNG (the entropy source).

Conversely, CSPRNGs aren't actually suitable for all random number needs in
programming; for instance, in software testing, you want an RNG you can seed
and retrieve deterministic results from. If you can do that with your RNG,
it's very unlikely that it's not suitable for cryptography.

------
depr
the essential characteristics of science are: (1) It is guided by natural law;
(2) It has to be explanatory by reference to nature law; (3) It is testable
against the empirical world; (4) Its conclusions are tentative, i.e. are not
necessarily the final word; and (5) Its is falsifiable. (Ruse and other
science witnesses).

[http://www.talkorigins.org/faqs/mclean-v-
arkansas.html](http://www.talkorigins.org/faqs/mclean-v-arkansas.html)

~~~
SeanLuke
This definition is from a non-scientist. And I think this is an absurdly over-
specific and over-naturalist definition. #5 in particular is annoying given
that Popperism is an ideal that essentially no scientific fields meet
(including physics).

Rather, I think the most reasonable definition of a science is very simple: a
discipline whose primary technique for gathering new results is the scientific
method.

Anyway, given either definition, I don't see how Cryptography is a science.

------
hislaziness
My takeaway from this conversation - No amount of mathematics can address
human stupidity.

