
Why I don't touch crypto - fs111
http://pilif.github.io/2013/07/why-I-dont-touch-crypto/
======
dobbsbob
Good thing Satoshi wasn't scared of cryptography or we wouldn't have bitcoin.
Also good thing the Truecrypt devs weren't scared of it, or the Debian devs,
or the OpenBSD devs and others who gave us OpenSSH.

Moxie Marlinspike doesn't seem to be scared of it either. According to him
it's not impossible to learn simple, tried and true methods for crypto
engineering that are all spelled out in Schneier's books and various white
papers. These standards were completely ignored by the cryptocat guys (and
ignored repeated warnings from auditors). So if you can read, comprehend and
do bare minimum research of NIST standards you're pretty safe rolling your own
crypto software if you know about attention to detail, the doom principal,
PRNGs and their flaws, and know not to attempt to make your own primitives. If
you guy's can push out incredibly complex and relatively secure financial
trading software there's no reason you can't roll your own SHA-256 on your
Android phone replacing SHA-1 FDE or even design Redphone like Moxie did.
Would also help if you hung around hashcat forums, read Schneier's cryptogram
newsletter, and watched some lectures about software crypto engineering.

~~~
tptacek
No, _some_ "tried and true" methods of using crypto are in _one_ of Schneier's
books, _Practical Cryptography_. _Practical_ is great, but it's not complete,
a fact mitigated by the effort Schneier and Furgusen go to in that book to
convince people not to write casual crypto.

The methods described in _Applied Cryptography_ are unfortunately well-tried,
but few of them are true. _Applied_ is an almanac of cryptographic concepts.
Where _Practical_ tries hard to present best practices at every point in the
book, _Applied_ instead strives for the broadest coverage; it's a survey, not
an instruction guide. Unfortunately for all of us, ~20 years of _Applied_
readers have tried to put directly into practice the material in that book,
much of which is (relevant to modern standards) half-baked.

It's also important to know that _Practical_ is also showing its age. There
are very common, very serious vulnerabilities that _Practical_ does very
little to prevent. For instance, "Design Rule 4" in _Practical_, "The Horton
Principle", implies that message authentication should precede encryption.
This construction is now disfavored; protocols that use mac-then-encrypt have
been broken with side channels abetted by attacker chosen ciphertext.

Other concepts missing from _Practical_: elliptic curve --- particularly given
endangered status of RSA, which would be a nit if not for the extremely
detailed coverage Schneier gives to more theoretical threats to AES, (EC)DSA
parameter tampering, the notion of minimizing randomness from cryptosystems,
the modern AE modes, an in-depth treatment of side channels (where "in-depth"
might mean "at least as much coverage as is given to the question of what
block cipher mode to use") --- particularly error oracles, applications of
hash collisions, RSA message formatting (it doesn't even cover OAEP), and
secure key derivation.

I write this from a place of love; _Practical Cryptography_ is one of my all-
time favorite software security books. It's one of those books you can "read
backwards" to learn how to break systems in addition to learning how to build
them. With age, though, _Practical_ is becoming more useful as a breaker's
guide and less and a builder's guide.

 _There is no book anywhere that a generalist developer can read to learn how
to build a secure cryptosystem from scratch, and generalist developers should
be relying instead on high-level libraries like Keyczar and Nacl._

I don't know a think about Satoshi's competence (I couldn't, because nobody
knows who he is), but Moxie Marlinspike has spent over 15 years building his
competence. If you're Moxie, build whatever you want.

~~~
Osmium
If I remember rightly, they actually say in Practical Cryptography that some
of the worst crypto they've seen comes from people who've read Applied
Cryptography and then decided to roll their own. Which is presumably one of
the dangers in writing an "accessible" cryptography textbook is that then
people who read it think they understand it... (I say this as a fan of Applied
Cryptography, but more as an academic text -- I'd never dream of writing
crypto code myself)

~~~
tptacek
Applied is a good _book_; it's fun and interesting to read. But people should
read it as pop science.

~~~
dfc
What is the deal with Practical Cryptography versus Cryptography Engineering?
I have frequently seen people say they are the "same book." How similar are
they? The publication date is seven years apart. Did they just add a new
foreword and change the title?

~~~
pbsd
They _are_ the same book; Cryptography Engineering is simply the 2nd edition,
although they decided to change the title for some reason.

~~~
tptacek
Everyone in our practice has a copy of one or the other, and the differences
aren't at all significant. I'd buy whichever one you can get cheaper.

------
taway2012
This is not really the most insightful comment, but anyway.

I found the article to be needlessly defeatist.

I am not a crypto expert. I've read Practical Cryptography and have a lots of
experience with software engineering in general.

This article (yet again) loosely says "crypto" without specifying whether he's
talking about "crypto protocols" or "crypto primitives" (I use "protocol" in a
theoretical sense: saving a file piped through gpg with passphrase
rememembered in memory constitutes a protocol).

It's well understood that mere mortals shouldn't create "crypto primitives".
But I would argue that we're soon going to reach the point where many software
engineers will have to understand crypto protocol creation.

Just like many software engineers in the past 10 years or so have had to
become aware of multicore (NoSQL, horizontal scaling, Go/Rust concurrency,
probably even async callbacks in Javascript etc are all different aspect of
multicore, imho).

I don't think we as a profession should abdicate our responsibility to
store/transport data securely by just saying "crypto is hard; so don't do it".

I also take issue with the claim that crypto code is either 100% working, or
0% working. From a crypto theory point-of-view, yes, that is how
cryptographers think.

But in practice, there is a _VAST_ difference between an attack that requires
2^32 _INTERACTIONS_ with a remote server and one that requires 2^32
_COMPUTATIONS_ on the attackers machine. A cryptographer would say both
attacks are equally easy (kinda like O(n) notation).

Just my two cents.

And finally, re: the RNG vulnerability in Cryptocat, that is very bad and just
sloppy coding. But even that vulnerability required that the attacker
compromise the private SSL key of cryptocat's server. Defense in depth FTW.

~~~
dlitz
No, it's not just crypto primitives, it's crypto protocols, too. It's even
designing other protocols that run _on top_ of crypto protocols (see CRIME).
Even implementing existing crypto primitives is hazardous: D.J. Bernstein's
work on cache timing attacks against AES are proof of that.

The problem with crypto is that it's a specialist profession that generalists
think they can do. In reality, crypto is more like law than software
engineering: You can't just reason it from first principles, and you can't
write any tests that will tell you that it's working. You have to know the
specific attacks that people are capable of, and come up with a strategy that
will avoid not only the current attacks that you know about, but future
attacks that haven't been discovered yet. You're tasked with building a system
of obstacles that nobody will be able to find any clever workarounds for, and
your adversaries are smarter than you, more numerous than you, more well-
funded than you, they're experts in breaking crypto, and they're all _from the
future_.

That's not something a handful of engineers working in isolation can do. It
takes the whole community years to even come close.

Saying that software engineers can design crypto protocols is like saying that
software engineers can be their own lawyers. A few can, but almost all
engineers who think they know the law are wrong. Even lawyers routinely lose
cases. The best we can do is to try to give them better tools to handle the
common cases (e.g. Creative Commons, SSH, better APIs), and remind them to
talk to the specialists before getting too creative.

------
noonespecial
The thing about crypto is that _nobody_ can get it right the first time. It
requires two things that most programmers have in relatively short supply.
Radical transparency and radical _humility_.

~~~
tptacek
I mentioned on another crypto thread the frustrating fact that more than one
professional crypto friend of mine is working on designs that could replace
PGP, not to mention a lot of terrible ad hoc app crypto --- but because
they're pro's, they're uncomfortable sharing designs until they're confident
that they've fully validated them. That's what humility looks like: knowing
you're an industry leading expert at cryptographic design and still waiting
months or even years to publish so you can make sure you got things _right_.

Compare and contrast.

------
graham_king_3
That attitude to crypto is pervasive, annoying, and wrong. We don't tolerate
the "you're too stupid to use that" attitude in any other part of software
development, and we shouldn't tolerate it in cryptography.

Every developer needs to touch crypto. Encrypted communications needs to be
our default. And yes, of course, we should prefer verified, standard
algorithms (NSA Suite B, for example).

It's OK to get it wrong, it's OK to fail forward, even with cryptography.
ROT13 will protect you very well, if your attack vector is someone glancing
over your shoulder for 1 second. As long as the code is open, and you're
honest about what it does, you've made people a little bit safer.

There's a fair amount of gloating around Cryptocat, but it protected people's
communications from me, because I didn't know how to break it. So that's
better than nothing.

~~~
tptacek
Everybody should be able to fly a plane, too. Think of how awesome that would
be! A rebirth of the American general aviation industry; new airplane designs;
solutions to congested airports.

It does not follow from that sentiment that anyone should be able to jump into
the cockpit of a Cessna and just figure things out for themselves.

~~~
ryan-c
The part where this analogy breaks down is that some random person will
probably die in a fire pretty quickly if they try to fly a Cessna. Crypto is
like a Cessna that's easy to fly but if you don't fly it exactly right, a few
months after your flight, the ground under your flight path spontaneously
combusts.

------
16s
I wrote some "home made" crypto about a year ago. It's a one-time pad
implementation.

[http://16s.us/FreeOTP/nsa/](http://16s.us/FreeOTP/nsa/)

The math behind OTP is pretty simple, but I may have made a mistake. I've
posted the source code to crypto forums, HN, wilders security and emailed it
to several prominent crypto developers/experts. No one seems to care nor want
to look at it in-depth.

I've worked as a dev where we encrypted research data using standard,
industry-accepted crypto (RSA and symmetric AES, etc) as well.

Having said that, I'm not an expert. And the real experts (Bruce Schneier)
won't verify anything. They just say, "It's not been broken yet, which is a
good indication."

Dive in and write some crypto code. Do it and make mistakes. That's how you
learn. Not every program is life or death/mission critical. And if you never
do it, you won't learn how. We learn by making mistakes.

Tarsnap is nice crypto code if you like C. It's easy to read too. I have no
relation to Colin Percival or tarsnap. He's an expert, and even he makes
mistakes:

[http://www.daemonology.net/blog/2011-01-18-tarsnap-
critical-...](http://www.daemonology.net/blog/2011-01-18-tarsnap-critical-
security-bug.html)

~~~
jessaustin
_No one seems to care nor want to look at it in-depth._

I'm not an expert, but I wouldn't look at an OTP implementation either. Key
management is already hard enough. It's hard to imagine a useful system you
could build on a one-time pad.

~~~
16s
Anyone wanting to learn more about information theory, decrypting to multiple
plaintext messages or unbreakable encryption would find it interesting. OTP is
an edge case and it does not scale (I understand that), but is useful for
small messages between two parties when privacy is paramount. Government
states have used it.

Also, I can't find any other OTP source code that compiles and works.

~~~
6d0debc071
You take a file an xor it with the pad. It's like five minutes work, tops. The
user interface on top of that might be very interesting, but I honestly find
it hard to imagine that the underlying algorithm could be messed up and that
you'd still have something that could use the keyfile on something encoded to
produce a readable plaintext.

Not wishing to be a drag here, but it's been done over and over again in every
intro to cryptography class I've seen.

Sorry ^^;

------
davidw
All these "crypto.... scary!" things are right of course. But somehow
tiresome. They're ripe for satire... something involving Chuck Norris being
the only one perfect enough to encrypt stuff, just as he's able to compress
random data. 100%. With his biceps.

~~~
tptacek
Serious question: why does this bother you so much?

If we were talking instead about people operating their own pharmacies, or
doing their own home electrical work, nobody would bat an eyelash.

~~~
23david
Home electrical work is a great example.

Just pick up a $10 book at your local Home Depot and if you can read and
follow the simple instructions and guidelines inside, you'll be able to do
most simple electrical work.

Some electricians may say that it's foolish to do your own electrical work
unless you're a trained/certified electrician.

Lots of people do most of their own electrical work without problems, and only
hire expert electricians for the really risky or technical parts such as
connecting to the utility companies or dealing with really high voltages.

~~~
tptacek
"Simple electrical work" might equate to "wiring Nacl into an application".
Designing a cryptosystem to replace Nacl in your application is more akin to
climbing up the pole in your alley and working from there.

But I'm particularly interested in David's response, because he raises this
concern pretty regularly.

~~~
im3w1l
Is there some kind of cryptography for idiots guide around?

Something that explains how to use a high level cryptography library safely. A
guide that recognizes that the reader is probably stupid enough to create a
buffer overflow if he even goes near C++. A guide that assumes the reader will
use rand to generate keys if able to and not reminded _multiple times_ not to.
A guide for people who have never heard about side channels. A guide that
seriously cautions not to hand over the plaintext to Eve even if she asks
nicely.

~~~
tptacek
If you use the right library --- Nacl or Keyczar, or call out to PGP --- you
don't need to be an expert to build cryptography into your application.

In all the stories you read about some generalist developer blowing their
users' feet off with bad crypto, what you're reading about is someone
_reinventing_ these libraries with vanity crypto.

~~~
im3w1l
Edited, original from
[https://code.google.com/p/keyczar/wiki/CppTutorial](https://code.google.com/p/keyczar/wiki/CppTutorial)

    
    
      keyczar::Keyczar* crypter = keyczar::Crypter::Read(location);
      if (!crypter)
        return;
    
      crypter->set_compression(keyczar::Keyczar::ZLIB);
    
      std::string input = "Hey Alice, here is Eves message: [quote]"+superescape(evemessage)+"[/quote] Your Bob";
      std::string ciphertext;
      bool result = crypter->Encrypt(input, &ciphertext);
    
    
    

Ooops because, [http://arstechnica.com/security/2012/09/many-ways-to-
break-s...](http://arstechnica.com/security/2012/09/many-ways-to-break-ssl-
with-crime-attacks-experts-warn/)

~~~
tptacek
Sure. I think that speaks to my point; the compression side channel was novel
enough that trained cryptographers missed it in Keyczar.

(Just don't use compression with your encryption and avoid the issue
entirely).

~~~
im3w1l
Well, even if _they_ removed it, our hypothetical library user could do it by
themselves if not told not to.

------
area51org
This point bears repeating every so often: cryptography is difficult, and
those of us who haven't spent years in its mathematical bowels are not
qualified to create usable algorithms. Meaning: the encryption code you wrote
yourself is probably hackable. Easily hackable.

I first encountered this in a classic essay by PGP's Phil Zimmerman:
[http://www.philzimmermann.com/EN/essays/SnakeOil.html](http://www.philzimmermann.com/EN/essays/SnakeOil.html)

edit: fix incorrect link

~~~
stcredzero
It's not just algorithms. Implementing the algorithms securely is also hard.
Even using libraries implementing algorithms is hard.

Actually, most problems are at the security protocol level. Implementing those
is the hardest.

~~~
area51org
Excellent point; I wish I'd thought to mention that. I spent a few years as a
CISO, and honestly, I know that I really want anything to do with the security
industry again, outside of making sure that systems and apps are as secure as
I can make them.

Why? Because security is a lot harder than most people recognize. Blowfish is
an excellent symmetric encryption algorithm, but, as you say, it would be easy
enough to implement it badly and end up with insecure, trusted code.

~~~
tptacek
Blowfish is not an excellent symmetric encryption algorithm. For instance, it
has a 64 bit block size, which makes it riskier to use in CTR mode and makes
some ciphertext malleability problems easier to exploit.

~~~
area51org
Shows you what I know. :-) I assumed because Schneier created it ...

------
mistercow
I feel kind of divided about using the Cryptocat as an example. On the one
hand, yes, off-by-one errors (and similar simple mistakes) are really easy to
make if you you're writing something with low feedback.

But the Cryptocat example is frustrating because the code is _so bad_. Not in
a "poorly written" sense, but in a "going about things in a completely insane
way" sense. The code generates a random number between 0 and 1 not by dividing
a 53 bit random number by 2^53, but by _generating 16 decimal digits_ , with
the rationale that 2^53 ≈ 10^16. If the code hadn't been trying to do the
wrong thing in the first place, there wouldn't have been an opportunity for
that off-by-one error.

My point isn't that you'll be protected from simple classes of errors as long
as you're coding sanely, but rather that that is the lesson you teach if you
point to Cryptocat as your example. Crypto _is_ hard, but you can't make that
point very well by pointing to code written by someone who doesn't know the
correct (and incredibly standard) solutions to common _non-crypto_ problems.

------
schrodingersCat
_" Crypto can't be a «bit broken». It can't be «mostly working». Either it's
100% correct, or you shouldn't have bothered doing it at all. The weakest link
breaks the whole chain."_

I completely agree. Ever seen a password hash mega-post on paste-bin? Bought
to you by bad crypto. Saying "I don't know" is one of the hardest things to
do, and also one of the most powerful. I applaud you for knowing your limits
in this regard. I like to say crypto is like surgery, best left for the
experts. You never want to "I kinda messed up that triple bypass," because
there are major consequences for it. The same is true for virtually all
aspects of cryptography. Thank you for this post!

Edit: Here a link to a blog post that was discussed recently on HN that is
pertinent: [http://www.daemonology.net/blog/2013-06-17-crypto-science-
no...](http://www.daemonology.net/blog/2013-06-17-crypto-science-not-
engineering.html)

~~~
bhitov
I disagree. You could, for example, unintentionally lose entropy without
entirely breaking your crypto.

------
pnathan
I don't like this attitude. It's really anti-hacker.

Of course, the world is littered with cryptosystems that some foo thought
would be secure and wasn't. XOR isn't encryption. ;)

I would suggest that hackers interested in crypto take a sober and unafraid
look at the history of crypto and then spend time breaking ciphers, "hacking"
in all senses of the word. Read the crypto papers and work on the advanced
math until you really and truly grasp "all the things". Pour those 10K hours
in....

Crypto is hard. Doing it mostly right requires humility and hard study, and
then you have a good chance of being wrong. Nothing to be afraid of if you
understand the situation and understand the years and years it takes to really
become good at it...

------
betterunix
I have said this before, but what we really need is the equivalent of SQL for
cryptography. We need a language that can be used to describe what kind of
security is needed from a cryptosystem and a compiler to turn that into code.
It is not just that cryptography is hard to implement; it is also easy to use
it incorrectly, compose it poorly with other systems (see the Skype attack),
etc.

~~~
dfc
I recently skimmed a giant manuscript about the creation of a language for
defining cryptographic requirements and primitives (120ish pages). I seem to
recall that the author was either from Stanford and/or had a name that begins
with a "P". I wish I had saved the citation. Hopefully someone else will be
able to recognize what I am referring to with my shoddy reference.

~~~
pbsd
That does sound like CAO [1,3,4], from the CACE [2] project.

[1] [https://eprint.iacr.org/2005/160](https://eprint.iacr.org/2005/160)

[2] [http://www.cace-project.eu/](http://www.cace-project.eu/)

[3] [http://www.cace-
project.eu/downloads/deliverables-y2/CACE_D5...](http://www.cace-
project.eu/downloads/deliverables-y2/CACE_D5.2_M18.pdf)

[4]
[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.113...](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.113.8971&rep=rep1&type=pdf)

------
tstactplsignore
Why on Earth did the Cryptochat guy write his own cryptography functions in
the first place? Couldn't he have used the Stanford crypto JS library?

~~~
oinksoft
The inexperience and hubris of youth. We all have this to varying degrees, but
those entirely new to the field for whom it is their first profession have it
in spades (that's both a gift and a curse; great things are built by "naïve"
trailblazers).

------
stcredzero
It's not just a matter of perfection in the field of programming. It's an
entirely different field that happens to involve some of the same things and
ideas. Just because you're a driver, it doesn't mean you're a qualified
mechanic, and if you're a mechanic, it doesn't mean you're a forensic crash
investigator. They are related in terms of subject matter, but they are not
equivalent skill sets.

As programmers, we are dealing with interfaces and movement/copying of data.
Security deals with epiphenomenal leakage of access and data. It is not
programming, though it involves programming.

------
santosha
These aren't good reasons to not touch crypto, these are good reasons to not
cook up your own. Use a standard implementation, and move along.

~~~
dclusin
These also seem like good reasons not to use proprietary/closed source
solutions.

At least with cryptocat everyone know knows about the security issues now and
take precautions as if their entire encrypted communications have been
compromised. With proprietary solutions you have to rely on an honor system
where you hope the vendor tells you there is an issue.

~~~
npsimons
Relevant: [https://www.schneier.com/crypto-
gram-9909.html#OpenSourceand...](https://www.schneier.com/crypto-
gram-9909.html#OpenSourceandSecurity)

------
billforsternz
The article implies that the off-by-one error in the Cryptocat random number
generator is catastrophic. Correct me if I am wrong, but surely that is hardly
the case. The random numbers generated are a tiny bit less uniformly
distributed than they would be without the error. That's an imperfection but
does it have any practical consequences at all ? I know that flaws like this
can introduce weaknesses that are vulnerable to attacks, but my intuition
tells me this flaw is very close to the "doesn't matter" end of the
catastrophic->doesn't matter spectrum. I am more than willing to be educated
by someone with specialised domain knowledge.

------
kGrange
There ought to be a good set of FOSS unit tests for those who dare implement
their own crypto. For instance, you let it hook in to your PRNG, and it'll
tell you if the output is random-looking enough.

It wouldn't be a panacea for bad crypto, and it does create a risk of people
thinking "oh, it passed all of the tests, it must be secure," while still
implementing it overall incorrectly. But I still think it would mitigate these
"foolish/easy" errors and allows devs to focus on proper overall
implementation.

Or does something like this already exist?

~~~
pbsd
If you're only worried that your PRNG doesn't _look_ random, there's TestU01:
[http://www.iro.umontreal.ca/~simardr/testu01/tu01.html](http://www.iro.umontreal.ca/~simardr/testu01/tu01.html)

However these are just statistical tests; they tell you nothing about how much
actual entropy you're actually getting (cf. Debian). That said, it probably
would have caught the Cryptocat bias.

~~~
mistercow
>they tell you nothing about how much actual entropy you're actually getting

I'm not actually sure what "actual entropy" would mean in this context if not
"the number of bits in the seed".

~~~
pbsd
Not only the number of bits in the seed, but also their unpredictability.

There are many things that can go wrong. Some examples:

\- the seed is timer-based and has far less entropy (information) than its bit
size

\- the seed is OK, but the generator is somehow weak (e.g., only made with
linear components, or throws away parts of the seed, or outputs too much of
its internal state)

\- Bugs, bugs everywhere (cute example: an RC4-based generator that uses the
XOR trick to swap elements)

No standard statistical test would catch those (it would catch the latter,
actually), even though the security of the generator would be gravely
diminished.

------
loup-vaillant
When you can't have bugs, you can have proofs. While unknown angles of attack
can exists, it is still possible to prove that the known ones are closed.

This is going to be costly…

------
motters
But if few people bother themselves with encryption code then there are not
many eyeballs on it and mistakes or backdoors become more likely. Probably
encryption code should not be left entirely to the "experts" at NSA,
Microsoft, etc.

------
thewarrior
So how does one go about writing crypto ?

~~~
NegativeK
Assuming you're talking about implementing, and not using preexisting, vetted
libraries:

Spend a long time in the community. Listen closely to the people who have a
good reputation. Have your code publicly vetted before it's ever used. Expect
your code to be torn apart.

------
pasquinelli
"Writing a testcase for this would have required complicated thinking and
coding which would be as likely to contain an error as it was likely for the
code to be tested to contain an error."

i don't know the details, by why couldn't you just collect a million bits of
supposedly random data from the generator and run it through the nist test
suite? it's true that testing for randomness does require complicated
thinking, but fortunately it's been done. standing on the shoulders of giants
and whatnot.

------
gravitronic
Good, because you can't even spell it in your title.

------
lectrick
He clearly doesn't touch unit tests, either, or he'd know that they would
address ALL (as in, 100%) of his concerns here. Crypto is even easier to test
than 99% of things... all you have is a couple of inputs, and an output. Boom,
test cases.

~~~
ethereal
I'd just like to point out that `cryptography' is _not_ just about testing
that `decrypt(encrypt(msg)) == msg`. memfrob'ing a region of memory does this,
but would we consider that a 'secure' encryption scheme? I don't think so . .
.

Even if it was -- how do you unit-test a PRNG to ensure the output is
uniformly random? How do you ensure you don't have collisions in a hash
function? How can you ensure that there's no replay attack on your protocol?
How can you use unit-testing to demonstrate that there is no hole in your
protocol like, say, when the Germans repeated the key twice at the beginning
of the message while using Enigma?

These are all very real and valid cryptography concerns. Sorry if this sounds
snarky, but in all seriousness -- I'm very curious as to how you would unit-
test these.

~~~
contingencies
_how do you unit-test a PRNG to ensure the output is uniformly random?_

NIST standard test suite @
[http://csrc.nist.gov/groups/ST/toolkit/rng/documentation_sof...](http://csrc.nist.gov/groups/ST/toolkit/rng/documentation_software.html)

 _How do you ensure you don 't have collisions in a hash function?_

All hash functions have collisions, by definition, since they represent a
unique checksum of a large amount of data in a smaller keyspace. What you are
really asking is probably "how do you prove that you have difficult to predict
collisions"? I don't know. (I assume that the mechanisms for doing that are
based upon statistical analysis of the propagation breadth and speed of
various single bit changes in the input stream over varying numbers of
rounds?)

 _How can you ensure that there 's no replay attack on your protocol?_

A combination of session keys and message sequencing? (eg. Each message must
include a hash of the last.)

 _How can you use unit-testing to demonstrate that there is no hole in your
protocol like, say, when the Germans repeated the key twice at the beginning
of the message while using Enigma?_

There's an entire field of computer science around formal proofs; I don't know
enough about it to really comment, however a partial answer may lay in there.
Essentially that means adopting a design-time proof strategy in your testing
rather than a post-facto unit-testing proof strategy, though.

