> Sure, it's complex and you have to get all the details right
What's really dangerous about cryptography (someone on HN pointed this out a few weeks ago) is that you get very little feedback on this. It's very easy to take reliable primitives and build a broken system. When I say "broken" I don't mean "theoretically vulnerable" - I mean "game over".
I get paid to look for these bugs, and they are legion.
> so for developers, I recommend a more modern approach to cryptography — which means studying the theory and designing systems which you can prove are secure.
The average application developer does not have the time or the need to learn the theory at a deep level. And as before, it's very difficult to say with confidence "this design has no bugs".
The best advice I can give developers is to play very conservatively. Our default recommendation is PGP for data at rest and TLS/SSL for data in transit. Or some high-level library like KeyCzar, NaCl, etc.
Are there any easy-to-implement tests for this sort of thing? For example
* For mathematical functions, I can trivially test for an expected value, a too low value, a too high value, and not a number input.
* For input/output sensitization, I can pump crazy unicode characters or system code.
* For encryption, I can ... open up tcpdump and see that it's ostensibly garbage-looking?
It's pretty easy to eyeball your ciphertext and think it's sufficiently garbage-looking, but it's difficult to predict what an attacker can do with access to your running system. Seemingly innocuous flaws can lead to complete plaintext recovery or the ability to forge arbitrary ciphertext. (Here's a fun way to get a taste: http://www.matasano.com/articles/crypto-challenges/)
@cperciva is right that cryptographic security is something that needs to be proven. Since I'm not smart enough to do that, I avoid designing my own crypto systems, and I recommend clients do the same. Lots of risk, little or no reward.
Last year I was lucky enough to engage Colin on some work I was doing. His interest was piqued by my (erroneous) claim about the security properties of a protocol I had designed.