AES - Advanced Encryption Standard
CBC - Cipher Block Chaining
PKCS - Public Key Cryptography Standards
SHA - Secure Hashing Algorithm
MAC - Message Authentication Code
PBKDF - Password-Based Key Derivation Function
NIST - National Institute of Standards and Technology
FIPS - Federal Information Processing Standard
KDF - Key derivation function
CTR - Counter Mode
RSA - Rivest Shamir Adleman (last names of each creator of the RSA algorithm)
OAEP - Optimal Asymmetric Encryption Padding
PSS - Probabilistic Signature Scheme
ECDSA - Elliptic Curve Digital Signature Algorithm
PS3 - Playstation 3?
DH - Diffie-Hellman key exchange
ECDH - Elliptic curve Diffie-Hellman key exchange
TLS - Transport Layer Security
AES - Advanced Encryption Standard
CBC - Cipher Block Chaining
PKCS - Public Key Cryptography Standards
SHA - Secure Hashing Algorithm
MAC - Message Authentication Code
PBKDF - Password-Based Key Derivation Function
AES - Advanced Encryption Standard
CBC - Cipher Block Chaining
PKCS - Public Key Cryptography Standards
SHA - Secure Hashing Algorithm
MAC - Message Authentication Code
PBKDF - Password-Based Key Derivation Function
Edit: I just double-checked the help page and saw the note about code formatting. My apologies for overlooking that!
Take AES for example, "Advanced Encryption Standard" doesn't really mean anything. AES is a block cipher, also known as Rijndael. CTR and CBC are block cipher modes. RSA is a public key cryptosystem, etc. The same applies for most of these things in the list.
So if someone doesn't know what these things stand for, they're going to have to go to Wikipedia to check it out anyway, the words behind the acronym are almost as confusing as the acronyms themselves.
I get a cosmetics site, apple.com and store.apple.com, wikipedia article about the mac adress and a clothing site. Message Authentication Code is nowhere to be found.
Yes, I took one of the harder ones, AES is on the first position, but still it is nice to know the words behind an acronym.
Even for technical documents, if you expect your target audience to be familiar with the domain, nobody remembers every abbreviation every time. It also helps new readers to quickly familiarize themselves with the material (and might even help a few understand more than they would have done without expanded abbreviations).
- for the readers who already know the abbreviation, it's a waste of space and time
- for readers who don't, the words won't tell them much either - Transport Layer Security tells you about nothing about what TLS really is - besides, for those who are really interested, they can always look it up on Wikipedia (that way, they will actually understand it).
Doesn't it? Even a complete tech-illiterate can glean some meaning from the word "security".
I would add to the people commentating here on HN: tptacek's review is tough; you do not need to lay into the author of this book any more.
Here's a readable version: https://gist.github.com/mikemaccana/10847077
I like that, because my browser window is resizable.
Sorry, but I have a right to an emotional reaction to your content and a right to describe it, especially if the reaction is grounded in objective technical reality. I suspect that younger people have this idea that online descriptions of emotional reactions are fictional and purposely crafted for effect -- mostly having to do with emotional aggression. It's true that sometimes "passion" over a subject is used as a pretext for such aggression. That doesn't mean that it's always true, however. In some cases, it's honesty.
That said, "arrrgh" reactions in a technical discipline often indicate a frustrating failure of outreach, education, or communication. I learned things from reading tptacek's review. Maybe he could supervise the ghost-writing of his own undoubtedly excellent book?
(1) - I was riding the bus and this young man had his sneakers tied to the back of his backpack, the soles of which he was pressing into my chest. I tried discretely hinting to him by pressing back, but he was oblivious, so I brought this to his attention.
I was amazed that his first priority wasn't to apologize or help me out, rather it was that I recognize that he didn't mean any harm. Be correct first, then worry about your own ego second.
> I have a right to an emotional reaction to your content and a right to describe it, especially if the reaction is grounded in objective technical reality
... because while you have rights, being a person who participates in a civilized society means you also have responsibilities, and one of those responsibilities is to interact with people in ways that are appropriate to the situation.
And "appropriate to the situation" changes depending on the nature of the situation. The less serious the situation, the less appropriate a volcanic reaction becomes. Nobody's going to disapprove of you if you start screaming at an airline pilot who you see snorting coke on his way to the plane. But lots of people will disapprove of you if you lay down the same reaction on some poor kid behind the counter at McDonald's because he forgot your French fries.
In the case of this review, I would say tptacek's tone is appropriate, because security is Serious Business (as we should all know, especially after last week); getting it wrong can result in people getting robbed or even killed. So if you're going to put yourself forward as a teacher of crypto, and you're teaching people things that aren't true, you're doing real damage and should be glad a good yelling-at is the worst punishment you have to suffer. But that doesn't mean that the same tone would be appropriate if taken with the kid on the bus, because "annoyingly oblivious" is a long chalk from "could get people killed."
> I take exception to your post
I demand satisfaction! Pistols at dawn, my good fellow! Pistols at dawn!
Ugh. My point in the previous comment -- for the second time -- is not that I'm not comparing the situations. I'm comparing the reactions. Not all cluelessness is equal, and for the second time I never said that! However, oversensitivity to criticism due to a prioritization of feelings/ego generalizes nicely across both situations.
I only take challenges from people with basic literacy and reading comprehension. Your comments only demonstrate the former, my good fellow. (Or, if this is the 2nd iteration of a deliberate troll through the subtle placing of words in another's mouth, I'll merely comment that I'd be a bit surprised if someone actually thinks this is clever, and note that this would disqualify a challenger though insufficient intellectual integrity.)
To me, complaints about tone are for critiques that contain phrases like "fucking idiot" and "worthless waste of space" and other such direct insults or attacks.
If something legitimately makes you stop and stare with your mouth hanging open, it is OK to say "this statement made me stop and stare with my mouth hanging open." Phrases like "I am not making this up" are reasonable shortcuts to expressing that sentiment.
Could Mr. Ptacek's review have been worded more kindly? Of course. Do I care? Not at all. It was nice enough. It concentrated on technical flaws rather than personal attacks. It was informative and useful. The tone was just fine.
It's okay if you are writing a story about your personal reactions.
It's irrelevant if you are writing a serious critique, which should be about the content, not about your emotional response to it (assuming it is a critique of an informative work -- obviously, if you are critiquing something as a work of art intended to inspire emotional responses, writing about your response as some relevance.)
It's possible to blend the first kind of story with the second kind of critique, but you have to recognize the different roles of each, do it deliberately, and be exceptionally skilled (the set of people who can do this and produce something worth reading is a proper subset of the intersection of the sets of those who can write entertaining personal stories and those who can write valuable straight critiques.)
That being said, tptacek's review seems pretty focussed on substantive critique with very minimal emotional distractions, so while I disagree with the categorical defense of the individual statements at issue as being appropriate to a straight critique of an informative work, I also think that the charge that the tone was inappropriate and a barrier to reading is overblown considering the fairly minimal level at which distracting emotional descriptions are present in the review.
It's ironic that these critiques of this review are much dumber than the review's critiques of the book, and implicitly hold a fairly off-the-cuff internet comment to far higher standards than a published book that purports to give important and useful advice about cryptography.
Could this review be better? Sure. But who cares?
That said, I want to point out that I think your review was excellent and it's the kind of thing I love coming across. It many ways, it reminds me of the heyday of Usenet. It's great content and it doesn't need to be better. To the extent that it can be better, it's because nearly any work can be made better with additional effort.
A purely "just the facts" version might need a "how bad on a scale of 1-10" or something to get the same information across, and would be less readable.
At 49, I see the exact opposite. Members of my generation tended to exhibit more tact and decorum. The urge to dress like a hobo, swear all the time, and flame everyone in sight is a classic overcompensation for years of helicopter parenting which forbade all of these things.
"In some cases, it's honesty."
In others, it's honesty used as a pretext for acting out.
Note I also make this observation.
Also note that I am specifically pointing out reactions to criticism. The other changes in decorum have been noted by previous generations since at least the 1800s. Waltzing was once a lascivious corrosive to society's morals.
Also: we are likely less than 4 years apart in age. I could do with more decorum, as I've been learning over the past 4 years.
This is presuming that such fear is always the input to a conscious decision. That doesn't fit my observations of human nature. Power relationships always have some bearing on the nature of an interaction, so what you're saying is comparable to telling an aquatic species that they're wet.
Also, going by what you say, you should have more respect for those who tell truth to power, or tell their more famous/more highly regarded colleagues the plain truth. Perhaps tptacek should be more humble because he's more famous, but if it comes to the choice of him being frustrated by widespread crypto cluelessness or by a desire to dominate others, I think the former makes far more sense.
Regarding tptacek, I suspect that if he had sent this to the author, or posted it as a formal review, I suspect he would have toned down the description of his reactions. I'm not sure where this review came from, but my impression is that he did not think of it as a published review that the author would see. Certainly doing such would be advisable, as people are more receptive to criticism that way.
Perhaps I am wrong regarding how tptacek would have responded had he known the review would be, essentially, published. But I know I phrase things differently in such situations.
A better way of apologizing is actually apologizing. The young man's reaction was more like exasperation that I should have been put out.
My impression is that the comment was/is a comment on social media. My comment was written in that context.
Why hurt someone when you can avoid it?
Having said that: had I written the "review" as an actual "review", and not as an oversized HN comment that I had to make a Gist out of to get it onto the site, I would have written it more carefully.
As long as criticism doesn't cross the "bright lines" of ad hominem or gratuitous ridicule, as a third party reader, I much prefer the targets toughen up, rather than the critics soften their language. And there are way more third-party readers than critics or their targets.
My hypothesis is that people expect text that has no obvious signs of being an Internet comment to use the more 'serious' language and this case (an Internet comment that is a bit longer than usual) is being classified wrongly as a result.
It's ridiculous to simultaneously say that a piece of writing devoid of context is being classified in a certain way and to say that it contains that's inappropriate for that classification.
The mere presence of phrases like "I am not making this up" tells you that this piece is not intended to be too serious. To say that it's intended seriously but contains non-serious language is a flat-out contradiction.
It could make sense if it was published in some context, like a serious blog or a news site or something, which implied seriousness. But it's a naked text file on the internet. It doesn't have to take a serious tone.
That said, it's funny that most these rants aren't ad hominem, unlike some of the "formatted" vitriol which attacks you without really seeming to do so.
Someone should watch EEVBLOG on Youtube, Dave Jones does reviews and teardowns of electronics.
There was a rant over PICKit3 where he voiced his frustrations with the new device (that wasn't better than the old one, and took out nice features, replaced things that worked perfectly with things that were sort of dumb) which triggered Microchip to answer with a funny video their own.
Your tale about the boy ... Arrrghh ! I don't want to go off topic, but man, I think nothing gets diluted more than values each year.. Well, except shares of a startup once VC's get in.
The "I am not making this up" thing came in the context of recommending ASN.1 for instance. If that were a chess match commentary, this is where the scorekeeper would have put a "??" after the move to note the shock.
And note what tptacek's comment was not: It wasn't a bunch of personal attacks, or swearing. Some of the commentary was "more than professional", to be sure, but that's exactly the kind of commentary you should hope to get in highly-demanding, highly-selective fields.
You want to know what a perfect book review would look like in the Navy's nuclear propulsion program? It would be this: "No deficiencies noted."
I agree that it wasn't especially bad, it could just be worded a bit better to spare feelings.
Tptacek could have chosen to say that differently, but it does add value as written. I have no idea what ASN.1 is; simply telling me that the book contains that string doesn't mean anything to me. Telling me that it was a stupid thing to say doesn't teach me about ASN.1 or crypto, but it does teach me about the book.
When working through basic knowledge to mastery of a topic, attempting to teach someone else is an extremely effective way to organize your thoughts and learn yourself. This is why graduate students teach undergraduate students.
The author of the original book shouldn't feel shame for making the mistake of working toward mastery of the topic. But racing to publish is dangerous when the topic is as serious as (heart surgery or) cryptography. A stern warning is worth repeating.
Most of us already read the review anyway.
while the factual content of tptacek's review may be spot on, his overall tone is very negative and smacks of "only experts allowed" logic. while he could have easily helped improve kyle's book and shared these comments privately, he instead chose to lambast kyle publicly, which doesn't really help anybody: tptacek looks like a total jerk and kyle now has a lot of negative attention on (this version of) his book.
this pervasive "experts only" attitude is a big part of why "secure" open source projects have hard times getting and keeping contributors. it is par for the course for people to be super rude and negative to new participants instead of trying to encourage them to improve and learn. this lack of contributors then has a whole array of negative secondary effects, like less people reading the code for the project.
If the author instead put together a book on how a layperson could perform open-heart surgery, you're damn right that actual surgeons would jump all over it.
There is some strange pervasive attitude/arrogance in tech that all it takes to be good at something is to be smart and give it a try. Why learn the theory/fundamentals when you can just start coding?
For building a web app, sure. But security is not one of those things. You actually need to learn the fundamentals and theory, and even then, need lots of experience.
1: Don't implement features you don't need. Nobody needs TLS heartbeat. Nobody. Don't implement it until you have a use case and the calling code in hand.
2: Test the features you do implement. What happens if this field is the minimum? The maximum? A power of 2? A power of 2, less 1? Negative when treated as signed?
Maybe the tone could have been a little softer, but this should not have been done privately. The criticism of the work needs to be just as public as the work itself, so that people who might have been misled have a chance to see why.
And we, of the Internet age, should be shocked to learn this is no longer true! Eric Drexler once proposed that hypertext would save the world by allowing such peer review. Just what are we collectively missing when it comes to crypto?
That doesn't apply for a book. Keeping the critique private for a week doesn't help the readers at all. In fact it harms them by keeping incorrect information in play and uncorrected for longer. Perhaps it softens the blow to the author's ego, but that is not at all what "responsible disclosure" is about. Helping out misinformed readers takes precedence over the author.
That all said, I still think we can treat each other better. Honest question: was it necessary to destroy it in such detail? Was it necessary for the effort of attack on the "crypto box" front? It seemed personal.
 Contacting the author first doesn't necessarily preclude timely notice "this book is flawed" out to readers.
If tptacek hadn't destroyed it in such detail, his review would have consisted of saying "Hey, this book is pretty bad; it's got some very serious issues, and makes some pretty terrible or misleading recommendations. My suggestion: do not read it".
Would that be better? Or would you be complaining that "Well geeze, it's not helpful to say that the book isn't good; you have to go into some detail about what the problems are so that everybody can learn!"
The idea of asking for LESS DETAIL in a criticism of a topic is bizarre. How much detail would you prefer?
I really truly cannot understand the critique of an "experts only" attitude when it comes to technical books that make important recommendations for building critical systems. By all means, non-experts should experiment and build and learn. But non-experts definitely should not be giving out large quantities of advice in an authoritative tone.
It helps people who might have read the book and learned to do things the wrong way.
We can model this as "Kyle has disseminated harmful material, and tptacek is trying to contain the damage". Kyle's feelings, intentions, and hard work aren't irrelevant; but they're not what we should be focusing on.
Publishing a book like this sends a strong public signal of deep expertise.
I have not found tptacek to be overly rude or negative when offering advice to journeyman cryptologists. But a journeyperson should not necessarily be publicizing their how-to guides yet.
Here, your attitude causes two problems.
First, you know and apparently like Kyle Isom, and so I presume you're also ready to tell me that he's an adult and a professional. Professionals do one of three things with criticism: ignore it, rebut it, or learn from it. My assumption has been that Kyle is choosing options (1) and (3) from that list. But here you are, inventing option (4): "get indignant about it". I wonder if you've thought about the extent to which people will attribute that response not to you, but to Isom.
Second, whatever you might think about the tone of my feedback, it's clear that Isom needs additional technical review for his book. Whipping up a totally unproductive us-versus-them narrative about "jerks" versus "open source" does the opposite: it generates drama. Even if you think my review was itself dramatic, piling more drama on doesn't make Isom's work more attractive to experts.
I'm not sure how big of a deal either of these issues are, but they're a bad habit for message board denizens. The exact same thing happened to Willem when he wrote his critique of the Akamai allocator, and Hacker News had a totally unproductive drama storm for a couple hours before Akamai (a) thanked Willem and (b) acknowledged that he was absolutely correct. Read the Akamai comments on the HN thread, and apply them here, substituting "Kyle Isom" for "Akamai", and I think you'll see that they apply.
Finally, I'll admit to being personally irritated by the claim that I operate from "experts only" logic with regards to cryptography. There are at last count something like twelve thousand people who have reached out to us for our free crypto challenges, and thousands of those people have gone on to solve multiple sets of challenges (something like 60 people have finished the first 6). Every damn one of those people is an email exchange that me, Sean, or Marcin had to have directly, on our own time, with no compensation --- the opposite of compensation, in fact, because we donate to charity when people finish them.
There are a lot of people on the Internet to whom you could direct the "experts only elitism" criticism regarding crypto. I am not one of them.
What's more annoying about that bogus critique is how it muddles a real issue. I'd like many more people to understand crypto and, particularly, what goes wrong when it's implemented naively. But I'd like far fewer people to plow ahead and implement their own broken stuff. The track record on amateur cryptography is bad, and what developers don't like to acknowledge is that the badness that work generates is an externality to them. People have in the real world been hurt, physically, because of broken amateur crypto. It is hard for me to take the hurt feelings of developers all that seriously by comparison.
The accusation of elitism on your part is not a new one, I don't think, to you - I found myself levying the same accusation when you decided to single out the CryptoCat project as a distinctly "bad" project, due to the number of issues that came up during the most recent security review, despite the fact that it's one of a very select group of open source projects even undergoing such reviews.
You say things like, "amateur cryptography" when it makes little to no sense. This book wasn't written for free, it was actually professional crypto, even if it had fundamental problems; it's bad crypto, not amateur crypto. When you do things like that, it comes off as elitism, whether or not you're intending it to.
Your criticisms of the book are indeed valid, but the obvious derision you apply when calling professional efforts such as this book and Cryptocat "amateur" is precisely the kind of behavior and attitude that keeps the state of crypto so backwards and slow, and is exactly the kind of drama you (correctly) lambasted earlier in this comment chain.
Sometimes expertise is actually required.
Not to mention the need to have to filter through all the BS criticism. I've read people arguing that there was no issue in having the e in RSA (the public exponent) equals to 1. Really.
It eludes me how you turn someone's terrible custom crypto into a parable about how we should be nicer to custom crypto.
Briefly, I was doing a single RSA encryption on the client and corresponding RSA decryption on the server as part of a login procedure, and using e=3 (which, at the time, was considered acceptable by most experts). Due to licensing issues the client code had to be all ours, so I was using an old arbitrary precision integer library I had written years before. It was not super fast. The multiplication wasn't too bad (Karatsuba), but division was the classical division algorithm. On the server there were no licensing issues, and I was using gmp.
So I had this "brilliant" realization. Why not do the division ON THE SERVER? The client could simply compute M^3 and send that to the server. The message would be 3 times longer but bandwidth was cheap. The server could then do the modular reduction.
I quickly made the change to the client and then started to revise the server code, when it occurred to me that since the client had made no use whatsoever of the modulus there must be a way to decrypt the message without using the modulus--like by just taking the cube root. Doh!
There's an interesting real-world RSA bug related to yours: in the absence of proper padding, it's possible that e=3 RSA of a small plaintext might not wrap the modulus. A similar cube root operation produces a signature that naive implementations (the ones that check the digest embedded in a signature block, but not the padding) will validate, despite the attacker lacking the signing key. That bug bit Firefox's NSS library; for a little while, it was possible to use a short Python script to forge any certificate.
(That bug is due to Bleichenbacher, who called it a "pencil-and-paper" attack in the rump session he presented it in).
e=3 RSA isn't insecure per se, but it does magnify the impact of other vulnerabilities, and so it's best avoided.
As my literal not-making-this-up favorite HN commenter and someone who has previously expressed an interest in crypto, I'd love it if sometime you could take some time to demolish our crypto challenges. I'd be happy to send them all at once to you.
 by "theory" I mean vigorous and convincing hand waving and white board diagramming...
I've been throwing $20 bills at my monitor so that your book will start downloading, but it doesn't seem to be working.
But really, you should write one.
I think the community in general is very harsh towards anything related to cryptography. It's as if you shouldn't bother writing code unless you have mastery of the underlying mathematics while at the same time not bother with the maths unless you're an expert very-low-level-language programmer.
There is certainly a need to put forward blatant errors and potential flaws. But the general harshness is misguided I think. Tptacek, you simply said out loud what many thought, I'm sure. You'd make a lot of people happy if you wrote a book. Because you're still learning doesn't mean others can't learn from you.
Very subtly broken cryptography software is better than no cryptography software. And together we will learn to make it better.
I am an odd duck, even for my odd little field: I'm a software security person who has spent a couple years getting decent at breaking crypto, and (weirdly) few people in my field do that, so I sound like more of an expert than I actually am.
Interesting. When I worked at Entrust, our cryptology team consisted of both cryptographers and cryptanalysts. The former were math PhDs who spent their entire graduate careers designing cryptosystems, so by the time they came to us, they knew what they were about. The latter - well, there was only one when we got started - was a B.Eng. who got interested in crypto at BNR, taught himself the basics, and became one of the top cryptanalysts in the world.
You and he probably have much in common - including not being qualified to design cryptosystems! Like you, he would have said to leave that to the experts.
Then he would have quite happily spent weeks and months figuring out what those experts missed, thereby advancing the field.
It puzzles me to this day that so few in the security field appreciate the difference between the two types of cryptologists.
My point is that if people like you, who are definitely more knowledgable than most in this area (most is very important here), communicate their experience, then everyone benefits. If nobody wants to write about crypto because nobody feels qualified, we're at a dead end.
When a person does write content, someone somewhere will tear it apart, for pretty good reasons: getting it right is very difficult, as you say. But that's precisely the point: to learn from our mistakes. We're not dealing with raw science, but real life implementations of theory, and this is where things usually break, as shown by your critique. The value of the book is pedagogical, not necessarily scientific.
If you have anything to say about crypto (and you clearly do), then say it. We're all the better for it! And contributions like the ones you gave here are needed. I just find the general attitude a little tiring, I'm not trying to force you into writing :)
Lastly, the most important thing, to me, is that I, as a chemist, can get on the internet and learn about these concepts from someone who understands them better than I do. Having a discussion about such topics is essential. Your contribution might not be in the deep theorems of academic cryptography, but they sure are appreciated by others like me. So if you ever want to write a book/pamphlet, go ahead, I'll buy it.
If you have the option to use NaCl, use NaCl. A Java-specific alternative to NaCl is Keyczar.
TextSecure's crypto is implemented in Java, which is of course garbage collected. Some cursory Googling suggests that Java's GC will suspend the main execution thread during each GC op: http://javabook.compuware.com/content/memory/reduce-garbage-...
I'm guessing the concern is that an attacker might somehow discover a way to use the GC to recover sensitive info. It seems like having a GC might cause some other trouble, like making it hard to wipe sensitive data from memory once it's no longer needed: http://books.google.com/books?id=43pcI3in1DcC&pg=PA122&lpg=P...
Since TextSecure is currently believed to be secure and high-quality, then it must be true GCs aren't fundamentally dangerous to crypto code, right? (Or at least that Java's GC on Android isn't dangerous to crypto.) If a GC is dangerous to crypto code, then TextSecure would be vulnerable to those dangers.
Do you happen to know whether cryptographers as of 2014 generally believe GCs are/aren't/might be dangerous to crypto code? Are there any known attack vectors or proofs of concept? (It would be awesome if anyone could point out any whitepapers on the subject.)
Regarding GC performance, maybe there's an avenue for attack there. You could potentially infer the amount of garbage being generated by an implementation, which seems like it could be variable in a PK implementation. I can't really think of a way to generate variable amounts of garbage without doing variable amounts of computation, so I think you're already leaking timing information. [Actually I can, but not in a way that seems natural].
The (non-default) CMS and G1 GCs will do as much as they can in background threads before they have to stop the world. For more information see http://www.oracle.com/webfolder/technetwork/tutorials/obe/ja... and http://www.cubrid.org/blog/dev-platform/understanding-java-g.... I only know about this because I kept encountering long, GC-related pauses on my Minecraft server that went away when I switched to the concurrent mark-sweep collector. :)
If a constant time cpu algorithm was not also constant factor memory, then the timing of cache misses (or gc, alike) would certainly be subject to side channel attacks.
If constant factor memory usage is attained, I cannot easily think of a way that a garbage collector could cause an interference that leaks information. (Though by all means, one ought think harder than I have.)
We should turn to hardware solutions for access to raw keys. Let the software stacks do what they do well, which is accelerate users and engineers.
> I reviewed several SSL implementations for coding style: OpenSSL, NSS, GnuTLS, JSSE, Botan, MatrixSSL and PolarSSL. I looked at how buffers are handled in parsers and writers. Of all of them, I think only JSSE, i.e. pure Java, can be trusted to be free of buffer overflows. It suggests that a good webserver for security-critical applications would be Tomcat, without native extensions.
That said, you first point seemed silly. Simplified, partial and building block examples are used in almost all fields to facilitate teaching (including among medical students and surgeons). They are useful because they keep people moving along the process, teaching them terms, skills and concepts they will need to get to the next step in the process.
What is your alternative method for teaching someone unfamiliar with these concepts in a way that won't just put them out to sea without a paddle?
I would also divide the book into two parts, the "easy" part and the "hard" part. The "easy" part would get readers to the point where they can safely use TLS, reliably PGP-encrypt something, hash a password, and invoke NaCl (which is part of the go.crypto package). I would probably spend a whole chapter on how to use Golang's TLS library, for instance. Most readers that are picking the book up so they can solve some business problem would probably never need to get past the "easy" part, and I would encourage them not to.
I would remove from the "hard" half of the book protocols that were insecure. An unauthenticated DH exchange is a poor basis for a cryptographic transport. Slash, cut, gone. A naive password challenge-response protocol doesn't solve anyone's business problems. Slice, snip, gone. In their place, I'd probably add more discussion of key exchange algorithms, with particular attention paid to how easy they are to get wrong.
IMO, it's exactly this kind of strong advice early on that would be of great benefit would-be crypto developers. They need it drilled in as early as possible that the canyon-like pitfalls lie in the compositional problems of building a working cryptosystem. This is really a domain where just-ship-it cowboy coding becomes a massive liability.
1. Top researchers come up with algorithms and techniques
- The research corpus reviews them
- The programmers communities review them
- which everybody else relies on in their tools
Your review was harsh, because you know more. What, I think, was missing from it is a bit of "This is a great first step, let me help you make it better so we can move everybody else forward. Here are my comments."
In keeping with the philosophy of being hands on; would the book be improved if after introducing the code in the first section, have the reader implement an exploit for the same code?
I do not feel remotely comfortable with the idea of writing a book containing prescriptions on how to design a cryptosystem. We're wary of doing that even for our clients, where we know all of the context and the threat model that the proposed system would face, and who will actually build it, and that we'll get paid to review the resulting implementation. I don't know how to solve that problem for strangers.
Other people do. They are much better than I am. When Trevor and Moxie write the book on why they chose the TextSecure primitives that they chose, I'll be first in line to buy.
You know, for old times' sake?
Someone who picked up the basics from a few Wikipedia articles here, a few papers there, a couple open source projects here and there... they're smart, so they're not completely clueless about the field, but they just don't have the experience to see where they fall short, the industry know-how, and so on.
I feel like instances of this in the tech community are not too rare, and it's a consequence of the internet: anyone can publish a book and distribute it all over the world now. It's worth keeping in mind that while harm is being done through the spread of false information, what's most important is to educate them, see this as a teachable moment, so they can become productive experts and modify their message to be fully correct. Of course, it requires them to be open minded of their shortcomings: but it can be done.
PS: I have no clue who the author of Practical Cryptography With Go is.
(I don't know the author either)
Not if you extend being wrong to being ignorant, no. When you're right about something in the sense of not being ignorant, you understand all the discussions and news easily, you know exactly what everyone is talking about, including reading academic articles on the subject, etc. When you're wrong about something your wrongness butts up against the correct model again and a gain and you're often left confused or unable to understand the actions, discussions, arguments, and conclusions, of others. (As opposed to seeing specific places they are wrong.) You can feel this lack of understanding. It just doesn't feel the same way as properly understanding a subject at all.
I would argue that this review is saying that the author's understanding falls a little short of par for the course. The author would probably have had a chance to see this for themselves by getting a little more into the literature.
I would not write a book on structural engineering to learn the subject or become an expert. The stakes for the misinformation being spread are high.
Unfortunately, not even widely used, highly trusted implementations work right all the time. A out-of-bounds memory bug introduced by an insufficiently vetted commit opened up a serious flaw in OpenSSL. On a much, much smaller scale, I once had the misfortune of working with an old version of Microchip's PIC18 AES library, which had some serious issues that made it nonfunctional for anything more complex than the toy sample app it shipped with. But with enough exposure these problems are eventually exposed and fixed. Would a world where everyone rolled their own bespoke, ad-hoc SSL implementations be more secure? I doubt it.
In the end, I think there needs to be a cultural shift. People shouldn't be discouraged from building their own crypto for fun and learning, but they should be discouraged from deploying it for any application where real security is required - at least not before undergoing rigorous analysis. One of the first things Dan Boneh teaches in his Crypto I class is that you should think very long and hard before implementing your own cryptosystems (i.e. don't do it), because getting it right is hard, and getting it even the slightest bit wrong tends to make it useless. And when you consider that people's livelihoods (their personal information, their money) and even lives might be jeopardized, taking responsibility as an engineer becomes of paramount importance. Crypto just doesn't lend itself to a "build an MVP, get it working, move fast and break things" mindset.
Just on example I've found:
"The second relates to the design of structures. It is time for engineers and architects to get together to devise new structural forms that offer a higher degree of protection not only against terrorist attack, but also against other hazards. There is much to be learned from what happened in Nairobi and Dar es Salaam, in Oklahoma City, and at the World Trade Center. Similarly, retrofitting of existing structures needs to be studied systematically, as it can reduce, at modest or virtually no cost, the potential for damage."
For example, I'd call OpenSSL documentation "criminally bad" to the point where it actively tries to coax the user into making a mistake - there is no central point of good practices for common use cases (e.g. sending a AES256-CBC encrypted block of data securely with shared key, using pub/private key, etc.) and the docs regulary don't mention the proper way of initializing different datastructures and modules (like RNGs) when they're mentioned. Even finding out how to properly initialize everyting to do a standard PBKDF#2 password derivation is a huge chore that requires reading through tons of badly formatted documents (or copying a random piece of code from StackOverflow which may or may not be secure). That makes even highly secured libraries a minefield where you can easily introduce huge security flaws without even knowing that you did so.
And as long as the excuse for that is "well, you just need few years of studying all crypto background" we'll be seeing security breaches everywhere - developers mostly just aren't prepared to spend so much time learning the crypto field or spend alot of money on security experts. In the big picture that's a huge issue - too many devs just opt to copy random pieces of code from StackOverflow, which can have very glaring security flaws or are simply not secure for the devs usecase.
Couldn't agree more. The problem is that for any bridge that gets used, every structural engineer signing off is going to have been educated and experienced to the extent that they are Chartered (or equivalent), the plans for the bridge have to be approved by planning authorities and a thorough documentation and review process has been gone through, before the first drop of concrete has been poured. And once its up, it's tested, and inspected regularly.
As every engineer on the project has necessarily been educated they know how hard it is and what the pitfalls are.
Bad crypto results from the fact that there is no need for any of the above requirements to be met. Any old engineer might think they can produce a good implementation, design it, put it into production and have systems secured by it, without being aware of the potential problems they're causing.
The free world of the internet is a wonderful thing, but regulation isn't entirely a bad thing either.
I disagree with you completely and absolutely. Your bridge in Boston isn't going to collapse the moment a researcher sitting in his bathtub in Tel Aviv has a eureka moment.
But if that eureka moment results in a preimage collision in a secure hashing algorithm, that hashing algorithm is broken for everyone all over the world forever. (Practically, as soon as the collision is public knowledge). Cryptographers have to actively seek out this information.
That's why MD5 is deprecated, even though the weakness is weaker than the one I've just stated. (It's just a chosen prefix collision that can be done today - a preimage [chosen hash] attack still takes nearly the full search space.)
Security is applied mathematics, the way engineering is applied physics. But the laws of physics don't change on an annual basis, or the way in which they change is too low-level to apply to engineering, whereas the laws of applied mathematics do.
Systems administrators have to keep up to date on an even more active basis, in some cases needing to patch any system within 48 hours of a public disclosure.
So I simply disagree that as an engineering endeavor implementation of cryptosystems is in any way similar to any other form of engineering.
In fact, for the particular example I used in the above case (a hash), the very existence of the operation is an open problem. ("The existence of such one-way functions is still an open conjecture", Wikipedia.)
What other branch of engineering relies on laws that may well be false?
Maybe some mathematician will prove AES is broken tomorrow. In terms of the analogy, I don't care. The most qualified, fastidious engineer building the most correct implementation of AES is going to have an insecure system on their hands at the end of the day if that's true, and there is nothing we can do about that on the implementation side.
This is distinct from some programmer reading "Learn Crypto in 24 Hours", building a bad implementation of an otherwise secure cryptosystem due to inexperience or carelessness, and then screwing people over because that bad cryptosystem goes into production code.
The entities that would most likely do the regulating already exist, but I'm unconvinced any of them would actually improve the situation. For instance, see http://blog.cr.yp.to/20140411-nist.html . What real group of people could really regulate cryptography? What real group of people's regulations could actually bring benefits to the field, rather than rubber stamping, government meddling for the NSA, and an emphasis on quantity over quantity?
If you can't answer that, I'd suggest staying away from a reflexive "regulation" standpoint.
There is actually some regulation in this space. FIPS compliance, PCI DSS etc.. It's just not as wide reaching as something like the FAA for aeroplanes.
The programming field just keeps making the same old mistakes again and again. We're even economically motivated to keep things this way, because we get paid to fix things when they go wrong. Roads in Germany come with a warranty, so the contractors make sure to build them correctly. Roads in the US and Italy keep getting fixed, because that's how those companies get paid.
Very few people seriously argue that. (Non-zero, but very few.) I'm a libertarian and I wouldn't seriously argue that. It's mostly a strawman. (I advise anyone who makes routine use of that strawman to stop, and read more carefully whenever they feel tempted to use it again, but that's another post.)
My point is that we aren't talking about "regulation in general", we're talking about "regulation in cryptography", and it's a logical and/or cognitive error to fall back to a general case when one is trying to consider a specific case. If we're going to regulate cryptography, how are we going to regulate it? "In the general case regulators" don't exist. The closest entities we have now that would almost certainly become the regulators show few to no signs of being worthy of the task. This is a serious problem to be addressed without falling back to "general cases".
But let's not kid ourselves, reimplementing a library the size of OpenSSL in a new language, or even the same one, is not a trivial matter. We're talking about a $10m+ investment, who's going to pay?
> there is concern that the NIST curves are backdoored and should be disfavored and replaced with Curve25519 and curves of similar construction.
Of course, "there is concern" is pretty vague, but it should be made clear that such concerns are in the realm of pure speculation at this point. There is simply no known way of constructing a "backdoored" elliptic curve of prime order over a prime field (in particular, the closest thing resembling such a backdoor, namely Teske's key escrow technique based on isogenies from GHS-weak curves, cannot work over a prime field). Scientifically speaking, I don't see more reasons to believe the assertion that "NIST parameters are backdoored because they aren't rigid" than the (equally unfounded) speculation that "Curve25519 may be weak because it has small parameters/a special base field/composite order/etc.".
Moreover, to say that the NSA has backdoored the NIST curve parameters is to assume that they have known, for quite a long time now, a serious weakness affecting a significant fraction of all elliptic curves of prime order over a given base field that has so far escaped the scrutiny of all mathematicians and cryptographers not working for a TLA. Being leaps and bounds ahead of the academic community in an advanced, pure mathematical subject doesn't quite align with what we know about NSA capabilities.
Don't take this the wrong way: there are good reasons to favor Curve25519 and other implementation-friendly elliptic curves (namely, they are faster, and they are fewer ways of shooting yourself in the foot if you implement them), but "NIST curves are backdoored" is not a very serious one.
The issue with the NIST P- curves is that there's no good reason to trust them. And, for what it's worth, being ahead of academia on pure math isn't science fiction; NSA employs a lot of mathematicians. But the notion of a backdoor in the NIST curves is totally speculative.
Here's what I was trying to capture:
Despite its very weird submission as a story to HN, what you'd been reading was just a very long HN comment; I wrote it in a single draft and in the style I would use when writing a comment.
And, if the criticisms can be addressed, in both specifics and perspective, for a future edition, they'll have a hardened book... almost sure to earn another updated expert review ("is it fixed?") at that time.
> In considering RSA, the book recommends /dev/random, despite having previously advised readers to avoid /dev/random in favor of /dev/urandom. The book was right the first time.
From "man 4 urandom":
> A read from the /dev/urandom device will not block waiting for more entropy. As a result, if there is not sufficient entropy in the entropy pool, the returned values are theoretically vulnerable to a cryptographic attack on the algorithms used by the driver.
In fact, using /dev/urandom is one of the causes of the creation of weak ssh key, found in this research: https://factorable.net/
So: Why is /dev/urandom the correct choice over /dev/random ?
Anyhow, its conclusions seem to be mistaken to me:
> It’s also a bug in the Linux kernel. But it’s also easily fixed in userland: at boot, seed urandom explicitly. Most Linux distributions have done this for a long time.
If you're an application developer (of something that runs very early in the boot process) but you're not making your own distro, and you can't trust your distro (I guess that since a lot of factorable keys existed, "Most Linux distributions have done this" might not actually hold true or count to a good enough percentage) you don't really have anything else that you can rely on to seed /dev/urandom explicitly
I'd think that the correct approach is to use urandom on everything but linux (after all, as long as your application isn't a blocker for the boot of the system, it doesn't seem terrible to wait for /dev/random)
Also, reading and blocking from /dev/random seems akin to failing early and explicitly (in the case where blocking on read is actually a problem), while reading urandom when not initialized seem to be a silent failure.
But I'm not going to write software that has to read from either device anytime soon, so don't panic if I'm mistaken :)
When did this community become more concerned with tone than correctness? The top of this thread is filled with people saying that the tone is bad, it's unproductive, it's unnecessary, etc. Yet nobody seems concerned about the published book filled with bad information that a lot of people are going to "learn" from. What gives?
There's been a pretty strong concern about tone on HN from the early days, mainly (afaict) driven by Paul Graham having an interest in and repeatedly commenting about it. It's not the only concern, but avoidance of flaming and mean-spirited comments, in addition to avoidance of vapid or dumb comments, is one of the openly and repeatedly stated design goals of the community. I.e. it should be intelligent discussion, conducted in a collegial tone.
(This is a general comment on whether tone is and/or should be important on HN, not an evaluation of tptacek's review or implication that this comment/gist in particular would fall afoul of the intended HN standards.)
The problem I have is that people are freaking out over extremely minor matter of tone while ignoring important technical problems. Worrying about tone is fine. Worrying about it to such an extreme degree in preference to other things is not.
Being right used to be the ultimate trump over social dynamics, which is what made tech a breath of fresh air to so many. Now that the field has become socially popular, it's been mired in the same vapid talking heads as everywhere else. And the people who actually know things are much quieter, as they generally have better things to do than compete for airtime.
I read Schneier's & Ferguson's Practical Cryptography years ago, the only thing I remember about it is the "don't try this at home" message.
I cannot take anyone who advocated MAC-then-Encrypt, in 2010, seriously (in the book Cryptography Engineering: Design Principles and Practical Applications by Niels Ferguson, Bruce Schneier, Tadayoshi Kohno).
The school of cryptography they subscribe to seems to be "crypto is black magic; this is tried and it works and it is pretty much secure because I feel it is secure; experience is everything; proofs can have bugs too" as opposed to a more principled, analytical, methodical, provable security.
This is especially problematic in pedagogical contexts, because the learners, by definition, do not have much experience or calibrated feelings, so they'll be lost or have to copy the design decisions of the authors without taking into account the contexts or that they might be flat wrong. That approach indeed implies the natural advice to someone who wants to learn will be "don't try it at home".
Was rather impressed...
Can I ask why? What is so dangerous with asymmetric crypto compared to symmetric crypto?
Another big reason is that public-key algorithms are parameterized to an extent symmetric systems aren't. Two random numbers is all you need to safely encrypt something with AES. Diffie Hellman, the simplest of the public key algorithms, needs a prime, a generator, and random private keys with particular relationships to those parameters.
Even though it's theoretical, the side effects of this fact surface from time to time as engineering issues in asymmetric crypto: all information that the attacker might need to break asymmetric crypto is more or less in the ciphertext, intuitively suggesting it's easier for asymmetric crypto to catastrophically go wrong.
It's great that one-time pad exists, but it's not really relevant in actual crypto code, right?
The only actual reason I can think of is that symmetric crypto is easier to write and understand - you just mangle and xor some text back and forth, while in asymmetric crypto, you need to understand fairly complex algebra. But again, that's not that important if you use existing primitives, right?
Not as big an issue with ECC, but RSA also has much larger block sizes, increasing the size of small payloads.
It's been my experience that Asymmetric is used for kex (key exchange) or key agreement or signing, but encryption is done using a symmetric algorithm.
Or you can browse his blog, Practical Cryptography or Cryptography Engineering may be a good start.
* This book, I am not making this up, contains the string: "“We can use ASN.1 to make the format easier to parse".
Or is the critique to an attempt to write a custom ASN.1 serializer/parser?
Didn't you know? If a web developer out of high school cannot read a format at first glance, it's obviously over-engineered and useless and anyhow everybody should always use JSON anyways.
a) it provides readers with a laundry list of things to go study independently
b) the book author can, given time and inclination, do the same study and improve the book
I want to go through every single chapter and rewrite it to stave off the imaginary critics in my head who will undoubtedly tear it apart.
Even if your work is on something fairly esoteric, you can almost certainly find someone qualified to review it on technical grounds, even if they're not an acknowledged expert in the precise topic you're writing about.
I have been unable to locate another book on this subject (its implementation in Go, anyway), but there are scores of experts in Go, many of whom are far more qualified than I am. I would respect their opinions on the book itself.
Best of luck in the process! Would you care to share the subject area? Now you've piqued my curiousity...
This was challenging primarily because it's so idiomatic and easy to handle in Go, so a good deal of the book talks about pitfalls, testing and implementation.
While criticism is good, the condescending way it is presented, as well as being overly critical are bad. Example:
"Total undue reverence for NIST and FIPS standards; for instance, the book recommends PBKDF2 over bcrypt and scrypt (amusingly: the book actually recommends against scrypt, which is too new for it) because it's standardized."
I know people love scrypt and bcrypt, and have been proven safe so far, but there are advantages to use standardized methods. An implementation can make something less safe than the standard.
bcrypt is also approximately the same age as PBKDF2.
And, finally, standardization is a very poor substitute for security analysis. PKCS1v1.5 is also a standard. If you want to argue against bcrypt, you'll have to marshal actual arguments.
I'm arguing that there may be advantages in using standardized methods (for interoperability) and especially implementations.
Which one is safer, using PBKDF2 from a known implementation or the "bcrypt library" for Ruby/Node that someone just posted to github? Oh what do you mean that's not how you read secure random numbers?
What kind of dilemma choice is that? It should rather be:
"Which one is safer, using PBKDF2 from a known implementation or bcrypt from a known implementation?"
But even that question is erroneous. After all, why shouldn't we just use PBKDF2 from RSA's known BSAFE implementation? What could possibly go wrong with that?
Or if you don't like RSA because of da Feds, why not use the known-good OpenSSL for securing data in motion instead of something new and untested like stunnel or spiped?
Known implementations are a very good consideration factor. You should use TLS (though maybe not OpenSSL's) in general instead of designing your own secure transport. But being from a known implementation should not be the only factor you consider otherwise you're just cargo culting. You have to understand pros/cons of each solution, even if that means you have to learn a little bit about the problem space.
"But being from a known implementation should not be the only factor you consider otherwise you're just cargo culting"
In the same way some people blindly answer "use bcrypt" to any mention of password hashing.
It's almost like there was a reason I explicitly decried cargo culting (yes, that counts too) and blind answers instead of understanding.
Could you try to read and understand the comments that are left instead of responding to points that were never made?
Is the audience for this review intended to be cryptography experts, who would not read such a book except to praise or trash it? If that's the case, it seems rather mean spirited. More "wow check this loser out" than "I don't recommend this book to beginners or anyone and here's why."