Everything you need to know about cryptography in 1 hour (2010) [pdf] 275 points by epsylon on June 12, 2014 | hide | past | favorite | 97 comments

 The reason this is being (re)posted now is that I gave this talk at a Polyglot Vancouver meetup last night. Freed from the constraint of a conference schedule I actually took about 90 minutes to go through this talk this time (followed by another 30 minutes of questions).
 As someone who is completely new to cryptography and knows very little, where would you recommend I start? I've recently been reading about bitwise operations to become familiar with how to (somewhat) interpret what a cryptograhic algorithm is doing in a program, since bitwise operations seems to be popular in almost all crypto algorithms.
 I would highly recommend reading Cryptography Engineering [0] cover to cover. It's amazingly readable, covers the basics, the theory necessary to understand how things works and includes ample practical advice and observations on the industry.The first thing I did after the Snowden leaks was read through the entire thing and after doing so I really wished I had done this years earlier. There's very few books that I think should be required reading across the board for software engineers, but this is one that I do think everyone writing code should read every page of.
 I don't. This book recommends, say, MAC-then-Encrypt and tries to justify it in 2010 by perpetuating FUD about provable crypto (proofs are only valid if your primitives are ideal, therefore you should worry about--one set of--risks that you can't measure, so trust us instead of proofs). There's no excuse for doing that.In general, the authors seem to subscribe to "crypto is black magic" school of thought, which doesn't make for good pedagogy.
 Dan Boneh's free "Crypto 1" on coursera. A new session will be starting on the 30th of June. I've taken it myself and this is hands down one of the best MOOCs (and class overall) I've ever taken.
 I agree, "Crypto 1" was excellent! On Coursera's website, in the upcoming section, it says that "Crypto 2" starts in 3 months. I hope that's true!
 It hasn't been the last, oh... four times or so?
 It was indeed an excellent course and for any would-be participants I recommend buffing up on discrete mathematics and number theory already if they're not your strong suit.I found the course pretty hard as programmer with a strong interest in crypto, but no formal CS/maths background. The coding pieces were fairly straightforward, but the maths hurt.
 The Crypto 101 ebook [1] has been discussed a bit on HN [2]. It was submitted by tptacek, who had positive things to say about it.It is in my "summer reading" pile.
 The free cryptography courses at Coursera. I have seen the videos uploaded somewhere, maybe you can find them.
 Hey, I found a small typo in the presentation. On page 80, in "good" code sample, you can see: x |= MAC_computed[i] − MAC_computed[i];  It probably should be something like: x |= MAC_computed[i] − MAC_received[i];
 Or better yet x |= MAC_computed[i] ^ MAC_received[i];  If they're not careful and the numeric type being subtracted is wider than CPU registers, depending on architecture, the compiler-generated carry code to implement wider-than-register subtraction may introduce timing attacks. Wider-than-register xor is much much less likely to have such issues.
 I don't suppose there's a video of this available? Longer + decent Q&A sounds appealing.
 There was a video camera. I assume it will be posted somewhere; I didn't ask.
 Do you have a copy of the presentation in beamer's "handout" mode? I think that is what it is called. It is the mode that condenses the 140+ pages into 20-30 by collapsing all the reveals.
 Is there a website that has information like this listed so that it can be updated over time? And perhaps where items could be discussed?For example, is the suggested key size still the same as 2010?Thank you.
 Many thanks. I wish Beamer made a bigger point of recommending distribution in handout mode or more accurately dissuading distribution in presentation mode.
 I always do that now. The only reason I didn't this time is that the slides were from 2010, before I updated my Makefile to automatically build both versions.
 > that the slides were from 2010Considering this, what do you think has changed in the past 5 years? Eg the slides mention SHA3 as a future option, now it is finalized and afaik usable.There is also one interesting thing about SHA3 which I learned recently, namely that it can apparently be easily used as MAC too:> Unlike SHA-1 and SHA-2, Keccak does not have the length-extension weakness, hence does not need the HMAC nested construction. Instead, MAC computation can be performed by simply prepending the message with the keyIn this light, do you think it would be reasonable to add exception for "DON’T: Try to use a hash function as a symmetric signature." rule?Another thing is that ECC seems to be on the rise (or is it just my perception?). Do you think a revised slide set would include something more about elliptic curves?
 what do you think has changed in the past 5 years? Eg the slides mention SHA3 as a future option, now it is finalized and afaik usable.Not much has changed. I still think SHA3 is worth considering 5-10 years into the future; five years ago I was concerned about the implications of MD5 and SHA1 breaks on SHA2, but the lack of recent progress makes me happier with staying on SHA2 while SHA3 gets more analysis.do you think it would be reasonable to add exception for "DON’T: Try to use a hash function as a symmetric signature." rule?I mentioned this in the talk; even with SHA3 you need to be careful, since a simple "append and hash" would result in MAC("key", "data") == MAC("keyd", "ata"), which breaks the MAC assumptions. Yes, you can use SHA3 as a MAC, but make sure you know what you're doing.ECC seems to be on the rise (or is it just my perception?). Do you think a revised slide set would include something more about elliptic curves?ECC is getting more popular; not necessarily for the right reasons, though. (The big drivers seem to be "bitcoin uses this" and "the most common way of using this provides forward perfect secrecy".) That said, as mathematicians continue to attack ECC systems, I am gradually becoming more comfortable; the 2025 version of this talk might recommend using them, but for now I don't think it makes sense to change my recommendation (except for situations like bitcoin which specifically need ECC's advantages).
 Here is the video of the talk: http://www.fosslc.org/drupal/content/everything-you-need-kno...
 Someone posted the bliptv link above: http://blip.tv/fosslc/everything-you-need-to-know-about-cryp... $youtube-dl http://blip.tv/blahblah$ ffmpeg -i file.flv file.ogv
 youtube-dl also has a --recode-video option that takes ogg or webm.
 Thank you, for introducing me to youtube-dl.
 > DON’T: Put FreeBSD-8.0-RELEASE-amd64-disc1.iso and CHECKSUM.SHA256 onto the same FTP server and think that you’ve done something useful.Actually, this is a bit wrong - it's just that this is not cryptography and security-related and doesn't have anything to do with authentication. It's the same reason why some fansubbers still include CRC32 into filename - basic unauthenticated integrity checking. Totally insecure, but still better than nothing.I had case where downloaded file was broken - for some weird case either software, storage or a network had failed but TCP/IP checksumming didn't help and the fact was, I had wrong bits on the hard drive. I hadn't broadband connectivity at home and that cryptographically-pointless CHECKSUM.SHA256 (well, actually it was MD5) helped to find that out before I left the library.And recently, the same concept had helped me to validate files with a faulty cloud storage service that managed to lose some chunks and silently replaced them with zeroes when I tried to download the data back.Obviously, a signed file stored on another server would always be a better idea.Just nitpicking.
 I used Poly1305, but exactly in the way Daniel J. Bernsetein used it in his NaCl library. I also used his ECC signature scheme, also in exactly the same way. :)As far as I can tell the gotcha with Poly1305 is that you absolutely cannot reuse the same key ever. His construction involves using your keyed cipher to create 32 random bytes by encrypting 32 zero bytes before encrypting the actual message. These random bytes are then used as the one-time-use Poly1305 key. In NaCl he does this with either Salsa20 or AES-CTR, both of which are stream ciphers. (CTR converts a block cipher into a stream cipher.)
 Here's a provocative gem:> The purpose of cryptography is to force the US government to torture you.
 It's pretty simple -- if the US government really really wants your secrets, they can kidnap you and torture you until you tell them what they want to know. Cryptography can protect data, but it doesn't protect humans; all it can do is make sure that humans are the only remaining point of attack.
 That's not entirely true.Proper cryptography can keep them from learning that it's you they'll need to kidnap to get the secret, or even keep them from learning that there is a secret they might care about in the first place.Also, there are plenty of bad guys in the world that can't kidnap and torture you that it's still quite worthwhile to keep your secrets from.
 You're over-thinking this. The point is simply that no matter how good the cryptography in a system is, if there are humans involved then you need to worry about human factors as well.
 The easiest way to avoid the human factor is to get a scapegoat. You make it seem like someone else is responsible for, or knows about, the crypto or its data payload. They will then torture that individual indefinitely until they confess to something. It's better if they don't know you or anything about your scheme as that way it'll look like they're holding out a really long time on important information.Then the only thing you need to worry about is that person dying, in which case the investigation continues. So similar to upping the number of rounds on PBKDF2 every year, you need a new scapegoat every year, or however long it takes them to break either the crypto or the scapegoat.
 As in the \$5 wrench xkcd:
 or Rubber-hose Cryptanalysis
 Would be interested in hearing more detail about your objections to both Poly1305 and ECC.
 Too many ways to screw things up, not enough decades of cryptographic analysis, and there are simpler tools available which have been around for longer.This is subject to the caveat that ECC offers benefits under certain specific conditions (e.g., you need small signatures or a small ASIC die area); but in those situations you want to talk to a cryptographer anyway. My talk was providing guidelines for software developers who are writing code for general-purpose PC hardware.
 What are your thoughts on using NaCl? On one hand it gives harder-to-misuse primitives, on the other is uses elliptic curve crypto.
 I'm not sure how anyone can look at OAEP and call RSA "simple". Worse still: very few people will ever implement their own ECC code, but lots of people will use RSA implementations that more or less just expose the RSA primitive and fob "encoding" off on the developer.
 > PROBABLY AVOID: Elliptic Curve signature schemes.Including Ed25519?
 Probably. As I said elsewhere, this is subject to the caveat that sometimes you need the performance characteristics of ECC; I'm providing advice for general-purpose computing environments which do not have any such constraints.
 Not that I really feel comfortable challenging you on anything crypto related, but it seems to me that Ed25519 is superior to RSA in every conceivable way even for general-purpose computing environments. It's more performant, easier to generate keys, the signatures are shorter, and there are fewer ways to shoot yourself in the foot (e.g. no padding issues). Is there a reason you don't actively advocate it other than the fact that it's not widely used?
 Elliptic curves have had fewer decades of cryptographic analysis. There's a lot of structure still being explored and I'm not so confident that nobody will ever find a way to exploit it as I am with integer factorization.
 Thanks!
 This PDF is a tiny bit old, so I wonder what they'd say today. I guess it depends on how conservative you want to be. RSA has been through a ton of cryptanalysis and peer review and has stood the test of time. Ed25519 is newer. It's highly regarded but hasn't been analyzed as much or used as heavily so you're taking a bit more of a risk.
 You are entitled to your opinions, no matter how wrong they are. ;-)
 This lets me down a little bit. If two guys that I consider to be very knowledgeable on this topic disagree on some key things, how am I supposed to get most of this right?
 I was skipping the details because Thomas and I have had this argument here at least a dozen times, and I figured that people were getting sick of it by now. To summarize the arguments:Both CTR+HMAC and combined AEAD modes, correctly implemented and correctly used, will keep you safe against existing published attacks. Thomas takes the view that "correctly implemented and correctly used" is a problem, and I'll accept that he's right to recommend AEAD modes in that case (as long as you have a good cryptographic library[1] available to you).I take the view that if you can't take CTR and HMAC and put them together correctly, there's no way the rest of your code is ever going to be secure, so you've already lost; so I focus on the "against existing published attacks" side of things, look at the places where novel cryptographic attacks tend to be found, and opt for combining two very simple and well-understood constructions.The same story plays out for RSA vs. ECC: Thomas is worried about the fact that people have made dumb mistakes when using RSA, while I figure that if you're going to make those dumb mistakes (especially after my talk) you're going to write code which is otherwise insecure anyway, so I focus on the places where I think it is more likely that attacks will be found in the future[2].If you're a high school student with two years of Python experience and you want to add some cryptography to your cat photo sharing startup, listen to Thomas. If you're a senior developer with 20 years of experience writing C code for internet-facing daemons, and the code you write is going to be used by democracy activists in China, listen to me.[1] I'm not convinced that such a thing exists right now. [2] Or, alternatively, where attacks may have already been found, but not published.
 I love your summary about cat-sharing startups vs. C code for internet-facing daemons, but the cat-sharing people can't usually afford our rates for crypto work; most of my experience comes from code built by people with lots of experience.Also, if you asked me who was more likely to get crypto right, the Django web guy or the C daemon guy, I'd bet on the Django guy every time. Betting against crypto implemented in C is like betting when you've made a full house on the flop: all in.
 First off, do not interpret my comments as trying to provide any sort of advice to anyone. I am merely expressing frustration.I have to agree with Colin's [1] point.The standard libraries for the languages I see in assessments most often just don't include AEAD constructions. And when public libraries exist, they haven't been properly assessed.
 Can you be more specific? Because this assertion didn't ring true to me based on my own experience, and less than 5 minutes of Googling appears to refute it:* OpenSSL supports AEAD through CCM and GCM (and OpenSSL's GCM uses PCLMUL and shouldn't have the obvious cache leak)* Botan supports AEAD through OCB, GCM, CCM, EAX, and SIV(!).* Java JCE with the Bouncycastle provider (extremely popular) does AEAD with GCM, CCM, and OCB* .NET stack languages get AEAD through CCM or GCM.* Crypto++ supports AEAD with GCM, CCM, and EAX.* Golang supports AEAD through GCM in the standard library (I did implement a crappy OCB myself, but the go.crypto package probably has a better one).What bases aren't covered here? Bear in mind that most languages get their crypto through bindings to OpenSSL.
 Sorry for the slow reply. I haven't looked into setting up notifications for replies. (if that exists on HN?)But .NET probably represents the largest block of applications I see. Seconded by (shudder) CF. I don't really have any hope for CF but crypto is hardly it's biggest problem.As far as .NET goes, my first point was that the standard library does not support it; true in this case. I'm aware of CLR and Bouncy Castle as external libraries but neither of them inspires me with confidence.Supposedly, CLR was released by Microsoft but why didn't they include it in the standard library or release any associated security assessment reports? Were there any?I've heard the name Bouncy Castle thrown around quite a bit but that's about it. When I dig through their websites, it leaves me with a feeling not unlike trying to find information about Truecrypt. Granted, I haven't followed their project(s) very closely. But because of that feeling, I honestly trust OpenSSL more because people are scared about it.So, maybe I'm just missing something but this is where I've arrived. Please correct me if I'm way off base.
 Notifications are possible via this third party site: http://hnnotify.com – works great!
 I am not as knowledgeable on these topics as Colin. But in this weird specific set of cases, I think my take is also closer to the conventional wisdom among cryptographic engineers; if you got a panel of them to stand in for me, I think their comment would sound close to mine.
 I think you're right about me being in a minority here. I also think there's an unfortunate amount of neophilia in the cryptographic community, partly because you don't get publications by not doing anything new.
 Academics do love their new fancy stuff. Here's the thing: elliptic curve cryptography is not particularly new anymore. I'd buy this neophilia argument if it were concerning pairings, ideal lattices, or the stuff people are doing with these nowadays (FHE, obfuscation, etc).Elliptic curves were proposed, and have since been studied in the context of cryptography, in 1985. They're 30 years old! For comparison, finite field discrete-log Diffie-Hellman is 38 years old, and RSA is 37. The latter have been severely beaten down in the ensuing decades that followed, whereas elliptic curves have stayed (modulo special cases, but those also exist for DH/RSA) resistant to every non-generic attack so far. I would say ECC has a better track record, and could be considered the conservative choice.It could be argued that the underlying problem, integer factorization and FF discrete log, has been studied for much longer than the ECDLP. Maybe. Some basic algorithms go centuries back, but I would argue that the field has only seen real progress starting in the mid-1970s, with CFRAC and Pollard's algorithms. It can be counter-argued that elliptic curves as a subject have only existed for around 100 years, so they are still underdeveloped. Again, maybe.There is indeed a new wave of interest in elliptic curves, and it is still fueling many publications every year. But these are mostly performance engineering at this point: Edwards curves are not fundamentally different from what Miller proposed in 1985, as far as the ECDLP is concerned. I would not recommend the wonkier stuff like GLS/GLV curves, though, nor curves over extension fields or of higher genus: that would be too neophiliac, even for me.(I realize that I'm not gonna change your (or probably anyone's) mind, but you make it sound like elliptic curves are much more of a novelty than they really are. I don't disagree too much about the AEAD vs CTR+HMAC issue.)
 I don't think it's fair to compare on the basis of years alone. Elliptic curves weren't getting nearly as much attention back when there were a dozen patents covering everything.
 neophilia: love of or enthusiasm for what is new or novel
 I think that in this case where the persons mostly agree to disagree, both solutions are usable. You could bicker endlessly about superiority of one over another, but seems like picking either one is not massive mistake.
 The slide starting on PDF page 83 seems wrong? If I know (x, E_k(x)) for some x (or even without knowing them), then I can trivially compute (x', E_{k'}(x')) for k' and x' of my choosing.Somewhat disappointing, since this slide is the only one in the presentation containing anything cryptographically "meaty".
 I can trivially compute (x', E_{k'}(x')) for k' and x' of my choosing.True, I oversimplified a bit. I was referring to situations where you don't know k' and x', e.g., x' = x and k' = k ^ \epsilon for some value \epsilon.
 Ok, so what is the revised statement? "Referring to situations" is pretty vague...
 Not if E is ideal. The point is that an ideal block cipher is not vulnerable to related key attacks. It should be indistinguishable from a random permutation. Selected uniformly from S_n, where n is the cardinality of the key space.
 From what is on the slide, I can compute (x', E_{k'}(x')) for x' and k' of my choosing by just running the encryption algorithm.(I know that ideal ciphers are defined correctly elsewhere, and agree that their definition makes sense.)
 I would love to know your opinion about ARX ciphers.Direct implementation of AES from specification can be attacked using cache-timing side-channel. ARX ciphers are much easier to implement in software and also because they run in constant time, and are therefore immune to timing attacks.What is your opinion of ARX ciphers Chacha20 (from Daniel J. Bernstein ) and Threefish, Skein hash (Bruce Schneier, Niels Ferguson) ?
 I have a probably stupid question for you regarding ECC: in general, is ECC cryptography safe from attacks such as "Dual_EC_DRBG",where the curve itself was (potentially) constructed specifically to be breakable? I.e. if I use one of the publicly recommended curves, am I safe or could the curve also be broken by its author?
 Thanks for sharing the slides (and techpeace for the links to the video).A question I have though is given that we (those of us who collectively don't identify ourselves as security rocket-surgeons) are always advised to stick with the high-level stuff and to stay away from stuff like AES (ref. slide 16), which nobody does anyway, then why is no mention ever made of established and known protocols? Examples like- Needham-Schroeder (fixed version), http://en.wikipedia.org/wiki/Needham-Schroeder_protocolhave proven to be really useful. Yes, there's a metric ton of work involved in implementing something like this, but going through it once is amazing practice for getting it right in future (that, at least, is my experience).Why is it that we talk about symmetric and asymmetric encryption, but never go as far as the protocols that provide real context for their uses?
 There weren't any "Don'ts" about compression.Is that solely an SSL/TLS concern, or is it generally applicable to cryptosystems?
 He recommends HMAC-SHA256 in this paper, but I think that AES-GCM is a better construction, as long as you understand the requirements for IV uniqueness. It offers significant improvements on top of the standard HMAC (privacy and additional out-of-band data) without adding much in terms of size.
 "And I still maintain that recommendation. CTR+HMAC is far more robust against side channel attacks than any AE mode." -- https://twitter.com/cperciva/status/475360367191674881
 I think Colin has a hard row to hoe with this argument, given how often HMAC implementations cough up timing vulnerabilities. Even if you're stuck with GCM, GCM has the advantage of having mercifully few implementations. Everyone writes their own HMAC verifiers (badly).
 I think you're confused? AES-GCM is an authenticated encryption mode. HMAC-SHA256 is just a cryptographic MAC. The two only share the same applications when the latter is combined with a non-authenticated encryption mode in an Encrypt-then-MAC construction.The problem with the polynomial AE modes is they are trickier to implement (AES is actually small, as is SHA).
 In many cases where you use an HMAC, you'll also want to encrypt as well (ie: web cookies, tokens, etc). Because of this, I find it's much better just to go with AES-GCM from the start rather than discovering that you don't want to leak your internal DB identifiers later and tacking on the encrypt-then-mac.It also means that the implementation of the algorithm does the work of both authentication and privacy, whereas with HMAC-SHA256 (and HMAC-SHA256 + AES-CBC/CTR) you will see developers hand-rolling more of it, leaving more room for them to do it wrong.I also prefer that AES-GCM is a mode itself, while encrypt-then-mac generally requires a developer to do research into the appropriate encryption mode and other details required to "get this right".
 Would HMAC-SHA256 been a better recommendation in, the year that this presentation was given, 2010?
 "Just reviewed my "crypto in 1 hour" talk slides in preparation for giving this talk on Wednesday. 4 years after writing, no updates needed." -- https://twitter.com/cperciva/status/475359526145646593
 Good suggestion to avoid PKCS v1.5; sadly PGP requires it: http://tools.ietf.org/html/rfc4880#section-13.1
 """Conventional wisdom: Don't write cryptographic code!Use SSL for transport""Honest question here: suppose I have a webapp with multiple servers, being load-balanced through Amazon's ELB. This sounds about as standard as it can get. Question is: how does one handle client migration between servers, and client authentication, without writing cryptographic code or knowing anything about cryptography?(apologies if this is trivial)
 SSL usually terminates at the load balancer.A quick googling found this guide to setting up SSL on ELB: http://docs.aws.amazon.com/ElasticLoadBalancing/latest/Devel...
 Yes, but what happens to the client authentication when it migrates between servers? Do I write a cookie to the client saying "authenticated as user X"? Encrypt the cookie? Sign the cookie? It sounds one would need to write cryptographic code to do that...
 You write a cookie to the client containing some random key and associate that key to the user on your backend in the shared db. Since it's just random data, you don't need to sign or encrypt the cookie and SSL takes care of protecting it in transport.
 Does this random value need to be unpredictable?And leaving this specific example aside for a moment, I was trying to prove a larger point: people write crypto code for a reason. Some people will write it for no reason at all, yes. A lot of other devs will be dying to save themselves the time it will take to write any type of code, especially if it has to deal with something they aren't experts in, but they just can't. Now, it seems likely you and I can hash out the details of an acceptable solution in this discussion (in fact, I think once you take the steps to make sure the random source isn't predictable, you're good, but even this is slightly crypto-related), but how would the average developer know this? How would he go from "don't write crypto code" to this? In my humble opinion, simply saying "don't write crypto code" isn't a solution, and there are good reasons why this hasn't worked so far.
 Here's a principle from which you could derive "use unique tokens to identify the user": effectively, TLS already has a mechanism of fingerprinting users, called "client certificates." Their current browser UI sucks (much like HTTP Basic Auth) so nobody uses them (much like HTTP Basic Auth), and instead provides poor reimplementations of them (much like HTTP Basic Auth.) If you think about what you need to simulate having a client cert--a persistable shared secret generated by the host and securely sent to the client--then "a long random key inside a cookie" is what you'll be naturally lead to.
 Yes, it needs to be as unpredictable as possible.Usually this sort of code is already present and well tested in whatever web framework you're using and there's just no good reason to write it again yourself. Best case is you wasted your time, worst case it's buggy and insecure.
 Disclaimed: I know sweet f-a about cryptography, and was expecting 100% of this pdf to go straight over my head.On the other hand, why is the author using "i++" in a preamble of a for loop?
 That's the style in C-like languages.http://en.wikipedia.org/wiki/For_loop#C.2FC.2B.2B... also used in Java, Javascript, Perl, Php, Bash (and many more I'm sure).
 > That's the style in C-like languages.That style is in many cases inefficient. /* summation of successive integers */ /* don't worry, this should not cause overflow in a 32-bit signed integer type */ volatile int i=0; int j=0; for( ; i<65536; i++) /* ++i is better suited to this scenario */ j+=i;  Being declared as volatile, the compiler will not attempt to optimise the for loop and change the code that increments i in the preamble from postfix to prefix. I will admit this is an extreme example, but I often see examples on the internet of simple for loops written in this manner when prefix is more suitable and simpler for the cpu to execute. Postfix increments of integer types involve making a local copy (y) of the variable to increment (x), then incrementing x and returning y. Prefix increments merely involve incrementing x. The former method is rather expensive.On the other hand, it would make sense to use postfix on something like [0].
 Have you actually tried your example? Even at -O0, it results in the exact same code: http://goo.gl/kDEFnU
 He doesn't know how to write even simple mathematics. E.g., he uses symbols n and k without definition. Without careful definitions right there in the context, such symbols mean nothing. He is not clear enough about what he means by x.His lack of precision in his writing lowers confidence in the quality of what he has written.
 I am reminded of this: https://news.ycombinator.com/item?id=35079
 He doesn't know how to write even simple mathematics.Umm...
 If you know how to write math, then do so. In particular, define your symbols. Nearly none of your symbols are defined.
 You know that this is a presentation, right? And that the symbols will be defined in the talk?There are better ways to ask someone to define their symbols.
 It's simple: Instead of just n, write "For a positive integer n". Instead of just k, write "For a key k".It's simple.Look, guys, 'n' just does not abbreviate 'integer', and 'k' does not abbreviate 'key'. Yes, yes, yes, I know; I know; and we should all know, that while too often people write, say, O(ln(n)) saying nothing about either 'n' or 'O', it's darned bad mathematical writing. If in some context n is a positive integer, then in that context it is just necessary to say so.Read math from calculus to Halmos, Rudin, Dunford and Schwartz, Spivak, Lang, to Bourbaki, and will find that symbols are always defined; good authors just do not have undefined symbols hanging around. It's just not done.Now you have learned something important. Sorry some people didn't know.
 It's simple: it's a slide, so it should never, ever have excess verbiage on it. Otherwise, you don't know how to use simple English properly. Good authors never say more than necessary. I wouldn't trust someone who's presentation skills were so bad they wrote "n (a natural number)" when it was blinding obvious from the context.And remember, dogma is always wrong.
 It's not "blindingly obvious": Or, for one, for some, or for all?And there are many more symbols undefined than just n; just read the document. E.g., there is k. Now what is k? "Blindingly obvious"? Nope.The author actually did say a little about his x; not saying what k was was poor mathematical writing in any sense.And there were many more symbols.I'm talking about rock solidly, standard good mathematical writing and not "dogma".You are refusing to acknowledge or learn a simple and elementary but important lesson in mathematical writing, and your excuses are not serious responses. You are just angry and fighting. As much as it irritates you, I'm fully correct, and my remarks are fully appropriate.Yes, computer science and practical computing have a lot of difficulty in such writing lessons; as fields, in writing, computer science and practical computing have quality way, way below that of, say, math or physics although the document of the OP is not nearly the worst example. For the worst example, the competition is severe.For your"And remember, dogma is always wrong." sounds pretty dogmatic.Your response deliberately attempts to be insulting, is not really responsive to anything I submitted, and is not serious, constructive, or appropriate, and I will not respond to you further.

Search: