
Cryptography is not magic - loup-vaillant
http://loup-vaillant.fr/articles/crypto-is-not-magic
======
deanCommie
> Cryptography has rules, which are both simpler and more fundamental than we
> might think.

<Proceeds to describe Cryptography in a way that confirms that it is exactly
as complex as I set it out in my head>

e.g. from just the first pagraph:

> " Never ignore timings, they're part of most threat models."

> On most CPUs, the timing side channel can be eliminated by removing all
> secret dependent branches and all secret dependent indices.

> Some CPUs also have variable time arithmetic operations.

> Watch out for multiplications and shifts by variable amounts in particular.

Yeah dude, stuff like this is EXACTLY what most people don't want to think
about, and shouldn't have to think about, and which is why the guidance is
"don't roll your own".

I reject his premise as well that this guidance prevents good people from
pursuing Crypto as a field of study - as far as I can tell it's not
discouraging anyone with actual interest in it.

~~~
loup-vaillant
Oh, I did not mean to say it would be a piece of cake. Yes, side channels can
be a nightmare to track down. But it's not a matter of knowing cryptography.
It's a matter of knowing your _platform_. The rule (don't let data flow from
secrets to the side channel), remains dead simple.

Think Go (the board game). The rules are much simpler than Chess, and the game
itself arguably deeper (for instance, effective Go AIs appeared much later
than effective Chess AIs).

> _Yeah dude, stuff like this is EXACTLY what most people don 't want to think
> about, and shouldn't have to think about_

Selecting yourself out is fine. And apparently you're doing it for all the
right reasons: too much investment, not worth your time.

One of my goals was to address the "how hard can it be?" eyes wide would be
cryptographer. Well, _this_ hard. More or less.

> _I reject his premise as well that this guidance prevents good people from
> pursuing Crypto as a field of study - as far as I can tell it 's not
> discouraging anyone with actual interest in it._

I confess I'm not quite sure about this one. I'll just note that we've seen
people bullied out of some fields. (I recall stories of women being driven out
on competitive gaming that way.) A constant stream of "don't roll your own
crypto" is not exactly bullying, but I can see it be a tad discouraging. To
give you an example, here's the kind of mockery I have to face, even now.

[https://twitter.com/bascule/status/1287113393439035392](https://twitter.com/bascule/status/1287113393439035392)

~~~
dependenttypes
> Yes, side channels can be a nightmare to track down

I think that this is extremely overrated. As long as you are using C (rather
than some weird language), avoid branches with secrets, avoid indexing arrays
with secrets, and avoid *, division, and mod with secrets it should be fine.

>
> [https://twitter.com/bascule/status/1287113393439035392](https://twitter.com/bascule/status/1287113393439035392)

Low quality posts like that which encourage dunking on people rather than
discussion are what made me stop using twitter. Extremely disgusting on his
part, I am sorry that you have to deal with this sort of bullying. (I also did
not find any signed shift despite the claim of the person responding)

~~~
DarthGhandi
> As long as you are using C...

These things mentioned are what frustrate me with crypto implementations in
pure Rust, the attempts at constant time operations aren't that solid and
everyone is going to war with the compiler to simply get basic functionality.
Replacing pointer arithmetic where it's needed with array indexing stands out
the most but there's other issues.

Honestly think just using C bindings and calling it day is the best way for
anything going into production.

~~~
zenhack
> Replacing pointer arithmetic where it's needed with array indexing stands
> out the most but there's other issues.

What situations do you run into where array indexing is not an acceptable
substitute for pointer arithmetic?

~~~
loup-vaillant
For the record, implementing Monocypher was not one of them. I use arrays
everywhere.

The one borderline case I can cite is wiping memory. I go by sizeof() and
access each byte.

------
hn_acc_2
This article completely misses the forest for the trees.

Of course someone can roll their own crypto, if they've a willingness to study
and internalize the concepts, have a commitment to doing it right, and spend
time doing things like "Make it bug free. Test, test, test. Prove what you
can. Be extra-rigorous".

The whole point of that common advice is that the overwhelming majority of
developers have none of those things and it would behoove them to lean on a
library instead.

~~~
loup-vaillant
One of my hopes is that my article gives an idea of what one would be getting
into. That someone without the dedication or rigour to do this would notice
right away and back off, _with no hard feelings_.

My other hopes is that it would help newcomers focus their learning. Had I
read this article 4 years ago, it would have taken me less time to design a
reliable enough test suite.

~~~
anticristi
@loup-vaillant This is an excellent write-up! But I would say that the net
benefit is that it teaches developers why they should not roll their own
crypto, instead of telling them off.

Small tangent, I feel the same way about car servicing: I watch YouTube videos
on how to do it, then I pay a mechanic to do it.

Besides not needling to invest time in learning a non-core competence -- most
devs need to deliver a user experience and features -- not rolling your own
crypto also transfers risks. I doubt you would like to hear your bank using a
custom IV, because they thought it would be cool. :)

In a follow-up, it would be cool if you wrote about SRP authentication.

~~~
loup-vaillant
> _In a follow-up, it would be cool if you wrote about SRP authentication._

First time I hear of this. Looks interesting, but as PAKE goes I know B-SPEKE,
AuCPACE, and OPAQUE better. I'm trying to determine which one I want right
now, possibly even design my own, but I found those protocols are
significantly harder to get right than authenticated key exchange. I also
don't know them well enough to competently write about them just yet.

~~~
anticristi
Sorry, I should have written PAKE (the general concept) and not SRP (one
particular protocol that implements PAKE). If you are interested in my 2
cents, choose one that you understand and you can explain best. Feels most
aligned with your goal of making crypto less magic.

------
badrabbit
OP, don't know you but big fan of your posts,especially the ChaCha20 writeup.

What this article talks about somewhat translates to the larger infosec
community.

A lot of the presumptions I had about working in infosec were false:

\- You need to be good at and understand software exploitation well

\- You need to be a good programmer

\- You need to know how to code (I do fwiw)

\- Your soft-skills should be great (not more than any regular office job)

\- You should know offensive techniques well, including breaking crypto

\- You need to go to cons and do heavy infosec social networking

\- You need to be good at math

\- You need to master every single IT discipline

\- How can you work in infosec if you never hacked a gibson? (joke)

And many more.

I can tell you,these types of elitist gate-keeping is why infosec always
complains about a "skills shortage". I am nowhere near exhausting my
mental/skill capacity with my day to day work and I do fairly well. I meet
people all the time who never coded before and never heard of an elliptic
curve that do well and impress me in very technical infosec disciplines.

My suggestion to anyone considering infosec is, if you have strong interest in
the subject and you enjoy the very technical aspects of it (even if you don't
understand some things well), I say go for it regardless of what you lack so
long as you don't lack motivation and free time to pursue your studies. There
are plenty of jobs that need people with passion in infosec and you need to be
an elite hacker as much as a sports team needs every player to be an
egotistical superstar.

------
zipwitch
Mentioning cryptography and magic makes me think of _Steganographia_ written
in 1499 by Johannes Trithemius.

[https://en.wikipedia.org/wiki/Steganographia](https://en.wikipedia.org/wiki/Steganographia)

"Trithemius' most famous work, _Steganographia_ (written c. 1499; published
Frankfurt, 1606), was placed on the _Index Librorum Prohibitorum_ in 1609 and
removed in 1900. This book is in three volumes, and appears to be about
magic—specifically, about using spirits to communicate over long distances.
However, since the publication of a decryption key to the first two volumes in
1606, they have been known to be actually concerned with cryptography and
steganography. Until recently, the third volume was widely still believed to
be solely about magic, but the "magical" formulae have now been shown to be
covertexts for yet more cryptographic content."

------
rsj_hn
This article rubs me the wrong way, because the number 1 problem I see when
people implement crypto is that they don't have a well defined threat model or
understand how cryptography can assist them in addressing threats.

The issue is not so much that there might be a flaw in their implementation --
indeed you should use reviewed libraries instead of rolling your own to avoid
flaws -- this is not specific to crypto, it's just as much true for IO
libraries or memory management libraries as it is for crypto.

But what makes crypto unique is that cryptographic algorithms have very
strict, well-defined, limited behaviors and there is generally a big gap
between what people want to accomplish "don't let an attacker see this file"
and what crypto will actually do for them "encrypt the file" and very often
the use of crypto doesn't end up creating much value.

Here, people _do_ think cryptography is some kind of dark magic, where they
can "secure" something just by encrypting it, and it's incredibly frustrating
to have to implement cargo cult crypto that doesn't add much in the way of
real security just because a PO views encryption as an end in itself -- e.g.
as a feature.

------
caseymarquis
Ironically, the article has increased my commitment to not writing my own
cryptography code except as a hobby.

------
jeffrallen
I work with cryptographers daily and crypto is magic, and the amount of
variables you need to consider when working on crypto systems is so large and
varied that it takes a team to succeed. You should not roll your own crypto
because your recruiting is not good enough to gather that team around you.

The only thing from that article that I agree with is that gate keeping is
bad. Crypto is so hard that we need more cryptographers to help us, not less.

~~~
tptacek
Do we need more cryptographers? Do you have a sense of how easy it is for
strong cryptographers to get meaningful work in industry doing this stuff? We
all know cryptography engineering rates are very high, but that doesn't mean
there's a surfeit of open reqs for them; some high-value specialties are
performed mostly by consultants because most companies don't need them full-
time.

~~~
quadrifoliate
> some high-value specialties are performed mostly by consultants because most
> companies don't need them full-time

I think there is a conflation here of "cryptographer" in the sense of "person
who studies, and maybe invents cryptographic algorithms" v/s "someone who
needs a broad understanding of cryptographic algorithms in their day-to-day
software work, studies them in some depth, but likely doesn't invent new
ones".

The former is the kind you mean, but likely most people (including the OP, I
think) are referring to the latter when they say "cryptographers". An
unscientific test of this can be done by searching jobs.lever.co for
"cryptographer" v/s "cryptography" – the former yield _one_ position, whereas
the latter yields _ten pages_. [1][2]

And maybe some of this is just due to the term being imprecise. For example,
would you term this position as being a "cryptographer"?
[https://jobs.lever.co/protocol/9afbc1c9-8b3b-4c03-856d-6b0cb...](https://jobs.lever.co/protocol/9afbc1c9-8b3b-4c03-856d-6b0cb5518eaa).
It's certainly full time.

\-------

[1]
[https://www.google.com/search?q=%22cryptographer%22+site%3Aj...](https://www.google.com/search?q=%22cryptographer%22+site%3Ajobs.lever.co)

[2]
[https://www.google.com/search?q=%22cryptography%22+site%3Ajo...](https://www.google.com/search?q=%22cryptography%22+site%3Ajobs.lever.co)

~~~
tptacek
No, I can understand why you'd think that, but that's not what I meant. I used
the word "cryptography engineer" carefully. I'm talking about people who can
_implement_ and _verify the implementation of_ known algorithms.

I don't know. Maybe cryptocurrency startups change the equation. I know that
prior to crypto-mania, good jobs doing pure cryptography engineering were not
especially easy to come by.

~~~
loup-vaillant
I'm not sure, but if the supply of cryptographic engineers were adequate, I
believe someone would have written Monocypher before I did.

We'd also have more working Elligator2 reverse map implementations. If DJB &
al themselves didn't provide a working reference implementation over
Curve25519, that suggests serious labour shortage, at least locally.

~~~
tptacek
They did. It's called libsodium. And people have been writing vanity versions
of libsodium's functionality forever. You don't hear about them, because
(thankfully) nobody uses them.

~~~
loup-vaillant
Your wilful ignorance of the differences between Monocypher and Libsodium does
not befit someone of your rank. I suggest you read this:
[https://monocypher.org/why](https://monocypher.org/why)

TLDR: Libsodium is over 10 times bigger than Monocypher, it needs an autotools
build system, and most of all, it doesn't support embedded targets. Monocypher
is closer to TweetNaCl, except it's much more complete, and an order of
magnitude faster.

Also recall that LibHydrogen was released not long after Monocypher. It
appears Frank Denis felt the same kind of void I did. If Monocypher is as
unneeded as you pretend, so would be LibHydrogen (which by the way is slower
across the board than Monocypher).

~~~
tptacek
You write as if you believe the difference between small-footprint embedded
cryptography and server farm cryptography was the byte size of the resulting
.o/.a. That can't possibly be what you believe; if I described you that way,
you'd justifiably object that I was attacking a straw man. Perhaps you can
clarify.

~~~
loup-vaillant
Indeed, I do not believe binary footprint is the only difference. It's
actually the least important one. The most important is _availability_.

As I said above, Libsodium's support of embedded targets is limited. To the
point it wasn't even considered on some benchmarks, while Monocypher was.
[https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8725488](https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8725488)
(Note that since that benchmark, I have reduced stack usage of signature
verifications by half, at no noticeable loss of performance).

Monocypher's portability comes from 3 factors beyond sticking to standard C99:
the size of its binary (least important), the size of the stack (more
important), and the lack of dependency (most important). Monocypher uses no
system call, and does not depend on libC. That makes it usable on targets
without a kernel.

The second difference is _simplicity_. When I wrote about Libsodium being 10
times bigger, I didn't refer to the size of the binary. I was referring to the
size of the _source code_. And the number of exposed symbols, to a lesser
extent. Such a massive difference in degree is close to a difference in
_kind_. Audits take 10 times more time and money (where Monocypher cost $7K
and 7 days, Libsodium would cost $70K and 14 weeks). Building, porting, and
deploying takes 10 times more effort, and sometimes is not possible at all,
when the environment is constrained enough.

Simplicity matters, even on server farms. DJB wrote TweetNaCl for this very
purpose if I recall correctly. He wasn't even targetting embedded devices.

\---

One important thing to clarify: small-footprint embedded cryptography and
server farm cryptography can be the same in the IoT world: that connected
object has to connect _somewhere_ , generally to the vendor's servers. While
embedded devices are often very constrained, the servers can be a bottleneck
(especially true if there's no subscription to pay for the servers). You might
want to optimise for the server side, to help scaling, even if it put more
strain on the embedded side.

That, and there's an advantage to using the most popular primitives: less room
for new cryptanalysis, more compatibility. That's how you get Ed25519
signature verification even on tiny 16-bit embedded processors. You'd like to
use an extension field to make a faster, more lightweight curve, but the
literature on prime fields (and therefore Edwards25519) is more stable, making
Ed25519 the safer choice.

I also acknowledge that Monocypher is not ideal for tiny embedded devices:
Blake2b instead of Blake2s, 64-bit multiplication on Poly1305 and Curve25519…
64-bit multiplication is particularly problematic on 8/16-bit processors, they
end up inflating the binary size to prohibitive proportions. For those tiny
targets, C25519 is a much better fit. I have redirected a user to it once,
though they ended up using a slightly bigger processor, and kept Monocypher
because of its speed.

On small 32-bit processors however, Monocypher is king. As far as I know, only
custom assembly beats it.

~~~
tptacek
Bernstein and Schwabe wrote Tweetnacl for verifiability. You did not write
Monocypher for verifiability, as the track record shows. Libsodium has been
repeatedly audited; it is one of the most heavily targeted libraries in the
industry. You shipped an EdDSA that accepted all-zeroes input as valid; if you
want to snipe at libsodium, let me ask: what's the comparably catastrophic
vulnerability there?

~~~
loup-vaillant
Okay, now you're conflating "verifiability" and "has been verified". One can
perfectly build a crypto library with verifiability in mind, then fail to
verify some crucial property.

Breaking news: TweetNaCl has not been fully verified. Two instances of
Undefined Behaviour (negative left shifts), lines 281 and 685, remain to this
date. They're easily found with UBSan (yay for verification!), but for some
reason DJB has yet to correct them.

The original paper verified 2 specific memory safety properties (no out of
bound accesses, no uninitialised memory access). Monocypher's test suite does
the same (and more) on a systematic basis since before version 1.0. I use
Valgrind, all sanitizers, and the TIS interpreter. The test suite covers all
code & data paths, much thanks to the code being constant time.

So not only Monocypher has been build with verifiability in mind, it has been
pretty thoroughly verified. You would know that if the time you took to
discredit Monocypher were used to look at it instead. It's all there in
tests/test.sh, referenced in the README.

\---

About that vulnerability 2 years ago. As shocking at it may be, I learned from
it. The looming threat of something similar happening again tends to do that.
I've paid my dues since, learned a ton. The audit gives no cause to fear
another error of that kind. The test suite was deemed adequate, and they found
no bug, however minor.

That old bug is irrelevant now. Give me a break.

~~~
tptacek
You're the one who drew the comparison between the security of your library
and libsodium. I simply completed the comparison for you.

------
RcouF1uZ4gsC
> Chacha20 tends to be naturally immune to timing attacks on most platforms,
> while AES requires special care if you don't have hardware support.

One of the nice things about the crypto designed by djb is the effort to make
it easy to implement safely. For example, as mentioned, Chacha20 is designed
to avoid timing attacks. Curve25519 is designed so every 32 byte key is a
valid public key.

Just like programming languages are shifting from the C like view of it is
solely the programmer’s responsibility to avoid screwing up, to languages like
rust which emphasize safety and make it harder to have an inadvertent memory
safety issue, so our crypto algorithms ideally should be designed that an
competent general software engineer can implement them without screwing up.

~~~
greesil
It certainly helps that we have the last 30 years of mistakes to learn from.

------
codysc
>Perhaps surprisingly, implementing cryptographic primitives & protocols
requires little cryptographic knowledge.

That's a dangerous statement on it's own. Making proper use of primitives is
not at all a simple concept. Developers can absolutely undermine their systems
with poor choices/mistakes.

Self promotion: I wrote a blog up on a very high level screw up with type
conversions to show just the very surface of how to screw up using solid
crypto primitives. Time allowing I want to do more entries on topics within
the crypto realm itself. IV reuse, etc.

[https://pritact.com/blog/posts/crypto-mistakes-
part-1.html](https://pritact.com/blog/posts/crypto-mistakes-part-1.html)

~~~
loup-vaillant
> _That 's a dangerous statement on it's own._

I'm not sure how best to say it.

Implementing primitives & protocols requires little _cryptographic_ knowledge.
It does however require significant knowledge about program correctness:
testing methods, proofs, and if side channels are important, the
characteristics of your platform, and an accurate enough idea how your
compiler or interpreter works.

Likewise, to implement an energy constant implementation of Chacah20 in
silicon, you don't need a cryptographer, you need a _hardware designer_. The
only thing you need a cryptographer for, is telling the hardware designer to
make it constant energy — or convincing the higher ups why the extra cost is
justified.

The blog post you link (which I love by the way) seems to confirm my view:
many problems are ones of correctness. I believe most such bugs would be
caught by corrupting the inputs, as I alluded to. Here, corrupting the
password would fail to abort, and you'd catch the bug.

~~~
codysc
I don't think I exactly understand your points.

Using an example of IV reuse in AES-GCM:

The weaknesses resulting from this wouldn't be discoverable with a test like
corrupting the password from the first example. If the developer wasn't aware
that IV reuse introduced that weakness then they would be using strong
primitives but in a way that dramatically undermines the actual encryption.

Not to put words in your mouth, but I assume your answer would be to say that
this would be a matter of correctness. If yes, then where I'm coming from is
that the majority of devs don't have the skillset to be correct and sometimes
wouldn't dive deep enough to discover these kinds of pitfalls.

~~~
loup-vaillant
> _Using an example of IV reuse in AES-GCM:_

Yes, that one wouldn't be caught by corrupting inputs. You need to make sure
you don't reuse the IV in the first place. And that's indeed a cryptography
related bug.

> _I assume your answer would be to say this would be a matter of correctness_

It would be.

> _where I 'm coming from is that the majority of devs don't have the skillset
> to be correct_

Unfortunately, I can believe that. Correctness is hard. Or expensive. Let's
try with this example.

If you're designing an AEAD yourself, you can notice the error by trying (and
failing) to prove IND-CCA2. If you're implementing or using AEAD, code review
should the problem… _unless_ this is a bug like you've shown before. Tough one
to spot.

One way to avoid the IV bug with a reasonable degree of certainty would be to
use a counter or a ratchet. Don't send the IV over the network, let it be
implicit. Then write two implementations in two very different languages. It
is very unlikely that _both_ happen to repeat the same hard coded IV.

If we still want to use random IVs, we probably need to mitigate replay
attacks: have the receiver store the last few IVs of the messages it received
this session, and have it compare any new IVs with this set. Won't stop all
replay attacks (the attacker could wait until old IVs are forgotten), but it
will at least catch the accidental reuse.

------
emilfihlman
>that kind of gate keeping is problematic on a number of levels

You know, this starts to sound like anti-vaxxers

>and started to teach myself cryptography 4 years ago

Yeaaaaah, I definitely see the parallers with anti-vaxxers.

Look, it's not about not-studying stuff. It's that you will fail and you will
have side-channel attacks. You do not have the resources or the skills that
multiple hundreds or thousands of people have in a) developing crypto and b)
breaking it.

I've built my "novel" crypto scheme (any hash algorithm can be used to create
a symmetric cipher), and I like it! But I wouldn't think about using that on
an actual product where there are stuff really at stake.

~~~
loup-vaillant
If you have to resort to name calling, I'm in pretty good shape.

You were talking about failure, but this doesn't look like failure to me:
[https://cure53.de/pentest-report_monocypher.pdf](https://cure53.de/pentest-
report_monocypher.pdf)

You were talking about side effect, but I have the feeling you didn't even
read the part of the essay that addresses this exact point.

------
blackrock
> don't roll your own

But, if you never attempt to roll your own, then, how will you learn to make a
basic version? Then, how will you learn how to make the next great encryption
algorithm or program?

------
rini17
Who are the experts here? For example, does any TLS protocol version or its
implementations fullfill the following?

"The slightest error may throw cryptographic guarantees out the window, so we
cannot tolerate errors. Your code must be bug free, period. It's not easy, but
it is simple: it's all about tests and proofs."

(Please, I don't intend this as a flamebait, only asking on what is this
belief founded?)

~~~
loup-vaillant
There are a number of components there.

(1) Protocols are very sensitive to errors, possibly more than primitives. If
you screw up the internals of a primitive, the results will be different (and
visibly so), but it stands a good chance at still being _secure_. Protocols
however tend to be very tightly designed. Modifications that have a working
happy path are more likely to have significant wholes: a missing check,
failure to authenticate part of the transcript, loss of forward secrecy… No
real justification there, it just has been my experience dealing with modern
primitives and protocols.

(2) Correctness subsumes security. A program is correct if it fulfils its
requirements. A program is secure if it fulfils its _security_ requirements.
Which by definition are a subset of all requirements. That said, while
immunity to relevant side channels are definitely parts of security
requirements, it helps to separate them from the correctness of end results.

(3) Bug free code is possible. It's not easy, but it can be done. Constant
time implementations of modern primitives are among the easiest code to test,
ever: since code paths only depend on the lengths of parameters, testing them
all against a reference is trivial. That's not just "100% code coverage", it's
100% _path_ coverage. As for the proofs, while they may not be easy to
produce, the good ones are fairly easy to follow (though extremely tedious),
and the great ones can be checked by a machine, making them trivial to verify.

~~~
zrm
> Protocols are very sensitive to errors, possibly more than primitives. If
> you screw up the internals of a primitive, the results will be different
> (and visibly so), but it stands a good chance at still being _secure_.

I don't know if I agree with this. You can easily write an implementation of a
primitive that even creates the correct output bytes while leaking secrets via
every side channel, or via the one side channel you didn't realize existed.

I also think that "protocol" is too wide to be a useful category. TLS is a
protocol, right? But what about HTTPS? Your site's API? There is always going
to be cryptography at the bottom, but at some point you have to draw a line or
"don't roll your own crypto" becomes "don't write your own software" because
everything is in scope.

Or maybe you can't draw that line because the upper layer stuff still has
implications. Think about the compression oracle attacks. The upper layer has
secret data, compresses it and then shovels it through a "secure" protocol but
has already leaked the secret contents through the content size difference due
to the compression. But if that means everything _is_ in scope, what then?

~~~
loup-vaillant
> _or via the one side channel you didn 't realize existed._

Can't do much about that one. Gotta have someone telling you, or (worst case)
the very state of the art advancing under your feet.

> _at some point you have to draw a line or "don't roll your own crypto"
> becomes "don't write your own software" because everything is in scope._

Yes, there is a point beyond which you don't have a choice. The natural (and
utterly impractical) line to draw is untrusted input. Talking through the
internet is the obvious one, but merely playing a video exposes you to
untrusted data that might take over your program and wreck havoc.

Compression is a tough one. I'd personally try padding. Something like PADME
should work well in many cases.
[https://en.wikipedia.org/wiki/PURB_(cryptography)](https://en.wikipedia.org/wiki/PURB_\(cryptography\))

~~~
zrm
> Can't do much about that one. Gotta have someone telling you, or (worst
> case) the very state of the art advancing under your feet.

But that's kind of the point. Primitives aren't easy either. Even one thought
to be secure yesterday might not be today, which makes it easy to get wrong
merely by relying on literature from a year ago rather than today.

> The natural (and utterly impractical) line to draw is untrusted input.

I think you're right about that, both as to where the line really is and as to
how impractical that is if you don't want to just end up with essentially
everything being in scope.

> Compression is a tough one. I'd personally try padding.

I'm not sure you can fix it strictly in the lower layers. The general attack
works like this: The attacker can supply some data, the victim compresses that
data and some secret data together and then sends the combination encrypted
over a channel where the attacker can observe the size. If the attacker-
supplied data matches the secret data, it can be compressed more so it gets
smaller.

The attacker's job is really easy if supplying one more byte of data that
doesn't match the secrets causes the observed length to increase by one byte,
but fixed padding just requires the attacker to find the padding boundary by
supplying increasing amounts of pseudorandom (i.e. incompressible) data until
the threshold for the next output size is reached, then swap in different
bytes until some of them match the secret and it falls below that due to the
compression. And random padding just requires more samples to account
statistically for the randomness. In theory you might be able to fix it by
making all messages the same length, but then if you want to support large
messages, all messages become large, which could be unreasonably inefficient.
And the whole point of the compression was to do the opposite of this.

The real solution is not compressing attacker-supplied data together with
secret data to begin with, but that means the upper layer has to know not to
do that.

------
leafboi
Crypto/security is cool and amazing, but it's also one the least flashy parts
of Computer Science. Purely in terms of overall reputation I don't think
people view crypto as "magic..." a more accurate analogy is "plumbing."

The Magic comes more from things like games and computer graphics and deep
learning.

~~~
anonymousDan
I don't know about that, I think a lot of people view codebreaking/hacking as
pretty magic.

~~~
leafboi
In the past yes, but that's not the case anymore.

------
justanotherc
"Crypto is not magic."

"Don't roll your own crypto because its basically magic and you'll screw it up
unless you're a cryptographer".

Umm... ok...

