
Guidelines for low-level cryptography software - lelf
https://github.com/veorq/cryptocoding
======
irundebian
It's worrying that for some cases you have to put a lot of effort into
"obscuring" code and destroy readability to comply with the rules just because
the C's semantics is not obvious and its type system weak and dated.

A good language for cryptography engineering should at least somehow allow the
programmer to annotate/type variables containing secret data and functions as
constant-time so that the programmer can concentrate on implementing the
cryptography logic while the compiler warns on correctness issues or security
violations.

~~~
nickpsecurity
Yup:

[https://www.cryptol.net/](https://www.cryptol.net/)

[https://cseweb.ucsd.edu/~dstefan/pubs/cauligi:2017:fact.pdf](https://cseweb.ucsd.edu/~dstefan/pubs/cauligi:2017:fact.pdf)

There's also program analysis work in this area:

[https://michael-emmi.github.io/papers/conf-uss-
AlmeidaBBDE16...](https://michael-emmi.github.io/papers/conf-uss-
AlmeidaBBDE16.pdf)

~~~
ReidZB
I admit I've only ever taken cursory glances at Cryptol, but it's never seemed
like a replacement for the sort of low-level programming described in the
article. If anything, it goes the opposite way - its Haskell base inherits one
of Haskell's flaws, namely that it can be difficult to understand the
runtime/performance implications of Haskell code without careful analysis.

Cryptol's README describes it as a sort of standardization/prototyping
language.

That said, I don't know these things for sure, and I'd love to be corrected if
I'm wrong.

~~~
nickpsecurity
It's a higher-level language that generates or compiles to the lower-level
forms, including C and hardware languages, so you don't have to worry about
that stuff. The expert can specify the algorithm, mathematically verify it,
synthesize it to code, another tool like Galois' SAW can verify that, and they
usually do a certifying compiler (esp CompCert) at the end. Rockwell-Collins
and Galois both use the tools in high-security projects.

I can't remember if it addresses constant-time programming. So, I went looking
arount, found the other language, and maybe someone will merge them.

~~~
ReidZB
Well, it doesn't matter _if_ Cryptol is compiled down to "safe" machine code,
like FaCT, right? But that's not something I've seen explicitly stated in the
Cryptol docs, so I would not make that assumption myself.

Are SAW or CompCert able to do formal verification around things like ensuring
no branching on secrets, ensuring that memory is cleared after use, ...? Or
does Cryptol make guarantees about the machine code it generates?

I see that SAW is able to do things like prove that hand-written, optimized
code is equivalent to the Cryptol specification, which is certainly very neat
- but that does not preclude potential side-channel vulnerabilities in the
hand-written, optimized variant. After all, much of the scariness of side-
channel attacks is that they're explicitly not targeting the (highly analyzed)
mathematics of a construction, but rather its implementation.

~~~
nickpsecurity
"Well, it doesn't matter if Cryptol is compiled down to "safe" machine code,
like FaCT, right?"

It does. It gets you the correct implementation of crypto. Most failures are
implementation errors. From there, before the buzz about constant-time, high-
assurance security was just mitigating that stuff by using fixed-size, fixed-
timing transmission of messages. As in, they couldn't tell anything about the
inside of the black box because all they saw was the same behavior on the
outside. Designers sacrificed some performance and plenty of bandwidth to do
that. Tightly-coupled designed with no covert channel analysis or mitigation
later showed up that leaked all sorts of stuff with them changing coding
styles instead of bad architecture. That's when we need stuff like OP article
and FaCT.

"Are SAW or CompCert able to do formal verification" (you)

"I can't remember if it addresses constant-time programming. So, I went
looking arount, found the other language, and maybe someone will merge them. "
(me)

I already answered that. I collected papers like FaCT specifically in case
anyone wants to integrate them with Cryptol to address that gap if it exists.
Also, other analysis and testing for information-flow security to make sure
_it_ is working on top of human review. This is a tiny part of my research
given covert channel detection is mostly a solved problem with public, free
methods such as Kemmerer's Shared Resource Matrix and Wray's for timing
channels that most people just don't apply. Most of my research is on stuff
that blocks code injection.

~~~
ReidZB
Ah, sorry, I didn't mean to imply that Cryptol itself doesn't matter. It does
seem very useful for its intended purpose (though again this is from an
outsider's perspective).

Rather, I meant to say that you do still need to worry about what sort of code
Cryptol generates if you want to address the points raised in the original
article (e.g. clearing secret memory). That was my original point, anyway:
Cryptol doesn't address low-level side-channel problems.

As far as integrating FaCT and Cryptol, couldn't you use Cryptol to build the
reference version of a construction, then implement a fast and side-channel-
resistant variant in FaCT, then use SAW to prove the two were equivalent?
(Maybe not as ideal as combining the two languages, exactly, but workable.)

I really appreciate you taking the time to reply. This stuff isn't my area of
expertise, obviously, but I still find it very interesting.

~~~
nickpsecurity
Ok, yeah, we're on same page now. Far as Cryptol, I wanted to seeif it could
be extended to generate FaCT which generated constant-time code. It would also
generate C to check with SAW or other tools for extra checking or situations
where side channels didn't matter. Could also be a tool that checked
equivalence of the two at source or object code level so any checking of C
code applied to FaCT.

That should cover most of the problem areas. I still think C itself or SPARK
Ada could be modified to do what FaCT does. Some subset at least. Then, one
can eliminate need to build FaCT-based tooling. So, those were the ideas I was
brainstorming at least for what applies directly to those languages.

------
nsajko
> Some compilers will optimize out operations they deem useless.

A compiler translates a certain language to machine code. If it is optimizing
it will use that language's semantics for optimization. That is necessary,
otherwise we would be writing almost everything in assembly.

> Know what optimizations your compiler can do, and carefully consider the
> effect of each one on security programming patterns. In particular, be
> careful of optimizations that can remove code or branches, and code that
> prevents errors which "should be impossible" if the rest of the program is
> correct.

A priority is to understand the language one is writing in.

> When possible, consider disabling compiler optimizations that can eliminate
> or weaken security checks.

A better solution is to use assembly language and thus exactly specify your
program's semantics within the program; that compiler flag could be removed by
whoever builds the program, or something like link-time optimization.

~~~
karl_schlagenfu
> A priority is to understand the language one is writing in.

That's all well and good but you can't expect a developer to know all the
optimisations different compilers will do, especially when it relates to
undefined behaviuor.

~~~
charleslmunger
I think the point of the OP is that you shouldn't think about the
optimizations a compiler might do, but what the language spec guarantees. If
the language spec guarantees that your code behaves the way you want, no
correct compiler will break it.

~~~
zokier
The problem arises when the language simply does not provide any guarantees on
the sorts side-effects you are interested in, so you can not simply rely on
specified behavior.

------
zokier
> On most operating systems memory owned by one process can be reused by
> another without being cleared

Isn't this plainly false? I think on all mainstream platforms memory is
cleared when it is mapped to process

------
tempguy9999
Slightly tangential to the article but still important, it says

"Some compilers infer that they can eliminate checks based on erroneous code
elsewhere in the program. For example, [obviously dodgy code snipped]some
compilers will decide that ptr == NULL must always be false, since otherwise
it would be incorrect to dereference it in call_fn()"

So why does the compiler not report this error, or better still reject the
code outright? It seems to turn a compile-time-detected fatal error into a
critically dangerous runtime error.

------
arkadiyt
This is great for educational purposes, but if you're thinking of implementing
your own primitives or cryptosystems please use a vetted out-of-the-box
solution like libsodium instead.

