If you have a program that runs successfully through both the analyzer and the compiler, you can have quite a bit of confidence in the result.
In an ideal world you'd have a formally verified toolchain for a language without as many deficiencies as C, but here we are.
Other approaches try to bypass the compiler by verifying the generated binary against the semantics of the source code (treating the compiler itself as a black box). The major drawback is that you had to completely disable all optimisation.
sel4 verfies the binary against the specification, and lets you use optimization. Once you've extracted the flow graphs from the binary, following the optimizations isn't difficult.
>We compile the modified source with the normal C compiler to produce a bugged binary. We install this binary as the official C compiler. We can now remove the bugs from the source of the compiler and the new binary will reinsert the bugs whenever it is compiled.
Of course, the login command will remain bugged with no trace in source anywhere.)
Another way is to bootstrap up to a C compiler from machine-language upwards. So that every "level upgrade" in the bootstrap chain can be verified independently.
Here is one bootstrapping effort:
The spec of a compiler is also relatively simple, so there isn't a lot of room for backdoors there.
But aside from these incidental properties, yes the same thing could be done.
(This still leaves a large class of related, less recursively symmetric integrity attacks that could be made without the language fully bootstrapping itself, of course)
They extract from Coq to C. Like in my Brute-Force Assurance concept, that would let us confirm and supplement the formal verification by hitting the C output with every V&V tool for C programs that we have on hand.
How do you know it hasn't already been compromised =)
Here's a link that worked for me: http://www.cs.cmu.edu/~dga/15-712/F14/papers/p761-thompson.p...
> For two decades now, Airbus France has been using our tools in the development of safety-critical avionics software for several airplane types, including the flight control software of the A380, the world’s largest passenger aircraft.
More customers listed here.
I've heard Xavier Leroy say a year or two ago that Airbus was still evaluating CompCert and was thinking of using it in production for the aircraft generation after the next one, or something like that. And that CompCert sales have been very slow in general.
But here is a paper about an actual industrial user: https://hal.inria.fr/hal-01643290
> https://www.absint.com/success.htm More customers listed here.
AbsInt has been selling timing analysis tools for decades, and Airbus and the other customers are probably almost exclusively customers for those tools. That doesn't mean that those customers also buy CompCert. It seems like they don't.
You have to understand that CompCert's main competition in this space was an ancient version of GCC without any optimizations. This is mostly an issue of certification. CompCert got the same certification and is already a great improvement just by virtue of having a register allocator...
A register allocator! :)
Which semantics? The ISO C semantics is rather lacking; "undefined behavior" lurks around every corner.
A C compiler that correctly implements all ISO C requirements is a fine thing, and certainly better than one which gets some of that semantics wrong, but doesn't achieve all that much in the big picture.
If the verified compiler actually provides extended semantics beyond ISO C that eliminates much of the undefined behavior, such that programmers using this dialect can rely on a safer language, then we're talking.
Bizarre way to talk about award-winning breakthrough research in applying verification to large practical programs.
> A unique feature of the CakeML compiler is that it is bootstrapped “in the logic” – essentially, an application of the compiler function with the compiler’s source implementation as argument is evaluated via logical inference. This bootstrapping method produces a machine code implementation of the compiler and also proves a theorem stating functional correctness of that machine code. Bootstrapping removes the need to trust an unverified code generation process. By contrast, CompCert first relies on Coq’s extraction mechanism to produce OCaml, and then relies on the OCaml compiler to produce machine code.
For details, see section 11 "Compiler Bootstrapping" in http://www.cs.cmu.edu/~yongkiat/files/cakeml-jfp.pdf
For CompCert, the more worrying is the correctness of "extraction". This is somewhat technical, you should be able to search if you are curious. CakeML solves extraction problem.
For HOL, Proof Technologies Ltd developed HOL Zero, which is solely dedicated to trustworthiness without bells and whistles. Bells and whistles are absolutely necessary for proof development. The idea is to use HOL Zero at the end, after proof is finished. It's like coqchk idea on steroid. http://proof-technologies.com/holzero/
Summary: CompCert trusts coqchk and extraction. CakeML trusts HOL, hopefully something like HOL Zero can be used in the future.
1. Implementations can be wrong, so we want a formal proof system
2. The proof system can be wrong, so we want to apply the proof system to itself
3. But the proof system may be wrong (either in implementation or specification) in a way such that its self-proof is actually invalid
At some point, we want to assert some property (completeness, correctness) of the system as a whole, but whenever we augment the system to allow this, that augmented system's own properties now come into question.
Godel’s Proofs rely on statement encoding using properties of integers not required for these logic systems.
Presburger Arithmetic, for example, avoids Godel incompleteness, yet provides a decent system for a lot of work.
But I might be wrong, there are a lot of nuances and I do not know them all...
Seriously, people, you can find kernel bugs at will with kernel fuzzers like syzkaller from Google. Look at https://github.com/google/syzkaller/blob/master/docs/found_b... and weep.
You could claim that eventually the payoff would pay back the lost 10 years. But "eventually" can be a very long time...
So basically, ChromeOS rewritten in Rust? Why not a Servo based UI?