Not really. Source: I use Rust in automotive where MISRA C is required, and the only thing we needed was to certify it.
No ISO standard required.
We actually also supplied a "fork" of the MISRA C doc, crossing out ~80% of it, since it was just stuff that is just not possible in safe Rust. We covered the remaining 20% with clippy lints.
Most of the certification involved is just basic stuff like "using version control system", "has unit test", "tests run on changes", "uses a linter" (really, as vague as just that...), etc. We tried to explain that we used "miri" to run most of our integration tests on top of an interpreter that detects all undefined behavior, but the current safety processes don't even have a concept for this, so we had to left this point out to pass certification (we still do it, it's just not something that's accounted for).
The process allows us to use crates.io crates "as is", even those with "unsafe code", as long as these have tests and we run them with our clippy linters enabled.
Basically, you just need to pay an entity to say that you don't have absurdly horrible software practices. That's where C and C++ have set the bar, and this bar is absurdly easy to meet with Rust, given that it has support for unit tests, most code available already has tests, has a linter built in, etc.
> We actually also supplied a "fork" of the MISRA C doc, crossing out ~80% of it
As an example of what you'd cross out in MISRA I would expect things like the handling of switch.
MISRA says thou shalt supply default: cases in a switch. This ensures the switch is exhaustive, but at a terrible price, there are often scenarios in which there are only four possibilities A, B, C, D and MISRA obliges you to add default: and then assert this is never reached because A, B, C, D covers all the options.
Rust's match is exhaustive. If you write matches for A, B, C and D, and that compiles, there was no other option. If a library author tells Rust "Today there's A, B, C, D but maybe I need more later" with #[non_exhaustive] your A/B/C/D match doesn't compile, what if, says the compiler, there were more later? If the library author doesn't do that, but then adds E next month anyway, it's a backwards-incompatible change and gets flagged. So in Rust you get the benefit of that particular MISRA requirement built-in, always, and none of the inflexibility because the problem they're worried about was tackled by the programming language as a correctness issue.
I think what's useful would be to have someone certify the Rust toolchain for some of these.
That's what's needed to enable anyone to develop safety critical apps without issues.
The problem is that most certification consultancies don't allow you to publish a certification such that everyone can use it. That puts them out of work.
The ones that have already certified Rust, make an upfront investment in the certification, and then can certify more users for a lot of money and little effort.
Having said this, for pretty much any project, you are going to deal with some of these certification entities anyways (the programming languages used are just a small part of the whole thing). I don't think Rust is cheaper or more expensive than the alternatives for these types of apps.
Customer RFPs can ask for anything. If they want it in C++, then that's how it is. Some RFPs actually explicitly ask for Rust nowadays (no C, no C++). But some RFPs ask for FORTH, or assembly, or don't really care at all about the language used. The customer needs to have people trained to "verify" the product that they are getting. If their people only know MISRA C, then that's what they are going to want, and none of this has anything to do with the actual technical pros/cons of any language; it's just a more complex social and engineering thing.
Part of the safety story for compliance with standards like ISO 61508 and ISO 26262 is that you can demonstrate your toolchain had been validated and verified against something. In the case of C or C++, that usually means (among other things) demonstrating compliance with an accepted published standard like ISO 8859 or ISO 14882, and that is usually done through extensive testing with one or more audited and widely-acknowledged conformance test suites.
In the case of Rust, that would be "the standard is what the compiler does, so the compiler is by definition 100% conformant." Trivially true, but it will be a challenge to convince your auditors that that is a reasonable safety story. Liability is high when lives are at stake. This stuff is taken very seriously.
> In the case of C or C++, that usually means (among other things) demonstrating compliance with an accepted published standard like ISO 8859 or ISO 14882
Notice that the C++ standard has dozens of open bugs, some of them are actually due to incompatibilities within the standard itself, e.g., because it require implementations to comply to two different things simultaneously, but these are incompatible with each other. So in practice, it is impossible for any C++ compiler to actually comply to the spec.
So at the end the only thing any process ends up looking at, is that your compiler has lots of tests, and the process of developing it: do all tests have to pass before each release? etc.
Once such a test suite is verified, it becomes somethings others can use to say "my compiler passes it", but in practice, these test suites have bugs, and the test suites of gcc and clang are more comprehensive etc. Even if your compiler passes one such test suite, it won't pass certification if you don't have version control, testing running regularly, etc.
The rust compiler has a huge test suite. Every single change to the compiler must pass the test suite on some of the supported architectures, new releases of the compiler are tested against all the tests of the 40.000 libraries available in crates.io to make sure results don't change in code people actually write, etc.
> Trivially true, but it will be a challenge to convince your auditors that that is a reasonable safety story.
This has been trivial to do in our experience. Just pay the fees. The standard at which the Rust toolchain development operates is much higher than what the highest safety certifications require, because the safety of these certifications is actually very low: their whole purpose is to avoid liability, so most of them are just a way to create a paper trail that shows that your process is not complete trash, so that if someone sues you, you can say that you were operating at the highest standards. Nothing more, nothing less. Nobody is interested in making these standards "good", that's not their purpose.
Practically, the only issue is that some certifications require you to certify every toolchain you want to use. So if you bump the stable Rust toolchain version from say 1.49 to 1.50, you need to re-certify. The process for recertifying is quick, since nothing covered during the certification process actually changed, but you have to pay the fees again and again and again.
I think this is braindead, but AFAIK all compilers for all languages have this problem.
What people generally mistake is that compilers are not certified - they are qualified tools. In this case, the regulators check if the vendor of the compiler uphold certain standards and practices - once. After that, you - the vendor - are allowed to update yourself.
>> Trivially true, but it will be a challenge to convince your auditors that that is a reasonable safety story.
> This has been trivial to do in our experience. Just pay the fees.
Ah, MAX-8 syndrome. Just pay someone off and your airplane will be just fine. The safety auditors I deal with are little more rigourous.
> The process for recertifying is quick, since nothing covered during the certification process actually changed, but you have to pay the fees again and again and again.
I produce a qualified toolchain for a living. The qualification process for the toolchain itself is fairly simple: re-run all of the qualifying test suites (a process that takes maybe a week of elapsed time), analyse all of the results, and dispose of each and every new or unexpected result, a process that can take several elapsed weeks. The runtimes have to be certified: this requires additional tests to produce more evidence of 100% MC/DC coverage and faulty injection. The paperwork then has to be prepared, reviewed, and submitted. All in all a couple of full-time-engineer equivalent years. Once the auditors have given their stamp of approval, the toolchain can not change. Period.
We produce a safety-certified version of our OS and its SDK every two years or so, not because the auditor fees are high, but because it's a massive amount of work because people's lives are at stake, and the price of the loss of one life is just too damn high.
But go ahead and pay the right people off to sneak something by. Chances are good the people responsible will have moved on to damage something else by the time the corpses start piling up.
No ISO standard required.
We actually also supplied a "fork" of the MISRA C doc, crossing out ~80% of it, since it was just stuff that is just not possible in safe Rust. We covered the remaining 20% with clippy lints.
Most of the certification involved is just basic stuff like "using version control system", "has unit test", "tests run on changes", "uses a linter" (really, as vague as just that...), etc. We tried to explain that we used "miri" to run most of our integration tests on top of an interpreter that detects all undefined behavior, but the current safety processes don't even have a concept for this, so we had to left this point out to pass certification (we still do it, it's just not something that's accounted for).
The process allows us to use crates.io crates "as is", even those with "unsafe code", as long as these have tests and we run them with our clippy linters enabled.
Basically, you just need to pay an entity to say that you don't have absurdly horrible software practices. That's where C and C++ have set the bar, and this bar is absurdly easy to meet with Rust, given that it has support for unit tests, most code available already has tests, has a linter built in, etc.