Also we are now on Ada 202x already, quite different from Ada83, that some Rust folks keep comparing their favourite language with.
You get all the niceties of a language more in line with Object Pascal productivity, formal proofs, a RAII approach safer than C++ (controlled types, automatic stack types, unbound collection types), industry baremetal profiles like Ravenscar, and the recent versions are in the process of having affine types as well (aka borrow checker).
I don't know Ada / SPARK, and I've been trying to figure this out. Based on the hallucinations I got from ChatGPT, it seems Ada itself is nowhere near as powerful as Rust in safety, while Ada with SPARK disallows some things I was considering to be quite basic, such as shared aliasing of data.
For example, it seems it's not possible to get a sub-string slice reference to an original unbounded string. In rust, a &str -> &str signature is trivial.
So it seems Ada still relies on discipline, while SPARK does not have the zero-cost abstractions that C++ and Rust have.
If that's true (is it?), then I'd definitely choose C++/Rust over Ada any time, since performance is very important to me.
Both Ada and spark have zero costs abstractions, they’re designed to run on embedded platforms.
Spark is a different use case from rust - it’s a full prover, and the goal is formal verification, typically in contexts where human life is at stake ( say, you’re writing software for an artificial heart , to take an extreme example ). This comes at the cost of being less flexible, but they’ve been slowly evolving spark so that it can handle increasingly complex cases .
Less ChatGPT and more language reference manuals, ChatGPT isn't an ancient oracle knows it all, even though Microsoft's marketing sells it as such.
Ada has as much zero cost abstractions as C++ and Rust have, and one of the reasons of Ravenscar is even what to turn off for bare metal deployments, and real time OS deployments.
By letting the compilers decide the best way to implement them.
Also I find funny this point of view on Ada, given the poor examples WG21 has added into the C++ standard library, that will never be fixed due to never-ending ABI drama.
But doesn't this copy the entirety of that slice? That's not what I meant, I was referring to a shared reference, akin to &str in Rust or std::string_view in C++.
Doing it normally should create a copy of the value, as far as I can tell. Unless you use it to create a renamed variable or by-reference parameter, the same way you can create references in C++.
I think the closest thing to a &str in Ada would be an "access String" or "access constant String", which you would get either from an allocated "String" or from a declared "aliased String". You'd create a subslice with "string(x..y)'Access". Though I'm not sure whether that actually works without explicitly declaring an array of "aliased Character", the Manual is dense (as with C++, it's nearly meaningless unless you already know what it's supposed to mean) and the tutorials generally avoid talking about access objects.
Thank you! I assume 'Access types are not memory-safe, right? Is there a SPARK equivalent which still does not copy (i.e. it only references) that is memory safe?
To be more specific, how would one implement something like a "Get_First_Word" or "Trim_Whitespace" without copying?
I think it's supposed to be safe in Ada by default? As far as I can tell, basic allocated objects cannot be deallocated without an "unchecked" operation, and access objects created from "aliased" declarations are subject to scoping rules [0]. (If you want to know the full details, go figure out however "accessibility levels" are supposed to work [1].) It should preclude functions from returning access objects without a "prefix'Unchecked_Access" operation. I'm not sure how SPARK's borrowing system is tied into all of this.
From the documentation you linked, it seems slicing creates a brand new String, which is more like std::string than std::string_view. In other words, it allocates and copies all of the string's characters (although it might allocate on and copy those bytes to the stack).
Also, the Unbounded_String owns its copy of the data, as opposed to referencing other data. The difference with String seems to be just that it can grow. It's still more like std::string than std::string_view.
Note that both std::string and std::string_view are essentially just a pointer and a length (std::string also has a capacity but let's ignore that). The difference is that trying to duplicate a std::string will end up duplicating (deep-copying) the data behind the pointer as well, where as duplicating a std::string_view will not.
Could you help me understand/interpret that link the same way you do, in case I'm missing something?
Allocating a new String, requires... "new String". You issue the "new" command, or the source does.
But what Unbounded does is... "U.Reference (U.First .. U.Last)". It returns a reference. It's not duplicating, because that would defeat the point of its entire existence. Its the buffer, containing one or more string objects, and you're just slicing a reference out of it - because that's the point.
If you want a String, you need to allocate one.
function Slice
(Source : in Unbounded_String;
Low : in Positive;
High : in Natural)
return String;
For this - there is no `out` marked. What you're grabbing is part of Unbounded, and the compiler won't let you deallocate it. Because it's owned, as a reference, to the Unbounded String.
For example, here's the actual GNAT source of the function [1]:
function Slice
(Source : Unbounded_String;
Low : Positive;
High : Natural) return String
is
begin
-- Note: test of High > Length is in accordance with AI95-00128
if Low - 1 > Source.Last or else High > Source.Last then
raise Index_Error;
else
return Source.Reference (Low .. High);
end if;
end Slice;
A pre-existing reference is returned. There is no allocation that happens whatsoever.
I tried to get some sort of proof based on Godbolt, to see if it generates any memcpy's, but I couldn't manage to do that after quite a few tries. :(
It's really difficult to understand this given how much I know about Ada, so the best I can do is to keep throwing questions at ChatGPT. And I keep getting results that go against what you said.
I've also tried a couple direct review examples from o4-mini-high, one without the documentation link [1] and one with it [2].
It matches what I've managed to learn as well. I know how LLMs work and that they hallucinate a lot, so I can't tell who here is wrong, since you seem to be really experienced, and I barely know anything... what are your thoughts?
Oh, and I really appreciate you walking me though this! Like, a lot a lot! Thank you very much!
Never, ever, throw any non mainstream language at an LLM. You will get absolutely nothing but bullshit back. They do not comprehend, and so they cannot move away from generalisation to actually speak about the language. There is not a large enough public source to train the model.
If you're throwing something to a con-artist, don't be surprised if everything you get back does not line up with reality.
I see it does what you're saying on the "else" branch, but that just returns the previous string unmodified, which makes sense. The more important one is the "if" branch though.
Looking at your Godbolt example's assembly, even if I add -O3, I see it does a call to "ada__strings__unbounded__unbounded_slice", but until I know the contents of that I can't say whether the pointer it returns is derived from the same allocation as the original one, or from a new allocation that the string was copied to.
You're using the Unbounded_Slice function [1], which calls To_Unbounded_String [2], which calls `new String` [3], which you mentioned in a previous comment that it will allocate, right?
The kind of operation I'm looking for is something like a "Trim_Whitespace" function that re-uses the old allocation without copying all the data, even when there is whitespace to trim.
(The exact definition of a string is implementation-defined. But that's the concept.)
Ada enforces safe ranges, which means you need to carry the length of the slice somehow. It does not use C's 0-terminated strings. So slicing does not work the same way as strtok or other self-modifying systems - the length isn't guessed, it's known.
But if you change one character in the buffer of the slice, it'll be changed in the original Unbounded_String too.
For trimming whitespace, you're right that Unbounded's standard Trim may reallocate. It carries multiple buffers, and when you Trim sometimes it will just hand it back, other times it'll reallocate. [0] Mostly for performance tradeoff. Keeping the original can make iteration slower, as it holds multiple buffers.
So, to implement our own - with one caveat. Slice can't handle 0-length, because range safety is enforced. So in the case of a wholly whitespace string, we'll be doing a whole new allocation.
-- This line is just for pasting into godbolt
pragma Source_File_Name (NTrim, Body_File_Name => "example.adb");
with Ada.Strings.Unbounded;
with Ada.Strings.Maps;
use Ada.Strings.Unbounded;
function NTrim(Source : Unbounded_String) return Unbounded_String is
Len : constant Natural := Length(Source);
First, Last : Natural;
Whitespace : constant Ada.Strings.Maps.Character_Set := Ada.Strings.Maps.To_Set(" " & ASCII.HT & ASCII.LF & ASCII.CR);
begin
if Len = 0 then
return Source;
end if;
First := 1;
while First <= Len and then Ada.Strings.Maps.Is_In(Element(Source, First), Whitespace) loop
First := First + 1;
end loop;
Last := Len;
while Last >= First and then Ada.Strings.Maps.Is_In(Element(Source, Last), Whitespace) loop
Last := Last - 1;
end loop;
if First > Last then
return To_Unbounded_String("");
end if;
declare
Trimmed_Length : constant Natural := Last - First + 1;
begin
if Trimmed_Length >= 3 then
return Unbounded_Slice(Source, First, First + 2);
else
return Unbounded_Slice(Source, First, Last);
end if;
end;
end NTrim;
The resulting compilation [1] has a few things. Our whitespace map gets allocated and deallocated most of the time. A map is harder to treat as a constant, and the compiler doesn't always optimise that nicely. Most of the code is bounds checking. No off-by-one allowed, here. Where first is greater than last, you get a new full allocation.
A few days ago, I had ChatGPT compare Rust and Ada. It tended to penalize Ada for its runtime checks and access values (aka pointers). However, ChatGPT didn't account for the fact that many of Ada's runtime checks would need to be manually implemented by developers in other languages. An Ada compiler, can often optimizes these checks away, knowing where they're genuinely needed and where they can be removed. This often explains why speed comparisons between C and Ada code can be misleading, as they rarely factor in the extra manual effort required to make C code equivalently robust with necessary safety checks.
Regarding access values, I listed out some of Ada's various restrictions. Its scope rules prevent referencing objects at deeper levels, objects must be explicitly marked aliased to create an access value to them, and there's far less need for access values (for instance, no pointers are needed to pass parameters by reference). Additionally, Ada offers the ability to dynamically size some objects and reclaim their memory without explicit memory allocation.
After I highlighted these details, ChatGPT admitted it had unfairly evaluated Ada, concluding it's a very safe and robust language, albeit using different techniques than Rust.
Yeah... my intro CS class was in C and Ada95 (I'm not a CS guy btw, just took the class). I actually preferred Ada over C... but continued to program in C for other classes because of compiler availability; I had to do all my Ada programming on Sparc workstations at school.
I personally think that AdaCore, and friends, missed an opportunity in the early 2000's to fully embrace open source... they left a big gap which Rust has filled nicely.
I still think Ada is a great programming language. When a question comes up along the lines of: "what's the best programming language nobody's heard of?", or "what's the best programming language that is under used?" Ada is usually my first answer. I think other good answers include F#, <insert LISP flavor>, Odin, Prolog, Haxe, and Futhark (if you're into that sort of thing).
We are on Ada 202x nowadays being discussed, and in a world where FOSS tool makers have problems making a sustainable business, always changing licenses, there are still 7 Ada vendors selling compilers.
In embedded world restrictions on the output is the last thing of worries while fulfilling various certification requirements can be a big and costly headache.
And from the vendor point of view releasing a compiler under libre license allows for concurrents to undercut one on R&D leading to the compiler and related tools certification. So from business point of view it just makes no sense. This is very different from contributing to, say, clang, where a cost of maintaining own closed fork outweighs any disadvantages of contributing.
Ferrocene [1] apparently offers an ISO-26262 certified version of Rust. I am thoroughly confused about these certification processes, but it also seems that AdaCore itself [2] has adapted the Rust compiler to be certified as well.
It happened to me. I sure wasn't elderly. The only thing that stopped a crash was shifting into neutral and hitting the brakes.
Then, I was sitting there with the engine revving loudly for no reason. Car behind me started backing up. I decided to just shut it off and turn it back on. It went away.
Scary stuff. Especially since I had no idea why it happened.
Everything I've heard about it was that it was pressure from contractors because they didn't like training or finding Ada talent.
I get that there's more tools for C++ but first class formal verification support and a language that's generally designed to save you from yourself seems like something you would stand your ground on. Ada is supremely good at killing people and/or keeping them un-killed, there's a reason you still see it kicking around in defense and aerospace despite the common perception that it's a bygone relic.
>Everything I've heard about it was that it was pressure from contractors because they didn't like training or finding Ada talent.
Do you think the auto industry will have a easier time finding Ada talent at their pay rates? Or that talent will want to specialize into Ada just to pigeonhole themselves into the Automotive jobs market?
I'm near Detroit which has a huge amount of auto industry, and engineering pay is good across pretty much all disciplines. It'll pay for a happy life and then some as long as relentless title climbing and job hopping isn't your definition of happiness.
Ada is not some exotic thing that requites SF comp. If it's such a major adjustment coming from C/C++ that it's actually causing you trouble, you have other problems.
It's comical bringing up the automotive industry considering that its responsible for AUTOSAR, which is simultaneously widely hated by engineers and completely useless outside the industry.
>I'm near Detroit which has a huge amount of auto industry, and engineering pay is good across pretty much all disciplines.
I dunno about Detroit since I don't live there, but in Europe the auto industry is on a major downturn with cost cutting, layoffs and hiring freezes. Good luck getting hired anywhere now if you're laid off from the Auto industry and your specialty is some niche stuff only used in the auto industry. Also, IIRC, the company I worked for recently laid off 35% staff at their Auburn Hills office overnight so I doubt the situation around Detroit is as rosy as you make it seem.
> If it's such a major adjustment coming from C/C++ that it's actually causing you trouble, you have other problems.
Like I said, there should be no problem for a (skilled) programmer to adjust from C++ to Ada or vice versa, the problem is convincing HR to hire you on that premise. Ask me how I know (see my username).
Gone are the days of the SW generalists programmer, who would be hired with the expectation to learn on the job the new language used at that job, companies now are only looking for people with X on-the-job YoE on that programming language or framework, not self taught people coming from other programing languages.
This is not something you can control, it's the hiring market that's broken, so you can only adapt to it by not pigeonholing yourself in things that might be a career dead-end.
>It's comical bringing up the automotive industry considering that its responsible for AUTOSAR
AUTOSAR did what it was supposed to do: be a vendor lock in program for German bureaucratic companies and job security program for workers in that industry. That's what Germany and German companies do best: create massive amounts of bureaucracy requirements as a moat for an industry they have a foothold in, then sell you the solution. Why do you think SAP is also German?
Everywhere is in a downturn. ZIRP is over, this isn't limited to automotive. Just because you're not going to be hired within a few days of looking doesn't mean that everything is doom and gloom. Automotive software engineers aren't working for poverty wages.
You don't outright lose years of experience writing C and C++ just because you learned Ada. You're not going to leave a decade of C experience off your resume just because it wasn't the primary language in your last position. Ada isn't for frontend web development where new "technology" gets chewed up and spat out every year or two. It shines in firmware. There's plenty of vendors that haven't moved past C99.
Chalking up a hypothetical appreciable Ada marketshare in automotive as "niche stuff only used in the auto industry" doesn't make sense in that it sees plenty of use in aerospace, defense, and medical. The point of me bringing up AUTOSAR is that Ada is nowhere close to the pigeonhole scenario that AUTOSAR is.
In the world where you only have Ada experience, and you're not a bad programmer, but HR departments are giving you grief, just lie on your resume. Fuck em. We both know that it's not a major adjustment, so brush up on details before you're interviewed and if they asked what you used C/C++ just say you're not at liberty to speak about it. Nobody is going to hit you with anything along the lines of whether or not it's true that monads are just monoids in the category of endofunctors.
>Automotive software engineers aren't working for poverty wages.
How does that help when nobody is hiring right now and you're competing with thousand of laid off engineers?
>so brush up on details before you're interviewed
How do you get interviewed in the first place if they're screening your resume out due to not having the experience with the programming languages they're looking for?
Like I said, having niche languages sends your resume in the bin.
>if they asked what you used C/C++ just say you're not at liberty to speak about it
Unless you worked for government intelligence, this type of response gets you rejected immediately here in Europe/Germany, since they have 100 other candidates who can speak about what they did. Why would they bother with you when you're already making their life hard from the interview stage?
I feel like your US centered viewpoint is way off from the reality of where I live.
As a Rust + C++ developer (in medical right now, and vehicle before that) I see no reason why I couldn't also pick up Ada as one of many skill sets. I have an Ada95 manual on my bookshelf from years ago that I bought out of curiosity -- frankly for senior level talent the language syntax is the easiest thing to pick up it's the intricacies of an existing code base and business domain that is the hard part of joining a new project and that is generally language independent.
Arguably picking up Rust -- with its frankly exotic value passing and ownership semantics -- is much harder than learning Ada.
> I see no reason why I couldn't also pick up Ada as one of many skill sets
Low wages and the negative resume drive development might be reasons.
Believe it or not, pigeon holing yourself in niche/uncool tech for years, can and will negatively impact your future hiring prospects at good jobs, since hiring in tech is broken and HR selects resumes on buzzwords and on your "last X years of experience in Y framework".
You mean the Patriot that ended up getting 28 people killed due to a SW bug?[1] That Patriot?
Let me repeat myself again, Ada won't save you from human bugs. If you hire bad programmers or have bad dev and test practices, there's no magic programming language that will save you from your calculation and logic mistakes. You can code in raw machine code like you're 1960's NASA, and still have less bugs than a clueless vibe coder in Ada/Rust/etc. if you know what you're doing and have the right test and verification processes.
The Patriot failures were the result of floating point error. Ada provides facilities specifically to deal with this, while you're left rolling your own in C/C++. Of course Ada won't save you from human bugs, but it's silly to say that you're no better off with a language giving you everything it can to avoid them than one that is a notorious fuckup dispenser.
2. In mission critical systems we always used fixed point fractional numbers in C as a representation of floats, to avoid floating point issues, so any issues of the language are moot.
In respect to both points, Ada provides decimal and binary fixed point representations as a first class feature of the language, and in the event you must use floating point, SPARK provides the capability to prove that you're not running into rough edges. It's actually one of the most immediately noticeable features of the language for a firmware developer coming from C, and I'm a bit surprised you don't know about it.
Of course I know that you can do all of this stuff in C. I did it for years. I just don't think there's any sort of honor or expression of skill in getting your balls busted by this stuff being an afterthought, it's just annoying. I know your response will be "get better", and I did, countless people have, and we all still appreciate that these nuisances can be taken care of by the language.
C programmers have been using fixed point integer math instead of floats for decades. It's a solved engineering problem in safety critical systems. It's only a problem for clueless devs who look for reason to shit on C and think the magic lies in the right programming language, not in having the right knowledge.
Chuck Moore disagrees. On C, I use OpenBSD and I even adapted some ports (mednafen, c++, but my point remains), one of them (not officially, as a home user) being cpulimit from Free to Open. Still, Forth on small devices it's far more predictable and introspectable than C.
you mean the plane plagued with software development issues[1][2]?
even ignoring the obnoxious yellow journalism, i'm not sure it's a shining beacon of best practices. or that replacing one ancient language with an even more ancient language is "moving forwards."
particularly when the language you're replacing was explicitly designed for your domain, and the language you're replacing it with is an entropic event horizon from which no coherent thoughtform can escape.
There are 7 compiler vendors still in business, if anything Ada's domain is one of the fews where paying for tools one needs to do their job is still a thing, like in most professions.
Regarding the number of options, C++ has quite a few IDEs: Visual Studio, Xcode, VSCode, CLion, and probably more (Oracle probably still sells the one they had for Solaris). For command-line compilers, C++ has: Visual Studio, Xcode, g++, clang++, IBM C++ compilers for their OSs, Oracle compilers for Solaris, etc.
For Ada, is there anything other than AdaCore? Is that the same as GNATStudio?
You don't have to use AdaCore's GNAT Studio. You can quickly get going with Ada/SPARK using Visual Studio Code, as there is an LSP extension for it published by AdaCore themselves.
The kind of tooling you need for avionics is not necessarily the kind of tooling you use for non-safety critical code.
None of the tools mentioned, other than in limited level IDEs (for practical purposes in safety critical C++ expect some niche variation of Eclipse just like if you used Ada) are valid for the F-35 project.
I should add, what I was told is that they were amazing, especially the pre-IBM Rational stuff. Rational was acquired by IBM in 2003. I was working on tech adjacent to the JSF (F35). I was told by a guy working on the JSF that the extra bugs from C++ would mean better job security and he was 100% right. It’s pretty much that conversation that triggered my move into high tech and away from military. I think it’s a shame that IBM happened to Rational, a lot of nascent good ideas pretty much disappeared or were mangled beyond recognition.
Ada compilers: PTC ApexAda, GreenHills Ada, Static analysis tools for Ada: CodePeer, ConQAT,Fluctuat,,LDRA Testbed,MALPAS,Polyspace,SofCheck Inspector,Squore,Understand. Similar list for all other things.
When C++ was chosen for F-35 there were more verification tools to Ada than C++.
For C++ on similar systems its becoming more and more "I hope you like LLVM with the serial numbers filed off". Lots of the tool vendors are sunsetting their bespoke compilers. Most of the vendor IDEs have always been Eclipse with a bunch of bundled plugins.
Since Lockheed struck a deal with the government allowing them to no longer communicate the number of issues encountered on the F-35 program, news reports have only been writing about how it has been the best thing since sliced bread. (Save for the mishaps happening from time to time..)
Which is entertaining, because until that point (2021, if memory serves?) it was encountering an ever increasing number of critical issues needing resolving, a double dozen of which would be lethal to the pilot flying. The backlog stood at 800+ issues at the time.
Some of the software issues were so serious that they were considered beyond salvageable at the time, despite having already gone through a full re-write from scratch cycle..
Are those bugs because of C++, or because of bad programing skills and practices? No programming language can save you from bugs if you hire people who don't know what they're doing. I used to work in automotive when a lot of the critical safety SW was only assembly and the end product didn't have any critical bugs.
Maybe Lockheed just has shitty programmers who don't know what they're doing because the US defense industry is incompetent and the US SW jobs market top heavy where talent who does know how to use C++ right goes to big-tech and not on-site at some defense contractor? To me that's not the fault of C++.
Not all programming languages are equal when it comes to the skill needed to deliver correct software. Since a large project necessarily can expect only to bring merely ordinary skill to the problem if that's not enough they're in trouble, even if superlative skill would have succeeded that's not what they have.
C++ iterators are a big example of this problem. In the most skilled hands these are a very powerful technology, excellent performance yet tremendous flexibility - but they have a lot of footguns. So do you choose to accept the high defect rates when your ordinary programmers shoot themselves in the foot, or do you neuter this powerful technique to reduce those defects but suffer significant performance problems ?
>C++ iterators are a big example of this problem. [...] but they have a lot of footguns.
Bigger than coding in assembly?
>when your ordinary programmers shoot themselves in the foot
Then don't hire non-name foot shooting programmers off the street. Hire versed engineers with background in programing for safety critical systems, then have them train the code monkey engineers on the right way to think and work. Like I said, skill issue, not language issue.
I think I'm tired, after 20+ years of being around the language, of hearing C++ developers tell people "they're holding it wrong" every time something blows up. Blaming the victim.
C++ is a language seemingly designed with footguns built in on purpose. Even worse it has a community full of elitism and obscurantism.
Rust isn't perfect (I have my gripes), but it has the right idea with ownership management at the static type system level. Ada has its own positives around explicitness and bounds checking etc
C++ for new projects in safety critical sectors makes zero sense to me.
> Even worse it has a community full of elitism and obscurantism.
I've never experienced that per se, I have experienced it in other fields or orgs where the bar to entry is super high. And keeping the bar to entry is often the goal of those that are elitist. Google interviews were that way 15 years ago.
I think the issue I keep coming back to with type safety is how do you make sure a value in meters is not mistakenly used as feet without some kind of translation. If you're going to have type safety, I feel the language should prevent that.
And if it can't prevent that, then it's not really "typesafe".
> I think the issue I keep coming back to with type safety is how do you make sure a value in meters is not mistakenly used as feet without some kind of translation.
Nominal typing - you declare meter and float to be separate types. Ada makes that simple. Both Rust and Ada support that, and it's normal practice in Ada to declare separate types for every unit.
C with underfined behaviours it's far worse than a proved embedded Forth. At least you will know how it will behave with the stack/dictionary and so on.
Then why did the auto and aerospace industry standardize on C/C++ instead of Forth?
Just because something is sometimes better doesn't mean it's also more economical in production. You're still limited by budget constrains. With that budget you need to hire and/or train SW developer. It doesn't matter that X language might be better if you can't find and/or train people with experience on it. So you're better off using C with safety.
It helps on security and correctness. Either you are a Forth like guy/gal stating how a firmware will work at every level (low, med, high, every one), cutting down every flaw by getting it as small and predictable as possible (even by ditching floats for rationals and decimal scaling), or a high level Lisp/CL/Java/C# guy with a garbage collector and some memory safety. And Java/C# and some Lisps aren't nowhere as strict on that as ADA.
I have worked on safety crucial systems and applications that interact with safety critical systems in a non safety critical language. Each time it never had anything to do with the language and everything to do with systems engineering and project management. The projects that were successful had excellent systems engineering and project management. Language choice was never a factor.
People need to ask themselves what benefits Rust would bring to an high assurance system (e.g. DO-178C Level A). You're not gonna want to malloc mid-flight even if you have a borrow checker.
The entire point of e.g. DO-178C is to show that the software only does exactly what it is supposed to do under all assumptions and have any derived behavior fed back to the safety process for evaluation. Software can never in itself make a system safe.
All that aside, modern tooling may be more ergonomic etc so I'm not saying languages like Rust shouldn't be considered, they just aren't as useful here as I think a lot of people assume.
I personally know of teams using modern tooling and expansive cloud based CI/CD with safety critical systems and we are talking hundreds of developers. This is in C++ with MISRA standards and DO-178 too.
I don't think it's fair to say C++ is safe and reliable as is. The only way it could be made safe is with a restricted version of C++.
I'm reminded by mozilla's sign "You must be this tall to write threaded code." [1] How much do you restrict your language and libraries to make it safe? Like custom templates? How do you define ownership of objects and lifetimes -- or just malloc everything all at once?
We use C++ at my shop for Level A and of course we very much restrict it. But the restrictions are more due to reducing the scope of what you need to show for your compiler.
> or just malloc everything all at once?
Yes! But here's the thing: this isn't done due to the footguns associated with memory management, it is done because you want as little dynamic behavior as possible. For Level A software you need to show that your software has both bounded execution time and memory usage, and be robust against all inputs. Achieving that is so much easier without dynamic memory management.
Also, another thing to keep in mind is that DO-178 has you show that your software requirements are traceable to system requirements, your software design to your software requirements, your source code to your software design and your object code to your source code. But testing should be requirements based. So if your compiler inserts a bounds check now you have object code not traceable to source code and for which you won't have coverage because your requirement mention it. But what if you mention it in your requirements? Well, then you'd have to implement it manually in source code to uphold traceability anyway...
I will caveat the above by saying that other players may interpret things differently or have found ways to do things more cleverly.
And it never will be. In aviation we certify three things: aircraft, engines and propellers. Everything else is at best certifiable, i.e. can be integrated into a product to be certified assuming all deviations, restrictions, caveats etc are considered when doing so.
Also, tools follows DO-330 and are qualified. It's their output that should comply with DO-178.
"Nvidia Security Team: “What if we just stopped using C?”, 170 comments (2022), https://news.ycombinator.com/item?id=42998383