Hacker News new | past | comments | ask | show | jobs | submit login
Airbus Chooses GNAT Pro Ada for Development of Unmanned Aerial System (manufacturing.net)
211 points by jayp1418 4 days ago | hide | past | favorite | 192 comments





Not an expert in ADA, but after playing around with it for a while, I don't know why this language doesn't get more praise. It seems to solve a lot of the memory problems Rust solves, albeit in a different way. Its first-class arrays and strong type system seem to go a long way into not having to deal with pointers often, for example. These "features" today would likely be considered part of "a better C", except that none of this is even new; ADA dates all the way back to the 80s. And then ADA/Spark allows for formal verification, which C/Rust do not have to my understanding.

Why is ADA not adopted more broadly?


In the 80s, largely because compilers were expensive.

I think it had a (smallish) second chance in 2001, when GNAT was put in the main GCC tree. By then plenty of people were disillusioned with C and C++, and I think it could have got momentum in a way similar to how Rust has (by appealing to programmers' pride in producing less-buggy programs).

But the Gnu Ada maintainer was also the person running Ada Core technologies (who sold the proprietary version of Gnat), and wasn't at all interested in making sure the free version worked out of the box on Linux distributions, or promoting Ada as a language for writing free software.


Compilers also were relatively large, at least in part because the language is large. For example, it has generics, tasks and inter-task communication. C compilers did not need to have those in the 80s. That mattered at a time when many users wouldn’t have a hard disk.

Yes, I remember a tiny buzz around 2001 for it. I considered it but I couldn't find much information and then Python distracted me for a decade.

Compare the Ada getting started page to the Rust getting started page: https://www.adacore.com/get-started vs https://www.rust-lang.org/learn/get-started

Ada asks you to compare four versions of GNAT and prominently displays "request pricing" buttons. Rust gives you a compiler, a package manager, some IDEs with Rust integrations, and a link to the Rust Book.

In point of fact, Ada has quite a bit of historical baggage/inertia as a proprietary language sold to well heeled defense contractors while Rust has a great deal of effort put toward community engagement and onboarding. Given that Rust now has a "cool" factor that Ada lacks, it's not surprising that people in the "market" for features both languages provide would gravitate to Rust (or other less popular but more engaging systems language communities).


Ada a proprietary language; quite the contrary: it was standardized right up front, in an effort by the DoD go get a common language at a time when proprietary languages were the norm.

That’s about the only nice thing I personally have to say about Ada so I’ll stop here.


I decided to take a weekend and poke around with Ada last year, and wrote a stream-of-consciousness blog post about it https://steveklabnik.com/writing/learning-ada

NVidia is also using it, in fact they decided against Rust when evaluating which language to adopt for their "Safe Autonomous Driving" project.

https://blogs.nvidia.com/blog/2019/02/05/adacore-secure-auto...


Where do they mention that they decided against Rust ?

Ada is certified by ISO for being used in automotive control systems. Rust isn't.

So I'm not sure how Rust could have even been an option. AFAIK, the options are Ada, MISRA C, and certain Forth flavors. That's it. I'd take Ada over these alternatives every day.


The session that they did together with AdaCore presenting which languages they went through and why they picked Ada in the end.

Rust syntax was one of the factors that played against it.

"Securing the Future of Safety and Security of Embedded Software"

https://www.adacore.com/webinars/securing-future-of-embedded...

Page 35,

https://www.slideshare.net/AdaCore/securing-the-future-of-sa...


Where does it say syntax is against rust. It seems like no commercial vendor meaning no plans for certification combined with no spark equivalent we're the reasons.

Watch the video, slides are just one part.

Part of "Determine the difficulty and learning curve for others".


Rust is also discussed at 35:00 into the presentation, noting the lack of a spec, higher memory util, and no commmercial vendor.

Yeah, Rust is not certified, and probably can't be certified without a spec.

> and no commmercial vendor.

So it's about liability. Makes sense.


I'm surprised they even considered Rust in any capacity as they couldn't use it as there are no certifications for it. Speaks quite a bit for rust that they considered it a "major alternative" as per their slides.

> Why is ADA not adopted more broadly?

I would argue that Ada suffered because IDEs weren't a common thing when it had its heyday.

For example, Ada (and VHDL which took after it) is really verbose and a single change to a declaration can ripple all over the place.

"Refactoring" is no big deal today. IDEs chop through it really easily.

Back in 1990, on the other hand, you wanted to strangle a language that made you ripple a domain change of 0..255 to 0..65535 through all the code by hand. EVERYWHERE.

There is a reason programmers are so huffy about "type inference", you know.

Currently popular languages are quite "path dependent"--they needed to be popular before IDEs existed but still gain benefits when IDEs became commonplace.

Now that IDEs are common, I suspect that some languages like Ada may slowly gain ground over time. There is far less need for a language to be "stupid editor" friendly.

Edit: Change ADA to Ada.


I agree with you, but I just have to say that if you're having to change a bunch of 255s to 65535 everywhere, you've done things poorly. Ada has a bunch of nice features to refer to these sorts of limits by names or methods that won't need to be changed. For those not familiar with Ada, one way might be MyInteger'Last, which will give the largest valid value for the type.

I may or may not be a little crazy about Ada...


> I agree with you, but I just have to say that if you're having to change a bunch of 255s to 65535 everywhere, you've done things poorly.

Sure, but that was just one example that stuck in my head from using Ada 30+ years ago. I also remember one of my project partners threatening me with bodily harm because I wanted to change a name, and it was going to ripple through his modules.

Perhaps there were better ways to do this in contemporary Ada. However, the fact that people using the language didn't find them says something, no?

A whole class of us absolutely LOATHED Ada because of this kind of stuff. An entire engineering class learned to reach for FORTRAN (unsurprising as it was just beginning its decline) and C (a little surprising as it really wasn't a juggernaut yet) instead of Ada. That says something about Ada, and it isn't good.

Sure, we weren't geniuses, but we weren't stupid. If Ada had been useful help to our classes, we would have used it. The HP-28/HP-48 came out in a similar timeframe and EVERYBODY in our class jumped on those in spite of them being $400+ (1988-1990 money) and having to use RPN--being able to chew through matrices on a calculator was huge.

Maybe modern Ada doesn't suffer from these kinds of problems (or, at least has decent IDE support to deal with it), but it certainly pertains to the path dependence about why Ada isn't popular.


I'm not sure why changing a name would be so particularly bad in Ada. If anything, as Lucretia09 pointed out, Ada probably tends to be a bit easier to change names in than other languages.

I'm not surprised an engineering class tended away from Ada. It's really designed to ensure robust software is produced than it is anything else. It tends to want to force you to write down far more of your mental model than other languages, and it will hold you to it. While I find this very helpful in ensuring my programs actually do what I intended, I think it also incurs some up-front costs. It's harder to just start writing anything and then slowly twist it into a solution. It also takes a little more time before you run it for the first time. The certainty that it actually works at that point is what makes it all worth it.

It's a bunch of trade-offs ill-suited for a large number of simple programs.

Emotionally I think it's also a bit of a harder sell because of it. I spend far more time trying to get it to compile than you do in other languages. Particularly if you are in a rush, it can feel worse. You don't even have an executable yet - and it's the damned language/compiler that won't let you make one! Never mind that the reason it's stopping you is that the theoretical executable wouldn't work properly. I can't decide if it is actually a matter of delayed gratification, or if it's merely very similar. But either way, I think that's one of the adoption issues I haven't seen talked about much.


> Sure, but that was just one example that stuck in my head from using Ada 30+ years ago. I also remember one of my project partners threatening me with bodily harm because I wanted to change a name, and it was going to ripple through his modules

Sounds like a psycho to me. If you change the name of a variable in any language it will ripple through other modules.

With Ada, in his modules, he could've done this:

    Old_Name : Type_Name renames New_Name;

> Now that IDEs are common, I suspect that some languages like Ada may slowly gain ground over time. There is far less need for a language to be "stupid editor" friendly.

I kind of disagree. While it seems to me that languages now tend to want to support tooling, the trend to tolerate depending on the IDE instead of improving the ergonomics of the language alone peaked sometime around the height of Java’s relative industrial popularity (I'm not saying Java is some kind of extreme examole of IDE dependence; Java itself has focussed a lot on its own ergonomics since that time.)

OTOH, I don't think Ada is so I ergonomic that, other than in a case of pathologically bad design to start with, you'd have to manually change integer ranges everywhere; I'm fairly certain that it supported type aliases for things like that and it would have been idiomatic at the height of its popularity (such as it was) to use them, so that you'd only have to make that change in one place, not everywhere.


>Back in 1990, on the other hand, you wanted to strangle a language that made you ripple a domain change of 0..255 to 0..65535 through all the code by hand. EVERYWHERE.

If you're using hard coded constants instead of actual constants, you deserve the pain of changing it everywhere.

    Something_Lower  : constant := 0;
    Something_Higher : constant := 255;

    type Something is range Something_Lower .. Something_Higher;

Is how you do it. Changing the higher value to 65535 or anything else won't ripple.

Now, you do that in C, you've got hardcoded magic numbers all over the place and the compiler would've just silently compiled without warning. Have fund debugging that mess.


Or

    type Something is range 0 .. 255;
And now Something'First equals 0, and Something'Last equals 255, or whatever upper bound you chose.

There is no reason to hard code a magic constant for the upper end of the range anywhere.


My first IDE was Turbo Basic in 1989.

Changing from

Type Domain is 0..255; to Domain is 0..65535;

Is quite easy.


"Turbo<foo> had a great IDE" in no way refutes that Ada suffered from not having an IDE.

Except it did, that is how Rational Software was born.

https://datamuseum.dk/wiki/Rational/R1000s400

Eventually that experience was moved into UNIX workstations, like everyone else trying to capitalise on their market.

The "I don't need IDE" culture is more related to languages born in UNIX culture.


> Why is ADA not adopted more broadly?

Productivity, and the general labour intensity.

Formal verification is not critical for the lion share of all software, and people obviously would like to save man hours by not dealing with it.

Second is the availability of developers. Not many people even now what formal verification actually is, let alone see rationale for learning verification driven development.


You can use ADA without the Formal verification part. I would say at least for embedded systems it's more productive than C because it has a much stronger and expressive type system. E.g. if you do a lot of fixed-point math, ADA's type system can express it. ADA has proper arrays, with bounds checks. ADA has built-in multithreading.

ADA is a very productive language for embedded systems, much more so than C. But people are afraid of the language, because they think it's difficult. People don't want to learn it because they think it's old, although ADA 2012 is a much more modern language than C11, it hasn't stagnated as much, but it also evolved more deliberately than C++, keeping it's focus on useful features, instead of adding all the features. At the same time it makes a lot of things easier in an embedded context, because it was designed exactly for this use-case.

The only issues are the lack of developers and the ugly syntax, but I think any C developer could learn what they need to be productive in a month, and the syntax you can get used to.


> ADA has proper arrays, with bounds checks.

Not only that, but you can also opt out at compile-time if you need, for performance reasons.

You also have (optional) runtime bound-checking for scalar values (aka "integers"), and you can even specify the bounds yourself (like "A number between 3 and 17").


I know, there are many great features in ADA, I just tried to first list those that seemed the most relevant to me in an embedded context.

Proper bounds checked arrays, possible to opt out, integers with bounds: Turbo Pascal had that too.

Key word in that sentence: had. I picked up Ada essentially because it seemed to me a more actively maintained Pascal. (I've since learned that Free Pascal/Lazarus might not be as dead as it first appeared.)

How are compile times in ada? (Best thing about pascal/delphi)

Ada has that and many more features that help in creating great embedded software.

When I first started poking around Ada, I wouldn't say I found the syntax ugly but I very used to C++ and found it at least a bit odd. Nowadays I find myself expressing the exact opposite of your own sentiment. I wish more languages used a syntax more similar to Ada.

The parts of the syntax I'm very familiar with just get treated by my brain just the same. It's not "busier" in my head because words are written out. But things I'm not so familiar with I can basically just read. And anything really unfamiliar with is easy enough to search.

Syntax stuff falls out of my head pretty quickly if I'm not using it, so I find this particularly useful.


I fully agree with your second point. But, having worked in Ada during 6 years, the productivity is around the same as C++, perhaps slightly better than C++. The productivity is only reduced when writing very short programs.

Ada, not ADA. It’s named after Ada Lovelace, not an abbreviation.

The core of Oracles PL/SQL is called DIANA, distributed intermediate annotated notation (for) Ada.

And while I'm not too sure of the development history behind PostgreSQL's PL/pgSQL (I suspect Oracle's PL/SQL had influence), the Ada influence necessarily shows there as well.

You are right, but it could be both.

No, it really cannot. It's a person's name.

Americans really love acronyms that spell out a word.

BASIC is a word, and it means Beginner's All-purpose Symbolic Instruction Code.

I don't know why you think it's impossible to have an acronym spell a name?


lol because it's not an acronym in this case, as stated

The previous poster said it could be both, not that it is both. Apparently a lot of people think it is, which is because it's such a commonplace phenomenon.

That it is a person's name is orthogonal to whether it could be an acronym.


Thank you for making my point clearer than I did ! :-)

:)

BASIC / Basic isn't a person's name.

Then name your kid Basic and you see that any word can be a name, which makes this a meaningless distinction.

I thought it was both. But I never bothered to ask what it abbreviates.

Ada doesn't have compile-time-safe dynamic memory allocation and deallocation (i.e., tracking a pointer so that you know statically that it's no longer used elsewhere in the function that frees it). Rust does it through the lifetime system, which is based on Cyclone, which long postdates Ada.

The usual alternatives are to statically allocate all your memory, to convince yourself it's safe to use Unchecked_Deallocation, or to use a GC/refcounting library. Static allocation is usually not a problem for special-purpose embedded control applications: e.g., your airplane has two wings and one set of landing gear and its position is described by three coordinates, and you'll never need more of those. You probably want to avoid malloc anyway for both predictable performance and not having to worry about whether your allocation will succeed. But if you're writing, say, a CSS engine or a grep replacement or a terminal emulator, that approach won't work.

GC works for those use cases, but there are a number of excellent memory-safe GC'd languages already in wide use for many things, like Python and Java. You can get very good steady-state performance with Java, sometimes higher than equivalently-complex C, but it's still not a language you'd want to implement loadable libraries or OS kernels in.

Rust solves the problem of making writing high-performance, C-drop-in-replacement code feel much like (definitely not exactly like :) ) working in a higher-level language like Python or Java, including the memory-safety parts of that experience. Ada addresses memory safety, but for most of the use cases people are excited about Rust for, they want the combination of those features. For things like flight control software, picking something extremely mature (usually the companies in these press releases have existing Ada code from previous projects dating back decades) is valuable and having safe dynamic allocation isn't really a requirement.


As described in another post Ada has features to avoid the problem Rust want's to solve in most cases. You can e.g. allocate varable length arrays on the stack.

Ada also lets you create ad hoc local reference types and disallows references escaping beyond the scope of their type. This is an insanely powerful combination for ensuring references do not leak beyond where they are supposed to.

Yes, thanks. There are a lot more examples. I wish someone with deep knowledge and real experience in both Ada and Rust would write a side-by-side comparison covering all relevant aspects, including more recent developments like https://www.adacore.com/papers/safe-dynamic-memory-managemen....

I've seen that before - is there an implementation, and also, how do the usual users of Ada feel about it?

I feel like one of the attractions to Ada for this sort of use case is its maturity, and "here's a proposal for a complicated feature, inspired by Rust, a language which young and rapidly evolving and in particular switched to a new borrow checker implementation less than two years ago and shipped it with known unsoundnesses and is working on a new borrow checker implementation as we speak" may not be attractive to Ada's usual users. :) (To be clear, I'm fine with the risk profile of Rust's borrow checker, but in part that's because I'm fine with the risk profile of using the latest stable Rust and picking up any fixes every six weeks. I'm fine with a more powerful feature that lets me develop code quickly and I'm fine with upgrading my compiler every six weeks to pick up any bug fixes; I imagine that people writing airplane control software are probably going to prefer something less powerful that hasn't changed in decades.)

Honestly, I think one of the questions here is community. I don't really know who the Ada community is, beyond press releases from Adacore about planes and trains. I suspect that community simply does not care about safe dynamic allocation, even if it were available as a supported option, so they aren't going to be excited about testing out this feature. Is there a user community who would use this, in production (doesn't have to be a company, the "serious hobbyist" open source dev is fine and probably even preferable here), within the next five years or so? Are there folks writing stuff like high-performance grep replacements in Ada or hardened HTML 5 implementations in Ada?

I'd love to see (or help with!) such a comparison, but I get the sense that Ada's and Rust's use cases are so different that writing a fair apples-to-apples comparison is hard and even finding someone to do it would be hard.


Where is the quoted text from ("here's a proposal... as we speak")?

> I imagine that people writing airplane control software are probably going to prefer something less powerful that hasn't changed in decades

True, depending on the design assurance level a certified compiler is required.

> I suspect that community simply does not care about safe dynamic allocation

See the reference in my former post. Whether and when the certification authorities accept it is yet another question.

> Is there a user community who would use this, in production ...

It's easy to find projects on Github where dynamic allocation is used; e.g. https://github.com/ghdl/ghdl; with the current developments, such applications may also have a chance for formal verification in the future. And no doubt it would help if the Rust compiler was formally verified as well.


> it would help if the Rust compiler was formally verified as well

As far as I know, the only usable formally-verified compiler (for a serious programming language) is the CompCert C compiler. Even C compilers and Ada compilers intended for use with critical systems, don't tend to be formally verified.

Of course, it would still be great if the compiler for Rust (or Ada, or any other language for that matter) could be formally verified.

It's unfortunate that CompCert isn't Free Software.


The Rust compiler had some severe bugs in the past which undermined the effectiveness of the memory safety. There is the saying in software engineering that the number of undiscovered errors in the system is proportional to the number of discovered errors. On the other hand Ada compilers used to be subjected to strict validation procedures in the past, and the AdaCore toolchain has an ISO 26262 and IEC 61508 qualification. That was the source of my remark.

> There is the saying in software engineering that the number of undiscovered errors in the system is proportional to the number of discovered errors.

I think that's true for software developed under similar processes, but (for instance) if you're comparing software that has unit tests and software that doesn't, the constants are probably wildly different. Your tests (or even the process of thinking about writing testable code) will discover a number of errors but will hopefully drive the number of actual errors down.

To be clear, I'm not making a statement on whether Rust or Ada is better-tested or better-developed, but I will note that Rust goes out of its way

I'm also curious about these bugs - I'm aware of bugs in the standard library (which is developed in the same repo as the compiler), but they only undermine memory safety against hostile source code, and hostile source code has lots of other options (unsafe, /proc/self/mem, ptrace, ...), i.e., the Rust compiler is not a sandbox. I'm not aware of severe bugs in the compiler, and I'm not aware of standard library bugs against hostile user input by well-intentioned source code - e.g., there shouldn't be things like the 2001 sudo "vudo" exploit.


> I think that's true for software developed under similar processes

Of course, this is not a law of nature, but simply a reasonably plausible heuristic, which somewhat justifies my reluctance to a new technology, which was received with much enthusiasm.


Ok, but a formally verified compiler is a very different beast from a high quality compiler. We can realistically hope the Rust compiler team significantly improve the correctness of the compiler, but full formal verification is unlikely.

CompCert is an extremely impressive achievement, and compiles a near-complete subset of standard C. A verified Rust compiler would be an enormous undertaking, and would need a very skilled team. It also wouldn't be worth trying unless the Rust language were very stable, which, if I understand correctly, it isn't.


Sorry, the text in quotes is my own, I'm not quoting anything. Probably I should have found a way to write that without quotation marks.

Rust added a new borrow-checker implementation, "non-lexical lifetimes" (NLL) in December 2018, as part of introducing the 2018 "edition" (a compile-time flag that says what compatibility level you're targeting, roughly analogous to --std=c99). In 2019, they backported it to the original 2015 edition and removed the original implementation, "AST borrowck". As the names imply, NLL has the ability to handle more complex patterns without assuming borrows are live for the entire lexical scope just because it's accessed somewhere in a pair of braces; this turns out to be very useful in practice, since a lot of natural patterns (including those written by beginners not trying to do anything complicated) trip up the older implementation. However, NLL initially accepted some code that was invalid, and it was shipped anyway, and at least a couple of production Rust users ended up writing things incorrectly accepted by NLL. See https://lkml.org/lkml/2020/8/23/214 for my summary of what happened with some links. (I don't think this caused any problems in practice; my sense is that it was undefined behavior, which shouldn't be possible in safe Rust, but the compiler almost certainly chose a behavior that happened to match the intended semantics.) And there's work in progress on a new borrow-checker implementation called "Polonius."

All this is to say that, if you wrote a paper in 2018 saying that Rust's (AST-based) borrow-checker feature is great and Ada should pick it up, you're still behind Rust because there's a new one. If you wrote a paper in 2019 and based your work on NLL, you stood a nonzero chance of picking up that bug. If you write one today, maybe you should base your work on Polonius, but Polonius will itself probably take a fair bit of real-world use to shake out its own bugs.

> It's easy to find projects on Github where dynamic allocation is used; e.g. https://github.com/ghdl/ghdi

Thanks, that's useful!

Though (and maybe this is my unfamiliarity with Ada), from some quick looking around to see where it dynamically allocates, I find https://github.com/ghdl/ghdl/blob/master/src/synth/synth-hea... , which seems to bind the C malloc function and intentionally not bind free and leak all memory, am I reading that right? That's definitely safe but suboptimal.


Whoops, forgot. The borrow checking memory management is available on GNAT Community Edition 2020, for SPARK and (I believe) also Ada itself. The GNAT Ada compiler is used to compile and build SPARK apps. Since SPARK is very strict, the implementation is robust even at this relatively early stage.

There's a telegram group https://t.me/ada_lang you could open that question on. Also, there's work (I think on GitLab) on a package manager called Alire. And the very old standby comp.lang.ada which tends to have some interesting conversations.

And if the allocation fails (stack overflow) one can re-execute the same function with a smaller allocation size.

I'm not super knowledgeable about Rust, but I have tried poking at it a few times, so... you know.

Rust seems to be extremely focused on memory safety. Meanwhile Ada is much more focused on correctness in general. Restricting the discussion to simply memory safety ignores, well, most of anything that might be programmed.

Ada also has more memory safety options than people tend to talk about. As others have mentioned, it's possible to allocate quite a bit on the stack. Other rules also prevent some reference issues.

One thing I rarely see mentioned are memory pools and subpools. It is possible to ensure whole types are allocated in a particular subpool, and deallocation can be left for when the subpool falls out of scope.


I like Rust, but my interest in it dwindled when I realized the safety/correctness focus was almost exclusively on memory safety, with regard to its "value add". Ada's type system offering things like custom, incompatible, integer ranges (even if they cover the same range, cross-type assignment requires explicit conversion) was the safety I needed for work. Memory safety is great, and a worthwhile goal, but was not the thing holding back (most) of the projects I've participated in (that said, memory safety was a notable issue in one adjacent project, I was not on it but memory safety and concurrency was the primary source of bugs for them).

I actually wouldn't say that Rust's safety/correctness focus is exclusively on memory safety. I think it's an easy and clear thing to point to, but compared to C, you've got the following features:

- Nullable types (Option<T>), maybe-error results (Result<T. E>) etc. that force you to check whether you're looking at the right variant. You can't have bugs where you fail to check the error value and you use invalid/meaningless data, because you syntactically cannot express access to the data. This does often manifest as avoiding NULL pointer dereferences, which is technically a form of memory unsafety, but it's really preventing a logic bug. (For instance, this would have made the "goto fail" bug unnatural to write; that one was returning err = 0 as if it were an error, which you can't really do in the Result model.)

- More generally, sum types / tagged enums and the match and if let constructs, which enforce at compile time that you're accessing data from the right variant.

- Locked data that enforces that you hold the lock when you access the data, using similar means. You can enforce that there's no direct access to the data except through the lock wrapper, and more importantly, you can enforce that the lock stays locked as long as someone holds a direct reference to the data (i.e., that you can't leak a reference to that data and use it after you've released the lock).

- Safe threading/concurrency via labeling which types can be moved between or shared across threads. Again, this is technically a form of memory unsafety (data races) but it's more about logic.

- Typestate. You can avoid bugs where you use some logical resource when it's in the wrong state (read/write a closed file handle, send protocol data while you're still handshaking, etc.) if your API consumes ownership of the object in one state and returns the object in another state, and you don't have methods on objects of the wrong state. Rust pre-1.0 had typestate as a language-level feature, but it's straightforward to implement using the typesystem and the ownership model.

Really, I would say the core feature of Rust is a richer type system and the system of ownership and shared/mutable references. The most obvious thing to do with it is to prevent buffer overflows and use-after-frees, but it's not the only thing. (And that's also why I mentioned safe dynamic allocations, which rely heavily on the ownership system; without it, it's still pretty easy to prevent buffer overflows for static allocations.)

I don't have a sense of where Ada lines up on these. I know it has a much richer type system than C too, so it's possible it answers many of the goals besides safe dynamic allocations. I just personally see the benefit of Rust as more than just memory safety.


Interesting. The main reason I lost interest in Rust is because I found the type system to be too poor (there should be a better alternative to "rich" in this context, but I can't think of it right now).

Off the cuff, I'd say Ada addresses all of those problems. In case you're interested, I'll leave a few terms that might be useful in delving further.

- Nullable types + sum types. Pointers can be declared "not null" which basically works as it says. Sum types exist as discriminated and variant records. Variants can be immutable or mutable, and obviously work to do the typical Option, Result, etc stuff.

- Locked data. Protected types do this. I imagine details might differ, but I know nothing about how Rust does this and haven't messed with them much or at all in Ada.

- Task types. I'm tempted to say more, but the little details matter so much for this kind of thing so I probably shouldn't. But Ada has tasking and concurrency mechanisms built-in to the language.

- Typestate. Compiler errors for this sort of thing are not immediately obvious to me. I can think of several options to try using even just the features I already mentioned, but I'm not sure what would or would not be detectable at compile-time. Catching incorrect usage and preventing it from causing trouble is more trivial, but just not the same.

The simplest and (in my opinion) most important feature in Ada is the ability to trivially create new types. So many languages require manually wrapping types. Better ones start to provide more magic to help automate it. But fundamentally none of them make user-defined types as a core component of the language. In Ada to create a new type it's just one simple line, and you automatically get full functionality of the base type while retaining incompatibility. The rest of the language and even the standard library is all designed around the idea of allowing the programmer to freely use these types.

You end up capturing so much more of the model that the program is using, and the compiler is able to check and ensure that the code is actually sensible according to that model. I find even the process of writing the types down helps expose flaws in the design I had in my head. Then once I start using it, it does a great job of catching when I start deviating from how I said everything should work.

Pretty much every other language I've ever looked at makes doing that sort of thing extremely painful. There is too much friction involved in detailing all of that, so nobody does. Then libraries are designed without it, which just increases the friction. Ada meanwhile makes it so easy to do that you feel bad not doing it, and you don't even gain much more than saving a handful of keypresses if you avoid it anyway.


Not exactly what you're after but I asked HN about memory-management in Ada recently, and got a response from one of the AdaCore guys.

https://news.ycombinator.com/item?id=24361992


the free version of gcc gnat is garbage, it randomly crashes during compilation, the timers are based on a subdivision of systick on STM32 (good luck for accuracy or high frequency).

So basically you need to already be wealthy to just dip into it (buy a licence of Adacore and a supported expensive aero grade MCU test kit).

I stopped messing with it 5 years ago.


Does anyone have more recent experience? Is this still the case?

I did Ada at Uni.

I think it's an ok language but its Pascal roots show. In an alternate universe where Pascal won instead of C (which it nearly did!) it might have ended up where C++ is.

Pascal being the choice for Systems programming instead of C nearly happened. One of the first things Ken Thompson and Bill Joy wrote for BSD was Pascal, and the Apple Lisa system was mostly written in Pascal.


Honestly, I wonder why Pascal doesn't dominate, too. Safer than C while still just as powerful, "teaching language" (easy to learn) but used for serious software (as you note, Apple used it for system programming). FPC is amazingly feature rich.

When Apple ported their code from the Lisa to the Macintosh they had to rewrite a lot of Pascal code in C and assembly to make it fit into the ROM's. There was also an issue 100% standard Pascal was useless for writing micro computer programs because it couldn't do any IO outside the console and files. C on the other hand through pointers and inline assembly allowed you to directly monkey with the hardware.

Thing also forgotten how many early PC programs were written in assembly.


The Mac was programmed in Object Pascal and assembly. No C anywhere.

C compilers for the Mac came along later, and even Apple A/UX Unix System V with BSD extensions, and Gcc. A/UX ran pretty nicely in 8M on an SE/30 (512x342 mono CRT), with a MacOS 9 GUI emulator you could run xterms on.


You are correct I mis-remembered.

Because Borland lost their way and Turbo Pascal/Delphi was the leading Pascal implementation, so eventually most Pascal lovers moved into Java/.NET.

Kernighan wrote about the limitations of Pascal in "Why Pascal Is Not My Favorite Programming Language": https://www.lysator.liu.se/c/bwk-on-pascal.html

Note, however, that he wrote this in 1981; these days, when someone speaks of Pascal, they usually speak of a dialect based on Turbo or Object Pascal, which appeared in 1983 at the earliest and address many of Kernighan's concerns.

tl;dr the Pascal of the 1970s was a lot rougher than the Pascal you probably know.


Especially the lack of separate compilation units was a weak point of the original Pascal, as also noted by Kernighan. This was fixed later in UCSD Pascal (1978) and of course by Wirth himself with Modula-2 (also 1978).

But that said, it's strange that Kernighan wrote the above article ("Why Pascal Is Not...") in 1981, i.e. three years after the introduction of UCSD Pascal and Modula-2.

Does anybody here know the background of that article? Why was Kernighan feeling the need to defend C against a 13-year old language which was already not in use anymore in its original form at the time the article was written? Was C under attack and risking to lose its popularity?


It should be noted that even when Kernighan wrote that, with the exception of the stylistic bits (e.g. where you put the semicolon), most of the issues he has didn't exist in the Pascal dialects that people used. His comments were about standard Pascal but few really confined themselves to standard Pascal - even the standard itself didn't do that, IIRC it explicitly mentions that implementations are going to extend it as that was an expectation at the time. In general Wirth's languages weren't meant to be taken as a gospel but as a base to expand from (even his own Oberon system slightly extends the language described by his Oberon language report).

> "teaching language" (easy to learn)

I think that worked against it as well.


The thing I saw with Pascal in the late 70's / early 80's was that in order to handle industrial use cases, each computer company invented its own dialect (I worked on one at Burroughs).

That was one of the problems Ada addressed. But its initial problem was the cost of compilation and builds on the hardware of the time. That got sorted out by the end of the 80's, but then other factors like contractors not wanting their customer telling them how to do their work came into play. Sort of a confusion of requirements and implementation methods.


I would prefer Modula-2 where I do like the module system.

Because Ada is not a language for quick and dirty. It's a language that is only fun and usable if you apply engineering methods to the whole project (a.k.a requirements, documentation etc.). Most projects out there don't. Quite frankly, it's not a hackers language.

EDIT: Oh and the price tag.


It's in gcc. Or available from AdaCore under GPL-3 (yeah, I know). Unless you're in the sort of position where you need a support contract anyway, price isn't really an issue.

Yes, I know it is in gcc. But compared to the AdaCore version, there are way too many bugs (at least when I played with it in 2018). And for any commercial endeavour (even if just a small company, say one or two developers) you really need the support contract (for fast bug fixing the compiler etc.). That's from my experience. Remember, it was about widespread adoption. With the above described small shop situation, there is no pricing that is payable by those shops. And the GPL 3 version does not help here. At least I haven't seen a solo developer going from "I created this amazing thing with the GPL version" to "I can now afford the commercial license of 20000€ per seat".

My experience is vastly different from yours. I've been using Ada as my main language for personal stuff for some years now... I can't quite recall when, but call it 5+ years?

I won't say I've never run into some kind of problem with Ada on gcc (aka FSF GNAT), but it's been pretty minimal and easily avoided. I think it's been a grand total of two or three bugs that I've known of in that time. I've also run into known gcc bugs with C++ before so the situation doesn't seem particularly bad to me.

I find it hard to believe a solo or duo dev shop has a real need for a support contract for their compiler. What language can you get a support contract for that is affordable by anyone?


As I said, I tried the FSF version at about 2018. Might be way better now, I don't know. My experience with gcc (C, not C++) for personal projects is exactly zero bugs. So, you see, experiences can differ ;-).

When saying "you need the support contract" I mean getting the commercial compiler version. I think (theoretically?) AdaCore does not charge you for the compiler, but for support. And you only get access to the commercial compiler and libs if you have a support contract, no?

I admit I might be one of those who does not completely understand the legal circus around GNAT, AdaCore and the FSF version.

P.S.: You can get commercial support from IAR for the C compiler and RTOS, which is next to useless ;-).


I mean, I was using FSF GNAT in 2018. And a few years before that, and some after it. I have a hard time imagining an experience even remotely described as unusable.

I do wish AdaCore didn't leave the compiler situation so confusing. It's just enough to be a consistent distraction.

Basically they all come from the same source. The GPL version gets a yearly release that, afaik is basically the commercial one (presumably not including anything customer specific). FSF then has a bit of a more ambiguous relationship with the GPL version, but AdaCore maintains that too. Probably around 2018 I think there was more effort being put into eliminating any remaining differences between the GPL and FSF versions.

They also have some further libraries that are available. At least some (maybe all? I don't know) are available under the GPL too. I have no idea how the pricing on that stuff goes. That strays from the compiler side of things though.


"I have a hard time imagining an experience even remotely described as unusable."

Ok, so i should give FSF GNAT a try then. I always tried AdaCore's package. It's unfortunate that comp.lang.ada isn't available anymore. I think there were quite some not so fun bugs reported.

Anyhow. I had a look at https://www.adacore.com/gnatpro/comparison. What strikes me as odd is only support for x86 Windows/linux/MAC for the Community Edition. No ARM Linux (say Raspberry Pi)? That sure is not what it looks like, no?

I really don't want to pester you, but did you ever use Ada in a commercial setting and can you share your experience if so? I'm generally interested in Ada (i have a look at it every year or so), but the consensus on comp.lang.ada always was "for commercial usage you really should buy a commercial license (no matter the project, no matter from which company)".

Which more or less always meant AdaCore (some were suggesting Janus Ada, because of the moderate pricing) if you want an Ada compiler which supports the latest standard. Also, because AdaCore seems to be top dog in this area, with other vendors differing wildly in quality of compiler and libs.

I don't know if any of this is true, but those were the vibes from comp.lang.ada i received.


I haven't used Ada in a commercial setting, unfortunately. I was never that involved with comp.lang.ada so I can't really comment there. However I never really got that impression from freenode #ada. Based on what I heard there, I'd describe it more like "if you don't know that you need it, you don't need it." I also got the impression comp.lang.ada tended to be a bit more um... conservative, if you will. I'm not so surprised to hear they were more inclined to have a commercial license.

Personally speaking with my own experiences and what I saw on #ada I'd be comfortable going without a commercial license. I've only ever seen problems with somewhat odd situations, and I'm pretty sure all of them produced a bug box (compiler error message) so at least I was aware of the situation. And Ada often has multiple reasonable ways of doing something, so even if the odd/clever thing causes an error it's unlikely to be a blocker. Mostly it means I would have to put up with usage that's a little more not to my taste.


"I also got the impression comp.lang.ada tended to be a bit more um... conservative, if you will."

   Hehe, yes definitely. But considering the projects some of the participants worked on, i can understand why. For those projects you do need a specific mindset.
So, thanks for sharing your experience. I will try the FSF version. Maybe this is the path forward for my personal projects (which might end up commercial ;-) ).


That’s a bit old, though. Anyone can use GNAT now, and there are newer versions of the language. Even the discussion is dated: Ada 2012 is more widespread and there have been developments in SPARK.

It is old. However I think at least a few of the points stand, particularly about competition with C.

I could get Turbo C or Turbo Pascal, there was no Turbo Ada. It has never had a compiler / IDE that made it easy to write Windows or Mac applications.

It has never had a compiler / IDE that made it easy to write Windows or Mac applications

Well, it did, it just cost $50k/seat.


A little exaggerated. From the 90's there were a couple (Janus, Meridian) at about $600 a seat.

>> It has never had a compiler / IDE that made it easy to write Windows or Mac applications.

Are you talking about the Eighties? GNAT is available for free since the nineties and the GPS IDE is available for free on all relevant platforms since nearly twenty years; even before there were IDE like alternatives for free.


The free part is confusing as it is GPL'd and I don't understand what the runtime exceptions mean. It's confusing to me where the boundaries are and when I would have to switch to the commercial version (and get a quote) to make paid and closed source software instead of using the community language. Ada is great, but I'll never use it past hobby status under these conditions.

The FSF GNAT version, in contrast to the AdaCore GNAT version, comes with the RLE (see https://www.gnu.org/licenses/gcc-exception-3.1.de.html). This means that the code generated from closed-source Ada applications may be linked with the runtime library included with FSF GNAT without GPL violation. With the runtime library included with the AdaCore GNAT this is not true. So in practice you have to use FSF GNAT for closed-source applications unless you're willing to pay for the AdaCore GNAT. This is mainly an issue if you need to compile for a processor architecture for which there is no FSF runtime library.

Thank you! Is the FSF GNAT version ancient or pretty close to the Adacore version in terms of updates?

I'm not up to date. Last time I checked (about five years ago) the FSF GNAT was about a year behind AdaCore GPL GNAT. Maybe someone else has more recent information.

At least as of a year or two ago, AdaCore seemed to be putting in more effort to shore up the differences between FSF and GPL GNAT.

Do you have evidence/references of this?

After all, FSF GNAT was already compatible with Ada 2012 at that time. And a company that makes large investments in software using Ada undoubtedly needs support and will not shy away from the comparatively low additional costs for licenses and maintenance contracts in its own interest. But it is important that the technology is absolutely free for small companies and people who are simply interested in the technology.


Nothing concrete I can point you to offhand. An AdaCore employee popped up in the community (freenode #ada, probably comp.lang.ada too) and started talking about it. I know several people pointed him to differences that were later addressed, or he had already tackled.

What abour Spark?

SPARK is a model checker for a subset of Ada. In contrast to the GNAT runtime library there is no need to deploy anything related to SPARK when you use it. As long as you don't deploy GPL software there is no issue with closed-source applications. Just use the GPL version of GNAT and SPARK to build and check your application.

Besides the licensing confusion 7thaccount mentions, the GPS IDE is only free for "open source GPL software"[1]. Otherwise its everybody's favorite "Request Pricing - Help us understand your development needs and get your pricing information or an evaluation".

This sucked for Smalltalk, and it sucks for Ada. I stand by my statement there was no equivalent to the Turbo family for Ada. The Ada space isn't very different than the Smalltalk although I think the Ada story might be worse. Damn shame for such a nice programming language.

1) https://www.adacore.com/community >> https://www.adacore.com/gnatpro/comparison


> the GPS IDE is only free for "open source GPL software"[1]

That's very academic. The GPS is available under GPL (see https://github.com/AdaCore/gps). As long as you don't deploy GPS or link your app with it you can use it for closed-source apps as you like; it's definitely no GPL violation.

> This sucked for Smalltalk

Not sure why you compare it to Smalltalk; but for the latter you can use Squeak for free (under MIT or Apache license) since the nineties and today even a couple of other powerful Smalltalk implementations such as Pharo.


That's very academic.

No, if small commercial developers cannot use the language without paying massive fees, you might as well forget any popularity. Heck, you couldn't even use Ada for non-GPL open source programs.

Not sure why you compare it to Smalltalk

Because Smalltalk vendors have a long history of doing the same pricing that made it impractical to impossible for the little guy to use the language. There have been several threads on HN about this.


> No, if small commercial developers cannot use the language without paying massive fees, you might as well forget any popularity

Have you read my answer? There is no reason why you couldn't use FSF GNAT or GPS in closed-source projects.

> Heck, you couldn't even use Ada for non-GPL open source programs.

Why not? The GPL GNAT with RLE allows you to use any licence with your application, whether closed or open-source.

> Because Smalltalk vendors have a long history of doing the same pricing that made it impractical to impossible for the little guy to use the language.

There were always free ST versions as far as I can remember. The "little guy" could e.g. use "Little Smalltalk" by Budd which appeared before 1987.


First the last, There were always free ST versions as far as I can remember. The "little guy" could e.g. use "Little Smalltalk" by Budd which appeared before 1987.

No one could write a commercial program with Little Smalltalk. Love the book, but the implementation is not going to work. Heck, Squeak could not do a program that looked anything like a real Windows or Mac program.

Maybe we're talking past each other but the IDE says the community edition is only authorized for GPL software, every other edition is big bucks. https://www.adacore.com/gnatpro/comparison


Well, if you were willing to spend 60$ you could buy a Smalltalk package suited for commercial applications which looked somehow like Windows. But that's not the topic here anyway.

> Maybe we're talking past each other but the IDE says the community edition is only authorized for GPL software, every other edition is big bucks.

Do not let this mislead you. The software is available under GPL. That is all you need to know. And GPL allows you to use the software for any purpose. Maybe I should mention that I studied law for two years and then had one more year of lectures in patent and license contract law (see my profile for more information).


Ok, no. Look at the GPL interaction with App Stores. GPL has been debated on HN forever.

I do not understand what you are trying to say. I also don't understand what the current topic of conversation has to do with App Stores.

> There is no reason why you couldn't use FSF GNAT or GPS in closed-source projects.

But there is Fear Uncertainty & Doubt.


Although we have the Enlightenment behind us, there is still superstition. There is nothing I can do about that.

You inform those who are open to being better informed.

Well, obviously with moderate success ;-)

March 7, 1988 — "Smalltalk/V 286 is available now and costs $199.95, the company said. Registered users of Digitalk's Smalltalk/V can upgrade for $75 until June 1."

https://books.google.com/books?id=CD8EAAAAMBAJ&lpg=PA25&ots=...


which quickly became Visual Smalltalk Enterprise, which you can ask various companies about how that went commercially.

Factually, October 22 1989 — "… an OS/2 version of the Smalltalk language, Smalltalk V/PM … The $500 product is designed to cut the time it takes to prototype user-interface-intensive applications for Presentation Manager."

https://www.cbronline.com/news/digitalk_has_first_compiled_v...


OS/2 didn't move the needle for anyone. Smalltalk V evolved into an unsupported, expensive Smalltalk. Smalltalk never had its Turbo moment.

15 July 1991 — "The package costs $499.95. Registered users of earlier versions of Smalltalk/V Windows can upgrade for $25. …

Digital's Smalltalk/V Macintosh will continue to cost $199.95, but eliminates the previous per copy run-time royalty fee."

https://books.google.com/books?id=jVAEAAAAMBAJ&lpg=PA16&dq=d...

Did Turbo ever have it's enterprisey moment?


> ADA/Spark allows for formal verification, which C/Rust do not have to my understanding.

Check out Frama-C: https://frama-c.com/


It requires a license to use GNAT for commercial projects, it's not free like many C/C++ tools.

FSF GNAT (from your distribution) can be freely used. The GNAT that makes output GPL is the GPL GNAT edition downloaded from AdaCore's website.

Yep, the GNAT toolchain in Debian is actively maintained and they even build cross compilers for all of Debian's supported architectures. Combined with a Linux RT kernel, you can get some pretty good performance with only GPL code.

If you need to build for bare metal rather than Linux, things get a bit more complicated as GNAT requires some runtime code to setup things like systick and interrupts. You can use AdaCore's non-GPL runtime for non-commercial purposes, cortex-gnat-rts on top of FreeRTOS, or RTEMS, which natively supports Ada (though it's poorly documented).

I've also had some luck using crosstool-ng to build gcc toolchains with Ada support, but you'll still need to find or write a runtime library for bare metal.


GCC only started to matter after UNIX vendors stop offering their SDKs for free actually.

Uneven C++ support was a big issue, too.

Rust is akin to the JS framework du jour in respect to safe languages: overpaid bored engineers started a project without bothering checking existing work or dismissing what they found, then they market the resulting (still unfinished) language as something radically new and solving most programming language issues. Some people followed because of the novelty effect and the company that spanned it, which allowed for the hype train to start in places like NH. Rust is a social phenomenon not a technical one.

(This post is sponsored by the Rust Paganization Strike Force)


That is just incorrect. Ada seems pretty nice, but it does not have a borrow checker, does it? I think deallocation is considered unsafe?

Documentation points to [Unchecked_Deallocation](https://docs.adacore.com/live/wave/arm12/html/arm12/arm12-13...) . Rust has automatic (as in, you don't even need to call `free()`), safe deallocation of resources. Of course, the syntax, ergonomics, and the modern looking website and documentation also helped rust become popular, but it is technically new (for a mainstream language).


There was recent Thread with lot of discussion over topic :) https://news.ycombinator.com/item?id=24360310

Education, ADA was under the radar, and only taught in a few colleges I believe. And even where it was taught, it would be drowned in Java classes.

Tooling-wise there's only one game in town. That's the primary reason.

There is even an AdaDoom :-)

> Why is ADA not adopted more broadly?

The Ada specification is huge. You can learn 90% of C with K&R 2.


Web search saver: Ada '83 spec is 338 pages (https://quicksearch.dla.mil/qsDocDetails.aspx?ident_number=3...)

For C life is easy as you can leave things undefined with the caveat "dragons may fly out of your nose" :)

An alternative example of a small spec language might be Scheme (especially pre-R6RS): https://www.schemers.org/Documents/Standards/R5RS/r5rs.pdf


Nobody wants to write 83 and the later additions are far more extensive than C99 and 11.

But k&r 2 was the point of comparison, that's c89 right?

K&R 2 represents the bulk of the modern language. You can't be very productive knowing just Ada83.

Right. But the question was historical (so I interpreted it at least).

Shouldn't those versions be compared to C++ though?

The problem is that using C in practice is not covered by K&R.

Any better books you recommend?

Programming Rust by O'Reilly, haha.

I get why this is downvoted but it is solid advice: in C you absolutely have to track ownership and lifetimes of pointers and pointees, but the compiler is free to not help you in any way. Learning rust will make you a better programmer. Thus, the recommendation is very good.

It is not solid advice to suggest a book that's not even about the language that's being asked about.

There's a number of reasons why it's not solid advice, but here's a huge one: you have no context for why the person is asking. Maybe they already know Rust, and want to learn C. It comes off extremely poorly. I downvoted your parent myself, even.


It's OK Steve, Jesus was also misunderstood :-).

I would not draw that comparison. You knew it was bad behavior, hence the "haha." You're understood just fine, it's just not going to have the effect you want.

I would rather have Hepatitis C than C.

Learning any language makes you a better programmer

Using* any language and writing non-trivial programs.

At the beginning of the century I worked at a company that used Ada for radar software. As it became clear the hpux and Solaris (pa-risc and sparc) where being replaced by Linux On x86 and my company had AdaCore come in and do training in the new toolset (rational not providing Ada compilers for Linux)

AdaCore were quite knowledgeable as a company company offering a supported vendor of gnu Ada. I think they’re Still involved heavily in the development of the free ada compiler “GNAT” (part to gcc)

https://www.adacore.com/get-started

Defense companies liked open source vendors to take risk off and I’m not sure how it worked but some Ada compilers were “validated”

I grew to like Ada, (it has some warts, strings for example, but we didn’t use them often)

GNAT Wikipedia page. https://en.m.wikipedia.org/wiki/GNAT


I've never written in Ada, but I did come across m2os the other day: an Ada-based system for Arduino Uno microcontroller (and a couple of others) that features a simple scheduling policy.

This is actually quite intriguing: a language that includes the concept of tasks at the language level, rather than trying to hammer in a bunch of macros or somesuch into C to achieve a similar result.

It makes you wonder if Ada really ought to be the "next" system programming language instead of Rust.


Ada is a really good system programming language, especially if you have to get very low level, where the language allows for very fine control over memory layout and for interfacing with hardware. All that _without_ sacrificing type safety. In addition, if you want to make your software even more bullet proof, you should consider the SPARK subset of Ada and statically prove the Absence of Runtime Errors, which includes memory safety. Furthermore, with SPARK you can prove various higher level properties of your software, thus reducing the number of unit tests needed.

In 1983, although Concurrent Pascal, Modula-2 and plenty of others did it already during the 70's.

There's an online book for Ada for anyone that's interested in what it looks like. https://learn.adacore.com/courses/intro-to-ada/index.html

It has to be said that Adacore is a French company, there is more to it than just a technical decision

AdaCore was formed by various individuals at New York University who were involved in the creation of GNAT, which was a DoD funded project. The company is still based in New York. Over the years they have expanded into various parts of Europe.

I don't care at all for the underscore-heavy style but otherwise Ada is actually a great language for developing high-reliability systems like avionics. Formal specification and verification are only going to grow more important in aircraft certification as time goes on.

Nice to see the defense/aerospace sector giving a nod to free software like gnat.


Ada 95 was the language for 1st year Comp Sci at the University of Glasgow at the start of the millennium. There was a small revival in Ada at that time.

I remember the first lecture introduced a little graphical rocket simulation in Ada and us students would write code to land the rocket. I liked that.

Then i think the semester after this they introduced us to Haskell (ghc = Glasgow Haskell Compiler). Another good course but one i wished i’d paid more attention to at the time!


Glasgow alumni here. The Haskell course was incredibly well taught, though the the free learning material for Haskell (e.g. learnyouahaskell) is of significantly higher quality than for most other programming languages.

NYU also. The profs were the creators of GNAT so they taught it instead of Pascal. I loved it but I haven't seen it in action ever in my life.

I'm not sure if this is still the case, but one of the reasons I learned Ada in college is because it was used by the DOD. I'm not sure if it's still a requirement for DOD projects, but I believe it was at the time. Which would perhaps explain why Airbus chose Ada. Either that or simply the engineers who are on the project were exposed to/ used Ada during other DOD projects.

Since college, I've never touched it again.


It hasn't been a requirement for a long time, though it is used in some DOD projects still.

Last I heard, the first major project that had blanket allowance to not use Ada was F-35...

... And I honestly don't think it benefitted from that exception ;)


That's still 20 years, and the mandate wasn't terribly successful. Lots of systems from the late 80s and 90s were developed in non-Ada languages, including new development (not just continuations of older systems). Now, whole platforms, that was probably harder to move away from the mandate until later. But components could be and were developed in other languages.

I actually remember when Ada was in development - the US Air Force decided to use Jovial in the meantime (F16 in this case). From someone who worked at Lockheed he said there were still Jovial codebases in the past couple of years. Ada was much better than Jovial but had a fairly long design time. I got out of the DoD world by the mid 80's so never came across either again.

Jovial is, indeed, still used. And no one knows it. The old timers who did have mostly retired, the young folks don't grok it. It's leading to a lot of rewrites that may or may not pan out.

Does anyone use Ada for general purpose programming, like for where one might otherwise use Python, Go, nodejs, or something like that? Is it used for any low-reliability areas?

I don't know Python, Go, or nodejs, but with regard to general purpose programming, you can look at some of the free tools and libraries on AdaIC (https://www.adaic.org/ada-resources/tools-libraries/). There is a man named Gautier, who actively develops a variety of tools (e.g. 3D gfx, Excel file producer, games, code editors, etc). Check out his blog page: https://gautiersblog.blogspot.com/search/label/Ada

In addition, I know for a fact that Rapita Systems (https://www.rapitasystems.com/) uses Ada a lot in their products.


I did back when I had time for pet projects. It's a really nicely designed language where things generally make sense together yet remain clearly separated, once you get over the initial hump.

Granted, it's still a low-level language so I would use it for things close to the system or where performance and/or resource usage matters, and not quick prototyping. So think more along the lines of "alternative to C, Go, Rust, D, Java, C#" and not "alternative to Python, ECMAScript, Perl, Clojure, F#".


I understand that pl/pgsql, the sql function language of Postgres, is based on Ada.

I quite like it.


I've seen two huge webpages built with PL/PgSQL and I've never after or before seen anything that would come close in terms of functionality, speed, stability, flexibility and safety.

It's apparently derived from Oracle's PL/SQL, which as mentioned somewhere else in the comments is directly using bits of Ada compiler technology.

Ada Web Application looks nice: https://www.electronicdesign.com/technologies/embedded-revol...

I have not had a chance to use it yet, but I would like to when the opportunity arises.


For example this project: https://github.com/ghdl/ghdl

That's certainly one way to minimize amount of code in the system, making it easier to review and presumably certify.

I feel that with more common languages, it is so easy to grab some open-source library and use it without any code review whatsoever. Even if you prohibit third-party code, there might still be some copy-paste from open source projects and even Stack Overflow.

Mandating that everything is written in Ada language neatly prevents all this stuff.



I would imagine it would be even more helpful if there is open source library that has formal specification a la seL4 but probably I am hoping too much.

Airbus seems to use ADA mainly for drone software. Everything else is MISRA C, or MISRA C with a layer of abstraction overlaid that allows for generation of assembly from the same codebase as well.

The language used is less important than the development process as a whole, including static analysis tool used to verify the code.

I believe Airbus uses Astree (abstract interpretation static analysis) verified MISRA C. It's really good tool. https://www.absint.com/astree/index.htm


The first real programming task of my apprenticeship when I was 17 was porting a Boeing Ada avionics project to C. Looking back years later that just feels wrong.

yes, that is a big step backward. One of the details that people don't realize is that the Ada code probably had runtime checks inserted by the compiler to help catch issues. In addition, the Ada code most likely restricted the set of input values or prevented accidental mixing of different numerical values using its strong type system. When you convert to C, you lose all of that. If there is a code change down the road, you could have an error occur that might have been caught by the Ada compiler at compile time, or if not then by an Ada runtime check. Yes, testing has to be really good to minimize these cases, but we all know it's difficult and very time consuming to develop test for every possible combination. In addition, when you have to do code reviews, you have to worry about a much wider range of possible bad values that can be passed around which can screw up the results.

To my shame I had never tried ADA before, but after a cursory look I find the syntax similar to Oracle's PL/SQL, which evokes many bad memories. I am sure I will be able to enjoy the language once I pass this barrier.

Can we also get a Rational R1000 to work with? ;-)

Looked impressive.


Pascal < modula < ada

Only the first ISO Pascal, ISO Extended Pascal is more in line with Modula-2.

There there are all the offsprings from UCSD Pascal and Object Pascal, and same applies to Modula, with Modula-2+ and Modula-3.


I was a CS student at York University shortly after Wirth had a residency there in the 1970s. Pascal of that time, was informative of Modula, but Modula was more informing of ADA because it was still a candidate language model going forward before Ichbiah's pick won out. The whole steelman process was the backdrop to my comparative computer language courses



Applications are open for YC Winter 2021

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: