Why is ADA not adopted more broadly?
I think it had a (smallish) second chance in 2001, when GNAT was put in the main GCC tree. By then plenty of people were disillusioned with C and C++, and I think it could have got momentum in a way similar to how Rust has (by appealing to programmers' pride in producing less-buggy programs).
But the Gnu Ada maintainer was also the person running Ada Core technologies (who sold the proprietary version of Gnat), and wasn't at all interested in making sure the free version worked out of the box on Linux distributions, or promoting Ada as a language for writing free software.
Ada asks you to compare four versions of GNAT and prominently displays "request pricing" buttons. Rust gives you a compiler, a package manager, some IDEs with Rust integrations, and a link to the Rust Book.
In point of fact, Ada has quite a bit of historical baggage/inertia as a proprietary language sold to well heeled defense contractors while Rust has a great deal of effort put toward community engagement and onboarding. Given that Rust now has a "cool" factor that Ada lacks, it's not surprising that people in the "market" for features both languages provide would gravitate to Rust (or other less popular but more engaging systems language communities).
That’s about the only nice thing I personally have to say about Ada so I’ll stop here.
Ada is certified by ISO for being used in automotive control systems. Rust isn't.
So I'm not sure how Rust could have even been an option. AFAIK, the options are Ada, MISRA C, and certain Forth flavors. That's it. I'd take Ada over these alternatives every day.
Rust syntax was one of the factors that played against it.
"Securing the Future of Safety and Security of Embedded Software"
Part of "Determine the difficulty and learning curve for others".
So it's about liability. Makes sense.
I would argue that Ada suffered because IDEs weren't a common thing when it had its heyday.
For example, Ada (and VHDL which took after it) is really verbose and a single change to a declaration can ripple all over the place.
"Refactoring" is no big deal today. IDEs chop through it really easily.
Back in 1990, on the other hand, you wanted to strangle a language that made you ripple a domain change of 0..255 to 0..65535 through all the code by hand. EVERYWHERE.
There is a reason programmers are so huffy about "type inference", you know.
Currently popular languages are quite "path dependent"--they needed to be popular before IDEs existed but still gain benefits when IDEs became commonplace.
Now that IDEs are common, I suspect that some languages like Ada may slowly gain ground over time. There is far less need for a language to be "stupid editor" friendly.
Edit: Change ADA to Ada.
I may or may not be a little crazy about Ada...
Sure, but that was just one example that stuck in my head from using Ada 30+ years ago. I also remember one of my project partners threatening me with bodily harm because I wanted to change a name, and it was going to ripple through his modules.
Perhaps there were better ways to do this in contemporary Ada. However, the fact that people using the language didn't find them says something, no?
A whole class of us absolutely LOATHED Ada because of this kind of stuff. An entire engineering class learned to reach for FORTRAN (unsurprising as it was just beginning its decline) and C (a little surprising as it really wasn't a juggernaut yet) instead of Ada. That says something about Ada, and it isn't good.
Sure, we weren't geniuses, but we weren't stupid. If Ada had been useful help to our classes, we would have used it. The HP-28/HP-48 came out in a similar timeframe and EVERYBODY in our class jumped on those in spite of them being $400+ (1988-1990 money) and having to use RPN--being able to chew through matrices on a calculator was huge.
Maybe modern Ada doesn't suffer from these kinds of problems (or, at least has decent IDE support to deal with it), but it certainly pertains to the path dependence about why Ada isn't popular.
I'm not surprised an engineering class tended away from Ada. It's really designed to ensure robust software is produced than it is anything else. It tends to want to force you to write down far more of your mental model than other languages, and it will hold you to it. While I find this very helpful in ensuring my programs actually do what I intended, I think it also incurs some up-front costs. It's harder to just start writing anything and then slowly twist it into a solution. It also takes a little more time before you run it for the first time. The certainty that it actually works at that point is what makes it all worth it.
It's a bunch of trade-offs ill-suited for a large number of simple programs.
Emotionally I think it's also a bit of a harder sell because of it. I spend far more time trying to get it to compile than you do in other languages. Particularly if you are in a rush, it can feel worse. You don't even have an executable yet - and it's the damned language/compiler that won't let you make one! Never mind that the reason it's stopping you is that the theoretical executable wouldn't work properly. I can't decide if it is actually a matter of delayed gratification, or if it's merely very similar. But either way, I think that's one of the adoption issues I haven't seen talked about much.
Sounds like a psycho to me. If you change the name of a variable in any language it will ripple through other modules.
With Ada, in his modules, he could've done this:
Old_Name : Type_Name renames New_Name;
I kind of disagree. While it seems to me that languages now tend to want to support tooling, the trend to tolerate depending on the IDE instead of improving the ergonomics of the language alone peaked sometime around the height of Java’s relative industrial popularity (I'm not saying Java is some kind of extreme examole of IDE dependence; Java itself has focussed a lot on its own ergonomics since that time.)
OTOH, I don't think Ada is so I ergonomic that, other than in a case of pathologically bad design to start with, you'd have to manually change integer ranges everywhere; I'm fairly certain that it supported type aliases for things like that and it would have been idiomatic at the height of its popularity (such as it was) to use them, so that you'd only have to make that change in one place, not everywhere.
If you're using hard coded constants instead of actual constants, you deserve the pain of changing it everywhere.
Something_Lower : constant := 0;
Something_Higher : constant := 255;
type Something is range Something_Lower .. Something_Higher;
Now, you do that in C, you've got hardcoded magic numbers all over the place and the compiler would've just silently compiled without warning. Have fund debugging that mess.
type Something is range 0 .. 255;
There is no reason to hard code a magic constant for the upper end of the range anywhere.
Domain is 0..255; to Domain is 0..65535;
Is quite easy.
Eventually that experience was moved into UNIX workstations, like everyone else trying to capitalise on their market.
The "I don't need IDE" culture is more related to languages born in UNIX culture.
Productivity, and the general labour intensity.
Formal verification is not critical for the lion share of all software, and people obviously would like to save man hours by not dealing with it.
Second is the availability of developers. Not many people even now what formal verification actually is, let alone see rationale for learning verification driven development.
ADA is a very productive language for embedded systems, much more so than C. But people are afraid of the language, because they think it's difficult. People don't want to learn it because they think it's old, although ADA 2012 is a much more modern language than C11, it hasn't stagnated as much, but it also evolved more deliberately than C++, keeping it's focus on useful features, instead of adding all the features. At the same time it makes a lot of things easier in an embedded context, because it was designed exactly for this use-case.
The only issues are the lack of developers and the ugly syntax, but I think any C developer could learn what they need to be productive in a month, and the syntax you can get used to.
Not only that, but you can also opt out at compile-time if you need, for performance reasons.
You also have (optional) runtime bound-checking for scalar values (aka "integers"), and you can even specify the bounds yourself (like "A number between 3 and 17").
The parts of the syntax I'm very familiar with just get treated by my brain just the same. It's not "busier" in my head because words are written out. But things I'm not so familiar with I can basically just read. And anything really unfamiliar with is easy enough to search.
Syntax stuff falls out of my head pretty quickly if I'm not using it, so I find this particularly useful.
BASIC is a word, and it means Beginner's All-purpose Symbolic Instruction Code.
I don't know why you think it's impossible to have an acronym spell a name?
That it is a person's name is orthogonal to whether it could be an acronym.
The usual alternatives are to statically allocate all your memory, to convince yourself it's safe to use Unchecked_Deallocation, or to use a GC/refcounting library. Static allocation is usually not a problem for special-purpose embedded control applications: e.g., your airplane has two wings and one set of landing gear and its position is described by three coordinates, and you'll never need more of those. You probably want to avoid malloc anyway for both predictable performance and not having to worry about whether your allocation will succeed. But if you're writing, say, a CSS engine or a grep replacement or a terminal emulator, that approach won't work.
GC works for those use cases, but there are a number of excellent memory-safe GC'd languages already in wide use for many things, like Python and Java. You can get very good steady-state performance with Java, sometimes higher than equivalently-complex C, but it's still not a language you'd want to implement loadable libraries or OS kernels in.
Rust solves the problem of making writing high-performance, C-drop-in-replacement code feel much like (definitely not exactly like :) ) working in a higher-level language like Python or Java, including the memory-safety parts of that experience. Ada addresses memory safety, but for most of the use cases people are excited about Rust for, they want the combination of those features. For things like flight control software, picking something extremely mature (usually the companies in these press releases have existing Ada code from previous projects dating back decades) is valuable and having safe dynamic allocation isn't really a requirement.
I feel like one of the attractions to Ada for this sort of use case is its maturity, and "here's a proposal for a complicated feature, inspired by Rust, a language which young and rapidly evolving and in particular switched to a new borrow checker implementation less than two years ago and shipped it with known unsoundnesses and is working on a new borrow checker implementation as we speak" may not be attractive to Ada's usual users. :) (To be clear, I'm fine with the risk profile of Rust's borrow checker, but in part that's because I'm fine with the risk profile of using the latest stable Rust and picking up any fixes every six weeks. I'm fine with a more powerful feature that lets me develop code quickly and I'm fine with upgrading my compiler every six weeks to pick up any bug fixes; I imagine that people writing airplane control software are probably going to prefer something less powerful that hasn't changed in decades.)
Honestly, I think one of the questions here is community. I don't really know who the Ada community is, beyond press releases from Adacore about planes and trains. I suspect that community simply does not care about safe dynamic allocation, even if it were available as a supported option, so they aren't going to be excited about testing out this feature. Is there a user community who would use this, in production (doesn't have to be a company, the "serious hobbyist" open source dev is fine and probably even preferable here), within the next five years or so? Are there folks writing stuff like high-performance grep replacements in Ada or hardened HTML 5 implementations in Ada?
I'd love to see (or help with!) such a comparison, but I get the sense that Ada's and Rust's use cases are so different that writing a fair apples-to-apples comparison is hard and even finding someone to do it would be hard.
> I imagine that people writing airplane control software are probably going to prefer something less powerful that hasn't changed in decades
True, depending on the design assurance level a certified compiler is required.
> I suspect that community simply does not care about safe dynamic allocation
See the reference in my former post. Whether and when the certification authorities accept it is yet another question.
> Is there a user community who would use this, in production ...
It's easy to find projects on Github where dynamic allocation is used; e.g. https://github.com/ghdl/ghdl; with the current developments, such applications may also have a chance for formal verification in the future. And no doubt it would help if the Rust compiler was formally verified as well.
As far as I know, the only usable formally-verified compiler (for a serious programming language) is the CompCert C compiler. Even C compilers and Ada compilers intended for use with critical systems, don't tend to be formally verified.
Of course, it would still be great if the compiler for Rust (or Ada, or any other language for that matter) could be formally verified.
It's unfortunate that CompCert isn't Free Software.
I think that's true for software developed under similar processes, but (for instance) if you're comparing software that has unit tests and software that doesn't, the constants are probably wildly different. Your tests (or even the process of thinking about writing testable code) will discover a number of errors but will hopefully drive the number of actual errors down.
To be clear, I'm not making a statement on whether Rust or Ada is better-tested or better-developed, but I will note that Rust goes out of its way
I'm also curious about these bugs - I'm aware of bugs in the standard library (which is developed in the same repo as the compiler), but they only undermine memory safety against hostile source code, and hostile source code has lots of other options (unsafe, /proc/self/mem, ptrace, ...), i.e., the Rust compiler is not a sandbox. I'm not aware of severe bugs in the compiler, and I'm not aware of standard library bugs against hostile user input by well-intentioned source code - e.g., there shouldn't be things like the 2001 sudo "vudo" exploit.
Of course, this is not a law of nature, but simply a reasonably plausible heuristic, which somewhat justifies my reluctance to a new technology, which was received with much enthusiasm.
CompCert is an extremely impressive achievement, and compiles a near-complete subset of standard C. A verified Rust compiler would be an enormous undertaking, and would need a very skilled team. It also wouldn't be worth trying unless the Rust language were very stable, which, if I understand correctly, it isn't.
Rust added a new borrow-checker implementation, "non-lexical lifetimes" (NLL) in December 2018, as part of introducing the 2018 "edition" (a compile-time flag that says what compatibility level you're targeting, roughly analogous to --std=c99). In 2019, they backported it to the original 2015 edition and removed the original implementation, "AST borrowck". As the names imply, NLL has the ability to handle more complex patterns without assuming borrows are live for the entire lexical scope just because it's accessed somewhere in a pair of braces; this turns out to be very useful in practice, since a lot of natural patterns (including those written by beginners not trying to do anything complicated) trip up the older implementation. However, NLL initially accepted some code that was invalid, and it was shipped anyway, and at least a couple of production Rust users ended up writing things incorrectly accepted by NLL. See https://lkml.org/lkml/2020/8/23/214 for my summary of what happened with some links. (I don't think this caused any problems in practice; my sense is that it was undefined behavior, which shouldn't be possible in safe Rust, but the compiler almost certainly chose a behavior that happened to match the intended semantics.) And there's work in progress on a new borrow-checker implementation called "Polonius."
All this is to say that, if you wrote a paper in 2018 saying that Rust's (AST-based) borrow-checker feature is great and Ada should pick it up, you're still behind Rust because there's a new one. If you wrote a paper in 2019 and based your work on NLL, you stood a nonzero chance of picking up that bug. If you write one today, maybe you should base your work on Polonius, but Polonius will itself probably take a fair bit of real-world use to shake out its own bugs.
> It's easy to find projects on Github where dynamic allocation is used; e.g. https://github.com/ghdl/ghdi
Thanks, that's useful!
Though (and maybe this is my unfamiliarity with Ada), from some quick looking around to see where it dynamically allocates, I find https://github.com/ghdl/ghdl/blob/master/src/synth/synth-hea... , which seems to bind the C malloc function and intentionally not bind free and leak all memory, am I reading that right? That's definitely safe but suboptimal.
Rust seems to be extremely focused on memory safety. Meanwhile Ada is much more focused on correctness in general. Restricting the discussion to simply memory safety ignores, well, most of anything that might be programmed.
Ada also has more memory safety options than people tend to talk about. As others have mentioned, it's possible to allocate quite a bit on the stack. Other rules also prevent some reference issues.
One thing I rarely see mentioned are memory pools and subpools. It is possible to ensure whole types are allocated in a particular subpool, and deallocation can be left for when the subpool falls out of scope.
- Nullable types (Option<T>), maybe-error results (Result<T. E>) etc. that force you to check whether you're looking at the right variant. You can't have bugs where you fail to check the error value and you use invalid/meaningless data, because you syntactically cannot express access to the data. This does often manifest as avoiding NULL pointer dereferences, which is technically a form of memory unsafety, but it's really preventing a logic bug. (For instance, this would have made the "goto fail" bug unnatural to write; that one was returning err = 0 as if it were an error, which you can't really do in the Result model.)
- More generally, sum types / tagged enums and the match and if let constructs, which enforce at compile time that you're accessing data from the right variant.
- Locked data that enforces that you hold the lock when you access the data, using similar means. You can enforce that there's no direct access to the data except through the lock wrapper, and more importantly, you can enforce that the lock stays locked as long as someone holds a direct reference to the data (i.e., that you can't leak a reference to that data and use it after you've released the lock).
- Safe threading/concurrency via labeling which types can be moved between or shared across threads. Again, this is technically a form of memory unsafety (data races) but it's more about logic.
- Typestate. You can avoid bugs where you use some logical resource when it's in the wrong state (read/write a closed file handle, send protocol data while you're still handshaking, etc.) if your API consumes ownership of the object in one state and returns the object in another state, and you don't have methods on objects of the wrong state. Rust pre-1.0 had typestate as a language-level feature, but it's straightforward to implement using the typesystem and the ownership model.
Really, I would say the core feature of Rust is a richer type system and the system of ownership and shared/mutable references. The most obvious thing to do with it is to prevent buffer overflows and use-after-frees, but it's not the only thing. (And that's also why I mentioned safe dynamic allocations, which rely heavily on the ownership system; without it, it's still pretty easy to prevent buffer overflows for static allocations.)
I don't have a sense of where Ada lines up on these. I know it has a much richer type system than C too, so it's possible it answers many of the goals besides safe dynamic allocations. I just personally see the benefit of Rust as more than just memory safety.
Off the cuff, I'd say Ada addresses all of those problems. In case you're interested, I'll leave a few terms that might be useful in delving further.
- Nullable types + sum types. Pointers can be declared "not null" which basically works as it says. Sum types exist as discriminated and variant records. Variants can be immutable or mutable, and obviously work to do the typical Option, Result, etc stuff.
- Locked data. Protected types do this. I imagine details might differ, but I know nothing about how Rust does this and haven't messed with them much or at all in Ada.
- Task types. I'm tempted to say more, but the little details matter so much for this kind of thing so I probably shouldn't. But Ada has tasking and concurrency mechanisms built-in to the language.
- Typestate. Compiler errors for this sort of thing are not immediately obvious to me. I can think of several options to try using even just the features I already mentioned, but I'm not sure what would or would not be detectable at compile-time. Catching incorrect usage and preventing it from causing trouble is more trivial, but just not the same.
The simplest and (in my opinion) most important feature in Ada is the ability to trivially create new types. So many languages require manually wrapping types. Better ones start to provide more magic to help automate it. But fundamentally none of them make user-defined types as a core component of the language. In Ada to create a new type it's just one simple line, and you automatically get full functionality of the base type while retaining incompatibility. The rest of the language and even the standard library is all designed around the idea of allowing the programmer to freely use these types.
You end up capturing so much more of the model that the program is using, and the compiler is able to check and ensure that the code is actually sensible according to that model. I find even the process of writing the types down helps expose flaws in the design I had in my head. Then once I start using it, it does a great job of catching when I start deviating from how I said everything should work.
Pretty much every other language I've ever looked at makes doing that sort of thing extremely painful. There is too much friction involved in detailing all of that, so nobody does. Then libraries are designed without it, which just increases the friction. Ada meanwhile makes it so easy to do that you feel bad not doing it, and you don't even gain much more than saving a handful of keypresses if you avoid it anyway.
So basically you need to already be wealthy to just dip into it (buy a licence of Adacore and a supported expensive aero grade MCU test kit).
I stopped messing with it 5 years ago.
I think it's an ok language but its Pascal roots show. In an alternate universe where Pascal won instead of C (which it nearly did!) it might have ended up where C++ is.
Pascal being the choice for Systems programming instead of C nearly happened. One of the first things Ken Thompson and Bill Joy wrote for BSD was Pascal, and the Apple Lisa system was mostly written in Pascal.
Thing also forgotten how many early PC programs were written in assembly.
C compilers for the Mac came along later, and even Apple A/UX Unix System V with BSD extensions, and Gcc. A/UX ran pretty nicely in 8M on an SE/30 (512x342 mono CRT), with a MacOS 9 GUI emulator you could run xterms on.
Note, however, that he wrote this in 1981; these days, when someone speaks of Pascal, they usually speak of a dialect based on Turbo or Object Pascal, which appeared in 1983 at the earliest and address many of Kernighan's concerns.
tl;dr the Pascal of the 1970s was a lot rougher than the Pascal you probably know.
But that said, it's strange that Kernighan wrote the above article ("Why Pascal Is Not...") in 1981, i.e. three years after the introduction of UCSD Pascal and Modula-2.
Does anybody here know the background of that article? Why was Kernighan feeling the need to defend C against a 13-year old language which was already not in use anymore in its original form at the time the article was written? Was C under attack and risking to lose its popularity?
I think that worked against it as well.
That was one of the problems Ada addressed. But its initial problem was the cost of compilation and builds on the hardware of the time. That got sorted out by the end of the 80's, but then other factors like contractors not wanting their customer telling them how to do their work came into play. Sort of a confusion of requirements and implementation methods.
EDIT: Oh and the price tag.
I won't say I've never run into some kind of problem with Ada on gcc (aka FSF GNAT), but it's been pretty minimal and easily avoided. I think it's been a grand total of two or three bugs that I've known of in that time. I've also run into known gcc bugs with C++ before so the situation doesn't seem particularly bad to me.
I find it hard to believe a solo or duo dev shop has a real need for a support contract for their compiler. What language can you get a support contract for that is affordable by anyone?
When saying "you need the support contract" I mean getting the commercial compiler version.
I think (theoretically?) AdaCore does not charge you for the compiler, but for support. And you only get access to the commercial compiler and libs if you have a support contract, no?
I admit I might be one of those who does not completely understand the legal circus around GNAT, AdaCore and the FSF version.
You can get commercial support from IAR for the C compiler and RTOS, which is next to useless ;-).
I do wish AdaCore didn't leave the compiler situation so confusing. It's just enough to be a consistent distraction.
Basically they all come from the same source. The GPL version gets a yearly release that, afaik is basically the commercial one (presumably not including anything customer specific). FSF then has a bit of a more ambiguous relationship with the GPL version, but AdaCore maintains that too. Probably around 2018 I think there was more effort being put into eliminating any remaining differences between the GPL and FSF versions.
They also have some further libraries that are available. At least some (maybe all? I don't know) are available under the GPL too. I have no idea how the pricing on that stuff goes. That strays from the compiler side of things though.
Ok, so i should give FSF GNAT a try then. I always tried AdaCore's package. It's unfortunate that comp.lang.ada isn't available anymore. I think there were quite some not so fun bugs reported.
Anyhow. I had a look at https://www.adacore.com/gnatpro/comparison. What strikes me as odd is only support for
x86 Windows/linux/MAC for the Community Edition. No ARM Linux (say Raspberry Pi)? That sure is not what it looks like, no?
I really don't want to pester you, but did you ever use Ada in a commercial setting and can you share your experience if so? I'm generally interested in Ada (i have a look at it every year or so), but the consensus on comp.lang.ada always was "for commercial usage you really should buy a commercial license (no matter the project, no matter from which company)".
Which more or less always meant AdaCore (some were suggesting Janus Ada, because of the moderate pricing) if you want an Ada compiler which supports the latest standard. Also, because AdaCore seems to be top dog in this area, with other vendors differing wildly in quality of compiler and libs.
I don't know if any of this is true, but those were the vibes from comp.lang.ada i received.
Personally speaking with my own experiences and what I saw on #ada I'd be comfortable going without a commercial license. I've only ever seen problems with somewhat odd situations, and I'm pretty sure all of them produced a bug box (compiler error message) so at least I was aware of the situation. And Ada often has multiple reasonable ways of doing something, so even if the odd/clever thing causes an error it's unlikely to be a blocker. Mostly it means I would have to put up with usage that's a little more not to my taste.
Hehe, yes definitely. But considering the projects some of the participants worked on, i can understand why. For those projects you do need a specific mindset.
Well, it did, it just cost $50k/seat.
Are you talking about the Eighties? GNAT is available for free since the nineties and the GPS IDE is available for free on all relevant platforms since nearly twenty years; even before there were IDE like alternatives for free.
After all, FSF GNAT was already compatible with Ada 2012 at that time. And a company that makes large investments in software using Ada undoubtedly needs support and will not shy away from the comparatively low additional costs for licenses and maintenance contracts in its own interest. But it is important that the technology is absolutely free for small companies and people who are simply interested in the technology.
This sucked for Smalltalk, and it sucks for Ada. I stand by my statement there was no equivalent to the Turbo family for Ada. The Ada space isn't very different than the Smalltalk although I think the Ada story might be worse. Damn shame for such a nice programming language.
1) https://www.adacore.com/community >> https://www.adacore.com/gnatpro/comparison
That's very academic. The GPS is available under GPL (see https://github.com/AdaCore/gps). As long as you don't deploy GPS or link your app with it you can use it for closed-source apps as you like; it's definitely no GPL violation.
> This sucked for Smalltalk
Not sure why you compare it to Smalltalk; but for the latter you can use Squeak for free (under MIT or Apache license) since the nineties and today even a couple of other powerful Smalltalk implementations such as Pharo.
No, if small commercial developers cannot use the language without paying massive fees, you might as well forget any popularity. Heck, you couldn't even use Ada for non-GPL open source programs.
Not sure why you compare it to Smalltalk
Because Smalltalk vendors have a long history of doing the same pricing that made it impractical to impossible for the little guy to use the language. There have been several threads on HN about this.
Have you read my answer? There is no reason why you couldn't use FSF GNAT or GPS in closed-source projects.
> Heck, you couldn't even use Ada for non-GPL open source programs.
Why not? The GPL GNAT with RLE allows you to use any licence with your application, whether closed or open-source.
> Because Smalltalk vendors have a long history of doing the same pricing that made it impractical to impossible for the little guy to use the language.
There were always free ST versions as far as I can remember. The "little guy" could e.g. use "Little Smalltalk" by Budd which appeared before 1987.
No one could write a commercial program with Little Smalltalk. Love the book, but the implementation is not going to work. Heck, Squeak could not do a program that looked anything like a real Windows or Mac program.
Maybe we're talking past each other but the IDE says the community edition is only authorized for GPL software, every other edition is big bucks. https://www.adacore.com/gnatpro/comparison
> Maybe we're talking past each other but the IDE says the community edition is only authorized for GPL software, every other edition is big bucks.
Do not let this mislead you. The software is available under GPL. That is all you need to know. And GPL allows you to use the software for any purpose. Maybe I should mention that I studied law for two years and then had one more year of lectures in patent and license contract law (see my profile for more information).
But there is Fear Uncertainty & Doubt.
Digital's Smalltalk/V Macintosh will continue to cost $199.95, but eliminates the previous per copy run-time royalty fee."
Did Turbo ever have it's enterprisey moment?
Check out Frama-C: https://frama-c.com/
If you need to build for bare metal rather than Linux, things get a bit more complicated as GNAT requires some runtime code to setup things like systick and interrupts. You can use AdaCore's non-GPL runtime for non-commercial purposes, cortex-gnat-rts on top of FreeRTOS, or RTEMS, which natively supports Ada (though it's poorly documented).
I've also had some luck using crosstool-ng to build gcc toolchains with Ada support, but you'll still need to find or write a runtime library for bare metal.
(This post is sponsored by the Rust Paganization Strike Force)
Documentation points to [Unchecked_Deallocation](https://docs.adacore.com/live/wave/arm12/html/arm12/arm12-13...) . Rust has automatic (as in, you don't even need to call `free()`), safe deallocation of resources. Of course, the syntax, ergonomics, and the modern looking website and documentation also helped rust become popular, but it is technically new (for a mainstream language).
The Ada specification is huge. You can learn 90% of C with K&R 2.
For C life is easy as you can leave things undefined with the caveat "dragons may fly out of your nose" :)
An alternative example of a small spec language might be Scheme (especially pre-R6RS): https://www.schemers.org/Documents/Standards/R5RS/r5rs.pdf
There's a number of reasons why it's not solid advice, but here's a huge one: you have no context for why the person is asking. Maybe they already know Rust, and want to learn C. It comes off extremely poorly. I downvoted your parent myself, even.
AdaCore were quite knowledgeable as a company company offering a supported vendor of gnu Ada. I think they’re Still involved heavily in the development of the free ada compiler “GNAT” (part to gcc)
Defense companies liked open source vendors to take risk off and I’m not sure how it worked but some Ada compilers were “validated”
I grew to like Ada, (it has some warts, strings for example, but we didn’t use them often)
GNAT Wikipedia page.
This is actually quite intriguing: a language that includes the concept of tasks at the language level, rather than trying to hammer in a bunch of macros or somesuch into C to achieve a similar result.
It makes you wonder if Ada really ought to be the "next" system programming language instead of Rust.
Nice to see the defense/aerospace sector giving a nod to free software like gnat.
I remember the first lecture introduced a little graphical rocket simulation in Ada and us students would write code to land the rocket. I liked that.
Then i think the semester after this they introduced us to Haskell (ghc = Glasgow Haskell Compiler). Another good course but one i wished i’d paid more attention to at the time!
Since college, I've never touched it again.
... And I honestly don't think it benefitted from that exception ;)
In addition, I know for a fact that Rapita Systems (https://www.rapitasystems.com/) uses Ada a lot in their products.
Granted, it's still a low-level language so I would use it for things close to the system or where performance and/or resource usage matters, and not quick prototyping. So think more along the lines of "alternative to C, Go, Rust, D, Java, C#" and not "alternative to Python, ECMAScript, Perl, Clojure, F#".
I quite like it.
I have not had a chance to use it yet, but I would like to when the opportunity arises.
I feel that with more common languages, it is so easy to grab some open-source library and use it without any code review whatsoever. Even if you prohibit third-party code, there might still be some copy-paste from open source projects and even Stack Overflow.
Mandating that everything is written in Ada language neatly prevents all this stuff.
I believe Airbus uses Astree (abstract interpretation static analysis) verified MISRA C. It's really good tool. https://www.absint.com/astree/index.htm
There there are all the offsprings from UCSD Pascal and Object Pascal, and same applies to Modula, with Modula-2+ and Modula-3.