There's a lot of FUD spread online, partly derived from historic facts, partly total bogus.
I like how Luke A. Guest (https://github.com/Lucretia) put it:
"You’ve got to love it when people who [know] nothing about Ada like to tell other people, who know nothing about Ada, things about Ada that they probably heard from someone else, who knew nothing about Ada." -- https://users.rust-lang.org/t/if-ada-is-already-very-safe-wh...
That also applies to lots and lots of comments on this site!
My personal reasons as someone who has learned the language quite intensively but ultimately decided not to use it: (i) Vendor lock-in and too much future dependence on Adacore (other commercial Ada compilers do not count as alternatives to me because they are super-expensive), (ii) I can't always use the MGPL and would prefer MIT licenses of important Ada packages, (iii) the user community is split in a weird way between very professional aerospace and high integrity systems engineers and fairly dilettante hobbyists, but not many users in the space in-between those extremes, and (iv) not enough libraries for mundane tasks/operating system integration/GUIs.
I'm currently using Go. Although I would prefer Ada as a language, (iv) is in the end decisive for my tasks. If I used Ada I'd spend half of my time reinventing the wheel or interfacing with C libraries. I'm hoping to find a use for it in the future, though.
As someone who is going all in with Ada, every time I have to reinvent the wheel I plan on releasing it as a MIT licensed library on github. Hopefully if enough Ada programmers do that, we won’t have to worry about it as much.
Ada was standard in aerospace and defence projects in the UK when I started in s/w back in the 80's, although personally never worked in those areas. It may be that its perceived lack of popularity is tied to its association with those rather more secretive lines of work, although that doesn't in and of itself explain why it didn't become more broadly used - other commenters have mentioned cost, and that consideration was enough to kill another technically excellent language (Smalltalk).
My gut perception of Ada is unfortunately mediated through the murky lens of its bastard offspring PL/SQL, which is by a good distance the least favourite of any language I have ever used, although I'd be willing to entertain the argument that this is in large part due to all the ugly and ill-considered bits nailed onto it by Larry's mob rather than inherent defects in the parent language itself.
On the other hand I really quite like programming in Postgres's version of PL/SQL which I find to be pleasantly consistent and quite easy to understand.
Have only recently started with Postgres, but have been really impressed with the whole product and yes totally agree - the language seems to be designed by people who actually understand the DML tasks that programmers want to perform.
1. Being good is not enough to be popular. In most cases to be popular a technology needs to be actively promoted by a big brand (of at least a non-profit foundation). E. g. Java is popular thanks to Sun and Go thanks to Google. Also it helps to have free of cheap tools.
2. This problem also has another side - C is much more popular
than the language itself warrants.
I see here following reasons:
1. Unix (which is popular in academia since 70s-80s) and later GCC (it was hard to compete with a free compiler at times when most other required an expensive license)
2. Microsoft designated C and C++ as "official" languages for Windows: MS provided IDE - Visual C++ supported only C/C++ [1] and official documentation implies that everybody uses C or C++ to create Windows apps.
[1] Visual Studio later added .Net support, but this not reduced C/C++ popularity because .Net competes mostly with Java.
There are a few; if you head over to comp.lang.ada you can find threads on the issue by people who were involved at the time. As I understand it though there are four or five points:
1. Ada was designed and specified completely before any implementations were extant, it used then bleeding-edge theory and integrated several big concepts/features: this lead to the very first 'implementations' being either incomplete or pretty bad performance-wise.
2. The backlash among DoD contractor's programmers; "Don't tell us what language to use!"
3. The rise of C's popularity; I believe this had massive consequences, ultimately setting back the field of computer science by decades. [Take a look at Multics, VMS, and the Burroughs... then realize how many of their features have been added to 'popular' OSes in the last 10–15 years.]
4. Misunderstanding the compiler's mindset: a lot of programmers take the view that they need to "shut the compiler up" rather than as the compiler helping them out by finding problems.
5. Misunderstanding "mandatory checks" — a lot of programmers are used to C & decedent's "simple" nature and really don't understand how things can be leveraged. A good example here is the sequence F(F(F(X))); if F takes Natural and returns Natural, then there is only one check that is needed for this sequence: N on the innermost call... and if X is defined as natural, even that check can be optimized away.