Hacker News new | past | comments | ask | show | jobs | submit login
Is Ada a failed language? (medium.com/tomekw)
63 points by todd8 on March 24, 2018 | hide | past | favorite | 138 comments



Is Ads the most hyped language? No. Is it a niche language? Yes. Does that mean it's a failure? No.

I'm sorry but this blog post is a really superficial treatment of the subject. How languages rise, fall or command their own little space is a very interesting topic worthy of a better discussion.


Agreed, there is so much more interesting material that could have been written. Ada once was a hyped language! In the late 80s when the DOD mandated it. It would be interesting to hear the history of Ada's rise and fall.

My CSC sequence (started 1990) was taught in Ada. I was already writing a lot of Pascal so it was an easy switch. I remember feeling no great love for the compiler and tooling, however.


I worked at a NASA contractor around 1990. They had these dedicated Rational workstations with an IDE just for Ada, it made it kind of fun. My recollection is it was better than other IDEs at the time. https://en.wikipedia.org/wiki/Rational_R1000 Today it sounds silly because we can get a portable/configurable IDE for every language under the sun on any platform but back then this was amazing. We did have some process that I can't recall right now to port this to DEC/IBM mainframes/minicomputers.


It may have been "hyped" by some, but it never had any true institutional or corporate penetration. It was always the "military language" with GNAT even being funded by the USAF directly. And in that field, it ruled the roost; and still has strong penetration as a legacy platform.


The problem with Ada isn't that it's niche. The problem is that it has a gazillion features whose purpose is (ostensibly) to make programming errors easy to prevent, identify and correct, but fail miserably at it. There is more to programming than safety, but if safety is going to be one of your programming language's selling points, then the least you can do is get it right. The process for getting it right consists of the following steps:

(0) Identify a concrete class X of programming errors you want to eradicate.

(1) Postulate a mathematical theory of what it means for a program not to have errors of class X.

(2) Develop proof techniques that are powerful enough to establish that “interesting” programs do not have errors of class X.

(3) Design algorithms that use these proof techniques to prove, or at least validate a purported proof, that a program does not have errors of class X. Your algorithm must be able to analyze program fragments (modules, classes, whatever) independently, and combine information obtained from several program fragments without reanalyzing them from scratch.


There are many fallacies in your ideas. What do you do about programming errors you cannot identify? And what about programming errors that your process and your language causes? Can you identify those too? Of course not. Once you understand that things you introduce into a language can both help with some problems and cause other problems we can move to ideas on how to identify and classify those things. And of course we cannot do that with math proofs.

Programming language design is all about interactions with humans and there is no place for so much ignorance of humans as your comment expresses. And don't get me wrong, you can come up with an arbitrary approach to language design that doesn't involve humans, but it can work for humans only by chance.


> What do you do about programming errors you cannot identify?

A programming language designer can only do so much for you. If you cannot identify errors in your programs on your own, consider switching to a different career.

> And what about programming errors that your process and your language causes?

Rigorous reasoning cannot possibly cause any problems. Rigorous reasoning helps identify and fix problems more often than not caused by prior non-rigorous reasoning.

> Programming language design is all about interactions with humans

Humans who need to reason rigorously in order to do things right, hence need to be pushed in this direction.


And no one without a PhD in applied mathematics will use it, unless it is ergonomic enough.

Ada's failure in the mainstream, besides the compiler prices and hardware requirements of old systems, is lack of support for cowboy programming, so loved by users of languages like C, JavaScript, PHP and other similar ones.


It's an empirical fact that people use languages they couldn't have designed themselves. Designing a safe language requires mathematical sophistication. Using a safe language - less so.

As for cowboy programming, Java and C# don't support it all that well, and they are very popular.


You don't need maths or proofs to improve the safety of a programming language. Also, surely it's as much a psychology /workflow / management issue as much as Rust-like memory management etc.


Last time I checked, psychology doesn't affect how a computer program runs.


It affects the way it is developed: hurriedly, carefully, carelessly, etc. This, in a software team in the real world, is a significant factor in the consequent reliability of the software developed.


This suggests trying “scared that my wife and children will be executed if I do it wrong”.


Errrr.. yeah.... if that's how you think ... maybe you should be thinking of another career


"Also, Ada’s strictness and correctness may be perceived as an anti-feature. When talking about general-purpose programming, businesses need fast development cycles and short delivery times. For the companies, to be successful, it’s far more important to be ahead of the competition even with an incomplete product than to let the customers wait and eventually abandon the project."

That is not necessary true when it comes to aerospace and defense projects. In these cases it not important to be ahead and have short delivery times, so here Ada is still suitable.

That the language is a general-purpose programming language does not mean that it is suitable for all application domains. You always need to use your common sense and experience too to choose the right tool for your task.


That is not necessary true when it comes to aerospace and defense projects

They chose to use C++ for the F35, which is buggy as hell, because they thought there were no Ada programmers available. But if you are planning a multi-decade project wouldn't you just... train some yourself?


Points to a general problem (in the US at least) of a complete failure of any sense of on the job training, or internal advancement.

Companies complain when they can't find purple squirrels, even where they could find someone with great general skills in the profession and train them, in about the same amount of time it takes to onboard someone with precisely the skills you want.


It's not just a matter of training.

Let's say you get a job offer from a company which the worst case of NIH syndrome you've ever seen, and all their code is written in a programming language that the company created and the only code in the world written in this language is at this company. Even if the company agreed to train you, would you take the job offer?

Chances are, no, because if you were to ever switch jobs in the future, you want to still be employable, and not have invested years and years into a stack that's completely irrelevant to the rest of the world.

So if the company is of such a literally enormous size and permanent standing to be able to make you a believable offer of lifetime employment / tenure for learning their untransferable skills, like a military or other government body, then maybe. For a defense contractor which could go belly up after some upper-management scandal? No way. For anybody else in the private sector? No way.

Either way, it's still problematic because your hiring mechnanics now revolve around unfirable employees, and the easiest way to demotivate employees is by putting them in an environment where mediocrity is accepted because the perpetrators of the mediocrity can't be fired.

Ada's problem is that it was only a defense-sector language, for reasons specified by others in the thread e.g. compiler cost. Ultimately, targeting tooling to only a specific industry is dooming it to failure; the tooling must stay relevant to many companies and industries both so that people will see it as a career-advancing skill and so that the labor market remains fluid.


Even if the company agreed to train you, would you take the job offer? Chances are, no, because if you were to ever switch jobs in the future, you want to still be employable

A counter-example would be Goldman Sachs, with their own language (Slang) and their own database (SecDB), former Goldmanites have no trouble finding work after.


An interesting point which I will concede. But I doubt the exception disproves the rule for the general case.

For what it's worth, GS was founded in 1869, which makes it older than most of the behemoth systems companies like IBM (1911) or AT&T (1880). Big banks like GS may not be as invincible post-2008 as they used to appear to be, but if there are institutions which you could say "this institution is not going to evaporate in the next few decades which will make up my career", GS would not be a bad bet.

Which brings an interesting question, how many former Goldmanites a) worked primarily in Slang (given that not all Goldmanite engineers work on SecDB) b) were laid off or otherwise left GS without having a job lined up and c) then found work? If you can prove that you can work on a different stack and get a job with no frictional unemployment, is a much different picture compared to engineers who stay out of work (the parallel question being, why is Slang so much more de-facto employable than COBOL or other languages which for one reason or another are now unpopular or unused in the wider industry?).


How do you ensure they stick around? Training has a high cost that isn't worth it if the people jump ship 2 or 3 years later as is the norm in this industry.


Well, it's a reason as good as any other to start paying good salaries AND good raises, instead of just the former. And don't skim on the benefits.

As a general rule, treat your employees fairly (which includes updating their salaries up to market value, as often as needed) and, lo and behold, they will stay. There is a lot of talk about changing jobs because of a lack of challenge, wanting to try new technologies, etc., but 99% of the time, it's about money (and benefits).

Admittedly, this idea might not be very well received by most companies.


I dunno, nobody I know is actually looking to job hop. If you treat people decently, give them raises, train them, why would they have any incentive to leave?

Hiring somebody new costs 15-25% of the raw salary cost for the year. Pay people 10% more instead, no incentive to leave.


Plus however long it takes to ramp them up, could be 6 months on a reasonably complex codebase. And the time of other engineers both in the hiring process and bringing the new engineer up to speed. I’d say the cost of replacing an engineer could be as much as their entire first year’s comp.


Perhaps if you invest in people, they wouldn't do that? I mean people job hop because that is the only way to advance, but what if there was another way? Job hunting, interviewing etc are a royal pain in the backside. If there was a way to have a career without all that everyone would be happier, employers included


People jumping ship 2 or 3 years later is a result of the market correcting itself. There was a point in time when people stayed for 20 years in a company. Why? You could switch, but the pay would be the same.


There is a cultural factor too, many companies seem to have an aversion to promoting from within. It's usually far easier to get promoted by interviewing elsewhere for a higher position than it is to deal with the internal politics of moving up within the same org. But it could be a clear expectation: complete this training, get this experience, move up a rung.


Then your employees aren't commoditized, replaceable sources of programming work who you can hire and fire at the whim of the budget cycle.


Defense is all done by contractors or lifers, far away from the competitive tech hubs. I don't think they have a retention problem, unlike companies in the Valley.

Also, the alternative is C++ and C++ developers are not a replaceable commodity either.


There are plenty of defense contractors near tech hubs. Defense was a big business in the SF Bay Area long before there were software companies there, and it's still big today:

https://en.wikipedia.org/wiki/List_of_companies_based_in_the...

In the NY metro area, there are lots of defense contractors on Long Island.


Yes we are, mostly. There are very few programming jobs for which the laborers aren't essentially replaceable commodities, including programmers who are proficient with C++.


My tiny aerospace knowledge also tells me that there's a ton of cpp tooling that is already known to shops and they rely on this rather than switching culture to ADA even if it may provides greater value in mid to long term.


My tiny aerospace knowledge also tells me that there's a ton of cpp tooling that is already known

All that tooling isn't helping produce the software tho' is it?

https://www.theregister.co.uk/2017/01/12/f35_alis_software_d...

And

https://www.theregister.co.uk/2018/03/22/f_35b_block_4_upgra...

For just two examples


I never implied it was useful, just that people are more likely to take cpp and coat it with tools to feel getting things done by juggling masses of files.


>which is buggy as hell

Is this really the fault of C++? Ada was appreciated for providing static code analysis out of the box but they found out that coupling C++ with a (modern) static analyzer gave better results.


> > which is buggy as hell > Is this really the fault of C++?

I'm not saying it necessarily is, but it clearly can be. When you fuck-up language semantics, as pointers in C and C++, making things vague and unspecifiable regarding important properties (aliasing, memory allocation, etc), then you lose something that is extremely hard to get back via static analysis.

For this reason, a language that has safe properties in a domain (like Rust or Ada) will always be safer in that domain, by construction, that a language where you graft static analysis on top of it.

This can be somehow alleviated by coding rules/standards and annotations for static analyzers, but then, you're not programming in the same language anymore, which has its own set of problems.


Well, the F-22 had control software written in Ada and it doesn't have as many reported software problems as the F-35.


The F-22 is also a decade older.


I think a time comparison of both would still show the F-22 had less software problems. The F-22 also has some seriously complicated software when you look at the capability.


It's a fair assessment. There are almost no Ada programmers available. There will be less and less in the future.

It's a difficult language, on par with C++. It's tough to learn, even if you already have a decade of C++.

I think it's fair to pick Ada for that kind of project but it is also fair to avoid it. Either decision is justified.


Ada was relatively easy to learn by anyone in the past with Object Pascal or Modula-2/3 knowledge.

Which used to be quite common before C and C++ got widespread during the mid-90's.

Ada's biggest problem was the price of the compilers, and lack of adoption by OS SDKs, an issue that also pushed other languages outside of mainstream.

Businesses already had to pay for the OS developer tools, unless there were legal requirements on their domain, they weren't keen in buying extra compilers.


I can confirm teaching basic Ada to newbies to "go and correct logic bug X" is 2-3 weeks on-the-job training. We don't even go for Ada experience when hiring anymore, just programming experience, and low cowboy factor...


Agreed on that. Ada was niche from the start because the tools were expensive and hard to procure.

Then Windows and Linux became the dominant platforms and they are C only. Everything else died.


More like Windows and UNIX, not only Linux, but yeah that's the spirit.


> UNIX

What UNIX? Only one I can think of macOS? What other UNIX is a dominant platform? Linux is not a UNIX, but rather UNIX like.

I'm all for UNIX, love the design, love the history, but it is not dominant. To be honest UNIX seems dead, and only live on in *nix-like OS.


Within this context of this discussion, what are the distinctions between Unix and Linux that are important to you?


That Unix is the roadster that gave us c. But Linux was the one that drove c into our core operating system infrastructure. Windows did the same favor for c++.

One of the reasons why other languages failed? While c/c++ flourished in the oss environment, even with its tooling being sub-par at the time. imo.


I almost agree, just not quite regarding how it went.

C++ was already being adopted by all desktop systems, Mac, OS/2, Windows, even if at lower level it was a mix of Pascal (Mac) and C.

On proprietary UNIX systems, C++ major stronghold was CORBA based applications, while some companies were pushing C++ bindings for Motif.

Then GNU and Linux happened, with FSF suggesting that the best approach to write portable open source should be C, and then we arrived in the present situation.


There are more computers on this planet than just plain desktop.

Most of them are servers, mainframes and embedded devices running some form of UNIX based OS.

If you prefer I can rename it to nix, as I wasn't thinking about POSIX certification.

BSD, Minix (running on your Intel chip), Aix, HP-UX, POSIX layers in IBM and Unisys mainframes, RTOS, NuttX, QNX, and many other POSIX based OSes for embedded deployment.


> If you prefer I can rename it to nix

This makes me sound pedantic, but yes. I understand the world is more then plain desktops. The topic was dominant platforms; UNIX is not.

I'm aware of most of your examples. But all of them are either unix-like or posix complaint.

But as admitted my complaint was a nitpick.

"UNIX is dead; long live *nix." ~ oneweekwonder


> It's a difficult language, on par with C++.

Ada is much simpler than C++, the hard part about learning it is the lack of tutorials and general information. That has changed in the last few years, but for the longest time the Ada community would just point to the (freely available!) standards doc to answer all questions.

The spec isn't that useful if one doesn't already know the concepts being explained, making learning the ideas behind Ada not that easy, but fundamentally those ideas are not hard.


Everything that exists in C++ also exists in Ada under a different name and syntax.

The separation of cpp and header files, classes, inheritance, different kinds of inheritance, virtual, pointers, references, pointers of pointers, exceptions, meaningless errors from the compiler, etc...

Personally, I find both languages to be a complete clusterfuck. The difference is that C++ will compile anything and crash for no apparent reason whereas Ada will refuse to compile anything.


> But if you are planning a multi-decade project wouldn't you just... train some yourself?

The reality is most businesses are rather short sighted and prefer to save on up-front costs. Never mind how bad it can come back to haunt them in the future...


Ada's strong static typing may also produce less buggy code.


I don't think it helps producing fewer bugs, but it makes writing some kinds of bug much harder.


Here's an apples to apples study proving otherwise:

http://archive.adaic.com/intro/ada-vs-c/cada_art.pdf

The others from that time period by the military and defense contractors showed something similar. In only one case did I see C++ and Ada tie but why wasnt evident. Note this was before automated proving in SPARK, Design-by-Contract in Ada 2012, and co tract/type/property-based testing tech. The numbers would likely be even higher. Of course, one would want a comparison against tech designed for same things in C which are plentiful now.


I was talking specifically about the strong typing, but this is an awesome study comparing two languages with strong typing. Thanks for sharing.

There probably is a point where the strictness of the typing starts influencing the bugginess of the code.


C isn't typed strongly.


The parts that allow you to do unsafe casting are easily detected by a static analysis.


You'd be surprised how much code you can move from conditional instructions to data types if you have a good type system, and if this type system is a static one, you get it checked before your program even starts.


F# has this too. Other languages too, of course. Heck, even Pascal has a limited form of it - enumerated types and user-defined array types etc. - years ago.


This two statements seem to contradict each other.


You'll write different bugs in different languages. You can't leak memory in Java (you can, but it's very hard), for instance, as you can in C. In Python, I often write code that barfs when it gets an object of a wrong type because parameter types are simply not enforced. The bugs I write in Lisp are entirely different from the ones I write in C.

We'll always write bugs.


But if we agree that Ada reduces one kind of bug compared to, say, C, then we'd also have to come up with a class of bugs which Ada makes easier to write, in order to say that the number of total bugs stay constant while one kind decreases.

I can't come up with such an example.

You're right in that things are rarely black and white, but when it comes specifically to bug prevention, I think Ada is a strict improvement over many alternatives.


That's not the point, though. The point is that Ada lets us write without much extra effort:

1. Way fewer bugs in general.

2. Fewer severe bugs that become full-on crashes or hacks.

3. Fewer bugs we cant fail-saif on and/or recover from.

Not all bugs are equal by far. Just see Rust's panics vs C's donate-PC-cycles-to-hackers technique.


> C's donate-PC-cycles-to-hackers technique.

I'll totally use that.


The same could be said about Haskell and I see it a lot in the startup space.

Sometimes, being first to market is less of a problem than being the first to not to scramble your user's data.


An incomplete product is one thing, but failing type checks implies a non-working product, not an incomplete product. Businesses need correct code. The real question is what is the fastest way to get correct code, using a language that isn't as strict and testing the heck out of your implementation, or using a language that is stricter but harder to learn and write?

It seems to me in both cases you'll still need to test your code thoroughly, so not an easy question to answer.


> For the companies, to be successful, it’s far more important to be ahead of the competition even with an incomplete product than to let the customers wait and eventually abandon the project.

Still looking forward to the day refunds and other kinds of legal actions actually start to change this mentality in software companies.

I don't buy incomplete pair of shoes, half-baked bread, car without seats, ...


Is COBOL a failure? Is FORTRAN a failure? How can anyone say a language that's used for so long can be a failure?

Is it the latest and coolest language? No. It can't be - it was already solving real problems since well before the people who wrote the latest and greatest languages were born.


This is one of the worst blog posts ever. Click bait title with absolutely shallow discussion. Asking if Ada is a failure for lack of widespread adoption is like asking if JavaScript is a failure for lack of use in satellites.


Ada was my first true language that I learned in college. (Pascal being the first language that I learned in high-school, but it is more like a teaching language). Ada and pascal are somewhat related (very similar).

I have special fondness of it as well, but my second year my college switched to Java, as they saw it gave more job opportunities to students after graduation.

ADA was only used from defense contractors, DOD and perhaps Nasa, and that's why it ultimately failed. It "strictness" is a good thing to have when writing the code for a Tomahawk misle, but not so much when trying to do rapid development.

Also, traditional Ada is more of a procedural/imperative language, and not OOP, and the world had moved into OOP by the late 90s.


Ada has had basic OOP support since Ada95. It just doesn't mandate OOD as the "one true way" to build software the way Java does.

I wound up unemployed in 2008 and took a job maintaining an old system written in Ada95. My mother (who is still a working software engineer) was kind enough to give me all of her old Ada books. In the 80s, during the big initial push for Ada by the DoD, defense contractors actually sent a lot of their programmers to training courses (my mother was sent to training courses on both C and Ada in the early years of her career). Today, they would just layoff all of their "obsolete" engineers and hire new ones.


Surprisingly it took ADA classes for me to grasp Java OO a bit better than just soul tears poking. At least encapsulation and interfaces.


Not too surprisingly. Ignoring the fact that Ada presents OO ideas very clearly and neatly, just learning different OO systems has done wonders for my understanding of it. Whereas I would previouly say "Just put it in a class" to everything, I can now distinguish cases where that actually helps from where it doesnt.


> soul tears poking

Ok, you got me. I have no idea what that means.


OOP paradigm made my soul cry, I had zero understanding of superclass etc. Most of what I did was poking around until the compiler didn't complain.


>Ada was my first true language that I learned in college.

Me too. It wasn't at the University of Washington by any chance?


Radford University, Virginia. Most colleges in Virginia and Maryland taught it, as the DoD/Pentagon has always been a larger employer for those two states.


Ah, I didn't know that. UW used Ada until the mid-90s, and I hadn't heard of any other colleges that did until now.


Ada didn't take off (despite some decent level of enthusiasm in the industry) because a compiler cost $$$$. We're all using languages today that had a zero or near zero cost to use in the era that mattered (when 32-bit micros were being introduced). Or newer also free to use languages, obviously.


Ada didn't take off (despite some decent level of enthusiasm in the industry) because a compiler cost $$$$.

That's also why Smalltalk failed to gain traction. I suppose both languages hailed from an era in which employers did still provide training - say up to the late 80s, maybe even early 90s. The idea that programmers would need to learn languages on their own time/dime never even occurred to those vendors.


Doesn't GNU compiler has Ada front end, GNAT? Is it really unusuable in practice? I thought all front ends of GNU Compiler Collection use the same back end so they'd create similar machine code.


The GNAT is great but didn't exist under the current licensing 30 years ago. It's still primarily developed by a single company and people are afraid they'll at some point release updates only to their proprietary version of it. Or something.


Windows didn't have a free C++ compiler for many years, but in general I think you're right.


Compiler prices for C, C++, Basic, Pascal, Clipper were always relatively cheap.

Lets pick the BYTE magazine in 1994.

- IBM C Set++ for OS/2, $249

- Powerbuilder, $249

- Watchom C/C++, $450

This on top of a PC that would cost around $1000.

Mac Quadras started at $1199.

Ada required UNIX or VAX stations.

Cheapest SparcStation on the same magazine $3995, plus OS SDK and the Ada compiler as additional expense.

So lots of zeros versus the alternatives.


So lots of zeros versus the alternatives.

Sure, for hobbyists and such. But if you're a "real" company and you are paying software people perhaps $50,000 (at the time) plus benefits, why would the cost of a workstation and a compiler make much of a difference?


Speaking from the reality of the Portuguese industry in the mid-90's, majority of companies where using PCs, Amigas, Ataris and such.

Only companies at Fortune XXXX level would even think about renting UNIX/VAX systems.

Then the OS SDKs only contained the systems language compilers, C, BLISS, Macro32, CLI tools and such. Every other additional language was an extra purchase.


DOS and Windows saw very affordably priced C++ (and Pascal) tooling from Borland, though. Which is probably part of the reason why Delphi was so successful later on Windows specifically.


In reality it did, at least since 1992 when I began Windows development. Visual C++ was a paid SKU but you got it free as part of the MSDN which was not costly and pretty much required to be an application developer.


Windows had a C++ compiler and IDE for more than two decades. https://en.wikipedia.org/wiki/Microsoft_Visual_C%2B%2B

There was a time when it was not free but it was a very very long time ago.

At this stage in history, the internet didn't exist and it was commonplace to pay for the shipping of a floppy or a CD. I don't know the price tag but it may or may not be considered free by today's standard.


> Most companies avoid GPL like the plague.

Is this actually true any more (if it ever was)? Yes, I do know that Google is picky about it, but is this really still widespread?

> Both the compiler and the majority of libraries are infected by it.

Why the flame-worthy language "infected"? Royalty-bearing libraries are just as "infected", as corporate IP policy needs to track shipments in order to make payments.

(I do agree it is unfortunate that GNU Ada's library is GPLed.)


It's worth noting the linking exception for the GNAT runtime:

https://en.m.wikipedia.org/wiki/GPL_linking_exception


You shall thank Ada every time your plane has landed successfully.

http://www2.seas.gwu.edu/~mfeldman/ada-project-summary.html


Tiny trivia, I found a few blogs about ADA generics used for embedded network abstractions. I find it super sexy to have clean abstraction and low level like that (probably doable in cpp too but ... cpp)


Care to post links? I'm planning to use Ada in my next embedded project.


I can't find the links right now, maybe later.


Ada was a curious but obviously unsuccessful language back when I first heard about it, and that was at least thirty years ago. Since then, the software industry has exploded, computers have eaten the world, and the internet has made itself more essential than the telephone network ever was; but Ada is nowhere to be found. Whatever its virtues may have been, it has had no significant influence on the culture or practice of computing as we know it today.

Yes, of course, there must be some niche somewhere which still contains people who care about Ada, just as there are little pocket communities who still care about MUMPS, APL, and other bits of exotica, or we wouldn't have articles like this one trickling out every few years - but out here where the action is, nobody knows anything about Ada and nobody really cares.


> The very common complaint on Ada user groups and forums is the licensing model. Most companies avoid GPL like the plague.

What's wrong with the GPL?


You go way back into the 1980's every compiler vendor was trying to drink your milkshake[1] via licensing fees for compilers and libraries.

Two companies that didn't do that were Microsoft, Borland. AT&T was forbidden from selling products outside of Telcom. Which is why Linux and C/C++ succeeded. C succeeded because a competent grad student could port the language to a new computer in a two months.

So you could use Microsofts Basic and C/C++ compilers, Borlands Pascal and C/C++ compilers, or gcc/etc C compilers without the them sharing ownership of your compiled binaries.

[1] Per unit licensing fees. Meaning instead of just charging you a seat license, they wanted a cut of your profit as well. You pay use $20 for every license you sell. I'm not kidding about $20 either.


GPL makes it harder to charge people for spyware. It also makes it possible (at least theoretically) for users to fix bugs themselves, which breaks the bug-fix middleman monopoly that aspiring spyware vendors seek.


The language is very anti-ergonomic in terms of shift key. This plus emacs can destroy your tendons:

    with Text_IO; 
    use Text_IO;
    procedure Hello is
    begin
        Put_Line("Hello, World!");
    end Hello;
I subscribe firmly to the school of thought that as programmers, we are not optimizing for keystrokes... but man oh man having to type capital-Put underscore capital-Line instead of just print or printf, 50 times a day. Yikes.


Ada is case insensitive. put_line would have worked just as well


There is also just 'put' it just doesn't tack a \n onto the end so helloworld in a terminal is ugly


Ada is not for Emacs programming.

Ada always used IDEs, which already supported automatic formatting back in the early 90's.

Rational started their business selling Ada Machines, just like the Lisp ones, but using Ada instead.


My emacs automatically formats and capitalises the right words.


Why should GPL be problem for Ada when it's not for GCC?


GCC has a runtime library exception, GNAT GPL does not. Programs compiled with GNAT GPL can only be GPL (GNAT GPL is specifically engineered to that end), that is not the case for GCC.



It's a little awkward. Adacore drops a full GPL edition every year which turns gmgpl the next year while simultaneously there's a group that bug fixes the existing gmgpl libraries and integrates the new changes.


Oh, weird... that sounds like a complete clusterfuck. :/

Why do developers do this?! These are the sorts of policies that scare companies away from open source software.


No.



Why do people post this when it is true but not when it isn't?


i have yet to actually see an instance of when it is false. do you have an example?


Heres a lecture at MIT about Ada from a favorite programmer of mine.

http://www.youtube.com/watch?v=0yXwnk8Cr0c

I do not understand the Medium authors definition of "success". I get similarly confused when web commentators use the term "won" (e.g., language X "won") when discussing programming languages. Won what?

IMO, the presenter (RIP) was quite successful. Im sure Ada "won" many contracts.


Won = monopsonized the market? I.e. so many employers looking for it that it influences education and training for people who have no interest in that particular language, but just want to learn “a” language.

See: Java wining enterprise programming; Python winning DevOps scripting; FORTRAN wining numerical simulation way back.

Also see: PHP winning SMB web consulting in the 90s (resulting in e.g. the Wordpress plugin ecosystem); RoR winning SMB web consulting in the 00s (resulting in the the current “horizontally scale by running more containers with a load-balancer in front” approach to web-app deployment, due mostly to Ruby’s horrible concurrency story at the time); Node.js winning SMB web consulting in the [first half of the] 10s (resulting in the rise of websocket-driven SPAs and a million Node build tools.)


According to the lecture, for Ada I guess it would be winning avionics. The presentation has some interesting artifacts, such as one student asking what "portable" means and another asking what is "object-oriented" programming.

Maybe "hyped" success is sometimes confused with "real" success. The later type may not always be as visible.

He addresses this in the lecture. He suggests people often make statements about programming language usage in industry without actual knowledge of what some organizations are really using (beyond what is announced publicly, or hyped).


any language older than 5yrs is a failed language. perl, ruby are recent memories.

i’m starting to realize the only industry more fashion conscious than software is... well fashion!

Ada?! ahahahaah


Heavens, no!

Ever heard of PL/SQL? Same thing.

Oracle ripped the entire ADA language into their database.

IBM DB2 clones most of it, courtesy of EnterpriseDB (Postgres bolt-on).

People forget this enormous ADA audience.


This completely wrong.

PL/SQL used Ada as a model for the syntax. The semantics is completely different. Just like JavaScript is not C, PL/SQL is not Ada in any meaningful way.


This completely wrong.

Then why is PL/SQL internally littered with references to DIANA, aka Descriptive Intermediate Attributed Notation for Ada?


My first sentence.


Ada has functions, procedures, and packages. So does PL/SQL.

C has unions. Show me the unions in JavaScript.

Unions are there because C is a systems language. Show me a kernel written in JavaScript.


Ada is a systems language, too. It was used in many OS's. Several are below:

Army Secure OS (A1-class design)

http://www.dtic.mil/dtic/tr/fulltext/u2/a340370.pdf

Fault-Tolerant, Parallel, Mainfraim-style OS/Platform

https://en.m.wikipedia.org/wiki/BiiN

MarteOS Real-Time OS

https://marte.unican.es

Muen Separation Kernel

https://muen.codelabs.ch


> Ada has functions, procedures, and packages. So does PL/SQL.

Lots of languages have those. That doesn't mean one copied the other. Are you serious?


> Show me a kernel written in JavaScript.

https://bellard.org/jslinux/


There is no kernel written in JS there. It boots normal kernels compiled for the listed CPU architectures.


I'd say PL/SQL is a huge technical failure. Its continued use is also a huge failure, but that one is a management one.

There is "successful" and there is "widely used because corporate mandated it".

edit: OK, I was being cruel. People use it and it solves problems, but it's NOT a joy to use.


Not much of an Oracle fan, but pl/pgsql is heavily derived from pl/sql and I use it on a near daily basis, can’t really complain about it for the purpose it fills.

Yes, some legacy applications do some really nasty things in database procedures - but that doesn’t make the tools themselves bad.


Sorry, but it is a joy to use.

I will always favour PL/SQL and the related tooling over SQL Server, DB 2, Informix, PostegreSQL languages.


Wow, we have different definitions of Joy...

I always treated PL/SQL as a necessary evil for performing specialized atomic operations, or merges, or other such things where SQL wasn't enough. I never enjoyed it, but I had to do it.

I'm very tempted to see about taking something like this, https://github.com/thehydroimpulse/postgres-extension.rs, and using a language I actually like. That example needs to be updated, but seems pretty straight-forward.

Edit:

It would be amazing to have a rich library in Rust that could be using to write DB extensions as a replacement for pgsql, etc. There aren't that many things that need to be wrapped in the API from Postgres: https://www.postgresql.org/docs/10/static/xfunc-c.htm


I prefer to do database work on the database, instead of wasting network traffic with useless data.

Never was into a project where the overhead of writing DB independent code was worthwhile. Usually moving to another DB meant rewriting big portions of the application anyway.

Stored procedures are also a good escape hatch for some of the ORM limitations regarding SQL capabilities.


I think you misunderstood my post. I want stored procedures too, bc they are necessary in certain cases. But I want them in a language I like.

The repo I linked to is for C based extensions added to Postgres but written in Rust, that could replace pgsql for stored procedures.


I did understood that, but unless the language has 1:1 of SQL declarative power, it is a no go for the majority of DBAs.

Hence why Perl, Java, .NET as stored procedures have been a failure to SQL extensions.


That's a cultural problem, not a technical one.


Ah. Now that’s a challenge...


Running code wherever the database thinks is best is a nice thing, but nothing really prevents it from being implemented in a more flexible language than something built on top of SQL. SQL is powerful, but it's a DSL.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: