
Why Ada isn't Popular (1998) - Jtsummers
http://www.adapower.com/index.php?Command=Class&ClassID=Advocacy&CID=39
======
kibwen
As a Rust contributor, I'm perpetually trying to find people who are familiar
with Ada to comment on our design. I know nothing of the language except for
that it's allegedly designed for both safety and efficiency, which is exactly
the niche that we're targeting. If there's anything that Rust can learn from
Ada, we'd love to know while we still have the leeway to make breaking
changes.

While we're at it, I'd love to see some Erlang programmers take a look at Rust
as well and tell us where we're falling down on concurrency. Sadly, both of
these languages seem underrepresented in the circles that I frequent.

~~~
brson
We've talked to Tucker Taft about Ada and Rust before, though it was a while
ago and I mostly recall that we discussed aspects of Ada's standardization
process (I think Graydon was a fan of Ada's spec). Niko would probably
remember more.

~~~
brson
And FWIW here's the research presentation Tucker gave while he was at Moco
about his new language Parasail: [https://air.mozilla.org/region-based-
storage-management-para...](https://air.mozilla.org/region-based-storage-
management-parasailing-without-a-garbage-chute/)

------
jacquesm
I think the biggest factor why Ada never 'made it' was that the compilers that
were available were priced ridiculously.

One of the big drivers in language popularity in the long run is how many
people can become familiar with a language by using it, and by having all the
implementations be terribly expensive you're almost automatically limiting the
number of people exposed.

I think companies that did this (much less so today than in the past) saw the
languages as their profit centers, not the eco systems. Microsoft, Borland,
Zortech and a whole pile of others were in a never ending battle to get the
programmers on board. Every computer came with at least one programming
language pre-installed (usually some form of BASIC). What you did after
learning BASIC (and some of its limitations) was dictated by need and funds.

For most people that meant that their second language was some variety of
assembler. For others to re-implement Forth (since that was doable). A C
compiler was something you could still buy on a regular salary.

To get exposure to Lisp, ADA, Modula-2 or any one of a whole bunch of other
high level languages required shelling out significant bucks. So during what
we might refer to as my 'formative years' as a programmer none of these really
nice languages made it to my computers.

~~~
colechristensen
Something I see missing from computers these days (and lots of days previous)
is a programming language designed for non-programmers.

The likes of Ruby and Python are _approachable_ but everything is a mess of
libraries, RVM and lookalikes, dependancies, and a whole lot of complication.
This is great for the target audience but less so for your average user.

Excel and the Microsoft like are currently filling the role, and doing so
poorly.

The scientific community has things like R and MATLAB which do a good job for
scientists.

Everybody else _can_ learn a language, but it can be daunting and it's
definitely geared towards hackers and professionals.

In an ideal world all computers would ship with a native programming language,
a big fat manual for users, and a set of software that's core design allows
for and is meant to be programmed for everyday usage.

This language would have to be relatively simple, stable over time, and more
monolithic than most. It would have to sacrifice speed and functionality for
approachability, but if it were done well it could be world-changing.

Sadly the market is moving in the opposite direction (but in some sense the
same direction too) Everybody these days is using an app with simple buttons
which doesn't do much at all. It's successful because it's so approachable,
but scary because it's so controlled by the vendor.

~~~
keithpeter
Audio/music: puredata [1] is pretty visual to start with and you can play
around with things, gets away from the REPL completely. ChucK [2] and
supercollider [3] require typing code into a terminal, but are feasible
_perhaps_ because of the minimal complexity of most live coding sessions.

Graphic artists have Processing [4]. REPL but with simplified grammar.

Perhaps the end-user programming future is in domain specific programming
environments (R like but for different disciplines?)

[1] [http://puredata.info/](http://puredata.info/)

[2] [http://chuck.cs.princeton.edu/](http://chuck.cs.princeton.edu/)

[3]
[http://supercollider.sourceforge.net/](http://supercollider.sourceforge.net/)

[4] [http://www.processing.org/](http://www.processing.org/)

~~~
bglazer
Processing is a wonderful library/programming environment. An intro to
programming class taught in Processing would be cool.

------
WalterBright
It was also initially considered to be unimplementable by a number of compiler
people (including me). We were wrong, but it was a while before a working Ada
compiler was produced.

C had the great advantage that a C compiler would fit in MS-DOS's address
space with some room left over for the symbol table.

~~~
jacquesm
What were your main reasons for thinking Ada could not be implemented?

~~~
wtallis
This is an especially interesting question given how much smaller and faster
Pascal implementations were for DOS than C compilers. What did Ada add that
was so complex (and how's it compare to the complexity that C++ added to C)?

~~~
RogerL
Ada has packages, generics, exceptions, concurrency, default values, operator
overloading, user defined types, fixed point, subtypes (so you can, for
example, limit an integer to the range 0..100, and then the compiler has to
check to see that it is used correctly everywhere), pooled memory, built in
run type checking, pragmas, etc., etc.

~~~
lobster_johnson
Pascal also had subtypes, or at least Borland's did.

------
krazydad
I started programming in the early 80s, and never used Ada. At the time, Ada
had a reputation as "that over-bloated language that people in the defense
department have to use." This perhaps underserved reputation is addressed in
section 7 of the linked article.

From my point of view at the time, any language that the US Government would
mandate as a required technology MUST be flawed. A language embraced by a
large bureaucracy must be full of large bureaucratic nonsense. That was the
thinking, at any rate. Ada, along with Cobol, was a frequent target of snarky
jokes. In the mid-80s, the cool kids were programming in C (while being paid
to program in Basic or Cobol or Fortran).

I freely admit these notions were likely borne out of ignorance, but I imagine
this reputation stifled its adaptation outside of government circles. Would
you want to use a language mandated by the DMV?

------
waveman2
One other factor I would mention is the massive uncoolness of Ada. This seems
to come from three directions:

1\. Invented by and for a bureaucracy known for its extravagance and
wastefulness.

2\. Verbosity of a degree only before seen in COBOL.

3\. A focus on avoiding errors above all else. For people who have a
positive/benefits focus like most hackers, this negative focus is very
unappealing and provokes references to anal-retentive pedants and so forth.

I would agree with others who pointed out the other problem "won't run on a
computer I can buy - and anyway I can't afford the compiler". This is not to
criticise the compiler developers but nonetheless it was a problem.

[http://en.wikipedia.org/wiki/Worse_is_better](http://en.wikipedia.org/wiki/Worse_is_better)

~~~
matthewjheaney
Oh please, the comparison of Ada to COBOL is invidious. Ada simply requires
you to be explicit about your intent. This is a feature, not a flaw. You mean
to say that you've never been burned by an implicit conversion in C, or a
misplaced semi-colon? Ada is no less verbose than Java.

~~~
riffraff
I think grandparent refers to what appears to be gratuitous verbosity, i.e.

    
    
        procedure Foo is
        begin
        ...
        end Foo;
        
    

vs java

    
    
        void foo() {
        ...
        }
    

Even if one wanted to be explicit about indicating what "End" refers too this
could have been

    
    
       proc Foo
       ...
       end Foo
    

EDIT: when I write "appears" I mean "there may be a perfectly good reason for
it but isn't obvious", not that it is in fact gratuitous.

~~~
jmilkbal
I am an Ada developer, but I think it is objective to say that anyone who
opposes a language because there fingers will have extra work probably doesn't
belong in this field. If you consider the development process as a
whole—research, planning, development, verification, etc.—those extra
keystrokes add an exceptionally marginal amount of time to the development
process, but reduce time so much more by making the code more intuitive to
read. Don't let me lead you to believe that Ada's words make it intuitive;
that would be disingenuous, but the syntax has been formed since its inception
to be readable by developers and non-developers alike. This is an important
distinction with something like Java, neverminding that you don't have to
explicitly instantiate generics in Java. One of the key objectives of Ada is
code that is especially intuitive to non-developers. There's a lot going on in
the language. I hope this helps.

------
cafard
I would mention here Richard Gabriel's essay "The End of History and the Last
Programming Language", which you can find at
[http://www.dreamsongs.com/Files/PatternsOfSoftware.pdf](http://www.dreamsongs.com/Files/PatternsOfSoftware.pdf)

His argment that "Languages are accepted and evolve by a social process, not a
technical or technological one." seems to apply here, also (in the context of
the time) "Successful languages must have modest or minimal computer resource
requirements."

------
humpt
I am a 25 years old student from one of the top CS engineering school of
France. People who enter there usually have a heavy math background, but are
fairly new to programming.

The first semester all the students are introduced to basic programming
concepts, and the language that's used for that is Ada. Then Ada is used to
teach algorithms and compilation class. In fact in the second year, we had a
full-time project where we wrote a compiler for a language close to Java (in
terms of syntax and features), in Ada.

I remember being in my python phase at the time, and bitching at Ada's
verbosity. But I realized how comfortable it is to let the compiler do most of
the debugging for you, especially on very large projects that are structured
as a pipeline (eg. a compiler, every part depends heavily on the other). When
it compiles, it works 99% of time. Static typing/subtyping, generic package,
all the time spent in the console with GNAT yelling at you really teaches you
how to structure, secure and bullet proof your code. Ada really is a great
teacher!

~~~
epsylon
(Fellow ENSIMAG-er, I presume?)

~~~
humpt
yeah :) a safe bet!

------
mraison
Even though Ada is now completely forgotten by most people, it's worth
mentioning that the GNAT compiler is being actively maintained by the company
AdaCore (which core business, as the name suggests, is about Ada).
[http://www.adacore.com/](http://www.adacore.com/)

~~~
matthewjheaney
Right. Also note that GNAT is just GCC. The most recent language standard is
Ada 2012. (I was involved in the design of the Ada standard container library,
which originally appeared in Ada 2005.)

------
matthewjheaney
Glad to see that this old post of mine (originally posted on comp.lang.ada)
has been revived on Hacker News!

------
cicero
I first learned Ada in a university programming languages class in 1984. It
was chosen by our professor as the culmination of classic programming language
ideas. We did not have access to a full Ada compiler, but we used "Janus Ada",
which implemented a subset. I really liked it because it was very consistent
and readable and you could create sophisticated data types, and the compiler
strictly enforced their use.

A few years later (1988), I was working an a large military aircraft project
that used Ada. We did a huge up front design with reams of design
documentation. The first code that was written was a package of data types to
ensure consistency. Unfortunately, there was very little Ada experience among
the team and we sometimes designed ourselves into corners. Such problems were
often solved by using "unchecked conversions" to get around the type
enforcement, which defeated the purpose. The problems we had were similar to
problems huge C++ projects would have a few years later.

The other problem we had was with the compiler. The government had placed high
performance demands on the compiler, so it was highly optimized.
Unfortunately, it would sometimes optimize away code that you needed.
Fortunately, we could generate a code listing that interspersed the assembly
output of the compiler with the corresponding Ada source code and thus find
the problem. Usually adding an intermediate local variable to break up a
complex calculation solved the problem.

In 1990, I got hired by a start up that was doing military contract work
simulating radars for flight simulators. They had done all of their work in C,
but now they had a contract that required Ada, so I became their "Ada expert."
All of the C guys hated Ada because of its strictness. Ada wouldn't let you
mix numeric data types without an explicit conversion, and that really chafed
the C folks. It reminds me of today's dynamic/static typing debate.

The Navy ended up canceling the aircraft so they no longer needed our radar
simulator. The next project I was on decided to use the new C++ language
everyone was excited about. Ada was starting to fall out of favor, so waivers
for the Ada mandate were getting easy to obtain.

At they time, I was thrilled to use C++. It seemed edgy compared to the
bureaucratic nature of Ada. Now, however, I don't know if C++ was an
improvement overall. It has some advantages, but also some disadvantages. I'm
happy to see that the Rust guys are trying to learn from Ada. I was an
inexperienced kid when I used Ada, so I am by no means an expert, but from
what I do know, I think it still has something to teach us.

------
bleair
This is horribly pragmatic, but a reason that can contribute to a langugauge's
success and popularity comes from how easy it is for new users to "pick it
up", "find libraries that they find useful" and build working prototypes for
"itches they want to scratch".

I'm making broad generalizations, but as examples, if you wanted to build a
program to perform some numerical analysis or maybe build an analytical
simulation you could pick fortran and you'd likely find libraries and examples
to help you out. If you wanted to build a simple web page backed by a database
you could grab php & some of its libraries and you would be quickly
constructing web pages. With java you could find examples of tools / libraries
for churning through databases and also plugging into various web application
frameworks. If you wanted to write a PhD about computer languages you could
use lisp :P. I've found great value in python personally because the examples
were decent and more importantly it was easy to see how to build the
programatic bridges into the other languages and libraries I wanted to use.
I'd strongly argue that ruby and node.js popularity can be partially traced to
seeing the "fun" and / or neat examples that are shown off in tutorials. They
show how to leverage the constructs provided by each language's common
libraries and the resulting programs are interesting / neat to some number of
potential adopters.

In the 90s when I looked at ada there wasn't much in terms of libraries that I
could leverage to explore problems that interested me at the time.

Again, not every language has to be ideally suited to writing video games or
web pages to be useful. Ada the language might have outstanding academically
interesting aspects. If it doesn't help me solve real world problems I care
about though it's less likely I'll invest the time to learn about it.

~~~
wting
I like to categorize many features that lower initial difficulty as "deferred
technical debt." There are type system features that ensure a higher degree of
correctness, but it's not fun debugging compiler errors when you're trying to
get something working.

For example, being forced to handle errors immediately via return codes or
option types is not "fun". By comparison, exceptions act as a giant GOTO and
no one blinks an eye.

Dynamic typing is also another form of deferred technical debt. It is
preferable to handle type errors at runtime or through testing instead of at
compile time. People who claim very few bugs are a result of type errors
typically do not have experience with ADTs or stronger type systems than Java
/ C++ / C.[0]

> Most programmers think that getting run-time errors, and then using a
> debugger to find and fix those errors, is the normal way to program. They
> aren't aware that many of those errors can be detected by the compiler. And
> those that are aware, don't necessarily like that, because repairing bugs is
> challenging, and, well, sorta fun.

I don't think this attitude has changed in the past 16 years. People prefer to
debug a run time stack rather than deal with compile errors, perhaps even more
so with the rise of dynamic languages.

[0] Clojure community likes to argue that bugs arise from mutability more than
type safety, but I don't have not experience to comment either way.

~~~
waps
I disagree strongly with exception versus error codes. It sounds reasonable
until you look at what real programmers (the kind that isn't perfect) will do
and how it affects production code.

Real life programmers don't know the stack up and down and haven't read the
documentation for the stuff they use. They will not have thought through every
possible error case. Sucks, but that's real life. So any solution that depends
on either of those will simply fail. So there's 2 ways to have parts of your
program signal errors to other parts. The question that matters is what will
mediocre programmers do ?

If you force em to give errors through return codes, they will flat-out ignore
the errors. That means errors, and all meta-information about them (like which
file it is that won't open, or which host doesn't respond, that sort of thing)
disappears into a black hole. It MAY end up in some log file, or it may not.
Crucially, sometimes it disappears into a logfile that is custom to the
library being used (at least in C). Needless to say, you will have memory
leaks when this happens, and you will have invalid state. It may also lead to
crashes directly, if it doesn't it will lead to "delayed crashes" (like OOM),
and all sorts of fun behavior (which is why some system engineers prefer
direct crashes : it makes the point where the problem gets created easy to
identify).

If you give them exceptions they will not handle the exceptions, nor will they
get try ... finally correct. This will lead to memory leaks and inconsistent
data. And it may lead to program crashes, BUT crucially it will preserve the
error that was originally detected, and log it in the main program, making
fixing the error critical.

Is this "deferred technical debt" ? I can understand your reasoning, but I
think checked exceptions are a much, much better solution than return values.

What I find especially irritating is the moronic attitude that testing somehow
makes up for not having static types. I have never seen a program that has
half the tests any statically typed language automatically executes. Not even
once.

~~~
ryanobjc
Yeah I totally agree here.

There is a lot of hand wringing about hiring 'the best programmers' etc, but
the reality is everyone is human and humans make mistake. Lets make the
systems resilient to mistakes that humans make - strange how this seemingly
simple statement is actually quite controversial around here!

In truly large systems, all of the above happens and more. This is why error
handling in C is so problematic, and why we toss it with Java. That go is
bringing it back, is a little worrysome. Then again, go might be a hackers
language, and might ultimately end up being the repository of write-once
programs that aren't maintained.

------
ellyagg
My dad is a retired DoD engineer who worked extensively with Ada. Sent him
this link, and this was his response:

Yeah, I liked Ada because it was very readable, and decomposition into smaller
units was easy. I always liked the package concept.

Our compilations weren't all that slow. Maybe an hour or so for 500,000 lines
of code. It was always best to compile units individually, not to build
everything and compile all at one time. It could take days to get through
compile-time errors that way. A lot were due to mismatching data types.

Once we tried to upgrade to a new version of the compiler, but couldn't get it
to work with our legacy code. We had the vendor send us an engineer from
Phoenix, and he could never get it to work, so we gave up and kept using the
old compiler.

I hear Ada is still popular in Europe. There were a lot of useful improvements
in Ada95 that eliminated some of the clunky features.

On large systems like ours we had subdivided the code into many separate
functional units. Each unit consisted of a directory of multiple packages that
all worked to perform a certain function. Each FCI directory would usually
contain an interface spec and code for interfacing to any other FCI that
needed to share data with it. There were a couple of ways to share data. One
way was to send an FCI a message that we were putting data into shared memory,
and then the other would grab the data. Another way was to just send a message
containing the actual data. This worked ok for small amounts of data. I think
to communicate with external devices at the lowest level we were calling c
code.

Software engineering in a large project is actually more fun than in a small
group I think. There's more activity and hustle and bustle. Also it does force
a certain amount of discipline that one would rarely do on an individual
basis. Some of the code reviews could be brutal. Or it least so it seemed.

Testing was always a challenge on the real hardware because it always involved
multiple computer systems networked together sometimes with wireless devices,
and it could be a challenge just to get them all stable and talking together,
let alone getting your own code working. When we went to a test facility for a
week or two to test we always kept our fingers crossed. Usually nothing would
work for the first two or three days and then on Thursday and Friday the
problems started to go away and were magically solved, and we could go home
successful. It was always very nerve wracking because of course there was a
whole management schedule that depended on it.

Of course the boring part was writing software requirements documents,
software design documents, test procedures and the like. We weren't supposed
to design while coding, and weren't supposed to write any code until our
design was complete. But a lot of times we didn't know how to design it, so we
would write code in advance and call it prototyping. But if we reused our
prototype code we could get huge code counts during the code phase, because we
had already written that during the design phase and charged it to design.

Since we got assessed partly by the amount of code we could write in a given
time, sometimes you would find people, who instead of writing a subroutine
would just duplicate the same code multiple times. It increased their code
count. It really didn't pay to write really tight, highly efficient code
because of the development time constraints. However, then during testing you
might have to go back and fix it to perform better.

~~~
acomjean
> I think to communicate with external devices at the lowest level we were
> calling c code.

As someone who spent a few years maintaining the ada ->c code wrappers, our
team would use I would say he's correct.

We used ada wrappers to make the system calls to c to do networking and unix
ipc. While ADA can call c directly, we had to support HPUX and SUN so
sometimes we had to write c code and call that (#IFDEF HPUX). This was a pain.

The package system was very good, especially at dividing work among a large
team (30+ programmers).

We had a lot of fake external hardware simulators we used to test our code
against. This worked well as long as our simulators worked like the actual
external hardware. It did with some frustrating exceptions of course
discovered during the final test.

Dealing with time is a pain in all languages (UTC vs GPS) I think sidereal
time thrown in for fun.

I remember some of those procedures (design->code->integrate,formal
test->ship) that DOD projects had. We counted SLOC (Lines Of Code), but
thankfully our reviews never were based on them. Code reviews could be useful
or just frustrating. I suspected somepeople were just difficult so they
wouldn't be invited to as many of them

Interesting to hear others perspectives on it.

------
davidgerard
>Most programmers think that getting run-time errors, and then using a
debugger to find and fix those errors, is the normal way to program. They
aren't aware that many of those errors can be detected by the compiler. And
those that are aware, don't necessarily like that, because repairing bugs is
challenging, and, well, sorta fun. You are not giving a programmer good news
when you tell him that he'll get fewer bugs, and that he'll have to do less
debugging.

What. the.

~~~
kd0amg
Sure, it might sound crazy, but I've met plenty of programmers who were proud
of using tools that make them do lots of extra work. It is a skill, even if
it's not the most cost-effective way to get a working product.

------
Tomte
In university I actually tried to get into Ada, especially since I studied
compilers with professor Ploedereder (great guy!) who was working on (and at
some point chairing) the standardization in ISO. I loved the story about how
the committee was in a seriously fight about whether Sunday or Monday should
be the first member of the week enum.

It just wasn't my thing, although I deeply admire the language. There has been
fantastic new stuff in the 2005 and 2012 versions. Like the Ravenscar profile
for embedded and safety-critical systems.

And just like the "Young Lispers" did with Lisp, Ada had some sort of
renaissance, where a lot of Ada stuff was going into Debian and young Open
Source people dreamt of re-writing the Internet. :-)

I think it was partly the tool chain that repulsed me. There is really only
one compiler in existence if you're a hobbyist: GNAT.

Nowadays Adacore are pretty open, having a big Open Source compiler download
page and all, but back then, you could get GNAT, as shipped with GCC, or "big
GNAT".

GNAT, as shipped in GCC, was always at least one version behind, and since
Adacore always hired everyone who got familiar with the GCC frontend, it was
highly dependent on Adacore. So no "second source" where GCC maintainers could
step up if Adacore ever left. The Ada frontend just was a second class
citizen.

"Big GNAT" was something everyone would have liked to use, but it was
_expensive_. So that was out of the question.

Well, I got it, as part of a special University cooperation program, but only
after signing that I would only use it for the assigned coursework, that I
would never distribute it to anyone etc. pp.

Oh, and how did Adacore manage to keep it closed? I mean wasn't the
predecessor, on which GNAT was based, GPL'ed?

Yes, and "big GNAT" also was. Kind-of.

Rumor has it that Adacore basically told all of their customers unofficially
"sure, you're legally free to distribute it, it's GPL after all. Just don't be
surprised if we don't pick up the phone for a few hours when you need
support".

As far as I know it never really leaked. For some time I was actively looking
for some "big GNAT" archive on the net, but never found one.

------
jv2
Ada's main popularity problem appears to me to be due to its advantages not
being immediately obvious when writing small programs. For example, comparing
a small program written in C and Ada the Ada version would appear extremely
verbose to most C programmers.

Ada's advantages start to really show when you deal with much larger pieces of
software; what appears verbose or overly-strict at a smaller scale provides
valuable assistance when dealing with large codebases.

Unfortunately, many seem to dismiss Ada after doing little more than looking
at small code examples and complaining about verbosity...

------
nl
We learnt Ada at University in the late 90s. It was OK, but nothing amazing.

Then I went into the real world, and !y first job was with Delphi (Object
Pascal).

It's surprising no one discusses Delphi with Ada, because it had many of the
same features with the same Pascal ancestry, but better tooling and aimed at
commercial as opposed to defense work.

For a while it was very successful. It lost because of commercial reasons, but
for a long time it was a better Visual Basic.

------
acomjean
I used Ada(95) a lot. All the radars built by the company I worked for are
using it today.

Ada has a lot of thing going for it I miss. It was rubust. Custom types (this
is an int from 0-100, it goes out of range throw an exception). Records
(structs) were nicely implemented. The package system really worked well for
large software.

It was interesing. It suffered a lot from running on systems that are written
in c. Although you can make system calls through a wrapper it was odd, so we
wrote packages to interface with the underlying c libraries we need to call
(java did this as part of core). We had a lot of libraries to make the
networking and unix ipc work the same across our hpux and sun systems. The
displays were written in C as it just wasn't really feasable in Ada.

It never reached the critical mass to get great tools, so debugging could be a
pain. It didn't have a lot of useful third party libraries that make a
language powerful and fun.

When your ada compiled there was a good chance it would work.

I do miss parts of it. When I see Go code sometimes I get flashbacks.

------
AceJohnny2
I know this is tangent, but as someone who hasn't programmed Ada, I'm struck
by how the superficial syntax of VHDL is similar to Ada.

See:
[http://en.wikipedia.org/wiki/VHDL#Design_examples](http://en.wikipedia.org/wiki/VHDL#Design_examples)
vs:
[http://en.wikipedia.org/wiki/Ada_programming_language#Langua...](http://en.wikipedia.org/wiki/Ada_programming_language#Language_constructs)

Same "use packagename". Same "entity foo is", same block definition by "begin
[...] end entity"

Coincidence, or is there a historical reason for this?

Edit: The answer was in the VHDL article itself: "Due to the Department of
Defense requiring as much of the syntax as possible to be based on Ada, in
order to avoid re-inventing concepts that had already been thoroughly tested
in the development of Ada,[citation needed] VHDL borrows heavily from the Ada
programming language in both concepts and syntax."

~~~
sitkack
VHDL was modeled after Ada.

And yes, the languages look almost identical.

~~~
matthewjheaney
Right, VHDL designed by the team at Intermetrics (which has also had a long
association with Ada -- Ben Brosgol was the designer of the Red language that
lost out to Green).

~~~
sitkack
Holy Shizzle, you met Jean Sammet?! I am super curious as to what she thought
was wrong with Ada (not that I was disagree).

Reading her paper on FORMAC now,
[http://dl.acm.org/citation.cfm?id=155372](http://dl.acm.org/citation.cfm?id=155372).

I chatted with Ivan Sutherland a couple years ago. Nice, prescient fellow. His
Fleet project is really cool,
[http://arc.cecs.pdx.edu/](http://arc.cecs.pdx.edu/)

------
javert
So who here (at HN) is using Ada, what for, and do you expect it to become
more or less widely used in the future?

~~~
achille
Until leaving Oracle about a year ago, I'd been doing a lot of development in
PL/SQL, which is extremely similar and compiles to the same ADA bytecode
(called DIANA).

Most of Oracle's E-Business Suite is written in PL/SQL, although some
components are being re-written in Java (ADF) in the new Fusion Middleware
stack.

If your company is using Oracle for payroll or financials (this includes lots
of companies, including Apple, Google etc), it's likely some of the
customizations have been developed recently, and they've been developed in
PL/SQL. Many of the components in EBS have not changed in decades

So to answer your question, developers at Google right now are writing in Ada
(well almost):

>
> [https://www.google.com/about/careers/search/?#!t=jo&jid=1144...](https://www.google.com/about/careers/search/?#!t=jo&jid=1144001&)

It's not easy to rewrite either. Writing ERP software is extremely hard,
because it can't be cleanly modeled in a few abstractions. There are a lot of
edge cases. I would suggest reading Ward Cunningham's
[http://c2.com/cgi/wiki?WhyIsPayrollHard](http://c2.com/cgi/wiki?WhyIsPayrollHard)

~~~
ams6110
Intresting nugget, I never knew that. I have done a lot of PL/SQL development
and thought it was pretty OK. Not my favorite language, but not the worst
either. I do recall the occasional obscure "Cannot generate diana for an
object" errors and never really knew what those meant; the Oracle
documentation was very vague and basically said it was a "contact support"
type of error.

------
Jtsummers
Technically not the linked page's title (the delightfully uninformative: Ada -
AdaPower.com - The Home of Ada), but the title of the post in it.

~~~
dang
That's just fine. More than fine: it looks like the author's title for what he
wrote, which is what we want.

From
[https://groups.google.com/forum/#!topic/comp.lang.ada/JwLB7Y...](https://groups.google.com/forum/#!topic/comp.lang.ada/JwLB7YvmKGM)
I surmise that the content is from 1998?

~~~
Jtsummers
Yes, forgot to look that up to put in the submission title. Thanks.

------
shepardrtc
When I first started college, the intro to computer science courses were
taught in Ada (this was back in '97). I don't have too many memories of it,
but I do remember how verbose the language was, and that drove me pretty
crazy. Its not very good for teaching programming to someone who has no idea
what they're doing, but after much more experience, I can appreciate the
language for what it is and what it can do.

And yes, I do realize that the verbosity and safety are good qualities and
certainly can teach a lot, but for a 17/18 year old who may have never seen a
programming language before, its pretty crazy. It should definitely be taught
at higher levels, so as to reinforce good programming standards and practices.
Have a few years of wild screw-ups and crazy bugs and gain an understanding of
why you need safety and so on.

~~~
spc476
I too, encountered Ada in college (around 1990, give or take a year) and yes,
compared to other languages I was familiar with at the time (BASIC, Pascal,
assembly, C, Fortran) it was quite verbose and the compiler (we had to log
into the VAX system to use it) was _very fussy_ \---to the point where we
joked that once we got past the compiler, the program would not crash (now,
whether it would produce the correct output remained to be seen).

It's amusing to think these days that C++ is probably larger and more complex
than Ada ever was (or even is) but that it crept slowly to that complexity
level, whereas Ada was "large" at the start but hasn't gotten much bigger. It
seems that if you want a large language to be popular, you should start with a
small language and grow it over a twenty year period.

------
ww520
Accessibility was a problem. As a poor student I couldn't afford to get a
cheap Ada compiler to play with, while Assembly, BASIC, C, Scheme, Lisp, and
Pascal were readily available for free or for little cost.

------
bsilvereagle
I think Ada is a really nice language. It's not my go to because you have to
type so much. Even with vim auto-completion it still feels like I'm typing a
lot of code.

------
renox
Nice list, incomplete though: \- since a long time, nobody know (or care) what
Tony Hoare said in 1980 about Ada, yet Ada isn't taking off. \- GNAT is free,
yet Ada isn't taking off.

So?

It's difficult to say but IMHO what he forgot is: the syntax. Ada was designed
to be verbose, _I_ think that programmers don't like verbose syntax.. Did
someone tried to make a terser syntax for Ada? Elixyr seems to revive interest
in Erlang, so..

------
cpeterso
I would love to know some examples of 12-to-1 language design arguments that
Jean Ichbiah vetoed. What is a good source for the Ada design process?

~~~
riffraff
maybe
[http://archive.adaic.com/standards/83rat/html/Welcome.html](http://archive.adaic.com/standards/83rat/html/Welcome.html)

i.e.

    
    
        The choice of identifiers for reserved words and attributes depends primarily on convention. 
         Preference is given to full English words rather than abbreviations since we believe full words to be simpler to read. 
         For instance procedure is used rather than proc (in Algol 68) and constant rather than const (in Pascal). 
         Shorter words are also given preference: for example access is used in preference to reference, and task is used in preference to process.
    

There should be an older version too.

------
reirob
"and nowadays, everyone seems to think exceptions are a Pretty Good Idea." \-
really? I don't think so.

------
jokoon
side question: any C++ or C compiler extension which forbids some bad
programming practice, or some set of more strict C/C++ dialect ?

Ada seems to have safety in mind, but isn't this achievable with static
analysis ? Aren't there attempts to attach static analysis onto a compiler ?

~~~
matthewjheaney
Oh, yes, that's what the SPARK language does. There's also CodePeer, which
does a static analysis of your Ada code. GNAT itself also does many static
checks.

------
ameoba
Why Ada Isn't Popular (2014) : Everyone outside the DoD wrote it off as dead
in 1998.

~~~
Jtsummers
Not just DoD (though, admittedly, there are a large number of military
projects in this list):

[http://www.seas.gwu.edu/~mfeldman/ada-project-
summary.html](http://www.seas.gwu.edu/~mfeldman/ada-project-summary.html)

~~~
sitkack
Sadly, the DoD doesn't require Ada anymore and many software failures can be
attributed to bugs that Ada would have caught at compile time. If I were doing
flight control software it would probably be Ada checked with Agda.

~~~
vezzy-fnord
Ada-based languages (such as SPARK) are still used in developing modern
aviation systems.

~~~
sitkack
[http://en.wikipedia.org/wiki/SPARK_(programming_language)](http://en.wikipedia.org/wiki/SPARK_\(programming_language\))

It would be interesting to have a version of SPIN [0] using this. This [1]
looks really fresh. And I love the idea of creating new languages out of a
subset. Almost all teams create an ad hoc language subset, formalizing it can
be really powerful.

[0]
[http://en.wikipedia.org/wiki/SPIN_(operating_system)](http://en.wikipedia.org/wiki/SPIN_\(operating_system\))

[1] [http://www.spark-2014.org/about/](http://www.spark-2014.org/about/)

------
tedks
Love the subtle approach into "Ada isn't popular because people are irrational
and believe things that are totally off-base from reality."

I've never hacked Ada; it seems to have been totally eclipsed by languages
like OCaml that give you a rich type system and a good degree of speed, but
when I was learning about programming and languages it seemed insane that
people didn't go for Ada. Was being able to be sloppy really that alluring?
Did people just love segfaults?

It makes a lot more sense to think of it in terms of irrational actors making
choices for largely social/political reasons rather than calculated cost-
benefit models.

Basically everything makes more sense that way. HN just had a big thread about
the death penalty that illustrated this perfectly. Some people just believe
things because they want to. It's really, really hard to be accurate to
reality.

What could have been done to make Ada popular? How can we hack humans to
prevent huge losses like this from occurring? What's the Ada of 2014?

~~~
sitkack
> What's the Ada of 2014?

Please post more meta questions. This is important.

