Hacker News new | past | comments | ask | show | jobs | submit login
Introduction to Ada and SPARK (2009) [pdf] (msu.edu)
47 points by vezzy-fnord on Nov 2, 2015 | hide | past | favorite | 37 comments



"Abstraction & composition ... Affects of a unit specified by its contract"

"Affects" there is clearly a noun, thus meaning "the conscious subjective aspect of feeling or emotion" (thank you, WordNet!). So, Ada is artificially intelligent?!?!

(I consider the authors to be a legitimate mocking target because they're supposed to be talking about a formal method; details matter.)

Anyway, I'd like to point out that most, if not all, of what was left out of Ada can, in fact, be handled formally with more or less difficulty. Some are trivial, such as goto's, while others are pretty damn hairy, like pointers and dynamic memory. One that is particularly galling is concurrency, where "Cannot reason about the effects of a [concurrent] module by examining its code in isolation" is simply false although nondeterminacy does remain problematic.

Overall, I think I like the SPARK approach, although I've never dug into it in detail.


Ada represents everything that is wrong with the U.S. Department of Defense's approach to solving their organizational problems.

Systems integration projects routinely fail in the military community, and the reason for this, of course, must be purely the result of so many different programming languages! Anyone who has worked in that industry can tell you that the problem Ada was trying to solve wasn't the actual problem.

Limited number of available programmers due to absurdly arbitrary security clearance process: Check Highly bureaucratic culture which has multiple middle managers for every person who actually works: Check Contract award process which is so complex that it favors proposal writing skills over software writing skills: Check Bloated fiefdoms whose members actively fight any and all efforts to integrate to protect their own jobs: Check

None of these problems are ever going to be solved by a programming language.


> None of these problems are ever going to be solved by a programming language.

Indeed but that wasn't the intention, the problem that the DoD had was that disparate systems with decade long service lives written in a huge variety of languages was going to cost them more and many of those languages where relatively unsafe languages.

Ada was built as a safer systems language that could replace many of those prior languages, a role it very much succeeded in - dysfunctional organisations are going to be dysfunctional whatever language you use but that doesn't stop one language been "safer" (against a given set of metrics) than another.

Lots of modern systems are still written in C or C++ but largely because of organisational/real world reasons (availability of programmers fluent in those languages, the sheer complexity of modern systems, you can argue that writing Ada on top of a Linux kernel might not be the sanest choice).

Ada still has it's niche though and if nothing else it showed that a language can be designed with safety in mind and it also prototyped many of the features that cropped up in later languages.


> Ada still has it's niche though and if nothing else it showed that a language can be designed with safety in mind and it also prototyped many of the features that cropped up in later languages.

Algol already showed that in the mid-60s, check Burroughs computers for example.

It was the C designers that decided to ignore safety.


BCPL, actually, given the really minimal computer resources the group that started it started out with. BCPL had one data type, the word, which for any given port was chosen to be the natural machine word size, like C's int ideally is.

Team UNIX(TM) continued that approach, since they also had meager computers to start with, a PDP-7 with I suspect not a lot of memory, then a PDP-11(/20) with the extra memory bank option, then a PDP-11/45, which while a big and fast minicomputer in it's day, still only allowed 64 KiB of instructions, 48 KiB of data, and 8 KiB of stack for a program (the latter was the usual way to split up data space with its 8 KiB page MMU).


Still memory requirements had nothing to do with ignoring safety, given what was already possible with Algol in 1961.


When you start out with 256 36 bit words on a good day and never get more than 1,024, it can cramp your style: http://pastebin.com/UAQaWuWG (by our very own nickpsecurity).


He used information given by me for that article. :)

Just as info, here is the brochure of the B5000 from 1961.

http://www.cs.virginia.edu/brochure/images/manuals/b5000/bro...


Also see https://en.m.wikipedia.org/wiki/JOVIAL

Ada by the way is alive and kicking; nowadays it's known as PL/SQL in Oracle.


It's alive and kicking as its own language as well, hell they just released GNAT and its IDE for the rpi2.

Something on my long list of things to play with when I have time :).


Engineers repulsion from Ada -- for being an icky DoD invention -- has probably held the art back several decades. We're only now getting measurable adoption of languages like Rust, Scala, Clojure, or Haskell that attempt to assist programmers in writing safe code -- admittedly in ways that mostly supersede Ada's admirable but now outdated efforts.

Now Spark is introducing further safety features, and you're jumping on the bandwagon again to make sure that no one bothers to learn from it. Please stop.

If every CS101 student in the last 20 years had been taught to depend on, for example, bounded numeric types, verified pre and post conditions, and buffer and array overflow protection, the world would be a much better place. Let's try to not make the same mistakes (over and over) going forward.


Don't know that it's so much repulsion as the simple inability to play the game without playing $$$$$$ for a very long time (until 1995 for GNAT). Even now, I remember reading that the fully free GNAT runtime is said not to be all that hot.

All that, and the costly tools above the language and runtime, is a major reason I'm ignoring this ecosystem in favor of (tentatively) the seL4 one. Hmmm, and how close are Ada and SPARK to being supplied by a single source?


NYU Ada/ED -- the free teaching compiler that went on to become GNAT -- was released in 1983, 4 years before the first release of GCC, the same year Stallman first published his Free Software Manifesto, and two years before the FSF was created. It's hardly for lack of compilers that people rejected Ada (I agree that the ecosystem there is lacking relative to other options in 2015).

Also, FWIW, seL4 is not an ecosystem. It's a mish mash of Haskell prototype, Isabelle/HOL proofs, and C99 code put together by an extremely talented team. You might use some of their tricks, but there is no such thing as putting together an "seL4 ecosystem" software project.


NYU Ada/ED was an interpreter, and per http://www.adahome.com/Resources/Compilers/Ada-Ed.html:

Apart from the 100-odd tests of ACVC 1.11 that Ada/Ed failed, the major deficiency of the system is that, being an interpreter, it does not implement most representation clauses, and thus does not support systems programming close to the machine level.

So it sounds suitable for getting your feet wet, but not "industrial strength", and not useful for a lot of the things we want to do.

As for seL4's "ecosystem", well, it's certainly not polished like AdaCore's offerings (or so I assume, which for things like this is very iffy), but something I do believe I can clone, and seeing as I plan to start with a seL4 port, it qualifies for me.


We used to pay for compilers for any language, not just Ada.

You surely are aware that one of the reasons GCC became anything useful was the switch to paid SDKs on the UNIX world, which attracted developers not willing to pay for them?


That's why I used 6 dollar signs, instead of my usual 3 (which I'll admit is not super clear). Or at least it was my impression that Ada compilers cost a lot more.

And I most certainly remember that detail about GCC, having started playing that game no later than 1992 with Sun.


The studies the military and defence contractors did on it showed the opposite: vast improvements in defect rate, maintenance, and ease of integration. Productivity usually had a small boost due to less debugging. I also used to have an article with a Boeing manager citing a situation like you described and bragging that their choice of Ada (plus good collaboration) made integration painless.

So, the empirical and anecdotal evidence contradicts your claim about Ada. It used specific design decisions to counter specific kinds of real-world issues they encountered. Now, your claims about DOD's big issues and how Ada cant solve those problems are spot on.


There's an undertone to this post that seems to be that Ada is quite a bad programming language. Are you certain it is ?

I'm not buying the old argument that it tries to do everything/too much therefore it's no-good, which seems to be the main thing people mention about Ada. A lot of languages tries to do a lot of things, and most of them let you chose what part of the language you use.

A few years back I was mainly doing C++, and looked a bit into Ada - it seems pretty alright. Not anything overly negative compared to the C++ work I was doing, and quite a few nifty things. Lacking is tooling and community though.


What about the security clearance process is arbitrary?

As far as the proposal writing goes, if a group can't draft a decent proposal, how are they going to produce good documentation? I'm not seeing how grantsmanship presents an unfair bar here.


I don't know about arbitrary, but it is detailed, expensive (Yes, they really do send someone to talk to everyone you list on the forms in person.), and takes a very long time. (I've had several relatives with both engineering and military backgrounds doing inventory-related reports for several months while waiting for their DoE clearances to be processed.) This seems odd given that I've also heard, less reliably, that they grant something like 96% of the clearances.

In practice, having a security clearance becomes a job requirement of the same level as experience and ability to perform the job. Since cleared jobs don't pay several times more than jobs without the requirement and there's no way for an individual to get a clearance without the sponsorship of an employer, job-related skill tends to suffer.

Government contracts are a very specialized skillset[1] that is completely unrelated to any kind of development effort. People with that skillset bring in revenue and are thus assets; people doing development are expenses.

[1] http://www.dau.mil/default.aspx


In practice, having a security clearance becomes a job requirement of the same level as experience and ability to perform the job. Since cleared jobs don't pay several times more than jobs without the requirement and there's no way for an individual to get a clearance without the sponsorship of an employer, job-related skill tends to suffer.

No, but they do offer career security, i.e. not having to become a consultant or whatever if you're not a superstar when you turn ~40. I've also have a couple of acquaintances with high level security clearances, and they can talk about the tools they use, and at least in their cases they're not radically different than what's normally used outside that world. E.g. Sun gear back in the '90s when it was the best, C#, Java, various generally obscure document imaging tools that are standard in the field, etc. And plenty of FOSS, the government loves it, or at least the intelligence agencies do.

If the people doing development are working on a cost plus basis, they're anything but an "expense", they're youe lifeblood. While working in a mostly commercial arm of a mostly government contracting smallish business (http://radianinc.com/) in the mid-90s, the money people were not happy in the first few months of each year when there are so many Federal holidays, the lack of the "plus" while paying everyone's salaries pinched cash flow.

I also did a lot of non-classified Federal government fulfillment work in the '90s and early '00s and it wasn't radically different than the commercial world. More legal work was required, different sorts of sales processes in some cases, but e.g. at my first such job, when our "closer" salesman took me to e.g. NASA or a commercial firm to explain how we were going to fulfill our proposed commitment, there was essentially no difference. (State government contracting, and I'm sure the stupider Federal levels are a different matter, and to be avoided at all costs.)


Career security is exactly it, especially if by "not a superstar" you mean "not particularly competent". And yes, the folks I've run across at Redstone have indeed been using typical commercial tools like SAP (favorite quote: "Why is this database column named 'herst'?"), DOORS, and some really horrible contract management stuff, and the usual assorted (SAP and otherwise) portals. And indeed, Java; I think I heard that the Army guys upgraded past 1.4 a couple of years ago.

I don't want to get started about cost-plus contracting, but here's a tip, if you're ever in that business again: hire friendly, personable, but not overly skilled people.

The government side is divided into two groups: program management, which are really only concerned with the money going out and are the other half of the contracting follies (see my DAU link previous), and project management; people who are on one hand interested in getting their work done and in the other on increasing the size and importance of their empires. Neither would know technical work from a hole in the ground, or if they did it was so long ago as to be irrelevant.

The project folks are on your side since the more there are of you, the more important they look. If your employees are good at being friendly with them, you're golden. All you have to do is convince the program management people that you're working really, really hard and that the problems are very, very difficult (hence the "not overly skilled" part), and the project managers will do most of that for you.

Just don't ever, ever, convert a cost-plus contract (where you as the prime contractor have cost-plus subcontracts) to fixed-price while not changing the subcontracts to fixed price. (And I'm looking at you, SAIC.)

There is one difference, in my experience between contracting commercially and for the government. The overhead for me when I was doing work for IBM (in the 90's and '00s) was 13-15%; the overhead for me doing work for NASA was somewhere close to 100%, judging by the GSA schedules for related contracts---I never could find the schedule for UNITeS or EAST.


Disregarding that it's not a solution to the DoD's problem as such, does Ada/SPARK have merit for writing embedded software?

I am coincidentally researching if I could rewrite parts of a personal project in Ada (from C) to improve reliability, and so far have failed to conclude either way.


I would say so, especially considering that is where the majority of Ada/Spark is used currently. Space flight, planes, assembly lines, automotive etc.

There is a blog post about rewriting a drones control system in Spark if you were curious: http://blog.adacore.com/how-to-prevent-drone-crashes-using-s...

It does exactly what you were planning. Taking C code and rewriting it in Spark in an embedded environment.


ISRO seem to be quite happy about using Ada on their Mars Orbiter - discussed somewhere in https://www.youtube.com/watch?v=KFJUlVbTXII


Have you considered MISRA C?

Ada/SPARK have merit if you really want to do hard real time and formal methods. That seems overkill for a personal project?


A problem there is cost for a personal project: are there any free or low-cost MISRA C static checkers out there?

Frama-C might be worth looking into if the goal is improved reliability, and they'd be able to leverage their existing C skillset.


Lowest I know of is Gimpel's PC-/FlexLint, single workstation prices are $389 to run on Windows, $998 for obfuscated C source code: http://www.gimpel.com/html/index.htm

The MISRA documentation is not very expensive and useful by itself, £15 + VAT for an ecopy: http://www.misra.org.uk/?TabId=58


Or if you want to go in the complete opposite direction, formal proofs of your system down to the binary, look at seL4.


So you have a formally verified kernel. And then what?

It's possible to run L4 processes written in SPARK on top of seL4.


My suggestion of a wildly different approach was to more or less replicate what NICTA et. al. did with seL4 with your own project. There's also the Coq ecosystem, which includes a verified free for non-commercial use C compiler.


That's possible, but incredibly tedious.

It may be easier to just generate binary code from assembly in coq (as Microsoft Research has demonstrated: http://research.microsoft.com/en-us/um/people/nick/coqasm.pd...) instead of trying to shoehorn a C compiler into a verified context and then running proofs over that for non-trivial problems.

seL4 was a huge project. Once you start reasoning about using L4 APIs, it will only become worse (eg. it's not simple to determine who has access to the pages in any L4 AddressSpace)


Somebody who's been there. It is often just as bad (worse even?) on the hardware side where I worked for a while. Although I should add there was also a lot of corruption of sorts, the kind where those trying to build their little empires will make fraudulent claims and use classifying things as a way to hide their empire building.


"one of four proposals selected as DoD’s language"

Does anyone know about the proposals that weren't selected?


This looks like a good place to start. http://iment.com/maida/computer/redref/index.htm


At the risk of sounding all LMGTFY, the process by which Ada was defined/selected/iterated into it's current form is exhaustively documented. The proceedings of the DoD High Order Language Working Group is the comprehensive history, but you can find a lot of the key summary documents at http://archive.adaic.com/pol-hist/. The Chair of the committee Col. Wm. Whitaker wrote a pretty interesting personal history of the process at http://archive.adaic.com/pol-hist/history/holwg-93/holwg-93.....


Thought this was going to be about Adaboost on Apache Spark, but it is not!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: