Many programmers talk about Ada Lovelace, who was certainly awesome, but should be talking about Margaret Hamilton. Before anyone heard of Dijkstra's work, Hamilton was straight up inventing everything from real, software engineering to properly handling fault-tolerance and human interfaces. Here's what she was doing (see Apollo Beginnings rather than USL):
Also note the size of that printout of her program. Try to think about how many people make one like that which doesn't fail in the field even when parts of its environment and hardware are failing. Also while using low-level code with little to no tool support on 60's era, embedded hardware rather than a JVM, etc. Margaret Hamilton was the first in the field of high-assurance programming that I've seen. A lot of firsts.
So, I give her credit where possible. The industry should even more as she wasn't a theoretical programmer like Ada Lovelace: she engineered actual software, co-invented many of the best practices for it, deployed them successfully in production, and then made a tool that automated much of that from requirement specs to correct code.
While programmers and industry ignore her, at least NASA gave her positive mention as the founder of software engineering and ultra-reliable software along with the biggest check they ever cut to an innovator:
I didn't know about her either. Thanks for the reference. The Wikipedia page on the language looks like it was hit with some nationalist propaganda: claims credit for all kinds of concepts without backing it up. Her page looks legit, though. I'd like to see English versions of her original work and subsequent papers.
I also like that they tried the automatic programming with algebraic methods. Their approach might be interesting. A few correct-by-construction software and hardware approaches developed here took that approach. DDD toolkit for hardware comes to mind.
Hmm, we need more info on her to see just how much credit she's missing and give it to her.
> On page 117 Isaacson credits Hopper as "the technical lead in coordinating the creation of COBOL," a hugely important standard language for business programming that was defined jointly by several computer companies. Standards committee work is a difficult thing to dramatize, which is presumably why Isaacson gives COBOL no more than a passing mention, but as historian of technology Janet Abbate recently noted, his omission slights several women who, unlike Hopper, were on the technical committee in question. Among them is Jean Sammett, who made the largest single contribution to the design of COBOL. Sammet has stated that Hopper was "not the mother, creator, or developer of COBOL," an idea Hopper reportedly endorsed by calling herself the "grandmother of COBOL" rather than its creator.
> [...] Sammet has not been forgotten, as a fellow of the Computer History Museum and a winner of the IEEE Computer Pioneer Award, but sits on the wrong side of a growing chasm separating the handful of women chosen for public veneration from those whose names are known only to specialists.
> This is an example of what sociologist of science Robert K. Merton called the "Matthew Effect," the tendency for the most famous person involved in any collaborative project, however peripherally, to be remembered by the public as the key contributor. He named it after a passage in Matthew's Gospel, noting that "unto every one that hath shall be given, and he shall have abundance: but from him that hath not shall be taken away even that which he hath." Over time the famous become more famous, while the moderately well known are forgotten.
Really sorry to rain on your parade, and it certainly doesn't take away from her other great accomplishments, but she really didn't originate the term[0]. It's a nice story though.
As she said, her famous "debugging" story is likely the first time the bug was caused by an actual arthropod.
(A moth in a relay in a truly large piece of big iron.)
I think it's interesting that this folk etymology has had such a long life; it's easy to debunk, and Admiral Hopper certainly doesn't need another feather in her cap, yet it persists.
Another great example. I've given her plenty of praise as well. I've always credited her as inventing English-language programming and compilers. Are there any before her mid-1950's work? And that actually worked? ;)
So, Hopper is the first, great programmer that women bring us. Helps lay the foundation for all HLL programming. Hamilton took it to the next step to true engineering practice. Kind of set the bar high. So, she's my main mascot. Hopper's is great, though. Should probably present them all as a group with their contributions to foundations.
Ada Lovelace's story is a jumping-off point to talk about other achievements of women in science, technology, engineering and mathematics. http://findingada.com/
You're quite right that Margaret Hamilton's achievements should be talked about more.
I see you're point there. Her story did kind of become a meme about women in science in general. I surely don't mind it but think Hamilton's accomplishments are more empowering. And not just to women: anyone trying to deal with the mess that is software.
> Also note the size of that printout of her program.
That print out is not the source code. I'm pretty sure it's a listing of test results. The Apollo computers couldn't store that much in either (the equivalent of) RAM or ROM.
> "In this picture, I am standing next to listings of the actual Apollo Guidance Computer (AGC) source code," Hamilton says in an email. "To clarify, there are no other kinds of printouts, like debugging printouts, or logs, or what have you, in the picture." It's just her and her code.
If she says so of course I'll concede the point. I don't really understand how it can be true, since the recorded specs of the AGC were 4K RAM and 64K ROM. That's only a few pages of text even in assembly code.
There is a scan of the program here [0], which is 1,746 pages. That seems much less than the number of pages in the photo (by maybe a factor of 10 or so?).
If I had to guess, maybe they printed out each revision of the program? Apparently there were seven different versions [1], so if the stack contains all of them, that would account for the difference pretty well.
As will I. Over 1,000+ pages of source is believable and still one hell of an accomplishment at the quality level. I still consider the possibility that the stack included specs of successful or error states given the importance of those for a working program. I know she liked including tons of error handling in everything. Most might have done that with tests but she did it in primary stuff. So, there's room for stuff in there that's not pure source as modern developers would consider.
Also before the picture lies this statement - Here's an amazing picture of her next to the code she and her colleagues wrote for the Apollo 11 guidance computer that made the moon landing possible:
I knew I read it was the code somewhere but lacked a link past Wikipedia. Thanks for that one! Original claim stays and remains at elite level it was. :)
I agree. As I told another, I essentially meant that as they're both great. Just more emphasis on the one that invented our field and put it into production nearly flawlessly. A lot more inspiring than even the heavy-hitters from academia because she didn't build "toy" systems. ;)
I thought I said "fault tolerance" rather than "hardware fault tolerance" to reduce confusion. I failed: inherently confusing I guess haha. To be clear, I was mainly referring to her use of the asynchronous, priority-based model. That combined with self-checking/adaptation and human-in-the-loop, priority displays. Most of what I read mention those specifically as her contributions. Do correct me if she didn't invent any of these or was merely the first wise enough to apply them to production systems. I like keeping the facts straight so right people get credit for building better parts of our field. :)
Thanks for the Svoboda reference as I didn't know about him. His is our vanilla approach for reliability so he deserves credit. Might look up his papers.
I'll admit I was slow enough to check if AGS had a 32,768Hz oscillator: crystals are a common usage of that number. Then, I realized I was overthinking it, it's HN, and you're probably a programmer. How would you interpret that number? Confirmation with the calculator leads to my final, hopefully-correct answer.
"Should have been $32,768."
No, it should be $37,200 because Ms Hamilton only accepts unsigned integers on her checks. Insists you pay her rather than the other way around. :P
Only tangentially related but it seems to me that if you want to work on cool software that does something novel and exciting it seems like it's better to graduate with a degree in math or physics.
It's a view into a software industry that is virtually never reported on in the news, at least not on HN. The client list is impressive. It's not really clear what they actually do? Seems to me like they maintain a developing environment and sell support contracts for it?
It basically solved programming: semi-automated design, auto-coding, auto-testing, real-time stuff, cross-platform... the works. The problem is the modelling language (USL) sucks and the performance did at some point. That was per a review from someone at NASA I believe. You know your modelling language needs improvement when NASA scientists have trouble with it. ;) It's also expensive.
I think it might work out if they use a different language. Sugar-coat it, make it more flexible, or even build on an existing prover like Coq or HOL with a subset of its typical usage. It would make it easier for developers to pick up. It would make their end a lot harder, though, as the tool attempts to solve and integrate a series of Mega-Hard problems. Like EDA, but not binary.
That she and her crew got this far is impressive. It's the standard I hope to exceed someday with more usable tools that optionally perform better.
>> if you want to work on cool software that does something novel and exciting it seems like it's better to graduate with a degree in math or physics
Speaking more generally, you need domain knowledge outside of computer science, so that you understand which applications to tackle and how to apply the engineering knowledge in a specific vertical.
I've been focusing mostly on software engineering for the past 60% of their life (started at 9, am now 28). Last year I realized that even though I'm a good engineer who knows many tools and so on, who wouldn't dare see everything as a nail despite having a hammer. Despite all that, when you look at me from the outside, from the perspective of someone who isn't an engineer, I am just a very good hammerer.
I look at everything through the lense of software engineering. Because at the end of the day, even though I have many different hammers, I'm still pretty much just a hammerer. To make an actual thing, I need a domain expert.
And that's kinda frustrating when you think about it.
You need good hammering specialists, though. The trick is to recognize that's who you are, and team up with domain experts you really click with. It just really helps to be both if you want to be a lone leader who comes up with visionary ideas.
001, 001:Digital Gold, Digital Gold, 001 Tool Suite, USL, Universal Systems Language, 001AXES, Function Map, FMap, Type Map, TMap, Object Map, OMap, Execution Map, EMap, Road Map, RMap, Xecutor, OMap Editor, 001 Analyzer, SOO, System Oriented Object, Resource Allocation Tool, RAT, AntiRAT, Object Editor, Primitive Control Structures, Development Before the Fact (DBTF), RT(x), VSphere, escherTMap, agent 001db are all trademarks of Hamilton Technologies, Inc.
A lot of novel and exciting software was, necessarily, during times when software was a new concept. During those times, computer science programs did not exist, and the people who played with computers tended to come from a range of math-heavy disciplines.
this is why I think it's great to start with C. I worked in it for 3 years and found myself creating and utilizing abstractions. I looked back and realized I was writing Design Patterns myself. Much easier to understand bottom up.
Not to knock the work that they're doing at all, but that website has the first use of the <marquee> tag I've spotted in the wild in a long long time! I had no idea Chrome still supported it!
She and Saydean Zeldin used to have a company called Higher Order Software. I met them decades ago, when they were promoting that. They have an unusual formalism which never caught on.
There was a lot of interest in formal techniques in the late 1970s and early 1980s, but the technology didn't go that way.
See this paper rejecting the technology for the Trident missile program.[1] Also this harsh critique from Djkystra.[2] And this writeup on HN last year.[3]
I encountered HOS back when I was doing proof of correctness work, and didn't really understand how it got beyond very simple problems. But I thought, at the time, that was my fault. In retrospect, it's an approach for a certain class of control problems dominated by required relationships between certain variables. That's what flight control systems are all about.
Control problems are typically expressed as a set of "laws", equations which define what's supposed to be happening given the inputs and perhaps past inputs. Some of these are equations, and others are constraints. Checking those laws for contradictions and turning those laws into code is partially automatable, and that's what HOS was trying to do.
HOS only seemed to work well with the founders driving it. Somehow, they were never able to express clearly what they were doing. Sometimes that happens. Norm Hardy, who came up with capability-based systems and created KeyKos, a capability-based OS for IBM mainframes, had that problem. The system worked great, but nobody understood what he was doing. He used to have an "explainer", Susan Rajunas. In both cases, the startups went bust.
Trying to get people to understand a complex formal system is very hard. Harder than developing one.
I knew Hamilton's achievements in the Apollo program were remarkable...but I hadn't known that she had started while being as young as 24 -- and being a mother. That's just incredible. I can't even imagine going through school as a father and being able to balance my time, never mind having to carry child. Nevermind being a star programmer at MIT. And her husband was going through law school at the time, which means not only did she not have a stay-at-home husband to take over the parenting duties, she was the breadwinner.
Excellent chapter from the documentary Moon Machines about the Apollo guidance software (worth getting the whole thing on Amazon if you like this sort of thing), told from the point of view of the people who built the machines, rather than the astronauts.
A fascinating read about the incredible capabilities of the Apollo software is _Digital Apollo: Human and Machine in Spaceflight Paperback_ by David A. Mindell.
Those craft were quite likely capable of a _lot_ more than they were ever allowed to do by the astronauts - though I guess we'll never know!
>Once the code was solid, it would be shipped off to a nearby Raytheon facility where a group of women, expert seamstresses known to the Apollo program as the “Little Old Ladies,” threaded copper wires through magnetic rings (a wire going through a core was a 1; a wire going around the core was a 0). Forget about RAM or disk drives; on Apollo, memory was literally hardwired and very nearly indestructible.
I was at a talk by Richard Battin, also of Draper on the Apollo program, a few years back. One of the stories he told was of a number of the Apollo astronauts visiting Raytheon (where the core memory was being "sewn") and the general gist of the visit was "be really careful with your work or these nice young boys could die."
In the "The Right Stuff" by Tom Wolfe, he mentions quite a few situations where the Mercury astronauts would visit factories and facilities in charge with building various components, and in all of these cases there was the same gist of "be really careful with your work or these nice young boys could die." So this practice existed from the very beginning of the American space program.
I wonder if any of that still happens these days. Probably way less now when American astronauts hitch a ride on the Soyuz.
I worked a program in the early nineties. We all met the test pilots that would fly the bird. It was abundantly clear (they made it clear) that lives were at stake, and that they wouldn't step inside unless they trusted you and your work. They had the unilateral right to declare the entire program 'no go', and rightly so. This is formalized as a "flight readiness review", but of course you meet the pilots before that. This was a Navy helicopter, not space, I can't speak to what happens at NASA these days, but I can't imagine it's much different.
http://htius.com/Articles/r12ham.pdf
Her Wikipedia page gives her due credit and shows just how much she and her team pulled off:
https://en.wikipedia.org/wiki/Margaret_Hamilton_%28scientist...
Also note the size of that printout of her program. Try to think about how many people make one like that which doesn't fail in the field even when parts of its environment and hardware are failing. Also while using low-level code with little to no tool support on 60's era, embedded hardware rather than a JVM, etc. Margaret Hamilton was the first in the field of high-assurance programming that I've seen. A lot of firsts.
So, I give her credit where possible. The industry should even more as she wasn't a theoretical programmer like Ada Lovelace: she engineered actual software, co-invented many of the best practices for it, deployed them successfully in production, and then made a tool that automated much of that from requirement specs to correct code.
While programmers and industry ignore her, at least NASA gave her positive mention as the founder of software engineering and ultra-reliable software along with the biggest check they ever cut to an innovator:
https://www.nasa.gov/home/hqnews/2003/sep/HQ_03281_Hamilton_...
Where would our systems and tools be if more attention was put into her methods, even just principles, than informal use of C? ;)