Hacker News new | comments | ask | show | jobs | submit login
What Did Ada Lovelace’s Program Actually Do? (twobithistory.org)
410 points by weebst 6 months ago | hide | past | web | favorite | 52 comments

Holy shit. I got shivers the moment I realized Sinclair actually finds and fixes a bug from the very first program written in 1843.

  In her “diagram of development,” Lovelace gives the
  fourth operation as v5 / v4. But the correct ordering
  here is v4 / v5. 

I was recently at the Musee des arts et metiers in Paris, and they have several original Pascalines on display. So many marvels packed into a very tight space, it is one of my favorite in Paris and I would highly recommend it. I'm not sure what they have on Lovelace, I'll admit I was pretty overwhelmed with the collection. Their computer collection alone is a museums worth, and they cover much, much more than just computer history (a lot of mechanical engineering for starters). If they don't have this algorithm on display, they absolutely should.

A leather-bound first edition Lovelace including this algorithm recently sold for about $120,000. That actually seems like a bargain when you think about the level of the accomplishment. [1]

I agree with OP that most likely the bug is in the transcription and not present in the original manuscript.

[1] - https://www.theguardian.com/books/2018/jul/24/ada-lovelace-f...

It was the first published program but not the first.

To be the first, you have to believe Babbage designed a general purpose computer that could accept instructions and never wrote any instructions.


“I confirm that the manuscript evidence clearly shows that Babbage wrote ‘programs’ for his Analytical Engine in 1836-7 i.e. 6-7 years before the publication of Lovelace’s article in 1843. There are about 24 of such ‘programs’ and they have the identical features of the Lovelace’s famous ‘program’,” adds Swade. The historian says that the new tests are “unarguable” and that they “do not support, indeed they contradict the claim that Lovelace was the ‘first programmer’.”

It really depends what you mean by "program". Would you argue that the Euclidean algorithm from two millenia prior was a program?

Ada Lovelace was the first to realise that the analytical engine would perform arbitrary tasks and wrote programs for those arbitrary task, beyond computational operations. Of course Babbage who designed the hardware had some idea of what programs it could run and presented examples, but he did not have the forethought to go beyond as Lovelace is quoted in your articles, "an original understanding of where the power and potential of computers lay."

There has always been a controversy of how much of Lovelace's work is hers and how much is Babbage's in the Menabrea papers, and I don't think Babbage writing a few simple programs settles this controversy one way or another. Lovelace had unique and original insights that should not be downplayed.

> It really depends what you mean by "program".

A [computer] program is a sequence of instructions executable by a machine (the computer).

> Would you argue that the Euclidean algorithm from two millenia prior was a program?

It's in the name: EA is an algorithm.

By that definition, Babbage didn't write a program either. He didn't write opcodes. You still need a human "compiler" to translate Babbage's writing into something that could run on the engines.

Btw, this human "compiler" job was for a long time considered to be inferior work and part of the reason why the first professional software developers in mid 20th century were largely women: it was considered clerical work to translate algorithms from paper into computer programs. The Computer Girls is a good article that describes these attitudes.


Looks like you got downvoted into oblivion, but I believe you bring up an important point that needs to be discussed, and that's "intent".

The string "x=1" can both be a computer program, and something intended only for humans to read. The Euclidean Algorithm was written specifically for humans to understand, with no intent for them to ever be interpreted by a machine. The fact that someone at some point did implement it doesn't retroactively make it the first program. Lovelace's "Diagram" was also not something a machine could directly execute. But the key difference was the intention, she specifically intended that her instructions could be interpreted and executed by a machine.

This is already discussed quite a bit in the article:

"Babbage also wrote more than twenty programs that he never published.19 So it’s not quite accurate to say that Lovelace wrote or published the first program, though there’s always room to quibble about what exactly constitutes a “program.” Even so, Lovelace’s program was miles ahead of anything else that had been published before. The longest program that Menabrea presented was 11 operations long and contained no loops or branches; Lovelace’s program contains 25 operations and a nested loop (and thus branching)..."

A group working on building an Analytical Engine, discovered that Lovelace’s Bernoulli program almost certainly would not be able to run on the Analytical Engine in Babbage’s notebooks. The “instruction set” was missing some required features.


From what I understand, she clearly understood the limitations of what Babbage achieved and wrote her program to demonstrate the value of extending the hardware in order to run more universal "programs," like the one she writes!


Also note that apparently Babbage later did consider exactly the implementation of the said missing hardware functionality:


Better yet, that version didn’t have conditionals the user could employ at all, so it isn’t even a proper computer but rather a calculator.

Is this the first documented instance of finding a bug in a computer program? That should be a more well known milestone in my opinion.

The author mentions that another blogger, Jim Randell, found the same error in 2015 when translating the tables to Python: https://enigmaticcode.wordpress.com/tag/bernoulli-numbers/

It's also debated whether these were bugs from Ada or transcription errors. But, yes, it does seem like an interesting milestone

>It's also debated whether these were bugs from Ada or transcription errors

Transcription errors would make this the first bug due to bit rot in a special sense! Still quite remarkable :)

git blame a bit hard in this case.

What else would you recommend checking out in Paris by way of museums?

The National Museum of Natural History is unlike any other I’ve been to. It is fairly worn out but is great. It is displayed in a very different t format to what I’m used to. https://en.m.wikipedia.org/wiki/National_Museum_of_Natural_H...

Quiet and completely excellent, The Museum of Heritage and Architecture. https://www.unjourdeplusaparis.com/en/paris-culture/musee-ci...

The Army Museum is worth a visit - it is interesting to see a less Anglo centric view of the wars, particularly WW1. The paintings are deeply grim - not ideal if you have young kids. https://en.m.wikipedia.org/wiki/Musée_de_l%27Armée

The Museum of the Middle Ages is currently being renovated but is well regarded. https://en.m.wikipedia.org/wiki/Musée_de_Cluny_–_Musée_natio...

Probably a candidate for best museum in the world, the Louvre is a great way to completely waste a day with massive crowds, sore feet, food and water shortages, closed exhibits, intense heat and even more crowds. Get a map ahead of time and check that the things you want to see aren’t closed before you go.

If you want to see an outstanding collection of art without having to face the Louvre crowds, try the Musée d'Orsay: a vastly more pleasant experience, IMO. https://en.wikipedia.org/wiki/Musée_d%27Orsay

The Pompidou Centre (Centre Georges Pompidou) [1]. It's technically not a single museum, but several, including the largest museum for modern art in Europe (Musée National d'Art Moderne) and IRCAM, which is about music and acoustic research.

If you're into modern art and/or architecture, it's a must-see. The center itself is a beautiful mess of industrial design, and they have everything ranging from modern paintings to furniture to contemporary sculptures to technological inventions.

Two of the current exhibitions are an installation by Ryoji Ikeda [2], which is one of the most superb audiovisual pieces I've seen (you sit in a completely dark room and watch a huge projection) and Coder Le Monde [3], a history of generative art (digital 2D and 3D, as well as physical), computer graphics and visualization. Both are fantastic.

The nearby Palais de Tokyo is also fun, as is the Museum of Modern Art, which is next door to it.

Everyone mentions the Louvre, but it is incredibly crowded and touristy, and the 90% of the paintings are portraits of 19th century nobility. The gardens surrounding the Louvre are much enjoyable than the inside, in my (possibly unpopular) opinion.

[1] https://www.centrepompidou.fr/en

[2] https://www.centrepompidou.fr/cpv/agenda/event.action?param....

[3] https://www.centrepompidou.fr/cpv/agenda/event.action?param....

The ossuary, for sure. Louvre is a meh. The park adjacent to the Louvre is good-to-go. Wander between the Louvre and Eiffel, there is a brassiere frequented by diplomats. Pickup intramural football (soccer) by the nurses' college just across the river. Explore on foot with only a rough plan is usually the best way to find treasures.

Looks like it is currently closed, but from an interesting infrastructure standpoint, the Musée des Égouts de Paris (sewers underneath Paris). https://en.wikipedia.org/wiki/Paris_Sewer_Museum

The Guimet Asiatic art museum is pretty spectacular:


My favorite is La Cité des Sciences, although it's not really a museum but more of an exhibition place for all things related to science.

Used to go there all the time as a kid, especially to read 2600 magazine in the library.

The Louvre must be on any list.

Great article. In some ways it reminds me of Jonathan Blow's presentation titled "Truth in Game Design" [1]

The author is saying that Ada deserves the title of first programmer both because she did publish an elaborate algorithm and because she understood the potential of the Analytical Engine better than Babbage and Menabrea.

Jonathan Blow's presentation is more about that second point and how computers (or rather, sufficiently complex dynamical systems) can "give you something back that you didn't put in".

System theory is nothing new so it might not be exact equivalent of thinking about engines that can compose music in 1842, but it seems to me that it is not a widespread and well-understood concept as it should be, and that the public opinion is re-discovering it mainly through Deep Learning advancements.

[1] https://www.youtube.com/watch?v=C5FUtrmO7gI

Modesty by the creator is not due to lack of foresight, it is due to the exact nature of their understanding.

Just for anyone curious there is attempt to build Babbage's Analytical Engine started by John Graham-Cumming's Plan 28 project[0]. Currently there is a lot of fragmented knowledge to be collected from the vast Babbage's writings. To understand the size of the project here is the excerpt from their blog[1]:

>in the context of manufacturing methods Babbage calculates that the total number of teeth to be formed for a store with 1,000 registers would be 1,800,000.

[0] https://plan28.org/

[1] http://blog.plan28.org/2018/05/spring-2018-report-to-compute...

The "C translation" of the program by the article author:


It's definitely far from being trivial!

Also note the use of floats in the C version. A lot of simpler CPUs didn't and still don't have floats.

Fascinatingly, the Analytical Engine's number representation was 40 digits (and later: 50)(!) base 10 (but not "floating" but "fixed" point):


Moreover, a lot of simpler CPUs also don't have the "division" in the "instruction set", the Engine has it!

Given the emulator already existing, maybe somebody will now create the cards from Lovelace's program, and run it as intended?


In case anyone missed it, these last two articles were written by John Walker. Founder of Autodesk, Inc. and co-author of AutoCAD.

This is really a very gripping piece, and I'm not actually a programmer. It's also interesting mathematically and just as general history, for lack of a better term.

I’m a programmer and respected Ada Lovelace simply because of her having the title of the first programmer. Now I think she deserves more recognition because she also envisioned the modern use of computers in everyday tasks. Amazing.

There is a very lucid graphical explanation of the hardware of the difference and analytical engines functions in a lengthy appendix to The Thrilling Adventures of Lovelace and Babbage. I really love this book and I can't recommend it enough. I wish it were adapted into animation! It is historical fantasy of the highest calibre.


This is a great article, and I also found myself shuddering to think what would happen if the world ever saw the first program I ever wrote (whatever that was).

For me, that's no big deal:


Even the horrible crap I wrote as an novice is probably less embarrassing to me that the stuff I wrote as a journeyman who shoulda known better.

Thankfully, the most embarrassing things are, like, spending 2 days configuring postfix and not realizing that I could just read the log files and it tells me specifically what isn't working. A valuable lesson, but embarrassment has its cost.

Hahaha, mine:

    mov ah, 2
    mov dl, 7
    int 21
    int 20
Discovering that I could write my own programs with debug.com was earth shaking. Picked up some decade old book on it at Half Price Books (which went well with my decade old machine -- an 80s model IBM with integrated monochrome green screen) and never looked back.

Speaking of embarrassing errors, your comment just sent my pre-coffee self on a brief and fruitless visit to http://debug.com

I can smell your program. All that new hardware around me. The smell of dot matrix ink.

Yes! I remember my first experience with a PC in the 80s was with a Robotron 8086 PC. It did emanate a smell and I could hear interesting mechanical sounds from inside. The motherboard was huge and it had a small 14 inch monochrome green monitor on top of it.

That would emit a beep from the speaker, right? So long ago...

None other than beep.com!

The first assembly program I wrote and got paid a handsome sum for:

    jmp FFF0
The client was amazed that I'd figured out a way to automate the ctrl-alt-del monkeys that had been hired to reboot everyones machines, for some reason, every night ..


> she made groups of operations and in the text of her note specified when they should repeat.

Does anyone know what "should repeat" means? Was the machine capable of repeating or does this mean the human operator needed to do the repeating?

Babbage's notes describe what we would call branch-if-zero and branch-if-negative instructions which you could use to implement repetition. Control flow was the least well developed part of the instruction set though, presumably because it would be trivial to implement compared to the arithmetic operations so it could wait.

If you're interested in that kind of thing I wrote a description of the entire instruction set a while back: http://h14s.p5r.org/2012/11/punched-cards.html.

Having just spent the last half hour on your blog I'm even more impressed by what the pair of them achieved.

Thanks for the in-depth, and wonderfully clear write-up.

The Analytical Engine was Turing complete so loops and conditionals would have been possible, but Babbage didn't get near building it.

The London Science Museum built a fully working Difference Engine, to Victorian tolerances so it could have actually worked.

I heard of a project to build a modern working replica but I don't know if anything ever came of it.

That would be https://plan28.org/ which I'm involved in. Blog for updates: http://blog.plan28.org/

Thanks! Nice to see it's still progressing.

You can see the exact 'program' here [1].

It's really just a series of mathematical steps with a store of values in between. Notions that would have clearly taken it from being a series of mathematical steps, to a program, such as branching, looping, or jumping are left completely unspecified. So her 'loop' was specified literally as "here follows a repetition of operations 13-23".

In other words, "should repeat" is left an as exercise to the reader.

[1] - https://i.imgur.com/HSWjbUl.jpg

It's easy to be dismissive from today's perspective, but I think it adequately conveys the algorithm. As I don't think Babbage ever published an instruction set I'm not sure it could have gone a lot further.

Amazing article! There's a bug in it though. The LHS of the sum of cubes formula ends in n^2 instead of n^3.

There is an error in the squares formula as well. The correct formula is:


Thanks a lot for pointing these out! I've corrected them.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact