Hacker News new | past | comments | ask | show | jobs | submit login
Apollo 11 Guidance Computer source code (github.com/chrislgarry)
652 points by uptown on July 7, 2016 | hide | past | favorite | 145 comments



Someone opened an issue: "Check continuity on O2 cryogenic tanks before allowing stir"

https://github.com/chrislgarry/Apollo-11/issues/3


This is a classic XY issue, suggesting a fix ("Check continuity...") instead of describing the problem ("O2 tanks exploded during Apollo 13 mission").

Further, I believe that the suggested fix is incorrect, or at least insufficient. The Apollo 13 investigation indicated that a list of factors led to the fan wires in the O2 tanks having damaged insulation. However, unless the wires were already short-circuited before stirring, checking continuity first would not have detected the short. Indeed, the tank was stirred twice earlier in the mission without incident. The investigation suggested that operating the fans itself may have eventually moved the wires into contact with each other, which combined with the damaged insulation, finally allowed an electrical arc and the resulting explosion to occur.

The correct fix is to upgrade the thermostatic switches which protect the tank heaters from overheating to accommodate 65 V DC, so that the fan wiring isn't damaged in the first place. In addition, the tank acceptance procedure should be amended to require switch cycling under load.

Source: http://history.nasa.gov/SP-4029/Apollo_13a_Summary.htm


Or maybe describing the problem in the explicit way you mentioned instead of the oblique way would have ruined the joke?


And it's been marked wontfix.


ELI5?




you'll here mention. Ugh, "hear."

Another tidbit. Toward the end you hear mission control say "30 seconds." That's how much fuel is left[1]. Those guys had steel spines.

If you can't get enough of this stuff, I highly recommend "A Man on the Moon" by Andrew Chaikin[2], as well as the HBO mini-series produced by Tom Hanks and based largely on that book, "From the Earth to the Moon"[3]. "Failure Is Not an Option" by Gene Kranz[4] (flight director on Apollo 11 and 13, among other things) is also a good read.

[1] http://www.nasa.gov/mission_pages/apollo/apollo11_audio.html

[2] https://www.amazon.com/Man-Moon-Voyages-Apollo-Astronauts/dp...

[3] https://www.amazon.com/Earth-Moon-Four-Disc-Collectors/dp/07...

[4] https://www.amazon.com/Failure-Not-Option-Mission-Control/dp...


This one is better image quality https://www.youtube.com/watch?v=Jg80HZsv_js


Here's one with subtitles and interactivity http://www.firstmenonthemoon.com/


Wow ! Just saw the whole landing again. I have seen the video before , but I never understood who was saying what . This is the best way all those conversations could have been visualized !


Wow wow wow indeed! This is a gem of a way to experience a most heralding moment of "mankind".


...and here's one with Neil Armstrong talking us through it...

https://www.youtube.com/watch?v=jfj2jqpst_Q&feature=youtu.be...


The high quality originals were taped over by NASA.

https://www.theguardian.com/science/2009/jul/16/moon-landing...


See? These are the types of reasons why the-moon-landing-was-staged conspiracy theories keep cropping up.


I have one word for those: retroreflectors.


And another- telemetry


Not that I believe in any particular moon conspiracy hypothesis, but the reflectors being on the moon do not proove anything else than someone did put them there. Wether humans went on the moon to put them there, or some other delivery system was used is (strictly) logically up for debate.


I wonder, could this put those moon landing denial conspiracies to rest?


Conspiracy theorists typically suffer from enormous confirmation bias, so it's far more likely that this will lead to new claims about how the code couldn't possibly have worked on a real mission. I wouldn't even be surprised if someone found a transcription error, claimed it was an error in the original code, and then accused the maintainers of a cover-up for fixing it.


when bait clicking grammar infiltrates regular english grammar....


This is amazing and contains so many gems.

I think this one is my favorite module: https://github.com/chrislgarry/Apollo-11/blob/master/THE_LUN...

		CAF	CODE500		# ASTRONAUT:	PLEASE CRANK THE
		TC	BANKCALL	#		SILLY THING AROUND
		CADR	GOPERF1
		TCF	GOTOP00H	# TERMINATE
		TCF	P63SPOT3	# PROCEED	SEE IF HE'S LYING

        	TC	BANKCALL	# ENTER		INITIALIZE LANDING RADAR
		CADR	SETPOS1

		TC	POSTJUMP	# OFF TO SEE THE WIZARD ...
		CADR	BURNBABY



Take note of the KALMAN_FILTER.s source code file. See https://en.wikipedia.org/wiki/Kalman_filter for details. The filter is named after Rudolf Kalman who recently passed away. (https://en.wikipedia.org/wiki/Rudolf_E._K%C3%A1lm%C3%A1n)


BURN_BABY_BURN--MASTER_IGNITION_ROUTINE.s - https://github.com/chrislgarry/Apollo-11/blob/dc4ea6735c4646...


And the name inspiration [1]

## At the get-together of the AGC developers celebrating the 40th anniversary ## of the first moonwalk, Don Eyles (one of the authors of this routine along ## with Peter Adler) has related to us a little interesting history behind the ## naming of the routine.<br> ## <br> ## It traces back to 1965 and the Los Angeles riots, and was inspired ## by disc jockey extraordinaire and radio station owner Magnificent Montague. ## Magnificent Montague used the phrase "Burn, baby! BURN!" when spinning the ## hottest new records. Magnificent Montague was the charismatic voice of ## soul music in Chicago, New York, and Los Angeles from the mid-1950s to ## the mid-1960s.

[1] https://github.com/chrislgarry/Apollo-11/blob/dc4ea6735c4646...


I really wish we can do an AMA with these two,

> # THE MASTER IGNITION ROUTINE WAS CONCEIVED AND EXECUTED, AND (NOTA BENE) IS MAINTAINED BY ADLER AND EYLES.

I had to look this up, NOTA BENE is latin for,

  observe carefully or take special notice


Random note: When I was in the 5th grade, our math teacher taught us the meaning of Nota Bene, and used a NB sign to mark important notes when deriving formulas. Still use to this day when making notes


I think I'm going to start using NOTA BENE in my source code comments just to be fancy.

[EDIT] On second thought "HONI SOIT QUI MAL Y PENSE" would be much better ("Shame on whosoever would think badly of it")


Postgres' code is full of NB: :)


Suddenly my Latin and Computer Science majors make more sense together.


Hence N.B.


I never made that connection. TIL. Thank you!


Also the "NOLI SE TANGERE" for "DO NOT TOUCH".


Maybe you will find this article interesting:

Don Eyles Walks Us Through the Lunar Module Source Code

http://hackaday.com/2016/07/05/don-eyles-walks-us-through-th...


... and the HN discussion of this article 2 days ago:

https://news.ycombinator.com/item?id=12036864


I found Don Eyles' website[1],

> At MET 102:39:31 the best possible confidence builder occurred — throttle down, right on time. "Ah! Throttle down... better than the simulator" commented Aldrin, "Throttle down on time!" exclaimed Armstrong, their excitement palpable. In the official transcript of communications between spacecraft and ground during the powered descent, these are the only exclamation points[11].

goosebumps.

[1] http://www.doneyles.com/LM/Tales.html


NB. Is how you indicate a comment in J.


"Note well"


… lest you fall into it.


I learned a new word: ULLAGE (the amount by which a container falls short of being full)


Also of note, in zero-gravity the ullage space must be forced to the opposite end of the tank before ignition so the fuel is at the intakes. Commonly this task is accomplished by "ullage motors" which fire to settle the fuel before primary ignition[1].

[1] https://en.wikipedia.org/wiki/Ullage_motor


Ullage (the word) has terrestrial applications too:

https://en.wikipedia.org/wiki/Ullage_(wine)


Did you notice the date:

> # Assemble revision 001 of AGC program LMY99 by NASA 2021112-61 > # 16:27 JULY 14, 1969

Likely the printing date but I envisioned the last minute all-nighters right away.


Considering that the binary was woven by hand using copper wire into the computer's core rope memory, I imagine the flight revision had to be finished months before the actual flight.


Also interesting is the name of two of the "General Purpose Ignition Routines" (I'm assuming, I don't speak Assembly):

  159: BURNBABY
  169: B*RNB*B*


There's a simulator, if you want to run it.[1] But it's just a simulator for the computer; there's no spacecraft attached.

There's a mod for Kerbal Space Program which gives it real solar system planets and dimensions. (KSP's world is way undersized so things happen faster.)[2]

There's another mod for Kerbal Space Program to give it real dynamics.[3] (KSP doesn't really do dynamics right; the spacecraft is under the gravitational influence of only one body at a time. This is why there's that sudden trajectory change upon Mun capture.)

Someone should hook all that together and do a moon landing in simulation.

[1] http://svtsim.com/moonjs/agc.html [2] https://www.reddit.com/r/KerbalSpaceProgram/comments/1piaqi/... [3] http://forum.kerbalspaceprogram.com/index.php?/topic/62205-w...


I'm really surprised there isn't a moon landing simulator based on the real code yet.


There's an Orbiter plugin[1] that simulates the rest of the spacecraft. IIRC it can interface with this simulator to get an accurate simulation of the computer.

[1] http://nassp.sourceforge.net/wiki/Main_Page


Love this line:

  TC	BANKCALL	# TEMPORARY, I HOPE HOPE HOPE
  CADR	STOPRATE	# TEMPORARY, I HOPE HOPE HOPE
  TC	DOWNFLAG	# PERMIT X-AXIS OVERRIDE
https://github.com/chrislgarry/Apollo-11/blob/master/LUNAR_L...



Line 666


Typical temporary code, which 46 years later is still there.


Whenever I put in temporary code like that, I always leave my full name, the date, and a snarky comment about "suuuuuure this is temporary". Seriously.


I wonder if the Space Shuttle has a TODO left in there somewhere.


Now that's some source I'd love to take a look at


Have they considered rewriting it in rust?


Well, considering that the program was "written" into the ROM by winding wire around ferrite cores, I'd say it was literally written in rust.


hehe

(A friend of mine, who when in search of a new go-to-programming language instead of Python, was more interested in Swift vs Rust because of this attitude).


Remind your friend these comments are only left by a small vocal minority and is not representative of the project or its maintainers.

This is akin to not liking something because you don't like the people who already like it, despite how much you'd like it otherwise.


> Remind your friend these comments are only left by a small vocal minority and is not representative of the project or its maintainers.

Thanks for mentioning this, I certainly will the next time. The last time this came up, this was not an attitude I had previously noticed, although did see a bunch on Hacker News the following week

> This is akin to not liking something because you don't like the people who already like it, despite how much you'd like it otherwise.

I would totally agree with this, but at the same time, while not a technical reason, if one works in primarily a single programming language I can imagine the nature of the community would be a legitimate factor to consider---in this case though as you point out, it would be an inaccurate opinion of the nature of the Rust community.


Sorry for the late reply.

To your last point, I agree slightly as well but my rebuttal would be that each person chooses how much and at what level to participate in a community they are in and which sub-communities they identify closer with.

I can imagine a person being proficient and working in any language without the need for them to be involved with the community at all, or if they do need to interact do so in a read-only matter.


Clue me in! What kind of comments are we talking about? Anything inappropriate here must have gone over my head.


The two aspects my friend's perception of the Rust community are both a) rewriting existing projects in Rust for the sake of having a version written in Rust[1][2], and b) it seems a frequent trend on Hacker News to see comments about "why didn't you write it in Rust" or "how about porting it to Rust" on posts about projects and "should have written it in Rust" comments on posts about security bugs. As the parent to your comment points out, this is probably a small group in the community who comments often.

As I could see from the way my score, on the post that started this discussion fluctuated, this is clearly a topic some are both sensitive about or like find humor in (or both!).

EDIT: I didn't mean to derail the discussion of a fascinating code posting; but I assume the comment I responded to that spawned this tangent, "Have they considered rewriting it in rust?"[3], was a joke made about the other comments on HN to rewrite things in Rust.

[1]: From a pedagogical standpoint, obviously this is a potentially good way of learning a language so if for learning and not pure calorie burn I personally don't see this as wasteful.

[2]: <<possibly exaggeration warning>> I understand there is some motivations within (perhaps a small part) the Rust community to replace the world'S C-systems-code with more secure Rust code.

[3]: https://news.ycombinator.com/item?id=12049521


     		CA	A		# SHOULD NEVER HIT THIS LOCATION
The 1969 version of "This should never happen".

https://github.com/chrislgarry/Apollo-11/blob/dc4ea6735c4646...


It's amazing that this is all online now, and easy to browse. Lots of source here too http://www.ibiblio.org/apollo/links.html

FWIW, I did performance analysis of the guidance computer and the 1202 and 1201 alarms at the start of my ACM Applicative 2016 keynote: https://youtu.be/eO94l0aGLCA?t=4m38s


hey there Brendan, thats a pretty awesome presentation (I m only 20 minutes into the talk as of now ).. But you mention that the Apollo Engineers expected the CPU load to be about 85% during descent. And the Guidance computer's Kernel ran "Dummy Jobs" when no real jobs were run.

What are these Dummy Jobs ? And Why did they have to do this instead of just leaving the CPU idle ?


> Why did they have to do this instead of just leaving the CPU idle ?

This would require a CPU that was designed to idle.


wow , wow ! Looks like I don't understand the first thing about the CPU design . Do CPUs have to be designed to IDLE ? Can you throw some more light on this ?


A basic model of a CPU is running an infinite loop like this:

  1. If interrupts not masked, check for interrupt
  2. Load instruction
  3. Advance instruction pointer
  4. Execute instruction
It doesn't ever stop - as soon as the current instruction is finished executing it moves on to the next one. So, if you don't have anything better for the CPU to do, you need to have it spin in a loop of instructions anyway.

More modern CPU designs typically include an instruction that means "halt until next interrupt" which actually stops the CPU from fetching and executing instructions.


Why do CPUs and GPUs run hotter when doing more intensive tasks?

In your last statement I could see it making sense where the CPU actually halts, but did prior CPUs always run at about the same temperature? Or do these idle processes throw fewer instructions at one time so it's not as overwhelmed?


Modern CPUs, GPUs and SOCs have power management states that disable entire submodules when they're not in use, by actually gating off the clock to them. If you run without power management enabled, you'll find that they run hot all the time.


> did prior CPUs always run at about the same temperature?

Basically, yes. But then they typically produced so little heat they had passive heatsinks, up to and including the Pentium II (~20 W TDP) and ATI Rage 128 that I used back in '99.


wow.. Thanks for the explanation there.When we switched to modern CPUs that could actually Halt, was there actually hardware/physical changes to the CPU ? Or was it just a software change (ie) Added a new instruction to the existing instruction set ? ..

Is that what made it difficult for us to design processors capable of "idling" ? (ie) completely new hardware design


It was a hardware change. In those older CPU designs, the external clock signal was directly driving a state machine, so for as long as the clock was applied, the state machine would go.

It's important to realise that there was no good reason to have the ability to stop the CPU in those days - power consumption by the CPU itself was truly trivial compared to the memory and peripherals it was attached to, and those CPUs weren't really damaged or worn out by running continuously. Having the CPU spin in software when there was nothing else to do was perfectly fine.


Normally the CPU clock runs continuously and every cycle the program counter increments (or gets changed by a branch instruction of some kind.) If you want to stop the CPU, you have to gate the clock somehow. Maybe a timer that you could configure and enable via software. But that's extra complexity.. and if you use dynamic logic (which is smaller and faster than static logic), you lose state when you halt. Spinning in a tight loop, on the other hand, doesn't require any hardware support.


# 3. AT PRESENT, ERASABLE LOCATIONS ARE RESERVED ONLY FOR N UP TO 5. AN N IN EXCESS OF 5 WILL PRODUCE CHAOS.

I want to just leave comments like this and have users be responsible for avoiding said chaos.


It's interesting to me that the AGC contains an implementation of a virtual machine that is used to perform the higher-level mathematical functions (called 'The Interpreter'). Some details are available in this PDF starting on page 74: http://www.ibiblio.org/apollo/NARA-SW/E-2052.pdf

It would be fun to do some research into the embedding of higher-level virtual machines in earlier computers. I'm thinking of 'The Interpreter' in the AGC as being an ancestor to 'SWEET16' in the Apple II (https://en.wikipedia.org/wiki/SWEET16), or the 'Graphic Programming Language' (http://www.unige.ch/medecine/nouspikel/ti99/gpl.htm) in the TI-99/4A.


In case you're wondering what hardware this source code is for: https://en.wikipedia.org/wiki/Apollo_Guidance_Computer

The Apollo Guidance Computer (AGC) was a digital computer produced for the Apollo program that was installed on board each Apollo Command Module (CM) and Lunar Module (LM). The AGC provided computation and electronic interfaces for guidance, navigation, and control of the spacecraft. The AGC had a 16-bit word length, with 15 data bits and one parity bit. Most of the software on the AGC was stored in a special read only memory known as core rope memory, fashioned by weaving wires through magnetic cores, though a small amount of read-write core memory was provided.


Other than gaining satisfaction from the historical importance of this code is there any conceivable way we can get some use of it - like try it out.

Even setting that aside, what is it I'm looking at? Assembly?


I made some art out of it (http://codeposters.io) and have it on my walls now.


These are great, well done.


I'm pretty sure there is a website that simulates this computer 100%

/e http://svtsim.com/moonjs/agc.html


The JS source is at https://github.com/siravan/moonjs.

It's derived from Ron Burkey's unmaintained Virtual AGC at http://www.ibiblio.org/apollo/, which also hosts many of the manuals and related documents.

But if you really want to be cool, you could build a real AGC from scratch: http://klabs.org/history/build_agc/


I've tried to play with that before, it is pretty incomprehensible even after reading the descriptions and Wikipedia article.

I feel like the source is likely clearer than that interface...


> Even setting that aside, what is it I'm looking at? Assembly?

AGC Assembly. Here's the manual: http://www.ibiblio.org/apollo/assembly_language_manual.html


Fascinating. Thanks for the link.


> is there any conceivable way we can get some use of it

I wonder whether there's any way of getting it into one of the autopilots for Kerbal Space Program.


Orbiter has the NASSP add-on, which supports Virtual AGC: http://nassp.sourceforge.net/wiki/Main_Page

Notably, it includes some unmanned auto-running missions, and some fictional missions, including a hypothetical manned flyby of Venus using the AGC.


Jeb wont need that. Jeb will fly without a puny computer. Because Jeb is a steely eyed missile man.



"It is correct to say that we landed on the moon with 152 Kbytes of onboard computer memory." - Don Eyles

Ref: http://www.doneyles.com/LM/Tales.html

Amazing!


What was the development environment like for this code?


Pencil, eraser, paper, Punchcards, punchcard machine, Punchcard reader, computer, papertape, papertape puncher, AGC computer, Command Module, LEM, Neil Armstrong, Buzz Aldrin, Michael Collins.

These last 3 are software QA.


When I worked on my start up, We built our complete hardware and then software for it.

I had to write down the drivers, and display library for the 128 x 64 lcd display with a simple scheduler, FSM and all!(Hard to mention all the work) Bulk of the work I did was using paper, eraser and pens.

A lot of work in unchartered territory requires paper work. I realized the more I worked on paper the more correct the code was and overall it took faster to write(given fewer bugs).


People, processes, and environments like this:

http://www.wired.com/2015/10/margaret-hamilton-nasa-apollo/

https://en.wikipedia.org/wiki/Margaret_Hamilton_(scientist)

Also was among first to invent scheme that automated the coding and testing from specs for high-reliability. Her case is the only one I follow as I lack data on the rest.


Needle and thread to weave the core memory.


I recommend reading Digital Apollo[0] about the development of the computer, and actually the entire man-machine interface of early spaceflight. The machines were made in the milieu where computer mediated control was highly controversial. (e.g. "A machine might work when everything is fine, but will never work in an emergency.") Essentially there was huge argument between pilots and engineers, about how much automation should be done. It was so bad, that pilots even tried to insist on flying the rocket into orbit. (If I recall correctly, in simulations in a centrifuge, only Armstrong was able to successfully not crash the Saturn V in a manually controlled ascent.)

The other recurring theme in the book is the disturbingly short MTTF for flight computers during the mid 1960s. Statistically, NASA had to plan for a computer failure in route to the moon, and so repair-vs-replace became a serious issue. (Yeah, they seriously considered soldering in zero-g.)

[0] http://web.mit.edu/digitalapollo/


Alright, who wants to make a video game using this source code with me?



I often wonder about the electronics of 1969 and what was done to mitigate radiation problems.

For instance, the type of memory was called core rope memory https://en.wikipedia.org/wiki/Apollo_Guidance_Computer

For anyone interested, XPrize winner Karsten Becker talks to popular youtube blogger David Jones about radiation, extreme heat & cold in space and specifically talks about bit flip and how electronic parts are sourced for such endeavors.

https://www.youtube.com/watch?v=7JwNmdV2QPs

Interesting to me was the "paper work" cited in the interview for space harden components. In other words, people are concerned with stuff falling back to Earth (wouldn't it burn up?) or used for not so friendly purposes (war).


> For instance, the type of memory was called core rope memory

Rope and core memory were the standard memory technologies of the day and very likely have not been chosed for they radiation hardness. The fact is, solid state memory became reliable and available in quantities only in the second half of the 1970-ies.


I will now be adding this comment to all of my code: HONI SOIT QUI MAL Y PENSE

https://github.com/chrislgarry/Apollo-11/blob/master/BURN_BA...


"Shame on him who thinks ill of it." It's almost as if the authors anticipated the need to administer percussive therapy, Buzz Aldrin style, to trolls of the far-distant future.


2.048MHz clock

16-bit wordlength

2048 words of RAM (4k 'bytes'/octets) using magnets?!

36,864 words of ROM

Ok this is a actually a really interesting read: https://en.wikipedia.org/wiki/Apollo_Guidance_Computer


Is this what I think it is?

PINBALL_GAME_BUTTONS_AND_LIGHTS.s


Unfortunately no. https://news.ycombinator.com/item?id=8063357

"A set of interrupt-driven user interface routines called Pinball provided keyboard and display services for the jobs and tasks running on the AGC. A rich set of user-accessible routines were provided to let the operator (astronaut) display the contents of various memory locations in octal or decimal in groups of 1, 2, or 3 registers at a time. Monitor routines were provided so the operator could initiate a task to periodically redisplay the contents of certain memory locations. Jobs could be initiated. The Pinball routines performed the (very rough) equivalent of the UNIX shell." - https://en.wikipedia.org/wiki/Apollo_Guidance_Computer#Softw...


The DSKY and PINBALL (something flashy with buttons) was a demo.

And that demo got us to the moon.

"Apparently, nobody had yet arrived at any kind of software requirements for the AGC's user interface when the desire arose within the Instrumentation Laboratory to set up a demo guidance-computer unit with which to impress visitors to the lab. Of course, this demo would have to do something, if it was going to be at all impressive, and to do something it would need some software. In short order, some of the coders threw together a demo program, inventing and using the verb/noun user-interface concept, but without any idea that the verb/noun concept would somehow survive into the flight software. As time passed, and more and more people became familiar with the demo, nobody got around to inventing an improvement for the user interface, so the coders simply built it into the flight software without any specific requirements to do so."

http://www.ibiblio.org/apollo/ForDummies.html


If this were to be rewritten in a high level language, I wonder what would it look like?


Has anyone written an AGC->Javascript transpiler?


Probably procedural C or C++ with some inline assembly bits


people could've really used a higher-level language compiling to optimized AGC(apollo computer) assembly. Is there any reason why they didn't develop one? It seems it would've helped tremendously with the productivity and verification (and a lot of the explanations and equations would be readable as code, not as an non-executed comment)


It was pretty much taken as gospel everywhere at the time that NO compiler could match the speed and size of a well crafted assembly language routine. Back then there were some noble attempts at building optimizing compilers, and probably the more notable one was IBM's ambitious ForTran H. But that's 50-year-old tech now, kids.

Remember also that memory was at a terrific premium. I don't have any specific knowledge about the AGC, but there's an interesting story I read once about a memory shortage in another project - Intel's 8080.

(If you'll permit me an OT digression...)

As the story goes, the program space was so tight in the original microcode for the Intel 8080 microprocessor there wasn't room to spare for a one-byte constant in the code! The architects decided that the AAM and AAD instructions in the 8080 set should have a required operand - 0x0A or 10 - so that the instruction could refer to itself and know that you were operating in base 10!

A side effect of this is that the Intel processors could actually execute AAM and AAD instructions in number bases besides 10; Intel had never formally acknowledged that the instructions do this, and so in the NEC V20 or V30 chips - which were supposed to be Intel compatible - you couldn't change the AAM or AAD operand - it had to be 0x0A.


This sounds interesting, any link to a second source on this?


  > Is there any reason why they didn't develop one?
“Now you have two problems.”

Edit: OK, maybe that deserves explanation. First, an optimizing compiler is a much bigger project than the AGC code. Second, current optimization techniques didn't exist in 1969, even assuming NASA had enough budget for machine(s) to run them. Third, you need to verify the binary anyway, and small source changes (in either the guidance code or the compiler) can lead to large binary changes.


I still think it's easier to verify two isolated projects: the compiler(that can be reused) and the apollo code instead of mashing all of the equations code in assembly. I doubt the people were so used to assembly(especially scientists) that they they didn't require a magnitude more time to mentally parse/unparse logic and equations to machine code. (Also the compiler doesn't have to be very high level, even just a glorified macro-assembler with some syntactic sugar for math does seem not hard)


Long story short, in circa 1969 higher-level languages were mostly the purview of academics. No one had a computer powerful enough to run a language compiler/runtime, and programmers were real men(tm) and wrote assembly by hand because that was all they knew how to do. That's not to say that there weren't any advantages to using a higher level language, but in the case of something like the apollo computer, they couldn't risk compiler bugs or slow code gumming up the system and potentially killing the astronauts.

Even today, certain ridiculously-high-performance or super low latency tasks (i.e. embedded devices, high frequency trading) drop down to the assembly level because that small bit of overhead the compiler adds (for such modern coddling conveniences as function calls and type safety) are just too much. It's not crazy, it's just what's needed for that particular job.


There were programming languages in widespread commercial use (e.g. Fortran and COBOL) and many others for niche applications or associated with particular manufacturers.


I could be wrong. But I think part of why they did it this way was so they could edit things on the fly if a emergency dictated so. Having to ship a compiler (would have likely been an entire separate computer) would have not been feasible.

With things done this way and documented this why they could (fairly sure did) have the pages printed out on paper and such. And if say there was some bug or new routine that needed to be added mid flight they could go to a key board input (no screen) and with the instructions from the ground reprogram a section or add a new jump point or change a value mid flight.

Remember, computers were not what they are today. They did not have super powerful laptops and such with fancy tools.


> I could be wrong. But I think part of why they did it this way was so they could edit things on the fly

The AGC code was stored in ROM. Notably a specific kind of ROM called "rope memory".


"The bulk of the software was on read-only rope memory and thus couldn't be changed in operation, but some key parts of the software were stored in standard read-write magnetic-core memory and could be overwritten by the astronauts using the DSKY interface, as was done on Apollo 14."

Seams you were mostly right about not being able to change it. I was only somewhat right because I assumed it would all be editable. I over estimated the technology they had back then. I would gather from that there was for more fixed code than editable.

Source: https://en.wikipedia.org/wiki/Apollo_Guidance_Computer


> "The bulk of the software was on read-only rope memory and thus couldn't be changed in operation, but some key parts of the software were stored in standard read-write magnetic-core memory and could be overwritten by the astronauts using the DSKY interface, as was done on Apollo 14."

Ah, yes right, there was something. I've been fascinated by the AGC by some time¹, yet I completely forget about that. Time to hunt down the mission protocols to understand what this patch did.

------

1: … and the LVDC, which is the computer that controlled the Saturn-V. Because it was based on the computers used in ICBMs still a lot of information about that is classified. A lot of people conjectured that it was a IBM System 360 reshaped, but when actual LVDS board got torn down by electronics nerds over the past couple of years some significant differences to the 360 were discovered.


Hey, uh, Apollo <n for n > 10 and n < 18>? This is Houston. Could you pull out the ROM banks and flip a couple of bits for us? we made a mistake or two. All you need are some wire cutters and a very steady hand.


The guidance and navigation functions were actually programmed in a slightly higher level language called "Interpretive".


Because writing control routines is not a big deal in assembly.

Specially in these CISC processors.


# NOLI SE TANGERE this should be noli me tangere, shouldn't it?


While "noli me tangere" is the Biblical phrase this alludes to, "noli se tangere" would mean in context "don't touch this." It's not that the programmer misremembered "noli me tangere" but that he played on the reference.


Yes, but I think the author meant "don't touch this". 'Se' is third person, but reflexive, so it should be: "noli id tangere"


Is it possible the author intended it to mean `don't touch this/it`, instead of `don't touch me`?


No credit to Margaret Hamilton?



So how did this guy get the source code and why is he the one publishing it?


How about the code apparently marked for deletion? https://github.com/chrislgarry/Apollo-11/blob/dc4ea6735c4646...

I've often wondered many things about the cleanliness, maintainability and style of such code (this particular system, in fact). It's fun to be able to actually poke through it.




If you want to be entertained really, really well, watch this: https://www.youtube.com/watch?v=4Sso4HtvJsw

It's an incredibly well done and at times hilarious narration of the moon mission. (Spoiler: contains a part where Armstrong overrides the automatic control and lands manually)

This is probably my favorite presentation ever.


Schematics are also available for the hardware it runs on:

http://klabs.org/history/ech/agc_schematics/

The CPU is built entirely from 3-input NOR gates.


I was 14 and listened to this live. We all were turning blue when those computer alarms were called during the first landing. Turns out it was a mistake in protocol. Computer was overloaded.


A less important aside: what license is this available under? Or what's the history behind this source release?

Edit: .s files indicate:

# Copyright: Public domain.


Since NASA is a US government agency, they don't have any copyright over any software they write.


Wow this is surreal. I can't believe I'm seeing this. It's historical, thank you for sharing.


What if you'll receive a pull request?


this is super cool, awesome post


it's amazing they ever got anything to fly


From the code comments:

"This source code has been transcribed or otherwise adapted from digitized images of a hardcopy from the MIT Museum. The digitization was performed by Paul Fjeld, and arranged for by Deborah Douglas of the Museum. Many thanks to both. The images (with suitable reduction in storage size and consequent reduction in image quality as well) are available online at www.ibiblio.org/apollo. "

I mean, I realise that this is the least of the amazing achievements we're talking about here, but yea.. respect :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: