Hacker News new | past | comments | ask | show | jobs | submit login

The strange thing is that the “instruction” Halt and Catch Fire originated in a joke System/360 “green card” (reference summary) from the 1960s, along with Rewind and Break Tape, Execute (or Punch) Operator, and many others I've long since forgotten. Reference is sometimes made to an undocumented Motorola 6800 instruction, but the term is from the Big Iron era, and has nothing to do with personal computing. It's really from the "Blinkenlichts" era, when people commonly put up cartoons in the machine room showing a floating-point adder as a snake in a birdbath.

By the way, Big Iron's abilities in the physical world were real. I occasionally operated a System/360 with 4 2311 disk drives, 7.5MB monsters with the form factor of washing machines. When running the IBM sort program, these clunkers were apt to move around the machine room.




Our CS department newsletter once had cartoons featuring snakes illustrating both adders and half-adders, the latter being the victim of an ax.


It only just occurred to me that Halt and Catch Fire didn't have a scene of a printer catching on fire, as best as I can remember. Or maybe it did and I just forgot? That would have been classic...


Nope, but there was the scene at the end of season one where Joe does something with fire (avoiding spoilers).


Completely off-topic reply:

You had replied to a comment of mine asking what I’d meant by “Google figured out hardware didn’t matter on the server” by 2000 and how that changed the internet to make Bill’s vision come true. I do not check this site often and found I could not reply, and there was no email on your profile, so I am replying here:

To keep it brief, Google was the first company to figure out server farms of cheap machines were cheaper and - counterintuitively - more reliable, than massive, powerful servers. Solaris, DEC and IBM would ring a bell to a lot of the sysadmins from that time. Once Google figured out you can decouple work from hardware, handle hardware failure in software, and drive utilization from pathetic levels (20% at most internet companies over a year) to high 80s for free, the modern internet got a huge shot in the arm. At night, you could run batch jobs. It was a beautiful monstrosity. If not for the work by some of the guys (including 6 or 7 of my friends) who went from DEC - ironically due to the dot com crash* - AWS or cloud computing would not be a thing.

* - another ironic twist: If the utilization problem was solved, a lot of these companies’ finances would have looked much much better to the point that the bubble burst would not have been nearly as devaststing.


> when people commonly put up cartoons in the machine room showing a floating-point adder as a snake in a birdbath.

I really want to see this, can't find it by googling though. Maybe you know where to find it?


The last time I saw that one was in the University of British Columbia 7044 machine room, circa 1969. It was part of an “Anatomy of a Computer” cartoon, with a number of other visual puns. The adder is the only one I remember. Have never seen it online, alas.


Interestingly, looking around for this cartoon led me to these two specimens showing adders in chip and circuit design: https://micro.magnet.fsu.edu/creatures/pages/fulladder.html https://micro.magnet.fsu.edu/creatures/pages/halfadder.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: