I found this post about ARPA Net a few months ago and finally had time to read it. Anyone have other great reads or know of books on computing technology?
I think knowing the history of how we got to where we are helps to understand ot more.
Top of my head, here are some books on specific histories I enjoyed. All old, but there are timeless nuggets in them.
Soul of a new machine, previously mentioned. Where I first learned about mushroom management.
Just for Fun: the story of an accidental revolutionary [0] was fun bio on Torvalds from 2001
The Mythical Man Month [1] offers some insight into the management and thinking that went into OS/360
Masters of Doom [2]: offers an enjoyable history of the shareware years and the rise of id software
The multicians site [3]: is a collaborative history of Multics, one the most influential operating systems.
The Mother of All Demos [4]: even better than Steve Jobs keynotes
Steve Jobs iPhone introduction [5]: I’m not a huge fan of Mr Jobs, but this is one of the best presentations ever. It’s not history, per se, but very interesting through our eyes.
It was exciting and inspiring and fun. It covered a ton of different inventions that came out of Bell Labs from its inception through the 1960's. It had some fun factoids like how they selected the wood for the telephone poles. Just an awesome book.
Could not agree more with The Idea Factory, his talks are also great. I also came away feeling inspired, nostalgic and the book is very fun. They are some real characters.
The Hacker Crackdown: Law and Disorder on the Electronic Frontier
Published in 1992, released on the Internet as a free ebook in 1994.
"The book discusses watershed events in the hacker subculture in the early 1990s. The most notable topic covered is Operation Sundevil and the events surrounding the 1987–1990 war on the Legion of Doom network: the raid on Steve Jackson Games, the trial of "Knight Lightning" (one of the original journalists of Phrack), and the subsequent formation of the Electronic Frontier Foundation."
Wow, can't believe nobody has mentioned the classic, _Hackers_, by Steven Levy.
Another book I found interesting, but is focused more on the early history of Silicon Valley as a whole, not just computing, is _Valley of Genius_, by Adam Fisher.
Valley of Genius is a fun way to become acquainted with some of the Valley's history (for those who haven't read it, it's an oral history comprised of interview snippets from a bunch of key pioneers and people who were at the right place at the right time).
Not an article, but Kirk McKusick (a very longtime developer of BSD) gave a few talks on BSD and UNIX history (and the beginnings of TCP/IP), which are on YouTube: https://www.youtube.com/watch?v=bVSXXeiFLgk
It's a treasure trove of computer history, gathered from countless sources. I've spent many happy hours in the company of the author, amused and educated by the stories.
This book is really all OP needs, as it covers computing from Turing to MS-DOS. Although ostensibly a biography of Licklider, it uses him as a skeleton to tell a much wider story about the idea of the computer as an extension of the human brain (as opposed to just a fast calculator).
Seconded. One of the more impressive books I've ever read, the way the author tells the story of the computer, which has a very complex history. He does a great job weaving together all the bits and pieces while also keeping it engaging and human.
While much more on the technical side of things, both Ken Shirriff and Curious Marc are great resources for a lot of the historical and technical aspects of things.
I'd highly recommend both of them for looking at specific topics of how the computing hardware worked:
UNIX: A History and a Memoir by Brian Kernighan is excellent. It traces the history of Unix from the early days at Bell Labs through the period where it reached widespread use in the rest of the world. It's a fascinating history of people building the command line tools that we still use today, and the environment that allows those tools to be combined in powerful ways.
I came away with a much clearer picture of how these systems were developed, and I am a little better on the command line for understanding the original philosophy better as well.
I am pretty proud of this piece that I wrote on my employers blog. It started out as training material to help newbies understand why email is so broken. It took tons of research, and goes way back to pre-ARPANET days. It's a high level, for sure, but goes deeper than most high level writings do:
Not exactly an article but Grace Hopper's 1978 HOPL keynote makes for good reading: https://dl.acm.org/doi/abs/10.1145/800025.1198341 .. she rails against the then 'Establishment' and thinks back to days when people thought assembly language was a bit fancy and modern. There's a lot about the earliest days of programming in her keynote.
We interviewed the inventor of PowerPoint for our blog, and he gave a fascinating inside story of how he almost sold to Apple before eventually sealing a deal with Microsoft:
In addition, the inventor of Powerpoint has written an entire book on its development including this anecdote in great detail called Sweating Bullets[0].
> ...is a 1974 book by Ted Nelson, printed as a two-front-cover paperback to indicate its "intertwingled" nature. Originally self-published by Nelson, it was republished with a foreword by Stewart Brand in 1987 by Microsoft Press.
> In Steven Levy's book Hackers, Computer Lib is described as "the epic of the computer revolution, the bible of the hacker dream. [Nelson] was stubborn enough to publish it when no one else seemed to think it was a good idea."
Just finished The Computer Boys Take Over, by Nathan Ensmenger: fascinating read about the deep roots of computing culture. Many of the debates we have today were already raging in the 1950s, 60s, and 70s, including:
* professionalism vs individual creativity
* the value of credentials
* terrible interview practices
* gender roles / biases
* theory/academia vs pragmatism/industry
Recommend the book for getting some historical perspective on these topics.
* Datapoint: The Lost Story of the Texans Who Invented the Personal Computer Revolution, by Lamont Wood.
* The Soul of A New Machine, by Tracy Kidder.
* Books by George Dyson (son of ...): Darwin Among the Machines, Turing's Cathedral. (unrelated, but also give Project Orion: The Atomic Spaceship a go)
* The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley, by Leslie Berlin.
I highly recommend "The Dream Machine" by Mitchell Waldrop (https://www.goodreads.com/en/book/show/722412.The_Dream_Mach...), without a doubt the best non-fiction book I've ever read and a fascinating insight into the history of computing and most of all the visionary people involved in this endeavor.
Here's another entry that's a bit unorganized. It's also a bit more of a recent (and sometimes older) history related things that I learned about mostly through HN. Replies / additions to this are welcome.
__Lisp (programming language)__
The Structure and Implementation of Computer Programs (you need to use a search engine for this, there are many different versions of it).
For Xerox, Hiltzik's Dealers of Lightning is a good source.
(The opportunities that Xerox lost because of short-sightedness, political shenanigans and just plain incompetence will probably make you mad. It's a good cautionary tale).
> I think knowing the history of how we got to where we are helps to understand ot more.
This is a great attitude, especially in our largely ahistoric industry. I wish I thought this way when I started programming.
The book "The Dream Machine" [0] does a fantastic job going into the ideas driving the pioneers. It especially focuses on ARPA and PARC, so you'll get a nice overview of the ideas explored there. And it is a fun read too.
It was out of print for quite some time, until Stripe Press bought the rights and brought it back to print [1]. They also give it away at conferences, as they want more people to be exposed to the ideas of the book.
Too many people confuse software innovations with other factors, such as the increasing speed of computer and network hardware. This paper tries to end the confusion by identifying the most important innovations in software, removing hardware advances and products that didn’t embody significant new software innovations. This paper presents its criteria for the most important software innovations and sources, the software innovations themselves, discusses software patents and what’s not an important software innovation, and then closes with Conclusions.
That Creatures of Thought blog [0] is great. I recommend starting at the beginning as the author gives a pretty strong history of the development of electronics and communications starting around the discovery of electricity to the modern day. I've been working my way through it for weeks and it's been consistently interesting. One of the key themes is how much technological development is intertwined, each person or team building on or coincidently paralleling others.
The gendered history of programming is particularly interesting for anyone who wants to understand the nature of office culture for software engineering/IT and of computer science academia.
Don't miss his articles about the old Infocom text adventures, like the ones about the less-well-known gem Trinity, in which he also talks about nuclear weapons and the 80s in general: https://www.filfre.net/2015/01/trinity/
Linus Torvald’s biography, Just For Fun, is a great read. I liked that it’s both the history of Linux and also showcases his personality and how he thinks.
Don Eyles' writing about programming the lunar module (which was something he did!) is fantastic [1]. There's also a gallery of photos on that site and he has a whole book!
And, click around on Bret Victor's references page[2] -- it's a real treasure. Despite my constant fear that mentioning it will make it go away, it needs to be shared to be useful. It's a big collection of classic papers and interviews. Someone mentioned "As You May Think", which is on there, as is Bush's follow-up from ~20 years later, and Douglas Engelbart's own partly annotated version of the original!
Also check out folklore[3], which is a great bunch of stories about working at Apple in its early days, by people who worked there (mostly Andy Hertzfeld, I'm pretty sure).
Lastly, look up "Ignition!" by John Drury Clark[4], which is a tangent, but is amazing -- it's about the history of the design of rocket engines, largely about the wild experiments and chemical science involved, and is very well written. I didn't feel right finding a pdf to link straight to, but they aren't that hard to find.
As a bonus, this isn't so much historical, but it's a great inspirational essay, Richard Hamming's "You and Your Research"(transcript[5], a video version[6]). Talks about working at Bell Labs and the different cultural elements there across people as part of analysing what makes certain people truly great.
If you're into listening to podcasts, I highly recommend Sean Haas's Advent of Computing podcast, which bills itself as "the show that talks about the shocking, intriguing, and all too often relevant history of computing."
Also, be sure to read the email thread linked in the video comments. It's a conversation between the restoration team and the compiler's original authors (now in their 90s).
If you're in SF and you want to see the 1401 in action, the Computer History Museum used to show it to the public on Wednesdays and Saturdays. Hopefully those demos will resume once the coronavirus situation's resolved.
I wrote a paper about the earliest history of programming notation that was recently published in ACM CHI and received an award:
“To Write Code: The Cultural Fabrication of Programming Notation and Practice”
http://ianarawjo.therottingcartridge.com/docs/To_Write_Code_...
It speaks to the cultural influences at play in the early history of programming, and how situated the designs were. A visit to Von Neumann and Backus archives at the Library of Congress informed the work (an experience I would highly recommend!).
I hope to make a blog post about the paper soon if there’s interest.
A couple years ago I attend this talk by Paul Wesling at Stanford. It's the most interesting, exhaustive account I have heard. It goes all the way back to the early days of Ham Radio enthusiasm in the Bay Area around the early 1910s, following the stories of three pioneers: Jack McCullough, William Eitel, and Charles Litton to modern days.
Maybe not just "history of computing" but I do enjoy the Kevin Mitnick books alongside with Ira WInkler. There's that subculture right alongside "Steal this book."
I'm most familiar with BBN for their contribution to acoustics, but the firm is best known for its development of ARPANET among other contributions to computing. Here is a PDF of collected stories by employees.
I strongly agree with this recommendation. And though it isn’t meant to be related, I think of “Decoding Reality” by Vlatko Vedral as being a spiritual successor and deeper dive into information theory (from a theoretical physicist’s point of view).
This is a pretty easy question to answer ... here are the links to history articles that I thought enough of to bookmark (note that a few are NOT computer/tech related) - https://pinboard.in/u:smoyer/t:history.
I would like to recommend "Go To - Software Superheroes" By Steve Lohr. It has great span from Fortran to (from memory) Word. Really enjoyed it and learnt a few things
I have uncovered early mid-20th century imported machines in China in the past, in fact I had an exchange with an HN user who plans on visiting one I found in a mining museum in Gejiu, Yunnan recently. I have seen many interesting domestic products using cheap processors. There were certainly entire Chinese domestic TV entertainment platforms based upon cheap chips (ca. late ~1980s/early 1990s), unsure if they were locally produced but suspect so. Personnel at the current foundries would be a good source of oral history, it would be a great project (but be careful not to conduct it in any way that could be conceived as national security related research). I believe a lot of the chip producers are currently using last-generation gear purchased from Taiwan (who allegedly have a policy of offloading last gen stuff to the mainland but keeping bleeding edge 'at home' on the island).
Also interestingly, Asia is rife with great history with respect to pre-Unicode input systems, romanization systems, glyph and font development, sorting, and so forth. Premodern international script efforts such as Phagspa are awesome, the use of Farsi as a lingua franca, Tamil innovations in keeping giant-keychain style records in lieu of inscribed palm leaves, abugidas, the development of the Korean script, non-Chinese pictographic scripts such as Nushu, Naxi and Yi, etc.
For early calculating devices and automata in China, especially hydraulic and astronomic, look no further than Needham's amazing Science and Civilisation in China, which is ultra expensive to get hold of in print but thanks for great glory of Kazakhstan many volumes of which are now available on libgen for your home learning pleasure.
A facinating video that is part of a series going into the restoration of a 1930 Model 15 Teletype. In this video, they use it as a terminal for Linux.
For the past few years I've been reading https://thehistoryoftheweb.com/timeline/
Really cool to see different groups and technologies and how they moved the web!
For more recent history, Alan Kay gave a couple talks at Stanford that are on Youtube. He goes over much of the work done at PARC. It might spark an interest in one of those topics to dive deeper into.
One other essay that comes to mind is the 1945 piece by Vannevar Bush, "As We May Think", that lays out the vision for a "Memex", or memory extension device, in the direction of what PCs and smartphones eventually became.
I think it's nice to reflect on a time, not too long ago, where these devices were a mere figment of a scientist's imagination, only to materialize decades in the future. What can you dream up today?
What I'm about to link considers the multimedia aspect of computing and a bit the history of the world wide web.
__Vannevar Bush__
Vannevar Bush is the person who came up with the idea that one could link information as opposed to methods that physical libraries use (catalogs, indexing, etc.). His implementation details are funny to read in hindsight. His conceptual ideas are nothing but amazing and a reality at the moment. It also highlights why we should separate conceptual ideas from implementation. The biggest reason is: despite the fact that you can't implement a certain system yet, having the conceptual ideas ready means that other people can be inspired by it when the technological requirements catch up.
Engelbart just blew my mind. He basically prototyped a simple version of TeamViewer + Skype in 1968! And even still, it has features in there that I still haven't seen (dual mouse control when using TeamViewer).
I wonder if Bill Gates read about him because if he did, then it was either too hard to implement some of Engelbart's his ideas in Windows 95, or he simply didn't read about it and now we're lagging 10 to 20 years behind on certain aspects of our multimedia experience.
__Ted Nelson__
The "younger brother" (= same time, related but not the same ideas) of Douglas. Ted Nelson is a bit of a controversial figure. Nevertheless, I do think he deserves a spot on this list. I'll leave it at that.
__I wrote a bit more about this stuff__
If you like this stuff, I invite you to read some of the introductory stuff of my thesis [1].
The thesis itself zooms in on an old concept called hypermedia (not hypermedia APIs, that came way later), which is a bit of an alternate reality of HTML5. While that's not really important to know about, it shares the same history up until the early 90's.
Very high level, but it spans 100 BC to 2018 CE, and it covers highlights from the Antikythera mechanism to the Mother of All Demos to Hypercard to the GDPR.
Relatedly I’m interested in the history of computing and information sciences in preindustrial times, ie how they scaled out record keeping in the Byzantine empire or Islamic golden age.
"Racing the Beam", on the technical history of early Atari games. It also does a great job of contextualising them in the social history of the time and the company.
It depends on what you're looking for. Do you want popular "fun" accounts or "serious" historical scholarship? Heavy on technical specifics or lighter on details and stronger on the historical big picture? All of those are fine, but the answer will mean you'll want to look in different places.
There is a active subfield of academic history of technology that deals with the history of computing. It's generally populated by professionally-trained historians who rigorously employ archival sources and tend to focus on the bigger picture, i.e. linking computing to wider historical trends and events (examples: the Cold War, Capitalism, the rise of programming as a profession, etc.). Some notable historians in this area are Mar Hicks and Nathan Ensmenger. The field is still growing and has a lot of ground to cover, but if you seek a deeper understanding (as readers of HN tend to do) then this is the place to go. Writing accurate history that also grapples with the big picture is hard, and it shouldn't be too much of a surprise that it takes specialists to do it.
There's of course an abundance of popular accounts of innovation, along the lines of Walter Isaacson's _The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution_. Academic historians of technology generally dislike these accounts because they focus on a small set of "heroic" individuals at the expense of the bigger picture. To pick on Isaacson a bit, he's generally most interested in personalities (Steve Jobs, say) and must less interested in digging deeper into the economic and social history that lies behind the "Digital Revolution". Nonetheless, these accounts sell very well, there are also tons of articles out there written in a similar vein, and there's nothing wrong with enjoying them provided you are aware that you're only getting one version of the story.
I also like some practitioner-accounts, really more memoir than history; they give you a nice sense of what it was like to live through a particular set of changes and to be part of building systems that we now take for granted. A good recent example, if you are interested in the genesis of UNIX, is Brian Kernighan's _Unix: A History and a Memoir_. Some excellent journalistic "on the ground" accounts also fall into this category of giving you a sense of what it was like to be there. The Soul of A New Machine, by Tracy Kidder is a good example.
Finally, there is some very good general history of technology that doesn't focus exclusively on computing. An example is the superb _The Shock of the Old: Technology and Global History Since 1900_ by David Edgerton (polemical, controversial, easy to read, and even if you disagree with his argument it will likely change how you see technology forever).
Soul of a new machine, previously mentioned. Where I first learned about mushroom management.
Just for Fun: the story of an accidental revolutionary [0] was fun bio on Torvalds from 2001
The Mythical Man Month [1] offers some insight into the management and thinking that went into OS/360
Masters of Doom [2]: offers an enjoyable history of the shareware years and the rise of id software
The multicians site [3]: is a collaborative history of Multics, one the most influential operating systems.
The Mother of All Demos [4]: even better than Steve Jobs keynotes
Steve Jobs iPhone introduction [5]: I’m not a huge fan of Mr Jobs, but this is one of the best presentations ever. It’s not history, per se, but very interesting through our eyes.
0: https://www.goodreads.com/book/show/160171.Just_for_Fun
1: https://www.goodreads.com/book/show/13629.The_Mythical_Man_M...
2: https://www.goodreads.com/book/show/222146.Masters_of_Doom
3: https://multicians.org/
4: https://youtu.be/yJDv-zdhzMY
5: https://youtu.be/vN4U5FqrOdQ