Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Best way to learn about computing history?
213 points by Tmkly on April 22, 2022 | hide | past | favorite | 152 comments
I'm a software engineer, mainly working on mobile apps (iOS primarily) through React Native and some Swift/Java. I have a CS degree and about 7 years in this field.

However recently I've become very aware that JS/TS and Swift etc are just APIs on top of APIs. I've been drawn to learning more about how computers work, the history of programming/computers (Unix, Sinclair, commodore, etc and even going back to Ada Lovelace, Babbage and mainframes in the 1950s) and things like memory allocation. I've tried learning some BASIC and Assembly code but haven't really got very far. I read/devour articles on sites like https://twobithistory.org but they only get you so far.

What can I do to help accelerate this and satiate this desire to learn more about how computers work? I live in London, UK and would be happy to spend some money on a uni course or something if there was a good one. I learn best practically so like to be "doing" something as well as theory.




Ben Eater's Youtube series "Building an 8-bit Breadboard Computer" is a really good introduction to the lowest levels of how a computer works:

https://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2...

I recommended it to my daughter when she was taking a class in R and asked "But how does the COMPUTER know what to do?"


>But how does the COMPUTER know what to do?

Ben Eater is amazing, but his series in my very humble opinion isn't the best answer to this question. I found the emphasis on the breadboard and the particulars of physical implementation getting in the way of a clean pedagogical introduction to logic circuits as a DAG of abstract computational elements implementing function from {0,1}^n -> {0,1}^m (which we then implement with real circuits in whatever medium and form we choose), it's very "DIY" and maker-oriented in nature. This doesn't negate it's status as a masterpiece of educational vlogs, I just feel it leaves a first-time learner hanging on some very important questions.

The single best answer I have ever seen to this question is the outstanding The Elements Of Computing Systems[1], better known as the NandToTetris course[2][3]. You literally start with Nand and build a computer, an assembler, a VM, a compiler, a simple runtime library, and - finally - Tetris running on top of all that. It's one of the best introductions to computers and computer science I have ever seen in my life, at once a textbook and a work of science communication. It comes with it's own software suit[4], and the first 4 chapters of the book (from Nand to a fully functional computer capable of running machine code) are gamified in [5].

[1] https://mitpress.mit.edu/books/elements-computing-systems

[2] https://www.youtube.com/playlist?list=PLrDd_kMiAuNmSb-CKWQqq...

[3] https://www.youtube.com/playlist?list=PLrDd_kMiAuNmllp9vuPqC...

[4] https://www.nand2tetris.org/software

[5] https://nandgame.com/


I went from NAND2Tetris to Ben Eater, and I think that’s the right direction to take.

The first gives a great overview, and the latter deep dives into some of the architectural concerns around building computers (which are mostly ignored in the course), like understanding buses, running assembly on actual hardware. The most important bit for me was learning how to read data sheets to figure out how to use chips.


Shout out to Ben Eater. This dude explains how the internet works and rips open an Ethernet cable and hooks up an oscilloscope to it and decodes the bits that transfer over the physical cable.

Very informative!


That's legendary stuff right there


Ben Eater also sells a kit for those who want to follow along by building their own: https://eater.net/shop


The developer shared this on here a while ago and I fell in love with it. Super great game

https://store.steampowered.com/app/1444480/Turing_Complete/

Also in the same vein https://nandgame.com

I found that actually building things helps me learn properly


Offtopic: Reminds me that Ben Eater has gone 100% silent since last november (hope he is ok)


You’ll be glad to know his shop has a message about delayed deliveries dated this month.

https://eater.net/shop


Hope that’s him, thanks


this looks extremely interesting and may be just what I'm after. I have some basic electronics experience so I could build on that too. Thanks!


I am wondering if it is a better idea to teach R as an intro to programming as compared to Python. I mean you dont have this whole indentation business with R...


Code, by Charles Petzold: http://www.charlespetzold.com/code/

Explains how we got from Boolean logic to microchips and software.

Also, the Computer History Museum in Silicon Valley has an excellent exhibit containing both early computing devices and the seminal papers that were the precursors and enablers of modern computers: https://www.computerhistory.org/revolution/; https://computerhistory.org/timelines/;


I’ll second code - I love that book and haven’t found a better explanation elsewhere.

For more general history, Hackers, The Dream Machine, Crypto, and to a lesser extent What The Dormouse Said are all good.

I’d add in In The Plex, Masters of Doom, and Facebook: The Inside Story for company specific books.

I haven’t ready show stopper yet, but that’s on my list.


IIRC, that museum also sometimes has retired pioneers in computing offering demonstrations.


From a NAND gate to Tetris was excellent. Informative and obvious

https://www.nand2tetris.org/


I cannot endorse this enough.

some of it is hard and will have you wondering if you want to continue. if you do, and I highly recommend that every developer complete this course, you will find yourself thinking in new ways and understanding many problems very differently, which is a very good thing.

and you will see huge performance problems in almost all software from them on, because none of this (I gesture vaguely at everything everywhere) should be as slow as it is. none of it.


Agreed. For me, the course was eerily able to hit my exact sweet spot of difficulty. Almost every project had me in the "I'm about to give up" stage just long enough before I had a breakthrough and kept continuing.

>and you will see huge performance problems in almost all software from them on, because none of this (I gesture vaguely at everything everywhere) should be as slow as it is. none of it.

Hah! Yes! After finishing, the first optimization I made was to add an "inc" command to the high-level (Java-like) language you implement. It annoyed me that any time you have "i = i + 1", it translates into VM code of "push 1 onto stack, push i's value onto stack, add, pop stack to i's location", each of which translates into several machine instructions. Especially given that the CPU has an increment instruction!

So I added the inc keyword that would ensure you bypass all of that, if you just want to increment a variable, and thus use significantly fewer cycles. It was really thrilling (well, as much as a technical project can be) to have the level of insight and control needed to make a change like that.


After reading that some huge percentage of x86 instructions are basically never used I’ve often wondered if there’s much more performance that can be gleaned through knowing what to write and how the compiler will use it.


The game inspired by the course is great, too: https://nandgame.com/


Though it doesn't cover all of computing history, this site is a comprehensive timeline of personal computing history from 1947 to now.

https://kpolsson.com/comphist/

Apparently the author has been maintaining that timeline since 1995 and is still doing it!

While it doesn't cover things like computer science, I think it's an excellent jumping off point for learning about notable people and events.

Not exactly what you asked for, but you may also be interested and may give you some insight I think more programmers should have.

EDIT: Also, don't stop at Babbage & Lovelace. Although Babbage's analytical engine was one of the first, if not the first programmable computers with a form of memory, there were people working on extremely primitive computers (or rather advanced calculators) way before Babbage. Schickard, Pascal, and Leibniz conceived of and developed calculating engines that did basic math with support for interim value storage, which one might consider to be the earliest form of computer memory.


Steven Levy's Hackers is a foundational work of the history of computing. Levy spends a lot of time on the MIT hackers of the 1960s and 1970s, the group that hatched Lisp, Richard Stallman and the free software movement, and also a lot of time on the Bay Area hackers that kick-started the microcomputer revolution. Certainly it's not a comprehensive guide to the full range of computing history, but it's an important and engaging look the beginnings of where we are today.


I second that, Hackers was a great read. I read it back in the late 90s and then again a couple years ago and was surprised how much of it came back to me. He's one of the few tech journalists and writers who actually gets it.


I read it early in life and it changed my life. By the time the O’Reilly anniversary edition came around I was there to witness some of the new interviews of people in the first book.


One cannot recommend Hackers enough. In many ways, the stories on Hackernews’ front page are the ripples of the events chronicled by Levy.


I think the best way to learn this stuff is from the people who did it, speaking in their own words. But watching videos takes forever, so the best way to do this is to read oral histories. The Computer History Museum has really great content-- I've read dozens of these. You can easily find them, ranking in approximate order of popularity, with the following google search:

https://www.google.com/search?q=oral+history+computer+museum...

To find more (and there are many great ones outside of the Museum), you can try a broader search:

https://www.google.com/search?q=oral+history+arpa+filetype%3...

I have found that I can read around 3-5x faster than listening to people talk, depending on the speed of the speaker (most of the people interviewed in these oral histories are quite old and can speak a bit slower), and I also retain the information much better. There is something about reading an actual conversation by someone who was there when this stuff was being invented (or literally invented it themselves) that you don't get from reading a retrospective historical account, and it makes the information stick with you more, since it's all framed in stories and personal accounts.

Some favorites:

https://conservancy.umn.edu/bitstream/handle/11299/107503/oh...

https://archive.computerhistory.org/resources/access/text/20...

https://conservancy.umn.edu/bitstream/handle/11299/107247/oh...

https://archive.computerhistory.org/resources/text/Oral_Hist...

http://archive.computerhistory.org/resources/text/Oral_Histo...

https://conservancy.umn.edu/bitstream/handle/11299/107613/oh...

https://digitalassets.lib.berkeley.edu/roho/ucb/text/valenti...

https://conservancy.umn.edu/bitstream/handle/11299/107642/oh...

There are so many other good ones, but that's a good start!


I'm in the middle of listening to the oral history by Margaret Hamilton,

https://www.youtube.com/watch?v=6bVRytYSTEk

The transcript is here:

https://www.computerhistory.org/collections/catalog/10273824...

Highly entertaining life story of the woman who managed the flight software for the Apollo Guidance Computer.


Don't forget as well the Oral History of Unix! https://www.princeton.edu/~hos/Mahoney/unixhistory


There are also several good museums dedicated to the subject. I used to work for the national museum of computing at bletchley park in the UK, and there they have a lot of good exhibits that teach basics of how computers and networking works and has evolved over the years.

Another good approach one can take to learn is starting with a simple system with well-defined rules, and making a simple computer out of it. Many people do this in minecraft, for myself it was boolean functions in excel. You can and should look many things up during this process, fail and rework designs several times etc. Learning how logic gates work, then scaling the knowledge up to bit adders, registers, ALU, making a cpu instruction set and starting on basic turing machine architecture is a very rewarding hobby and is definately the best way to get low-level knowledge


I second Bletchley. Consider it your obligatory pilgrimage as a computing professional.


Note that there are two museums on the site of Bletchley Park, with different opening hours -- the historic site, including the main house and the huts, and the separate National Museum of Computing, which has the Colossus replica, the WITCH, and a large collection of other British computing machinery and memorabilia. If your interest is in computing history, you'll probably want to see both -- which means timing your visit when they're both open, so be sure to check.


They’re just opening a new exhibit in Block A next week!


Sounds like I need to make another Hajj then.


I've been meaning to visit Bletchley! thanks, these suggestions are helpful


if you do go, my favorite exhibit at Bletchley is the Harwell Dekatron a.k.a WITCH. It's a really good machine for learning how computers work, both because its so early meaning it has simple architecture, but also for other reasons:

It uses base 10 instead of base 2 to store values in its memory, so doing the calculations in your head is a lot easier

It's memory is made from special tubes that have a orange glow at one of 10 positions, meaning you can see the contents of the computer's memory in its entirety just by looking at the machine.

Building off the previous point, you can see the cpu register, which is a single small piece of memory that stores the current data to be modified, which again with the lit-up memory means you can see what the computer is doing at any givenm time.

Finally, the computer has a debugging switch which can pause and step forwards through each instruction the computer is performing, as well as a more granular mode that lets you step through individual parts of an operation (for example each digit being added in a adding operation separately)

It's not always running but if you go I hope you get to see it :)

EDIT: oh and I suppose I should mention there is more than one museum at bletchley, there is the big one that focuses on ww2, but the museum of computing is tucked away to the side a bit.


If you go to Bletchley Park, don't go on a hot day in summer! Those old machines massively heat up the room.


So your title and comment suggest two slightly different things. For "how computers work?" I recommend Code by Petzold (higher level, good book) and The Elements of Computing Systems by Nisan and Schocken (also available here: https://www.nand2tetris.org/). The latter is project based and has you develop a computer starting at NAND gates and working up. It can be run through at a good clip while still learning a lot if you're a moderately experienced developer.

EDIT: Per Amazon there's a second edition of Code coming out at some point, but no date that I've been able to find.

I've also got a copy of, but not yet read, Ideas That Created the Future: Classic Papers of Computer Science edited by Harry R. Lewis, the contents are in chronological order with the most recent in 1979. It has 46 different papers on computing, being largely historical this ought to be a decent starting point as well.


I’d also add the book “But How Do It Know?” from J. Clark Scott as a fantastic primer, building from gates to RAM and CPU, to a simple bootloader and assembly programming. It comes with a CPU simulator on the book’s website so you could make sense of what you’re learning - and being a light read you could reasonably finish it in a week.

http://www.buthowdoitknow.com


yeah I'm not entirely sure what I want. Thanks for these suggestions, will take a look. nand2tetris looks cool!


If you want pacing & support for Nand2Tetris, Coursera has it split into two courses. I've done the first from NAND gates to a working CUP & assembler and can testify it's worthy. Coursera loves to have content sales, so if you're not in a rush you can pick it up for cheap and have their (petty yet ego boosting) certificate of completion to read over one morning with you Cheerios (and then put away in a drawer to be forgotten). Here's the two links:

Part I - https://www.coursera.org/learn/build-a-computer Part II - https://www.coursera.org/learn/nand2tetris2

Some day I hope to pick up Part II, but Part I was still a lot of fun!


I haven’t read it, but heard good things about “The Soul of a New Machine” about Data Generals efforts create a new 32-bit superminicomputer.

https://www.tracykidder.com/the-soul-of-a-new-machine.html

Another comment mentioned “Pirates of Silicon Valley” as a good dramatization of MS/Apple and there’s also the miniseries “Valley of the Boom” about the rise and fall of Netscape and “Halt and Catch Fire” which is a fictional and thematic view of 80s/90s computer history.


Soul of a New Machine is absolutely classic. You'll learn just how much, and how little, computers and programming have changed since the 70s. And some interesting takes on microcoding, if you're already dealing with APIs on APIs, you'll get a better understanding.

If you want some Apple history, particularly on the early days of Macintosh, check out folklore.org. DO NOT start reading it if you have anything important to do for the next 24 hours.


Feynman’s Lectures on Computation: https://www.amazon.com/gp/product/B07FJ6RRK7/ref=as_li_tl?ie...

You might be familiar with Feynman's Lectures on Physics, but his lectures on Computation (based on a class he taught and his work in 'Connection Machine') aren't any less amazing. Through this short book, Feynman guides us through the concept of computation and the van Neumann architecture in his unique style, from logic functions, to Turing machines, coding and even quantum computers. It will give you a unique appreciation of the finer points in which computers are "Dumb as hell but go like mad" so that you can better squeeze every bit of performance out of your code.


For "how things work", I recommend the book Code by Charles Petzold. After that, Jon Stokes's Inside the Machine will give a lot of details on CPU architectures up to Intel's Core 2 Duo. You can also try following along a computer engineering book if you want to go that low in detail with exercises, Digital Fundamentals by Floyd is a common textbook (I have an old 8th edition).

History-wise, enjoy learning slowly because there's so much that even if you dedicated yourself to it you wouldn't be "done" any time soon! Some suggestions in order though:

Watching The Mother of All Demos: https://www.youtube.com/watch?v=yJDv-zdhzMY

A short clip of Sketchpad presented by Alan Kay: https://www.youtube.com/watch?v=495nCzxM9PI

An article from the 40s that also inspired Engelbart: https://www.theatlantic.com/magazine/archive/1945/07/as-we-m...

The Information by James Gleick

What the Dormouse Said by John Markoff

The Psychology of Computer Programming by Gerald Weinberg

Lastly, to mix up in whatever order you please, some paper collections:

Object-Oriented Programming: The CLOS Perspective edited by Andreas Paepcke

History of Programming Languages papers for various langs you're interested in, here's the set from the second conference in 1993 https://dl.acm.org/doi/proceedings/10.1145/154766 but there have been further conferences to check out too if it's interesting

Also all of the Turing Award winners' lectures I've read have been good https://amturing.acm.org/lectures.cfm

All that and some good recommendations others have given should keep you busy for a while!


A long while ago I found the Jargon File, it's a "dictionary" of terms used by hackers as the culture was budding at the universities in the 70's. Reading the entries you get a glimpse of the technology and culture of those places at that time. Young me found it really cool in a nerdy way, and read all the entries from front to back. Since this was before the always online times, I was just reading the TXT file from http://jargon-file.org/archive/ rather than needing to navigate the many pages: http://www.catb.org/~esr/jargon/


that's great, will take a look. thanks


In terms of computing history, The Dream Machine by Mitchell Waldrop is incredibly good.

In terms of "how computers work" I agree with others who recommended Elements of Computing Systems (aka nand2tetris).


The Dream Machine is easily one of the best-written books I've ever read. Really gives a good overview of how many people are involved in various ways in the evolution of computing.


Yes I agree, the writing is superb. I found out about it from Alan Kay, and so far every book Kay has recommended has been really good (you can find his reading lists easily online).


I have the same passion about computing history. I can't count the amount of literature I've read to learn about this fascinating history; it's very satisfying to know when, how, where, and by who original work was done to advance computing. Most of the foundational work in computer architecture and computer science was done in the 50s, 60s, and 70s. From there it has been incremental improvements.

I highly recommend reading "The Dream Machine" by Mitchell Waldrop. It's very well written, and covers a huge swath of computing history, from the ENIAC to the Internet (it was written in 2000).

Instead of recommending specific sources (too many), I can mention key milestones in computing history that you may want to research:

- Theory of computation (Alan Turing, Alonzo Church)

- Early binary systems (John Atanasoff, Konrad Zuse, George Stibitz, Claude Shannon)

- Early computers (ABC, ENIAC, EDSAC, EDVAC, Von Neumann architecture)

- Early programming (Assembly language, David Wheeler, Nathaniel Rochester)

- Early interactive computing (MIT Whirlwind, SAGE, TX-0, TX-2)

- Early mainframes (UNIVAC, IBM 70x series)

- Early programming languages (Speedcoding, Autocode, A-0, A-2, MATH-MATIC, FLOW-MATIC)

- First programming languages (FORTRAN, COBOL, LISP, ALGOL)

- Early operating systems (GM-NAA I/O, BESYS, SOS, IBSYS, FMS)

- Early time-sharing system (MIT CTSS, Multics, DTSS, Berkeley TSS, IBM CP-67)

- Early Virtual Memory (Atlas, Burroughs MCP)

- Early minicomputers (DEC PDP line)

- Mainframe operating systems (IBM OS/360, UNIVAC EXEC)

- Early online transaction processing (SABRE, IBM ACP/TPF)

- Early work on concurrency (Edsger Dijkstra, C.A.R. Hoare, Per Birch Hansen)

- Early database systems (GE IDS, IBM IMS, CODASYL)

- Early Object-Oriented Programming (Simula I, Simula 67, Smalltalk)

- More programming languages (CPL, BCPL, B, C, BASIC, PL/I)

- Mini/Supermini operating systems (Tenex, TOPS-20, VMS)

- Structured Programming (Pascal, Modula, Niklaus Wirth)

- Relational data model and SQL (Codd, Chamberlin, Boyce)

I could keep going on, but this is already too long. I hope this at least puts your feet on the first steps.


I’ll second your recommendation for The Dream Machine. Unless your vision is very good, I’d recommend getting it as an ebook. The book from Stripe Press is beautiful, but the text is pretty tiny.


Good: split your time between activities and reading something as satisfying as the things you “devour”. To that end, I would plus one Hackers (Levy) and Code (Petzold). Also, the Cathedral and the Bazaar by esr

http://www.catb.org/~esr/writings/cathedral-bazaar/

and other things from esr at

http://www.catb.org/~esr/

including the aforementioned jargon file. Here’s one I hadn’t stumbled on before, ‘Things Every Hacker Once Knew’

http://www.catb.org/~esr/faqs/things-every-hacker-once-knew/

For an activity ymmv depending on how much time you can spend; an alternative to building a computer from scratch, or an OS from scratch, is to buy a vintage cheapie running cp/m or dos, something where the OS isn’t abstracting memory management for you. Growing up in the 80s, I think managing my own memory and _everything_ that implies was the greatest teacher.


Having gotten into computers in the early 1990s, I knew a lot of "Things Every Hacker Once Knew", but I did find something exciting that I didn't know in the discussion of ASCII control characters:

>ETB (End of Transmission Block) = Ctrl-W >Nowadays this is usually "kill window" on a web browser, but it used to mean "delete previous word" in some contexts and sometimes still does.

I tried Ctrl-W in a Linux console and it works! This will save me some trouble in the future.


Though the cathedral/bazaar terminology is influential, I am not sure reading the original text helps understand open source as it actually is now. It concludes that open source would drive out closed source software, when what we see is open libraries being much more popular than open applications. This is in part due to Raymond's own work at the Open Source Initiative. I'd probably be better for someone now to read a retrospective rather than a treatise.


> It concludes that open source would drive out closed source software

Relative to the pre-FLOSS era, this is exactly what has happened. FLOSS is ubiquitous today in systems software. Many of the current Big Tech companies and business practices simply would not exist without FLOSS computing infrastructure.


I agree. You said better what I was trying to say in that second sentence. It's more that for someone new to the theory of open source, reading a 1998 essay about a Linux email application gives the wrong idea of the strengths and weaknesses of the model.


these resources look great, thanks. also yeah great suggestion for buying a cheap vintange microcomputer!


If you can manage a day trip to Cambridge (about an hour from London), you should visit the excellent Museum of Computing History http://www.computinghistory.org.uk/


National Museum of Computing at Bletchley Park is also excellent.


Seconded (thirded? fourthed?)

I'd also add the Information Age gallery at the Science Museum in London, which covers from the telegraph to Arm and the modern day, its aimed at a more mainstream audience than Bletchley but is free and well worth a look.


The 1992 WGBH/BBC 5-part miniseries "The Machine That Changed The World"(US)/"The Dream Machine"(UK):

https://en.wikipedia.org/wiki/The_Machine_That_Changed_the_W...

is out of print, but can be found intermittently on youtube.

I love the coverage of 1940's computing, with interviews with several of the surviving people:

https://en.wikipedia.org/wiki/Konrad_Zuse

https://en.wikipedia.org/wiki/ENIAC

https://en.wikipedia.org/wiki/Eckert%E2%80%93Mauchly_Compute...

https://en.wikipedia.org/wiki/EDSAC

Currently working episode links:

1: https://www.youtube.com/watch?v=hayi9AsDXDo

2: https://www.youtube.com/watch?v=GropWVbj9wA

3: https://www.youtube.com/watch?v=rTLgAI3G_rs

4: https://www.youtube.com/watch?v=E1zbCU5JnE0

5: https://www.youtube.com/watch?v=vuxYUJv2Jd4


The Advent of Computing podcast may be of interest. The host really strives to find accurate historical information about a variety of early computing topics.

https://adventofcomputing.com/

It's also fairly entertaining


There is tremendous value in this podcast. Many histories are linear stories of how we got to today, but Sean Haas goes further by exploring many of the dead ends. Those diversions are essential if you want to understand the why, and not simply the how.

It also manages to strike a reasonable balance between being an easy listen while going adding some technical depth. For example: you aren't going to learn enough to build your own mercury delay line, but you are going to have a rough idea of how it worked and what the limitations were. It is as much a history of technology as it is of the people behind it.


That's been one of my favorite tech podcasts, for sure one of the gems.


Was just showing the subject to a youngster recently. Other folks mentioned the Code book, I liked that one. The MMM by Brooks of course. We also looked at the following videos on youtube/Kanopy and other places:

- The Story of Math(s) by Marcus du Sautoy to set the stage... school and taxes in ancient Sumeria, Fibonacci bringing Indian numbers to Europe, and other fascinating subjects.

- We watched short biographies of Babbage and Lovelace, full-length ones of Turing and Von Neumann. The "code breakers" of WWII.

- Top Secret Rosies: The Female "Computers" of WWII, another good one.

- There's more history in PBS' Crash Course Computer science, than you might expect. It is great although so peppy we had to watch at .9x with newpipe. Shows relays, vacuum tubes, to ICs, to the Raspberry Pi. As well as the logic gates they model.

- "The Professor" at Computerphile is a great story teller about the early days.

- There are great videos about CTSS being developed at MIT I think, where they are designing an operating system via paper terminal and trying to decide on how to partition the memory/storage: https://www.youtube.com/watch?v=Q07PhW5sCEk

- The Introducing Unix videos by ATT are straight from the source: https://www.youtube.com/watch?v=tc4ROCJYbm0

- The movie/book "Hidden Figures" touches on this time as well. Facing obsolescence by IBM, one of the characters teaches herself Fortran.

- The Pirates of Silicon Valley is a fun dramatization of the late 70s to 80s PC industry. It specifically calls out the meeting between MS and IBM as the deal of the century. We also watched a "Berkeley in '68" doc on Kanopy to set the stage before this one. Interesting, but a tangent.

- The "8-bit Guy" is also great, he dissects and rebuilds old home computer hardware from the same era, and teaches their history as he does it. Even his tangential videos on why there are no more electronics stores (besides Apple) in malls is great.

- There are good docs on the "dead ends" of the industry as well, such as "General Magic" and "Silicon Cowboys."

- "Revolution OS" a doc about the beginnings of FLOSS and Linux.


Computer Chronicles was a PBS series that ran for 20 years and captured a lot of computer history as it happened, it's a great watch on YouTube: https://youtube.com/user/ComputerChroniclesYT


thanks will take a look!


Try this book:

"Understanding Digital Computers : A Self-learning Programmed Text That Will Teach You the Basics for the Microcomputer Revolution" by Forrest M. Mims III.

It's dated, but the core material is still relevant. Even the dated sections might suit you if you're interested in the history.

In a similar same vein, a few years ago I wrote a course which starts with the idea of a bit and ends with the student programming a computer that they built themselves in a logic simulator.

https://john.daltons.info/teaching/engineering/

The first few lessons meander, as I was still figuring out a direction, so the meat starts at lesson 3. The last lessons are missing (roundtoit), but if there is interest I can put them on-line. From memory all the examples are on-line. Here is the final computer:

https://john.daltons.info/teaching/engineering/simcirjs/comp...

In this example a program is already in memory, so just push "run/stop" to make it run. The instruction set isn't on-line, as it's in the later lessons, which I haven't gotten around to uploading.


As you are based in London, UK' let me propose a slightly different alternative.

A 1 hour journey leads you to the The Center for Computing History in Cambridge [1]. Please go there and see for yourself and interact with the history of computing. You may also buy one of the maker kits to get started with [2].

I too have a similar keen interest and there are some fantastic volunteering opportunities to deep dive and learn about the history. [3]

And there was this awesome Gaming Generations Exhibition that just got over last week [4].

You could combine all this with other equally fantastic solutions proposed here (Ben eater's videos, Nand2Tetris) etc.

That hopefully makes for a fun, interactive way of satiating your good hunger :-)

[1] http://www.computinghistory.org.uk

[2]http://www.computinghistory.org.uk/det/50229/MyZ80-Maker-Kit...

[3]http://www.computinghistory.org.uk/pages/14522/volunteering/

[4]http://www.computinghistory.org.uk/det/66270/gaming-generati...


I'm not at all sure that learning about computer history and learning about how computers work are the same thing. For example, looking at early microprocessors would give you the idea that instruction set architectures are completely random when in fact their designers were faced with a limited transistor budget and very short development times. Often, microprocessors were offered as a replacement for discrete logic, rather than as generally programmable computing devices.

The history of computing is replete with really dumb ideas, from addition and multiplication tables in memory (IBM 1620) to processors optimized for Ada that ran too slowly to be useful (Intel iAPX 432). There were really smart ideas, too, such as cache (IBM System/360 Model 85) and RISC (too many systems to mention). What you want is just the smart ideas, I'd say.

If you want to get an understanding of how modern computers work, and given your CS degree, I would recommend David Patterson/John Hennessy's Computer Organization and Design, any edition. A lot of universities use this book in a second-year architecture course.

In terms of relating this information to the overall hierarchy of computer systems, I would also recommend Nisan and Schocken's Elements of Computing Systems.


Lookup tables are a good way to do fast arithmetic. Some IEEE 754 implementations still use them - usually with interpolation - for certain functions.

That aside - hardware technologies, storage systems, ISAs, engineering enhancements (like paging, caches, microcode, and others), operating systems, market segmentation (micro, mini, super, etc), languages, and compiler theory all have their own separate histories.

You don't need to know the histories to write good code, but they're all interesting in their own right.

All the book suggestions here are good, but I'd also recommend a rummage through the huge bitsavers computing archive (http://www.bitsavers.org/pdf/) for first hand notes, memos, and documents from a huge selection of manufacturers and facilities.

It's a bit of a disorganised grab bag with a fair amount of noise, but the IBM, DEC, Burroughs, CDC and various university archives have some fascinating material.


One thing I learned from 30 years doing university-level teaching is that if you throw a grab bag at people, most of them won't learn much of anything. That's why university courses (and good textbooks) tend to impose a bit of structure on what's to be learned. My advice: pick one or two books from the plethora mentioned in this thread, and work your way through them. Once you have built a mental model of the particular topic areas you want to understand, then you can start widening your search.

As for lookup tables, I won't say anything bad about them in general. That said, if you pressed the Memory Clear button on a 1620 console, it would wipe the addition and multiplication tables, meaning you couldn't even load a program until you had manually entered replacement values. This was a Dumb Idea :).


To the point of the original question, Patterson & Hennessy includes a bit on history at the end of every chapter.


"The Dream Machine" by M. Mitchell Waldrop.

It tells the history of computing by following J.C.R Licklider. As one of the directors of ARPA, he was responsible for funding research labs to work on computer research. He had a major impact on which projects got funded, and in-turn which systems are now being used 60 years later. I honestly love this book so much. If you love computers and history, its a must-read.


I think the way to go about it is read some books gradually on history of computing & the various designs and rationale that evolved over time. Consider these sources:

1. The Annotated Turing.

2. A History of Modern Computing 3ed

3. The ACM Turing award lectures

4. Theory of computation - Dexter Kozen

5. Coders at Work

6. Hackers: Heroes of the computer revolution

Additionally, you could subscribe to Communications of ACM, which is a computing oriented monthly magazine.


A History of Modern Computing is great

Would add:

- How The Internet Happened (history of the internet)

- To The Digital Age (transistor history)


these look good, thanks!


I'm coming from a similar background and asked myself that exact question :)

These two books helped me much already:

Programming from the Ground Up by Jonathan Bartlett - A very good introduction to assembly

Learning Computer Architecture with Raspberry Pi by Eben Upton - Great read about the inner working of memory and the CPU, with reference to the past and how things developed


Lots of good links getting posted. Another interesting resource is the ACM's History of Programming Languages (HOPL) proceedings,

https://dl.acm.org/conference/hopl/proceedings


The Journal "IEEE Annals of the History of Computing" might be a good source. It has been published for over 40 years.

https://www.computer.org/csdl/magazine/an


In addition to the great books listed here, a few more that might be of interest:

- Turing’s Cathedral, by George Dyson

- Black Software, by Charlton McIlwain

- Programmed Inequality, by Mar Hicks

The Dyson book is a rigorous and deep historical dive into the philosophical and practical origins of digital computing, and is really great.

The other two are equally great and deep but cover computing history through different lenses. The Hicks book in particular may be of interest for you, as its emphasis is on the history of computing in the UK. They’re less directly about how computers “work”, as such, and more about how computers and society have interacted with one another in interesting and non-obvious ways, and how those interactions have impacted the ways in which technologies have developed.


I read some of the pages of Turing's Cathedral, and it read like an elaborate ad for Princeton. It also felt like a Princeton history.

Whilr it is interesting, and I might finish it later, it is not something I was looking for or have time now.

I also learned some cool things, like, Einstein forbade the use of the word "god" in a wall engraving becauae people might think that he believed in god.


I got to the end of Turing’s cathedral, but it wasn’t an easy ride. I didn’t pick up the Princeton Ad vibe, but it was definitely all over the place and not focussed on the core story that I was interested in, which was von Neumann’s machine.

I did learn a lot from the book, so in that regard I’d recommend it. But as a ok it’s nowhere near “Soul of a new machine”.


Ted Nelson's YouTube channel: https://www.youtube.com/user/TheTedNelson

It's also worth looking at: https://www.youtube.com/user/yoshikiohshima. There's a goldmine of talks by people like Alan Kay and Seymour Papert. An important question to ask when, "probing" the literature-- why are computers the way they are in terms of human-computer interaction and human culture? What is a, "computer" without making an appeal to mathematical concepts like Turing Machines/Lambda Calculus? What are the major, "paradigm shifts" that gave us GUIs, mice, ect...?

It's worth noting that the history of popular computers parallels almost exactly the neoliberal economic period. Atari was founded in 1972. Look into the Mansfield Amendment and ARPA. Try to get past a cultural myth that computer companies started in, "normal" people's garages. Try to see past the, "present concept." Alan Kay has famously said, "The computer revolution hasn't happened yet." It's up to the current/future generations to, "really" define what computers are in terms of human culture. Think, "living history." Think, "world before-after the invention of the Gutenberg printing press."

https://www.nsf.gov/nsb/documents/2000/nsb00215/nsb50/1970/m...

https://en.wikipedia.org/wiki/Douglas_Engelbart

https://www.theatlantic.com/magazine/archive/1945/07/as-we-m...


My practical recommendations:

  * understand brainfuck or so called RAM machine as simplest computer
  * read 50 pages of https://en.m.wikipedia.org/wiki/Code:_The_Hidden_Language_of_Computer_Hardware_and_Software
  * read https://www.bottomupcs.com/ to understand low level stuff
  * Learn some C
To understand computation I think scheme or lambda calculus is the best. Don’t know good intro.

Bear in mind that what we have is just certain implementation/abstraction for computation that’s likely still suboptimal. That’s why people come with new languages/VMs. I wonder if some alternative to RAM machine exists. I’ve heard about lisp machines…


IEEE has a special interest group called Silicon Valley Technology History Committee, which regularly hosts talks/discussions: https://r6.ieee.org/sv-techhistory/?page_id=320

Here is an example link from a recent session on the history of Ethernet networking standard: [ Ethernet’s Emergence from Xerox PARC: 1975-1980 ] https://www.youtube.com/watch?v=SVEcqZnGya0


I have a bachelor's in Computer Engineering from University of Illinois at Urbana-Champaign and several of my courses covered how computers work in detail!

- ECE 190 and ECE 290 covered basic programming, logic gates, and the basics of software processor architecture.

- ECE 391 (one of the hardest courses in the school) covered x86 assembly and operating system design. The capstone project for the course was to build a simple OS with terminal input.

- ECE 411 covered processor architecture in detail, and how a modern x86 processor is built.

There should be courses from other universities that cover the same topics. Here's some similar courses I found on MIT's OpenCourseware platform.

- Computation Structures covers logic gates and other standard electronic constructs. https://ocw.mit.edu/courses/6-004-computation-structures-spr...

- Operating Systems Engineering covers fundamentals of operating system design: https://ocw.mit.edu/courses/6-828-operating-system-engineeri...

Best of luck!


Just to piggyback on, I'd be interested in the pre-computer history of computing. That is, a survey of how they handled all the computation problems before (electronic) computers. Like, storing large amounts of data, having "databases" that need to answer queries over a large geographic area, how they replicated "databases", how they indexed information, how they did backups, and so on.


Michael R. Williams' "A History of Computing Technology" ( ISBN-13: 978-0818677397 ) talks a lot about pre-electronic-computer technologies and techniques for calculation and information management.

It's a solid suggestion for history of computing books in general. I like it better in several ways than the more common Ceruzzi book (especially because it starts much earlier and situates better), but it hasn't been updated since 1997 and some things, especially in the more recent history, are understood differently now than at the time it was written.


Thanks for the suggestion, I’ll take a look!


The textbooks I used in university were "From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry" (Campbell-Kelly) and "A History of Modern Computing (second edition)" (Ceruzzi). There is a brand-new update to the second one, "A New History of Modern Computing" (Haigh/Ceruzzi) that I'm looking forward to reading this summer.


Came here to mention these two books.


In the vein of learning how computers work, osdev has got to be pretty high up there. It’s so much fun - it has become my hobby. I’m surprised no one else seems to have mentioned it.

I just finished implementing a really basic network stack for my x86 kernel, including a (crappy) driver for the RTL 8139 network card. I just learned a ton about how the internet works. I learned it all in college, but there’s something different about grappling with it directly.

And I’ve gotten pretty good at C in the meantime. I’ve also learned a ton about virtual memory, page tables, system calls, various hardware, how data is stored on disk, the ELF file format, how processes are loaded and executed, the x86 calling convention, a little bit of assembly code, just to name a few.

Check out https://wiki.osdev.org for where to start. I’m hoping to start writing some blog posts about all of this soon, to provide a resource to complement the osdev wiki. A lot of info on this is surprisingly hard to dig up.


One resource I stumbled upon was The Dream Machine book. Very broad historic overview of computing history throughout last 70 years or so.


I did a quick search and I didn’t see HOPL mentioned (History of Programming Languages). You can learn a lot about this history of computing in general by going through that workshop series. HOPL IV was just last year and had some great talks. https://hopl4.sigplan.org


Turing's Cathedral by George Dyson is a good source on the history and development of computation in the 1930s through to the 50s. It's very centred on the work that was done at Princeton by Von Neumann et al [1] and lacks coverage of important work that was going on at the same time in Germany, the UK, and other places.

You might want to look into how the idea of computation came out of mathematical work in the early twentieth century. The Annotated Turing by Charles Petzold is good if you're up for some maths.

Aerospace and spaceflight were some of the first activities that required large-scale software development. You could check-out Starburst and Luminary by Don Eyles and Digital Apollo by David Mindell.

[1] The author's father was Freman Dyson who was at the Institute for Advanced Study (at Princeton) with Einstein, Gödel and others.


If you like podcasts, there is Advent of Computing. It's not chronological, instead covering a different topic every episode. Most recent episodes are about magnetic core memory, INTERCAL, a hypertext system developed by the US military, and the Analytical Engine, respectively. There's over 80 episodes now so there's a lot to learn about.

Website: https://adventofcomputing.com/ RSS: https://adventofcomputing.libsyn.com/rss

If you want an idea of how computers work, there are toy virtual machines that are a good teaching tool (https://peterhigginson.co.uk/RISC/).


I find reading old issues of Byte magazine from 1975 up to around 1989 to be very educational. There appears to be a complete archive herehttps://archive.org/details/Byte-Magazine-Complete.


I'm getting this error page at that link:

    The item you have requested had an error:
    Item cannot be found.
    which prevents us from displaying this page.

    Items may be taken down for various reasons, including by decision of the
    uploader or due to a violation of our Terms of Use.


Weird that must have just happened! They are also available here.. This site has a crazy amount of old magazines and has been running for years : https://worldradiohistory.com/Byte_Magazine.htm


On the more "history" side of things -- Podcasts!

My very first introduction to anything "old" tech was through the TechStuff podcast[0] (re: 2011-era episodes, so sort by oldest).

More recently the On The Metal podcast[1] has been a really cool deep dive into old tech history, especially the episode (season 2) with John Graham Cumming.

About implementations, my first real playing around with assembly was "Learn TI-83 Plus Assembly in 28 days"[2].

[0]: https://player.fm/series/techstuff-2152808

[1]: https://oxide.computer/podcasts

[2]: https://tutorials.eeems.ca/ASMin28Days/welcome.html


I love this question because I’ve also been fascinated with the history of the field, some suggestions below.

From September 2021:

A new history of modern computing

https://mitpress.mit.edu/books/new-history-modern-computing

Skip around it’s various chapters, it’s fun of little details.

Also a fun read, this old article about the silicon in Silicon Valley, it’s from a long dead magazine and it’s titled They Would be Gods:

https://www.dropbox.com/s/l9mi2aqnyf5fp3l/They%20Would%20Be%...

Lastly, the part I enjoyed the most of Walter Isaacsons bio on Jobs, was the adjacent history.


Worth finding and watching is the three-part PBS series called Triumph of the Nerds hosted by Robert X Cringley.

It covers the rise of the PC up until the early 90s and has interviews of everybody including Bill Gates Steve Jobs Larry Ellison etc. etc.… It’s pretty amazing.


This site has a nice timeline of computer development history dating back to the 1930s:

https://www.computerhistory.org/timeline/computers/


Computer history museum in San Jose is pretty cool.


Are you referring to the one in Mountain View, or is there another one?



Many many suggestions two years ago in this Ask HN: Computer Science/History Books?

https://news.ycombinator.com/item?id=22692281


1. Visit Bletchley Park and the attached computer history museum.

2. Check out a recent computing history book like: Thomas Haigh and Paul E. Ceruzzi (2021) A New History of Modern Computing, Cambridge, MA, USA: MIT Press.


The MIT Press has published a fairly large set of books, mostly about IBM, that I found very useful for understanding the 1950s (and for IBM the early-mid 1960s as in the System/360). I would start with Project Whirlwind: The history of a pioneer computer by Kent C Redmond, that's a Digital Press as in DEC book, that computer by and large defined a great many standards in the way we build computers to this day, and that project also developed the first sane type of RAM, core memory.

The MIT Press books describe IBM starting in the punched card data processing era, how the above intersected with IBM in the SAGE project, and much more. IBM defined a great deal of what was used for a long time by many companies due to a previous punched card anti-trust settlement which required them to FRAND license their patents. So their electromechanical wizzardry turned to computer systems like the half-inch reel to reel tape became ubiquitous, was still the thing in the first part of my computing career.

DEC also published some good books about its history, I can look those up if you want.

These are some of the topics I've found to be very interesting, there's of course lots more published by them and others. But I like to focus on what became wildly successful, or what caused a company like IBM to falter very badly like when it become dogmatic in the System/360 days that virtual memory was a bad thing and would not be supported, in one stroke costing them the high end academic market and eventually resulting in their becoming a niche player.

For DEC, a study of the PDP-6/10 and PDP-11 is a necessity if you want follow my interests, the former became a dominant replacement for IBM computers, the latter was the system UNIX first became big on.


I highly enjoy the nandgame[0]. Its a game that goes from the basics of building simple logic gates all the way up through building memory and an ALU. While you can go into it blind expect to need to study and look up a ton of stuff if you have never had an intro-electronics course. Best of all it's a free web game.

Classes to look into. An intro course in microcontrollers would be a good place to start. Usually you will find them attached to the Electrical Engineering department. Maybe take a course in Circuits or Computer Architecture.

[0] - www.nandgame.com



It's tangential, but "Where Wizards Stay Up Late: The Origins of the Internet" by Katie Hafner and Matthew Lyon is one of my favorite books on the foundations of the Internet.


I came here to post this - fascinating book, which I loved.


A great summary, from someone who was influential in web technologies, can be found in this series "Crockford on JavaScript": https://www.youtube.com/watch?v=JxAXlJEmNMg&list=PL766437924...

(context: https://en.wikipedia.org/wiki/Douglas_Crockford)


Youtube "The computer chronicles"

Fascinating show, mostly about micros but featuring early industry legend Gary Kildall.

The software reviews are hilarious. The predictions of the future of computing always wrong. The guests demoing stuff always cut off as soon as it gets interesting.

I started programming as a teenager in that era but never saw the show in period. For me its eye opening just how amateur the industry really was. The show is unintentionally funny now, but really gives a great idea of the time period.


I found this to be a pleasant primer before delving deeper into the subject. Though, as you can see, as with all things, there are different takes on it's merit based on where people are coming from.

https://www.goodreads.com/book/show/191355.Darwin_Among_The_...



Read these books-

- Innovators by Walter Isaacson

- Code by Charles Petzold

- The Annotated Turing by Charles Petzold

- Where The Wizards Stay Up Late by Katie Hafner

- The Information by James Gleick


Though it's geared more for the non-CS, general population, I found Crash Course Computer Science with Carrie Ann Philbin to explain concepts clearly and it's entertaining https://www.youtube.com/playlist?list=PL8dPuuaLjXtNlUrzyH5r6...


If you have the money and time to take classes, I'd recommend which ever of the standard CS foundation courses interest you:

- Programming Languages (and then Compilers, my favorite)

- Algorithms

- Operating Systems

At a decent school with some level of difficulty, you'll learn the big picture while doing fun projects for homework, along with history.

Programming is a craft, not a science, but it overlaps with math in a lot of places.


A Computer Called LEO, by Georgina Ferry. This book tells the story of how Lyons teashops created LEO, the first business computer. It also tells the story of early computing, from the Difference Engine of Charles Babbage to the codecracking computers at Bletchley Park and the ENIAC in the US, and the story of postwar British computer business.


https://oxide.computer/podcasts

This podcast is everything


> I've become very aware that JS/TS and Swift etc are just APIs on top of APIs.

I'd call them abstractions. This is how it's - high level programming languages (C#) are just abstractions so that we don't have to remember machine instructions. Similar to a name in Contacts connecting to a phone number; the former is far easy to remember.


Read papers by the people who actually invented things. I've been doing this a little recently, it's very eye opening.


I had aspirations on writing a book on 'pre-"software engineering" history of software' that hasn't made much progress.

I used NATO's conferences in 1968 and 1969 on "The Software Problem" as my inspiration.

Now that the ACM digital library is available without subscription, that would be a good resource of their publications.


Maybe buy a raspberry pi pico and code it in assembly?

Or try to find a retro computer, e.g. a BBC micro and start programming it for fun?


I was looking at a vintage computer, so yeah that's a possible route. Been using emulators so far but it's not the real thing. Thanks.


I'd say if you have experience with vintage computer then it's probably worthwhile to get one. However just work with an emulator if it's not too exciting to hold a real machine because the cost of maintenance might be pretty high.


I'd second the emulator recommendation. While programming, say, an Apple ][+ can be fun, getting one running maybe not so much. Plus, with an emulator, you'll have the ability to do things like write your code in a modern editor and then paste it over to the emulator rather than trying to work with 8-bit tools. I used to hand-assemble my 6502 code back in the day because I couldn't afford to buy a fancy macro assembler, but I wouldn't recommend writing code in long hand and filling in the hex codes on paper before typing it all in to anyone in 2022.


An intermediate step between real hardware and emulation could be buying the hardware for a MisterFPGA.


PBS made a "crash course" computer science series that covers a lot of topics https://www.youtube.com/watch?v=tpIctyqH29Q&list=PLH2l6uzC4U...


I ended up binging the entire series after seeing this.


“How Computers Really Work” by Matthew Justice. [1]

I enjoyed this book because every chapter includes hands-on hardware and software experiments for you to see the concept described in action.

[1] https://www.howcomputersreallywork.com/


Maybe not exactly what you're looking for, but Xibalba BBS (https://xibalba.l33t.codes for a web UI) hosts a ton of articles on computing history in the files section.


My HS teacher made a pretty good high level video on computing history. I recommend starting there.

https://www.youtube.com/watch?v=MZ3tSPF83yo


Obligatory mention: "Secret History of Silicon Valley" by Steve Blank [0].

[0]: https://steveblank.com/secret-history/


I enjoyed this book recently:

Computer: A History of the Information Machine by Martin Campbell-Kelly, William Aspray

But this is a first of all a history book. But nonetheless I learned a lot of things!


A tangent, but you might find interesting starting points from the nand2tetis course. ( www.nand2tetris.org) And reading seminal papers by the likes of Turing and Church


The Acquired podcast is really good. Both speakers have CS degrees but are VCs now. Learned a ton from the TSMC, NVIDIA, Sony, A16Z, Epic, and Sequoia episodes.


If you're ever in Silicon Valley, take a stroll through the Computer History Museum (which used to be SGI for some meta history fun).


_UNIX A History and a Memoir_ by Brian W Kernighan


I enjoyed a book called “The Binary Revolution” by Neil Barrett. It gave me a good sense of computing history from c1930 to c2000.


I would recommend: The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution - Walter Isaacson


hello,

about 2 years ago there was a similar thread here @ HN

* https://news.ycombinator.com/item?id=22907211

awesome-computer history

* https://github.com/watson/awesome-computer-history

br v



Computer History Museum in Mountain View (CA) is amazing. Can spend half a day there.


A few recommendations from an avid armchair computer historian:

- Dealers of Lightning is a wonderful book covering Xerox PARC's history and contributions. If you don't know what Xerox PARC is, then you should definitely read it.

- Where Wizards Stay Up Late is a highly-readable, engaging book covering the history of the Internet's early development.

- Soul of a New Machine gives a compelling glimpse into the era when physical machines and unique architectures were more dominant than software in shaping the market.

- The Jargon File as maintained by Eric Raymond is not without controversy, but I think it's still fair to say a lot of computing folklore and cultural history is preserved there. http://www.catb.org/jargon/html/index.html

- Folklore.org is a wonderful, wistful collection of stories from the early days of Apple Computer, as told by some of the engineers and programmers who made it what it was in the 80s and early 90s.

- The Thrilling Adventures of Lovelace And Babbage is a wonderful graphic novel that's full of utterly ridiculous fiction that's only loosely inspired by the title characters. However, it is jam-packed with footnotes about the actual history from top to bottom, and in my opinion, there probably isn't a better or more fascinating glimpse of the proto-history of the computer anywhere.

- Douglas Engelbart's Mother Of All Demos is well worth watching (can be found on YouTube), and maybe reading some commentary on. Mind-blowing what his team put together that we still haven't really matched in some ways.

- Vannevar Bush's piece "As We May Think" isn't really about computers, but it's hard not to connect it to them when you read it. And then maybe to sigh and wonder how someone who didn't have any machine like what he describes can have a vision more compelling than what we've actually managed to build, so many decades before it happened.

- If you're interested in hypertext, look into Ted Nelson. None of his work ever really took off, and Project Xanadu was a legendary mishandled undertaking, but his vision for what might have been is fascinating, and influenced many of the software pioneers, as I understand it.

- This glorious video of using a 1930s teletype as a command-line Linux terminal taught me a surprising amount about why the classic Unix tools work as they do. https://www.youtube.com/watch?v=2XLZ4Z8LpEE

Enjoy!


Live a long time and never stop learning.


You may find the history of the first large-scale digital computer interesting.

https://www.amazon.com/Colossus-secrets-Bletchley-code-break...

It was used by the British to break the Lorenz cipher (which the Nazis used to encrypt high-level strategic communications.)


There are YouTube videos of Seymour Cray.

Learn about the CDC 6600 and the architecture compared with top IBM 360 computers. Learn about Cray I.

Cray was the builder of the fastest computers in the world for a long time. Always sold early machines of each model to scientific govt labs and NSA. The first 6600s were delivered to Livermore and Los Alamos (Wikipedia CDC 6600).

With less than 30 people Cray built a computer (the 6600) about 2 times faster than anything IBM with its massive budget could build. There is a famous letter by IBM's Chief Watson Jr. about this fact at The Computer History Museum website.

When IBM used ICs for the 360's, Cray still used transistors for the CDC 6600. His reasoning is great.


If you wish to learn how COMPUTING works not computers.... then Rosens book on Discrete Mathematics is the MASTER KEY




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: