Digital Apollo, by David Mindell (development of Apollo guidance and avionics generally)
The Dream Machine by Mitchell Waldrop (technical development leading up to the Arpanet, thence to the Internet)
Colossus, the secrets of Bletchley Park's codebreaking computers, by Jack Copeland et al.
Unfortunately, there's a lot more of the technically unsophisticated junk that Knuth is complaining about, and the excuses offered here rather badly miss the point. ("The history of software is not the history of computer science". Yes, and so? How does a history of software from which all technical detail has been drained away contribute anything of value?)
Let me reiterate one recommendation in particular, Mechanizing Proof, by Donald MacKenzie. This is a superb introduction to the technical subject of formal methods as well as a superb history! My only regret is that it was published in 2001 and a lot happened in the last decade.
Google is your friend and Bob's your uncle.
Worldcat indicates that the book is available from various libraries: https://www.worldcat.org/title/ignition-an-informal-history-...
The science was so good that I got a bad grade on my History class paper, because I focused too much on the science and not so much on the "storytelling" of history. Which apparently puts me in good company slongside Knuth :-)
Now that you have mentioned it, inventors and technology are treated in a similar fashion. Not sure why. Maybe because historic biographies adopt their methodology from general history where kings and generals are simply dots in a larger picture that does try to describe the society of the time?
1. A timeline where you ought to remember what happened before and after that thing.
2. Objectify events based on causal reasoning presented in the books itself. For example "America got dragged into WWII because of an unprovoked attack by Japan that caused a massive loss of life and military resources"
And the share volume of world history covered in a year or so makes it so hard to do any justice to a topic.
The entire planet gets into a war in the early 1900s because some two-bit archduke of something-or-other got shot? WTF? Woah, back up there, Teach; you've got some explaining to do. (And when I read the explanation years later, I still go "WTF?")
Point is, if the cliche is correct then we're doomed to repeat history again and again because there seems to be little effort to explain why things happened. What the hell was the U. S. doing in some dot on the map in Southeast Asia in the 60s? Domino theory? C'mon, I doubt anyone believed that even at the time if a ten-year old version of myself is saying, "how dumb do you think I am?"
But, as you mention, it would seem that there's little enough time to teach the timeline, let alone explain the treaty situation of early 1900s Europe such that WWI makes sense.
Btw I was at the WWI memorial in Kansas City (created just a few years after the war) and they list Darwin as a cause of the war! The leaders of the world had read his "Origin of the Species". It was variously interpreted politically to mean 'the strongest culture will win' and used by the German leadership to justify war. War was natural, it would winnow the nations down to just those worthy of survival, and Germany was the most worthy.
The cause of war is not usually taught. It's the events that led up to war that is. http://en.wikipedia.org/wiki/Events_leading_to_the_attack_on...
Does anyone know of resources (books, websites, etc) on historical advances that does actually cover the technology?
When Knuth wrote Vol. I, "Fundamental Algorithms", there were almost no computer books published other than manufacturer manuals and a few introductory programming language books. Knuth's series is really a history of algorithms, with each one traced back to its originator. The early history is on the record.
We're now inundated with corporate histories, mostly of applications companies. There's a glut of CEO biographies. There are academic papers on the history of the spreadsheet. There's no longer a lack of history on the user-facing side.
This comes with progress in the field. A book-length history of the electric motor, from Henry to Edison to Sprague to Tesla and up to modern brushless motors, would be interesting - to a very small number of people. Popular articles on motors can be found in mainstream publications from 1880 to 1925 or so. After that, they were just routine items, not newsworthy. Low-level algorithms now belong in that category. A few people still have to study how to do hash tables, just as a few people have to study rotating electrical machinery theory and design new motors.
The important thing is not to lose the history. Campbell-Kelly says "we can't save everything". Well, we can. Disk space is cheap. Future generations (of humans and machines) can re-summarize it.
I personally think there's a rebranding of Linked Data as a way to rescue the real-world failures of the Semantic Web dream by saying that all interlinked databases are part of Linked Data and ergo the Semantic Web, and my limited research of the historical record backs that up.
But this knowledge doesn't really affect how most people do business now. (Though we end up with highly paid consultants who peddle a false mythos rather than paying historians for the real information.) It's mostly relevant for those points that Knuth enumerated, which are superb humanistic goals that few beyond the academically secure or retired can afford to advance.
While much can be saved, many things are only in people's heads, or in their personal effects. Doing that history now, rather than waiting, means that more of those can be recorded for the future.
It's hard to fight because it is rarely based on outright lies, per se, just pervasive misinterpretations which cast things in a false light. Disputing it makes you look like a nitpicking fool picking at things which barely even exist, but accepting it makes history seem like a teleological process, inevitably aimed at today like an arrow aimed at a target, with all previous processes culminating in what we have now.
\\1 Too many principals of said history are still alive, still influential, and still kicking too much to allow much in the way of historical analysis. It is much easier for a historian to stick with a subject where those who remember events are safely dead.
\\2 It would be improper to expect anyone to write informatively about ideas they cannot themselves understand. And it would be improper (as the author of the article writes) to expect a historian to understand an unrelated technical field.
This is unfortunate, because "history", the preferences of historians notwithstanding, is the story of technology. I do not believe there is any great difference between any of us and some goober fingerpainting on the walls of a cave.
So, what the heck
was Goober doing with his abilities that were
crucial? He found the cave, kept out animals
that could hurt him, built a fire to keep it warm, built
families, found food and water,
make tools, weapons, and clothes, etc. Apparently
found good uses for his brain.
"The computer and its software nervous system brought a revolution in human development as significant as the steam engine, the automobile or the aeroplane, and even more effective in shrinking the planet"
What would you get from a history of recent revolutions in human development from someone who doesn't know why steam-powered airplanes never really caught on? It doesn't matter what kind of geo-political story they can put together; the idea just doesn't work.
A Margaret Gowing told C-K about his history of EDSAC, "What you have written
is clearly very good. I know practically nothing about computers, but I can tell that what you have written is good history, so far as it goes. However, let me urge you to look beyond programming technology to consider the kinds of people who were using computers and the problems that they were solving." Which is excellent advice, certainly, but part of what he comes up with years later, in his embarrassment over not following the advice the first time, is, "Jim Wilkinson investigated errors and stability in digital numerical methods and developed advanced matrix programs. These were to prove vital in understanding and preventing “flutter” for the British aircraft industry, which was still reeling from the de Havilland Comet air disaster of 1954," which makes me suspect he knows about as much about numerical methods as I do.
But can C.S. departments manage that as an institutional matter?
That link just redirected to the mobile version on my Galaxy Note 3, but you reminded me to try the "Request desktop site" option which worked fine.
It's funny that the mobile site is completely unreadable on a mobile device, where the desktop version formats cleanly on mobile and is much easier to read!