The article states that the system was cooled using a solution of sodium chromate to inhibit corrosion. However the wiki page of sodium chromate states that it is very corrosive. Is it a typo or something?
It's also mentioned that the computer uses one of the first integrated circuits for miniaturization. Do you know if this can be definitely traced to advances in industrial/consumer products? It's a common trope that military research trickles down - so it's a "good" thing. It's not clear if this actually happens or if progress would have been made eventually without the need for these machines.
Sodium chromate is highly corrosive to humans (as well as carcinogenic, see the movie Erin Brockovich). However, it inhibits corrosion in metal, acting as a passivating inhibitor, forming some sort of protective oxide.
I've been doing a lot of research on the impact of Minuteman and Apollo on the IC industry (which led to the current post). The Air Force likes to take credit for the IC industry, as does NASA, but the actual influence is debatable. My take is that both projects had a large impact on the IC industry, more from Minuteman. However, even in the absence of both projects, there was a lot of interest and demand for ICs. If I had to take a quantitative guess, I'd say that those projects advanced ICs by maybe a year, but the basic trajectory would have remained the same.
Ken, the story I heard wasn't so much that the MM3 demand created the IC industry, but rather that it created the quality culture that then led to ICs being widely accepted, because they were reliable. The AF had such purchasing power due to the program that they were able to impose quality standards on the industry that hitherto were not expected.
The article states that the system was cooled using a solution of sodium chromate to inhibit corrosion. However the wiki page of sodium chromate states that it is very corrosive. Is it a typo or something?
Chromates are effective corrosion inhibitors for aluminum alloys and some other metals. Here's a brief article about how they work with aluminum:
"Inhibition of Aluminum Alloy Corrosion by Chromates"
When the Wikipedia entry's "Safety" section says that sodium chromate is corrosive, in context it means "destructive to human tissue by contact." That is, like sodium hydroxide (lye) and many other chemicals, in concentrated form it can destroy skin and eyes.
In a serial computer, you have a 1-bit ALU, say an full adder that generate a sum and carry. Each clock cycle you read two bits and feed them into the adder, and then you write back the sum. You hold the carry in a flip-flop to use in the next clock cycle. It's just like doing a binary addition with pencil and paper, one bit at a time.
Note that you need to start with the lowest bit with a serial computer, which explains why x86 is little-endian. It goes back to the Datapoint 2200, a desktop computer made from TTL chips and running serially. The Intel 8008 processor was a copy of the Datapoint 2200 (as was the Texas Instruments TMX 1795). Although the 8008 was parallel, it copied the little-endian architecture of the Datapoint 2200.
I've often wondered if serial computers could have a useful role again. At very high clock speeds and wide data paths, you hear about trouble controlling signal skews. In contrast, imagine a serial computer clocking data around at 8 GHz vs. an 8-bit computer clocking data at 1 GHz. You have to deal with faster speeds, but no skew, and it seems like a 1-bit ALU might be simpler (and faster) than a 64-bit one.
Hmm, I see - how do the opcodes work and jumping then? Do you also read them bit-by-bit and reconfigure the ALU / codepaths? Is addressing also single-bit?
Here's a 16-bit bit-serial computer I made and tested on an FGPA https://github.com/howerj/bit-serial. If you look at `bit.c` it looks like an ordinary 16-bit Accumulator based Virtual Machine with a few odd instructions that make more sense when you know how a bit serial CPU works, nothing special about it. However the VHDL in `bit.vhd` shows how all those instructions are processed in a bit serial fashion, how data is fetched and stored in shift registers, etcetera.
The bit serial CPU in `bit.vhd` is actually customizable, you can make a 32-bit, a 14-bit, or a 27-bit CPU if you want from that VHDL quite easily.
if you write down two binary numbers on paper and add them, you will almost certainly do the computation bit-serially, adding each pair of corresponding bits with the carry from the previous operation. serial computers work the same way
I've got the book on my desk right now :-) It's a bit of an unusual book because it is full of technical details but it also has a fair bit of sociological content like "the construction of technical facts", "technological determinism", and "sociology of technological knowledge". This is in contrast to, say, "Minuteman: A Technical History", which is strictly facts and details. They are both good books, but it is interesting how they have completely different styles and focuses.
I agree with your observations about "Inventing Accuracy". Personally I found the sociology focus a bit unexpected, and maybe a little too strong in some chapters, but still a worthwhile point of view.
I will definitely be taking a look at "Minuteman: A Technical History". Books dealing at least in part with the history of IMUs are few and far between.