which was one of the first computer books I ever read --- I believe in an abbreviated form in _Reader's Digest_ or in a condensed version published by them (can anyone confirm that?)
EDIT: or, maybe I got a copy from a book club --- if not that, must have gotten it from the local college after prevailing upon a parent to drive me 26 miles to a nearby town....
“I am going to a commune in Vermont and will [In my mind I've always heard a 'henceforth' inserted here for some reason] deal with no unit of time shorter than a season”
One of my favorite quotes in the book - when an overworked engineer resigns from his job at DG. The engineer, coming off a death march, leaves behind a note on his terminal as his letter of resignation. The incident occurs during a period when the microcode and logic were glitching at the nanosecond level.
His resume doesn't really indicate he didn't join a commune. Leaving Data General in 1979 and joining ComputerVision in 1979 wouldn't preclude joining a commune for a season. However, a letter to the editor in 1981 [1] provides evidence that he left Data General specifically to work for ComputerVision, with both events happening in Spring.
I don't know about an abridged version but it's one of the best books about product development ever written. I actually dotted-lined to Tom West at one point though a fair bit after the events of "the book." (Showstopper--about Windows NT--is the other book I'd recommend from a fairly similar era from the perspective of today.)
I'm about 75% through the audiobook, and it's absolutely fantastic.
The most surprising thing so far is how advanced the hardware was. I wasn't expecting to hear about pipelining, branch prediction, SIMD, microcode, instruction and data caches, etc. in the context of an early-80s minicomputer.
Another great Kidder book is "House", which applies the same perceptive profiling to the people involved in building a single house -- the couple, the architect, and the construction workers. There's an argument that a lot of the best work is done not in pursuit of an external reward but because doing good work is itself rewarding, and one way of viewing these books is as immersive explorations of how that plays out.
Indeed, one of the more memorable set pieces in chapter 1:
"He traveled to a city, which was located, he would only say, somewhere in America. He walked into a building, just as though he belonged there, went down a hallway, and let himself quietly into a windowless room. The floor was torn up; a sort of trench filled with fat power cables traversed it. Along the far wall, at the end of the trench, stood a brand-new example of DEC’s VAX, enclosed in several large cabinets that vaguely resembled refrigerators. But to West’s surprise, one of the cabinets stood open and a man with tools was standing in front of it. A technician from DEC, still installing the machine, West figured.
Although West’s purposes were not illegal, they were sly, and he had no intention of embarrassing the friend who had given him permission to visit this room. If the technician had asked West to identify himself, West would not have lied, and he wouldn’t have answered the question either. But the moment went by. The technician didn’t inquire. West stood around and watched him work, and in a little while, the technician packed up his tools and left.
Then West closed the door, went back across the room to the computer, which was now all but fully assembled, and began to take it apart.
The cabinet he opened contained the VAX’s Central Processing Unit, known as the CPU—the heart of the physical machine. In the VAX, twenty-seven printed-circuit boards, arranged like books on a shelf, made up this thing of things. West spent most of the rest of the morning pulling out boards; he’d examine each one, then put it back.
..He examined the outside of the VAX’s chips—some had numbers on them that were like familiar names to him—and he counted the various types and the quantities of each. Later on, he looked at other pieces of the machine. He identified them generally too. He did more counting. And when he was all done, he added everything together and decided that it probably cost $22,500 to manufacture the essential hardware that comprised a VAX (which DEC was selling for somewhat more than $100,000). He left the machine exactly as he had found it."
That reminds me of the US, during the cold war, intercepting the soviet "Lunik" satellite, in transit by truck, which was being exhibited in the US(!), and overnight completely disassembling/reassembling it before letting it go on it's way with the soviets none the wiser.
I like the idea of identifying ‘bit flips’ in papers, which are (if I am following along) statements which precipitate or acknowledge a paradigm shift.
Perhaps the most important bit-flip of this paper’s time (and perhaps first fully realized in it) might be summarized as ‘instructions are data.’
This got me thinking: today, we are going through a bit-flip that might be seen as a follow-on to the above: after von Neumann, programs were seen to be data, but different from problem/input data, in that the result/output depends on the latter, but only through channels explicitly set out by the programmer in the program.
This is still true with machine learning, but to conclude that an LLM is just another program would miss something significant, I think - it is training, not programming, that is responsible for their significant features and capabilities. A computer programmed with an untrained LLM is more closely analogous to an unprogrammed von Neumann computer than it is to one running any program from the 20th. century (to pick a conservative tipping point.)
One could argue that, with things like microcode and virtual machines, this has been going on for a long time, but again, I personally feel that this view is missing something important - but only time will tell, just as with the von Neumann paper.
This view puts a spin on the quote from Leslie Lamport in the prologue: maybe the future of a significant part of computing will be more like biology than logic?
is an insight I've often seen attributed to von Neumann, but isn't it just the basic idea of a universal Turing machine? - one basic machine whose data encode the instructions+data of an arbitrary machine. What was von Neumann's innovation here?
Turing machines' instructions are different from their data; a Turing machine has a number of states that it flips between depending on what it reads from the tape.
> 6.8.5 ... There should be some means by which the computer can signal to the operator when a computation has been concluded. Hence an order is needed ... to flash a light or ring a bell.
(later operators would discover than an AM radio placed near their CPU would also provide audible indication of process status)
Similarly, on-board soundcards are notorious for this. Even in my relatively recent laptop I can judge activity by the noise in my headphones if I use the built-in soundcard, thanks to electrical interference. Especially on one motherboard I had, I could relatively accurately monitor CPU activity this way.
There's also audible noise that can sometimes be heard from singing capacitors[1] and coil whine[2], as mentioned in a sibling comment.
The connection to scrolling is usually increased activity on the bus. CPU feeding the graphics card new bitmaps is just one source of that traffic. In a previous job, I could tell whether a 3-machine networking benchmark was still running based on the sound.
There was a Lenovo docking station compatible with the T420 (around 2012). The headphone Jack had shrieky noise on it whenever the Gigabit Ethernet connection got going. Took a while to figure that one out.
The other two docking stations were built differently and had the wires run somewhere else.
I have experienced two or three computers in my life (and least one laptop) that produced barely audible sound when CPU activity changed. The most memorable was a PowerBook G4 with a touchpad, and as I slid my finger slowly across the pad in a quiet enough room, I could hear kind of a tick each time the pointer moved a pixel.
Not necessarily. My GPU audibly sings in the KhZ range whenever it comes off idle, a long time before the fans actually start up. It could be electrical noise from the fan drivers and/or motor coils if they're running at low speed, but it's definitely not the sound of air movement. And if you're e.g. processing photos on the GPU, you can hear it starting and stopping, exactly synced to the progress of each photo in the UI.
This was very common with first sound cards (at least on PC). As I remember, only with Creative Sound Blaster, completely finished era of continuous hearing of "machine soul" sounds.
My favourite was troubleshooting an already ancient original IBM XT in 1992... The operator was complaining about the screen giving her headaches.
Sure enough - when I went onsite to assist, that "green" CRT was so incredibly wavy, I could not understand how she could do her job at all. First thing I tried was moving it away from the wall, in-case I had to unplug it to replace.
It stopped shimmering and shifting immediately. Moved it back - it started again.
That's when we realised that her desk was against a wall hiding major electrical connectivity to the regional Bell Canada switching data-centre on the floors above her.
I asked politely if she could have her desk moved - but no... that was not an option...
... So - I built my first (but not last) solution using some tin-foil and a cardboard box that covered the back and sides of her monitor - allowing for airflow...
It was ugly - but it worked, we never heard from her again.
My 2nd favourite was with GSM mobile devices - and my car radio - inevitably immediately before getting a call, or TXT message (or email on my Blackberry), if I was in my car and had the radio going, I would get a little "dit-dit-dit" preamble and know that something was incoming...
(hahahaha - I read this and realise that this is the new "old man story", and see what the next 20-30 years of my life will be, boring the younglings with ancient tales of tech uselessness...)
My 2nd favourite was with GSM mobile devices - and my car radio - inevitably immediately before getting a call, or TXT message (or email on my Blackberry), if I was in my car and had the radio going, I would get a little "dit-dit-dit" preamble and know that something was incoming...
I remember this happening all the time in meetings. Every few minutes, the conversation would stop because a phone call was coming in and all the conference room phones started buzzing. One of those things that just fades away so you don't notice it coming to an end.
The “GSM chirp”. I never got to hear it much because my family happened to use CDMA phones in that era, but I do remember hearing it a few times. I know it was well known.
At a certain large tech company, at the peak of GSM data-capable phones (eg Palm Treo, Blackberry) it was accepted practice for everyone turn off data or leave their phone on a side table due to conference speaker phones on the tables amplifying the data.
Also, during this era I was on a flight and the pilot came over the PA right before pushing from the gate saying exasperatedly "Please turn your damn phones off!" (I assumed the same RF noise was leaking into his then-unsheilded headset).
I had an IBM in a Electrical Generating Station that had a HUGE 480 volt 3 phase feed not far from the monitor in the next room. The display would swirl about 1/2 of an inch at quite a clip.
The solution I picked was to put the machine in Text/Graphics mode (instead of normal character rom text mode, this was back in the MS-DOS days), so the vertical sync rate then matched up with the swirling EM fields, and there was almost zero apparent motion.
Many years ago, I was a user of an IBM 1130 computer system. It filled a small room, with (as I recall) 16K 16-bit words and a 5MB disk drive, which was quite noisy. I would feed in my Fortran program, then head down to the other end of the building to get coffee. The computer would think about the program for a while, and then start writing the object file. This was noisy enough that I'd hear it, and head back to the computer room just in time to see the printer burst into action.
(Please feel free to reference the Monty Python “Four Yorkshiremen” sketch. But this really happened.)
Much of my earliest computer experience was on an 1130 clone, the General Automation 18/30. Never did the AM radio thing, but you could see what phase the Fortran compiler was up to just by the general look of the blinkenlights.
> When people who can’t think logically design large systems, those systems become incomprehensible. And we start thinking of them as biological systems. And since biological systems are too complex to understand, it seems perfectly natural that computer programs should be too complex to understand.
For some time I've been drawing parallels between ML/AI and how biology "solves problems" - evolution. And I also am bit disappointed by the fact the future might lead us in a different direction than mathematical elegance of solving problems.
I really loved that quote in the article. It’s such an excellent description of the “computer voodoo“ users come up with to explain what is, to them, unexplainable about computers.
You’re right though, we’re basically there with AI/ML aren’t we. I mean I guess we know why it does the things it does in general, but the specific “reasoning“ on any single question is pretty much unanswerable.
Evolution: optimization done by selection, result is the DNA, that does amazing (we can't match that with traditional programming) things but it's bloody hard to decipher.
Neural network: created via training, does amazing things but (this is the area my knowledge is too small) it's bloody hard to decipher why is it doing the thing it does.
"Traditional" programming: you can (as in article) explain it down to the single most basic unit.
This is a great read. von Neumann was pivotal in the design of architecture, I think his contribution is way under appreciated - did you know he also wrote a terrific book The Computer and the Brain and coined the term the Singularity? https://onepercentrule.substack.com/p/the-architect-of-tomor...
"The attribution of the invention of the architecture to von Neumann is controversial, not least because Eckert and Mauchly had done a lot of the required design work and claim to have had the idea for stored programs long before discussing the ideas with von Neumann and Herman Goldstine[3]"
Yes, von Neumann was tasked with doing a write up of what Eckert and Mauchly were up to in the course of their contract building the eniac/edvac. meant to be an internal memo. goldstein leaked the memo and the ideas inside were attributed to the author, von Neumann. this prevented any of the work being patented btw, since the memo served as prior art.
The events are covered in great detail in Jean Jennings Bartik's autobiography "Pioneer Programmer", according to her von Neumann really wasn't that instrumental to this particular project, nor did he mean to take credit for things -- it was others that were big fans of his that hyped up his accomplishments.
I attended a lecture by Mauchly's son, Bill, "History of the ENIAC", he explains how eniac was a dataflow computer that, when it was just switches and patch cables, could do operations in parallel. There's a DH Lehmer quote, "This was a highly parallel machine, before von Neumann spoiled it." https://youtu.be/EcWsNdyl264
I think the stored-program concept was also present internally at IBM around those times, although as usual no single person got the credit for that: https://en.wikipedia.org/wiki/IBM_SSEC
I believe it was probably a typo, and one that existed in the original paper. The tricky thing about it is, a person might not notice it until it was pointed out. (I didn't - but I verified afterward that it's in Lamport's paper)
If you've read the article, and thought to yourself... I wonder what it was like back then, and if there might be some other direction it could have gone, oh boy do I have a story (and opportunity) for you!
It's my birthday today(61)... I'm up early to get a tooth pulled, and I read this wonderful story, and all I have is a tangent I hope some of you think is interesting. It would be nice if someone who isn't brain-foggy could run with the idea and make the Billion dollars or so I think can be wrung out of it. You get a bowl of spaghetti, that could contain the key to Petaflops, secure computing, and a new universal solvent of computing like the Turing machine, as an instructional model.
The BitGrid is an FPGA without all the fancy routing hardware, that being replace with a grid of latches... if fact, the whole chip would consist almost entirely of D flip-flops and 16:1 multiplexers. (I lack the funds to do a TinyTapeout, but started going there should the money appear)
All the computation happens in cells that are 4 bit in, 4 bit out Look up tables. (Mostly so signals can cross without XOR tricks, etc) For the times you need a chunk of RAM, the closest I got is using one of the tables as a 16 bit wide shift register, which I've decided to call isolinear memory[6]
You can try the online React emulator I'm still working on [1], and see the source[2]. Or the older one I wrote in Pascal, that's MUCH faster[3]. There's a writeup from someone else about it as an Esoteric Language[4]. I've even written a blog about it over the years.[5] It's all out in public, for decades up to the present... so it should be easy to contest any patents, and keep it fun.
I'm sorry it's all a big incoherent mess.... completely replacing an architecture is freaking hard, especially when confronted with 79 years of optimization in another direction. I do think it's possible to target it with LLVM, if you're feeling frisky.
I don't like to rain on anyone's parade, but this is at least the
third time I've seen one of these comments, and if it were me I'd want
someone to point out a drawback that these young whippersnappers might
be too respectful to mention. Isn't it a truism in modern computer
architecture that computation is cheap whereas communication is
expensive? That "fancy routing hardware" is what mitigates this
problem as far as possible by enabling signals to propagate between
units in the same clock domain within a single clock period. Your
system makes the problem worse by requiring a number of clock periods
at best directly proportional to their physical separation, and worse
depending on the layout. If I were trying to answer this criticism,
I'd start by looking up systolic arrays (another great idea that never
went anywhere) and finding out what applications were suited to them
if any.
Not sure why you're saying that systolic arrays never went anywhere. They're widely used for matrix operations in many linear algebra accelerators and tensor units (yes, largely in research), but they are literally the core of Google's TPU [1] and AWS EC2 Inf1 instances [2].
Please also note that the ENIAC slowed down by a factor of 6 (according to Wikipedia) after it's conversion to von Neumanns architecture. BitGrid is slower than an FPGA, but far more flexible for general purpose computation.
Sorry to burst your bubble but modern FPGAs are already designed this way (latches paired with muxes and LUTs across the lattice). Take a look at the specs for a Xilinx Spartan variant. You still need the routing bits because clock skew and propagation delay are real problems in large circuits.
Yes, I know about clock skew and FPGAs. They are an optimization for latency, which can be quite substantial and make/break applications.
The goal of an FPGA is to get a signal through the chip in as few nanoSeconds as possible. It would be insane to rip out that hardware in that case.
However.... I care about throughput, and latency usually doesn't matter much. In many cases, you could tolerate startup delays in the milliseconds. Because everything is latched, clock skew isn't really an issue. All of the lines carrying data are short, VERY short, thus lower capacitance. I believe it'll be possible to bring the clock rates up into the Gigahertz. I think it'll be possible to push 500 Mhz on the Skywater 130 process that Tiny Tapeout uses. (Not outside of the chip, of course, but internally)
[edit/append]
Once you've got a working BitGrid chip rated for a given clock rate, it doesn't matter how you program it, because all logic is latched, the clock skew will always be within bounds. With an FPGA, you've got to run simulations, adjust timings, etc and wait for things to compile and optimize to fit the chip, this can sometimes take a day.
The goal of an FPGA is to get cheap prototype and/or for applications which don't need large batches, enough to ASIC become cost effective.
Low latency links are just byproduct, added to improve FPGA performance (which is definitely bad if compare to ASIC), you don't have to use them in your design.
If you will make working FPGA prototype, it would be not hard to convert them to ASIC.
Latency is only part of the problem. Unmanaged clock skew leads to race conditions, dramatically impacting circuit behaviour and reliability. That's the reason the industry overwhelmingly uses synchronized circuit designs and clocks to trigger transitions.
"The terminology of calling computer parts "organs" is gone now, but the "Control" here describes what we now commonly refer to as CPU."
Actually, what was described is currently called the "Control Unit". We divide CPUs into a Datapath (which includes registers, busses, ALU...) and a Control Unit which is either hardwired or microcoded. Decoding instructions is the job of the Control Unit though it might get a little help from the Datapath in some designs.
The block diagram for a modern processor at the end of the article is just the Datapath - the Control Unit is not shown at all which is the most popular current style. It must actually exist in the chip and HDL sources, of course, if the processor is to actually do anything.
It is reasonable amount for CISC CPU cache, as for CISC normal to have small number of registers (RISC is definitely multiple-register machine, as memory operations are expensive with RISC paradigm).
> for CISC normal to have small number of registers
A very common fallacy. For CISC it's normal to have small number of programmer-visible (architectural) registers. But nothing stops them from having large register files having about a hundred or two of physical registers which are renamed during the execution. IBM has been doing this sort of thing since the late sixties.
Modern x64 have IIRC 180 integer general-purpose registers, and I imagine this number can be increased further: say, AMD's Am29000 from 1985 had 192 registers... and honestly, having more than 16 programmer-visible registers is not that useful when 99.999% of the code is written for the high-level languages with separate-compilation schemes: any function call requires you to spill all your live data from the registers to the memory; having call-preserved registers means that you can do such a spill only once, at the start of the outer function, but it still has to happen. And outside of the function calls, in the real world programs most of the expressions that use only built-in operators either don't require more than 4 registers calculate, or can be done much more efficiently using vector/SIMD instructions and registers anyway.
> For CISC it's normal to have small number of programmer-visible (architectural) registers
You compare apples with carrots.
Question is what you name CISC/RISC. If under CISC considered early single-chip stubs like 8080 or even x86 before AMD64, this is one history, but if CISC are IBM/360,/370,/390 and DEC PDP-11 or later, this is totally other drama. Same is with RISC.
What really important, genuine CISC typically have human-friendly instruction set, and even /360 assembler is comparable to C language; many RISCs are load/store architectures, don't have rich memory addressing modes, so by definition tied to work inside registers. Also, when compare similar level of manufacture and similar timeline, will easy see, 68k series have just 8 registers, but same era RISCs have at least 16.
This analogy, I swear. Yes, I do, and this is an entirely reasonable comparison. Both are foodstuffs, and you can make pies out of both. But which pie would be healthier, and/or more nutritious? Now take the relative prices (or comparative difficulty of growing your own) of the ingredients into the account.
> moving data from a data register to an address register and back did not destroy the top 8 bits
Yes, you could use address registers for data storage, but you cannot make general computations on them, and because of this wiki definition don't account address registers in list of general purpose registers.
Ideally, your working set fits into the registers, and that was most win feature or RISC, having relatively simple hardware (if compared to genuine CISC, like IBM-360 mainframes) they saved significant space on chip for more registers.
- Register-register operations are definitely avoid memory wall, so data cache become insignificant.
Sure, if we compare comparable, not apples vs carrots.
If you are unlucky, so your working set don't fit into the registers, in this case you sure will deal with memory bus and with cache hierarchy. But best if you just don't touch memory at all, making all processing in your code in registers.
The main topic for this blog is a paper that describes what is now more commonly known as "von Neumann Architecture". This architecture is used in almost every computing device on earth
I'd argue that MCUs, the highest volume of which are largely based on Harvard architecture, far outnumber von Neumanns.
I just realized that the word for organ the music instrument and the body part is one and the same in English. (In dutch they're called orgel and orgaan respectively.) Which of these meanings is being referred to in the article? To me both could make sense.
The definition of organ in this article is closest to the body part definition. The use of organ here relies on a less common definition: roughly a constituent part of some larger whole that performs a specific function.
Not just a specific function, but a necessary function. Usually, without an organ, the organism won't survive. So your fingers aren't organs, because you can survive (with some difficulty) without them, but without your stomach or heart or skin, you'll die.
https://www.goodreads.com/book/show/7090.The_Soul_of_a_New_M...
which was one of the first computer books I ever read --- I believe in an abbreviated form in _Reader's Digest_ or in a condensed version published by them (can anyone confirm that?)
EDIT: or, maybe I got a copy from a book club --- if not that, must have gotten it from the local college after prevailing upon a parent to drive me 26 miles to a nearby town....