> John and I went to one of the early microcomputer shows in San Francisco. Steve Jobs was sitting on a wooden peach crate telling all his design in another box of wires that did not work was the computer of the future. He said that mass of wires was not working because something was bumped getting that mess to the show. John, who later became an IBM hardware design engineer, took pity on Steve and designed the Apple II system board which was the first version that actually worked no matter what is claimed today.
This comment makes me doubt some of the veracity of the rest of the article and its extraordinary claims. But maybe I’m biased by reading a lot of the accepted history.
Yeah, that's kind of weird, really. It basically ignores the existence of the Apple I, and I've never seen anyone talk about Steve Jobs claiming to have designed either the Apple I and II -- that's always been credited, even by Jobs, to Wozniak.
Maybe this is supposed to be a claim about the Apple I and Wozniak, but that still doesn't fully line up.
The details always matter in these kinds of statements, and I guess there is often exaggeration in a lot of claims to have designed things. Someone that made a one sentence comment on a design then claims to be the one that actually made it work. It’s clear that Woz was a uniquely talented hardware designer, but he also wasn’t living in isolation (although imagine what it would have been like if he had the internet).
In Chuck Peddles oral history at the Computer Museum he has this to say about Apple:
Peddle: Apple II. OK, so Apple I. Peter lived in the Bay Area, and he was my salesman for the Bay Area. So he was selling Atari and all that sort of stuff. And we went to two or three other of his customers and helped them with the ICE. And he called up and said, hey, we got these two young guys working on this computer in their garage, and they can't make it work. So Peter and I took the ICE down, and we made it work.
Peddle designed the 6502 and a lot of 6502 systems, he would have been a great person to have around to get your design going. And an ICE (In Circuit Emulator) would have helped too.
> In fact, in Tracy Kidders’ book “Soul of a New Machine”, Bill was one of if not the university guru that Data General went to for help when their microprocessor design stumbled.
Hah. I never thought I'd see CSUS here on HN. I briefly went there, but wasn't in CS. In the early 2000's, I worked in a large finance company in Sacramento. Our tech lead spent a not insignificant portion of his time interviewing people. He always complained how Sac State grads never knew anything software relevant coming out of that program. They didn't know any web technologies or Java, a little C, but they were all whizzes at hardware, COBOL, and assembly.
Looks like I underestimated that program in the 90's. This article gives me a bit of a hint of where they were focused.
Sac State grad here, from 2002-2006! After the turn of the century they started teaching more Java classes, but the most interesting classes were definitely the Assembly and Computer Architecture classes. I never ended up using them in my career, but they were a blast.
The article uses wrong terminology. VLSI is just a generic to describe IC with very high degree of integration, like a CPU or a real-time MPEG compressor. What he described is actually an early attempt to produce FPGAs.
>"Because IBM dominated the computer market all Sac State computer science students had to learn BAL (Basic Assembly Language) used on the IBM System/360, 370 and later mainframes.
Students had to prove their mastery of BAL by writing a working assembler, BASIC interpreter, simple compiler, operating system and data base.
To stop cheating completed student program listings, card decks and outputs were collected and stored in a locked area that faculty members reviewed to ensure students did not copy prior programs."
>"Bill recognized that all he needed was for the Intel 8008 to run the same BAL instructions. Bill had just finished a large graduate student project that created the firmware to make an IBM System 3 minicomputer run on a 4K based VLSI based computer.
Bill simply changed that code so BAL also ran on the Intel 8008.
This ability to run BAL let Bill's team pick and choose between the best student programs. They soon had the Sac State 8008 running DOS (Disk Operating System) which allowed loading and starting programs stored on paper tape, cassette tape, cartridge tape, and even their mobile pluggable 3/2 (three megabytes fixed, two removable) hard disk system. They had it running a simple BASIC interpreter. Because the BAL firmware ran so slow, Bill’s team also built a BAL assembler which instead of putting out one machine code instruction per human readable instruction put out all the code needed to run in 8008 machine code."
If the article is correct and Intel really could have made “a commercial microcomputer with BAL/DOS from Sac State”, then it’s quite mind boggling to think of how different things could have turned out. For want of a nail and so on.
Doubt if an 8008 uC would have been a commercial success. It ran at 125 KHz with instructions taking around 20 clocks. Instruction set was very primitive compared to the 8080 and the address bus was only 14 bits wide. It was best suited for calculators and controllers.
I built an 8008 machine in 1974 with a whopping 256 bytes of memory and toggle switches for program entry in binary machine language. You could watch it execute subroutines in the address bus LEDs!
That's around five times slower than a PDP-8/S, which was possibly the slowest mini around at the time - already obsolete by 1970.
So it's doubtful it would have been successful as a computing product, although it might have had some applications at the more undemanding end of process control.
It would have been more interesting to use the skills, techniques, and marketing contacts collected during the 8008 project to jump-start an 8080 project as soon as the new CPU appeared.
But the S100 era happened because the Altair had a standard bus that made the system trivially expandable.
Corporate/industrial/academic systems had expensive and relatively complicated proprietary bus and backplane systems, which limited the market and kept prices high.
So... an alternative 8080 system wouldn't have been a game changer unless it was sold with the same open hardware model. At best it would have been an expensive micromini for a niche academic and industrial market.
The 8 level deep stack of 8008 precluded most complex software though. It was also 4 to 6 times slower than an 8080 running at 2MHz, had a very small package that required extensive demultiplexing, slowing it further.
There were multiple kit computer based on 8008 in the era, ex. Mark 8, the French Micral, but they were severely hampered compared to something like the Altair 8800 and it's unlikely they could have had a similar story despite launching earlier.
I think he was probably too humble/Scandinavian to stand a chance.
> Gary Kildall was born and grew up in Seattle, Washington, where his family operated a seamanship school. His father, Joseph Kildall, was a captain of Norwegian heritage. His mother Emma was of half Swedish descent, as Gary's grandmother was born in Långbäck, Sweden, in Skellefteå Municipality, but emigrated to Canada at 23 years of age.
This comment makes me doubt some of the veracity of the rest of the article and its extraordinary claims. But maybe I’m biased by reading a lot of the accepted history.