> If backspace is pressed on the keyboard, the Alto does a Ethernet boot
This sounds impressive. Was it common for machines of that era to be able to boot from the network?
Also, aren't those some awfully long wires to make high-frequency measurements with an oscilloscope? Some of that ringing and overshoot in the clock signals might not actually be in the circuit that is being probed, but caused by the long leads.
Ethernet was invented for the Alto, so it was the first machine to boot over Ethernet. (I don't know if any other machines did network boot earlier, using other networks.)
One entertaining thing about Alto boot: you could select a different boot address (i.e. boot something different) by pressing various keys at boot time. Each key controlled a different bit, so you would press many keys at once. This was called "nose boot" since you might need to use your nose to press a key if you ran out of fingers.
The Alto clock is 5.88 MHz, so the signals aren't as high frequency as modern computers. We could probably get crisper signals with better probing, but since we just want to see the signals, the quality isn't too important.
Man, doing that digital work with a scope can be painful at times.
I'd really recommend a Saleae Logic here to scope out your digital lines. You probably don't need the high-level protocol decoding at this point, but for capturing longer blocks of data with a great interface you can't beat this tool.
We (i.e. Marc) have a fancy HP logic analyzer with many, many input lines, so the plan is to use that next session. (Not that I have anything against the Saleae system.) I expect that once we can trace the executing code, we'll be able to figure out the problem quickly.
So ctrl+alt+del did have some precedent...
Good luck with the restoration effort, tracing the instructions using a logic analyzer seems like a daunting prospect, especially with such an arcane instruction set.
Thanks for the detailed articles, it is much appreciated!
"Was it common for machines of that era to be able to boot from the network?"
What network? This was the first machine to have a local area network.
A full Alto network included workstations, a file server, a laser printer, and a gateway to other Xerox networks using Parc Universal Protocol over 3Mb/s Ethernet. They had a complete all-Xerox vision.
Octopus was what we would today call a "Storage Area Network". Around 1968, IBM built the IBM 1360 Photo-Digital Storage System [1][2] for several of the big atomic energy labs. This was 1.2GB of storage. On microfilm. The whole thing was automatic, with hardware to write digital data on microfilm, develop the film, store and retrieve individual filmstrips, and read them. The film was directly written with an electron beam in vacuum. IBM electromechanical technology at its height. The Computer History Museum in Mountain View has one of the seven units built.
Since this was a big, but slow, storage device, it had to be front-ended by a computer with disks, and made accessible to other computers. That's most of what Octopus did. It wasn't peer to peer, like Ethernet.
Ha, guess you're right. I was thinking about mainframe clients, but I guess in those days they didn't even have their own CPU? This was way before my time.
The Alto becomes more impressive the more time one spends thinking about it, you have to actively compare it to the state of the art back then to see the innovations that we now take for granted.
It was impressive. I got a tour of PARC in 1975, when the Alto network was just coming up. The PARC vision then was to find out what the future of computing looked like by throwing money at the problem. The Alto/file server/laser printer network was insanely expensive, but someday it would hopefully become cheaper.
Making the technology cost-effective took a while. Around 1980, the UNIX workstations started to appear. Those tended to run around $20K each. By 1982 or so, you could have workstations, file servers, and printers from several vendors. Apollo, Three Rivers, and later Sun got into the business. The big problem was that, in the UNIX command-line tradition, none of these companies could do a decent GUI. (Most windows were either a terminal or an editor. Or a clock. Everybody had a GUI clock, with animated clock hands.) There was very little commercial software. One of the best pieces of software of the era was Interleaf, which was a very good WYSIWYG document editor, sort of like Microsoft Word. But it was sold as a $60,000 system including a workstation and a laser printer. There was no software mass market yet.
So all the key hardware was available years before the Macintosh came out. It just cost too much. And nobody had a good GUI.
Except Xerox, with the Xerox Star, 1981. But they had a different corporate vision - a dedicated system for word processing and document handling. The idea of tens of millions of people having to learn about the internals of complicated computers, just to do ordinary office tasks, was scary. How could Xerox support that? Xerox was into support; copiers were rented and service was part of the rental. So the Xerox system was a closed environment. Users could only run the Xerox-provided applications.
What Xerox didn't envision was a society in which computer literacy was widespread. Society backed into that, via the IBM PC, DOS, and open systems.
For others that have been following his work (and talks), it doesn't really contain much new stuff, but the context of a talk to an audience of programmers brings out some different contexts. He touches on the the fact that they didn't "just" spend extra money to prototype the dynabook in the form of the Alto - but they built 2000 machines - and part of that is of course so they could invite (among others) whole classes of school children to come and play with the tech.
Every time I delve into Xerox PARC documents I always get the feeling the computing world would be so much better if UNIX hadn't taken off due to how AT&T gave it away.
When I mix the information written there with my own experience using Smalltalk and Oberon, it is quite eye opening in terms of overall experience.
Nowadays I think Mac OS and Windows are the only two major environments whose experience comes close to it.
You're probably right, but you can't imagine the thrill of using the very early Unix system releases (I was an undergrad when the first Bell-external releases were shipped to my office-mate Geoff Steckel at Harvard): imagine an OS written in a higher-level (non-assembly) language! (Well, Multics existed, of course, but that was wildly byzantine in its complexity, being based on PL/I, which itself was a huge language. ;-)
And imagine a fairly powerful OS which was so simple at heart that you could read the entire source and understand it. (Lyon's samizdat book helped, too.)
For those of us studying systems at the time, it was magical.
Fascinating. It is indeed very easy to judge, with the benefit of hindsight, but - who could ever expect computer literacy to become mainstream? Especially since GUIs were bad and the command line still ruled, as you've pointed out. It still seems unlikely.
Well, Alan Kay expected computer literacy to become mainstream, or more accurately for computers to become easy enough for everyone to use. He was one of the driving forces behind the Xerox Alto. His 1972 paper on the Dynabook basically describes a vision of the modern laptop or tablet computer. (Starting around page 6, he describes exactly what technology improvements would be necessary, which is pretty interesting.) The Xerox Alto was the "Interim Dynabook", a system to try out his personal computing ideas before the technology was available.
I remember reading that the original IBM PC motherboard got an Ack from the keyboard when it initialized, and if a particular bit was set in the ack character it would read boot code from the keyboard port. Apparently it was just for testing though....
Wow, that is a remarkable amount of stuff that is working. My experience with wirewrapped boards is pretty mixed, some are solid and some are just flaky as heck.
Really curious what they find out about the disk, nothing at all coming over the interface sounds like a big (and usually simple) issue, like card in the wrong slot or cable plugged in backwards or to the wrong connector.
I'm hoping for a problem that's easy to fix, but not something stupid :-)
To clarify the disk issue, we're seeing sector pulses coming from the disk drive, so the drive and cabling is working. The Alto isn't sending any read requests to the drive, so it seems most likely that the microcode isn't seeing any disk request block in memory.
A lot of things need to work correctly to get the request block into RAM. So it could be a problem with a chip on the ALU board, a bad memory chip, something wrong on the disk interface card, a corrupted bit somewhere, or anything. With the logic analyzer, we should be able to see at what point in the microcode things go wrong. The nice thing about the Alto is because it's all TTL chips, it's straightforward to see what's happening.
I noticed that 3101 SRAMs have the same pinout as the later 7400-series 7489 16x4 SRAMs; I'm not 100% sure that they are fully electrically/timing compatible, but this could end up being a way of testing using known good parts...
I got to play with this a few weeks ago and it was really cool. Some friends and I created and formatted a document.
The whole museum is great and I highly recommend it to anyone who finds themselves in Seattle. There's something intensely satisfying about writing, compiling, and running Hello World on a teletype (paper roll!) hooked up to a PDP-7.
this is an excellent project that's great fun to watch the progress of. keep up the great work.
i have no experience with altos but i have been wondering -- on day one of the restore, it was noted the alto in question had some modifications from stock issue[1], could the booting issues be related to the microcode or wiring changes necessary to support those modifications? is it trying to boot off of the now missing trident drive?
That's an interesting question. From reading the documentation [1], the Trident disk should co-exist with the standard disk. It uses different microcode tasks (3 and 15), requiring additional microcode. No wiring change is mentioned (or discussion of booting off the Trident disk).
It seems like the Trident drive shouldn't be causing us any problems, but it's possible there's a wiring change or different PROMs. I guess we'll find out...
This is one of my favorite ongoing stories in HN. It's like a peek into a team building a time machine, and for all intents and purposes, this IS a time machine -- one that opens a window to the past.
This sounds impressive. Was it common for machines of that era to be able to boot from the network?
Also, aren't those some awfully long wires to make high-frequency measurements with an oscilloscope? Some of that ringing and overshoot in the clock signals might not actually be in the circuit that is being probed, but caused by the long leads.