Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Restoring YC's Xerox Alto day 8: it boots (righto.com)
292 points by dwaxe on Sept 26, 2016 | hide | past | favorite | 95 comments



I love that the first boot screenshot at https://lh3.googleusercontent.com/-JB4VOj7FgnU/V-lCHIADrsI/A... says:

    Date and Time Unknown
Oh, you poor sweet computer, if we told you the answer, it would blow your little mind.


I happened to come across the Alto's time handling code [1] while looking for something else. The Alto counts seconds since Jan 1, 1901 and the time can run until 2036 since it uses a 32-bit value. (Unix time in comparison runs from 1970 to 2038; it runs half as long because it loses a bit by using a signed value.)

The point of this is the Alto can handle dates until 2036, but will run into problems two years before Unix self-destructs in 2038.

[1] http://xeroxalto.computerhistory.org/Indigo/AltoSource/TIMES...


Good to see they at least thought some reasonable amount of time into the future; in contrast, the Apple Lisa, released 10 years after the Alto and a few years after the IBM PC, had an RTC that only went to 1995:

https://systemfolder.files.wordpress.com/2009/11/lisaem-date...

http://lisafaq.sunder.net/lisafaq-hw-io-cop421_clock.html

(Apart from this shortsightedness, the rest of the Lisa's design is also... interesting. Proprietary custom ICs, a pretty much locked-down user-is-an-idiot GUI, and complete separation between users and developers. It failed then, but unfortunately the same strategy is now considered the norm 30 years later...)


32 bit unsigned seconds spanning 1901-2036 is the same representation as NTP.


The PC had no RTC, so DOS would ask you the date and time every time you booted it:

https://en.wikipedia.org/wiki/File:PC_DOS_1.10_screenshot.pn...

...and if you accepted the default, as many people at the time did, it would be 1980-01-01 00:00:00. DOS and the PC actually had no Y2K problem, storing the number of years since 1980 in a single byte --- which will overflow after 2235. It is notable that there is no year 2038 problem either.

I'm not sure what the Alto's RTC capabilities are. Hopefully a bit better than the PC's?


Yeah, yeah, get me the EGA card/memory update that includes the Dallas real time clock!


It's always weird hardware combinations like that that makes me look batch at the early PC era and wonder what drove some of those decisions. Like all the sound cards that included a free extra ide controller for running CD-ROMs, you couldn't boot off them most of time but it meant you could play games and music CDs, assuming you had the correct extra audio cable to connect the two (many systems couldn't read the raw data fast enough to play without the drive doing the decoding).

And then I could swear at least one of my video cards added a joystick port.


Both of these make perfect sense. On early PCs everything was on a ISA card. Your parallel ports, serial ports, HD controller, floppy controller and video. Dallas RTC and video upgrade were probably the 2 most popular upgrades but you likely didn't have slots available for both, so putting both on a single card makes complete sense.

When sound cards came out with ide ports many computers still had ST-506 hard drives and not ide drives. Even if they did they might only have a single IDE port which was already being used or using it for a CD-ROM required on mounting your hardrive to switch master/slave jumpers on your hard drive. As since people at the time were installing sound cards for games and many need games requires CD-ROM drives the painting makes perfect sense. When soundcards with ports for CD-ROM drives first came out they weren't even necessarily ide ports. Some were SCSI or Sony CD-ROM ports.


In many cases you even saw the soundcards and CD-ROMs bundled together (and a pack-in demo CD or even a game) as that was such a common path to buy a computer, then upgrade it for sound and CD at the same time.


All of my joystick ports were on sound cards. As for the IDE controller, a lot of computers in the late 80s and early 90s only had one IDE channel, and it was pretty common to buy a sound card and CD-ROM at the same time (I remember the first ones my family bought were bundled in a "multimedia upgrade kit"). If you already had two hard drives, you were short on options, except that the sound card provided a solution.


Boggle. I'm amazed my old meat based brain can remember something like that 25 years later.


Probably will crash if you told it correctly. You know, the old Y2K problem. ;-)


As someone who was too young at the time, why would it crash? I thought the Y2K problem was just programs only storing the last two digits. So 2016 would be stored as 16 and displayed as 1916.


Yep, that's pretty much it.

You could run into crashes if something was dependent on the date being in the future since when the machine was built (and hence you get a negative number when you shouldn't have), or if the machine was alive between 1999 and 2000 (in which case the BIOS or applications may complain because you've gone back in time).

This /mostly/ affected date formats. Well built systems and programs measured time in seconds since the epoch on operating systems that supported that time representation format.


Many RTC chips used in PC's behaved weirdly when set to any year >2000 (resp. <80 or so). These behaviors included setting year to some fixed value, setting it to correct value but not incrementing date ever again and so on.


Yup, It will overflow its mind!


This series is the best thing on HN in a long time. As a software engineer by trade, I feel like I've been missing out not learning more about electrical engineering. The level of knowledge and skill involved in this restoration is downright awe-inspiring.


I am also a software engineer (in part) and I basically agree with everything you say, except, I don't really feel like I'm missing out.

I have a huge amount of respect for the people doing this, and it does look like great fun for them to do it and I do enjoy reading about them doing it but the idea of hooking up hundreds of logic probes and analyzing code on a logic analyzer sounds like something that is, personally, a nightmare. But, this is coming from a guy who hasn't picked up a soldering iron without burning himself on it.

That said, the EE class I took in college was by far my favorite (and the only one I regularly attended -- before I left), but I was always more entranced by the theoretical than the practical side of it.


I somehow suffer from the software / hardware split. The more I master the former, the more I need to learn about the latter. This is a good series. Kudos to their dedication and success.


At my University (Swinburne, Melbourne, Australia) I studied Software Engineering with a minor in Digital Electronics.

They said all along the goal wasn't for us to be as competent as the Electrical/Electronic Engineers, but to teach us their language and the basics so we could work together. It was great.

My degree is accredited by the Australian Institute of Engineers.


At ANU I took one of our (quite few) digital electronics courses that taught me how to design simple digital logic on FPGAs. It ended up being probably one of my three favourite courses of my degree.

I think what I got out of it was similar - I'm obviously not nearly as knowledgeable as a fully fledged electronics engineer, but I have some basic knowledge which allows me to cross the gap between software and hardware and truly appreciate the fact that it is actually possible to get from just having a pile of logic gates to having something resembling a processor.

To anyone reading this who's currently a CS major - if you have the opportunity to do a course in digital logic design I'd highly recommend it. Regardless of what layer of the stack you end up working in, it's a lot of fun and very satisfying to understand how this stuff works. If you do end up working fairly low in the stack it's particularly helpful - after uni I ended up working on kernel development at IBM where even though I don't work in hardware engineering I've found it helpful to have some of the background knowledge.


To anyone reading this who's currently a CS major - if you have the opportunity to do a course in digital logic design I'd highly recommend it. Regardless of what layer of the stack you end up working in, it's a lot of fun and very satisfying to understand how this stuff works.

Seconded.

There wasn't really a Computer Engineering degree back when I was in school, but I made do. As it turned out, some EE classes would qualify for the lab science credit, so I took those instead of more physics or chemistry.

I had been fascinated in electronics for a long time before starting college, so I had a blast in those classes.

And it has all helped me a lot since then. I'm working on radar systems these days, and it would have been useful to take a little more analog and RF design courses. But those weren't my interest way back then. I've picked up a lot on my own since then though.


If you have the time and interest, get a broken 80s system and a cheap logic analyser. Try to fix it; I learnt most that way. Or get a working and a broken and first analyse the working and then the broken.


A PDP-8 is an awesome machine if you want to get into this kind of thing. Not so big you'll need to extend your house. Front panel that will impress your friends. TTL logic that you can debug using a scope, replace dead chips with your soldering iron. Unless you have a problem in the core memory :)

As with all old iron there is little point to it but I used to love it.


Agreed, just very hard to get one... C64s are still easy to get.

Edit: also I do not think there is little point: it still allows people to get away from the computer as magic. Even though it does not completely compare to a modern system, once you have done this you will never think you system is performing magic or that some coding bug is the universe conspiring against you.


FWIW I might have an extra PDP-8/m :-)


I have a museum with over 1000 computers from 60-70-80s (oldest is from '62); what you want for it? :)


Must...Resist...


I could do vintage computer and electronics restoration all day...if it paid my bills!


Awesome!

All you need is the white lab coat and Gene Wilder's "It's alive!" in the background. Congrats on getting it to this point, as anyone who has brought up new systems for the first time knows, once you can get your central processor to load and run software of your choice, you can use it to tell you what it not working correctly and bring everything else up. It is always a total rush when the system boots for the first time. (or in this case, "boots again" :-))

I would be particularly keen to insure that the arrow keys work on the keyboard, as a lot of Altos were used to play mazewar and that was hard on the arrow keys.


As is said every time this blog updates: This is very interesting stuff and I enjoy reading about the progress.

I can't imagine the excitement these folks felt when the machine finally booted up. All of their skilled trouble shooting and hard work were spot on.

I can't wait to read more.


This is an amazing milestone! I have to ask, why did we need a separate boot disk sent over instead of flashing the previous disk with the contents of a boot disk? I suppose swapping out parts is an expected part of restoration though.


We're building a FPGA-based disk emulator that we could connect to the Drive to rewrite the junk disk, but we don't have that ready yet. The Living Computer Museum could have rewritten our disk for us, but that would have required sending our disk to Seattle first so they just sent us a new disk. (The Living Computer Museum has been extremely helpful, so visit them if you're in the Seattle area!)


Recycle PC is another Seattle shop with a cute little museum. It has old Amigas, Apples, Xerox, et al systems. Unlike the Living Computer Museum, they're not functional (or they might but, but they're turned off).

You can't look inside any of the bigger units (you'll be asked to leave if you do), but you can press the keys on the older mechanical and non-mechanical keyboards if you're a keyboard nerd.


Their assistance was really shrewd; I've been to the CHM in Mountain View before, but had never heard of the LCM and probably would never have gone before this, but now I'm definitely going to go the next time I'm up in Seattle.


I don't think they're assisting because they are shrewd, but because they are genuinely nice and helpful people, and like to see old computers running.


Oh, of course! I didn't mean to suggest otherwise. I'm just saying, it's also great publicity in addition to everything else.


I can highly recommend it. A team I was on did it as a team-building exercise and it was by far the best one of my career.


Yeah, I'm going to wrangle up some nerds and see if we can make a day of it.


To "flash" the previous disk (terribly amusing choice of anachronistic terms there) you need a working disk drive to drive the disk with. (You also need a disk controller, but you could probably cobble one together out of an Arduino and some TTL shift registers or something. Surely Ken could.) It turned out that their disk drive wasn't working reliably, so that would have been tricky.


They don't currently have any way of writing to the disk except using the Alto itself, which would require booting it first. Once they get it working they should be able to create their own boot disk.


They had that test jig in the previous video that could drive the disk and read it (and i think write it). That would have allowed them to do so, i would think, regardless of how slow it might have been.


The FPGA disk emulator/interface isn't quite working yet.

http://www.righto.com/2016/09/restoring-ycs-xerox-alto-day-7...


That's a bit like my old soldering iron - the heating element was soldered to its wiring, so you needed a soldering iron to fix the soldering iron.


Interesting use of the word flash here. It's a magnetic disk though, no flash chips, so you would just say write.

I imagine the issue was a lack of working drives, and Alto systems to run them.


And "flash memory" itself can't be flashed anymore. Isn't it called that because you could erase an EEPROM with a light?


And "flash memory" itself can't be flashed anymore. Isn't it called that because you could erase an EEPROM with a light?

That would be EPROM, not EEPROM. The 'EE' meant Electrically Erasable, so it didn't have (and didn't need) that little window on the top of the DIP package to shine the UV light upon the chip itself of EPROM.

Which was still a vast improvement upon PROM, which was write-once.

My first encounter with that stuff was a 2nd gen version of the floppy disk controller for my RS Color Computer. The newer version wasn't compatible with the Deft Pascal [1] compiler I had purchased, so some guys at the user group helped me to program the older 1.0 version of the firmware to see if that would work. It didn't. :-( The text editor was still handy though.

I eventually got OS-9 Level III running and picked up a C compiler on sale though.

[1] http://www.kenandmartha.com/coco/DEFT.html


No, flash is called flash because of the speed, relative to other programmable memories that can be erased.


"Flash memory (both NOR and NAND types) was invented by Fujio Masuoka while working for Toshiba circa 1980. According to Toshiba, the name "flash" was suggested by Masuoka's colleague, Shōji Ariizumi, because the erasure process of the memory contents reminded him of the flash of a camera."

https://en.wikipedia.org/wiki/Flash_memory#History


That's really weird and not what I remember being written back in the day when flash was invented. There is no 'flash' (optical erase process using UV) to go with flash memory, the erasure is electrical.

The 'eWeek' article that that paragraph was sourced from doesn't contain any explanation of why the association between the two was made, maybe early 'flash' memory did have an optical erase process?

Here is the quora link for the question why 'flash' is called 'flash':

https://www.quora.com/Why-is-flash-memory-called-so

Which is apparently sourced from a book about FPGA's, not directly accessible but roughly what I remember being written about flash at the time it came out.

http://www.linfo.org/flash_memory.html

Gives a similar definition.

Optically erasable memories existed but I'm not aware of any that have erasure times < several 10's of seconds in ideal conditions and minutes in more practical settings, certainly nothing in the sub second range that would justify the term 'flash'.

Eeproms (electrically erasable eproms) already existed well before 'flash' came along and flash is an improvement in speed on those, not a re-vamp of the optically erasable eproms.

It would be interesting to contact Toshiba to see if that quote from the inventor of flash memory can be substantiated and how to determine what the association with photography is if it does not refer to optical erasure.


The disk seek problem didn't mysteriously disappear. You need to bust out some some contact cleaner and clean those header strip connections, both sides.


Yes it is mysterious.

Was that drive known good then?

Also what is the track sense mechanism?

Optical shaft encoder? Could have been just a bunch of dust.

Also a lot of drives used a Hall sensor or even an opto-coupler with a slot to detect track zero. First thing was to calibrate the head with a track zero seek.

Another thing. Those flat ribbon cables are prone to a lot of RFI over long distances, especially when you have them hanging around on the bench. Notice some of the other ribbon cables are twisted pair. A lot better!

I've wire wrapped my own disk controllers back in the early 80's. This series is serious nostalgia! <3

Curious to know what the final problem will end up being diagnosed as.

The smell of old hardware like that is literally burned in my brain!


This has been a fun series to track. I used an Alto when I was in college. As an EE, we had to wire-wrap a computer using bit-slice and TTL parts. Definitely the hardest lab component course I took (there were much harder theoretical courses).

So a new Broadwell-EP Xeon chip has like 7 billion transistors. I'm trying to imagine this kind of computer hardware archeology as it will be done thirty years from now.


The video of the Alto booting is now available: https://news.ycombinator.com/item?id=12590137


Really neat. I can't wait to see the video!

The pictures of screen look unusual, as if every second scan line is missing. Is the Alto outputting an interlaced video signal? That would be very interesting and unexpected.


Yes, the Alto's video is interlaced. I took the photos with my phone (which used 1/60 second speed), which would explain why scan lines are missing. I'll use a real camera next time and see if that helps.


Congrats to everyone who made this happen. It's so cool!


What were the CPU/ ISA used in the Alto?


To the programmer, the Alto's instruction set is the same as the 16-bit Data General Nova minicomputer, with the addition of a few crazy instructions such as "copy character bitmap from font file to screen".

The Alto's CPU is built from a whole pile of TTL chips on three boards. The Alto's arithmetic-logic unit, like many computers of that era, uses 74181 ALU chips. The CPU runs a crazy multitasking microcode, with one of the tasks emulating the Nova's instruction set. The hardware manual refers to the "microprocessor", meaning the microcode processor, not a microprocessor chip since microprocessors as such were in their infancy at the time.

There's a picture of the Alto's ALU board in one of my earlier articles [1]. You can see the individual CPU registers, as they are made out of multiple latch chips. For all the details of the Alto's CPU, see the hardware manual [2].

1: http://www.righto.com/2016/06/y-combinators-xerox-alto-resto...

2: http://bitsavers.informatik.uni-stuttgart.de/pdf/xerox/alto/...


Just an addition, microcoded processors are conceptually very simple. One can start with considering the 74181 which has four lines to control which operation is performed with the two 4 bit inputs. So each microcode word has four bits to control the ALU. A few bits to select which words out of register file is presented to the inputs of the ALU, etc.

So conceptually a machine like that Alto is actually something you can grasp without a lot of pain. Very much unlike the early 8 bit processors like the 8080, Z80, 6502 where they used lots of ticks to keep the transistor count down.


> So conceptually a machine like that Alto is actually something you can grasp without a lot of pain.

Have you actually looked at the Alto's microcode? It's brain-explodingly bizarre. The first thing is it has 16 tasks (yes, in the microcode) and what an instruction does depends on what task is running. Second, every micro-instruction includes a computed goto, where the hardware can OR bits into the address. And because this is task-dependent, you can't figure out the control flow - maybe this instruction will proceed linearly, or maybe it will branch 4 ways depending on the status of the Ethernet board. Next, the circuitry uses PROMs in various places. You say the microcode has four bits tied to the 74181, but no, the four bits go into a mystery PROM which generates entirely different bits that go to the 74181. And then there's the constant PROM holding all the constant values used by the microcode. And the processor bus has the property that multiple sources can write the bus at the same time, with the values ANDed together. Not to mention the ALU shifter output is modified by a microcode function literally called MAGIC. And I'm just getting started here...

My point is that even by microcode standards, Alto microcode is bizarre and painful. And if you disagree, I have some microcode that needs explaining - MADTEST (Microcode Alto Diagnostic Test) fails on our Alto and nobody understands what it is doing.


Oh man I've used a Nova. Never come close to using an Alto.

You guys are rocking it. Really enjoying dropping by and seeing how you are doing.

Really cool!


This took place way before that technology existed. The CPU (ALU + Control Unit) was built from individual chips.

The Alto dates from 1973, and Intel's 4004 microprocessor dates from a year or two before (and was used to run a calculator, not a personal networked workstation)


Wikipedia has a detailed description of the hardware design:

https://en.wikipedia.org/wiki/Xerox_Alto#Architecture


I'm trying to read the article on an iphone but every time I zoom out to see the full width article I get bounced over to an article about toothbrushes?!

For the better part of a minute I was wondering if they had monkey-patched the disk controller with an electronic toothbrush?!


You must have swiped left to my earlier article where I did a teardown of a Sonicare toothbrush.

I've removed Blogger's swipe left/right navigation, which I think is more distracting than useful, and I've re-enabled pinch to zoom, so hopefully the navigation will work better for you now. Also note that you can tap on a photo to get a high-resolution version (better than zooming will give you).


It's a little thing, but thankyou for reading feedback and responding like this with blog improvements.


Thanks, it works great now.

I put two fingers down on the screen and then started dragging the fingers in opposite directions. I guess whatever javascript was enabled only looked at one of the touches (a zoom gesture is basically two opposite-moving swipe gestures, so...) :)


I think the page navigation is using "slide to the left/right". Probably your zoom movement detected as a page navigation


Geez, I thought it was because I'm using a beta of iOS. I figured apple had goofed something in the gesture recognition in this build. Glad you posted something, I was debating filing a bug.


I love it that its finally booting. I wonder if they've considered taking all the probes off, and booting without debugging hardware attached ..


We'll leave the probes on until we're done debugging since it's pretty tedious to reinstall them. Is there any particular reason you suggest booting without the probes?


Probably out of fear that the signals might deteriorate with probes attached.

But with a modern-ish logic-analyzer and the ancient original hardware (huge voltage swings, slow clock, very long busses), I doubt that the probes would make a discernible difference, would they?


Yes, you're right about the signals. The Alto has a 5.88 MHz clock which is very slow by modern standards, and the buses are a couple feet of wire-wrapped pins so the probes are unlikely to have any effect.

I'm not sure modern-ish is the right word for the logic analyzer. It's a 1999 Agilent logic analyzer and I don't know how they managed to make it so slow. You can ftp a trace off it over Ethernet at the glacial speed of 10 kilobytes per second, so copying traces for analysis is a big bottleneck.


Is it a 16500? If so, you might see if you can scrounge up a 16505A, which might be able to network much faster (plus, it supports VNC and has a nice (for the time) graphical display. The 16505A is basically a PA-RISC pizza box that hooks up to the 16500 via a SCSI cable on the back, and adds some X11/Motif GUI happiness on top. I took a quick look around eBay and saw one for ~$275, but you might be able to get one cheaper if you look.

Side note: it was called a "Prototype Analyzer" because it was meant for analyzing prototypes, but we had more than one customer tell their salesperson to come back later when it was out of the prototype stage.


The logic analyzer is a 1670G; Marc collects vintage HP equipment, although technically this is Agilent.


You might try mounting it via NFS; it's also quite possible that the serial port might be faster for getting traces off the box than its relatively anemic ethernet support.


The serial port is also slow as molasses. If we want a full download, Marc will run it over the serial port overnight. We haven't tried NFS.


Eh, 1999 was right when the split happened, so regardless of the label, it was still basically an HP product at the time. Still, that doesn't help you much; the 16505A only worked with the 16500 series analyzer. There was a 16700, IIRC, that was basically an 16500C with an integrated 16505A, but that doesn't help you much.


Have you tried a different ethernet switch connected to the logic analyzer ? One cause of really slow transfers is one end getting the full/half duplex negotiation wrong.


That's a brilliant idea, I almost forgot about this. When I installed networking gear in the 90s, common knowledge/best practice was to always fix duplex/link-speed on fixed installed gear (routers/switches) because they frequently messed up autonegotiation.

Try a managed switch where you can choose 10/100, Full/Half and see statistics. Also try to get a "netstat/ifstat" like display on the logic analyzer to see bad packets/collisions/...


Reminds me of my electronics R&D days on flight sims in the 1980s where the fix for one particularly edge-case timing issue was to replace a 74S TTL part (within a sea of 74S logic) with a 74LS one and INSIST that only 74LS parts were used in that particular socket. Job done, no need for more probing!


Back in the old days, we saw crosstalk at at 4Mhz. I think circuits absent a pullup can be susceptible. But in general, I agree, you're more likely to break something removing and reinstalling the probes.

Which reminds me, one time back at my first job, the hardware guys were working on a new design, and it only worked with the logic analyzer probes attached to the CPU. They later discovered a missing connection between a pullup or pulldown resistor and the appropriate +Vcc or GND supply.


Really cool stuff. Makes me want to give another go at bringing up my C-64 (much simpler, of course).

If only I had proper tooling and knowledge...


This has really been a great series to read. Does anybody know where this will live when they are done restoring it?


It will probably be at the YCombinator office for a while but we haven't figured out a long-term plan. Any suggestions?


Gateway the ethernet from yours to the one in the museum? After all, they are supposed to be networked workstations...


The Computer History Museum in Mountain View? Its close enough that you could still go visit it too :)

I think you should talk to them anyway as the material in these blog posts would make for a really amazing talk.

http://www.computerhistory.org/


The living computer museum is an amazing place. Highly recommend visiting if you ever come to Seattle.


Is this the only Alto running today?


The Living Computer Museum in Seattle has a fully functioning Alto, as well as a bunch of other vintage systems.


I played with it. It was as amazing as I had hoped.


There are a bunch of running Altos, so we're not exactly in uncharted territory. Fortunately people with Alto experience are willing to help out, for example Al Kossow giving us a disk interface card when we found ours was the wrong type. The Living Computer Museum is probably the best bet if you want to see one in operation.


<


>>




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: