Hacker News new | past | comments | ask | show | jobs | submit login
Xerox Alto Restoration Part 2: Firing up the monitor [video] (youtube.com)
107 points by andars on June 25, 2016 | hide | past | favorite | 24 comments



I'm really loving following this and I admire the tenacity this must take. For example...

Screen brightness too low? Okay, just whip out some ancient circuit diagrams, trace them to find the capacitor at fault, find that capacitor on the board, desolder it, test it, find that it's faulty, resolder a new one. Well that helped but it's not much better sigh. Now examine the switch, CRT tube, etc.

And here I've been known to throw in the towel when having 'this' issues in JS.


I do that kind of thing for antique Teletype machines.[1] The discouraging projects are when you get all the way to a working machine and the typing quality is still too bad to be usable. That machine had been dropped at some point. I could do more work on it, but I could get a better one on eBay.[2]

[1] http://brassgoggles.co.uk/forum/index.php/topic,43672.0.html [2] http://www.ebay.com/itm/272286062507


Teletypes are interesting because they're basically mechanical UARTs. Watching the mechanisms in them operate still fascinates me.


You may enjoy Bill Hammack's video "IBM Selectric Typewriter & its digital to analogue converter"

https://www.youtube.com/watch?v=bRCNenhcvpw


The IBM Selectric is a magnificent piece of engineering, and close to being the last complex piece of mass-manufactured mechanical logic . The only electrical part is a motor.

But it's not the ancestor of the computer printer.

The Selectric's main ancestor is the Blickensderfer_typewriter [1], from 1892. This used a curved type element rather than a typeball, but had a similar method of turning key presses into typing element positions. As with the Selectric, the font could easily be changed. There was a proportional-spacing successor to the Blickensderfer, the Vari-Typer, and IBM imitated that, too, with the IBM Selectric Composer.

The Selectric had a moving print head rather than moving paper, with cables on moving idlers used to transmit position to the print head. That mechanism is from the Teletype Model 28/35, from the early 1950s. Moving the print head rather than the paper was a Teletype concept - it worked much better with roll paper, and the machine was narrower.

The Selectric mechanism was not designed for electrical inputs and outputs. The add-on mechanism for that was an afterthought, and kind of a hack. The unit had to be built into a table, and the mechanism for the keyboard extended below the tabletop. IBM and Remington had previously built typewriters with electrical I/O, and those were used with some early mainframe computers.

Many computers, before and after the Selectric, used Teletype machines as I/O devices. That lasted until daisy-wheel printers and cheap CRT terminals were invented.

(Mechanical line printers have a completely separate history. They descend from the Potter Flying Typewriter.[2])

[1] https://en.wikipedia.org/wiki/Blickensderfer_typewriter [2] https://www.computer.org/csdl/proceedings/afips/1952/5041/00...


They are a glimpse into an earlier age in more ways than one. At that time industrial processes were not as specialized as today and could easily be repurposed. For example in the 1940s IBM's factories (which had been making mechanical tabulators and the like) converted to making guns, bombsights, engine parts and the like. Can you imagine one of Intel's multibillion dollar fabs making anything but single process chips?

The whole world isn't quite like that yet; a few years ago I visited a factory in India that made portable generators. Aluminium ingots and copper wire came in one end and generators came out the other. None of the labor was automated. They could have, with some effort, changed products.


I repaired a few TVs back in the 80s with my father. He worked as an electrical technician so he'd had all the equipment so when one of our TVs died, he'd open it up, find the fault, and fix it. I had smaller hands so I'd be the one who would actually use the multimeter to check the voltage drop across various components. By the time the 90s rolled around there were a lot more custom chips so we haven't done it since then.


Its what you can do when you:

A. have the schematics.

B. its not just a jumbled mass of ICs.


Summary: we got the monitor working (using a signal generator from the Seattle Living Computer Museum (thanks!)), but the display is very dim, probably due to the old CRT. Does anyone in the bay area have experience with CRT rejuvenation?


There's other things that could cause a dim display, not necessarily just a worn tube. Low anode voltages is another possibility that comes to mind. The various television/electronics forums may have people willing to assist with troubleshooting.

If it's really the tube, and it's a truly rare custom tube and not just something you might be able to find a suitable replacement for, maybe these guys can help:

http://www.earlytelevision.org/crt_rebuild.html


I was about to post a link to them. Full CRT rebuilding is a reasonable option for a B/W tube. (Color requires very precise electron gun alignment so the shadow mask lines up with the dots, and that's hard to do as a repair.)

The last commercial CRT rebuilder for ordinary TVs in the US closed in 2010.[1] But there's still a company that does it for military display devices that need to be kept going.[2] "We are your obsolescence solution now and in the future."

It really is a custom tube. PARC built the first tubes in-house (PARC was also a copier R&D facility, so they could build precision optoelectronics) but then sent the job out for production.

[1] http://www.tvtechnology.com/miscellaneous/0008/last-lone-wol... [2] http://www.thomaselectronics.com/


Back in the days, folks used to slightly increase the filament voltage, from 6.3V to 7V or more. Sometimes you could even recover the CRT and it would work with the nominal filament voltage.

This is a really unique piece of hardware and of course you might want to try this technique only if you have no choice.


If you wait for half a year, I own two rejuvinator units and have used them to restore some tunes. Once I move to the Bay Area I'd be happy to lend them out.


Was the camera operator x-rayed by the high voltage arc in the tube? (7:47 in the video)

At 7:35 he filmed in a close up the caution label on the tube backside with x-ray warning fine print

  X-RAY WARNING: When picture tubes are operated above 16 
  kilovolts, and when personal exposure is prolong at close 
  range, special shielding precautions against X-ray 
  radiation may be needed.


Is there some history of refresh rates around? I'm really curious why they used a "weird" rate on the Alto.


The Alto has a "weird" refresh rate because it uses a portrait-format display with 875 scan lines, compared to a "normal" display with 525 lines. The higher scan rate stresses the horizontal drive circuitry and can cause overheating problems.


So, they didn't do what portrait monitors do today? Interesting, thanks for the information.


"what portrait monitors do today" - you mean just turn the monitor sideways? That would have kept the scan rate lower, but the Alto would need to send the pixels a column at a time rather than a row at a time. That would be tricky to do (keeping in mind this was all done with TTL chips and microcode, so they wanted to keep things simple).


> but the Alto would need to send the pixels a column at a time rather than a row at a time. That would be tricky to do

Couldn't the buffer just be column-major?


The Alto could have used a column-major buffer and a sideways display, but that makes software more complex. For instance, text is in rows so its easier to render row-major. Using an 875 scanline display was easier - just change some resistors and capacitors in the monitor. (The exact resistors and capacitors are listed on page 36 of http://bitsavers.informatik.uni-stuttgart.de/pdf/xerox/alto/...)


I'm equally puzzled by this. The complexity argument is IMO bunk; it's no more difficult.

There is however a possible efficiency argument: if you are more likely to write horizontal rectangles of bitmaps, then you'd want the memory word to be horizontal so you have fewer memory locations to update. Was that really the case?

UPDATE: I'm was using 2016 thinking, my bad. In those days, compared to the complexity of the rest of the system, making the CRT circuits custom was trivial. If this could, even marginally, simplify the rest of the system (not least the software), it would be the right choice.


Meaning what, a landscape or wide monitor turned on its side?


yes


Reminds me the scene in Alien where they turn on the damaged Ash.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: