Hacker News new | past | comments | ask | show | jobs | submit login
Y Combinator's Xerox Alto: restoring the legendary 1970s GUI computer (righto.com)
300 points by kens on June 18, 2016 | hide | past | favorite | 105 comments



I haven't used one of those in a long time.

Stanford had several Alto machines, but they didn't have Smalltalk, due to some licensing issue. They just ran standalone Mesa programs. When I was at Stanford, few people wanted to use the obsolete Altos, so time on them was available. So I did a small project on them.

Bravo was used as both the text editor and the word processor. The file format was plain text, then a control-Z, then the formatting info. The compiler stopped at control-Z. So you could use bold and italic in your programs, and make the source code look good.

As in the picture shown, the Stanford machines had the keyboard and display on top of the computer. This isn't required, and it's really annoying to type on. The keyboard is great; it's a massive casting around clicky keys.

Altos talk PUP, Parc Universal Protocol, over 3MB coax ethernet. Stanford had gateways to connect this to the wider world.

I think I still have some of the Alto manuals.

The vision statement for the Dynabook is in "Personal Dynamic Media"[1] This is worth re-reading every few years.

[1] http://www.newmediareader.com/book_samples/nmr-26-kay.pdf


I just looked in some boxes I haven't opened in decades. I have "Mesa Language Manual, Version 5.0, April 1979". If the people with the Alto need this, let me know.

If the world had used Mesa instead of C, computing would have been far less buggy. Mesa was a hard-compiled language, but it had concurrency, monitors, co-routines ("ports", similar to Go channels), strong type safety, and a sane way to pass arrays around. In the 1970s.

(I should donate this stuff to the Computer Museum. I just found the original DEC Small Computer Manual, many 1960s UNIVAC mainframe manuals, and a reel of UNIVAC I steel magnetic tape.)


Please, both of you, get in touch with Al Kossow at the Computer History Museum about preserving these important pieces of our heritage.


I'll second eschaton's comment about preserving these old documents. Al Kossow has already preserved a lot of Alto documents at bitsavers [1] and I've found them extraordinarily useful for understanding how the Alto works.

[1] The Alto docs are at http://bitsavers.org/pdf/xerox/alto/ - I recommend the hardware reference (AltoHWRef.part1.pdf ) and the schematics (schematics/) The Mesa manual mentioned earlier is at http://bitsavers.org/pdf/xerox/mesa/5.0_1979/documentation/C...


That's exactly the version of the Mesa manual I have, so that's been preserved.

Nobody seems to be very interested in early UNIVAC mainframe stuff, which is most of what I have.


Kossow does collect Univac mainframe stuff, though there are gaps. (Last I checked, bitsavers didn't have any Flow-matic docs, which is frustrating considering that language's historic importance -- though whatever exists would be a couple of decades older than the Alto.)


Well, I'd like to see the UNIVAC stuff preserved, even if it doesn't have the name recognition of, say, IBM or Xerox. Maybe I could write a blog post about UNIVAC some day - do you have anything interesting I could write about?


Maybe Modula-3 is too distant from Mesa to make this a valid point, but if not, there's a discussion to be had about why the Mesa-influenced Modula-3 (or the arguably essentially similar Ada) didn't sweep all before it in the 90s.

Is it that the "small set of sharp tools" provided by C, and the "safe and somewhat onerous discipline" provided by Modula-3 represent two points on an evolution that's converging towards the ideal systems programming language? Or maybe the language level is the wrong level at which to be considering this, as if we were analyzing prose at the level of phonemes?


Many of the Mesa researchers left Xerox PARC to join DEC.

There they created Modula-2+ with input from Niklaus Wirth that had created Modula and Modula-2 after his sabatical year at Xerox PARC.

Modula-2+ eventually became Modula-3, which was used for creating Spin OS, distributed objects network protocols and other interesting goodies.

Meanwhile at Xerox, Mesa evolved into Cedar. Niklaus Wirth on his second visit learned about this new system and created Oberon when he returned to ETHZ.

Oberon then gave birth to Oberon-2, Active Oberon and Component Pascal. The Gadgets framework in version System3 was great for a mid-90's workstation OS.

Nowadays I would say that C# is the spiritual successor of Modula-3, with Singularity, Midori and now .NET Native.

With Go taking the role of spiritual sucessor of Oberon.

Everyone should try to delve into books and papers from Burroughs, Xerox PARC, ETHZ, UK Royal Navy, DEC for an alternative view of doing systems programming the right way.


No, it's that Compaq bought DEC, and then HP bought Compaq, and that ended the DEC Systems Research Center in downtown Palo Alto. That's where most Modula-3 development took place.[1]

[1] https://en.wikipedia.org/wiki/DEC_Systems_Research_Center


Lucky HP has kept all DEC stuff alive, so anyone that wants to delve into Modula-2+, Modula-3 and everything that was built with them just needs to dive into the HP Labs repository.


Wow. Please conserve and write down what you have. Pivotal piece of computing history.


We've all been fans of Ken's blog for years, so were thrilled that he took an interest in this project. Not only is Ken doing these amazing writeups, he gathered together the master restorers and engineers, some of whom worked on Altos at the time, who are now working on this one. Seeing them set to it, inspecting the Alto and figuring out what would be needed, was a real lesson in self-organization. I felt honored just to watch from the side.

We have two goals. One is to have the restoration chronicled as it goes along, in a way the HN community can discuss and participate in. Obviously we hit the jackpot there, with one of the best technical bloggers in the world.

The other goal is to do something with the Alto that the community will find interesting once it's running. A couple ideas are to make it fetch and render the front page of HN (we'd happily write whatever code was needed to serve it in a suitable format, since HTML is probably a bridge too far), or if we could find a second Alto to communicate with, play Maze War on them (http://www.digibarn.com/collections/games/xerox-maze-war/#ma...). But we'll be eager to hear any suggestions the community comes up with!


I have a Symbolics XL-1200 that I've been wondering what to do with. Would YC like it? It worked the last time I turned it on, but that was 4 or 5 years ago, probably.

It's like triple-pizza-box size, not a refrigerator nor even the size of the Alto.


I think the answer to that is very likely: hell yeah! But we should wait until the Alto is running. Can we keep in touch about this?


Sure, no problem.


Wasn't the pizza-box one the XL-1201 ?


Whoops, you're right, it is an XL1201 (without the dash, so I made two mistakes :-). The legend on the case is "XL1201 Compact".


I used the Alto at MIT (and for fun when I worked at PARC -- we had more powerful machines by then).

There are two other things about the alto that have really stuck in my mind. First, the whole thing uses only 300 SSI and MSI TTL chips! No higher order chips (no LSI, much less VLSI).

The other is that the bus bandwidth was only 3/2 the screen update rate. Updating the screen was really important: this was a user-centered, IO-focussed machine which was super radical for its time. If you wanted to do a lot of computation you could steal cycles from the screen update, causing it to go black (in just the bottom half or so IIRC which I probably don't).

Error in the article: I do believe the Alto was the origin of the BITBLT instruction, but it was based on the PDP-10 (PDP-6) block transfer instruction BLT, and the expression blitting was current before the Alto was developed. In fact PARC had a PDP-10 which was the standard research computer at the time -- homemade as well (clones) because at that time Xerox was in the computer business and wanted PARC to use an SDS. (Again this is before my time though MAXC was still running when I was there -- with an Alto as its front end!)

Also contrary to what the article says, the Alto display was not unusual in being portrait mode -- most glass TTYs (think ADM-3A, Hazeltine, VT-52, and I believe the 3270 as well) were taller than they were wide, like a piece of paper. The Alto display was unique, as mentioned, by being bitmapped and black on white. Because of the Alto, bitmapped portrait mode was standard for workstations such as the CADR lisp machines, Bell's BLT terminal, three rivers PERQ, and of course the later PARC computers we used, Dolphins, Dorados (all ECL logic!), Dandelion (sold as the Star). I remember vividly the first landscape machine I used, the Symbolics 3600 in 1985. I didn't, and still don't appreciate the wasted space of landscape displays.

Three-button mice with the mouse buttons arrayed horizontally was also standard because of the Alto. The first time I saw the Macintosh mouse in 84 I was shocked: how could someone use only a one-button mouse? There was a lot of mouse (originally called the "bug") experimentation in the 70s on button count and layout.

The microcode of the alto was compatible with the DG nova as that was the computers used at PARC before the Alto was developed (before my time!).

edit: forgot to mention the origin of blitting.


  > most glass TTYs (think ADM-3A, Hazeltine, VT-52, and I believe
  > the 3270 as well) were taller than they were wide
That's mistaken.

http://terminals.classiccmp.org/wiki/index.php/Lear_Siegler_...

http://terminals.classiccmp.org/wiki/index.php/Hazeltine_100...

http://terminals.classiccmp.org/wiki/index.php/DEC_VT52

https://en.wikipedia.org/wiki/IBM_3270

Edit: Yes, I should have replied to the other comment…


(If you want to post a link to the comment you meant to reply to, we'll move it and delete this one.)


Thanks for the info, gumby. As you point out, the amount of bus bandwidth consumed by the display is a big thing. It seems a bit crazy that the processor was running microcode to copy all the pixels to the display in 16-word chunks, 30 times a second. (In a "normal" system, the video hardware fetches characters or pixels from memory. But in the Alto, the processor was running instructions to feed the pixels to the display over and over as they were being sent to the screen.)

I'm confused about your portrait vs landscape comments, though. The ADM-3A, Hazeltine 2000, VT-52, 3270, as well as Datapoint, Four Phase, Viatron, etc had a horizontal display, not a portrait display.


> It seems a bit crazy that the processor was running microcode to copy all the pixels to the display

It seems crazy today but not in context. There wasn't video hardware in today's sense. There were either mainframes (with channel controllers) with the terminal doing the "rendering" or minicomputers and microcomputers in which the CPU did everything (which is what I guess it was like before the mainframe era).

You can see this reflected in Unix, and therefore in Linux: unix was developed for the PDP-7 (and later -11), a reimplementation of some of the ideas of Multics, which ran on a mainframe. So C's IO was "user mode" (I seem to remember a funny line in the original version, something like, "You mean you I have to call a function to do IO?") and the kernel had expensive, primitive IO capabilities and involved the CPU in everything. Well, there wasn't any alternative in the smaller PDP line (FWIW PDP-10 were larger machines than the -7 or -11, though only the later models had channel IO).

Memory mapped IO was not uncommon on minicomputers.

> I'm confused about your portrait vs landscape comments, though. The ADM-3A, Hazeltine 2000, VT-52, 3270, as well as Datapoint, Four Phase, Viatron, etc had a horizontal display, not a portrait display.

I'm confused / unclear. Those terminals had more columns than rows, true, but the character positions were rectangular. So the ADM-3 and the Datapoint were rather square, actually, because TV tubes were squarish, not paper-like as I claimed. I think I biased my memory because I used a bunch of hacked terminals like AAA Ambassadors which could cram 80x48 rather than 80x24 and because of the rectangular characters were portrait-ish. Unfortunately it's too late to go back and edit my comment.


Ah yes, Ambassadors! Portrait mode "dumb" terminals, but highly valued for coding given their 48 lines of text. (Twice the normal in the VT-100 days.)

Had a lot of those first at the Columbia computer center back in the DEC-20 days, later at the Fairchild AI Lab startup (DEC-20's and LISPMs), and even later at Imagen (a Stanford TeX project spinoff building the first commercial laser printers before Apple and Xerox released theirs).


I think you also find the memory mapped thing in early micros.

http://www.c64-wiki.com/index.php/memory_map

If you wanted to do IO on the C64 for example you manipulated RAM addresses between D000 and DFFF.


"First, the whole thing uses only 300 SSI and MSI TTL chips! No higher order chips (no LSI, much less VLSI)."

I guess that makes it easier to reverse-engineer the hardware. I know it will be destructive, but does anybody know how many of these machines the visual6502.org team would need to do that?


Why reverse engineer anything ?

Just look up the logic diagram in your copy of The TTL Data Book.


Are all of them standard ones? If so, you're 100% right.

Also, I just saw https://news.ycombinator.com/item?id=11930327, which states: "All the Alto schematics are available at Al Kossow's Bitsavers [1]. If you want to understand how it works, start with the hardware manual [2]"

That must make this one of the easiest pieces of hardware to keep running, at least for the digital parts (monitor and hard disk are more of a problem, I understand)


I don't know of any nonstandard 7400 series parts (not sure what that would mean) but the low end ones (all the SSI and MSI) must have several suppliers even today.


The first Altos used early dynamic RAM chips which required some really oddball chips to talk to TTL. I imagine that both the memories and the interface chips are probably very hard to find these days. But this particular Alto is five years newer and probably doesn't have these problems.


Interesting about portrait mode monitors, and overall. Was the DG Nova the same machine mentioned in The Soul of a New Machine? Read it a while back.


The Soul of a New Machine was about the 32-bit Data General Eclipse (1980), follow-on to the 16-bit Eclipse (1974), follow-on to the 16-bit Nova (1969), which provided the instruction set for the Alto. So not the same machine, but related.

(I find the Nova instruction set strangely similar to the ARM in many ways, but maybe I've just spent too much time looking at the ARM1.)


We had some Altos (and a laser printer) at the Bureau of Standards, when I interned there in the late 1970s and early 80s. They had a number of games; one of them was written in SmallTalk, which you could break into, and then muck around in. Some screenshots from BYTE and a few papers gave us syntax hints, and we were off.

At one point we had some questions about SmallTalk-76 and called up Xerox PARC. Managed to get hold of Adele Goldberg, who answered our questions but was not terribly amused. I think Alan Kay would have been friendlier to us kids :-)


My first impressions of it are that the portrait oriented monitor actually looks very stylish. There is something almost futuristic about it.


These days, some displays allow you to rotate them between landscape and portrait orientiation.

At work, I have a 24 inch TFT in portait mode which I use mostly for coding (and other tasks where vertical space is valuable). It is very nice, because e.g. in text processing, a whole page fits the screen nicely.

In a way, we have that with tablets and phones, too.


Operating systems and drivers will allow rotation of most monitors and it mainly comes down to a matter of mounting hardware if autorotation isn't a big deal. Monoprice has a stand for about $20.

http://www.monoprice.com/product?c_id=108&cp_id=10828&cs_id=...


There is an non-obvious caveat to using vertical monitors on Windows though - ClearType font rendering doesn't support vertical subpixel arrangements, so you're stuck with naive anti-aliasing.


Color subpixel font hinting is on its way out. Windows 8 and 10 stopped using it for UWP (including the start menu / start screen); DirectWrite doesn't use it (font rendering in Edge and Firefox); Office 2013 doesn't use it.

Certainly screen rotation (with windows tablets and phones) would involve a lot of inefficient re-rendering. But i think the official reason was that since the subpixel colouring effect depends on the background colour, it's hard to animate transitions efficiently.

On a high-DPI screen, you'd be hard-pressed to notice the difference compared to greyscale hinting. Colour subpixels were a great hack, but high-DPI is the proper solution to this issue.


Turn it off. It looks better anyway.


I've got 3 Dell 24" monitors that can be used in portrait mode. (U2412M - 95dpi 1920x1200 eIPS - not expensive. But many Dell monitors come with this sort of stand.) When I had a desktop PC with 2 GPUs I had them lined up that way in a row. It have to say it was a bit menacing at first - it's about a 48" diagonal. But, you know... you make do. I took a few pictures at the time but the only one I've got handy is it running Doom: http://ffe3.com/pics/.3monitors/IMG_1327.JPG

In the end I missed the horizontal space often enough that I've now settled on having at least one full-size - i.e., not laptop display - landscape display in the setup. (Currently I've got my laptop, 1 x landscape in front of me, and 1 x portrait to the side.)

If you're going to do this, I guess you want to buy IPS-type monitors to minimize colour discrepancies due to the wide range of viewing angles you'll be using. eIPS is supposed to have a 170° viewing angle, but the colours on mine still aren't quite consistent from side to side (or top to bottom as it is once rotated). If you've got a TN-type display I imagine it will be even worse.

Also might be worth buying all your monitors at once. I bought mine over a couple of years; the colour temperature is very slightly different on each one, even when using the factory-calibrated settings, and one has a noticeably different anti-glare coating.


For years I've used an old 19" LCD in portrait mode next to a new-ish 24". The display heights (after rotating the 19") are identical, and the vertical resolutions are almost identical (1080 vs 1024). It's a cheap/free way of getting a wider screen. Usually I keep some paper or reference document on the 19" and have two side-by-side terminals on the 24".


One of the first Macs I ever saw back in the early 90s had a Radius pivot display which could be switched between landscape or portrait.

As it was mostly used for DTP it made loads of sense.


I worked on screen saver engines that had to support these - We did a lot of direct access to the bit mapped display for speed, at the cost of compatability. That being said, there were OS hooks that made responding to the rotation easier.

The problem came in when running applications that didn't respond correctly to the configuration changes - since we had patched the os (transparently, particularly when not actually driving the display) we got blamed for a lot of application crashes and ended up debugging and sending very explicit bug reports to other software vendors.

When the iPhone came out, I was deathly afraid of the screen rotation for about 6 months, convinced as I was that 50% of the apps wouldn't respond gracefully. (And I personally have never found one that failed here.)


At some point (I believe Windows 7), Windows started allowing the use of ctrl-alt-arrowkey to rotate the display. I think it depends a bit on the graphics drivers, but I also think most "standard" drivers (ie: Intel, Nvidia, AMD) supports it out of the box. It can be a bit confusing if you hit ctrl-alt-downarrow (invert screen) by surprise... (ctrl-alt-uparrow is the shortcut for standard mode).


That happened to me when I was 11 using Windows XP. Since I couldn't figure out what I did I resorted to turning my monitor on its head for about a week.


Those CRT Monitors are probably the weakest part of the system in terms of durability.


There is a lot of documentation about the Alto. Are there also complete circuit diagrams? I am just curious because the processor was made in TTL at a time before the 6502 and Z80 were born.

Sooner or later the last functional Xerox Alto will cease to work (sadly). In that case it could make sense to replace the dysfunctional parts with modern retro circuits. I wonder if a project to build a functional Alto clone (with TFT as screen) would make less effort than the famous monster 6502 which was presented recently.


All the Alto schematics are available at Al Kossow's Bitsavers [1]. If you want to understand how it works, start with the hardware manual [2]. The Alto processor could probably be kept running indefinitely, since it uses standard parts that can be replaced. The monitor is more difficult; the Living Computer Museum made some new monitor boards for their Altos, cloning the existing boards. The disk drives are another potential maintenance nightmare; I wonder if it would be possible to build a flash-based disk emulator.

Making a FPGA-based Alto clone would be a possibility.

[1] http://bitsavers.org/pdf/xerox/alto/schematics

[2] http://bitsavers.org/pdf/xerox/alto/AltoHWRef.part1.pdf


LCM built MASSBUS adapters for their PDP-10's, its probably possible. (to be clear, the ALTO was built in total from COTS parts, the disks are of a known type used on many other minicomputers, including I believe the DG NOVA)


I'm sure if they can make an adapter to use SIMM or DIMM RAM chips and bypass the bad memory chips. Use a SDCard instead of hard drives to store the OS and programs on.

I guess someone could just write an emulator for the Raspberry PI and Linux or something based on the design of the Xerox Alto and how the chips function.

It seems there are emulators or simulators for Xerox Alto out there already:

http://toastytech.com/guis/salto.html

http://altogether.brouhaha.com/

To help out would be the source code to the Xerox Alto: http://www.computerhistory.org/collections/catalog/102706061

It was released and I am sure the team can make modifications to it to make it run with modern retro hardware.


Thanks to all the posters here for their answers and recommendations!


As it's a TTL (with 4 74181 MSI chips) wouldn't it be possible to just grok it from the circuit board..?


Total envy! There is an Alto in the Heinz-Nixdorf-Museum, but it is not functional.

Seeing one of these machines in action would be awesome. (Is there an emulator available?)


The Living Computer Museum in Seattle has a running Alto, but I haven't seen it. I've used the Salto emulator: https://github.com/brainsqueezer/salto_simulator


I have seen it, and used BravoX on it. I even gave a demo to standers by once I figured it out. Its a very interesting system, one that is clearly not too removed from a modern word processor.


This emulator looks interesting: http://toastytech.com/guis/salto.html


Xerox Alto brings back some memories indeed. I used one at CMU as a frosh to do engineering drawings for the Terragator (http://gizmodo.com/5072167/25-years-of-strange-and-wonderful...). Being new to computers, I didn't really get that a mouse and GUI were revolutionary. It just seemed so obvious. But that is the beauty of innovation done right - that to users it just seems obvious and natural.


The restoration is also being documented in Marc's youtube channel.

https://www.youtube.com/user/mverdiell


This was a great system.

The more I research into Xerox's papers and manuals for Interlisp-D, Smalltalk and Mesa/Cedar systems, the more I become convinced it was a big step back to the industry the adoption of inferior systems like UNIX.

Thankfully many traces of those ideas are now in Windows, Mac OS X, Android and iOS, Language Playgrounds and many IDE workflows.


I agree. UNIX has some brilliant ideas, like isolated functions (executables) connected by streams, but beyond that it made a lot of mistakes that we are still dealing with today.

The biggest one being a C-centric view of programming that has cost the world untold billions of dollars when dealing with untrusted data. It could have used a statically analyzable functional middleware of some kind, falling back to micro optimization only when needed - the way that Clojure works with say, JavaScript.

The other major failing that I see is overlooking ideas from ZFS, that the filesystem can act as a virtual tree over any storage medium, so UNIX wastes a lot of time on things like dependency hell, permissions, and distinguishing between file and socket streams or local and remote processors. It could have jailed each process in its own sandbox where copies of libraries are reference counted by the filesystem, running in a virtual storage and thread space. We're just now seeing the power of that with Vagrant and Docker (technically it took so long to get here due to virtualization resistance by Microsoft and Intel).

My other main gripe is more about approach than technology. UNIX (and LINUX especially) stagnated decades ago due to the RTFM philosophy. The idea being that to be proficient in UNIX, one had to learn the entirety of the operating system. This goes against one of the main tenets of computer science, that we are standing on the shoulders of giants. So I really appreciate how passionately the Alto tried to make computing and programming approachable to the masses.

I keep hoping someone will release a portable lisp machine that can run other OSs under virtualization and release us from these antiquated methodologies..


> The biggest one being a C-centric view of programming that has cost the world untold billions of dollars when dealing with untrusted data.

That's pretty simple to explain: all those other options were just way too slow to get the kind of performance required out of the hardware available at the time. The difference was simply too large to be ignored.

It's all nice and good to theorize about how the past should have been, but without UNIX you probably wouldn't be writing any of this on the medium you're currently using.

It has its flaws and it is far from perfect but at the time it fit the bill nicely.

The real problem is that we are categorically unable to move on when better options are around. There is a large amount of silliness involved when it comes to making responsible choices in computing, lots of ego, lots of NIH. Those are the real problems, not that UNIX was written in C.


> all those other options were just way too slow to get the kind of performance required out of the hardware available at the time. The difference was simply too large to be ignored.

If you compare with Xerox PARC hardware not really, the major issue was the price to produce the type of architecture they were having.

As for safe systems programming, Burroughs was already doing it a decade early in computer hardware much weaker than a PDP-11.


Moore's law is essentially about density (and hence about price), not about speed, the speed was a side-effect.


Kinda. Density allows for either speed or for price, and going for speed allowed the companies to hold the price fixed.


The real problem is that we are categorically unable to move on when better options are around

Every 10 years or so the industry just restarts the same loop it's been stuck in since the Amiga (actually the Amiga is probably the first loop starting with the Alto) just with different syntax and faster hardware. Software is stagnant; ALL progress in in hardware. And with the end of Moore's Law that is grinding to a halt too.


I have some hope that with the end of Moore's law in sight we will finally be able to concentrate on the software for some progress. All that we've achieved to date seems to be prettier way to squander cycles.

And in a way that's a real pity. It could have been that if Moore's law had been a doubling in 30 years rather than 18 months that we'd have had a lot more appreciation for writing good software. As it was the crap won out over the good stuff simply by being bailed out by Moore's law just in time for the next cycle.

But in some alternate universe hardware progress was so slow that any gains had to come from better software.


> All that we've achieved to date seems to be prettier way to squander cycles.

And stack turtles.

Whenever i see a headline about unikernels, i envision doom running on DOS in a VM on top of Linux on top of some hardware somewhere. How many layers of (potentially leaky) abstractions are we looking at?


I find myself wonder if it is a problem of human computer IO.

The minimal set of reliable human computer interface is a keyboard and a screen. going beyond that quickly add either noise, delays, or both.


> The biggest one being a C-centric view of programming that > has cost the world untold billions of dollars when dealing > with untrusted data.

I'm not sure it cost anyone anything. I mean, a lot of the OSes were/are written in it, so if you were going to go down that path you'd have to not totally forget to add a rather large benefit in the credit side. It's hard to imagine but programming wasn't always about compensating for not quite understanding how the 17 different frameworks you've downloaded from github and dragged into an IDE worked by just getting a faster machine. Once upon a time people had to carefully measure how much to unroll the loop, or how small a lookup table they could get away with before the errors become a problem.


And that was already being done in the 60's with much better languages, but the UNIX revisionists don't like people spend time reading about Burroughs, Algol-68RS, PL/I, Mesa and many other languages provided by mainframe owners.


Err, are we not taking things out of historical context?


Mass market products are rarely best in class, they are all about offering 'good enough' at commodity prices. Unix won because it was cheap, widely available and got the job done.


The famous article "The rise of 'Worse is Better'" describes the victory of Unix/C over the "right thing" style of design : https://www.dreamsongs.com/RiseOfWorseIsBetter.html

Edit: I've changed the link. Thanks for letting me know about the bad redirect, golergka. The jwz link looked fine to me; it's pretty obnoxious if the site does a NSFW redirect.


Warning: this link now redirects to a NSFW image about HN being a DDOS.


No the party didnt forget to check the link or have some joke in mind, jwz's site always does if you have HN as a referrer.


JWZ may know his stuff, but damn he can be a salty asshole...


I usually like to leave more substantive comments on HN, but I don't think he would be mad if I said "that's kind of his thing."


One thing that puzzles me is how he coined CADT to describe the Gnome development process, while also having embraced OSX as some kind of example of good computing. Because Apple have been just as much about "churn" and dropping of old code for new code, afaik.

Heh, and while writing this it dawns on me that it may well be that he is about OSX because UNIX. Meaning that OSX is a BSD derivative, that in turn is derived from the historical UNIX. While on the other hand Linux is just a bastard clone.

Because it hit me that i have seen some similar fervent anti-Linux sentiment from other UNIX people, related to either the BSDs or Solaris.


As we all get older, I think it's understandable for some people to be frustrated by waves of what JWZ called (on one occasion) "anklebiters" arriving from HN. DDOS or not.

Wikipedia has an objective and some structure, whereas HN is, for all its uniqueness and value, not an organized activity.


What an interesting fellow! I'd like to subscribe to his newsletter.


Well if you didn't misplace the /s you could try his blog. It will be more about DNA lounge and Pizza than about tech though. Not worth linking it here though unless you want more facefulls of hairy eggs...


My uncle worked at PARC when this was being made. I probably wouldn't of gone into tech, if it weren't for him.

This makes me want to message him and ask him about his time there.


get him in touch with the restoration group!


Please do, and let us know what he says!


> The disk drive at the top of the cabinet takes a removable 2.5 megabyte disk cartridge. The small capacity of the disk was a problem for users, but files could also be accessed over the Ethernet from file servers.

(...)

> The Alto was introduced in 1973.

2.5 megabytes of removable/swappable storage ? In the 70s? I'm amazed that users found it constraining! That's more than even the Amiga managed to fit on 3.5" floppy disks (Unlike the PC which generally were only able to format for 1.44 MB, the Amiga generally fit ~1.8 on the raw 2MB HD 3.5" floppies).


Keep in mind that although the Alto disk is removable, it is a hard disk, not a floppy; the user normally kept it mounted for a session, rather than using a new disk every time they load something. Thus, the disk needs to hold everything you need. (Although you could access files over the network with FTP.)

The user guide describes in great detail [1] why Alto users found 2.5MB of disk limiting. To summarize, the disk starts with 4800 512-byte pages. Basic OS stuff takes 900 pages. FTP and the editor take 900 pages. Fonts take more, as well as other commonly-needed software. So a non-programmer's disk typically has 1600 pages (800kB) available to work with. Programmers require more tools, so the free disk space is tighter. It seems people ended up needing to manage their disk space closely, even though 2.5MB sounds like a lot.

[1] See pdf page 20 of http://bitsavers.org/pdf/xerox/alto/stanford/StanfordAltoUse... for a discussion of why disk space on the Alto was rapidly used up.


I see. That coupled with the limited RAM (as opposed to eg: the Amiga 500 which had 512Kb), or the Amiga 2000 which had at least a full megabyte -- I think ours had 4MB does make a big difference.

Interesting to note (as @pjmlp mentions) that 1973 + 15 years leaves us at around the time the Amiga 2000 was introduced. I remember ours (a model b IIRC) initially had a 20MB hard-drive, later upgraded with another 40MB for a total of 60MB of storage. It could fit almost all of my floppies on there!

Looking at the wikipedia page for the Amiga 500 I see the RAM latency listed as 150 ns. I'm not sure if that's the actual latency to read data from RAM - if so it's within an order of magnitude for accessing RAM today! (I'm also a little sad that while my i7 is a lot faster than that ageing 68000 that ran at 7.5 Mhz, it's still not easy to find a single core with 7.5+ Ghz of performance :-( ...).


As Alan Kay states on his talks, for doing this type of research one needs to imagine how the hardware will look 15years from now and build such hardware, regardless how much expensive it might be.


This point is so important that Alan has emphasized it as one thing we're mostly missing.

The Alto was not meant to be a personal computer but a "personal supercomputer"—hardware so far ahead of the curve that it could run the software of the future. In other words, not a personal computer but a machine for inventing personal computing with, which of course they proceeded to do. Having such hardware is what makes it possible to figure out what the software of the future needs to be. If you can't run it, you can't create it.

For me this sheds light on a couple of Alan's famous lines: that the best way to predict the future is to invent it, and that people who are serious about software need also to be serious about hardware.

Such advanced hardware is necessarily costly but should still be within the reach of a properly funded research lab. It seems to have been an enormous frustration to Alan and his colleagues that they were unable to persude hardware vendors to make such hardware at any price, especially the Alto's microcode-driven style of hardware. Among other things, that was what had enabled the PARC researchers to make the most of the limited memory they had, since when a task was consuming too much memory they could implement it in microcode, pushing it down to the hardware (at least from the software's point of view) and freeing up RAM for software.


Alan Kay's 1972 paper on the Dynabook [1] is very interesting how he designed the Dynabook from the perspective of what you'd want in a personal computer, and then described in detail what technology improvements would be needed to get there, and how to get the price down to $500.

It's important to remember that the Alto was called the "interim Dynabook", a system to try out the Dynabook ideas until technology caught up and made a Dynabook-like system possible (40 years later).

[1] http://history-computer.com/Library/Kay72.pdf - jump ahead to page 6 for the hardware details


> It seems to have been an enormous frustration to Alan and his colleagues that they were unable to persude hardware vendors to make such hardware at any price

No surprise there. Hardware production is frankly inherently conservative unless one is a startup (aka, nothing to lose but ones pride). This because you have all the up front cost of tooling and production runs before you even know how well something will sell.


There are so many points in the computer industry where one or a small group's decisions changed everything. After listening to "Dealers in Lightning" you wonder about quite a lot of decisions. One of the later ones got me to thinking, what would have happened if Xerox had not got rid of its pre-IPO Apple stock and allowed the Lisa team to license Smalltalk? I wonder the sequence at that point with the Mac.


Xerox did license Smalltalk-80 to Apple. It was one of the products promised at Lisa's launch, but it was never delivered. It was ported to the Mac and became available to registered developers (I got version 0.7 through a friend who was a developer). In 1996 this evolved into Squeak and Apple was able to release it as free software thanks to its license from Xerox. Some people complained about the Squeak license, so in 2006 Apple re-released its part of Squeak under the Apache license (and the rest was re-released under the MIT X11 license by each part's author).

Xerox gave the same license to Tektronix (which launched its 4404 and 4406 Smalltalk computers), HP (which created Distributed Smalltalk) and DEC. The DEC license was extended to UC Berkeley.


According to 'Dealers in Lightning' they did not license it when Apple asked the first time. They did when they licensed it to everyone else. This was all pre-Squeak.


The first Smalltalk-80 tape was delivered to Apple, Tektronix, HP and DEC on February 17, 1981. So in a way, they all had licenses at this point though it was more a contract to review the system and the book that was being written as well as dedicating at least two engineers each to try to implement it. Three more tapes followed (the last one probably early in 1982).

When Smalltalk-80 Version 2 was released all four companies got a free license for it which allowed them to do anything they wanted, including re-release it under a different license. Which is what Apple did (three times) with Squeak. The Version 2 license is from 1982 while the Squeak licenses are from 1996 and 2006.

But you are right - Xerox did not give Apple any special treatment compared to the other three, though it did compared to the world in general.

Note that Apple had several Smalltalk users involved in the development of the Lisa and the Macintosh, but they felt Smalltalk was too much for the machines they were creating. Steve Jobs felt that was still the case when developing the NeXT, which is why they used the Smalltalk/C hybrid Objective-C.


Thanks for the info. Now, if I understand the time line for Lisa, what you have said, and what 'dealers in Lightning' have said; the book must be refering to a pre 1980 decision (or the book is out to lunch)? When did Lisa actually start?

// sorry for the latency on my responses, I'm driving west on I94


A very good timeline for the Lisa using screen captures is:

http://www.folklore.org/StoryView.py?story=Busy_Being_Born.t...

This contradicts some accounts which say that the Lisa was a text based minicomputer before the Xerox visit.

In practice both the Lisa and the Macintosh projects started in 1979. This is pretty amazing when you consider that the floppy disk for the Apple II had just been made available less than a year before and Visicalc had not yet been released. That the Apple III was in development at this time seems pretty reasonable, but these more advanced machines were pretty ambitious.

Jef Raskin was very familiar with the Xerox PARC work since he had visited there when he was a professor at UCLA before he joined Apple to work on documentation. He didn't like the mouse or windows, but had always been interested in a fully graphical computer. His ideas for the Macintosh can be seen in his later Canon Cat machine. Steve Jobs didn't like his project and kept trying to kill it. Jef thought that if Steve could see the Alto he would "get it" and leave the Macintosh group alone. That is indeed what happened. But not long after that Steve was kicked out of the Lisa project by the board who wanted someone with more experience to be in charge of such an important project. Steve joined the Macintosh project and reshaped it as a "Lisa Jr", which caused Jef to leave Apple.

In the Lisa timeline you can see the effect of the second visit to PARC (both were in 1979) in the form of windows, though these didn't stay like the Smalltalk ones for very long (this style reappeared in the BeOS for some reason). And the effect of the launch of the Xerox Star can be seen in 1981 in the form of icons which look very much like the ones on the Star.

The Smalltalk project at Apple was started in October 1980 and lasted 18 months. The system first started running on the Lisa in April 1981.


The original Macintosh really seems like it borrowed a lot aesthetically, in addition to the similarity in operation:

http://lowendmac.com/1984/macintosh-128k/

http://www.righto.com/2016/06/y-combinators-xerox-alto-resto...


I'm not seeing the aesthetic similarity between the Mac and the Alto, except that they're both beige, have a mouse, and are taller than wide. What specifically do you see in common?


Monitors back then were wide then and the Alto turned the screen on its side. The Mac somewhat similarly had a screen that wasn't wide. The Mac mouse is also copied from the Alto's if you look at the shape.


I'm not following you here - the Mac had a wide monitor, standard 4:3 CRT, totally unlike the Alto's portrait display. The Mac's mouse had one button, while the Alto's mouse had three buttons. Both had a rectangular shape, but that's the obvious shape to use (see Engelbart's mouse).


What Mac are you talking about? The classic Mac screen was wider than it was tall.


Well, Xerox got pre-IPO Apple stock, so I would hope Apple got something for the money.


Did the Alto have a graphical chat system?


As far as I can tell, no. The Alto had a program called Chat, but this was more like telnet/ssh, allowing you to connect to a remote computer, rather than communicate with a person.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: