Hacker News new | past | comments | ask | show | jobs | submit login
How the Atari ST Almost Had Real Unix (2011) (dadhacker.com)
153 points by pmoriarty on Nov 27, 2015 | hide | past | favorite | 73 comments

It is possible to make VM work with the 68000, Masscomp did it but it was pretty brute force, they ran two 68000's in lock step with one just behind the other so when the first one faulted it stopped the other one and the other handled the fault. Something like that. Clem Cole would know the exact details.

They made for a pretty beefy machine, I ran ~20 users on one. That was when we had a pile students on a Vax 11/780, I had accounts on both and much preferred the Masscomp.

I think there were several manufacturers who used that trick. Apollo was another. That wasn't really an option for consumer-grade machines like the Atari ST, though, since 68000 chips were pretty expensive at the time (I think they were around $500 apiece when they were introduced). By the time the price came down the later 680x0 chips with real MMUs were available.

They must have lost a lot of money on the ST, then, mine cost $800 new (monochrome) IIRC.

As others have noted, the ST came out several years after the 68000, so the price had dropped some. It still was a non-trivial amount, though.

The comparable Apple machine (monochrome, 68000 processor, 512K RAM) was selling for $2,800.

Even given Apple's famously high margins and Jack Tramiel's equally famous "deal-making ability" (which some would call "gouging" :-)), it's clear that the ST retail pricing was pretty close to the bone.

As was the workmanship on some of them. I had to re-solder more than one joint on the ST's passing through one of the stores in Amsterdam (ditto for the C64 by the way).

The ST was a very capable machine built on a very small budget. Compared to that other el-cheapo 68K of the time (the Sinclair QL, a 68008 based machine) it worked wonders and it was the first 32 bit capable machine that I could afford.

Wow. $500 for a 68000?!

State of the art high-speed CPU. You don't blink at paying Intel $500 for a Xeon 10-core doohickey, right?

And like everything else in silicon, sizes went down, production went up, costs went down. The Palm PDAs used, essentially, 68030 at 33MHz (a variant called Dragonball) and started selling at $499 for the system and were eventually around $79 each.

Now a slow, small Raspberry Pi runs $5, plus $25 in more or less required accessories.

I don't recall the COGs for the ST (they held this stuff pretty close), but in 1985, the 68000 CPU couldn't have been more than $50, probably less. Almost certainly much less because the Tramiels were absolute wizards at making deals.

1985 was 6 years after the 68K was reduced. By Moore's law, the cost of manufacturing should have been about 1/16th of what it was in 1979, so the volume price was probably around 1/16th of introduction as well.

Remember that everything on the ST was being costed in early or mid 1984, about a year before the ST shipped. (Which in retrospect was an amazingly compressed schedule...)

I think that the National 32032 would have beaten the 68K in cost; National was pretty hungry and willing to make good deals. But their chips were rather extremely very buggy (which the Tramiels may not have cared about, since making software engineers suffer was part of their model), but it was a relatively poor performer, clock-for-clock, and that made the hardware engineers go "ewwww" and after a while we software guys realized that National wasn't being invited over for meetings any more.

At this point the 68010 had been available for a long time and the 68020 was just released. These still had no MMU but had much better bus error handling than the 68000. In retrospect, it was probably trivial to do 68000/68010 versions of the same CPU by just changing the microcode.

The 68010 and the 68020 both had (external) MMUs available. In those days, the MMU and FPU were almost always separate chips (I can't think of any exceptions, though there might have been some).

Granted, the one for the 68010 sucked pretty bad.

Yes, this is exactly what I mean. There was Motorola's own MMUs, but custom MMUs based on discrete logic was also common.

This is why I wished the 6502 crew had followed up with a processor. This price and Intel pretty much killed any 16-bit or 32-bit follow-on to the under <$200 market after the C64 / Atari 8-bit era.

For some real joy, check the 88000 pricing and see why it failed.

ARM's the closest spiritual successor I can think of, but it didn't have the kinda traction to make a proper successor to the cheap micros...

... at least then. now, there are. :)

It should be noted that certain 680x0 Macintoshes could run a real UNIX, created by Apple no less.


What was unique about it was that it could run System 7 applications alongside your common System V/BSD tools.

I've still got a Quadra 800 at home running it.

It might be worth pointing out that there is a Macintosh II emulator built solely for the purpose of running A/UX.


A mirror of the well-known A/UX software repository, Jagubox, has been published at the Internet Archive.


> This will probably be the last release. I won't be able to work on Shoebill going forward (by contractual obligation), so I wanted to race out one last release.

Really sad that companies put these kinds of restrictions on their employees...

Well considering that the author went on to work at Apple, it sort of makes sense. I don't think he was explicitly told not to continue development, but it is likely that some sort of boilerplate part of his contract restricts him.

I don't think it makes any sense, and the fact that it's a boilerplate clause everyone gets makes it worse, not better.

Employers really need to stop trying to control what their employees do while off the clock.

The project depends on reverse-engineering proprietary (if archaic) Apple hardware/software. Do you think Microsoft would/should allow their employees to contribute to Wine? Or Nintendo employees working on open-source emulators?

Do you think I should allow you to drink tea? You'd probably say, "you can't allow or disallow that, you don't have that power." That's what I think Microsoft's position should be with regard to their employees' free time.

The funny thing is that in this case the Apple stuff involved is decades old by now!

I share your view. However, I can also see the employer's perspective in a scenario like the project for the company ships late but all employee's commits to the open source project the employee founded or participates are all timely.

I won't sign a contract the restricts what I do on my free time. Any place that does that is not a place for me.

I can see the employer's perspective as you describe it, as in "psychopaths might think this way," but I can't sympathize with it. I think this is something we need to regulate, just like we say employees can't tell you what god to worship or whether to have children.


If you get a chance to edit that typo I'll delete this comment to reduce clutter. :-)

Thank you for pointing that out. I guess I'll delete mine after you delete yours. I hope it works that way.

Silly me, I didn't get back here in time to delete my comment! But we're probably long off the home page by now, so nobody cares much about a little clutter... :-)

Doing anything else is a form of slavery. Most people are somehow OK with it. That company basically owns you 0-24 and everything you produce for a hand-full of cash.

Also, it appears that a mirror of A/UX Penelope is still up, even after its original maintainer has shut it down.


We had a lab of I think Mac IIx's running A/UX at Georgia Tech. My understanding was that Apple only did A/UX so they could satisfy a government procurement requirement of having an available *nix for their hardware. We weren't supposed to let people reboot them into Mac OS so the lab was mostly empty all the time.

A/UX 2.0 was much better than 1.x I think though, including the Finder and MultiFinder (previously Mac apps had to be launched from the command line). Trivia: A/UX 1.x used separate programs to be used at the command line to read/write MFS/HFS volumes, and 1.0 only included the MFS one.

au/x was an interesting UNIX, but it did have its flaws.

If the system beep occurred, the kernel would lose interrupts. Needless to say, this was not a good thing.

I'm surprised that would be the case, at least by A/UX 3 the Sound Manager routines were re-routed to call into special A/UX implementations (which I assume wouldn't be disabling interrupts).

What manner of madness caused this?

I imagine, the oscillator was software driven, and play_sound (or whatever they called it in the library) disabled interrupts so the sound wouldn't be modified by an interrupt.

That was a real missed opportunity, the equivalent of OS X decades earlier

It did require a ridiculously expensive computer for its time, well into the workstation range, but with a relatively puny hardware compared to what Sun could offer at the time.

More aptly, FPU and MMU hardware were incredibly expensive in the late 1980's and early 1990's, and the amount of RAM to sustain running A/UX would've crept you closer into workstation territory anyway.

Keep in mind that this was a time when one was not guaranteed to have memory segmentation hardware, let alone floating point hardware. We take this for granted in 2015, but in the period between 1985 and 1995, it was a serious luxury for lots of computer buyers.

That said, with 4MB of 60ns RAM and a 68030 costing you $10k to start on a Mac IIfx and an upgrade to take you up to 8MB potentially costing you another couple grand, why not just go SGI and be done with it?

1MB SIMMs was much cheaper than this by then, though the IIfx was famous for requiring special 64 pin SIMMs.

Equivalent? No. Unix is not the secret sauce of OSX. I never used A/UX. Did it have a GUI at all?

Yes, a variation of the System 7 finder. Personally I used MachTen which was a 4.3BSD variation which ran on top of the usual system.

Those were the Mac IIs, which had 68020 CPUs and could resume instructions after faulting.

Trivia: A/UX 1.0 was released just before the famous DRAM shortage and at one point it was common to buy A/UX Mac IIs just to have the 4MB of RAM in them.

They needed a real MMU though (68851). The Amiga also supposed real Unix on 68030 machines.

This came about after Sun and AT&T decided to standardise the Unix industry on SVR4. Having lots of cheap Atari, Amiga and other micros running SVR4 would have helped it dominate the industry. In theory, at least.

This prompted DEC (Ultrix), HP (HP-UX), IBM (AIX) and other companies to band together to develop their own standard Unix, which was OSF/1 (Open Software Foundation, or as Scott McNealy said, "Oppose Sun Forever").

This kicked off another round of Unix Wars, which was no doubt much appreciated in Redmond, where Windows NT was being developed.

The Atari SVR4 was ported by Unisoft and there's a video of it booting on a TT 030 workstation (which had a 68030). https://vimeo.com/100657320

I was at the Atari Unix SVR4 (pre-)announcement at CeBIT in Germany, but it's so long ago I've forgotten the details.

> "Having lots of cheap Atari, Amiga and other micros running SVR4 would have helped it dominate the industry."

For those that are interested, SVR4 was released for the Amiga. It's a shame Commodore's management blocked it from reaching its full potential.


I'll just add that I found the author's story of how he started working at Atari in the first place[1] far more interesting than the present article.

[1] - http://www.dadhacker.com/blog/?p=987

Many thanks for that link: it's a great story.

So essentially he re-invented segmented memory for memory protection purposes. Neat hack!

There was a brown box at the time, iirc by Torch (?) that ran a version of Unix on a 68K. Does anybody know which machine that was?

The 6809 OS that was 'unix like' referenced in the comments is about OS/9.


Which is a really neat little OS that you can run on the TRS-80 CoCo and the UK clone of it called the 'Dragon 32'.

I knew about segmented memory. But base/bounds was too expensive and slow.

The trick was to make it work without requiring an adder -- and subsequent carry propagation delay -- in the data path. Thus, "one gate delay" and simple replacement of address lines, and a simple "all zeros or all ones" check on the bounds, rather than a numeric comparison.

The reason I call it 'segmented memory' is that that is more or less exactly the same but this does not have the addition penalty. Super neat :)

I ran into a related problem, I had a need to drive a whole pile of IO and only an 8 bit centronics port to work with (on the ST as well). So I ended up making a little demultiplexer using 4 bits of the 8 as address and the remaining 4 as data. The strobe was used to latch the data. Not super but it worked well enough to let the 68K drive stepper motors in real time near their maximum working frequency (30K steps / second or so).

The ST was a fascinating machine, lots of ports, relatively cheap and with all kinds of limitations that continuously drove the users to creative hacks. I loved that time. Thanks a ton for all your work.

Oh, one more for the road: we had a product that was copied quite frequently and obviously wanted to make that a bit harder. There was no budget for a dongle so in the end I looped one of the 8 bits of the port back to an input. If that wire wasn't present the software would exit with an 'out of memory' error :) Threw the copyists for a loop because all I did on not finding the wire was allocate all the remaining memory which would then cause a TOS Bomb on the next memory allocation, which could happen just about anywhere. Nasty little trick.

If it was a "brown box" maybe it was a Cromemco?


No it most resembled a cube each side about 30 cm or so. It's ages ago and my memory about this is hazy, I saw the machine exactly once. Now I really want to find out.

Edit: Ah, got it:


Indeed, the Torch Unicorn, Unix on a 68000. Makes you wonder what they did for MMU, apparently it ran Unix System 3

I remember reading about them in what was probably "PC World" - first time I'd read about Unix or seen C code. I was amazed that you could program without line numbers... :-)

Great story. This machine was remarkably capable for it's time and age for both business and personal use. I was lucky enough to spend a few years with this machine as my primary, only can imagine if I was exposed to unix on top of the graphic, digital creation and other tools.

When you consider what the Atari ST, Commodore Amiga and Acorn Archimedes were capable of in the late 80s and early 90s, you see how little progress we've really made.

There isn't much point in going beyond "sufficient", so, I would not expect much progress beyond what a common user would need to be made. Beyond that point, if we want to make better computers, we need to make compelling software that needs them to run. This is how GUIs, multitasking, multicore, memory protection, GPUs and 64-bits became mainstream. I am not oblivious to the fact my Windows machine of 1991 was science-fiction when I got my first Apple II as would the prospect of having 6 ARM-based UNIX machines (one of them off because the battery is drained) on my desk or the fact I am running 3 Linux VMs and one LXC container while I work, would be unthinkable in 1991.

What annoys me is that the machines you mention were remarkably simple and elegant machines (Amiga was elegant, but very simple) while a modern PC is a matryoshka of nested computers all the way down to an 8088-powered IBM PC 5150. There must even be a cassette port (connected to nothing, of course) somewhere in a modern PC chipset.

I had an Atari St something(1200?) in college. I never told any of my professors, or classmates about my computer. I literally felt like I was cheating.

I was afraid someone would detect the matrix print, but no one ever questioned my papers. I did have professors bleed red all over my papers though. My grammer was just horrid. My spelling was great though.

In one English class, a student plugged in a laptop(it was the size of a small suitcase) into a electrical socket. The teacher made some demeaning comment to the student. It was someting like, 'I never thought I would see the day where a student plugged in a computer? Ugh?' I remember thinking that student has some balls, but has a great tool at his disposal. Never liked that English teacher. He accused me of plagiarizing once--in a weird round about way. Ah, back then teachers had a lot of power. They could ruin a student's future.

I remember trying to get my girlfriend interested in my Atari. She was paying 2 dollars a page for someone to type her papers. I showed her the word program. I showed her the Paint program. She didn't like anything about the machine. She didn't even want be near the machine. She is now some big wig at some hedge fund. Her official title is chief of web technology, or something along those lines. I haven't gone one day where I haven't thought about her. Crazy?

I look back, and I couldn't get anyone interested in my computer. It was like people just hated them? Couldn't figure out why. I wasen't a big tech guy either, and only used the word processing program. I remember thinking, I guess the computer thing will never take off. I was so wrong.

I remember my last visit to the retail store that sold me the Atari, and I could tell they were about to close up shop. My last memory is of a big guy with a beard, and he looked so depressed. All the computers were nicely displayed, but no customers. They even turned off the shopping music. There was this weird silence, and a feeling of doom. I think the shop stayed open a few more weeks, and it turned into some shop that made custom doors?

Looking back, I just have fond memories of my college girlfriend. I wish I treated her better. I think I have holiday blues?

> My grammer was just horrid. My spelling was great though.

I'd say it was the opposite ;)

Back then you had fixed hardware platforms, much like gaming consoles. The A500, besides some additional RAM, stayed the same hardware config for the duration.

The PC today barely have the BIOS in common of the PC of the era.

This seems to have made developers lazy, as performance problems could always be fixed with an hardware upgrade (why it seemed that MS and Intel was in cahoots about some kind of tick-tock plan).

Also, the net happened. And thus a increased focus on security (to the point of paranoia perhaps). Thus where before a program could pull all kinds of "shenanigans" to maintain performance, these days it would throw multiple security violation errors. End result is that the computer is "wasting" cycles doing all kinds of checks where before it was full speed ahead.

The arrow of mainstream time is backward most of the time.

I often wonder the same, so much seems like a cyclical re-invention of the same things over and over.

The author comments here at HN pretty often: https://news.ycombinator.com/threads?id=kabdib

Sorry, it's on GoDaddy. Every once in a while my blog gets popular and . . . well, I guess I should move it off of GoDaddy.

> "[What’s that about Linux? Dear child, Linus was probably not out of whatever they use for high school in Finland."

I think not even that... I think Atari ST came out in 1985 and if I remember right, Linus started working on Linux in 1991, so not even in the picture at that time.

Linus had Sinclair QL at the time, also Motorola 6800x, sadly x was in wrong place making whole computer a pile of garbage

PCs were too weak to run UNIX in the 1980s. MicroSoft was the vender for Xenix, a 16-bit UNIX. It was so slow compared to PDP 11 UNIX. 32 bit x86 CPUs and Linux came out about the same time in the early 90s- a successful marriage.

Ah...Microsoft and Xenix. Those were responsible for the most annoying meeting I have ever been forced to attend.

When AT&T and Sun were doing their joint Unix stuff with SysVR4, they had a deal with Microsoft to add 286 Xenix compatibility. Microsoft contracted with Interactive Systems Corporation (ISC) in Santa Monica, CA, to do the work (which made sense, because as part of ISC's contract to do the 386 port of SysVR3 for AT&T, ISC had done a 286 Unix binary compatibility system).

I worked at ISC at the time, and the 286 Xenix compatibility was done by Darryl Richman and myself. At one point Microsoft told us that there was a serious issue that had to be decided, and it was too important to handle over the phone or email. They wanted both Darryl and I to fly to Redmond. This annoyed me, because I strongly avoid flying [1]. (During the flight, though, I did see something interesting. Over Northern California I saw a black aircraft very rapidly ascending. As far as I could tell, we were at around the right latitude for Beale AFB, which was a major starting point for SR-71 and U-2 spy missions, so one of those could have been what I saw).

So we finally get to the meeting, and they bring up the massive issue that could only be dealt with in person. There were one or two differences in signal behavior between Xenix and SysVR4 that could not be taken care of in our user mode code. They would need to add an option to the kernel to switch to Xenix behavior for programs running under our Xenix emulator. That was no problem--they knew how to do that.

What they did not know, and needed this urgent face to face meeting to discuss, was whether the interface to that option should be an ioctl or a new system call.


We told them to go with an ioctl (if I recall correctly), they said that would be fine, and the meeting was over.

If that wasn't annoying enough, when the project was done Microsoft said they were very happy with the results and they expressed this by sending Darryl a bunch of free Microsoft software.

They sent me nothing.

Now if Darryl had been the project lead or principle engineer or something like that on the project, that would have been fine. But we were equal...and I actually was the more experienced in that area because the 286 Xenix emulation was based on the 286 Unix emulation which had been done by Carl Hensler and me.

[1] I believe everyone is allowed, and should have, one stubbornly irrational fear, and I picked for mine flying on aircraft that I am not piloting.

I ran Interactive UNIX on a 16MB 386 in the late 80s, it was a nice OS.

808x has no Protected Mode or extended memory until the 80286 (actually the 80186, though hardly anybody used it) came out, beginning the IBM AT generation.

AT&T had an early UNIX PC series, the 6300 (8086, in a rebranded Olivetti unit)) and 6300 Plus (80286). I think the former only ran DOS and Venix, but they had a System V implementation for the Plus, IIRC.

I ran Minix on a 8088 with 640K RAM and a 5.25" floppy drive.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact