Hacker News new | past | comments | ask | show | jobs | submit login

I'm a bit sad that the Amiga never caught on much in the US...at least not in the home market. Looking back at computer history, it's almost surreal to see how much better it was than virtually all the competition (with the possible exception of NeXT), and still managing to lose the war to Windows.

I've played with the Amiga a lot via emulation, and it's still impressive to me; a home operating system with preemptive multitasking in the 80's?! With something like that, whomever was in charge had to work pretty hard to ruin it.




The Amiga in 1985 had amazing custom graphics and sound chips, a preemptive multitasking OS, double the RAM of the Mac 128k at 1/2 the price (Mac 128 MSRP is $6000 in 2018 dollars)

It’s really sad how much public hagiography is made over the Mac when almost no middle class family I knew of could afford it, certainly not with a LaserWriter.

The Commodore 64 was way more affordable and got a legion of kids interested in computing and coding, who later went on to adopt Amigas.

Even today if you look at the home brew, hacker, and demo scenes, Commodore dominates. Hardly anyone is doing stuff on old Apple 2s or Macs.

Commodore gets the short shrift in the Twitterati retelling of the personal computer evolution, and today’s millennials completely fixated on Jobs and Apple and ignore most of what was really happening in the 80s with home users.


> how much public hagiography is made over the Mac when almost no middle class family I knew of could afford it,

Part of the late '80s/early '90s revenue strategy for Apple was to sell into the educational market. The people who fondly remember the Apple computers of this period do so not because they had one at home, but because many of them were young children at this time, playing games on those Apples in school computer labs.


We had an Apple II in the classroom, we were allowed to play with it in pairs for maybe an hour every fortnight.

Our teacher had almost no idea how it worked and there was no manual or instructions, so basically we sat in front of that green screen confused. I managed to make a star in logo, that was about it.

I still loved playing with it anyway, this machine that you had control of through arcane commands. It felt like you could make it do anything if you only knew the magic command.


That might have been the case in US, but Apple was nowhere to be seen around most of the rest of the world.

So the "Twitterati retelling" critic still applies.


Well before there were Twitterati, the Silicon Valley glitterati fell in love with the Mac as a concept: a computer designed from the ground up to be easy to use, present a single, consistent interface, and be an appliance with minimal cognitive engagement from the user. Like a Yoko Ono piece, the vision was the product -- even if the actual hardware was lacking and expensive. And in the Mac conceptual world of the time, generating interest in computers and programming was seen as an anti-feature. You shouldn't have to be interested in those things to leverage the full power of the Mac, and if large numbers of people were getting interested in those things, the wrong things were being optimized for. Programming is just a job, and computers are just a tool to enable you to do your real work. Such was the thinking of the day.

The Amiga was an entirely different class of machine, designed by and for engineers, and it was a bit rougher around the edges UI-wise but it did far more in terms of real, concrete advancements in the state of the art.


For early '90's, there was nothing rough around the edges of AmigaOS 3.1. It was a fast, elegant and highly extensible operating system via DataTypes and shared libraries.


AmigaOS was tremendously powerful, but Macintosh System (as it was called then) had much more UI polish and could be operated with one mouse button (this was important!). The official programmer's reference manual, Inside Macintosh, contained strict rules for how an application should look and behave. By contrast, on the Amiga, some great UI frameworks existed but they looked rougher and a lot of people seemed to roll their own UI and play by their own rules anyway. To me this was part of the Amiga charm, but it was inimical to the vision of computing Apple was selling.

I didn't realize just how weak the Mac APIs were for building actual applications until I tried writing one. You NEED a framework like PowerPlant in order to contend with the very primitive primitives Apple supplied. And even then, you don't get nice things like preemptive MT.


Applications on the Amiga all had different interfaces because intuition.library had infinite possibilities since it only implemented graphic primitives on top of graphics.library.

I'm an Apple guy now but I will never get used to a one button mouse - it's too retarded, especially when one comes from UNIX where three buttons are simply phenomenally super awesomely useful: I love the mark with the mouse and paste with the middle mouse button - it's the best. I cannot figure out why others haven't implemented that - it's so natural and intuitive - I love that I don't have to explicitly cut and paste - marking is enough.


Apple ][ was the school computer and the upper middle class computer. Commodore, Atari, Sinclair, Radio Shack and TI (when the TI-99/4A went on sale) were the computers that introduce computing to a whole generation of families without the income to afford Apple. Its telling that both Atari and Commodore sold more computers than Apple for a lot of years.

Its a shame that both Commodore and Atari forgot what their niche was.


You nailed the description. At my school they had one Apple ][ and this infuriating rule that only kids who had an Apple computer at home could use it. Us poor kids had to use TRS-80 model 3s. I carried an anti-Apple grudge for years and years because of that policy.


That is really a horrible policy. What were they thinking?


At my UK state school in the 80s, they had a similar policy for musical instruments. They only gave music lessons to those that already were having private lessons (!)


UK is a very class based society. You must stick to your class.


The kids not familiar with it would mess it up?


I mean good? I learned a lot by messing up stuff. School is a great place to make plenty of mistakes.


Especially on those machines: You could simply press the reset button and it would go back to a pristine state. There was no hard drive or other state that persisted across reboots.

The worst you could possibly do was destroy a program stored on disk or a cassette.


But they are expensive and the teachers didn't know how to fix them. At my school it was a similar setup, getting access to the computer lab was guarded like the crown jewels and the one guy who could program never said a word to me. I got a 486 the year after and at that point couldn't care less about the old boxes we had to school. Pity they didn't even brush on programming, it was all word processing making and bad computer art (easy to mark, the winner got to print theirs out in colour!).


I get the feeling my school age self would have been filled with a bit of rage at the teacher and the students who got to use the Apple computers. The TRS-80 Model 3 wasn't bad, but it wasn't exactly the funnest machine.

Well, I would guess that would teach a lesson, but probably not the one they wanted.


I'm also saddened when people omit the mention of the Sir Clive Sinclair's ZX Spectrum, another affordable little wonder that was responsible for bringing the other half of the world into computing. :)

https://en.wikipedia.org/wiki/ZX_Spectrum


My first computer, and something that is often referenced in the UK, or amongst people of the right kind of age.

The Spectrum was insanely popular, though the development of a new machine also coincided with difficult times. (The Sam Coupe.)


Sam Coupe was a technical marvel, pity it had some issues with the first models and it was already at the end of the 8 bit generation.


That's because the ZX Sinclair Spectrum's only claim to fame is the rock bottom cheap pricing: it did not have any revolutionary hardware or operating system like the Amiga. It couldn't do things competition wasn't capable of. The only thing Spectrum competed on was price.


Keep in mind that the Amiga in 1985 was hard to get, due to production problems.

Reading the specs doesn’t give you much of a story. As the article talks, around 1986, software availability became the prime driver of platform sales. That meant taking a different approach to designing hardware. It takes a long time to figure out how to work with custom chips well, which is why the Amiga is still popular in the demoscene, but also a contributing factor for why there was not as much software available when it came out.

Macs and PCs are pretty boring by comparison. A CPU, some memory, interrupt controllers. But they had software. Macs had educational discounts back then (and still do, but not quite as dramatic).


I actually had a hand-me-down C64 as a kid, so I couldn't agree more. It's a bit bizarre; the C64 and Apple II were around at about the same time, but the C64 had more of, well, everything, for a cheaper price. It sort of baffles the mind to me that Apple even stood a chance to them.


The Apple II came out 5 years earlier, an eternity in tech-time.


Ah, looks like you're right. I knew I should have looked it up before I made that comment. My bad!


You aren't wrong, though the Apple ][ came out earlier, it had an extremely long lifespan, and coexisted with the C64 for many years. I remember playground debates about which was better.


Really I guess that the Apple ][ was a platform rather than just a single machine.

So you've got the original Apple ][ which a C64 would have been a much better machine than with the benefit of 5 years and super aggressive cost cutting. However you've also got the Apple iigs in that family.

I think I'd see that as a much better machine than the C64 (but obviously much more expensive too).


Eh, your memory isn't as wrong as you may think. The C=64 was released during the Apple II+/Apple IIe era, and frankly, both of those machines were barely improvements. To give you an idea, the biggest improvement offered by the IIe was... support for lower-case characters.

The Commodores genuinely were something special for the era.


> Commodore gets the short shrift in the Twitterati retelling of the personal computer evolution

Isn't that just because they had no actual impact on the industry? The Apple II was first as a mass consumer microcomputer (they even showed a prototype to it to Commodore years before the PET, when Tramiel still thought calculators were the thing). The Mac inspired every popular GUI that came after it (they all look more like the Mac than Xerox).

You could say that Commodore inspired a generation since they were so cheap and everyone had one, but you could also say that about Dell and Gateway 2000. The Amiga had amazing hardware, but so did the Sharp X68000 - still the whole concept of proprietary chips that software had to be written exclusively for has never had a long-lasting impact on the PC space.

Of the PCs and smartphones we're using today - how much of it can be traced to Commodore? The price? The method of vertical integration maybe?


As Apple fans like to say, being first isn’t always important is it? The Apple 2 may have been first, but the Commodore 64 was both more affordable, and more widely adopted.

Apple computers weren’t Personal Computers because precious few people owned them at Home, they were time share systems you got to use at school labs or the office.

What impact did Commodore have? An entire generation of engineers who went on to work on graphics software and hardware you use today were hacking C64s and Amigas as kids.

Do you think Linus Torvalds got started on Commodore hardware or Apple? You can directly trace a lot of the modern hacker ethos of the internet to the kids who grew up in the 80s on non-Apple hardware.

If you want to credit long lasting inventions that are part of software and hardware today, you can look at Alan Kay, and thank him for Smalltalk (which Brad Cox based Objective-C on)

Or you can look at Kerrigan/Richie/Pike.

Much of what makes modern PCs and smartphones what they are is invisible. There would be no iPhone without them no matter how important and revolutionary you think capacitive touch interfaces are, they stand at the top of a deep deep pyramid of inventions and innovations that did not come from Apple, and annoyingly, often isn’t credited by the historical retelling and hagiography.


A general purpose operating system. As a business graphical OS it was lovely to develop for. It made sense with a delightful conciseness that Windows and X could only dream of. Executables of equivalent capability could, quite literally, be an order of magnitude smaller.

There were few warts - AmigaDOS because of the BCPL roots as it was an extremely last minute addition when the planned CAOS never materialised, and icons were a pain to work with. Thirty years of working on other things and I still believe they got 95% just so. The plug and play of Win 95 was pathetic compared to AutoConfig (IIRC what it was called), in Zorro 1. MFM and IDE hard drives compared to SCSI on its own DMA channel etc etc. Stuff that took decades to arrive on Win.

As for a reboot - I can imagine an Amiga like OS experience on several platforms, but hardware? I find it difficult to even imagine how something could have the quantum leap that Amiga was compared to everything else on the market under $50k.


Are there any architectural features from AmigaOS (whether kernel components, syscall ABI, or OS libraries) that would be an improvement even compared to today's OSes?

Or: if one were building a hobbyist OS today, what would be the key takeaways to pull from AmigaOS?


See this excellent answer by vidarh to the same question in 2015:

https://news.ycombinator.com/item?id=9935892

Highlights:

* DataTypes

* file system assigns

* AREXX scripting of many/most programs


It's always fun when my old comments get dug up...

Actually recently I've started thinking about what it would take to create a toolchain to do the minimum to provide the pieces of AROS (for the uninitiated: AmigaOS "reimplementation" though it goes beyond the original in some respects) that might make sense on Linux and provide a compatibility layer to make it work.

AROS itself can run hosted on Linux, but not integrated well with Linux apps, but quite a lot of AROS relies on relatively small subsets of the AmigaOS API, and it'd be a fun experiment to e.g. bring data types to Linux, possibly combined with a fuse filesystem as a fallback to do on the fly conversions for apps with no support.

I'd love to see if some of those things could gain traction if suitably modernized.


I am thinking about what would make something revolutionary today. The only, but very important thing I can think of is that software today has no time constraints. All interfaces feel sluggish at some point.

What if we could make an OS with constraints, or an app store with a vetting process, or both complementing each other, to the effect that:

A widget pressed or touched or interacted with could always be trusted to respond in time - or fail in an understandable manner.

- No launching screens on touch interfaces suddenly being sluggish.

- No waiting for apps to download and install and can not be used during that time. (Solved by having updates installed quietly in the background.)

- No stutter or slowdowns, ever, no audio lags, ever.

the main thing should be that what you are interacting with must never feel like it's sluggish, no more than the water flowing out of a faucet starts to lag or freeze/unfreeze suddenly. The interface should feel so solid and "real", that if it stuttered you would be so shocked as if a thrown ball in real life stuttered in mid air.

Possible means

: Give GUI code very high priority. This will have to involve putting some intelligence in GUI code, or the interface will appear to be unresponsive or do strange things when underlying IO or network is being slow.

: Focus on determinism and time budgets, not raw performance throughput

: Vetting of applications

: Constrain apps to hard RAM budgets

: IO budgets for apps?

: Have apps allocate and bid for network performance and available bandwidth

I have a feeling much of this would not need a ground up rewrite. Probably Android or Linux could be used as a basis for such a system.


The big lesson in this for AmigaOS is to thread everything and make message passing a cheap and easy mechanism, but also to make developers develop but at least test on very low end hardware.

Even a simple button involves half a dozen threads in AmigaOS between the drivers, the handlers "baking" raw events from the drivers into something higher level, the Intuition thread processing the button presses into events related to a specific button etc.. It affects total throughout but increases responsiveness.

I think that if the OS provides responsiveness, and some key apps do, people will demand it. That is what happened with AmigaOS. You didn't get away with making a system sluggish because you'd get compared to apps that were extremely responsive.


Excellent answer. To which I'll add:

IFF standards. In today's world it would be an unthinkably open approach taken by open source only. Even more surprising it came from a joint venture between Commodore Amiga and EA! Every single graphics program understood, and via datatypes understood in a standard way, IFF graphics. Saving, processing or reading. They were so prevalent and expected that you would be hurting your chances to release something with a propriety only format. Same went for sound and no end of other things. Had the Amiga thrived there'd no doubt have been an IFF in place of many of the multimedia formats. With a standard OS level library call to decode them, etc.

The conciseness of approach, necessary in a system providing proper multi-tasking in 256K, meant all the services other platforms placed at least partially in the .exe were usually in the OS. There were system libraries you could rely on without the absurd version dependent dll hell of windows. I'm sure had the Amiga persisted there'd be some version annoyance, but I can't imagine it reaching the stupidity of now.

Windows had far more than glue code in the exe. If you needed to use a file requester, accept messages, have a window that could be resized etc there was tons of unique OS related code in the exe for all that etc etc. Update the OS and unless you update the source and rebuild the exe it will clearly and obviously be of the previous Windows release. Or as was so often the case buy the latest office and the look is clearly of the next, unreleased Windows. Amiga had it such that all that rubbish was nearly all external. Set up your structures, call the API, get woken up when there's something you need to care about. If you updated OS, all your window chrome, file requesters etc, would be of the new OS. No ridiculous dependencies on v 3.2.152 of MSVC.dll, and 3.3.x not being acceptable, meaning you end up with 12 different installed versions etc.

Only apps doing something clever - like CygnusEd with its hand written assembler scrolling that remains, 30 years later, my benchmark for "fast and smooth enough" editor scrolling. Essentially nothing has yet matched it, though Sublime is probably closest just without smooth scrolling. It was really difficult to come to accept - in some sense I still haven't - that I had to do so much of this OS housekeeping for myself each and every time, in every application for other platforms. I often used to wonder what Windows was, in fact, adding as it always seemed like I was doing everything myself. I gave up complete on Windows programming pretty quickly as a result. :)

AmigaDOS may have been a bit of a last minute, ugly addon, but in use it felt like a lightweight single user *nix. Proper filenames, priorities, comments, proper scripting and ARexx if you needed additional integration. Sure, it was far happier on a HDD, but what aside from DOS - more a program loader than OS - wasn't? :)


IFF actually lives on thanks to Microsoft and IBM in large part. RIFF is basically a little endian IFF,and used for AVI, wav and Webp among others.

What hasn't lived on, of course is a concerted push for an ecosystem around tools for working with the underlying container format instead of the specific formats. This is what made the biggest difference in the Amiga: to a great extent when coming up with a storage format,the question was increasingly "which IFF chunk types are suitable" rather than a question of designing a format from scratch.


PNG is a very similar format to IFF, though for some reason, despite have essentially the same needs and despite the PNG working group being aware of IFF, they chose to be incompatible with IFF.

Nerdier trivia: Erlang's BEAM VM emits compiled bytecode files in an IFF format. (Which is a strange choice, honestly, since they could have easily chosen to use a purpose-made executable-binary container format like ELF, which would have made .beam files more amenable to analysis using standard compiler toolchain tools.)


Didn't know that about Erlang. Reason is probably that ELF wasn't that widespread until the late 90s. When I started using Linux around 94, a.out was still common. It took several more years for ELF to become dominant.


Don't forget about AIFF, as well as Maya's variant of IFF.


and AT&T, DjVu ebook format is IFF based.


" ... like CygnusEd with its hand written assembler scrolling that remains, 30 years later, my benchmark for "fast and smooth enough" editor scrolling. Essentially nothing has yet matched it ..."

Here it is: https://www.youtube.com/watch?v=L41oIvre9K0

For those who never used CED on an Amiga, this is how it ran on now 30 years old hardware clocked at 8 MHz.


Well "improvement" is a bit of a loaded word, but AmigaOS's exec kernel was a microkernel, and one of the few to pull it off without many problems. Compared to Linux's monolithic, or Windows/Apple's hybrid Mach thing, it's actually something that's still a bit uncommon.

It's still experimental, but RedoxOS is really the only newish OS that I know of that does the Microkernel design.


I would argue that a major defining characteristic of AmigaOS was that it ran in the same flat address space and privilege level as the applications. As a result, message passing is lighting fast (just pass a pointer) and applications can easily obtain direct access to hardware. This has obvious downsides as well— unstable app takes down whole system, no security whatsoever.


Hardware memory protection only came into being with, IIRC the 68030 - Edit: though available as separate coprocessor 68851 chip for 020. Early Windows was no different, limited memory protection first came with the 286, wasn't it? BSOD just as often, instead of Guru meditation - at least a guru let you into a remote debugger. :)

Without that it wasn't hard or unexpected for an unstable app to take down the system. Used to happen reasonably frequently on 68000 Unix systems. Certainly for every time that happened you might expect a couple of caught core dumps, but before hardware protection it was still wing and a prayer...


Think it was the 386 that first had protected mode...


Indeed it was. And it took until Windows NT and the various PC Unixes to properly utilize it. Windows 95/98/ME were ostensibly running in 32-bit protected mode, but apparently could switch back to the old non-protected 16-bit mode to run old applications and drivers, compromising stability of the entire system.


286 had too but no common OS used it. Too limited


Windows and OS/2 used it. A bunch of Unixes did as well. It was only limited if you wanted to run multiple DOS applications at the same time and switch between them.


Whatever Windows did, it was not good enough and way too easy to crash 16 bit Windows from within applications. Yes, I remember now I heard about 286 OS/2. But hardly common, even though cool. I was thinking MINIX which IIRC could use memory separation on 286. (But not on 8088/8086.) Still, you could only use 64 kbyte segments, limiting you data set a lot. You could not do the "large" model of up to half a meg or so you could in DOS.


Versions of MS-Windows before Windows NT used "cooperative multitasking" in which it was the responsibility of each process to yield CPU time to the next process in the task queue. Compare this with "pre-emptive multitasking" employed by UNIX, OS/2 and AmigaOS in which an interrupt causes the OS to save registers, stack pointer, etc and transfer control to another process (if needed) after each quanta.

If a Windows 3.1 process failed to yield, it could result in a nonresponsive OS. On Linux, an abusive process would have to try a bit harder to take down the system (fork bomb, hog a bunch of ram, etc). On AmigaOS, a process could just overwrite part of another process or the OS itself to cause a crash.


No, the protected mode was introduced with the 286.


Ah, yes, the (in)famous Guru Meditation errors. The Amiga had lots of quirky things like that; it's what really made it special to me.


Other platforms had their own.

RISC OS (Acorn) had its infamous "Abort on data transfer" (or "Abort on instruction fetch" if you branched instead of LDR/STR'd). And if you were especially naughty and chased a null pointer, you got "ofla" -- which was the contents of the first four bytes of memory!


Yikes! Would never fly today, I don't think.


A similar architecture is used in embedded operating systems like some versions of vxWorks! And RTEMS.


I remember telling my computer science teacher how the floppy disk file system worked (a directory was a linked list of sectors, each one of which represented the head of a file, IIRC) and he refused to believe anyone would implement it like that due to the obvious perf issues.


Yeah, that was a terrible idea and was later dropped with FFS.


The ease with which you could drop in new filesystem drivers is another one of those things that was great. Aside from the official FFS there's been a number of other filesystems even long after Commodore went under.


A terrible idea because they were backed into a corner. Had the third party producing CAOS delivered, there would have been no need for an insane timescale port of Tripos to become AmigaDOS. Then floppies wouldn't have got OFS which was a HDD filesystem hacked to fit in as little time as possible. dos.library would have escaped the horrible BCPL mucking about with BSTRs and BPTRs too.


The ability to run fast and smooth on a 7Mhz processor.


It's insane. All these 80s computers have silly fast response times, which put all modern machines to shame.

But, since arguably the Amiga is the only computer with a modern GUI and being super responsive, it really points out the absurdity of everything modern when you feel it.

It can't be gleaned from youtube videos, either. You must hold that damn old mouse in your hand and click something or drag a window. To the brain, there's zero latency. NOTHING. You ARE the computer. (I think that is one reason why it's so addictive, it's one of the truly cybernetic devices. My modern Mac comes close, but not quite. Scrolling on some phone apps come close.)


If software were as accessible on Amiga through a ubiquitous marketplace, things might be different.

For a while, you could buy an Amiga at Montgomery Wards at an outstanding price. Software came from another end of the mall at Babbage's or EggHead.

Boxed software at retail was limited and expensive.

It required the invention of entire manufacturing pipelines to make the product happen.

Lots of blame and lots of reason to thing it might be different to take advantage of today's manufacturing and compute capability.

Some very stubborn people are still at it, preventing that from happening.


having had (and loved) a BBC B, part of me wishes i had replaced it with either an amiga or an archimedes, rather than the 386 i ultimately got. they both seemed very much in the spirit of the beeb, whereas the "ibm pc" did not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: