I've played with the Amiga a lot via emulation, and it's still impressive to me; a home operating system with preemptive multitasking in the 80's?! With something like that, whomever was in charge had to work pretty hard to ruin it.
It’s really sad how much public hagiography is made over the Mac when almost no middle class family I knew of could afford it, certainly not with a LaserWriter.
The Commodore 64 was way more affordable and got a legion of kids interested in computing and coding, who later went on to adopt Amigas.
Even today if you look at the home brew, hacker, and demo scenes, Commodore dominates. Hardly anyone is doing stuff on old Apple 2s or Macs.
Commodore gets the short shrift in the Twitterati retelling of the personal computer evolution, and today’s millennials completely fixated on Jobs and Apple and ignore most of what was really happening in the 80s with home users.
Part of the late '80s/early '90s revenue strategy for Apple was to sell into the educational market. The people who fondly remember the Apple computers of this period do so not because they had one at home, but because many of them were young children at this time, playing games on those Apples in school computer labs.
Our teacher had almost no idea how it worked and there was no manual or instructions, so basically we sat in front of that green screen confused. I managed to make a star in logo, that was about it.
I still loved playing with it anyway, this machine that you had control of through arcane commands. It felt like you could make it do anything if you only knew the magic command.
So the "Twitterati retelling" critic still applies.
The Amiga was an entirely different class of machine, designed by and for engineers, and it was a bit rougher around the edges UI-wise but it did far more in terms of real, concrete advancements in the state of the art.
I didn't realize just how weak the Mac APIs were for building actual applications until I tried writing one. You NEED a framework like PowerPlant in order to contend with the very primitive primitives Apple supplied. And even then, you don't get nice things like preemptive MT.
I'm an Apple guy now but I will never get used to a one button mouse - it's too retarded, especially when one comes from UNIX where three buttons are simply phenomenally super awesomely useful: I love the mark with the mouse and paste with the middle mouse button - it's the best. I cannot figure out why others haven't implemented that - it's so natural and intuitive - I love that I don't have to explicitly cut and paste - marking is enough.
Its a shame that both Commodore and Atari forgot what their niche was.
The worst you could possibly do was destroy a program stored on disk or a cassette.
Well, I would guess that would teach a lesson, but probably not the one they wanted.
The Spectrum was insanely popular, though the development of a new machine also coincided with difficult times. (The Sam Coupe.)
So you've got the original Apple ][ which a C64 would have been a much better machine than with the benefit of 5 years and super aggressive cost cutting. However you've also got the Apple iigs in that family.
I think I'd see that as a much better machine than the C64 (but obviously much more expensive too).
The Commodores genuinely were something special for the era.
Reading the specs doesn’t give you much of a story. As the article talks, around 1986, software availability became the prime driver of platform sales. That meant taking a different approach to designing hardware. It takes a long time to figure out how to work with custom chips well, which is why the Amiga is still popular in the demoscene, but also a contributing factor for why there was not as much software available when it came out.
Macs and PCs are pretty boring by comparison. A CPU, some memory, interrupt controllers. But they had software. Macs had educational discounts back then (and still do, but not quite as dramatic).
Isn't that just because they had no actual impact on the industry? The Apple II was first as a mass consumer microcomputer (they even showed a prototype to it to Commodore years before the PET, when Tramiel still thought calculators were the thing). The Mac inspired every popular GUI that came after it (they all look more like the Mac than Xerox).
You could say that Commodore inspired a generation since they were so cheap and everyone had one, but you could also say that about Dell and Gateway 2000. The Amiga had amazing hardware, but so did the Sharp X68000 - still the whole concept of proprietary chips that software had to be written exclusively for has never had a long-lasting impact on the PC space.
Of the PCs and smartphones we're using today - how much of it can be traced to Commodore? The price? The method of vertical integration maybe?
Apple computers weren’t Personal Computers because precious few people owned them at Home, they were time share systems you got to use at school labs or the office.
What impact did Commodore have? An entire generation of engineers who went on to work on graphics software and hardware you use today were hacking C64s and Amigas as kids.
Do you think Linus Torvalds got started on Commodore hardware or Apple? You can directly trace a lot of the modern hacker ethos of the internet to the kids who grew up in the 80s on non-Apple hardware.
If you want to credit long lasting inventions that are part of software and hardware today, you can look at Alan Kay, and thank him for Smalltalk (which Brad Cox based Objective-C on)
Or you can look at Kerrigan/Richie/Pike.
Much of what makes modern PCs and smartphones what they are is invisible. There would be no iPhone without them no matter how important and revolutionary you think capacitive touch interfaces are, they stand at the top of a deep deep pyramid of inventions and innovations that did not come from Apple, and annoyingly, often isn’t credited by the historical retelling and hagiography.
There were few warts - AmigaDOS because of the BCPL roots as it was an extremely last minute addition when the planned CAOS never materialised, and icons were a pain to work with. Thirty years of working on other things and I still believe they got 95% just so. The plug and play of Win 95 was pathetic compared to AutoConfig (IIRC what it was called), in Zorro 1. MFM and IDE hard drives compared to SCSI on its own DMA channel etc etc. Stuff that took decades to arrive on Win.
As for a reboot - I can imagine an Amiga like OS experience on several platforms, but hardware? I find it difficult to even imagine how something could have the quantum leap that Amiga was compared to everything else on the market under $50k.
Or: if one were building a hobbyist OS today, what would be the key takeaways to pull from AmigaOS?
* file system assigns
* AREXX scripting of many/most programs
Actually recently I've started thinking about what it would take to create a toolchain to do the minimum to provide the pieces of AROS (for the uninitiated: AmigaOS "reimplementation" though it goes beyond the original in some respects) that might make sense on Linux and provide a compatibility layer to make it work.
AROS itself can run hosted on Linux, but not integrated well with Linux apps, but quite a lot of AROS relies on relatively small subsets of the AmigaOS API, and it'd be a fun experiment to e.g. bring data types to Linux, possibly combined with a fuse filesystem as a fallback to do on the fly conversions for apps with no support.
I'd love to see if some of those things could gain traction if suitably modernized.
What if we could make an OS with constraints, or an app store with a vetting process, or both complementing each other, to the effect that:
A widget pressed or touched or interacted with could always be trusted to respond in time - or fail in an understandable manner.
- No launching screens on touch interfaces suddenly being sluggish.
- No waiting for apps to download and install and can not be used during that time. (Solved by having updates installed quietly in the background.)
- No stutter or slowdowns, ever, no audio lags, ever.
the main thing should be that what you are interacting with must never feel like it's sluggish, no more than the water flowing out of a faucet starts to lag or freeze/unfreeze suddenly. The interface should feel so solid and "real", that if it stuttered you would be so shocked as if a thrown ball in real life stuttered in mid air.
: Give GUI code very high priority. This will have to involve putting some intelligence in GUI code, or the interface will appear to be unresponsive or do strange things when underlying IO or network is being slow.
: Focus on determinism and time budgets, not raw performance throughput
: Vetting of applications
: Constrain apps to hard RAM budgets
: IO budgets for apps?
: Have apps allocate and bid for network performance and available bandwidth
I have a feeling much of this would not need a ground up rewrite. Probably Android or Linux could be used as a basis for such a system.
Even a simple button involves half a dozen threads in AmigaOS between the drivers, the handlers "baking" raw events from the drivers into something higher level, the Intuition thread processing the button presses into events related to a specific button etc.. It affects total throughout but increases responsiveness.
I think that if the OS provides responsiveness, and some key apps do, people will demand it. That is what happened with AmigaOS. You didn't get away with making a system sluggish because you'd get compared to apps that were extremely responsive.
IFF standards. In today's world it would be an unthinkably open approach taken by open source only. Even more surprising it came from a joint venture between Commodore Amiga and EA! Every single graphics program understood, and via datatypes understood in a standard way, IFF graphics. Saving, processing or reading. They were so prevalent and expected that you would be hurting your chances to release something with a propriety only format. Same went for sound and no end of other things. Had the Amiga thrived there'd no doubt have been an IFF in place of many of the multimedia formats. With a standard OS level library call to decode them, etc.
The conciseness of approach, necessary in a system providing proper multi-tasking in 256K, meant all the services other platforms placed at least partially in the .exe were usually in the OS. There were system libraries you could rely on without the absurd version dependent dll hell of windows. I'm sure had the Amiga persisted there'd be some version annoyance, but I can't imagine it reaching the stupidity of now.
Windows had far more than glue code in the exe. If you needed to use a file requester, accept messages, have a window that could be resized etc there was tons of unique OS related code in the exe for all that etc etc. Update the OS and unless you update the source and rebuild the exe it will clearly and obviously be of the previous Windows release. Or as was so often the case buy the latest office and the look is clearly of the next, unreleased Windows. Amiga had it such that all that rubbish was nearly all external. Set up your structures, call the API, get woken up when there's something you need to care about. If you updated OS, all your window chrome, file requesters etc, would be of the new OS. No ridiculous dependencies on v 3.2.152 of MSVC.dll, and 3.3.x not being acceptable, meaning you end up with 12 different installed versions etc.
Only apps doing something clever - like CygnusEd with its hand written assembler scrolling that remains, 30 years later, my benchmark for "fast and smooth enough" editor scrolling. Essentially nothing has yet matched it, though Sublime is probably closest just without smooth scrolling. It was really difficult to come to accept - in some sense I still haven't - that I had to do so much of this OS housekeeping for myself each and every time, in every application for other platforms. I often used to wonder what Windows was, in fact, adding as it always seemed like I was doing everything myself. I gave up complete on Windows programming pretty quickly as a result. :)
AmigaDOS may have been a bit of a last minute, ugly addon, but in use it felt like a lightweight single user *nix. Proper filenames, priorities, comments, proper scripting and ARexx if you needed additional integration. Sure, it was far happier on a HDD, but what aside from DOS - more a program loader than OS - wasn't? :)
What hasn't lived on, of course is a concerted push for an ecosystem around tools for working with the underlying container format instead of the specific formats. This is what made the biggest difference in the Amiga: to a great extent when coming up with a storage format,the question was increasingly "which IFF chunk types are suitable" rather than a question of designing a format from scratch.
Nerdier trivia: Erlang's BEAM VM emits compiled bytecode files in an IFF format. (Which is a strange choice, honestly, since they could have easily chosen to use a purpose-made executable-binary container format like ELF, which would have made .beam files more amenable to analysis using standard compiler toolchain tools.)
Here it is: https://www.youtube.com/watch?v=L41oIvre9K0
For those who never used CED on an Amiga, this is how it ran on now 30 years old hardware clocked at 8 MHz.
It's still experimental, but RedoxOS is really the only newish OS that I know of that does the Microkernel design.
Without that it wasn't hard or unexpected for an unstable app to take down the system. Used to happen reasonably frequently on 68000 Unix systems. Certainly for every time that happened you might expect a couple of caught core dumps, but before hardware protection it was still wing and a prayer...
If a Windows 3.1 process failed to yield, it could result in a nonresponsive OS. On Linux, an abusive process would have to try a bit harder to take down the system (fork bomb, hog a bunch of ram, etc). On AmigaOS, a process could just overwrite part of another process or the OS itself to cause a crash.
RISC OS (Acorn) had its infamous "Abort on data transfer" (or "Abort on instruction fetch" if you branched instead of LDR/STR'd). And if you were especially naughty and chased a null pointer, you got "ofla" -- which was the contents of the first four bytes of memory!
But, since arguably the Amiga is the only computer with a modern GUI and being super responsive, it really points out the absurdity of everything modern when you feel it.
It can't be gleaned from youtube videos, either. You must hold that damn old mouse in your hand and click something or drag a window. To the brain, there's zero latency. NOTHING. You ARE the computer. (I think that is one reason why it's so addictive, it's one of the truly cybernetic devices. My modern Mac comes close, but not quite. Scrolling on some phone apps come close.)
For a while, you could buy an Amiga at Montgomery Wards at an outstanding price. Software came from another end of the mall at Babbage's or EggHead.
Boxed software at retail was limited and expensive.
It required the invention of entire manufacturing pipelines to make the product happen.
Lots of blame and lots of reason to thing it might be different to take advantage of today's manufacturing and compute capability.
Some very stubborn people are still at it, preventing that from happening.