I got really into computers in the mid-late 90s by which point we were mostly down to Wintel and Apple hanging on by a thread.
What I remember from that era is that nothing was compatible with anything else. It took a lot of work to interoperate between two PCs, let alone cross the gap between OSes. So for a long time, I have kind of taken the current world of few OSes that are highly interoperable as being a great thing: you can build your own Linux machine and still do real work with people on Windows and Mac, etc.
But the more I learn about computing in the 80s and early 90s, the more I’m impressed by the variety and diversity of ideas that came out of that era. I see now that today’s standardization has a real cost, which is that we don’t really see new ideas or new paradigms.
For the HN crowd, especially those who are older than me and can remember the earlier era of computing, what do you think about that trade off and where we’ve ended up today?
Are we better off living in a world where all computers can work together with minimal fuss? Or would it be better if we still had a wide range of vendors taking significantly different approaches and innovating at a much faster pace - albeit in incompatible ways?
Personally I prefer the plethora of OSes approach, and I never been a big UNIX fan to start with.
Yes it does have a couple of nice ideas, and it was much better to use than Windows 3.x + MS-DOS, but that is about it.
All the UNIX based OSes that I really appreciate, have moved beyond it, namely NeXTSTEP, Solaris/NeWS, Irix, Plan 9/Inferno.
Thankfully only BSD and GNU/Linux are stuck into being a continuum of UNIX clones without much to add, when you see their conferences it always boils down to kernel features or filesystems.
GNU/Linux has the ingredients to make an Amiga like desktop experience, with D-BUS, GNOME/KDE, but the fragmentation and love for POSIX CLI applications just doesn't make it worthwhile to try to make it work.
Look at iOS, macOS, Android, Windows (UWP), GenodeOS, Fuchsia, Haiku, Azure Sphere for the pursuit of modern ideas in OS research. The fact that some of those have a POSIX like kernel (or deeply customise Linux kernel) it is just an implementation detail of the overall architecture.
With the Exemption of Plan9 all the OS's you named are pure Unixes. As a example FreeBSD is nearly a mirror of Solaris. Nexstep was pure BSD with a Mach Kernel (OO development was the great seller on NextStep)...Irix? Well OpenGL and XFS is available under Linux.
>iOS, macOS, Android, Windows (UWP), GenodeOS, Fuchsia, Haiku, Azure Sphere for the pursuit of modern ideas in OS research
Narrowing your understanding to the kernel delivered with the UNIX variants that I mentioned, only reveals lack of understanding of what the user and developer experience feels like in such OSes.
Hint, it goes beyond doing CLI applications and daemons.
BeOS is 25 years older and 25 years younger than UNIX, yes modern.
>Hint, it goes beyond doing CLI applications and daemons.
So please tell me then. All Unixes have more or less the same FHS, Solaris did nothing different as..lets say BSD in the User-space nor did Irix, so tell me what did Solaris or Irix and even Nextstep fundamentally different then any other Unix?
>BeOS is 25 years older and 25 years younger than UNIX, yes modern.
Now you mix User-space programs with kernels, we talked about Operating-system (Unix), if you see it like that Windows or Linux is the most modern OS, because most Software.
>explored the idea of Java based kernel drivers
They even made a Java OS a Browser and a Chip, we all know what a stupid decision that was, you could have say ZFS which is against the unix philosophy (one program one job) but you forgot that i think.
>Irix had Inventor.
Windows had paint, and Mac the almighty HyperCard...but that's NOT the Operating-system nor does it make any difference Unix-wise.
>NeXT had the mach kernel, drivers written in Objective-C
Ahh, now you bring the kernel argument i already brought, yeah drivers written in the beloved Objective-C (is that a feature or a User-space thing?)...you know Multics was made with EPL waaay earlier.
They are pure and certified Unixes (not NeXT, but MacOS X is/was), based on Unix ideas..basta.
Looks like you missed the whole systemd and cgroups thing. The two are huge and novel developments in OS design, and much more significant than the GUI flimflam you pine for.
Everything "has been done before" in computing. The difference is a matter of degree and attention to detail.
And your comment about it having been a thing on mainframes underscores my point.
The server story for Linux is evolving (and maturing) greatly, and very quickly. The Linux of today is not at all like the Linux from 2010. To say that Linux is stuck in some sort of legacy UNIX quagmire is just silly.
Did you read my comment? There are no "new" ideas in computing. Everything has already been invented in the 1960's. It's the finesse of execution that matters, not ideas.
I did and you are wrong, there are plenty of ideas to bring to the market, when one doesn't constrain themselves to copy mainframes and big iron UNIXes, and catching up with whatever Apple, Google and Microsoft do on UI/UX.
> The older I get, the more our devotion to the CLI feels like gatekeeping.
Nah... you can't take one of the handful of major styles of interaction with computing devices and diminish adherence to that style as just 'gatekeeping'. There are moments where the strengths of GUI interaction is useful (discovery of options, selection of individual objects, etc.), but modern systems are complex and large enough that you can't really walk away from the idea of expressing operations and state linguistically without walking away from tools you need to do your job efficiently (or at all). So yeah, maybe CLI is less accessible and more difficult for newcomers, but the problems they solve aren't necessarily easy, either.
However... I do think there are many opportunities for both better accessibility in CLI's and better interoperability between CLI's and GUI's. Better discovery of options in CLI would help (autocomplete, etc.), as would a better connection between the GUI and CLI worlds. (Lisp has some good ideas where the "terminal" remembers the object it prints and allows interaction with those objects, as opposed to just remembering a string character representation of them. Microsoft's Office applications similarly will write macro code for you as you interact with the UI - which is a great way to understand the relationship between the two worlds.)
I don't see it that way. My first computer used Win95 and I didn't get into CLI until college and then seriously at work. I'm not a Unix expert, but am blown away by how much more powerful it is than Windows in many uses. I even push files to a Linux server for analysis at times as I can tear through a file with grep/cut/awk/sort/uniq and some pipes very quickly and efficiently. There are many other technologies I know, yet the simple terminal is still the best approach for many of my uses. The design has stood the test of time for a reason and the compositional approach is nice. Nobody is gatekeeping here.
I've been running desktop Linux for 20 years, but this isn't a hill I'm gonna die on. Kernel and userland developers treat Linux as a server OS first and foremost, and that's where the funding goes.
The point of "Linux on the desktop" was not that it happens for you and the other commenter, but that Linux becomes mainstream. :) Never happened and unless something extraordinary occurs it likely never will. And Android doesn't count, obviously.
True. But it also has the connotation that Linux is "not good enough" for the desktop, and this is false. Popularity has a lot of factors, sadly. A user-friendly Linux like Ubuntu is indeed "good enough" and has been so for years.
What I'm personally bit horrified with is that "Unix" is the end all of OS design and so on. With some fundamental designs going back to 70s... Like everything being text. Which seems quite a mess in current world with multimedia and increased networking. Yes we have spend decades hacking to get it to work... But maybe more options and different designs would enrich us lot better.
And really we do have layer that can connect all of the different systems together. Namely IP. So even if we had fundamentally different systems, there is no reason why intercommunication wouldn't be possible at this point.
Yes, it's not either/or. The web is a kind of meta-OS with a very minimal and simple API, and as far as the web is concerned everything in userland operates as a thin-client. The OS is only of interest locally.
By the mid-90s I was furious with both Microsoft and Apple for setting computing back by at least a decade.
You could - with extreme effort - make an Atari ST multitask. And of course with the Amiga it was built in.
So why did we throw that away and go backwards to DOS and then single-process Windows - which eventually reinvented multitasking nearly a decade later and sold it as if it was the most astounding development in the history of computing?
Of course there were technical challenges - protected memory, protected processes, and so on. But the earliest versions of Windows didn't have those either.
So it was a disappointing and frustrating time - an alienating difference of philosophy between consumer computing designed for creativity and exploration, and box-shifting commodity computing designed for form-filling and bureaucracy, which might allow you to have some fun after hours if you behaved yourself and the machine didn't crash.
Considering how smart the Amiga team were, it would have been very interesting to see what they could have done with ubiquitous fast networking, high-res graphics and video, and pocketability.
I suspect the result would have been far more open and inspiring than the corporate sand trap we have today.
> The web is a kind of meta-OS with a very minimal and simple API
You may have missed the last 20 years of Web evolution, because the Web is now a very messy and hard to implement platform. Even Microsoft gave up on implementing a browser and switched to Chromium.
> So why did we throw that away and go backwards to DOS and then single-process Windows - which eventually reinvented multitasking nearly a decade later and sold it as if it was the most astounding development in the history of computing?
Because cheap commodity harware, a single Intel CPU controlling everything, won out over an expensive custom multi-chip solution.
Every Amiga model was a new model, with large amounts of engineering effort put into it, until eventually the platform just couldn't keep up with how quickly PCs were dropping in cost.
Sure in 1989 an Intel x86 based system was a joke compared to an Amiga, and it was a joke in 1990, and 1991, and 1992, but it was less of a joke every year.
When it all began PC's didn't even have anything that could charitably be called sound capabilities, but year after new cards were released, there was massive competition, at first at the high end, but then sound cards got cheaper and cheaper until one day Microsoft demanded that for a computer to be Windows Certified it had to just include one, and it was up to OEMs to figure out how.
Meanwhile Amiga didn't benefit from that technological explosion.
Same thing happened for graphics.
Same thing happened for networking.
Same thing happened for hard drives, and cd-rom drives, and types of RAM, and literally everything else.
Technology has often been one step back two steps forward, going from mini->micro people complained about the same thing, and going from PCs->Smartphones, it happened again. Remember early smartphones / PDAs? They had a maximum process limit, in the low double digits! Storage that was wiped out if the battery died! Everything ran in the same address space!
And remember the first 5 or so major versions of Android? It wasn't exactly a pleasant system to use.
But it got better.
The thing is, all in one custom hardware solutions will always, at first, beat out general purpose computing. But those custom hardware solutions are expensive, and slow, to engineer. Now if you are Apple and you can manage to find economies of scale to do everything in house, great!
But Amiga didn't have that scale. They had custom in house everything, which, and they were competing against not just Microsoft, but literally every other consumer PC hardware manufacturer on the planet who were in a fight to the death to drop prices on DOS/Windows PCs peripheral by peripheral.
The sad truth is Amiga hardware barely changed from 1985 to 1992. The A500 was nothing more than a cost reduced A1000 - no new features. The A2000 was nothing more than an A1000 with expansion slots. The A2500 was a couple extra boards, not a new model. The A3000, my favorite machine, was effectively a cost reduced A2500, combining the 68030 accelerator board and SCSI onto the motherboard. Again, nothing new.
So after 7 years, all we had were expansion slots and faster CPUs. The OCS -> ECS chipset upgrade was very minor. That's basically it.
Finally, at the end of 1992, the AGA machines (Amiga 1200 and 4000) were released. We finally had upgraded graphics that didn't even keep pace with SuperVGA. The A1200 was also totally gimped with no fast memory and a slow 68020 that was now over 8 years old...
> You could - with extreme effort - make an Atari ST multitask. And of course with the Amiga it was built in. ... So why did we throw that away and go backwards to DOS
Mine is a US-centric perspective, it but always seemed like Commodore and Atari were selling niche follow-ons to their eight-bit lines, with momentum heavily toward DOS, even in the second half of the 1980's.
Open up the back of a 1987-era computer magazine, and the pages are full of companies that will sell you a DOS machine, expansion cards, operating system, etc. All these participants in the market are sinking money into the platform to try to establish a competitive advantage by making something better. (The Compaq Deskpro 386 and various video cards come to mind immediately as places where the market rapidly outran the original developers of the platform.)
But if you want an Amiga, you're stuck with Commodore. If you want an ST, you're stuck with Atari. Two individual companies both trying to build an entire _platform_ - custom hardware and OS - that's competitive with the collective output of an entire segment of the industry. I guess it's easy to say this now, but both of those companies picked a hard battle to fight - and both did very well, all things considered.
> ... single-process Windows - which eventually reinvented multitasking nearly a decade later and sold it as if it was the most astounding development in the history of computing?
The development wasn't the technology, the development was getting that technology into a delivery channel where it was relevant to a large group of people. This is why Windows 3.0 was such a big deal - there wasn't anything technically novel in it, but it was cheap, worked with what the customer base already owned, and attracted enough investment to overcome its (many) faults.
This reminds of Robert Pike's "Systems Research is Irrelevant" speech [1]. Now, 20 years after his speech, we are still stuck with the same notions (such as everything being a string). It's not that there are not plenty of alternatives around, however, expectations are so high that it's almost impossible to make a new computer system economically viable. On the other hand, the hacker and maker scene is very active, some of them building operating systems and hardware such as tiny Lisp-based machines [2] and OSes [3]. (My only gripe is that most of the new "avant-garde" systems are still text/file-based.)
I'd love to see a next wave in personal computing, starting with a clean slate, building on the research, insights and learning from the mistakes that have been made. I have no doubt that it will happen, the question is only when.
As for interoperability: Even on the same platform there are countless problems getting software to talk to each other, so I don't think that a new system will make the situation any worse.
IMO Unix hasn't taken over as much as people think it has. If you look at an OS more closely they typically have a POSIX API layer on top of whatever unique ideas they have. Even Linux does quite a lot of its own thing.
People don't know or underestimate how much Unix has taken over. From cellphones to super computers. From consumer devices to industrial control. Unix servers made and continue to run the internet, aka everything.
I think this is largely a factor of PCs getting more powerful and the lag to libre implementations. Trying to build container infrastructure during the pentium 90 days wouldn’t have succeeded.
I think the issue is that, from a user perspective, the implementation details of the kernel and base OS doesn’t matter. As long as it’s sanely implemented and doesn’t impose onerous limitations, it should just get out of the way. The vendor can implement it however they like, with whatever technologies they like, and as long as it supports featureful applications users shouldn’t need to care.
The reason Unix was so successful was precisely because it was simple and portable a d designed to get out of the way. It was as unremarkable as possible, by design. All the interesting stuff is left to application developers. You want to implement a relational file system, go ahead. You want to develop your own GUI layer, fine. Compared to the other major OSes of the day like VMS, Pick, PrimeOS, etc it’s as unoppinionated as possible. Likewise with C, which is intended to be as low level and paradigmless as they could. Just an abstraction over assembler just as Unix is a thin abstraction over the hardware.
Whatever got us the Internet was worth it. To generalize, it's fine to have a variety of components, as long as the interface between them is roughly standardized, and the innovation can come in the components. That we can have a Playstation and an iPhone use the same network is a good thing, and they can do whatever they want with the rest of the stack.
A computer pre-internet felt a lot more like an island. I had an Amiga, but I wasn't aware of 90% of the stuff that was out there, and could barely afford to buy a compiler.
Not always islands, if you had a modem and a phone line. Pre-Internet were increasingly numerous BBS systems, from big one’s like AOL and CompuServ to medium sized one’s like GEnie and many tiny independents. Instead of remembering URLs you had lists of phone numbers (the smaller one’s requiring long distance calls when those were still expensive) for your little modems to dial. My Atari ST (a contemporary of the Amiga) was online a lot already during those days. I actually connected to the Internet via CompuServ for quite a while before the local phone company began to offer dial-up Internet service. And pirated as well as free software were widely available already back then.
I didn't have a modem for the Amiga for most of the time I owned it (starting in about 1985). I did eventually get one but had trouble getting access to the phone. The BBS sites I did connect to were often full, and many required you to upload something in order to download something. Starting in about 1991 I got access to the Internet at school, and spent most of the time in the lab there, leaving my Amiga behind when I left for college.
Prior to the iPhone we were “standardized” on resistive touch screens, which sucked compared to the iPhone’s capacitive screen. Not because a stylus was a good way to use a portable device, and not because capacitive touch screen technology was unknown - it was invented long ago. It’s because of cost. You’re just conflating standardization with cost cutting.
If there was a cheap and easy to install OS in that era they’d all use it. Oh wait that was DOS & Windows, that was the whole point, that’s what happened, it was adopted because 999/1,000 vendors are interested in cost cutting not innovation.
It’s hard to celebrate cost cutters. History never celebrates the crummy cost cutters. I feel no nostalgia for that.
Resistive digitizers also didn't freak and and refuse to function at all if your fingers were sightly damp. Hence why you'll see a lot of point-of-sale systems in bars using resistive digitizers for their displays.
I think there have been many missed opportunities in computing, however I'd suggest that the Internet and GPL (open source licencing in general) would have come regardless of the diversity of operating systems. Both these two innovations have really bound the software world together - compatability these days is really at the TCP level (eg. docker and APIs) and allows for a massive diversity of architectures (kafka, istio, materialize.io, IFTTT, Macrometa etc) - the bar has moved in a way.
Having said that, some of the missed opportunities of note are:
MINT: Gem and Tos weren't really developed much by Atari, but they did buy in MINT which has preemptive multi tasking and memory protection. With the Aes being retargetable (graphic cards), gdos vector fonts and postscript printing, tcp and lan networking stacks, shell, global shortcut system, posix compatibility and multi-user capabilities, it managed to evolve Tos to effectively a unix with usable desktop aka a standard OSX or linux-on-the-desktop well before now.
Secondly, choosing A2 instead of Android would have been huge. A compiled multithreaded, multitasking self-hosted OS with GC, zooming UI, 3x faster than Linux and small enough for one person to understand (250k kernel).
Oracle v. Google would have happened eventually. The underlying problem is the law, not the litigants. In any other field that deals with copyright, the idea that copyright protection extends to what is effectively a series of headings and summaries would be obvious. The problem is that copyright is a poor fit for software - our headings and summaries carry function to them, and changing them breaks code. In general, the law doesn't want to grant copyright over purely functional things, or even things where functionality and creativity are frustratingly mixed. That's why boat hulls and hardware have separate ownership regimes - copyright over them was explicitly rejected. But we also explicitly said software is copyrightable, which leaves open the question of where the unprotected function ends and the protected creativity starts.
Maybe in the world where Android never happened, some other company stumbled upon this rich vein of confusion, and we're dealing with SCO/Novell/Attachmate/Whoever v. Apple over who can implement UNIX APIs in their kernels.
This is the way of the world in microcosm. When globalization has smeared all cultures together, and the only languages left are English, Spanish, and Mandarin, the world will be far more interoperable - but at the cost of every other independent niche society. I’m not actually saying it’s not worth it.
The idea that English (in this case Spanish and Mandarin as well) will kill all other languages is debious and forget a very important fact: through most of history and in a lot of places bilingualism is the norm. Current examples are some African countries where languages local are spoken and French is used as vehicular language; Standard Chinese while people speak another Sinitic language. What is likely to happen is more people being proficient in a second or even third languages (like it is in Hong Kong).
The incompatibilities just moved to different places, where the focus of development and competition is. Taken collectively, the various core services of mobile devices, desktop OS's and browsers interoperate very poorly across vendors. Think about stuff like identity, auth, sync; even very basic cases - e.g. user using a couple of Apple devices one of which is a macOS laptop where they run Chrome - are an incomprehensible mess of incompatibility to most non-nerd users.
A big "Yes" to proportional scroll bars. I didn't have an Amiga, but rather an ST. The ST also had proportional scroll bars and for years, I could not understand why the major platforms (Windows, MacOS) did not. It was a pet peeve that really bothered me when I would sit down to use someone's Mac or PC of the time.
I should have realized it could be worse. With the modern era's "mobile first" UX regime, we now have many desktop experiences that hide the scrollbars entirely—revealing them only on interaction—wholly removing any at-a-glance utility we enjoyed from proportional bars.
but you can't interact with them except by bringing the mouse over the right side, and you can't see them so you literally have lost a visual cue of the very thing you want to interact with. It's just amazing.
Yes, but the fact that so few people actually complains probably tells you a lot about the input devices that people use.
Touchpads with two finger scroll and mouse with scroll wheels have turned scrolling into a gesture that doesn't necessarily require UI element, in pretty much the same way that a physical keyboard is just there for you to type on and doesn't pop up in existence when you have a focused input field
But this is partially missing the original point about proportional scrollbars, which is that they illustrate how large the scrollable content is at a glance. With hidden scrollbars or, as I believe you're describing, simply not even looking at the scrollbar, there's no indication of where you are in the content or how large the content is. I would argue this is one of the chief grievances I have when consuming content on a mobile device (among an armada of other grievances with mobile consumption, at least on the web).
Those of us who like knowing contextual information such as document size at a glance are sad to see (proportional) scrollbars go.
> Yes, but the fact that so few people actually complains ...
I'm curious where you're getting your numbers for that claim.
In Microsoft Word, f.e, there's a lot of questions around how to disable the auto-hiding, and I believe the only remedy is to disable auto-hiding of all scrollbars within Windows 10 itself (it's under the accessibility setting hierarchy, weirdly).
There's myriad native & web applications that choose to hide these, and they're an endless source of frustration for people that would like them present, but find they are not, and subsequently find there's no convenient way to disable the auto-hide, and then lose interest in filing yet another bug / feature request.
Few people complain about them because there are so many new-to-computers users who don't have enough experience to appreciate the value of default-visible, proportionally-sized scrollbar thumbs. We're in some kind of awful eternal-September of user interfaces world where experienced users get to suffer through new developers, eager to leave their mark by making "modern" user interfaces re-learn all the old lessons.
my anecdata: when my family members complain about confusing things with computers the question "how do I know if there are more things down the page" is rarely the problem.
By far the most frequent issue is "where did that file I just saved go?" followed by "how do I forward this and that together?" and "it doesn't work! I didn't do anything. Yes, it asked me something and I clicked something but I don't remember, but I didn't do anything"
I complained to MS about a serious performance issue. Even finding out how took time. Nothing happened. I reported it slightly differently. Nothing happened. I spent several hours overall, all completely ignored. I no longer consider bug reports to MS to be worth anything.
I think you're right about the touchpad, it's my understanding they removed scrollbar visibility because of that input style, however ATM I'm using windows and and I have a trackball with a scroll wheel and I very regularly look at the scrollbars when I'm doing C#, or looking at browsing in palemoon, my two main activities. It tells me where I am (heavily looked at in vis stdio), which us part of the reason some people (not me) like minimaps.
It’s even weirder when you realize how much more resolution we have now.
A favorite implementation of mine was 4dwm on SGI. You could middle-click on the trough and it would go directly there, rather than the normal paging by left-click. Still try to use it everywhere and am disappointed.
In Cinnamon on Linux, the scrollbars in Firefox, Terminal, and Xreader are all proportional, and jump directly on left-click. Shift-click moves one page.
Middle click to jump to the clicked point is very common in X11 toolkits to the point where it is considered a "de facto" standard (some applications that implement their own scrollbars, like Firefox, will even use that functionality just on X11 even though technically are able to do that anywhere).
On Windows you get the same functionality with Shift+Left Click. You also get to see which applications use Qt since it isn't implemented there :-P.
on mac: Preferences > General > "Click in the scroll bar to:" > "Jump to the spot that's clicked". Though with mac touchpads, I forgot when was the last time I ever clicked the scrollbar.
"I could not understand why the major platforms (Windows, MacOS) did not."
I expect that it's because in order to have correct proportional scrollbars, it requires you to know the exact height of the entire scrolling area (or have decently good heuristics for them, which is not as much simpler a problem than you might think), which is an expensive thing to require of a scrollable area. The more content the system can handle, the more expensive that is. It's a big thing to ask of a heterogenous collection of layout code.
Even today there's some scrollbars that aren't as proprotional as they seem. emacs, for instance, seems to be using some cheating heuristics rather than being precise. Under normal coding or text editing usage you might not notice, but if you fill a buffer with a lot of very differently-sized text you may find the experience of scrolling around is less slick than you might expect. Browser scrollbars can also behave strangely while the page is loading, as I'm sure everyone has experienced even if they haven't noticed. In that case it's perhaps more obvious why a smoothly-scrolling, totally-correct scrollbar is a surprisingly hard problem in the general case.
> A big "Yes" to proportional scroll bars. I didn't have an Amiga, but rather an ST. The ST also had proportional scroll bars and for years, I could not understand why the major platforms (Windows, MacOS) did not. It was a pet peeve that really bothered me when I would sit down to use someone's Mac or PC of the time.
TFA explains that it has it now but didn't during the heydey of the Amiga.
My light experience from a long time ago is it also takes quite a bit of code to implement this on Win32 relative to other platforms. Your link could be used as evidence.
I love how the circuit diagrams were part of the manuals. Admittedly, an Amiga 500 was vastly less complex than a modern computer, but still: what today companies like Apple want to keep under wraps, so not everybody can just repair anything easily, everyone got whether they wanted/needed it or not.
Amiga nostalgia isn't just nostalgia, the Amiga was great. From age 10 to 14 most of my life revolved around my Amiga 500 and then Amiga 4000, and while it was great then, too, I feel doubly blessed in hindsight, considering what tech has become and where it's heading, to have been allowed to catch a brief glimpse of what could have been. And the music. All that music.
As expected, most of the features were rare on home computers at the time, but have become ubiquitous (or unnecessary) on modern computers. HOWEVER, the `Datatypes` feature is really cool and would actually be a nice feature that still doesn't exist in Windows, macOS, or Linux.
Although, yes I do concede it is more complex to implement, the way to do it depends on what kind of data type we are talking about, and it only works properly if the applications actually make use of the OS frameworks instead of doing their own thing.
And as expected with the GNU/Linux "desktop", nothing does exist that comes close to it.
Wow, in 24 years as a Windows user I never knew about Windows Imaging Components, nor did I come across Image Units in the 4 years or so I was doing iOS dev on a Mac. I've never seen an end-user application that used them, or told me as a user that I could expand its functionality with them.
In contrast, on the Amiga, I knew about Datatypes out of the box as they were written up in the user manual, and pretty much the whole developer community got on board with them. If an app used DataTypes it was a feature that the user knew about.
The Amiga community seemed much more unified than today with respect to using all the latest features of the OS. Even the smallest public domain utilities provided AREXX ports so the user could script them.
That seems limited to specific types of media, though. Datatypes is a general purpose framework for creating a hierarchy of parsers for file formats.
It can be images or sound, but also text, or anything else. E.g. here's a small sample of datatypes from Aminet, excluding anything sound and graphics related:
* Datatypes for transparent decompression
* Datatypes for syntax highlighting
* Datatypes for parsing document formats (word etc.)
* Assorted hypertext datatypes
* Assorted datatypes for showing debug information and structure of various binary formats, like executables.
I just didn't want to list everything, hence why I added the remark "the way to do it depends on what kind of data type we are talking about", and yes I do not mean it was as exhaustive as DataTypes, as I don't know all use cases.
The Windows equivalent is/was COM and in particlar OLE ("object linking and embedding"). With appropriate DLLs you can make a new file format and drag-and-drop a file into Word and open it inside Word.
The peak of this was ActiveX; in Microsoft's world around 2000, they really wanted you to be able to embed controls and programs in web pages and then put those web pages everywhere, including on the desktop background ("active desktop").
The problem of running third-party code inside your application is not just security but it tends to crash bady and put the name of the hosting application on the crash. Eventually Microsoft got so fed up with that, especially in the kernel, they built a system where video drivers could crash and restart while the OS kept on trucking, and use of COM has faded.
> The Windows equivalent is/was COM and in particlar OLE [...] it tends to crash badly
The mistake was implementing this stuff in C++, which is way too fragile on its own. Had they built a separate VM to manage requests for these objects, it would have been no problem: VM crashes, host app just restarts it. But Java did not exist yet and VMs were not really a thing, resources were too scarce.
I do think the ideas behind COM/OLE were pretty good, unfortunately real development on desktop OSes basically stopped mid-'00s so we might never know what it could have been.
My favorite feature from the past is RISC OS's "adjust" feature, bound to the third mouse button. This had three functions:
Selecting menu items without closing the menu.
Toggling selection status of selectable icons (same as control+click in modern GUIs).
Changing the length of text selections (same as shift+click in modern GUIs).
Sadly, modern GUIs seem to limit themselves to two mouse buttons. Adding "adjust" would improve productivity if you had a three button mouse, and you could ignore it if you lacked mouse buttons and not be any worse off. Maybe there are other good features from early OSs we have forgotten.
I'd always hoped that extended attributes (xattr) would be used for something like that. Nobody seems to use them, though, since they are optional and no standard naming scheme exists.
One unique feature of the Amiga that I've not seen referenced here is that programs, scripts etc. could reference removable media by label and if the media was not in the drive when needed, a Kickstart-managed requester would pop up asking for it to be inserted promptly. This made it very easy to manage even a single-floppy-drive system. Linux could support this even now, as all the required pieces are there (e.g. automount support) and it would be quite useful for a number of things; however it doesn't, AFAICT.
To be fair, this feature was more useful in the nineties than today :) And for the ones not familiar with Amiga jargon, a requester is what you would call a dialog box on other OSes...
I have vague memories of named pipes between GUI programs that also went along with datatypes.
eg in the file dialog of GUI app 1 saving a file to something (I forget the details) like PIPE:foo, then in GUI app 2 opening that data from PIPE:foo without it having to go to disk (I suppose when you're dealing only with floppies that is a bigger deal than now).
Back in 89 while stationed in Hawaii I worked for a small PC shop whose owner was a HUGE AMIGA FAN. Commodore worked their dealers hard, and just prior to Christmas that year he was required to stack 250 machines in inventory which just about broke the bank.
Meanwhile I was hired to build PC clones for a few hours everyday. While I was there he would argue constantly about the superiority of the Amiga vs. the PC and though I knew he was right (486 PC's had just hit the scene) I would argue right back about the fact the PC's were winning due to the velocity of change in hardware, storage, video, etc.
Anyway we got to the point with it that his wife started yelling at both of us to cut the crap, so not wanting to piss her off since she signed the checks I shut up and let him drone on.
As Christmas approached we were selling PC's 10 to 1 over Amiga's and his anxiety about not moving the Amiga's hit the roof. About a week prior to Christmas I came in one day and started building the PC's on order (about 30+ on order) and he was forced to help me with building them out.
He was pissed, completely off his rocker angry, and started verbally hammering me beyond what I could take. I finally decided enough was enough.
I turned to him and I said "I know one thing the PC can do that the Amiga can't do." And taking the bait like the fish he was, sez to me "Nothing can beat the Amiga". I sez, "There is defintely one thing it does better", and he SHOUTS "WHAT?"
Sez me, "It makes money!" and then I turned to the stack of nearly 200 remaining Amiga's sitting in the shop shelves and pointed at them and laughed long and loud.
He fired me on the spot.
Not exactly on point for this article. But a great story in the timeline of "our thing". I should mention that I did own a Commodore 64 and the Amiga before I switched to the PC. I loved them both and if it wasn't for the C-64 assembler module I would not have built the companies I created years later. The 6502C was the best learning chip ever!
Speaking for the Dead is something I really really hope we can do more for computing. There's so many pasts that we forget.
I want condensed good & bads of CORBA, SOAP, NeXT, ESB, & so many others. It feels like there's only dwindling folklore of so many of these things. At least I can point newcomers to C10K and Apache forking models to discuss some of the webserving systems architecture work that emerged around 2000, that gives a fairly broad view of the challenges & was afoot. But I've found few clear stories, clear tellings for so many of the faded technologies. Nice to see Amiga here somewhat avoiding that fate, having some stories told.
One little thing that I always liked about the Amiga was that it could show human dates.
I never owned one, but my friend had an Amiga 1000, and I remember the directory listing had dates like "Last Thursday," or "Christmas, 1990," or "An hour ago."
I don't know if that was part of the stock Amiga OS, or an add-on, but it was awfully cool.
Arexx ports for scripting IPC / automation, the standardized installer (especially the "Pretend to install"), and datatypes (which sort of exist in Windows at MacOS, but aren't exactly the same thing either) are the three features I'd love to see in current OS's.
All three would need buy-in from application developers, and since using them would tightly-couple applications to the host platform I can't imagine that any developers would take advantage of them anyway.
We've ended up in this world where a vaguely POSIX-flavored feature set is about the best we can hope for if a developer doesn't want to go "all-in" on a given platform.
My favourite is FrexxEd (co-authored by the author of curl) - a text editor - that exposed all open buffers as files, so you could run script directly on the contents of the open buffer.
For me, ARexx and Datatypes still stand out. A standard scripting language with standard interfaces into all your desktop software is a magnificent tool, and one that common desktop OSes still don't have.
Datatypes meant that every program could open relevant data; again, something you can't rely on even 30 years later on your desktop OS.
Note that a lot of the "shell"/AmigaDOS details were inherited from Tripos[1], along with BCPL to implement that (the BCPL that begat first B and then C at Bell Labs). IIRC, that was a temporary solution until they got their own design running, but as usual, nothing lasts longer than a stopgap.
I very badly wanted an Amiga as a kid, primarily for the Video Toaster which included Lightwave. It was out of reach financially, but also on its way out with the Commodore corporate problems. My parents drove me to B&H Photo in Manhattan to take a look at one, but they were pretty honest about it being near the end of the line for Amiga, sadly.
The closest I got was a subscription to Amiga World and a NewTek magazine if I recall correctly. I must have read and re-read the same articles on the visual effects for Babylon 5 about a thousand times.
Despite the cost, the Toaster plus Amiga was insanely inexpensive compared to the alternatives of the time.
I worked for a production company that went from 100 grand worth of Grass Valley Switchers/Digital Picture Manipulation (DPM-700) down to one Video Toaster 4000 on an Amiga 4000. While I missed some of the cool stuff we could do with the expensive equipment, we were mostly doing local TV commercials and infomercials. So the Toaster ended up working just fine. I barely knew anything about the Amiga at the time other than the thing was rock solid until it crashed one day.
We had to load up the entire box of like 60 floppies of Toaster software, the Amiga itself and drive to some hole in the wall in St Augustine Florida to have one of the last remaining Amiga gurus to help us fix the machine. He put a new hard drive in it, but we got like 40 floppies into the Toaster software install and it failed. The dude HAD the software on site but told us it would be wrong for him to use his own floppies to do the install. We asked him if there was ANYTHING he could do. He eyed the twisted pair network card we had and said, "I could be persuaded for that". We weren't using it... as far as we knew, it was just another BNC on the back of the computer case. So, he lowered his "morals" and we got our box fixed. Great times.
It is pretty crazy to think that Babylon 5 had better special effects than TNG (in my opinion) and did it for cheap on the Amiga compared to Industrial Lights and Magic that TNG used.
It's a weird comparison. Babylon 5's special effects were very early CGI, but TNG has almost no computer graphics at all. (Nearly) everything was done practically, and with models. Personally I think TNG stands up better, and the fact that it was done practically and shot on film now allows a high-definition version to exist, which can never happen for B5.
A couple of years ago, I watched the whole of Star Trek TOS, TNG, DS9, and then Bablyon 5 one after the other.
So I have to ask whether you're using a modern TV and perhaps what your media source for B5 is. Because, frankly, most aspects of B5 pretty objectively haven't aged well (the elevator scene is one exception), and the FX renders are especially bad, at least from the DVD box set.
I fully believe it looked incredible at the time on a standard def CRT. But TNG still looks OK and DS9 looks good.
Are you watching a digitally remastered Star Trek?
I've watched B5 on CRT and various LCD. The station monitors obviously aged poorly, but a lot of ships still looked pretty good. Minbari Sharlin, White star...etc.
He wasn't stating how it looked for the time, he was comparing it to star trek: the next generation, which used models and was shot on film while even high budget CG was still in its infancy.
Yeah I was basically just saying star trek spent a fortune, while B5 was done "on the cheap". I was wrong and forgot TNG still used models, but the overall comment still stands.
> to this day every time we have a conversation he make an Amiga reference.
The Amiga was so far ahead of its time that there hasn't been anything else comparable. I lived through that time and I reference it a lot as well because so many things today are based on the Amiga or still aren't done as well as they were done on the Amiga.
The Amiga was far ahead on several fronts including digital audio and video, but the Atari ST was the platform of choice for midi using musicians. A couple of today’s biggest DAWs (Steinberg Cubase & Apple Logic) have their roots on the Atari ST.
Was Amiga ever big in the United States? I had this realization the other day (which might be completely wrong) but it feels like US computing industry is a lot more business focused than the European computing industry. E.g. most DAWs are built by European companies (Ableton, Logic (before Apple acquisition), FL Studio, Reason, Bitwig).
Same goes for the demo scene. The demo scene was really big in Europe, but it feels like it was never that big in the US (but I might be wrong).
I legit wonder if part of the reason is due to the popularity of Amiga in Europe.
Yeah that is my impression too. Always got the impression that PCs, Macs and Nintendo was bigger in the US.
Growing up in Norway in the 80s I remember the Commodore 64 and Amiga really dominated the home computer scene. We used to read computer magazines from Sweden, so it must have been big there too.
Pretty much nobody used Macs. Schools used a home grown Norwegian computer system called Tiki, with these odd keyboards with round orange keys.
I had the impression that Macs where much more prevalent in the US due to school usage. My American wife has fond memories of playing Nintendo games as a kid. But for me it was all about Amiga.
It was really sad that it died. It was a really enabling system. I compare to people who used Nintendo and we learned so much more. I learned to do Basic programming on the Amiga. There was even this Game making Basic called AMOS which as really ahead of its time.
Using Deluxe Paint to draw was amazing. I learned so much about colors and shading from that. Then there was Assembly programming which was actually kind of nice on the Amiga.
But before all that I remember making things like bootable disks with menus where you could select preferred program.
The Amiga was somewhat popular in the US but I think it was mostly with people who had Commodore 8-bit systems and remained with Commodore when upgrading to 16-bit systems. There were a lot of Atari and Apple II systems in the US besides the C64 and Atari and Apple II owners likely upgraded to Apple Macs, the Atari ST and IBM PCs and not the Amigas. Beyond home computing, the Amiga got quite a bit of use in the video industry in the US.
In contrast, Commodore systems were huge in Canada, where I live. All the schools here had 8-bit Commodore systems from the PET to the C64 era. This meant a lot of homes had the VIC-20 and C64. I only knew 1 person in my school with an Apple II and one with an Atari 800XL. Many of the Commodore owners upgraded to Amigas, especially once the Amiga 500 came out. Of course, Canada had 1/10th the population so there weren't as many Amigas as in the US, but a higher percentage of the population had them.
AMOS! I swapped my copy of Easy AMOS with a school friend's copy of AMOS Professional. It set me on a path to becoming a professional software engineer. (Later, I swapped AMOS for Amiga E. Anyone remember that?)
I do, and still miss it although today we have really powerful computer languages;
kudos to Wouter van Oortmerssen for creating it.
But on that little 8MHz machine (mine actually was 7.16 MHz for being a PAL machine), that wonderful little language did wonders. I recall being amazed at how it would compile a several hundred lines program I wrote back then in a few seconds without caching. I miss those years so much!
I spent a happy summer writing a point-and-click graphic adventure in Amiga E (with my brother doing sound and graphics). You could drag the game screen down and reveal Workbench behind it. It was the first large program I ever wrote.
It wasn't big, but it did exist. While in Europe Amiga was the "default" games computer, the US adopted games consoles much faster, so most Amigas sold in the US were used for productivity purposes, and mostly video, due to the popularity of the Video Toaster.
In the US Amiga was a niche player. Video production nerds went crazy for it and some people got them for games, but PCs were so clearly the way forward that you had to be something of an iconoclast to go for the Amiga. Macs were another niche player, although some businesses went for Macs which gave them a bit more of a "you can justify this by saying you'll work from home sometimes" purchase. In the US it's harder to convince people to spend thousands on a machine that's only going to be used for games.
After Switching from Amiga to PC, I missed the RAM disk for a long time for some reason. It has been so long ago now that I cannot remember my workflows that well anymore. But I seem to remember that the RAM disk was a really natural part of my workflow.
I also remember how us Amiga users got really used to that you could set the layering of Windows specifically.
I feel the same way. The RAM disk was so important for daily operations that I couldn't imagine not having it, yet I don't even think about it today.
Speaking of layering windows (screens?) I specifically remember having 3 different resolutions open at the same time: normal (Workbench), HiRes (DPaint), LowRes (terminal). It was kinda magical to do all of those things at once and switch between them. Ahhh, memories...
I sold my very first piece of software (aged 13) leveraging the Amiga RAD disk.
It was called ‘RADBench’ and essentially installed a mini version of Workbench into the recoverable RAM disk, so you could work better with floppy-based files by having key executables in memory.
Nice article that brings back memories. It missed a couple things surrounding the Amiga that contributed to make it great:
Among libraries, the arp.library, whose windowing functions were somewhat precursor to the ASL library.
Scala Multimedia, an absolutely top notch presentation/titling software.
Many local TV stations were using it to add titles to their video material, advertising etc. Its animations were super smooth, the PC world had nothing that could come close.
https://www.youtube.com/watch?v=W5V0J2BB5Ho
Only by cheating, you can save the contents of a tmpfs to disk on shutdown, and then restore it on boot. This can be done with a systemd unit or an init script.
That's only half of the story. The interesting part is when assigning multiple targets to the same logical name and effectively creating virtual "collections"
Yes, for things like PATH.
But ASSIGN did more, you could essentially create a "device" (combining all ADD'ed things) that could be transparently accessed as such by other programs. It worked like a version of the Windows SUBST command accepting multiple arguments.
Yes, a lot like a union mount except that it was read only and easy to modify.
Assigns were often used for things that we today would stuff into an environment variable to represent an array of paths. Instead of an environment variable for setting the search path for shared libraries, on the Amiga you would just setup `LIBS:` and an application could find its libraries by opening `LIBS:MyFoo.library`. Similar story for $PATH. Amiga has `PATH:` which "contained" all of the executables in your path. Unlike unix style environment variables, assigns were system wide. (Amiga was a single user system.)
If it is indeed a union mount mechanism, then that's truly a remarkable feature. Reading the documentation makes it sound like something akin to an symbolic link but perhaps not in the actual filesystem; I still can't tell if it's implemented in the kernel/FS or just the shell (or to what extent the two are integrated). If it's just the shell then this seems to be a shining example of where the POSIX interface is holding us back. In Linux, they've tried to implement union mounts a number of times (as UnionFS, aufs, and OverlayFS) because it's almost too complicated to get right for Unix-style filesystems and you need a bunch of kernel-level code. Do you happen to know if it was possible to create the ASSIGNs programmatically? Does deleting/renaming files in the ASSIGN work like you'd hope?
Yes, it was indeed implemented at OS-level, you could point at the assigned name/alias from, say, a paint program and find the files/dirs you'd expect to see. Changes to files/dirs too worked as expected as far as I can recall (disclaimer: most recent Amiga experience -> 1994). It was also possible to create assigns programmatically with calls to the Amiga DOS library, see "dos.library/AssignPath" in http://amiga.nvg.org/amiga/reference/Includes_and_Autodocs_2...
NAME
AssignPath -- Creates an assignment to a specified path (V36)
SYNOPSIS
success = AssignPath(name,path)
D0 D1 D2
BOOL AssignPath(STRPTR,STRPTR)
FUNCTION
Sets up a assignment that is expanded upon EACH reference to the name.
This is implemented through a new device list type (DLT_ASSIGNPATH, or
some such). The path (a string) would be attached to the node. When
the name is referenced (Open("FOO:xyzzy"...), the string will be used
to determine where to do the open. No permanent lock will be part of
it. For example, you could AssignPath() c2: to df2:c, and references
to c2: would go to df2:c, even if you change disks.
The other major advantage is assigning things to unmounted volumes,
which will be requested upon access (useful in startup sequences).
INPUTS
name - Name of device to be assigned (without trailing ':')
path - Name of late assignment to be resolved at each reference
RESULT
success - Success/failure indicator of the operation
SEE ALSO
AssignAdd(), AssignLock(), AssignLate(), Open()
Fun trip down memory lane. My first C compiler was Lattice C which I made resident so I wouldn't have to swap floppies on my single drive Amiga 2000. I can see now how it was a great stepping stone to learning UNIX.
Well, it doesn't introduce anything novel but perhaps it is time for a new personal computer revolution. Raspberry PI 400 running Risc OS gives a taste of what the Archimedies was capable of, lots of problems too, but they are certainly addressable, and it is somewhat refreshing to see the performance of modest hardware using an OS that is designed for a single user, a taste of what a users experience was like in the Amiga era.
I really miss that, back in the days, there was still innovation going on in the OS space. Sure we get some features every now and then, but back then computers like the Amiga tried to come up with ideas that where fundamentally different from other systems. Its sad that I still long for some of the solutions the Amiga and Be OS brought to the market, 20+ years later.
An Amiga1200 on a 14Mhz processor could boot to desktop in 1.5 seconds...
Don't forget the Jesus on E's demo by the LSD group. https://youtu.be/0MjNsNW8DvM. If I remember rightly, the LSD group also did the scene Grapevine magazine and also Dox disks.
While AmigaDOS does have a CD command, it's usually not needed. Any command that evaluates to a valid path will automatically change the current working directory to that path
It just means you don’t have to type cd before the path to change directories. It’s not a big deal at all, especially in the context of the DOS shells of the day or modern *nix shells.
eg:
> Quarantine/Trojan
Will dump you into the Trojan directory relative to the current path. Unless....
Wow, was reading the bit about resident programs and was reminded of the cold capture vector! Reboot the machine and your program could jump over the reset... fun times ;-)
It's because the default screen resolutions are PAL (or NTSC), so they have double-height pixels to avoid interlacing.
Regular progressive screen modes are also available (some may require the installation of a graphics card), and if you're using a much higher resolution, increasing the workbench font size also makes all the widgets bigger.
I started programming in the 70s, on Z80s and a PDP-11. Then I moved to the Apple IIe. Also got a bit of early exposure to MS-DOS in the early 80s. When I first was able to get my hands on an Amiga 1000 (my friend got one as a graduation present), it completely blew my mind. The Amiga is still my favorite, and I have several that still boot up and work fine, even though they are quite a bit older than my kids, who do not always boot up and work. :) There are a bunch of things I still miss when working with Windows (for "work"), even though it is well documented that Microsoft ripped off a bunch of ideas and features from the Amiga (and early Macs). There are, of course, some things I've gotten used to in the intervening years, like increased CPU speeds, but the UI of my old Amigas is actually sometimes more performant than a brand new machine running a modern OS with thousands (or millions) of times the available resources. Go figure.
I had an Atari 800XL, then a 520STfm, then a Falcon 030.
Back in those days there was quite a bit of tribal rivalry between Atari and Amigas - but I have to admit that the Amiga's OS was way ahead of Atari's, which didn't even get true multitasking until the Falcon (which almost nobody bought).
What I remember from that era is that nothing was compatible with anything else. It took a lot of work to interoperate between two PCs, let alone cross the gap between OSes. So for a long time, I have kind of taken the current world of few OSes that are highly interoperable as being a great thing: you can build your own Linux machine and still do real work with people on Windows and Mac, etc.
But the more I learn about computing in the 80s and early 90s, the more I’m impressed by the variety and diversity of ideas that came out of that era. I see now that today’s standardization has a real cost, which is that we don’t really see new ideas or new paradigms.
For the HN crowd, especially those who are older than me and can remember the earlier era of computing, what do you think about that trade off and where we’ve ended up today?
Are we better off living in a world where all computers can work together with minimal fuss? Or would it be better if we still had a wide range of vendors taking significantly different approaches and innovating at a much faster pace - albeit in incompatible ways?