Ah, yes. Mavericks, California. It's a great little offshore town, just off Pillar Point. I love that town.
Kidding aside, this is a great article.
Related to this story, the Windows 3.0 visual shell was originally not supposed to be Program Manager and File Manager. It was going to be a program called Ruby that I worked on with Alan Cooper and our team.
Ruby was a shell construction kit with a visual editor to lay out forms and components, which we called gizmos. You would drag arrows between gizmos to connect events fired by one gizmo to actions taken on another.
The shell was extensible, with an API for creating gizmos. A really weak area was the command language for the actions to be taken on an event. It was about on the level of batch files if that. But we hoped the API would allow for better command languages to be added along with more gizmos.
BTW, this project was where the phrase "fire an event" came from. I was looking for a name for process of one gizmo sending a message to another. I knew that SQL had triggers, but for some reason I didn't like that name. I got frustrated one night and started firing rubber bands at my screen to help me think. It was a habit I had back then, probably more practical on a tough glass CRT than it is today.
After firing a few rubber bands, I knew what to call it.
(As one might guess, I've always been curious to know if the phrase "fire an event" was used before that. I wasn't aware of it, but who knows.)
Anyway, Ruby didn't become the Windows 3.0 shell after all. The went with ProgMan and FileMan instead. To give Ruby a better command language, they adapted Basic and the result was Visual Basic. Gizmos were renamed "controls" (sigh), and my Gizmo API became the notorious VBX interface (sorry about that).
And we still don't have a programmable visual shell in Windows.
Having said all that, what you describe would still be interesting today. I just don't know how feasible it would be to implement. I had a file manager/shell idea in the form of a game construct. Something like crates to open or destroy as a delete mechanism. It was never more than a fleeting concept but I find it interesting that I'm not the only one lamenting about what is now Explorer.exe.
Actually, the apology at the end of my post wasn't about Visual Basic as a whole, but the VBX interface specifically. People did use it to build a lot of nifty controls and extensions, but the interface itself wasn't the best-designed thing in the world, and Microsoft eventually replaced it with COM/OCX.
Hmm... Not sure if that was an improvement!
"Control refers to any set of rules describing conditions under which processes may fire an event or switch to a new state."
Other sources confirm that there is a 1979 book with that title. (Google Books sometimes has a dramatically wrong year.) However, I don't have access to the book to verify it for myself.
There's also a interesting reference from particle physics; "Figure 6 shows the number of tubes that fire an event vs. the number of photo- electrons.", Proceedings of Workshop on "Weak Interactions and Related Topics", December 13-15, 1979. While not the same, it feels like a similar construction.
Otherwise, the only relevant hits for that phrase are post-1990.
I guess I can't really claim to be the true originator of that term then. But it still makes a good story...
BTW, here are a couple of rule-base references of firing an event, in:
1) http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.81.... -- "An event can fire, i.e. it is active, ..." -- Representing procedural knowledge in expert systems: An application to process control (1985).
2) http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.12.... -- "The object manager identifies fireable events, and fires the rules of each of the participants of the event." -- Using Objects to Implement Office Procedures (1983).
Very true, there isn't one. But there are several. None of them all that good though. Windows Workflow, System Center Runbooks, Biztalk, they've all made their mark. I've seen serious systems built using all of them, ultimately I have to wonder whether visual programming on that scale is even a good idea.
But it did allow you to get things done very quickly, and that's why you see that was used a lot in several industries for small things that doesn't require to be maintained a lot and still do useful things.
Ah, it starts to come back! In SQL, "trigger" is a noun, not a verb. It refers to a stored procedure that's executed in response to some event.
I knew "trigger" could also be used as a verb, of course, but it seemed that it might be confusing to use it that way given its meaning in SQL. So that's when I started searching for words and firing rubber bands.
I might have fired up something else too, but that's a story for another day...
Were you from tripod too ?
Alan was a good coder before he decided to concentrate on UX design, but like many prototypes where a number of different ideas have been explored, the code got a bit messy and he felt a fresh start was called for.
There's more of the story here:
I don't know of any other Microsoft projects that used the Ruby code or ideas, just Visual Basic.
I called DEC and they too believed it existed, so they (while I was still on the line) called their contact at IBM. After being transferred twice, we arrived at the person who could mail me the driver, but I would have to sign an NDA. Myself and the DEC rep explained we didn't want source or a beta driver, just the release one. He insisted every customer had to sign. I said I'd think about it. After hanging up, the DEC rep couldn't stop laughing. He asked if I wanted a free copy of NT compliments of DEC. I took it and it had the correct driver.
I tried, but they had no chance.
Hey I had a Rainbow! It was pretty amazing IF it had it's full potential it would have been the best computer until the Amiga came out.
CPUs - Z80 (8 Bit) and a Intel 8080
OS - CP/M and MS-DOS
Could be upgraded to 286 later.
Would have been perfect BUT they didn't get things setup correctly and I was really stuck in Z80 CP/M land.
It was actually an 8088 not an 8080.
Anyone who had to deal with people using those damn Rainbow floppy disk drives has my eternal sympathy. I really, really want to know what they were thinking on both the format and how you inserted those disks.
Somewhere in the Usenet archive is Gordon trolling the OS/2 users for weeks (or months?) on end. I can't remember the exact details, but he had a bet with several people that Windows would have multitasking, or that OS/2 wouldn't have some sort of multitasking before Windows. The bet was to fly the winner to any city of their choice and buy them dinner.
The discussions were quite heated and it was particulary memorable because he was one of the first 12 employees at Microsoft.
The cover photo is also classic.
When I would take the OS/2 system offline and replace it with a Windows cage the payment network would sometimes tell me the uptime on the deprecated machines . . . one network operator claimed 8 years of uptime at one particular machine. I have no way of confirming that, but I definitely felt the OS/2 machines were rock solid, especially compared to the vulnerable Windows machines. Most small banks with NCR machines are running two software packages (APTRA Edge or Advance) with default admin passwords and are really behind on the monthly bug patches. Eek.
The OS/2 machines required you to input config info in hex though, so I was glad I didn't have to work on them in the field too much.
A software upgrade on the XP machines was ridiculous. It usually involved 3-4 hours of loading and booting and restarting with several different CDs. The other techs and I were convinced that NCR padded the installs/updates to take longer since their certified field techs billed out at around $300/hour.
As well, I worked at a bank, and as the article correctly stated, the entire bank was run on OS/2, most notably the ATMs, except the ATMs I worked with was using OS/2 2.0.
However, when Windows NT 3.51 came out, that was the game changer. I was the only person I knew who even knew what it was (I read about it in a magazine at the time), and I was able to get a student-priced copy at my college bookstore. I started using it, and it was awesome, everything just worked, except for some games. You couldn't even compare NT 3.51 to OS/2, it wasn't even in the same level. The look and feel of NT was exactly the same as Windows 3.11, and all the programs worked.
Even with what was bleeding edge hardware at the time, I still remember trying to finish writing my senior project using Corel Draw and Microsoft Word on Windows 3.11, and having my computer crash ever 30-45 mins, and my project partner eventually broke down and started crying because of how frustrating it was.
Sadly the big problem was that it swapped like mad in 4MB which Windows 95 didn't. Doesn't sound like a big problem today but the extra memory would have been expensive at the time. In particular 4MB was a popular option on 386/486 machines arranged as 4 x 1MB which often filled all the SIMM slots on the motherboard so upgrading to 8MB meant buying 4 x 2MB SIMMs which doubled the memory cost.
For most of us at the time 4MB was a good usable amount of memory with EMM386 but the upgrade past that didn't give much extra functionality.
At the time, IBM had sent in scores of company reps to train up our floor staff on the advantages of OS/2 over the always-soon-to-be-released Chicago. They did a good job getting all of us to "drink the Kool Aid". I received a free (not pirated, promotional) copy of blue OS/2 Warp 3.0. It was a fantastic operating system for running a DOS based multi-node Telegard BBS and it did well with Win16 applications.
The impact of Windows 95 coming on the scene, though, is difficult to fully appreciate unless you were there. We had been selling pre-orders for months and there were a myriad of promos. I remember some of those preorders were sold under the threat that there wouldn't be enough copies to go around on release day. I had been playing with pirated copies of the betas of Windows 95 for the prior two months. Even in its beta form, it ran circles around Windows 3.0/3.1 in terms of reliability. I even remember reloading my PC with the most recent beta after release because a DOS application I used ran more reliably in it than in the RTM code.
Then launch day came. It was unlike anything I had ever seen in terms of a software release. We closed up at 9:00 PM and re-opened at 12:00 midnight to a line of customers that went around the building --- A line of customers ... for an operating system. We joked at the time that "Windows really was that bad". There were tons of additional promotions to ensure people came and lined up--Some RAM / hard disks selling under "cost" and others. And the atmosphere of the store felt like a party. We had theme music playing (start me up?) and some Microsoft video playing on our higher-end multi-media PCs. It was obvious to us, on the floor, trained by IBMs marketing machine, that Warp died that day.
As an anecdote to the stories about IBMs marketing being a little off: I remember around the release of Warp 4.0 I saw an advertisement at a subway station something along the lines of "Warp Obliterated my PC!"-- that tagline, evidently, meant to be some hip new use of the word obliterated.
I grew up in Dallas, SoftWarehouse (somebody else remembers that name, awesome)/CompUSA's original stomping grounds and this story kept coming back to me throughout the whole article. Windows 95 was considered revolutionary at the time, even to those of us lined up at the Lewisville, Texas store at 10:30PM to buy an operating system. (Incidentally, the first time I ever talked my dad into taking me to an overnight release of anything.) Windows 95 was the first operating system I ever saw non-technical people set out intentionally to buy and I spent months installing it for friends and family.
> We had theme music playing (start me up?) and some Microsoft video playing on our higher-end multi-media PCs.
Yep, it was "Start Me Up" by the Rolling Stones. If I remember correctly, the video was a demo reel of everything new in Windows 95 and was highlighted by the huge Start button popping in at the end of the video, then fading to black. It even had a snippet of the waving Windows 3.1 flag and the bear from the "Help / About" Easter Egg hidden in 3.1.
Even Windows 95 was limited by many system calls being funneled through single threaded BIOS or DOS 16-bit land.
Back when CD burners were still uncommon, I got as a gift a japanese, SCSI-based one. With my hardware, I'd lose a CD if I forgot to disable the screensaver - the amount of disk seeks required to load the screen saver executable was enough to starve the buffer, and I'd get a buffer underrun every single time.
So, one time I was trying a linux system, and had to do a last-minute presentation, which required files on a floppy drive. For some reason, I had no usable floppies and had to format one, and I couldn't wait for the burn to finish.
So I inserted the drive and, fully expecting to lose the CD, called fdformat. cdrecord didn't even flinch, the buffer was still full when the format finished.
I only ever booted Windows from then on to play games.
Plus it had a heterogeneous architecture, with dedicated chips for sound and graphics.
Just set the required data structures and let the chip do its work alone. Sounds familiar?
You also needed to load the Workbench and related libraries from floppy, so ROM firmware alone wasn't enough.
I don't remember if I ever saw a 1000.
Around that time I had updated from 1200 bps to 2400 bps to 14400 bps modem. Mostly the multitasking was running smooth enough to provide reasonably speedy BBS experience even for the 14k4 caller. The multitasking mostly was visible when the system was building and compressing the QWK archive for offine message reading (Blue Wave FTW).
In 1996 had to put access to that data via internet, and used the same OS/2 box for both modem access and an internet server with most of the usual services (mail/web/dns/proxy).
It was very reliable (really don't remember reboots, maybe it was mostly for upgrades), impressed me when i realized that i created a 16mb vector in that 16mb server with everything running normally, and loved WPS (not sure which today's desktop had so good integration with fs) and rexx (parsing without regexes was good, but later learnt how limited i was).
It was reasonably amazing at the time to be able to run more than one thing concurrently on your PC.
The small things one remembers...
In college I really wanted to like OS/2 2.0 (and later 2.1), but driver problems with the Diamond video card were a constant problem. (If only we'd sprung for the ATI Graphics Ultra Pro!) I had a copy of DeScribe; later sold it to someone through the ISCA BBS.
My impression at the time was that Microsoft executed so much better than its competitors, offsetting its weaker Office products with a better UI, which in turn gave you a reason to run Windows. I later attributed its success much more to its ruthless business practices.
This article brings the focus on the strategic vision: betting big on clones; belatedly embracing the Internet; hammering away at PDAs and tablets, yet losing big to the iPod, iPhone, et al. Sometimes we predict the future, and sometimes we make it.
System V: https://en.wikipedia.org/wiki/System_5
https://en.wikipedia.org/wiki/Xenix (which was created by the eventually infamous Santa Cruz Operation - SCO - and licensed by Microsoft)
Also VMS qualifies as a fully memory-protected, preemptive OS:
The real innovation that Windows and OS/2 did was to take fully preemptive OSes and put them in marginal hardware (at the time) like PCs.
SCO ported Xenix to a few processors for Microsoft starting with 8086/8088. It wasn't until 1987 that SCO ownership of Xenix.
However, we very soon switched to simply having two PC's on our desktops.
It seems like it was on 5-1/4" disks. Can that be right?
And REXX! Ha. REXX.
The IBM PS/2 only came with the smaller disk, so bigger wasn't necessary
And it explained to me, in clear terms, why Windows was such a buggy pile of shit. It was created of its culture.
http://ereads.com/ecms/book_title/Showstopper (warning, will set affiliate codes on Amazon links)
The idea being that Windows 95 was internally called Windows 4.0 with the codename Chicago.
I keep on searching for it but can't find it anywhere.
And Bill Gates on OS/2 in 1987: "I believe OS/2 is destined to be the most important operating system, and possibly program, of all time."
My favourite codename sniping had to do with Windows NT (codename Cairo) and NeXT. When announcing NeXTSTEP 4.0 (codename Mecca), Jobs quipped, "Why stop at Cairo when you can go all the way to Mecca?"
It talks about Windows 4.0 and other contemporary "next gen" operating systems.
The internals of pre OS X Mac OS are horrific and disturbing. They are in no way superior to the internals of Windows 95 and are at best only perceived superior due to a different user experience peppered with a healthy dose of self-delusion, certainly not through stability or performance. They are infantile compared to OS/2. Moreover, over the next few years after Windows 95 was released the PC improved greatly while the Mac mostly stagnated. Around the Power Mac era things were roughly even in terms of capabilities and performance though the Mac was significantly more expensive, by the Pentium and especially Pentium II era the PC began to become objectively more powerful than the Mac.
This put Apple into a freefall that they were only rescued from by the return of Steve Jobs who established style as a foundation the company rested on and pushed them into digital media and mobile devices, as well as forcing a hardware architecture migration (to the formerly hated x86 from the PowerPC architecture developed by a consortium of which Apple was a huge part of) and a complete OS rewrite (transforming Mac OS from an antiquated bucket of kludges into NeXTSTEP in Apple clothing).
At the time in question though Apple was pushing old, slow hardware at a price premium in a market that was rapidly passing them by.
As for Windows vs. OS/2 it's at best complicated. It's a bit like a micro-cosm of the PC vs mainframe debate. The raison d'etre of Windows 95 was backward compatability with a low memory footprint.
That may seem like a small thing, or at best not a thing to make so many enormous compromises over, but back then it was everything. The problem with doing multi-tasking "right" was that it imposed a ~4mb RAM requirement per 16-bit application being run at the same time. That is nothing today but back then 4mb was the minimum requirement for installing Windows 95, and it represented a cost of around $100 in 1995. Owning a computer powerful enough to run even a handful of 16-bit apps while running OS/2 or NT was simply above the economic means of a lot of people. And by the time technology caught up and RAM became cheap enough to make proper multi-tasking cost effective there was too much Windows 9x network effect for competitors to make much headway.
That experience ranked up there with writing my own PPP init scripts and endless hours tweaking my fvwmrc. I didn't play video games during that period, because I was quite seriously having more fun learning every nook and cranny of any unix or unix-like system I could get my hands on.
And then folks complain today about the tiniest things!
Back then we all thought "computers" was a hardware game. Only Microsoft realised the hardware didn't matter, software was the main game. And yes, I realise we have swung back to the "integrated hardware/software platform" thing being important again. Picking winning strategies in platform wars is hard.
Yes. This was one of Bill Gates' many strokes of genius. After just one year of Traf'o'Data (Microsoft's precursor), Gates saw that the future was in software, not hardware, and he created Microsoft centered around this very vision (while Apple bet the farm on hardware).
The clones did expand the Mac market, but Apple was so incompetent at that point that it didn't know how to handle that.
And the thing that probably saved Apple in the late 80s was the desktop publishing business.
I know that is the conventional story (and pretty similar for the Amiga). I'm not convinced.
I think that both the Amiga & the Atari ST were too far ahead of their time. They were multimedia workstations, without anywhere to play that multimedia (except on other Ataris and Amigas).
Like you said, the Mac managed to hit the desktop publishing wave, which was exactly right for the the paper-centric late 80's and early 90's.
I never used things like MiNT, but those weren't supported by Atari anyway.
Atari just didn't have the resources to sink into an OS that could compete. They knew how to make cheap hardware, but after a while the PC ate their lunch. Nobody wanted to do biz with the Tramiels, so games were pretty much off the table.
...and education, and government, and academia. Apple was a pretty safe bet in the 80s (The Mac IIfx was designed to government specs and was the fastest desktop PC around). They only really started to lose market after Windows 95.
All I recall from childhood/teen years is "IBM compatible PC" a mantra on every TV commercial IBM or others produced.
A PC was for business and you were rich or a fool to spend thousands on something so useless to a regular Joe.
Apple was even more expensive and even less common than a PC and even so Apple was quirky and drew pictures, ATARI and Commodore were for games.
They were in totally different worlds. Back then I wouldn't have even thought to have one computer that did everything.
If there was any stigma, it was for being on the opposite end of the spectrum -- having a computer whose primary selling point was that it was cheap, like a Commodore 64. The C64 was a fine machine for the price, but nobody was going to ooooh and aaah over it they way they would if you took them into your Dad's study and showed them MacPaint.
I was the only one, among my circle of friends owning computers, that had a PC instead of an Amiga.
My dad thought the Amiga were only good for playing games, for anyone serious about computers, PC was the way to go.
So I was left reading 68000 Assembly manuals, some Amiga reference books, and playing with them on computer parties we used to organize.
In any case, the only way to buy any of them in my home country was on credit.
I find this hard to believe, given that Rexx was developed by IBM.
The complete userspace is bytecode based, regardless of the language.
Applications are compiled on installation phase, or when the generated code is deemed not to be valid any longer.
When there was an architecture change for the PowerPC, many installations only required a regeneration of the installed software.
A concept that Microsoft tried with Longhorn and Windows Phone 7. Or we could even say, the model Android almost has as well.
Native Oberon and Inferno also tried a similar approach, to certain extent.
Unfortunately it wasn't quite that clean in practice. Regen required TIMI to have access to the compilation templates for each program. These were intermediate compilation stages ( bytecode, I suppose ) that it could then translate to the new machine architecture.
However, these templates were often missing, deleted or out-of-sync. So we did an awful lot of recompiling from source, when we could find it...
I only did a bit of AS/400 administration back in 1994's Summer, just logged into the system and started the backup procedure.
I was actually doing Clipper development and the company where I had a Summer job used AS/400 systems for accounting.
Most of I know about OS/400 bytecode system was found looking for compiler technologies a couple of years later.
"No, everything's memory!"
Mainframe environments are weird.
Or the OS/360 which uses virtualization for all OSs, like Hyper V does on Windows.
The first OS to boot from the hypervisor has master rights, but all OSs are virtualized.
No, the hypervisor was called VM. According to WP it most frequently ran CMS guests, which was a light weight single-app OS. But could also run OS/360 guests (which predated VM).
Yep that is also a nice feature.
Which leads us to the great set of names where we have: IBM i, Apple iOS, Cisco IOS
Disclaimer: I'm an Ars subscriber and <3 Ars but when my bullshit detector is already off the charts before the article begins ... it makes me wonder how good the article is.
> In addition, IBM once made a deal with Commodore to license Amiga technology for OS/2 2.0 and above in exchange for the REXX scripting language.
In other words, IBM licensed REXX to Amiga in return for something else (we don't know what).
But who knows if this is true. Apparently IBM and Commodore already had an IP cross-licensing agreement at the time, and had access to their patents. And AREXX apparently did not contain any IBM code.
In other words, IBM licensed REXX to Amiga in return for
something else (we don't know what).
Put yourself in the shoes of someone who'd grown up on mainframes. The systems you're used to are fabulously expensive, sure, but their operating systems have sophisticated architectures that allow them to easily juggle multiple users running multiple programs while maintaining the security of the system overall. They are the completest expression ever realized of forty years of progress in the field of information technology.
Then, one day, someone drops a PC in your lap. You are horrified. It's a single-user, single-tasking system with absolutely nothing stopping the user from trashing everything by running the wrong program.
To you, this new... thing, whatever it is, is barely worth the name "computer." It feels more like a toy -- like something you would give to a child to play with. Certainly nobody would ever run critical systems on it, you think.
And here's the thing. You're absolutely right! All your concerns are one hundred percent valid. But it turns out that nobody cares; the PC is much, much cheaper, and it lets everyone have their own dedicated hardware right on their desk running any software they care to install, instead of time-sharing a mainframe and begging for permission each time they want to try a new program. Or, put more bluntly, it lets them escape from having to deal with the corporate IT priesthood (i.e. you) anymore.
The market speaks! You are derided as a pointy-headed nerd and swept into the dustbin of history.
Now fast forward ten or fifteen years. People start taking all those PCs they bought and hooking them together into networks... and suddenly all those things you were worried about back in the day come roaring back to bite them. Users discover that their machine stops talking to the network when they click and hold the mouse button, because the OS can't walk and chew gum at the same time. And the complete lack of security makes their machines super easy to compromise.
The PC vendors panic. They scramble to rewrite their old systems into systems that can live comfortably on a network. And when they're done, they roll out systems that look an awful lot like what you were insisting the baseline for a "real computer" was fifteen years ago. The world cheers and lines up to buy back all the sophistication they had happily thrown away before.
In other words, it's not so much that the IBMers were wrong, it's that they were early. When OS/2 arrived, the world didn't understand yet why it needed something like OS/2. And by the time it did, OS/2 didn't exist anymore. But in this business, being early is effectively the same thing as being wrong. The market doesn't give out points for foresight.
A beautiful thing is created. It is very expensive. It is sold to very few customers at very high margins. The very high margins subsidise the further perfection of the beautiful thing.
An ugly thing is created. It is very cheap. It is sold to very many customers at very low margins. The very large revenues subsidise the further papering-over of the flaws of the ugly thing.
Eventually, the ugly thing utterly supplants the beautiful thing. A few wistful old high priests mutter about the beautiful thing. Meanwhile, billions of dollars and millions of hours are wasted working around the flaws of the ugly thing and reinventing, poorly, the features of the beautiful thing.
Indeed; in fact I'd argue the market (and society in general) flat out punishes foresight, especially when it's right. What's that old Heinlein quote about Cassandra? And I think this is why so many nerds/engineers are "unsuccessful", at least by market definitions, or why they end up bitter: they'd rather be right than rich. Just look at the deriding RMS gets, and yet "Right to Read" has more or less come to pass.
History doesn't repeat itself. But it sure does rhyme.
(Although I'd be really happy to know if a "user-level sudo" is possible).
Unless you manage to survive and keep at it until the time turns out to be ripe. NeXT kinda-sorta did, 20 years down the road. It's an exception though, graveyards are full of great OS who didn't have the time to wait.
If you think skript kiddies on the other side of the planet are a difficult adversary, try inside jobs like corrupt fellow employees, now they're a worthy adversary.
Anyway, the machine IBM gave me to use was a PS/2 Model 80. This was a 1988-era machine that had been brought to the semi-modern era with 20MB of RAM memory installed via several MCA expansion cards. Against my best expectations, the machine ran well, despite the fact that its CPU was at least 10% the speed of the then-state of the art.
From what I remember, the OS/2 LSE product itself was fairly solid. However, the biggest memory I have from that summer was the afternoon we spent playing around with the Microsoft Windows 95 beta disk we received for compatability testing. Towards the end of the afternoon, we tried to DriveSpace (compress) the disk. We got bored during the wait for the compress, so we pulled the power on the machine thinking that would be the end of it. However, once we powered the machine back up to install OS/2, Windows 95 just resumed compressing away like nothing happened. A few weeks later, a friend and I went to CompUSA for the Windows95 launch. Even at midnight, there was a line out the door, winding past the Windows 95 boxes, then the Plus Pack, then Office 95, and then memory upgrades... Didn't hear much about OS/2 after that...
The special effects were created on Amigas: http://www.midwinter.com/lurk/making/effects.html
Also, while looking at Video Toaster's entry on Wikipedia, I found this gem:
"An updated version called Video Toaster 4000 was later released, using the Amiga 4000's video slot. The 4000 was co-developed by actor Wil Wheaton, who worked on product testing and quality control. He later used his public profile to serve as a technology evangelist for the product. The Amiga Video Toaster 4000 source code was released in 2004 by NewTek & DiscreetFX."
(...all the events from this stretch of computing history seem so weird to me, like from a steampunk-like alternate reality movie. There's surely lots of context missing and stories that nobody will ever tell, since most of the decisions taken by all the key players seem so anti-business. Computers may have changed a lot from back then, but business is still business and all the decisions made seem either "irrational" or based on "hidden information" that is not part of the story.)
A workstation back then was defined as an expensive high-end computer that was designed to be used by only one user at a time (i.e. not a multi-user mainframe), yet was suitable for high-performance applications. They were intended mostly for corporations and academia where extra juice was needed and that could afford to buy workstations (think scientific computing, CAD and graphic design in the 1980s). IBM did at this time already manufacture workstations with UNIX as the operating system (see e.g. http://www.old-computers.com/museum/computer.asp?c=867&st=1 ) But they were way too expensive for the home user. PC (personal computer) was the low-end product that you could afford with a normal salary.
I understand UNIX back then was a mainframe and workstation operating system. Licensing was expensive and the hardware requirements beyond that of a PC. Few people had access to UNIX, mostly at universities and at big corporations. These were the very reasons why GNU and Linux were born - to provide a mostly-compatible UNIX clone for the home users with an affordable IBM PC compatible.
So my theory is that IBM was protecting its mainframe business - it did not want to put the powerful UNIX to the PC because it wanted to sell more expensive special hardware to those who wanted UNIX. So it hired a maverick company (Microsoft) to write a low-end, feature-poor operating system for PC (DOS). It was (and continues to be) a business strategy to bundle better software with better hardware so that you can ask customers that want only the superior software for a higher price (still essentially the business model of a certain Cupertino, California based manufacturer)
One of my cow-orkers at Apple had worked on the OS/2 Presentation Manager at IBM. I tried talking with her about it, but she said the experience had been "absolutely awful" and she didn't want to say much else.
IBM never had a chance.
It worked nicely and the install wasn't bad, it was quite a few disks but than Windows didn't fare much better in that regard. It was also a mess of many floppy disks to perform an install.
I remember fondly the Team OS/2 meetings where we could geek out our love for OS/2 and mourn the inefficiency of IBM marketing to push it.
And then I found Linux.
The SIQ was a "synchronous" input queue and the problem has been understated in the article and comments. It was really bad. The base OS was incredibly stable, but the GUI shell, not so much due to the SIQ problem.
There were a number of Unix and Unix-like systems in addition to the ones already listed: Coherent, Interactive, and SCO are some that come to mind. They were pretty expensive IIRC, around $1000 to license.
"These machines were meant to wrestle control of the PC industry away from the clone makers, but they were also meant to subtly push people back toward a world where PCs were the servants and mainframes were the masters. They were never allowed to be too fast or run a proper operating system that would take advantage of the 32-bit computing power available with the 386 chip. In trying to do two contradictory things at once, they failed at both."
Not quite the same situation, but they have many similarities.
I'm glad it ended up the way it did, Microsoft at the time was betting on openness being a feature, and I think they helped move the computer and software industries they have gone in since, towards greater openness (and thereby professionalism).
People associate Microsoft with closed source, but it is of course relative, they were in their day the vendor who was banking on openness and courting developers harder than the others.
I own a copy of the OS/2 Galactic Civilizations 2.
I think it was Next computers who got first on this.
Probably all Windows by now...
A couple of decades later, Dave Cutler is still around at Microsoft and worked on the hypervisor for the Xbox One at the ripe young age of 71, allowing games to run seamlessly beside apps.
>Underneath it all lies the magic — a system layer called the hypervisor that manages resources and keeps both platforms running optimally even as users bounce back and forth between games, apps, and TV.
>To build the hypervisor, Multerer recruited the heaviest hitter he could find: David Cutler, a legendary 71-year-old Microsoft senior technical fellow who wrote the VMS mainframe operating system in 1975 and then came to Microsoft and served as the chief architect of Windows NT.
>It appears his work bridging the two sides of the One has gone swimmingly: jumping between massively complex games like Forza Motorsport 5, TV, and apps like Skype and Internet Explorer was seamless when I got to play with a system in Redmond. Switching in and out of Forza was particularly impressive: the game instantly resumed, with no loading times at all. "It all just works for people," says Henshaw as he walks me through the demo. "They don’t have to think about what operating system is there."
I once attend a session by him about the Windows kernel design, quite interesting.
(So far as I know, both of these are just amusing coincidences.)
I think it is good that the industry enjoys different types of OS architectures and designs.
Just because UNIX managed to spread as it did, doesn't mean it is the be all of OS design. After all, its creators tried to fix UNIX, just the industry did not adopted it.
Bad example. Really bad example: Not even IBM could standardize on a single OS for the System/360.
The System/360 went through a few OS iterations before OS/360 came along: OS/360 was late, as recounted in The Mythical Man-Month, so DOS/360 came along, then BOS/360, then TOS/360, and even PCP, which didn't support multiprogramming. Other OSes were CP-67, which became VM, MFT, MVT, and still more OSes on top of that.
To this day, there are multiple OSes for the architecture descended from the System/360, including Linux.