Hacker News new | past | comments | ask | show | jobs | submit login
Half an operating system: The triumph and tragedy of OS/2 (2013) (arstechnica.com)
123 points by eaguyhn 5 days ago | hide | past | web | favorite | 87 comments





A couple things to add:

1. Microsoft got paid about $400 million for their work on OS/2, supposedly, through to v1.3 .

Development of Windows through to v3.0 was much much less. It is safe to say that Microsoft made money from the partnership with IBM.

2. OS/2 installation had a fatal flaw: if your PC had cache memory you had to disable it while the installer copied the files from floppy. It didn't hurt install speed since floppies were the bottleneck but it was very confusing to the newbies to OS/2 that IBM was trying to attract.

3. The Ziff-Davis magazines were pay-to-play in editorial content and they plugged Microsoft as being the better choice all the time. And pre-Internet they were a big source of information.


The other OS/2 installation fatal flaw was there were 20+ installation diskettes, so your chance of coming across a bad disk halfway through a 2+ hour installation was pretty good.

If you were an OS/2 administrator back in the day, hearing the disk drive start trying to re-read sectors on Disk 18 was a terrifying prospect.


I never got OS/2 installed as a teenager. I had everything set and I had run into everything that could go wrong. The computer store I bought it from and sold for seasonally (At 13) was sure I was just a stupid kid. I brought it into the store after 3 days and it was everyone's personal project to prove me wrong. It never worked.

Add Turbo Pascal, Borland C++ and Windows 3.x to that floppy salad.

The afternoons spent installing software and now they complain an app takes too long.

The first thing I did when I bought software back then was to create backup floppies and always install from the copies. Naturally it did not work with some copy protection schemes.


For Windows you can just copy all the disks to a directory on the HDD and install from there. It goes much faster that way.

Did you ever install Windows NT from the 27 floppies it came on?

Yep, I did.


No, I got my copy already in a CD-ROM.

Who here remembers installing Slackware 1.0 from 24 floppies? Same story, except at least it was free!

For my first install of Slackware in 1995 or 96, I downloaded 40 floppy disk images and copied them onto floppy disks at University, to install it on a friend's Pentium computer. The very last one ended up corrupted. That was part of the X11 "package". So we ended up without any GUI. But at least we could play Doom on the console!

Haha right! Quake also ran from the console. I spent my high school years working at an ISP NOC in the 90s with one of my responsibilities being to maintain the game server (battle.net & Quake)

A few years later they were passing out free copies of OS/2 CDs as if there was no tomorrow (which, by then there really wasn't...)

Free as in beer unfortunately.

many of my teenage years were spent downloading the "A" set of floppies over 28.8k modem, until I befriended somebody at a university who downloaded the whole thing to a QIC tape and just mailed it to me

Win 3.1x had 11 diskettes. So while OS/2 was worse, it wasn't an order of magnitude worse. And after you were done, you were less likely to have to do a clean reinstall.

Installing operating systems back then was much more involved than swapping floppies. The amount of involvement in an OS/2 installation could easily make it an order of magnitude worse. That being said, the results were usually worth it. OS/2 was far less likely to be affected by misbehaving applications, which was a serious problem with Windows 3.x (and, to a lesser degree, Windows 9x).

My stubborn refusal to use Windows during my student days was beneficial in a few cases. While my classmates were getting uninformative general protection faults, I was receiving much more insightful error messages.


I remember in 94 doing an install of an Oracle Forms system we had built - it took 13 or 14 floppy disks to install the multiple oracle products needed.

It took two of us 2 days to install it for 5 systems - we also had to set up a server as they had neglected to buy one :-)

We even took enough cable and nics to build a small network in case they had no suitable network


"So a rogue group in Boca Raton, Florida—far away from IBM headquarters—was allowed to use a radical strategy to design and produce a machine using largely off-the-shelf parts and a third-party CPU, operating system, and programming languages."

At IBM around this time, Boca Raton was known as the place where people who weren't competent but couldn't be convinced to leave were transferred.


Yep never underestimate the capabilities and ingenuity of people, especially those with curiosity and time on their hands. They where basically sent to Boca to rot, so they really did not care what they did, they gave them a marginal budget and approved almost any project that did not increase headcount or budget. You would have thought IBM would have caught on then or at least after Bell Labs that incubators work and funding incubators is a good bet. The only strange part is incubators never seem to be able to retain their culture and seem to be temporal in nature.

> The only strange part is incubators never seem to be able to retain their culture and seem to be temporal in nature

Because incubators work only as long as they're not administrated and controlled. The moment they produce a successful project that brings in money, you need to control the process and put administration in place.


Not if you spin said project into an independent company, take the key players and replace then with energetic and creative individuals. Incubators need product guys that can see beyond the technicool aspect of the project and spin it into a company. The problem now days is there is just too much cargo-cult around incubation. There is a happy middle between Bell Labs and today fake it till you make it incubation. Don't get me wrong it takes some believing in yourself and I think Job's and Musk had and have the right amount. But believing and outright lying and borderline scamming is what we see more of today, with no technical assets and a story. Where in the days of Bell Labs you had the tech but no one to sell it into a product, much less an independent company.

There was a brief moment in time when South Florida had a growing software industry. IBM had the T-REX campus (from the air the buildings were assembled in the shape of a dinosaur) where this thing called OS/2 was being developed. As a kid I got to beta test a lot of games and office apps on it. Then one day IBM moved everyone to North Carolina and half my friends disappeared over summer.

"Competent" at IBM was defined around "being able to jump through the correct hoops after filling out the correct forms".

Which is better than 90+% of the places people work. Most big companies I've worked for or experienced can't find the forms and wouldn't know what a hoop is if you shoved it up their ass.

This is based on experience in the 1990's, so may not hold anymore.

IBM normally at least had a procedure. Do you want to start a new project? Okay, fill in this form and give a presentation to that group next month. You've got about a 1 in 3 chance of 12-18 months of funding. We'll review your progress at that point as to whether to kill your project.

That's better than almost anywhere I worked ever since other than startups at which I was a principal.

IBM in the 1990's (remember--this is still mostly pre-internet) was still an amazing place if you were motivated. You could pick up the phone and call a world expert on just about any technical subject and just talk to them. There were lots of interesting projects still going on and someone who wasn't lazy could quite easily maneuver into them--even if you were just an intern or co-op. (As a co-op, I sat in an X-ray lithography class and lab! You couldn't buy that course at the time, for any amount of money let alone do stuff in the lab.)

In the 1990's, IBM still also had lots of little enclaves of hypercompetency--generally composed of old greybeards who were within 10 years of retirement. And, if you were motivated, you could attach yourself to one of those and do amazing things.

Or, you could go get an MBA and stab your way up the management chain. That was also a choice and option at IBM for those so inclined and motivated.

The point was that in the 1990's, IBM would let you wallow in laziness if you wanted--the company wasn't going to give you motivation. But if you found your own motivation, it was really quite an amazing opportunity.

(Story time: we were at a bar celebrating the fact that a friend's project shipped and he said: "I've been at IBM almost 12 years and I finally had a project ship." I stared at him in horror: "Dude, I've been here about 6 years and I've had all four of my projects ship. Perhaps you should pick projects that have customers instead of political connections.")


I worked on the OS/2 Workplace Shell, Taligent, and Workplace OS. And later the XCP network protocol (and rode the Pervasive Computing division into the ground).

:-(


I liked OS/2 and used it simply because it meant I could develop 16 bit code using a protected mode operating system, which made development much faster.

Only as the last step was it ported to real mode DOS.


Combine 16 bit mode with pointer magic --> bird shot the OS.

Usually it's the interrupt vector table that got scrambled. Enough to scramble your hard disk. I would always simply reboot after any erratic program behavior. This is why OS/2 was soooo much better for development.

Intel made a large mistake putting the interrupt table at 0000. It should have put the boot ROM there.


Is there any known reason for why the IDT is initially at address 0 and the boot address is 0xFFFFFFF0?

I don't think there is any technical reason it couldn't be anywhere. Except that they were expecting the IDT table to be mapped to DRAM. And you have basically two memory areas, DRAM and the BIOS in ROM. If the addresses don't start at 0 and max memory then you end up with RAM being non contiguous which is a fat wart. Pick your poison.

Also remember too C and C++ weren't really a thing back in 1977. C and C++ tends to love splating low memory when you mess up your pointer magic. Other languages didn't have that problem.


> IBM rules about confidentiality meant that some Microsoft employees were unable to talk to other Microsoft employees without a legal translator between them.

Haha, this is so ironic. I've worked on projects for MS that were just the same. We had to have code names for their code names and a code name for MS itself. Even our own code names had to be uttered with caution. Maybe they learned from IBM? Edit: Thinking about it some more, maybe they really did. None of the other big tech companies we did work for were that secretive.


There is Arca Noae[1], but I don't know how successful their business is.

[1] https://www.arcanoae.com/


Thanks. Didn't know they existed. From their page:

https://www.arcanoae.com/arcaos/

"ArcaOS is more compatible with modern hardware, makes more efficient use of memory and system resources, and installs more easily than any other OS/2 distribution…ever. Really.

Do you have a system with 16GB of RAM in it? Want your apps to really fly? Configure ArcaOS to utilize all memory above 4GB as a RAM disk, and at bootup, copy your most frequently used applications there. It’s like running your OS/2, Windows, DOS, REXX, Java, and ported Linux applications on air."

And they develop further for the newer hardware:

https://www.arcanoae.com/arca-noae-progress-report-usb-arcao...

"When IBM left off USB driver development, OS/2 had a working, 16-bit USB 1.x and 2.0 driver stack. Fast forward to 2019, and this is no longer adequate for the needs of today’s hardware.

The Arca Noae USB stack is now fully 32-bit, and USB 3 support development continues to make good progress. Implementing USB 3 support has been tedious because the OS/2 USB architecture didn’t accommodate the peculiarities of USB 3 well."

"entry last updated: October 16th, 2019"


I can speak to their success, but I have an ArcaOS Personal Edition and am a big fan. It definite seems positioned as a maintenance/upgrade path for existing OS/2 users more than anything else.

SOM was much more advanced than COM and nowadays it is almost impossible to find any documentation online.

WinRT is closer to it, but still lacks the metaclasses capabilities that SOM had.



SOM was pretty cool, for sure. But using SOM objects (more than a couple) and it would bring my 486 to it's knees.

Interesting, I never got to use it, only wonder in awe reading articles and OOPLSA papers about it.

My OS/2 experience is based on playing around with it at trade shows.

When I finally got a PC, I went with a 386SX, which was unable to run OS/2, as the only shop in town selling PCs with OS/2 was selling it with alongside PS/2 systems, about 1000 € more expensive (in today's money) than compatibles running the DOS + Win 3.x combo.



(2013)

> This story first ran in November 2013, and it appears unchanged below.


I was pleasantly surprised that Ars re-ran this story. It was one of my favorites to write. Also probably the most popular single article I ever did.

I almost got to use OS/2 for a commercial project.. we had fetch information from a PC desktop and feed it to an IBM mainframe while servicing up to two simultaneous users.

OS/2 1.1 EE had EHLLAPI support but was still a few months from release and I couldn't wait for it. I was really disappointed I didn't get to explore it fully because first look was really impressive.

I ended up using DOS + DESQview, and it worked out fine.


DESQview. Now there’s a name I haven’t heard in 20 years.

I never used it, but I remember being blown away at the time by the screenshots and advertized features.

DESQview API was pretty nice for the time.

> I ended up using DOS + DESQview, and it worked out fine.

This was my standard setup for almost a decade, from my first 486 machine (my previous computer had been an 8088 PC clone, with only two 360K floppy drives and no hard disk) until I replaced it with a Windows 98 machine. I even ran Windows 3.1 occasionally in a DESQview window.


OS/2 was the most impressive "prosumer" OS of the time.

Wasn't as impressed by anything else that was not from Apple or Microsoft except NeXTStep and BeOS.


What did you think of the Amiga? Early 90's Amiga OS was contemporary with OS/2 2.x. It had preemptive multitasking, but lacked memory protection.

Like everything about the Amiga: in 1985, implausible; in 1989, excusable; in 1991, unforgivable. And by 1994: punishable by death.

(See also, though with timeline offset and/or scaled: RISC OS, MacOS.)


I liked it. Was extremely focused on games instead of regular computing, though (save for Video Toaster, possibly the first "killer app")

When Windows 95 launched I was largely underwhelmed. It was like an ugly version of OS/2 with more bugs.

They stole the right-click menu and the Blue Screen of Death.


Stole? You need to read about Windows 95’s development history.

They took IBM's money and spent a bunch of it on developing a competitor to OS/2.

They also baited a bunch of their competitors into developing for OS/2 because it had IBM's support, then made their own products for Windows.

WordPerfect and Lotus 1-2-3 for OS/2 were great products for the wrong OS.


Little-known fact: Windows of today is an entirely different OS than early DOS-based Windows because Microsoft took ownership of the 386-based “OS/2 3.0” codebase it jointly developed with IBM, forming the foundation for Windows NT 3 which all modern Windows is based on. This gave them the huge head start in having a modern enterprise-grade operating system that allowed them to dominate the market.

I don't think that's the case; Cutler's NT has VMS-esque foundations, not OS/2.

Windows NT was originally going to be OS/2 NT. Due to the architecture of NT, it could support many different APIs.

Due to the success of Windows 3.0 and the lack of OS/2 success, Microsoft wisely decided to expand the Windows API to the Win32 API and have it be the default API.

The book Show Stopper, has a good account of the early days of NT.

https://www.amazon.com/Show-Stopper-Breakneck-Generation-Mic...

It is interesting read.


Original NT contained an OS/2 subsystem layer too, like WSL today. It allowed running of OS/2 1.x apps.

WSL is vastly different from the old-school NT subsystems. Both in the technology and in the result. And rightly so, because while SFU&co was in some way more integrated, it effectively was a separate platform from anything else, and who want to support that? Arguably the OS/2 subsystem did not have this problem because the environment/ecosystem to support was way smaller, and it was making existing binaries made for this system work. So on that last point, yes WSL is similar to the OS/2 subsystem; but it could not have been like that in SFU because Linux was not seen as a serious competitor at the time (and well, it actually was not...); and the price to pay now that co-evolution did not happen is a more segregated environment.

Indeed, there's no relationship between OS/2 and NT core, other than historic naming. NT is a cleaner, OO-style reimplementation of many core concepts behind VMS, to sometimes funny extent.

The programming API for use by userland started out as OS/2 though.


Russinovich claims it was an OpenVMS clone and upgrade with some convincing evidence:

https://www.itprotoday.com/compute-engines/windows-nt-and-vm...

That gave them a huge lead in developing a server-grade OS. They probably just used OS/2 for the OS/2 part that ran on top of it. Fast forward to today, the result still isn't robust as its predecessor in cluster configuration or the AS/400 Microsoft ran on before using their own product. They did show how much better VMS could've been as a desktop... by dominating on the desktop. :)


I didn't work for Microsoft, but I don't believe that Windows NT and OS/2 Warp had much significant in common, other than starting as a nominal "OS/2 3.0". The relationship between IBM and MS had fallen apart before 2.0 was released.

Yes, the "OS/2 3.0" was a marketing/naming thing, and was dropped when it became obvious that OS/2 wasn't going anywhere in the market.

There were some allegations that Cutler did a bit of a Levandowski on DEC. Or maybe it should be that Levandowski did a bit of a Cutler on Google.

Anyway, they settled out of court as long as MS promised to port Windows to DEC's Alpha.


Eh, there's a big difference between the two. Levandowski basically stole files from Google and gave them to Uber. Cutler just transferred his knowledge of building VMS to building Win NT. I suspect that's why it was settled out of court because the case was much much weaker that Cutler had stolen something tangible from DEC.

I'm not so sure about that. DEC may have simply decided to settle because fighting MS in open court would take a long time and deplete them of cash they needed very much (we all know how they ended up), Alpha was their hail-mary pass and having NT on Alpha could have strengthened their position.

Cutler didn't stole files, as they would be pretty useless. But DEC had feathers ruffled because Cutler uprooted a team he picked for his attempt to do a modern rewrite of VMS, which Digital refused to do.

And it was a good team.


Citation?

Wikipedia pages on NT, DEC, the Alpha microprocessor and Microsoft's general legal strategies would be an excellent starting point.

Friend of mine worked for a company that had a TCP/IP stack for VMS. He said they were asked how hard it would to port it to NT. The answer was trivial.

That said I think there is a difference between re-implementing APIs and stealing sources.


During early days of NT, if you wanted a port, you filled in a form and entered partnership where you were responsible for your system-specific code and MS cooperated with you.

Majority of NT/alpha work was done by team at DEC, later Compaq.


That was after the Pentium Pro appeared. Before the Pentium Pro, MS wanted NT to be cross-platform just in case 'RISC' processors became big.

Another little know fact that I only discovered this week while listening to the coding after work podcast, is that some of the multitasking effort from DOS 4 (which apparently went nowhere), was actually integrated into OS/2 efforts.

That was a totally different DOS 4, which only surfaced relatively recently. (And the shipped versions apparently weren't fully featured either)

The shipped 4 versions were a consequence of the project's failure.

Like almost everyone else I kept to 3.3, totally ignoring 4.

Afterwards I actually ended up using DR-DOS 5 for a while, until MS-DOS 6 came out.


Yes, I heard that too.

Apple and MS basically bought their current incarnations of their respective OS at some point in the past.


Microsoft bought the first incarnation of their OS as well, licensing QDOS to become MSDOS.

This thread brings back the memories - I was an editor at a Ziff-Davis computer magazine back in the day.

Looking back on all this, the only lasting legacy I can identify is the linux windows emulation layer (wine), which exists only because Microsoft was required to make the Windows API public so it could be used by OS/2.

Given the difficulty of getting an ancient version of Windows running on currently available hardware, linux/wine is now the only practical way to run a lot of old Microsoft Windows application software. If it weren't for wine (which was made possible by OS/2), that application software would be unusable.


>Long before operating systems got exciting names based on giant cats and towns in California named after dogs

What operating systems were named after towns in California named after dogs?


Not a town, but I'm guessing a reference to this. https://en.wikipedia.org/wiki/Mavericks,_California#Origin_o...

> In early March 1967, Alex Matienzo, Jim Thompson, and Dick Notmeyer surfed the distant waves of Pillar Point. With them was Matienzo's roommate's white-haired German Shepherd, Maverick, who was accustomed to swimming with his owner and Matienzo while they were surfing.



I was a fan, but looking at the UI now, it's too messy. Has any designer mocked up something like the OS/2 UI, but good?

There was something called Tabworks at some point in time, a reworking of the Windows GUI that I always really liked. Very intuitive and worked well on smaller screens.

Don't you mean Geoworks, that was bundled with Phillips PCs?


Ah thanks. I don't remember ever seeing it.

It came pre-installed on the Compaq Aero, a very early notebook format laptop.

The OS/2 team wrote a replacement for Window's Program Manager called WorkPlace Shell for Windows. It gave Windows 3.x the OS/2 desktop. It was vastly better than Program Manager. WPS4WIN was eventually open sourced.

Never mind the UI. I wanted OS/2 Warp, but it wouldn’t boot on my machine because the RAM’s parity method was not the IBM way. Or at least that was roughly the error code.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: