Hacker News new | past | comments | ask | show | jobs | submit login
How the IBM PC Won, Then Lost, the Personal Computer Market (ieee.org)
99 points by headalgorithm 2 days ago | hide | past | favorite | 101 comments

I worked at IBM, '91 - '93, in the PC Business, in the sales channel.

We had a couple of problems that IBM's competitors didn't have. For one thing, we couldn't make too powerful a PC that might cannibalize the minicomputer sales (AS/400). IBM made much more money on an AS/400 than a PS/2 model 95.

Our cost structure was too high. How high? We lost $1 for every $100 we sold.

Part of the reason the cost structure was high were the number of IBM'ers: I worked in an office of about 15 people supporting the NY/Long Island region. Four years later, it was down to 1 person.

Another reason was our physical costs were high. For example, we couldn't buy RAM from Intel; we had to buy RAM from IBM (which was fabbed in Kingston, NY, not to be confused with the Kingston RAM mfr).

Also, we were struggling to get the correct mix of sales channels. For example, we couldn't sell PCs directly through the IBM sales force (too expensive, IIRC it added 36% to the sales cost). Instead, the IBM sales force would partner with dealerships (which added 21% to the sales cost). The web didn't exist yet, not really, so we couldn't sell through there. We could have done direct-response (an 800 number, like Dell & Gateway were doing), but that took a while to spin up. And it was similar in cost to selling through dealerships. Also we didn't sell through distributors, which was the least-expensive channel (16% IIRC).

I remember looking at the price list. We had two levels of dealers: those who bought large volumes, which typically purchased equipment at 30% off list, and those who worked in smaller quantities, which would get ~20% off list.

As the PC Market became more competitive, we couldn't afford high list prices (we were too expensive), so one of the ways we brought down our list prices was to shrink the discount we gave to the dealers.

The dealer was not allowed to sell online or via computer magazines: there had to be in-person contact. That helped prevent the Atlanta dealers from poaching the NYC dealers' business.

It's easy to armchair-quarterback the decisions IBM made, but a lot of those decisions were nuanced. For example, you could say, "Why didn't IBM have an 800 number to buy PCs?". And the answer is that a lot of our biggest customers were corporations, not individual consumers, and they felt more comfortable purchasing through a dealer they trusted.

> we couldn't make too powerful a PC that might cannibalize the minicomputer sales (AS/400).

Agreed, knowing when to cannibalize your own market is one of the key inflection points.

Apple has done a fantastic job (up until now) of introducing something that basically makes a "current at the time" product obsolete.

Classic version was the iPhone when iPod was a household name. A more conservative company would figure out how to keep the iPod market extremely healthy, but in the end losing out on the whole pie. A lot of smaller/other instances as well.

> IBM made much more money on an AS/400 than a PS/2 model 95.

I wonder if someone high up in IBM made the same mistake for personal computing that Ballmer did with the iPhone - "There's no chance that the iPhone is going to get any significant market share. No chance..."

To be explicit, making the bet to cannibalize your own market is one of the toughest any company can make. High risk, high reward. It's so easy to point at the winners and forget the landscape that laid waste to the ones that lost.

But, being able to do it consistently is what makes a company multi-generational.

>> we couldn't make too powerful a PC that might cannibalize the minicomputer sales (AS/400).

> Apple has done a fantastic job (up until now) of introducing something that basically makes a "current at the time" product obsolete.

An interesting analogue at this time will be the iPad and the Macbook lineup now that they both use the same processor. It'll be interesting if they make one or the other obsolete or keep both going. (Yes, I realize there's all sorts of design tradeoffs between direct & indirect manipulation environments)

Personally I couldn't wait nor justify owning two devices - migrating to a Windows/Linux tablet after waiting so many years and invested into the mac platform for a decade.

The big problem is that they want people to be able to use iPads for 95% of what they use Macs for, but as developers we reside solidly in that 5% of critical functionality. It’s very hard to be productive as a developer on a system as locked down as an iPad. I want Emacs, Git, bash, and the Erlang runtime. That’s all I would need to be happy, but I think we’re a long way off from that on the iPad, if it ever happens at all. I’m not sure they can do it while maintaining their security model either, and they’re not going to give up their security model for anything less than antitrust action.

There's no real reason why Apple can't build a "development sandbox" app that emulates a terminal (that has no access to anything on the iPad outside the sandbox app) that includes a terminal, a package manager (possibly acquiring Homebrew to help them make it happen), possibly a wrapping GUI to manage secrets, IDE-style text editor, etc.

I'm not really sure why nobody's built one yet except for the sheer complexity involved. But it certainly seems doable.

There is iSH which somehow managed to make it onto the App Store. Only problem is it's emulating x86 so it's not exactly the full potential of the M1.

AIUI and IIRC, iSH implements just enough of an x86 emulator to be able to interpret x86 binaries and JIT "compile" them to a series of function calls rather than directly to native machine code, and it implements enough of the x86 Linux ABI to be able to run most binaries. In short, it's effectively treating x86 machine code as a scripting language, and Apple does allow scripting languages (e.g. Python) on the App Store.

The iSH team did run into some friction with Apple over the fact that the built-in filesystem image is Alpine and thus includes a package manager, but apparently they were able to resolve that situation in a way that allowed them to keep `apk` in the built-in filesystem image: https://ish.app/blog/app-store-removal

Yeah, there are certain critical functions that to me define a "real computer". Accessing external mass storage, system-wide volume control for individual programs, being able to back up the system without a second computer, not having any sort of settings or features that depend on a second computer (like things you can only do on bootloader-locked android using ADB from a PC). I'm never going to have a household without a real computer in it.

> direct & indirect manipulation environments

What do you mean by this?

I am guessing the commenter means computing with and without use of your fingers on touch screens.

Without the fingers? How does that happen?

That cannibalizing one's own market is a viable strategy at all speaks to the inherent flaws of free market capitalism.

I don’t understand what you mean.

Supposedly the advantages of free market capitalism are that it increases efficiency and makes for better products, primarily through competitive pressure. Yet here is a case where a company is actively stifling its own product in order to ensure its more profitable product isn't threatened.

My understanding from the OPs comment was that IBMs failure to cannabalize it's AS/400 market led to the failure of it's PC business. IBM chose to stifle it's own product to keep a more profitable product alive and as a result IBM's PC business failed. That sounds like exactly how things should work in ideal free market capitalism. The idea isn't that every business will choose to make better products to deal with competitive pressure, but that those who don't will die.

Parent also said:

> To be explicit, making the bet to cannibalize your own market is one of the toughest any company can make. High risk, high reward.

In other words, it is considered a viable strategy, although also a risky one. The fact that it didn't pan out in this instance doesn't change that.

This is a classic case of a business losing a market because they didn't want to let go of a dying but profitable business. The market always wins in the long run.

Don't feel sad. This market dynamic is how startups can get ahead. Otherwise everything would be dominated by the huge incumbents forever.

> Otherwise everything would be dominated by the huge incumbents forever.

One thing to remember is that mainframes are still dominated by IBM, 60 years later. And mainframes are still a somewhat stable market worth billions of dollars.

IT works primarily by accretion, new markets are created, and in those markets IT also seems to work by natural monopolies due to network effects, so once the market stabilizes the huge incumbents are there forever.

IBM still dominates mainframes, minicomputers are gone so no one dominates those anymore, workstations are also gone so no domination there either, but PCs are dominated by Microsoft and have been for 40 years, smartphones are split between Android and iOS and both will probably be around as long as we use smartphones.

> PCs are dominated by Microsoft

?? PC (Desktop) OS are certainly dominated by Microsoft, but I don't see any one player dominating the PC Market. There's definitely companies that dominate different PC components (OS, CPU, audio, network, etc)

Anyway, I know that the mainframe market is pretty healthy, but who are the clients?

> Anyway, I know that the mainframe market is pretty healthy, but who are the clients?

Banks, insurance companies and airlines. The amount of only-documented-in-code corner cases and custom business logic is immense (for the really old, up to 60 years!) which means rewrites are immensely difficult and expensive (and prone to failure), the amount of government regulation and the fines for when things go south (aka people lose money) is equally massive, and the companies are ruled by beancounters by definition which means they will, as long as IBM has mainframes and support for them, go with keeping their mainframe stuff to "never touch a running system".

That is part of why fintech start-ups are eating the old banks and insurance companies alive... they can design proper systems from scratch without having to take care of decades of accumulated cruft that needs to be migrated and especially they don't have to retain horribly outdated and complex business logic, workflows and processes that would take old-school companies billions of dollars to re-train their staff - almost 2 million people in 2019, per https://www.statista.com/statistics/250220/ranking-of-united....

PCs are commodities. The PC market has always been dominated by software vendors [1]. PC vendors are by and large disposable.

Oh, and speaking of the rest of the PC market, another natural monopoly on the other side: servers are dominated by Linux.

[1] And historically by Intel, ergo Wintel.

It's not obviously a mistake to me. They may well have been better off just milking that existing business for all its worth rather than accelerating its demise by moving into barely profitable markets. As far as I know they are still in the mainframe business.

The innovator’s dilemma.

> For one thing, we couldn't make too powerful a PC that might cannibalize the minicomputer sales (AS/400). IBM made much more money on an AS/400 than a PS/2 model 95.

Sure, but it wouldn't be one-for-one. A lot more people can justify buying a PS/2 than can justify an AS/400. Did you make more on an AS/400 than on 100 PS/2s? 1000?

I suspect it's almost impossible for a business run on high margins to compete in a low margin sector. I can't remember an example.

Apple is very successful in the laptop and smartphone markets (both are more or less commoditized and very low margin). Of course due to the openness of the platform IBM couldn't really differentiate their PCs enough to justify higher margins.

True, although Apple does ignore sectors where they can't maintain their margins. They no longer make printers for instance and despite an obvious interest they have never sold a TV.

I think what I'm saying is if you have a model where you make 30% to 50% margins then going after business that can only attain 3% to 5% margins is never going to work.

I think that Apple smartphones and laptops have higher margins than most phones and laptops. Apple has done very well at making a brand, not just generic commodities.

No mention of the PS/2 and Micro Channel Architecture? MCA was IBM's attempt to put the genie back in the bottle and redefine the PC back to being something they wholly controlled, and it utterly failed. It also didn't help that they were late to the market with 386-based computers (Compaq having beaten them to the punch with the Deskpro 386) which already signaled they were losing their market leadership position that allowed them to define the PC platform. In response to MCA, the PC industry formed an independent consortium to define the EISA bus and later Intel itself eventually took over as the de facto standards originator for PCs by developing PCI, USB, ATX, ACPI, AC'97/HD Audio, (U)EFI, and a large part of the other foundational standards on which modern PCs are built.

According to this video[1], Compaq's portable also bested IBM. It was Compaq's first product, and everyone thought IBM's upcoming portable would wipe them out, but they held strong and counted on IBM to have production problems, which they did, so Compaq held onto this segment.

[1] https://www.youtube.com/watch?v=HEMhpInIACk

So handy for pirating software at PCCUG meetings.

PS/2s were overpriced and underpowered, and people I knew told me that a surprisingly high percentage of them arrived DoA. That was basically the end of "no one ever got fired for buying IBM".

Yes, I recall the MCA, it was born a dead horse and never run a furlong. I recall all the companies who went through the lengthy MCA add-in card signup process - that took a year or more to be granted, whereupon they fielded cards that never sold. I recall the surplus tables at the Dayton hamfest being piled high with MCA detritus - cards that were eventually scrapped for gold. IBM tried mightily to force this dead interface, few bought the idea and made boxes - few sold. They were dead before they sold one. The supposed superior MCA bus, was soon humbled by a string of faster and better busses. There were a few old fogies who clung to big blue - I see them from time to time wandering in the wilderness...

I remember NCR making a few Micro Channel machines during its short not-quite-heyday. I always thought the idea of putting card configurations onto floppy disks was downright stupid. I wondered what the EISA crowd was thinking when they adopted that for EISA systems.

Yes, Eisa, Vesa and MCA whose sun rose and set so rapidly. I had EISA and VESA bus boxes, but soon left them. I had 7 build your own box retail ops in the 80's, so it came in and went out, and as soon as it faded I stopped buying. Never bought MCA - terrible sticker shock, my student base would not go for it. I think that IBM ran with floppy config and EISA followed like a puppy dog and it died a quick death.

Relevant Computer Chronicles episode from 1988: BUS wars


IBM engineer responding to the accusation that MCA is proprietary (around 14' 15"):

"I really don't understand the term proprietary. To me that would mean that it's not available [snip] and it's available for use by anyone."

That's like when Intel switched to SECC (single edge card connector) with Pentium2/Pentium3 and stuck it on an Atari(NES?)-like catridge: this was to thwart clones and outstanding cross-licensing with AMD. What a weird decision. I'm not even sure if it worked tactically because it only lasted until Pentium 4. Weird days.

The side connector lasted until Socket 370 in the Pentium III product cycle. The new socket took Pentium III chips that had integrated L2 cache (versus earlier PIIIs which had cache on an external chip).

Pentium 4 never had a side connector model, but did have a “socket shrink” in the transition from Willamette (0.18 u) to Northwood (0.13 u) cores. Socket 478 was kept until the Prescott cores (0.09 u). The Prescott cores were also offered with an LGA 775 socket later.

Yup, that's why I said "until Pentium4". P6.0/P6S actually introduced multi-die packages at Intel, not the SECC. Well, actually that's not true either, because I worked on Sidewinder: a 486 that fit in a 386 socket with an ASIC on the top of the package that matched the busses. So it was technically "multi-die".

Wow. Transistor sizes measured in um instead of nm. That takes me back...

I think the SECC1 design was more based on practicality of cache that was available at the time; While the connector was 'proprietary,' the Bus design itself was probably a more protect-able part of locking AMD out.

FWIW AMD's original Athlon used a Slot connector as well, for similar reasons.

Intel and AMD actually used the same physical connector, with the slot keys on the connector opposite for AMD than Intel.

Slot 1, or slot 2 for Xeons. AMD had slot A for a hot second in the early Athlon days, before they went back to a PGA.

It's weird, yeah, but is it worse than aligning an expensive bunch of easily bent metal pins into a grid of small holes?

In terms of usable data/electrical connections, the slot is worse. This could mean you have to use multiple clock cycles to send the same data, due to fewer direct connections.

They did talk in the article about how slow IBM was to adopt the 386.

And that then helped to kill OS/2 only because it delayed the release of OS/2 2.0 as that version needed a 386 to run.

OS/2 2.0 used features found in the 386 to greatly improve the MS-DOS experience.

IBM instead kept pushing the OS/2 1.x versions all of which had very bad support for MS-DOS applications.

That delay was a massive bonus for Microsoft as it gave them time to work on Windows 95, which used the same 386 CPU features to deliver similar improvements in running MS-DOS applications.

Yeah, but also remember that Windows NT started as the next version of OS/2 after MS and IBM stopped collaborating. The UI was basically the presentation manager from os/2 until 4.0, and MS provided OS/2 APIs in NT.

I think you may be misremembering. Windows NT 3.51 had a UI that generally mirrored Windows 3.1:


You're totally right, I am. Thanks.

Relased in late 1987, Windows 386/2.0 supported multiple DOS boxes using virtual 8086 mode.

Win3.1 added some features which made windowed DOS boxes more usable.

Win95 came three years after OS/2 and Win3.1 were released.

Hard to adopt the 386 when they basically bought the whole supply of 286's.

"OS/2 finally came out in late 1987, priced at $340, plus $2,000 for additional memory to run it. By then, Windows had been on the market for two years and was proving hugely popular."

No. Windows was basically irrelevant until 3.0 came out in 1990.

I'd go so far and say irrelevant until Windows for Workgroups 3.11. That's when Banyan Vines + Ethernet (as opposed to token ring) + Lotus (ccMail & 123) really started to take over corporate America.

Novell Netware seemed more prominent at my customers. I don't ever remember seeing Banyan Vines.

Yes, it was dos and lotus123 and maybe word perfect that were popular in the 80s.

Yes. Personal Computers were typewriters and calculators for the first 10 years. That's not to diminish their importance - quite the opposite. The impact changed the world. That's why it took a whole decade. Consider that when the IBM PC was launched, offices the world over had typing pools populated by women whose job it was to type whatever they were given.

Spreadsheets changed the world. We went from 1 or 2 types of tuna and 1 or 2 types of olive oil in the supermarket to dozens of each. Why? How? The big enabler was Lotus 1-2-3. Product managers in food companies could manage the ingredients, inventory, branding, marketing and sales performance of dozens of products at a time instead of just one or two. The spreadsheet was a fancy calculator, in the same way that a machine gun is a fancy pistol, and a combine harvester is a fancy hoe.

The first decade of personal computers enabled office work to scale, and it reconfigured literacy by enabling individuals to type letters when they felt like it. The fax machine was right there alongside, like the hammer and the sickle of capitalism. The resulting boost to productivity touched every industry the world over.

Not contradicting, just adding that MS Word for DOS was also somewhat popular.

Seconded. In 1987, Windows was not "hugely popular". It wasn't on anyone's radar. This was still DOS-versus-Mac era, with your Amigas and Ataris and held-over 8 bit micros on the sidelines.

Yes, it didn't feel like a certainty that products like Windows or the Mac would prevail. Early versions of Windows were crude and the Mac was very expensive.

IBM also hindered the roll out of

fried chicken

I remember when IBM launched the PC and it looked ludicrously overpriced and lacking in "modern" features (like graphics and sound). I didn't buy one for myself until the 386 came out.

The article's claim about "IBM’s strategic error in not retaining rights to the operating system" begs the question if the PC revolution would have happened had Microsoft not owned it. I personally doubt it. While IBM had success up to the IBM AT, there really wasn't much revolutionary about them and they were stupidly expensive. The clones really made the PC a success. It's just tragic that the crap SW and HW won the market.

> The clones really made the PC a success.

Yes. Price competition is a wonderful thing. They undercut IBM's pricing model. IBM tried to get it back with the PS/2, which was patent protected. But that totally missed the point.

> It's just tragic that the crap SW and HW won the market.

Crap SW and HW made the market. There wasn't much of a market for PCs at the PS/2 price point. There was a much larger market at the PC clone price point. (I'd guess that halving the price led to something like 10 times the volume. The PS/2 was locked out of any meaningful volume just by the price.)

General rule: The PC I can afford is better than the PC I can't. Sure, it may be technically inferior. Doesn't matter. The one I can't afford has zero utility for me, so the one I can afford is actually better (for me), no matter how crap it is.

The other thing that made that market was the clone vendors cooperating to define the EISA bus standard. It created a huge market (all the clone companies put together) for peripherals. And because you could get anything in an EISA peripheral, that created a huge market for EISA PCs. It was mutually reinforcing to create an exploding market. If they had fought each other with competing bus standards, that would have killed the market.

I'm sort of skeptical of how important EISA was from a market perspective. It seems to have been primarily a premium option-- you might see it in the super-expensive early 486DX workstations with SCSI and a fancy video card, but the commodity desktop with a 386SX, 4M RAM, and a small IDE hard drive was just using vanilla ISA.

From that perspective, all the clone makers really did was being intelligently lazy. They just built IBM 5170s with faster CPUs for years. It was probably MORE effort to develop a custom bus and break all the existing expansion options. You saw some brief experimentation with that for the early "local bus" machines before the advent of VL-Bus, but that was quite a few years after the debut of MCA and ISA had legitimately run out of gas.

EISA might have been important from the perspective of building a vendor-neutral standards process.

EISA was an enabler which allowed broader growth of the whole PC market. Office LANs were just starting to become really popular. So EISA allowed a single Novell NetWare server PC to have enough bus bandwidth to run a network of cheaper ISA PCs.

EISA was probably more important in that it lead to PCI, than the standard itself.

Amending my "crap HW" comment: PCI was a massive step forward and PCIe even more so. PCIe to me marked a turning point where PC IO could be considered enterprise-class. Obviously the CPUs had improved a lot by then with the Pentium Pro marking a milestone and the beginning of Really Good implementations. (The ISA of course remained garbage).

It's a classic worse is better scenario.


But for short-sightedness at DEC and Xerox, things might have been different indeed. Unfortunately, DEC hamstrung their micros until it was too late, and Xerox's pie-in-the-sky machines at PARC were expensive enough that their real-world offering was the 820, a me-too 8-bit CP/M system that came out just in time to be stomped by the PC.

If you have any interest in the origins of the personal computer you should track down the sublime documentary series "Triumph of the Nerds" by Bob Cringely. You an find various versions on Youtube.

Also I’d recommend ‘Pirates of Silicon Valley’ and ‘Silicon Cowboys’

PC clones killed IBM in the PC market, which matches how I remember it

> Both Microsoft and Intel made a fortune selling IBM’s competitors the same products they sold to IBM. Rivals figured out that IBM had set the de facto technical standards for PCs, so they developed compatible versions they could bring to market more quickly and sell for less.

Lenovo bought IBM's PC business for $1.75 billion in 2004, or about $2.51 billion adjusted for inflation. Lenovo's PC and smart device sales hit a record 12.4 billion in its FY2020, mostly due to Windows-based PCs. I would argue that had IBM ceded software to Microsoft, and had it focused on hardware and PowerPC chip compatibility, the company could have greatly improved on the 20% margins that were the impetus for selling its PC business to Lenovo.

They wanted access to China and Lenovo gave that to them.

All in all, I think IBM did mostly the right thing. Given that they published their hardware and software interface standards, they must have expected to be creating a platform as well as a product. They might not have adequately anticipated the resilience of DOS when they tried to come out with OS/2.

They definitely underestimated the market's requirements for DOS compatibility. Microsoft in contrast understood that which is why Windows (though technically inferior to OS/2 in many respects, at least until Windows 95) ultimately won. The Digital Antiquarian blog has an excellent (if long) series of articles covering the history of Windows from its original conception as a product up through Windows 3.1, including looks at OS/2 and other competing products: https://www.filfre.net/2018/06/doing-windows-part-1-ms-dos-a...

It was technically inferior in many respects, but mostly not ones that mattered. I ran OS/2 2.0 and 2.1 for a while, and it was dog slow and consumed massive amounts of memory compared to Windows.

The main selling point of OS/2 was supposed to be that it had preemptive multitasking and protected memory, which was supposed to make it much more stable compared to the (constantly crashing) Windows 3.0/3.1.

The problem was that the Workplace Shell (the GUI) had some sort of single message queue that could be blocked by a misbehaving program. This would cause the GUI to hang. While it was true that the system would continue to task switch, and you could even telnet into the machine after this happened, from the console, the system was completely unresponsive, so it was functionally equivalent to the OS crashing.

IBM as an organization didn't seem to understand what features were actually relevant to normal users and prioritize them. It could do a bunch of things like smoothly multitasking multiple DOS apps, that were technically impressive but not that important to users.

OS/2 is one of the only operating systems that I know of where a version successor actually had lower hardware requirements (assuming you already had a 386. If you were on a 286 then not much changed) than the version it replaced. You could upgrade from OS/2 2.1 to OS/2 Warp without upgrading hardware and it would run faster. By that time though, it was too little, too late. It speaks to how inefficient OS/2 2.1 was.

> The problem was that the Workplace Shell (the GUI) had some sort of single message queue that could be blocked by a misbehaving program. This would cause the GUI to hang.

Windows 9x had very similar issues, and even early versions in the Windows NT series weren't altogether free of them. The viable fix was to hit CTRL+ALT+DEL and force close the misbehaving application.

Windows 9x had driver and stability issues but you could always kill an application if it stopped responding. Linux today is similar to OS/2 - the OS keeps working but UI is unresponsive so if you have another computer you can ssh in but if you don't you may have to reboot. I know that sometimes you can switch to terminal using keyboard and try to kill the application from there but that is not as easy as CTRL+ALT+DEL + Task Manager and killing the application. Of course, there are times you don't have the keyboard attcahed (tablets) and then frozen UI is equivalent to the crashed OS. And just to add - these days MS has made Windows much less resilient as misbehaving applications can freeze the UI to the point where CTRL+ALT+DEL and Task Manager do not respond any more so I guess OS/2 has won in the end - two of three most used Desktop OSes emulate the bad UI behavior.

> but you could always kill an application if it stopped responding.

Wasnt Windows famous for BSOD?

Sure, but those were kernel panics, mostly caused by misbehaving drivers.

Hence why nowadays all drivers have to be certified, and graphics drivers are again in userspace.

At the win95 time plenty of BSODs came from microsoft bugs, from userspace accidentally weitingbovervkernel data, or even from running out of resources.

You have to remember the barrier between kernel and userspace was very porous, applications had their own address space but e.g. DLLs were projected at identical addresses and shared some resources. Kernel memory was mapped in userspace, if I'm remembering it right. Basically, apps had their own cubicle, not their own apartment.

Win95 was still a special flower OS due to its backwards compatibility compromises, where you could even kill the kernel with task manager, not really a good idea.

Eventually NT 4.0 became a better option, if you could get all the required software and weren't doing DOS games.

In my view, people started developing for DOS, and it became a standard for small scale and casual developers. Even into the Windows era, a lot of utilities ran under DOS, such as early embedded development tools. Programming any Windowed OS was too hard, and we had only been programming for a few years, in relative isolation from one another and beholden to crappy documentation. Especially with tools like Turbo Pascal, writing "software" that looked like commercial software was relatively easy and got the job done.

Maintaining and growing any kind of platform becomes a whole new ball game when "what are its specs" is replaced by "will it run my apps?"

> Windows (though technically inferior to OS/2 in many respects, at least until Windows 95)

In many respects maybe, but which aspects matter for success is highly contextual, and some just didn't really matter for the general public. Is the system in protected mode? Does not really matter during that era. What did matter however, was resource consumption. Maybe for speed of legacy programs OS/2 1.x targeting only 286 was not good, compared to even Windows 2.1x which had Windows/386.

On the memory front, OS/2 1.1 requested 3MB of RAM. Windows 2.x/386 requested 640kB. Windows 3.1 requested 2MB (or even just 1MB in standard mode). OS.2 2.0 requested 4MB (release date 92). Windows 95 requested 4MB too...

So OS/2 was arguably kind of technically inferior to Windows on many other points, and that was the points that were actually important for most people at the time.

Windows NT was "far better" (abstracting away e.g. the resources required) than both, but was out of reach of consumer system for years...

On the other hand, Mac OS on 68k was quite problematic until its end but it was not what made successes and failures of Mac during that era.

The success of Windows was certainly not only about backward compat. It was also about requirements. And price. And add to that, of course, about what was provided by default by nearly all OEMs...

Sure OS/2 was with a certain modernist view of things "technically better" than some Windows, but you know what else was even better in 92? A SPARCstation 10 :D

> Windows NT was "far better" than both, but was out of reach of consumer system for years...

Microsoft had a far better developer roadmap with "32 bit protected mode with just a recompile." They sandwiched OS/2 v2/3 from the bottom and top. When memory prices dropped, the period when OS/2 had a significant hardware cost advantage over NT was rather short.

I believe you are correct. The IBM PC's direct competition was not so much Apple but 'business standard' CP/M Z80/8080 machines from a variety of vendors.

Also IBM was under an antitrust decree requiring reasonable and discriminatory licensing, so they didn't have much choice in the matter. Once this was lifted, they went the Microchannel route.

Yes - everyone seems to forget that S-100 CP/M business machines had been around since 1976. It wasn't a huge market, but it definitely wasn't negligible. And there were some home users, although most preferred Apple.

So IBM didn't invent the PC. IBM invented mass marketing for its own brand of PC. And also - after a while - IBM started the constant upgrade treadmill.

S-100 was more backward looking. You could get 8086 and 68000 cards for S100, but it was really a Z80 ecosystem. There was no clean upgrade path with backward compatibility. This was a showstopper for most businesses, which needed access to old data and software.

IBM screwed themselves by hugely overcharging for PS/2, and Compaq stole their lunch. As soon as other clones appeared IBM were out.

Award BIOS was probably a huge game changer. Knowing the contents of BIOS wasn't enough, after Franklin lost their lawsuit with Apple. Someone had to come up with a functional equivalent. A third party BIOS meant that anybody could crank out a machine.

I still remember the nonsense cost of IBM PS/2 i386 for like $14,000-$21,000.

IBM approached PC revolution with a dinosauric mainframe mindset and that killed it.

For them.

I still remember ~1985 paying $1,000 for 512KB of DRAM from suppliers listed in the back of Computer Shopper. Even clones were very expensive then, but not PS2 expensive.

Yes, I was in IBM, some AI in their Research Division, from 1985 to 1993 when IBM had a big crash.

Maybe an old idea I had about PCs has some insight: The idea is that for several years into the rise of the PC, the base, solid as concrete, fundamental, economic productivity reason for the PC was to kill off the typewriters, that is, word processing; as one guy put it, "capture the key strokes". The typewriters didn't "capture the keystrokes" and, thus, were a huge economic waste. Next in line was spreadsheets.

Now the biggie? Okay, replace TVs. We've already essentially replaced newspapers printed on paper; and PDF is replacing a lot of books printed on paper.

The future? See a problem that in terms of economic productivity needs solving, and get one or a few PCs and solve it. How? There is nearly no limit on the new problems to be solved or the new means of solution.

No mention that PC clones happened only because IBM could not prevent Compaq's reverse enginneering trick to give birth to PC clones?

Side note, there are a LOT of die-hard Thinkpad fans, myself included. Sadly it just hasn't been the same since Lenovo took over. I literally grew up playing Counter-Strike Source with the little red dot.

I've got a smallish collection of old IBM ThinkPads I hold on to (most of them with the matching laptop bag). Models include: 701C, 380XD, 380Z, 600X, 770Z, T42P, T43P, X41t. Running Linux on all of them except the T42P, 380XD, and 380Z.

Oddly enough, my current 'laptop' is the Lenovo X1 Tablet, probably the least IBM-like ThinkPad ever (it's like a Microsoft Surface). I just really like the form factor, and unlike the surface, you can take it apart and replace the battery. With thunderbolt, I run an eGPU for light gaming and video editing.

Yup, I loved my Thinkpads back in the 2000s.

It just never was the same with Lenovo. After 2010, I completely lost interest and moved away to Apple and using a MacBook.

I was always running Linux on my thinkpads, and still happily doing so on the latest Asus G14 :)

I always ran Linux as well.

I highly recommend Charles Ferguson's "Computer Wars" on this topic, as well as his other book (about startups and the early Web), "High Stakes, No Prisoners."

Microsoft stabbed IBM in the back with licensing DOS to OEMs and bundling Windows and Office with each OEM PC sold while IBM had OS/2 Microsoft had Windows NT.

Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact