We had a couple of problems that IBM's competitors didn't have. For one thing, we couldn't make too powerful a PC that might cannibalize the minicomputer sales (AS/400). IBM made much more money on an AS/400 than a PS/2 model 95.
Our cost structure was too high. How high? We lost $1 for every $100 we sold.
Part of the reason the cost structure was high were the number of IBM'ers: I worked in an office of about 15 people supporting the NY/Long Island region. Four years later, it was down to 1 person.
Another reason was our physical costs were high. For example, we couldn't buy RAM from Intel; we had to buy RAM from IBM (which was fabbed in Kingston, NY, not to be confused with the Kingston RAM mfr).
Also, we were struggling to get the correct mix of sales channels. For example, we couldn't sell PCs directly through the IBM sales force (too expensive, IIRC it added 36% to the sales cost). Instead, the IBM sales force would partner with dealerships (which added 21% to the sales cost).
The web didn't exist yet, not really, so we couldn't sell through there. We could have done direct-response (an 800 number, like Dell & Gateway were doing), but that took a while to spin up. And it was similar in cost to selling through dealerships. Also we didn't sell through distributors, which was the least-expensive channel (16% IIRC).
I remember looking at the price list. We had two levels of dealers: those who bought large volumes, which typically purchased equipment at 30% off list, and those who worked in smaller quantities, which would get ~20% off list.
As the PC Market became more competitive, we couldn't afford high list prices (we were too expensive), so one of the ways we brought down our list prices was to shrink the discount we gave to the dealers.
The dealer was not allowed to sell online or via computer magazines: there had to be in-person contact. That helped prevent the Atlanta dealers from poaching the NYC dealers' business.
It's easy to armchair-quarterback the decisions IBM made, but a lot of those decisions were nuanced. For example, you could say, "Why didn't IBM have an 800 number to buy PCs?". And the answer is that a lot of our biggest customers were corporations, not individual consumers, and they felt more comfortable purchasing through a dealer they trusted.
Agreed, knowing when to cannibalize your own market is one of the key inflection points.
Apple has done a fantastic job (up until now) of introducing something that basically makes a "current at the time" product obsolete.
Classic version was the iPhone when iPod was a household name. A more conservative company would figure out how to keep the iPod market extremely healthy, but in the end losing out on the whole pie. A lot of smaller/other instances as well.
> IBM made much more money on an AS/400 than a PS/2 model 95.
I wonder if someone high up in IBM made the same mistake for personal computing that Ballmer did with the iPhone - "There's no chance that the iPhone is going to get any significant market share. No chance..."
To be explicit, making the bet to cannibalize your own market is one of the toughest any company can make. High risk, high reward. It's so easy to point at the winners and forget the landscape that laid waste to the ones that lost.
But, being able to do it consistently is what makes a company multi-generational.
> Apple has done a fantastic job (up until now) of introducing something that basically makes a "current at the time" product obsolete.
An interesting analogue at this time will be the iPad and the Macbook lineup now that they both use the same processor. It'll be interesting if they make one or the other obsolete or keep both going. (Yes, I realize there's all sorts of design tradeoffs between direct & indirect manipulation environments)
Personally I couldn't wait nor justify owning two devices - migrating to a Windows/Linux tablet after waiting so many years and invested into the mac platform for a decade.
I'm not really sure why nobody's built one yet except for the sheer complexity involved. But it certainly seems doable.
The iSH team did run into some friction with Apple over the fact that the built-in filesystem image is Alpine and thus includes a package manager, but apparently they were able to resolve that situation in a way that allowed them to keep `apk` in the built-in filesystem image: https://ish.app/blog/app-store-removal
What do you mean by this?
> To be explicit, making the bet to cannibalize your own market is one of the toughest any company can make. High risk, high reward.
In other words, it is considered a viable strategy, although also a risky one. The fact that it didn't pan out in this instance doesn't change that.
Don't feel sad. This market dynamic is how startups can get ahead. Otherwise everything would be dominated by the huge incumbents forever.
One thing to remember is that mainframes are still dominated by IBM, 60 years later. And mainframes are still a somewhat stable market worth billions of dollars.
IT works primarily by accretion, new markets are created, and in those markets IT also seems to work by natural monopolies due to network effects, so once the market stabilizes the huge incumbents are there forever.
IBM still dominates mainframes, minicomputers are gone so no one dominates those anymore, workstations are also gone so no domination there either, but PCs are dominated by Microsoft and have been for 40 years, smartphones are split between Android and iOS and both will probably be around as long as we use smartphones.
?? PC (Desktop) OS are certainly dominated by Microsoft, but I don't see any one player dominating the PC Market. There's definitely companies that dominate different PC components (OS, CPU, audio, network, etc)
Anyway, I know that the mainframe market is pretty healthy, but who are the clients?
Banks, insurance companies and airlines. The amount of only-documented-in-code corner cases and custom business logic is immense (for the really old, up to 60 years!) which means rewrites are immensely difficult and expensive (and prone to failure), the amount of government regulation and the fines for when things go south (aka people lose money) is equally massive, and the companies are ruled by beancounters by definition which means they will, as long as IBM has mainframes and support for them, go with keeping their mainframe stuff to "never touch a running system".
That is part of why fintech start-ups are eating the old banks and insurance companies alive... they can design proper systems from scratch without having to take care of decades of accumulated cruft that needs to be migrated and especially they don't have to retain horribly outdated and complex business logic, workflows and processes that would take old-school companies billions of dollars to re-train their staff - almost 2 million people in 2019, per https://www.statista.com/statistics/250220/ranking-of-united....
Oh, and speaking of the rest of the PC market, another natural monopoly on the other side: servers are dominated by Linux.
 And historically by Intel, ergo Wintel.
Sure, but it wouldn't be one-for-one. A lot more people can justify buying a PS/2 than can justify an AS/400. Did you make more on an AS/400 than on 100 PS/2s? 1000?
I think what I'm saying is if you have a model where you make 30% to 50% margins then going after business that can only attain 3% to 5% margins is never going to work.
IBM engineer responding to the accusation that MCA is proprietary (around 14' 15"):
"I really don't understand the term proprietary. To me that would mean that it's not available [snip] and it's available for use by anyone."
Pentium 4 never had a side connector model, but did have a “socket shrink” in the transition from Willamette (0.18 u) to Northwood (0.13 u) cores. Socket 478 was kept until the Prescott cores (0.09 u). The Prescott cores were also offered with an LGA 775 socket later.
FWIW AMD's original Athlon used a Slot connector as well, for similar reasons.
It's weird, yeah, but is it worse than aligning an expensive bunch of easily bent metal pins into a grid of small holes?
OS/2 2.0 used features found in the 386 to greatly improve the MS-DOS experience.
IBM instead kept pushing the OS/2 1.x versions all of which had very bad support for MS-DOS applications.
That delay was a massive bonus for Microsoft as it gave them time to work on Windows 95, which used the same 386 CPU features to deliver similar improvements in running MS-DOS applications.
Win3.1 added some features which made windowed DOS boxes more usable.
Win95 came three years after OS/2 and Win3.1 were released.
No. Windows was basically irrelevant until 3.0 came out in 1990.
Spreadsheets changed the world. We went from 1 or 2 types of tuna and 1 or 2 types of olive oil in the supermarket to dozens of each. Why? How? The big enabler was Lotus 1-2-3. Product managers in food companies could manage the ingredients, inventory, branding, marketing and sales performance of dozens of products at a time instead of just one or two. The spreadsheet was a fancy calculator, in the same way that a machine gun is a fancy pistol, and a combine harvester is a fancy hoe.
The first decade of personal computers enabled office work to scale, and it reconfigured literacy by enabling individuals to type letters when they felt like it. The fax machine was right there alongside, like the hammer and the sickle of capitalism. The resulting boost to productivity touched every industry the world over.
The article's claim about "IBM’s strategic error in not retaining rights to the operating system" begs the question if the PC revolution would have happened had Microsoft not owned it. I personally doubt it. While IBM had success up to the IBM AT, there really wasn't much revolutionary about them and they were stupidly expensive. The clones really made the PC a success. It's just tragic that the crap SW and HW won the market.
Yes. Price competition is a wonderful thing. They undercut IBM's pricing model. IBM tried to get it back with the PS/2, which was patent protected. But that totally missed the point.
> It's just tragic that the crap SW and HW won the market.
Crap SW and HW made the market. There wasn't much of a market for PCs at the PS/2 price point. There was a much larger market at the PC clone price point. (I'd guess that halving the price led to something like 10 times the volume. The PS/2 was locked out of any meaningful volume just by the price.)
General rule: The PC I can afford is better than the PC I can't. Sure, it may be technically inferior. Doesn't matter. The one I can't afford has zero utility for me, so the one I can afford is actually better (for me), no matter how crap it is.
The other thing that made that market was the clone vendors cooperating to define the EISA bus standard. It created a huge market (all the clone companies put together) for peripherals. And because you could get anything in an EISA peripheral, that created a huge market for EISA PCs. It was mutually reinforcing to create an exploding market. If they had fought each other with competing bus standards, that would have killed the market.
From that perspective, all the clone makers really did was being intelligently lazy. They just built IBM 5170s with faster CPUs for years. It was probably MORE effort to develop a custom bus and break all the existing expansion options. You saw some brief experimentation with that for the early "local bus" machines before the advent of VL-Bus, but that was quite a few years after the debut of MCA and ISA had legitimately run out of gas.
EISA might have been important from the perspective of building a vendor-neutral standards process.
> Both Microsoft and Intel made a fortune selling IBM’s competitors the same products they sold to IBM. Rivals figured out that IBM had set the de facto technical standards for PCs, so they developed compatible versions they could bring to market more quickly and sell for less.
The main selling point of OS/2 was supposed to be that it had preemptive multitasking and protected memory, which was supposed to make it much more stable compared to the (constantly crashing) Windows 3.0/3.1.
The problem was that the Workplace Shell (the GUI) had some sort of single message queue that could be blocked by a misbehaving program. This would cause the GUI to hang. While it was true that the system would continue to task switch, and you could even telnet into the machine after this happened, from the console, the system was completely unresponsive, so it was functionally equivalent to the OS crashing.
IBM as an organization didn't seem to understand what features were actually relevant to normal users and prioritize them. It could do a bunch of things like smoothly multitasking multiple DOS apps, that were technically impressive but not that important to users.
Windows 9x had very similar issues, and even early versions in the Windows NT series weren't altogether free of them. The viable fix was to hit CTRL+ALT+DEL and force close the misbehaving application.
Wasnt Windows famous for BSOD?
Hence why nowadays all drivers have to be certified, and graphics drivers are again in userspace.
You have to remember the barrier between kernel and userspace was very porous, applications had their own address space but e.g. DLLs were projected at identical addresses and shared some resources. Kernel memory was mapped in userspace, if I'm remembering it right. Basically, apps had their own cubicle, not their own apartment.
Eventually NT 4.0 became a better option, if you could get all the required software and weren't doing DOS games.
Maintaining and growing any kind of platform becomes a whole new ball game when "what are its specs" is replaced by "will it run my apps?"
In many respects maybe, but which aspects matter for success is highly contextual, and some just didn't really matter for the general public. Is the system in protected mode? Does not really matter during that era. What did matter however, was resource consumption. Maybe for speed of legacy programs OS/2 1.x targeting only 286 was not good, compared to even Windows 2.1x which had Windows/386.
On the memory front, OS/2 1.1 requested 3MB of RAM. Windows 2.x/386 requested 640kB. Windows 3.1 requested 2MB (or even just 1MB in standard mode). OS.2 2.0 requested 4MB (release date 92). Windows 95 requested 4MB too...
So OS/2 was arguably kind of technically inferior to Windows on many other points, and that was the points that were actually important for most people at the time.
Windows NT was "far better" (abstracting away e.g. the resources required) than both, but was out of reach of consumer system for years...
On the other hand, Mac OS on 68k was quite problematic until its end but it was not what made successes and failures of Mac during that era.
The success of Windows was certainly not only about backward compat. It was also about requirements. And price. And add to that, of course, about what was provided by default by nearly all OEMs...
Sure OS/2 was with a certain modernist view of things "technically better" than some Windows, but you know what else was even better in 92? A SPARCstation 10 :D
Microsoft had a far better developer roadmap with "32 bit protected mode with just a recompile." They sandwiched OS/2 v2/3 from the bottom and top. When memory prices dropped, the period when OS/2 had a significant hardware cost advantage over NT was rather short.
Also IBM was under an antitrust decree requiring reasonable and discriminatory licensing, so they didn't have much choice in the matter. Once this was lifted, they went the Microchannel route.
So IBM didn't invent the PC. IBM invented mass marketing for its own brand of PC. And also - after a while - IBM started the constant upgrade treadmill.
S-100 was more backward looking. You could get 8086 and 68000 cards for S100, but it was really a Z80 ecosystem. There was no clean upgrade path with backward compatibility. This was a showstopper for most businesses, which needed access to old data and software.
IBM screwed themselves by hugely overcharging for PS/2, and Compaq stole their lunch. As soon as other clones appeared IBM were out.
IBM approached PC revolution with a dinosauric mainframe mindset and that killed it.
Maybe an old idea I had about PCs has some insight: The idea is that for several years into the rise of the PC, the base, solid as concrete, fundamental, economic productivity reason for the PC was to kill off the typewriters, that is, word processing; as one guy put it, "capture the key strokes". The typewriters didn't "capture the keystrokes" and, thus, were a huge economic waste. Next in line was spreadsheets.
Now the biggie? Okay, replace TVs. We've already essentially replaced newspapers printed on paper; and PDF is replacing a lot of books printed on paper.
The future? See a problem that in terms of economic productivity needs solving, and get one or a few PCs and solve it. How? There is nearly no limit on the new problems to be solved or the new means of solution.
Oddly enough, my current 'laptop' is the Lenovo X1 Tablet, probably the least IBM-like ThinkPad ever (it's like a Microsoft Surface). I just really like the form factor, and unlike the surface, you can take it apart and replace the battery. With thunderbolt, I run an eGPU for light gaming and video editing.
It just never was the same with Lenovo. After 2010, I completely lost interest and moved away to Apple and using a MacBook.