
The Decline and Fall of IBM - evo_9
http://www.cringely.com/2014/06/04/decline-fall-ibm/
======
DoubleMalt
I recently was told a story by an internal developer who works for a company
that outsourced their IT infrastructure to IBM.

Their deployment process for a business critical java application is as
follows:

\- Someone calls IBM to alert the person (resource) responsible that they want
to do a new deployment. If this person is sick or on vacation, try again
later.

\- Then they send over a jar file with the compiled application.

\- The responsible person copies the jar file to the appropriate location and
restarts the server.

I rest my case.

~~~
incision
This was my experience with IBM hardware infrastructure as well. Just replace
'copy jar files' with everything from 'power on server' to 'create LUN'.

After a major, unexpected power outage we had to stand around waiting for a
technician to be dispatched and drive over to:

1.) Flip the power switch.

2.) Say 'Looks good.'

3.) Leave.

(I was told in dead serious terms that I'd be dismissed for powering up the
box myself.)

~~~
adl
The company I work for (that shall remain unnamed) signed a thousand million
dollars, 10 year contract with IBM for IT infrastructure.

I was in need to log into a Mac Pro that ended up in one of the Server Rooms
that IBM took over in our building.

I basically needed to connect a keyboard, log in and enable remote desktop so
I could run a few XCode builds (I don't have a Mac, and I needed to test a few
hybrid Sencha/Phonegap apps)

after several phone calls involving my boss and his boss, three meetings with
five IBMers and 3 months (I kid you not) we gave up, and ended up borrowing a
few personal MacBooks from friends to test the apps.

The Server Room was a few meters from the conference room, it was a 2 minute
job, but we could not enter and do the changes ourselves without IBMs
permission.

The word frustration doesn't even cover it.

(oh, and the 'copy jar files' thing? totally true.)

~~~
kamaal
>>The Server Room was a few meters from the conference room, it was a 2 minute
job, but we could not enter and do the changes ourselves without IBMs
permission.

Sorry, But there are reasons why your dev team is kept out of the server
rooms. Because generally when permission is given they go in and make big
mess, which some one has to clean up later. Plus there are thousands of other
issues like data security, theft etc that need to be taken care of here.

I've worked for a outsourcing firm here in India, we partnered with another
outsourcing firm. While we were there our partner offered a lower hourly
billing price and we lost most projects to them. We had a strict server room
policy. They came in with all these policies of 'process elimination','cutting
bureaucracy' \- Which ultimately directly translated to cow boy behavior with
total collapse in discipline.

After some time the chaos ensued. Dev's would just walk into servers rooms for
even simple things collecting logs, or troubleshooting production software
issues. Until some one accidentally pulled out a power cable while toying with
a server. A downtime happened, followed by all kind of data consistency
problems. It took quite a while, and a lot of beating from senior management
for them to realize rules exist for a reason.

My respect for IBM just increased after I read your story. If I have to ever
in the future outsource some IT infrastructure I may chose them.

------
TheMagicHorsey
I haven't read Cringely before. Is he a trustworthy reporter, or a
sensationalist? I will buy his book if he's a sober observer. I just hate
these kind of books usually though, because they are often the product of
sensationalists looking to make a quick buck.

~~~
smacktoward
His book "Accidental Empires" ([http://www.amazon.com/Accidental-Empires-
Silicon-Millions-Co...](http://www.amazon.com/Accidental-Empires-Silicon-
Millions-Competition/dp/0887308554)), and the PBS documentary miniseries that
was based on it, "Triumph of the Nerds" ([http://www.amazon.com/Triumph-Nerds-
Bob-Cringely/dp/B00006FX...](http://www.amazon.com/Triumph-Nerds-Bob-
Cringely/dp/B00006FXQO/ref=sr_1_1?s=movies-
tv&ie=UTF8&qid=1401912503&sr=1-1&keywords=triumph+of+the+nerds)), are both
terrific, must-read/see histories of the dawning and maturity of the age of
personal computing. The doc is especially fascinating now because it was made
in the window of time after Steve Jobs' failure at NeXT, but before his
triumphant return to Apple -- so it provides a glimpse of him humbled and
circumspect, which is a very different tone than that he took in nearly every
other public appearance ever.

Cringely's more recent work has been kind of hit or miss, though.

~~~
kqr2
He also did a miniseries on PBS called "Plane Crazy" in which he tries to
build an airplane in 30 days.

It actually has some good start-up lessons in there!

[http://www.pbs.org/cringely/pulpit/1998/pulpit_19980724_0005...](http://www.pbs.org/cringely/pulpit/1998/pulpit_19980724_000578.html)

~~~
pessimizer
OH WOW. He's that guy? That documentary is one of my favorite things of all
time. Ok, just went to Amazon and ordered a copy; thanks for reminding me of
this.

------
the906
I work for IBM's new design group and honestly I'm pretty optimistic about the
future of the company (and I'm a generally a pretty big pessimist/realist).
Aspects of the company have issues but many of those are being fixed/changed.

~~~
coldcode
I think IBM is big enough to be both incredibly stupid and smart at the same
time, depending on where you look.

------
na85
A ~300 page post mortem of a company that is still alive? This isn't
Blackberry we're talking about. IBM has (as the author duly notes) huge
financial reserves and can easily stay relevant for another decade IMHO.

It seems like markets and investors have been accustomed to the huge, meteoric
rises of a small handful of star companies, forgetting perhaps that plenty of
businesses don't need to become household names overnight to succeed.

~~~
beat
"relevant for another decade" =~ dead.

One could write a similar article for Microsoft, really. I think they screwed
the pooch with Windows 8 and Surface. They tried to protect both the Windows
and Office monopolies simultaneously in the face of radical change, and lost
both.

IBM has been the biggest player in enterprise IT since it was done with
typewriters and adding machines. For them to become irrelevant, even if it
takes a decade, is a huge deal.

~~~
_delirium
_" relevant for another decade" =~ dead. One could write a similar article for
Microsoft, really._

You have to call the right decade, though. :) The startup community started
declaring Microsoft on its death bed in the late 1990s...

~~~
coldtea
Actually it was mid-00s. And they were right.

(You might be conflating the startup community with the open source community
of the late nineties, members of which did say that Linux was to "soon"
overtake Windows in the desktop).

~~~
_delirium
_You might be conflating the startup community with the open source community
of the late nineties_

Considering the large overlap between those two communities, I'm not sure it's
an unwarranted conflation. The first tech boom included a bunch of Linux-
oriented startups and IPOs, whose successful pitch to VCs involved
essentially, "Microsoft is on its way out".

~~~
coldtea
I think the Linux oriented startups were the minority (VA Linux, Eazel,
Ximian, some distros etc).

The majority in the late nineties were web startups -- for this new fangled
"web" thing. Think amazon, pets.com, etc.

~~~
_delirium
True, although a decent number of the web startups _also_ thought Microsoft
was the walking dead, because it had missed the web (which was sort of true,
but ended up not killing MS).

------
mixmax
_Gibbon’s thesis was that the Roman Empire fell prey to barbarian invasions
because of a loss of virtue. The Romans became weak over time, outsourcing the
defense of their empire to barbarian mercenaries who eventually took over.
Gibbon saw this Praetorian Guard as the cause of decay, abusing their power
through imperial assassinations and incessant demands for more pay._

read that sentence again and think about the US.

~~~
AnimalMuppet
We're not quite there yet. You don't see the Koch brothers trying to
assassinate Obama. (No, attack ads are not the same.) You don't see a shooting
civil war between Koch supporters and Soros supporters. And all that happened
during the Roman Republic.

Moving on to the Empire, you don't see soldiers assassinating the president
(or members of Congress) because they want higher pay.

I know that it feels similar, but we really aren't there... at least not yet.

~~~
coldtea
> _Moving on to the Empire, you don 't see soldiers assassinating the
> president (or members of Congress) because they want higher pay._

Perhaps because that's not how politics is played in the 21st century in a
Western country.

That doesn't necessarily mean the decline is not there -- it could be just as
big as Rome's but different. For one it wont have much to do with "babaric
hordes".

------
jlukanta
It's interesting that he claimed IBM has probably been doomed since 2010,
considering that the stock price in 2010 was around $120, and it is now
$187.06. I guess 50% increase means the company is dying.

~~~
astrodust
Stock price is mostly a matter of public opinion, and often diverges wildly
from fact.

~~~
batbomb

         ...often diverges wildly from fact
    

What is it called when somebody writes a book titled "The Decline and Fall of
[company still around]"

~~~
astrodust
Blackberry is technically still around.

------
fidotron
It strikes me that IBM has become full of the sort of people that can see how
to cut fat from a business, but without any of the sort that know how to grow
one, or at least if they are there they don't have much influence.

One of the most important lessons I learned in big corporate America was that
in the end the cutters always lose, even if they can make small gains in the
short term there is only so much you can cut. Upside is potentially larger,
but finding it is the core of the problem.

~~~
xienze
Your typical software developer at IBM has 7 or 8 levels between him and the
CEO. They're good at cutting, but I'm not sure "fat" is what they know how to
cut.

------
graycat
IBM:

It was, when, 1890, that the US Census took more than 10 years to process one
census that is supposed to happen each 10 years? So, US Census needed a better
tool. Enter H. Hollerith who borrowed the idea of punched cards from, maybe,
the Jacquard loom.

Tom Watson was with National Cash Register or some such, saw the Hollerith
equipment, concluded that it was the secret to automating routine business
record keeping, especially accounting, and IBM was born.

Watson sold the heck out of his equipment. And he had a team of good electro-
mechanical engineers in Poughkeepsie designing and building the equipment.
That takes us to about 1960. They made a lot of money.

About then, Tom Watson, Jr. visited Columbia University and reported that
there was a guy there doing things, whatever, 200,000 times a second.

An IBM life insurance customer in NYC complained that the IBM punched cards
were taking up too much room. So, some changes were needed, e.g., magnetic
tape.

Due to US WWII work on vacuum tube digital computers and pushed by Univac, IBM
got into the computer business, with magnetic tape, software, etc.

In the late 1950s Watson was told that he had no research division and that to
have a good one he needed a leader good researchers would respect. So, Watson
got a chunk a land in Westchester County, NY and hired Eero Saarinen to build
him a fancy research lab on a hill near Yorktown Heights.

By the mid 1960s, IBM had an idea, a line of compatible computers called
System 360 (the origin of the IBM 'mainframe' computers). So, a customer could
start with a cheap, small, slow System 360/20, 30, 40, etc. and when ran out
of capacity upgrade to a 360/50, 65, 85, 91, 95, etc. and just keep running
the same software.

IBM regarded itself not as an electronics company or a computer company but a
marketing company. The marketing/sales people had nearly all the power.

Here the focus was still on routine business record keeping, 'business
machines', and not general purpose computing. IBM had some good work in
general purpose computing, but the executives stayed with their 'good IBM
customers'. So, when MIT did Multics in 1969, IBM was slow to respond. When
DEC was selling good general purpose time sharing on the DEC 10, IBM was slow
to respond.

As a means of a time-sharing computer to be used for developing operating
systems, IBM Cambridge Scientific Center did CP67/CMS for the IBM 360/67
virtual memory computer of about 1967 and where CP67 was virtual machine. Lots
of people loved CP67/CMS, later VM/CMS, but IBM's marketing people didn't want
to sell it. Internally, VM/CMS ran the company through at least 1994. Also
there was VNET, roughly like the Internet except the computers were also the
routers -- did essentially of the IBM internal computing through at least
1994. Use it? Yes. Eagerly sell it? No. 'Dog fooding'? No.

In the 1970s, IBM did move to System 370, a 'mainframe' which had virtual
memory. Customers wanted to do interactive, on-line applications, and IBM
responded slowly. For the communications, IBM did Systems Network
Architecture, which was about as flexible and easy to install as a railroad
and about as costly. IBM did notice that with much in on-line activity, they
could sell a lot more System 370 computers and did.

IBM was slow to let the capacity of the 370 computers increase as quickly as
customers wanted, but by 1980 IBM had a collection of faster boxes. As 18
wheel trucks lined up to receive these computers, the line backed up to the
NYS Thruway, and there was a blip in the US GDP.

A lot of IBM customers were doing 'personal productivity', e.g., word
processing and electronic spreadsheets, on IBM's mainframe computers.

But that was 1980, and DEC, Data General, Prime and others were doing well.
Prime was a single board, bit-sliced, virtual memory computer with some extra
register sets for fast process switching and a darned efficient time sharing
box, much more efficient, and much easier to use, than anything from IBM. And
in 1980, Prime gave the best ROI on the NYSE.

Also about here came the microprocessors, e.g., Intel 8080 and 8086. Then PCs
began to explode. Then work started moving off IBM's mainframes. By 1986 or
so, DEC was getting more revenue from DuPont than IBM was.

There were well done IBM internal reports on technology and markets that
outlined the future and the threats to IBM's business, but IBM's leadership,
really successful mainframe salesmen, essentially ignored the reports. The
mainframe people had the power and worked to kill off anything else inside
IBM.

At one time, IBM CEO Gerstner said that "IBM is the most arrogant, inwardly
directed, process oriented company I've every seen.".

By 1986, at an internal top management meeting, it was possible to go around
the table and find that nearly no one had made their projections. The
conclusion was that God had ceased to smile on IBM.

IBM laughed at Intel and Microsoft, and those two came to have by far the last
laugh.

For a while IBM's Cocke's work on reduced instruction set computing (RISC)
gave IBM an opportunity to grab the high end desktop and workstation markets,
e.g., for finance, engineering design, graphics, but the mainframe people
didn't like the competition. E.g., when an IBM mainframe had a processor clock
of about 10 MHz, Cocke's discrete component board with RISC had a clock of 80
MHz.

Near 1994, IBM in three years IBM lost $16 billion and went from ~400,000
employees down to ~200,000. The research division phone book went from 4500
names down to 1500 with about 500 of those recent temporary employees.

Then IBM pushed services, e.g., they would run your data center for you.

IBM bought companies with attractive products and put the IBM marketing force
behind those products.

Net, the first and last 'visionary' thing IBM did was to grab Hollerith's
work, although, of course, it was Hollerith who was the real visionary. Since
then IBM has focused on selling 'business machines' for routine business
record keeping for large banks, insurance companies, manufacturing companies,
and governments. That's their 'market'. They haven't tried very hard to look
for other markets. In technology, IBM lets others take the first steps and
maybe does something similar or maybe just buys a company. That's what they
are still doing.

IBM had a lot of opportunities they just dropped, apparently deliberately:

They could easily have had all of Cisco, Intel, Microsoft, and Oracle. At one
time, they ran NSFNet, that is, the Internet; had IBM stayed with that work,
they could have be running some huge fraction of all of the Internet. IBM was
long a leader in laser printing, e.g., for printing bank statements on a roll
of paper moved with a fork lift truck, but in the US HP made big bucks in that
market. In the 1980s IBM saw the need for video servers and had some working;
same for wearable computing. For the Internet of things, IBM has long had the
TCP/IP stack on a chip. For relational database, IBM invented it, roughly
parallel to E. Wong's work at Berkeley, but now the revenue for relational
database goes to, right, Oracle, etc. IBM did early work on the tricky
'passive' amplifier wrap around an optical fiber to amplify the digital signal
without converting from optical to electrical; likely that amplifier is
crucial for the backbone of the Internet, but IBM is not running that
backbone; maybe IBM is getting some patent license revenue. IBM did good,
early work on giant magneto resistive disk heads and associated vertical
recording and long was the leader in magnetic disk, but now people buy hard
disk drives from Seagate, Western Digital, Maxtor, etc. IBM did a lot of high
end work on large disk storage systems, but people buy from EMC, build their
own, etc. It goes on and on: At one time or another, IBM had in their lap the
beginnings of nearly everything we see today in information technology but
dropped it.

IBM remains focused on being a 'marketing' company, but now they are a bit
short on what to market.

Ah, yet another mainframe salesman bites the dust.

~~~
hga
Quibbles:

Jumping on computers was "visionary" in the same way as Hollerith's work, but
perhaps more so, with various complications like an antitrust suit to help
jump start it. But they did a lot of interesting things in that period,
seriously pursued scientific computing for a while (was one of the biggest
markets for some time), made relatively affordable machines like the 650 (main
memory a drum, first mass produced computer per Wikipedia) and 1130 (360
technology with some very clever hacks), certainly innovated in computer
languages (FORTRAN, maybe PL/1). Taught us a lot about how not to write
software ( _The Mythical Man Month_ ), but were hardly unique in that, or the
second system syndrome.

In a pattern I observed at the time, companies that really screwed up their
disk drives, as in too many failures in the field, lost so much brand equity
they were forced to sell the remnants to another company. Or at least this is
how I interpret IBM's sale of their drive unit to Hitachi in 2002, and
Maxtor's to Seagate in 2006.

Anyway, the point being that for disk drives IBM did worse than the pattern
you describe above, this was an severe execution failure.

Although that also seems to be happening in services.

~~~
graycat
> Quibbles:

You make good points. I typed ASAP. There are a lot of possible "quibbles"
with what I wrote.

You are correct about the 'vision' thing -- Tom, Jr. did a very gutsy thing
pushing out System 360. He essentially bet the company. By then IBM was well
into computing, e.g., the 7094 based on transistors instead of tubes, but 360
was at least two giant steps more.

But I do remember that somehow about then DEC was able to do the PDP 10, nice
system -- hmm. Dartmouth was able to do the system that GE used to sell time-
sharing, etc. MIT did Multics. Net, others were able to write OS software
without betting a Fortune 10 or so company on the work.

You are fully correct about the anti-trust suit -- in a sense it made IBM 'gun
shy' and timid for a long time.

As I recall, long IBM sold the main chips used by Cisco and Juniper. I can
guess that IBM gets patent licensing revenue for disk head technology, and
maybe the heads themselves. But the big bucks are in selling the drives and
the subsystems.

I forgot about the sale of Maxtor to Seagate.

I guess my broad point was that IBM was essentially always a 'marketing
company' run by successful mainframe salesmen. E.g., once, the day after my
wedding, I got an offer from the IBM Chicago branch office, and the head there
gave me that description of IBM. They wanted me to hold the hands of the oil
refining customers using linear programming to make decisions on what inputs
to take and what outputs to make -- big bucks in that, still. They had sold a
360/85 and likely wanted to have sold more. Gee, I might have gotten to work
with Ralph Gomory, later head of IBM Research.

I guess part of my 'broad point' was that they kept throwing off their plates
little opportunities like Intel, Microsoft, Cisco, Oracle, Seagate, EMC,
operating the whole Internet (can you believe that!), Yahoo (IBM had Prodigy,
good idea, bad execution), OS/2 (ahead of anything from Microsoft until
Windows NT or 2000), Netscape (IBM had a decent Web browser early on), Web
servers ("ah, what's to do with an Web server; trivial, right?" \-- build one
easy to program and to serve 10,000 pages a second and then tell me that).

PL/I? Yup! It was done by a committee headed by George Radin in about 1964. I
used PL/I and CP67/CMS to do the first computer based scheduling of the fleet
at FedEx. Nicely enough I was paid well to learn PL/I by the US DoD at the
JHU/APL for some work on passive sonar and the FFT. Once I tried to talk to
Radin about operating systems, and he said, "Three times in my career I tried
to help IBM in operating systems, and three times I broke my pick trying.".

I remember the sale to Hitachi but didn't know that the IBM drives sucked.
I've heard only good things about Hitachi drives; sorry I didn't mention them
-- another quibble.

IBM had object oriented programming in microcode as part of 'Future Systems'
in 1980 or was it 1970?

But, they were a 'marketing company' selling to 'good IBM customers'.

> Although that also seems to be happening in services.

I can believe that. Long IBM had their pick of the job pool something like
Google seems to today; likely no more.

So be it.

~~~
hga
" _MIT did Multics_ "

And it was a fairly ugly, long, and drawn out second system syndrome
experience; rather famously Bell Labs dropped out of the project but the
experience inspired UNIX(TM), which is Multics with some vital parts missing.
Delivered much later than planned, but it was a quality system. It very
possibly was the only system of the ones you mentioned that had scope on the
order of OS/360 and TSS/360 (5 and 1 man millennia) ... but it wasn't done
with the commercial pressure on those IBM OSes, and of course embraced virtual
memory vs. disdained it.

Multics and the PDP-10 were both crippled by 1 MiB address space sizes (1/2 of
36 bits, word addressed, the total for the PDP-6/10/Decsystem-20, per segment
for Multics, which was much less limiting, mostly a nightmare for big data
sets).

I'd quibble OS/2 was also a failure of execution, driven by marketing. IBM
told IT managers that the PC/AT was the last PC model they'd have to buy for a
long time, but the 286's protected mode was horribly misconceived (changing a
64 KiB segment incurred a terrible performance hit). Therefore early OS/2 had
to run well on it ... but really couldn't all that well. Whereas Windows 3.0
hit a lot of niches very well, and the rest is history.

Somehow I don't see Cisco, very much not part of their DNA ... but then again,
that's your whole point.

Future Systems would have been circa 1970, it was the ambitious 360 follow-on
(vs. System/370 which was 360s with ICs and DRAM). As I understand it, that
group eventually gifted us with the _very_ advanced AS/400 et. al. Like
Multics, every file is part of the address space, it's very neat and worth
studying.

Anyway, thanks a lot for the insights and stories you've shared. I got my
start with the IBM/1130, but then it was UNIX(TM), watched but didn't really
partake of the by then doomed Multics (Honeywell was horribly managed, blamed
a project failure on the decision to microcode the machine and then tried to
compete with 1 MPS async processors through the '80s), PDP-10s and Lisp
Machines, followed by an unending sequence of UNIX(TM) and Unix alikes. Bleah,
when you know we can do _much_ better.

But it didn't have to be that way. IBM certainly had a chance to win my heart
and mind with their systems in my home town's college (the 1130 and a
370/115), before I got exposed to better systems when I left.

~~~
graycat
The story went, some guys at Honeywell on Multics went to management and said,
"We believe that we can bring up Multics on a super mini computer, sell it,
and make money." and management said, "We don't believe you can bring up a
Multics on a super mini computer; if you did, it wouldn't sell; even if it
did, you wouldn't make any money.".

So, those guys did Prime. The OS was written in a slightly tweaked Fortran. I
ran two of them and did my dissertation computing on one of them. For the
first one, we were doing some DoD analytical work on TSO, and our two
programmers were spending $80 K a year. We got a Prime for $120 K, and our
computer usage went through the roof. We just copied over our 500 K TSO
Fortran programs, compiled, and ran -- ran fine. That Prime was just for our
group of 40, but soon the company of 300 wanted one, and I served on the
selection committee. For the second one, I got that for a B-school as a prof.
There the Prime made the central computer group's Amdahl 470/V6 look silly,
and I served on the committee to select a new university CIO.

At one time I had a summer job, and at first I was to program an 1130, but
later the job was for me to design and build some DC power supplies to power
some IBM tape drives IBM had given away to a research lab.

~~~
hga
Software Arts, as in Visicalc, used Prime computers. I see from a biography of
Dan Bricklin, the less technical of the two, that:

 _Prior to forming Software Arts, he had been a market researcher for Prime
Computer Inc., a senior systems programmer for FasFax Corporation, and a
senior software engineer for Digital Equipment Corporation. At Digital, he was
project leader of the WPS-8 word processing software, where he helped to
specify and develop one of the first standalone word processing systems._

And the more technical partner Bob Frankston was an MIT type. I remember one
Software Arts employee, friend of one or more, or maybe Bob's youngest
brother, who was a friend of mine, mention that among other things they
appreciated what the system adopted from Multics.

Being bit-sliced, the Prime micro-architecture you mention was microcoded,
which of course would fit with their following the example of the successful
System 360, many models of which _had_ to be microcoded because the logic
family they used pretty much had only one speed, they made micro-architectures
narrower, down to 8 bits as I recall, for the slower machines, and the 2?
fastest had none.

Honeywell's rejection of microcoding helped made Multics and GECOS systems
terribly uncompetitive, at least by the time Visicalc was being developed,
although the macro-architecture allowed you to easily hook up 6 CPUs in one
system (and 8 with a horrible kludge, as I recall).

------
vorg
> Only bloggers have the patience (or obsessive compulsive disorder) to follow
> one company every day.

> I wrote story after story, [... but] I was naive. My hope was that when it
> became clear to the public what was happening at [...] that things would
> change.

Bet you were called a _stalker_ and _cyberbully_.

------
hindsightbias
Cringely doom in 2007:

"But this week's "job action," as they refer to it inside IBM management, was
as much as anything a rehearsal for what I understand are another 100,000+
layoffs to follow, each dribbled out until some reporter (that would be me)
notices the growing trend, then dumped en masse when the jig is up, but no
later than the end of this year."

[http://www.pbs.org/cringely/pulpit/2007/pulpit_20070504_0020...](http://www.pbs.org/cringely/pulpit/2007/pulpit_20070504_002027.html)

The path of HP, Cisco, Dell, IBM and all has been pretty clear for awhile now,
and Cringely doesn't add any insight that isn't obvious. Scale up is drying up
and there is no pot of gold in scale out.

Your turn at being a commodity is coming, and that prediction is more accurate
than anything Cringely has made.

------
yuhong
I wonder who will be the equivalent of Lou Gerstner this time.

------
byefruit
[https://www.google.co.uk/finance?q=NYSE%3AIBM&fstype=ii&ei=A...](https://www.google.co.uk/finance?q=NYSE%3AIBM&fstype=ii&ei=AoKPU_CAOaH4wAOunYDAAg)

If $16bn in net revenue is IBM falling, I'd love to see what Cringely defines
success as.

~~~
xienze
Yeah they're still making tons of money but consider that they haven't grown
revenue in years, all their best talent is leaving in droves, they're selling
off everything that isn't bolted down, and they're betting the future on cloud
and Watson. They're late to the party for cloud (and in typical IBM fashion,
overpriced) and Watson ain't all it's cracked up to be. They have massive
problems going forward.

Source: former IBMer.

~~~
astrange
I can tell you’ve been at IBM too long, because you say “cloud” rather than
“the cloud” or “cloud computing”.

I have a friend who sends me quotes from their work sometimes, like “Good
clarity on Data, Cloud, Engagement! Loved your passion about IBM’s
competitiveness with Cloud!” or “My key takeaway is to evaluate how I can use
information to stay engaged in analyzing data!”.

Whatever are they so happy about, I wonder. Cocaine stipend?

------
devanti
not surprised. IBM hires a lot of incompetent software engineers

