Late 1980s joke: "How many people work at DEC?" "About a quarter of them."
Pay increases were a percentage of each group's payroll. Several groups had CTs (corporate turkeys). These were highly compensated people with a dotted line relationship to Dr. Bell's No Output Division. The deal was, they could work on NOD projects to their hearts' content, but they should not expect raises. That allowed the raise budget to be spent on the other folks.
For a while they had a policy that poor performers had their performance reviews delayed. This meant that you couldn't tell if it was you yourself who was incompetent, or your manager. (Incompetent managers often were late doing performance reviews for their people.)
Now all that's left is the Digital Credit Union. It's sad; they had many great people and some great products.
>There was virtually no organizational structure during Digital’s early years because Olsen was committed to creating an environment much like the research labs at MIT. A temporary position as liaison between MIT and IBM in 1959 convinced Olsen that the hierarchy at companies like IBM did not allow for creativity and the flow of ideas.
I used to work at a fairly big software company that was on about version 6 of a product with a 2 or 3 year ship cycle. They'd spent years trying to get away from the waterfall model, but there were still gaps on the order of maybe 8 months between ship cycles where the teams had nothing to actually work on. (To clarify, what does your QA team do while all of the PMs and architects are off designing stuff and devs are barely scratching out prototypes in the dirt?)
As a result, they would have these planning drives where everyone would get mobilized to work out details for the next release's engineering system. Dozens of committees would be formed, meetings held, PowerPoint decks produced, the Director and his staff would hold tense weekly meetings and curse and growl at each other over graphs and spreadsheets of "progress".
Then finally, the big day would come, day one of the new engineering system, and ----- crickets. A week later everyone would have forgotten the buzzwords and names of the initiatives and would be back to waterfall.
That reminds me of a discussion we had when I was working for * (one of the large tech companies in Silicon Valley): The conclusion arrived at was that if a freak meteor would hit our campus and take out just middle management, the rest of the company could come in tomorrow - hold a mandatory memorial service - and go on with their work, with higher productivity.
This is a profoundly complex question. Or, rather, answer. : )
Everyone knows one Ken Olsen quote, though few people know it's taken hugely out of context[1] - it was quite prescient at the time (and the next few decades) though with a bunch of RPi / IoT / HUE gear in my home now, perhaps he was wrong after all.
Mind, his anti-TCP position is almost as naive / famous -- but aligned with several other CEOs of large organisations that are still with us.
Is there ever a single sound-byte-able response to the question 'what killed company X?'? Probably not. Perhaps they backed the wrong architecture. Or they naively went up against Microsoft (a very different beast to the cuddly MS of today). Or they disrespected their customers just a wee bit too blatantly. Or their QA dropped unforgivably (or at least uncompetitively) low.
DEC was in the minicomputer and workstation businesses. It got squeezed to death between IBM at the high end and the upward expansion of the PC business at the low end.
A $25,000 MicroVAX was cheap and wonderful when it could replace a $250,000 mini, but not very attractive once a $2,500 PC-based server could do the same job.
DEC also had a bit of a NIH problem. For example, it did its own (expensive) PC, which wasn't IBM PC-compatible. Later, it launched Windows NT machines based on its own 64-bit microprocessor, the DEC Alpha. Then it did its own version of the ARM chip, the StrongARM (which later reappeared as the Intel XScale before being sold on again).
DEC also lost its star software designer, Dave Cutler, the man behind DEC's biggest success -- VAX VMS -- by cancelling his project. Bill Gates promptly hired him, and said he could bring whoever he wanted to Microsoft.
Cutler wrote Windows NT, which ultimately freed Microsoft from its huge problems with IBM and OS/2, and provided a transition path from DOS-based Windows to a real OS.
For younger readers, the top workstation companies were DEC, Sun, HP, Apollo, Silicon Graphics etc. The top minicomputer companies were IBM, DEC, HP, Wang, Data General, Prime, Nixdorf, NCR etc. Not many survivors from that lot, but they were big and powerful at the time.
Alternatively, DEC was built on a 1950s business model where customers (not necessarily final users) were skilled and educated scientists, developers, academics, engineers, and bizops people - basically hands-on hackers of one kind or another.
Its R&D and marketing machine was created to match this market. As long as most of the computer buyers in the world shared that culture, DEC did very well.
DEC had no experience designing and selling commodity/appliance computers to the general public, and not much interest in same. This may have been Olsen's fault. I suspect he just couldn't imagine his wonderful computer engineering machine selling crappy microcomputers direct to ordinary Joes through retail and mail order.
At the high end there was always an interest in taking on and beating IBM, who had a monopoly on the very high end of business and scientific computing, but for whom the PC was just a flukey minor side project. (IBM didn't understand commodity computing either, which is why it was pushed out the market by the clone makers.) DEC made some headway but never quite understood that the business high end is not the same market.
So the reason there was no DEC PC and we're not all using DEC clones is cultural. Gordon Bell was - as usual - ten years ahead of everyone else, and worrying about this at the start of the 1980s when VAX was well on its way to making DEC a giant. There are DEC memos about this period at bitsavers.org, and they provide some insights into how DEC failed.
DEC engineering, especially in VLSI, was easily the best in the world. Alpha was a thing of beauty, and an affordable Alpha PC would have killed Intel, MS, and maybe even Apple, and advanced the PC market and perhaps the Internet by five to ten years, and created a completely different culture around commodity computing.
> DEC engineering, especially in VLSI, was easily the best in the world. Alpha was a thing of beauty, and an affordable Alpha PC would have killed Intel, MS, and maybe even Apple, and advanced the PC market and perhaps the Internet by five to ten years, and created a completely different culture around commodity computing.
DEC's Alpha PCs were actually quite reasonably priced, but all the software was written for x86. There was zero chance of Alpha killing Intel or Microsoft without broad software support, and even with it, there's no guarantee it would have won.
> It got squeezed to death between IBM at the high end and the upward expansion of the PC business at the low end.
And laterally from Sun, who were much more aggressive about selling workstations, undercutting DEC on price, performance, and delivery schedules. And the Sun machines also ran 4.2 BSD
I worked with VMS back in the v.late 1980's - it was quite lovely, but that feeling was based on experiences with C64, CP/M and MS-DOS.
Somewhere in the back room I still have two Multias, which you mention, and were famous for running Microsoft NT (albeit briefly).
Are you confident of the NT back-story? My understanding (and I ack that much of this is heresay thwarted by time) was that MS had long since planned their IBM-non-friendly exit from the OS/2 co-venture (certainly pre-dating Dave Cutler's hiring). The timing doesn't seem to match up - though I concede my memory is a bit fuzzy, plus things moved at a different pace back then.
Thanks for the comments. I am absolutely certain of the NT back-story. However, you can read the whole story from another source, Show-Stopper, a book by G Pascal Zachary from the Wall Street Journal (1).
When Cutler was hired in 1988, both OS/2 and Windows were failing in the marketplace. The problems arose after Windows 3 sales started to take off in 1990.
Microsoft tried to get IBM to accept "OS/2 NT" as a replacement for the 16-bit OS/2, which was dead in the water. It refused, IBM and Microsoft divorced, and NT got rejigged for Windows compatibility instead of OS/2 compatibility.
Remember that Microsoft got its power from IBM: it would have been nowhere if IBM hadn't used its DOS and Basic in the IBM PC. IBM then resented Microsoft for "stealing" a small part of IBM's rightful monopoly.
Microsoft would go to any lengths to hang on to the IBM connection, which Ballmer called "riding the bear". The in-house Microsoft strategy for IBM meetings was BOGU (for Bend Over, Grease Up). Hence OS/2.
You may recall that, at once time, Microsoft saw Unix as the potential replacement for DOS. It did Xenix, which was the most popular Unix of its day. However, way back then, IBM had an implacable hatred for Unix and for AT&T, and owning OS/2 EE (not available from Microsoft) was the cornerstone of its plan to bring the PC industry under IBM's control (OS/2 EE, PS/2, MCA, SAA etc).
Yeah, I'm not sure how big a role Cutler leaving played.
DEC (and the other minicomputer makers) generally missed the shift from vertical integration to horizontal integration but what exactly that should have meant for the strategy of a company like DEC is a complicated question. With 20-20 hindsight, there are a number of paths they could have followed although many seem counter to the DNA of a company like DEC such as becoming a Dell.
As an interim step, they could probably have invested more in Unix early-on. Ken Olsen's "Unix is snake oil" quote seems to be as lacking in context as his quote about computers in the home. [1]
> DEC (and the other minicomputer makers) generally missed the shift from vertical integration to horizontal integration
Great comment, and absolutely right. I seem to recall that Andy Grove wrote a whole book about that ;-)
With hindsight, I should have recast my NIH statement, because those developments may have been a function of the desire to own as much as possible of the vertical stack.
I forget which book by Grove it was but I remember reading it when I was doing some strategy research for a client once.
You may be right that DEC's NIH was something of an outcome of vertical integration. It was widespread in that sector and just generally there wasn't anything like the sort of coopetition that would come later. I still remember at Data General getting an absolutely flaming email from a sales rep because we had OEMd a LAN board from DEC for a system after we had switched to some sort of industry standard bus. The email was to the effect that "We're fighting with these guys every day and you guys in corporate turn around and stab us in the back by buying their products."
For a while in the 90s Japanese companies had NODs for exactly the same reason Olsen gives: they didn't want to admit that they were firing people. The form varied, but typical examples would be an assignment to guard a tree or wait in a specified room for someone to give you work. The idea was to enforce isolation, boredom and loss of status to the point where the unfortunate employees would voluntarily quit; a kind of 9-5 solitary imprisonment.
Its actually a really brutal method of downsizing without paying redundancy money. Be careful what you wish for.
In Japan I think the phrase is called Window Gazing -- the employee / victim is provided an office with a nice view (presumably of things they could be doing instead of draining resources from the company) and left to draw their own conclusion.
In Australia many years ago -- early to mid 1990's -- we had a spate of large public organisations being privatised, with a predictably large number of HR casualties. There was typically a brief queueing system, as some token effort at redeployment was considered ... but in reality once you'd been side-moted to a Window Gazer role, you know precisely where you stood.
The phrase used locally was 'being led into / sitting around in Flight Deck' -- that being the name of the frequent flyer awards club of one of our two major airlines at the time. (That airline's completely gone now -- make of that what you will.)
Yes I've heard about it. It was even mentioned in "Silicon Valley" TV series. I guess it did the trick before internet era. Now you can be paid for looking at cute cat pictures whole day ;) Joking aside I was skeptical about this being a way to force someone to quit, but after some unbelievable boring projects I totally get this. You just cannot for longer time periods just play billiard, table football, browse HN.
It is brutal and in some countries, likely to result in a law suit as it can be construed as 'Constructive Dismissal'. Extraordinarily damaging to the person on the receiving end. Its better to do it the right way, make the position redundant, and enable everyone to move on.
This is known in Anglophone countries as "constructive dismissal", and is not a good idea if your area has anything resembling employee rights. It is treated as equivalent to firing someone, but will generally be taken as constituting malice by deliberate creation of a hostile work environment.
It seems that Konami is still at something similar: unwanted software developers are assigned to blue-collar jobs. http://www.giantbomb.com/articles/report-reveals-restrictive... The Giant Bomb article describes it as a punishment, but I assume that the main intention is to have the programmer take the hint and file his resignation.
I am in two minds about this. It is great to build a culture where nobody is fired, but at the same time if the weight of carrying dead wood gets too high - well you go the way of DEC.
On top of this a lot of dead wood is not dead wood in a different environment - it might be better all round to let people move on to new opportunities.
> It is great to build a culture where nobody is fired
It is great to build a culture where you never need to fire anyone because you are good enough at hiring, and at placing, moving, and counseling (a two-way process!) those you have hired to be able to effectively utilize them. And, if you've done that, then its good to actually not fire anyone.
But, outside of that, I don't see the value of a culture where nobody is fired; especially if it achieved through internal segregation where it cannot avoid being visible to people within the organization that the "not firing" is largely a charade.
I seem to recall reading somewhere (I can't recall where unfortunately) that Ken Olsen was averse to letting people go, and when DEC was forced to downsize in the 90s, the redundancy packages were so generous that it cost the company more to lay people off than to retain them.
Pay increases were a percentage of each group's payroll. Several groups had CTs (corporate turkeys). These were highly compensated people with a dotted line relationship to Dr. Bell's No Output Division. The deal was, they could work on NOD projects to their hearts' content, but they should not expect raises. That allowed the raise budget to be spent on the other folks.
For a while they had a policy that poor performers had their performance reviews delayed. This meant that you couldn't tell if it was you yourself who was incompetent, or your manager. (Incompetent managers often were late doing performance reviews for their people.)
Now all that's left is the Digital Credit Union. It's sad; they had many great people and some great products.