
The rise and fall of AMD: How an underdog stuck it to Intel - sk2code
http://arstechnica.com/business/2013/04/the-rise-and-fall-of-amd-how-an-underdog-stuck-it-to-intel/
======
continuations
It's not so much that AMD "stuck" it to Intel as Intel shot itself in the
foot. The only reason AMD was successful was that Intel made a gigantic
mistake in NetBurst ("10 GHz or bust!")

When Intel finally rectified that mistake and released Conroe it was game over
for AMD. AMD simply can't compete with Intel whether in architecture or
fabrication.

What AMD could've done was to concede the PC market to Intel and focus on the
emerging mobile device market - the classic disruptive attack. ARM went that
route and now has a market cap that's 10X bigger than AMD. Qualcomm adopted a
similar strategy with Snapdragon and now has a market cap equals to that of
Intel's.

Lesson: never fight a dominant incumbent in its own game. You will get killed.
Play a different game. Better yet, invent a new game where you know the rules
best.

~~~
kinkora
>>Lesson: never fight a dominant incumbent in its own game. You will get
killed. Play a different game. Better yet, invent a new game where you know
the rules best.

What about Google[1], Apple[2], Facebook[3], etc? Not disagreeing with you but
I'm genuinely curious on what you think about these companies. I probably can
think of many more examples but these are the ones that sprang to mind when
thinking of companies that "fought the dominant incumbant in its own game".

~~~
continuations
1) Google - when Google came into existence in late 90's there were no
dominant search engines. The orthodoxy at that time was that there was no
money in search. Accordingly nobody wanted to be a search engine and everyone
tried to be a portal. The dominant player back then - Yahoo - emphasized its
use of human editors and partnered with AltaVista and Inktomi for search.

So Google never really defeated a dominant search engine - it merely took
advantage of a giant vacuum in the market.

2)Apple - Apple never defeated Microsoft in the PC game. Instead it focused on
emerging markets: ipod, iphone, ipad. Eventually these new products
(smartphones & tablets) became powerful enough to erode the PC market. Even
now Microsoft is still dominant in the PC market. It's just that PC is no
longer where the action is at.

3)Facebook - there were no dominant players in social networking when facebook
was founded. Myspace was the biggest player, but "big" was a relative term
back then. Even at the height of its popularity the vast majority of the
population was not Myspace users. Sure if you were 16 year old in 2004 then
you were probably on Myspace. but outside of that demo Myspace was far from
dominant. Most people were not on any social networks. In short it was a wide
open market.

I'm sure there are examples where a challenger took on a dominant incumbent
head on and won. But those are the edge cases. You pretty much need all the
stars to align perfectly to have a chance. It's not a strategy I'd bet on.

~~~
k__
1) Google > Apple in mobile 2) Apple > Nokia in mobile 3) Facebook > Google in
ads

~~~
thedrbrian
What exactly do you mean by google > apple in mobile?

~~~
k__
After Apple hit the marked everyone was buying an iPhone. Now most of the
people have Android phones.

~~~
fpgeek
Not really. While the iPhone has been influential and a huge success for
Apple, it didn't really kill any of existing smartphone incumbents. "Everyone"
was only buying an iPhone for fairly limited values of "everyone" (e.g.
excluding several continents). Instead, Android killed off those incumbents,
taking advantage of the opening the iPhone had created.

So it's really Google > Nokia (Symbian), Microsoft (Windows Mobile), Palm and
RIM, with Apple off to the side somewhere. And the complex, multi-faceted
Apple vs Google battle in mobile isn't an upstart vs an incumbent, it is a
battle of two upstarts who have become the incumbents.

------
kps
>AMD began life as a second-source supplier for companies using Intel
processors.

Ars should know better. AMD started in 1969, making their own logic parts. In
the late minicomputer era, their Am2900 series of bit-slice[1] components was
king, being used to build CPUs for models of DEC PDP-11 and PDP-10, DG Nova,
Xerox Dandelion ("Star"), Wirth's Lilith, Atari vector arcade machines, and
countless other machines.

[1] <https://en.wikipedia.org/wiki/Bit_slicing>

(Not especially relevant disclaimer: I work for Intel.)

~~~
jfb
Ars hasn't known better in a good long time.

~~~
enraged_camel
Agreed. Their quality has gone down a lot over the past couple of years. Every
article they write is full of mistakes and inaccuracies. Honestly they only
reason I still visit the site is to see Aurich Lawson's artwork.

~~~
gwern
It's funny that you mention mistakes and inaccuracies, because I'm actually
waiting for the author of the OP article (Cyrus) to get around to replying to
an email I sent him 4 days ago pointing out that the claims about Silk Road in
[http://arstechnica.com/tech-policy/2012/06/fbi-halted-one-
ch...](http://arstechnica.com/tech-policy/2012/06/fbi-halted-one-child-porn-
inquiry-because-tor-got-in-the-way/) are completely false and this can be
proven by merely reading the PDF used as the source for the article.

Thinking about complaining to him on Twitter, maybe if I start being more
public about it he'll bother to do something about a year-long mistake...

~~~
gwern
I pinged him on Twitter and he fixed it within the hour. I guess I've learned
a lesson here.

~~~
enraged_camel
Yeah. I find it kind of sad just how bad people are with email. I hate the
idea of having to rely on Facebook or Twitter to get people to respond.

------
szager
What a wave of nostalgia. I took a job at AMD in 2005, right at the zenith of
their success. I was totally enamored of the great technology that went into
K7 and K8, and I was ready to help this underdog company stick it to Intel and
turn the microprocessor world on its ear.

I worked there for six years, through fumble after disaster after boondoggle.
I could go on and on about all the reasons I believe AMD went down the tubes
-- and I'm really looking forward to reading the second installment -- but I
think a lot of it reduces to the disfunctional corporate culture alluded to in
this piece.

During my time at AMD -- and the old-timers confirmed that it was ever thus --
there was _always_ the sense that every project was make-or-break for the
company, that we were always on the brink of disaster. Long-term strategic
planning is simply not in the company's DNA. We lurched around like a headless
chicken, and when -- through a combination of good products and missteps by
Intel -- AMD finally got a taste of sucess, we squandered it in the most ham-
handed and disastrous (and predictable) way.

P.S. Bulldozer project was a total horror show, beginning to end.

~~~
StringyBob
> P.S. Bulldozer project was a total horror show, beginning to end.

I'm intrigued. In your opinion what were the pain points - Management?
Schedule? Technical? Foundry? Methodology (EDA synthesis vs old school hand-
layout etc)? Something else?

All of the above?

~~~
szager
The whole project was over-scoped and over-ambitious. Flush with the success
of K8, AMD decided to undertake a completely new from-scratch processor
design. New architecture, new cell libraries, new design methodologies, new
tools -- we chucked everything out and started over completely.

It's like AMD had their first taste of champagne with the success of K7/K8,
and we immediately got drunk and fell on our face.

I would also like to single out for opprobrium Bruce Gieseke, the technical
director of the project, who shoved an utterly impractical and labor-intensive
design methodology down our throats, and would not relent even when it became
clear how much it handicapped the project. We should have been trying to
synthesize much more of the chip from day one.

Over all, the project was plagued by delays, bugs, and dead ends. It was way
over budget and well past schedule; in the end, it came to market at least two
years too late to have an impact. By the time bulldozer-based products reached
market, the technical innovations of the new architecture had already been
bested (or at least matched) by Ivy Bridge. And of course, Intel has Haswell
on deck; AMD, having poured all its resources into bulldozer, has nothing left
in the tank.

But bulldozer was also disastrous for all the resources it leeched away from
other projects, and the way it focused the company's energy on a product whose
market was at least flat, if not yet shrinking. There were some really
promising projects that got cancelled so that AMD could throw more engineering
resources at bulldozer.

management: fail

schedule: fail

technical: would have been awesome in 2009

foundry: Working with Global Foundries was not entirely smooth, but it would
be inaccurate to lay too much blame here.

methodology: fail++

~~~
StringyBob
Thanks for the write up. In particular I'd seen comments[1] that seemed to
imply big disputes between the old school (highly tuned custom chip layout) vs
new school (progress via faster iteration with synthesis/automation) design
styles.

[1] e.g.
[http://www.xbitlabs.com/news/cpu/display/20111013232215_Ex_A...](http://www.xbitlabs.com/news/cpu/display/20111013232215_Ex_AMD_Engineer_Explains_Bulldozer_Fiasco.html)
(ignoring the misleading intepretation in the article itself).

~~~
szager
I worked with the engineer cited in that article. He left the company, on less
than amicable terms, pretty early in the bulldozer project. I'll just say he
lacks credibility, and leave it at that.

------
TikiTDO
From my perspective, AMD really screwed up the AMD/ATI merger in the worst way
possible. When AMD bought ATI they were well positioned to beat everyone to
market to release their APU chip. However, the problems started almost
immediately when neither AMD nor ATI did anything to combine resources. They
should have put in the time and money to merge teams at all levels of the
company. Instead they built a few small groups that included senior personnel
from both units, then they left most of the lower level teams to work on
whatever they were working on previously. Never mind the fact that there were
a lot of really clever people all over the place ready to contribute really
great ideas.

In other words, there was zero global direction down the ranks. It was just
business as usual; keep doing what you've always been doing, and maybe we'll
show you some nice slides a few times a year about how great APUs will be.
This lack of organization meant that no one had any idea what anyone else was
doing. Worse-- even if you wanted to find out there was absolutely no company-
wide documentation or organization on anything. Your only hope for getting
information was hoping one of your co-workers had bookmarked some magical page
with the info you required. l This just got worse when you accounted for the
problem of elitism. The hardware teams were just so much better than the
software teams. After all, software is easy, so what sort of useful input
could those code monkeys offer. And far be it from the software teams to
actually talk to someone from the QA teams; those QA people were beneath
notice. Finally, add in a very wide distribution of personnel seniority,
insane levels of paranoia about job security, grade-school level office
politics, and completely disparate management styles, then hit blend.

So really, the results are not at all surprising. You can't have two companies
pretend to be one while playing tug-of-war, and still be competitive.

~~~
marshray
> They should have put in the time and money to merge teams at all levels of
> the company

How would forcing the teams to merge quickly rather than gradually achieve
anything other than messing up everyone's development schedules?

~~~
qompiler
Yes because teams are like liquids, just pour them in a cup and stir that
stuff up. _sigh_

~~~
oxide
that's not what was said at all. _sigh_

------
JVIDEL
How I remember those days! after the complete disaster that were the first
PIIIs comes the Athlon. It was just so amazing, so unbelievable that you could
get that much power without breaking the bank.

And it was the same story until about 2006, you could get an Athlon or a FX
and get more than enough CPU power to run almost anything in the market at the
time. Even the Sempron which was the cheap option was _good enough_ , I
remember guys in forums getting mobile version and OCing 'em to the very
limit.

But then came Core2Duo, and AMD literally had nothing against it. The first
Phenom sucked, big-time, there is no other way to put it. The Phenom II was
much better but not good enough and the only talk about it was how you could
turn on the disabled cores in the X2 and X3 variants since they were the same
silicon than the quads.

I really wanted the FX to be as good as the Athlon-era FX, but it wasn't, not
even close.

------
AlexDanger
Hang on, what about consoles? AMD are doing the CPU and GPU for the PS4 and
presumably the next Xbox if rumours are true.

Surely this is quite a windfall? Particularly if they're pumping out the same
part for ~5 years, much longer than the average CPU stays on the market. Is
this not a cash cow?

I'm disappointed AMD are no longer competitive on the CPU desktop performance
market, but I'm not sure I understand _why_ they cant compete with Intel on
CPUs given the stable revenue offered by consoles and their competitive GPU
line. Is it Pure R&D budget? Or Intel are too far ahead tech wise? Intel have
the best engineers? What is it?

~~~
seanmcdirmid
Console parts are commodity, not cash cows. I'm sure AMD will make a profit,
but not much. NVIDIA and Intel never made much money here either.

------
cyraxjoe
Apparently AMD will have the upper hand in the next gen video game consoles
PS4 -> <http://en.wikipedia.org/wiki/PlayStation_4#Hardware> xbox 720 ->
[http://www.digitalspy.co.uk/gaming/news/a471564/xbox-720-to-...](http://www.digitalspy.co.uk/gaming/news/a471564/xbox-720-to-
use-amd-processor-lacks-backwards-compatibility.html) or at least that's the
rumor.

~~~
SG-
The margins are likely extremely low and performance doesn't seem to be
anything amazing, it's likely current or slightly last generation. Don't
forget that consoles will still have to sell for <$500 and make some profit.

~~~
yvdriess
Indirectly, it does makes AMD CPUs and GPUs more attractive to PC gamers:

[http://www.eurogamer.net/articles/digitalfoundry-future-
proo...](http://www.eurogamer.net/articles/digitalfoundry-future-proofing-
your-pc-for-next-gen)

~~~
deelowe
Which is another shrinking market. AMD needs a mobile and/or GPU/CPU
integration strategy. I think ARM is about to eat everyone's lunch for
consumer products. Not sure what may happen with Intel, as there's still some
life left for servers.

~~~
continuations
> Not sure what may happen with Intel, as there's still some life left for
> servers.

For now. But sooner or later ARM will become powerful enough to run servers.
The chips are cheaper. The operating costs will also be lower as ARM draws
lower power. Before you know it data centers will switch over to ARM, first as
a trickle, then as a deluge.

ARM will do to x86 what Intel did to SPARC & Alpha & PA-RISC & MIPS. Intel
will have to keep retreating to higher and higher end niches.

~~~
thornkin
What do you think is the moat that will keep Intel from doing to Arm what they
did to the RISC chips? Intel has better process and very capable designers.
They are quickly moving down-market. Why assume that Arm moves into Intel's
territory before Intel moves into Arm's?

------
adventured
AMD has only stayed in the game to begin with because Intel was 'forced' to
license x86 to them in order to avoid government anti-trust prosecution.

~~~
Osiris
And we should be glad that they did or history may have turned out very
different. I doubt that an Intel without any real competition would have
innovated as quickly as they have.

~~~
adventured
It's a very interesting notion, that we'll never get to test unfortunately.

Without the artificially propped-up AMD perhaps the market would have served
up a far superior competitor than a half-the-time mediocre AMD that is now on
life support.

ARM is tremendous evidence that the market can produce competition to beat
Intel. I think the ARM model is kind of like how Windows & Android both
managed to beat Apple in market share, in regards to the business model (ARM
being distributed amongst numerous competitors, advancing the market faster
than it would with a solo supplier).

It's also worth considering that it nearly always takes a new epoch /
inflection / radical shift in markets to dislodge industry standards (and that
as a consequence AMD may never have stood a real chance). Which is also why
ARM now has a chance to rock Intel.

I'm not aware of too many standards that have been killed off in tech without
a big shift in the underlying technology segment in question.

------
chollida1
I found it interesting that Bill Gates had a hand in AMD buying NexGen, the
company that gave AMD its' K5/6 technology and brought them Raza.

I don't really know what was going on with Microsoft and Intel around 1995 but
it appears pretty clear that Bill wanted to have atleast two x86n chip
suppliers available for PC manufacturers.

------
richardjordan
A book recommendation relevant here: Inside Intel is a great book about the
founding and early-days sparring with AMD - it's old - I bought my copy in the
late nineties en route to Silicon Valley from London. But it's a great scene
setter and backgrounder for these sorts of stories. Great read.

~~~
AlexDanger
So whats its called and where can we find it?

~~~
caw
It's called "Inside Intel", and appears to be available at Amazon.
[http://www.amazon.com/Inside-Intel-Worlds-Powerful-
Company/d...](http://www.amazon.com/Inside-Intel-Worlds-Powerful-
Company/dp/0452276438/ref=sr_1_1?ie=UTF8&qid=1366642565&sr=8-1&keywords=inside+intel)

------
ams6110
AMD also had a niche in supercomputers, for example the Cray XT/XE series. Not
sure the absolute volume really made a difference there though, the number of
Cray sales would be dwarfed by ordinary PCs and servers.

~~~
wmf
Cray's next generation is going Intel BTW.

------
rorrr2
This is a very scary image for AMD:

<http://i.imgur.com/pCiDEKP.png>

Their revenue is falling, and falling consistently.

They pretty much lost as a CPU manufactured in every area - desktop, laptop,
tablet, phone.

Their only option at the moment is to keep making the best damn APUs and GPUs,
and try to invent something new. I wish they came out with a 1000-core CPU or
something.

~~~
lucian1900
Unified CPU and GPU memory is an awesome concept that game developers in
particular love (but not just them). We'll see if unified memory in both the
next-gen consoles will spur something interesting in PCs. I'm not holding my
breath though, since Intel have no interesting GPU story.

~~~
macspoofing
It makes a lot of sense for laptops and tablets and console-like PCs (e.g.
SteamBox). There may be a market there for AMD.

~~~
lucian1900
But in the former and latter, people are likely to wish for Intel CPUs instead
(even if they don't actually need them).

