
How AMD Won, Then Lost - geerlingguy
http://hackaday.com/2015/12/09/echo-of-the-bunnymen-how-amd-won-then-lost/
======
systemshacker
This article is a recounting of public story. The real truth is much more
nuanced with enough internal reasons (AMD veteran here).

AMD made a lot of profits with the x86-64 (Opteron family) around 2004-2005.
However, that got to managements head and there were a series of missteps:

* Inorganic growth: The company went from small teams with startup culture to larger teams with many projects. AMD went and acquired large teams from HP in Fort Collins and Sun in Boston (Millenium chip team) in one fell swoop. This slowed projects a lot while assimilating and learning to work together with very different cultures and methodologies.

* Mid-management from IBM: Since the company was growing larger, a bunch of VPs from IBM were hired. They tried to bring IBM style processes which will not work when you do not have a captive market like IBM and your competitor is Intel :)

* Too many projects: The people and management growth resulted in everyone wanting their own chip project instead of working on derivatives of existing projects. There were too many projects conceived, spent cycles on and then cancelled.

* Paid too much for ATI: Bought them for 5.4 billion in 2006 when they could have waited till 2008 and bought them for 1 billion :) They had to write off most of the ATI value off their books and took charge for it.

~~~
qb45
> Paid too much for ATI: Bought them for 5.4 billion in 2006 when they could
> have waited till 2008 and bought them for 1 billion :) They had to write off
> most of the ATI value off their books and took charge for it.

But was it really that predictable back in 2006?

2008 sounds like the year when Intel killed 3rd party chipset market. Which,
afaik, wasn't expected by anyone.

~~~
FreakyT
Ah, that was the year that happened? I still remember the dark ages of having
to pick between one of several chipset manufacturers, each with their own sets
of strange problems.

Really, I think moving to chipsets made by the CPU manufacturer was an overall
improvement.

~~~
qb45
A part of those chip's appeal was integrated GPUs. Not everyone wanted to buy
discrete GPUs and Intel's own IGPs of early '00s were rather poor.

------
superbatfish
This article omitted one interesting part of the story, which is how AMD led
the way to the x86_64 architecture while Intel was attempting to migrate to a
conceptually radical architecture in its own products (Itanium). As I
understood it, AMD's success in the early 2000s was largely due to their
leadership in 64-bit, and Intel had to play catch-up to adopt AMD's approach.
(But when they did, they reclaimed the throne.)

~~~
burnte
They were able to lead with AMD64 because of their incredibly
price/performance at the turn of the centry. Late 90s the K6x chips were
breathing new life into Socket 7 motherboards and existing systems, and a
couple years later the Athlon was saving peple a hundred or more dollars to
get a system that outperformed the Pentium 4 and produced significantly less
heat. This helped them grow their marketshare significantly, which meant when
we needed 64bit computing, we had two viable companies to pick from, rather
than one giant and one minor also-ran.

Intel still acted like they were the market dictator (which they weren't at
that time) and said "IA64 is the way forward, dump x86!" This was astounding
from Intel as for decades they had been the key drummer on the backwards-
compatible drum, making sure each new x86 CPU could run the old existing code.
Now suddenly they're saying dump decades of installed base. AMD took advantage
of their increased marketshare and said, "Hey, we have a 64bit solution for
x86 that keeps your old code, and lets you create 64bit code for the future
without relearning a new, complicated architecture." IA64 was EPIC, meaning
Extremely Parallel Instruction Computing. AMD64 was x86 with a whole new set
of 64bit extensions but with x86 compatibility and familiarity, which is
exactly what Intel did when they extended x86 into the 32bit world with the
i386. Those two factors combined are why AMD64 became the defacto standard.
Neither would have helped without the other.

~~~
mnw21cam
EPIC - Explicitly Parallel Instruction Computing. See
[https://en.wikipedia.org/wiki/Explicitly_parallel_instructio...](https://en.wikipedia.org/wiki/Explicitly_parallel_instruction_computing)
That means that the compiler works out the parallelism rather than the CPU,
which is meant to reduce the amount of silicon wasted on instruction
decode/parallelism.

Anyone remember Transmeta?

~~~
yvdriess
Yes, Transmeta was bought by Intel. x86, despite its quirks, does not seem to
be holding back the hardware too much. The silicon for decode/ILP does a
fantastic job.

~~~
Symmetry
Intel just liscenced their patents the same way that NVidia did. NVidia later
hired much of the team and their Project Denver core is essentially the
Efficeon 2.0.

------
smegel
I remember the days of the Athlon X2 versus the Pentium 4...AMD was truly the
king back then. Combined with nVidia producing an awesome AMD chipset
(nForce), they were glorious days. Then AMD went weird and starting focusing
on these dual-socket platforms instead of just improving their stand-alone
CPUs...the Intel Core came out...and the rest is history. Sad really.

Later (~2006) I remember because the Athlon X2 was produced in limited
numbers, and the eBay price went through the roof as people were trying to
upgrade from single-core without replacing every other component in their
system to jump to Phenom.

~~~
radoslawc
Not to mention first Duron processors, originally clocked at 600 MHz, that by
shorting some bridges could run at 900 or even 1GHz, at this time there wasn't
CPU that fast (customer grade) and that affordable. Other thing was bear dye
that could be crushed while mounting heat sink. Later on I've got even more
hands shaking while mounting heat sink on next generation Barton which could
be overclocked from 2500+ to 3200+, ah old days.

~~~
krylon
> Other thing was bear dye that could be crushed while mounting heat sink.

A friend of mine did that once. Broke off the tiniest bit from the corner of
the die while trying to mount the heatsink, but that killed the CPU. He had
been saving money for that CPU for months and was understandably unhappy.

~~~
SixSigma
I almost know his pain. I snapped the plastic posts on my SIMM connectors
once.

------
atomicbeanie
Payments to a customer to not buy a competitor's product is a form of price
fixing. Other Intel customers would then not know what the real market price
was. They were therefore undermining AMD's opportunity without the market
being able to take advantage of the price war between them.

Intel seems over-ripe for anti-trust action at this point. I think the
government is codling them in all likelihood. This is not a technical issue, I
suspect it is far more of Intel taking advantage of its strong political
clout.

~~~
raverbashing
But Intel did have an anti-trust action against them and settled

------
rdc12
While I don't think the ICC situation was fair, I am not convinced that it
played a big role in the story of AMD. Was ICC ever really used outside of
niche markets, like HPC.

I would bet that most of the mediap layers at the time would have been using
hand written assembler for the bits that really needed SSE.

~~~
Lan
There were notable instances of benchmarking suites being compiled with ICC.
So even if most applications didn't exhibit the issue, reviews for AMD
processors did. And it was a prominent enough issue that a tool was written to
search your PC for software compiled with ICC and patch it to work correctly.

------
metrix
According to this article, Zen has not taped out yet:

Devinder Kumar, AMD CFO : Zen was a clean sheet design that started a few
years ago. We are in the final stage of executing and you know the milestone
that you want to hear us talk about is Zen taping out, which should be over
the next several months, and then putting samples in the hands of our
customers and then starting first full year of revenue in 2017. And by the
way, because we have this reuse approach for cores, you will see us with Zen
cores in the high-end desktops first and then the servers from our overall
products standpoint.

Read more: [http://wccftech.com/amd-confirms-zen-coming-highend-
desktops...](http://wccftech.com/amd-confirms-zen-coming-highend-
desktops-2016/#ixzz3tsoNfrvK)

------
RUG3Y
I love my AMD cpus, on paper they're not as "good", but they do what I need at
a great price.

------
scott_karana
This seems to be missing some of the technical merits of Intel's failure and
rebirth:

Downturn: The heat- and power-intensive single-core Pentium 4s, and the
stillborn & binary-incompatible IA64/Itanium lineups that prevented them from
competing on both the multicore and 64-bit AMD marketing bullet points

Rebound: Dedicated, efficient "premium" laptop chips in the Centrino lineup,
to which AMD had no competitors, and which subsequently reunified with the
P4's Hyperthreading, _plus_ multicore, in the "Core" lineup/microarchitecture
that debuted in the newly-Intel, no-longer-PowerPC Macbooks.

(I seem to remember hearing that Core-era Xeons were good, serverside: Nehalem
_definitely_ was, just afterwards)

------
technofiend
I guess the real question is will AMD have a compiler to match Intel's when
ZEN is released?

~~~
S_A_P
I'm surprised nobody's hacked the Intel compiler to compile for AMD chips. I'm
sure it's for fear of litigation.

~~~
berkut
You don't need to these days (since 2011) - there have been numerous
benchmarks over the years showing ICC generate binaries that perform better
than MSVC++ for AMD processors.

~~~
throwaway7767
Interesting. I guess they must have finally stopped intentionally crippling
ICC for non-intel processors[0].

[0]
[http://www.agner.org/optimize/blog/read.php?i=49#49](http://www.agner.org/optimize/blog/read.php?i=49#49)

------
ksec
I am, naively thinking, may be we have hit the wall of single threaded
performance, Skylake wasn't that much difference, even though i believe with
some compiler optimisation and recompiling some software may get another 10%
benefits.

We could run at higher clock frequency, but CPU will heat up.

And when most of the world don't need high performance PC any more, may be AMD
do stand a chance this time around. As long as ZEN is within 10 - 15% of
Intel's performance. They have lots of head room to work in the Server CPU
sector. And their APU in the lower end.

The problem with AMD is they have never been good with execution, so while
even on paper Zen is good, AMD will likely mis position their product, fail to
market it, or likely fxxked up by GF production issues.

~~~
krylon
> And when most of the world don't need high performance PC any more

Well, that is because we _do_ have very high performance PCs these days.
Somewhere on Wikipedia, there is a quote by somebody (I forget who...), that a
supercomputer is a device for turning a CPU-bound problem into an IO-bound
problem. By that definition, even modest PCs these days _are_ supercomputers
for the majority of programs ordinary people run on them.

~~~
BuildTheRobots
The quote is from Ken Batcher (- I liked it enough I went digging).

[1]
[https://en.wikipedia.org/wiki/Ken_Batcher](https://en.wikipedia.org/wiki/Ken_Batcher)

~~~
krylon
Thanks!

------
golergka
I wonder how western mentality can bend morality rules to always favor the
underdog.

The "immoral" practices author accuses the Intel are developing favoring
compiler and offering low prices and better deals to OEMs. Let me tell you
something: when you give someone one billion dollars, you don't hold them on
"ransom"; they're upholding an end of the deal that they voluntarily agreed
to. And to lose money, to offer better prices, to get bigger market share is
not something even remotely immoral.

Now, about compilers: Intel have never pretended that Intel compiler is
supposed to work just as good with Intel as with other compiler manufacturers.
It says so even in the marketing benchmark picture provided with the post. It
was individual developer's decision: whether he wanted to get equal
performance on different platforms, or whether he preferred to sacrifice
performance on AMD in order to get more on Intel. Microsoft, Borland, gcc,
LLVM and other compilers exist; if a developer choses Intel compiler instead,
it's his decision to make his software to run slower on AMD. How is offering
such an option immoral?

However, author completely glosses over the fact that AMD reverse-engineered
Intel's product and released it's clone. To me, this actually seems like
something not only immoral, but quite possibly, illegal, and definitely
something worse than Intel's deeds. But of course, since AMD is smaller, it
doesn't have to adhere to the same moral standard.

~~~
ikurei
> How is offering such an option immoral?

Not an expert, but _if_ Intel were intentially crippling the performance of
its compiler for AMD CPUs, i'd say that'd be inmoral. That's unfair
competition.

If it's just that they didn't spend any time testing and optimizing for the
competition's compiler, then... I'm not sure what to think of it, but I doubt
i'd come out on the inmoral side of the argument.

EDIT: s/disloyal competition/unfair competition/g.

~~~
TazeTSchnitzel
It wasn't just AMD, Intel was crippling performance on any non-Intel CPU. All
x86 CPUs have an instruction, CPUID, that allows you to check for the presence
of various features and instruction sets (MMX, x86-64, SSE, etc.). The
standard way to generate backwards-compatible code which uses newer
instructions is to check the flag, and run the version with the new feature if
the flag is there, otherwise use older instructions.

Intel, though, would not only check if the CPU supported a feature, but also
if the CPU was made by Intel. Essentially, Intel's compiler would generate
code like "if Intel, run fast, if not Intel, run slowly."

