
Has AMD's Clock Run Out? - dlevine
http://blog.thirdyearmba.com/has-amds-clock-run-out
======
KeyBoardG
Times are tough, but I sure hope not. First to 64bit, first to multicore x86,
first to on-chip memory controller, first to APU. We need them to keep pushing
the envelope.

I'd love to see AMD at a 22nm process with Intel and then compare them.
They've managed to stay close even two process generations behind(The chip
that just came out today is at 32nm).

------
beagle3
Sad.

If AMD wasn't there in 2004, forcing Intel to (effectively) drop the Itanic
and do what customers actually wanted - the AMD64 platform - we'd either be
much farther back now in terms of x86; Or would be much farther in terms of
ARM; (or both).

AMD's gut to do AMD64 helped us all.

~~~
masklinn
Not so much the Itanium (which simply nobody cared for) as the P4 dead-end,
forcing intel to come back to the P6 architecture (via the Pentium M).

They showed the path to better integration and performances (on-chip memory
controller, significantly better interconnect, better multicore integration)
as well.

It's just sad how much they lost their way since the Athlon 64, and how
Intel's Core just curb-stomped them.

~~~
beagle3
Actually, it was very much the Itanium. At the time, Intel had no plans
whatsoever to introduce a 64-bit x86 architecture - they were the only game in
town for a long time, and believed that they can force the 64-bit market to be
Itanium. HP and Compaq/Alpha had already given up on their 64-bit offering at
that point, and either blessed Intel as the 64-bit heir (HP) or sold it to
them (Alpha). The only other game in town was SPARC, which Intel wasn't really
facing in the 32-bit market.

While it is true that the P4 was going nowhere, it was the 64-bit market that
forced Intel to reconsider their road map; If it wasn't for AMD, 32-bit might
have sped up to Core, or stayed at P4, but we'd be nowhere close to where we
are today.

------
programminggeek
I am planning on building a mini-itx system with a 65watt A10 chip sometime
soon, but AMD's trinity didn't launch with mini-itx mobo's and the 65 watt
chips aren't on newegg yet.

The small form factor gaming/htpc market could be a good one for AMD, but they
haven't been able to get any other companies to build them in volume. Usually
you get the eeeBox or something underpowered using an E-350. Heck, HP is
sticking those in full desktop cases which is absurd.

All the innovation left in the PC market seems to be on tablets and
ultrabooks, not on the desktop at all. AMD hasn't pushed as hard on either
form factor and it's hurt.

Also, why hasn't AMD done what nvidia did and become an ARM chipmaker? Tegra
has sold well enough for nVidia and an ARM desktop box could be quite
competitive in the next few years for the average user.

~~~
dlevine
The problem with AMD making ARM chips is that they would be cannibalizing
their own market.

Their best move would probably be to license the ARM platform and use their
chip designers to make custom chips (like Apple and Qualcomm have), but I
don't think this is realistic. Also, I'm not sure whether any of their GPU
technology is low-power enough to be useful in ARM designs.

~~~
stephengillie
Since they didn't cannibalize their market themselves, someone ate it out from
under them.

------
mtgx
I don't know if AMD has anyone to blame but themselves. Over the past few
years, I kept listening to the statements of their CEO's, and I got the
impression that they "don't get it", and aren't very visionary.

When they were supposed to do something about the mobile market, they said
they will "wait and see". That was even after Nvidia did the right move and
created Tegra. Nvidia was clearly a more visionary company than AMD, and
Nvidia will survive because of this. It might even out-survive Intel because
of their move to ARM. AMD won't. They'll be crushed by both Intel and ARM chip
makers.

~~~
fnordfnordfnord
Who in their right mind, in the chips industry would consider taking the CEO
job at AMD? In order to thrive, you'd need to be smart, lucky, and able to
convince Intel to let you. Intel, on the other hand seems interested in the
survival of AMD, but only inasmuch as it keeps the FTC and Justice Dept. from
sending nasty letters about monopolistic business practices. Add to that, they
both have to worry about ARM eating their lunches.

tl;dr Nobody with any sense would take the CEO job at AMD.

~~~
seunosewa
No true, considering how much CEOs are paid regardless of whether the
companies they manage ultimately succeed or fail. It would be financially
irresponsible to turn down a CEO job offer in a high profile company.

------
lunarscape
If games consoles do well in the next few years AMD will do okay. The Wii U
has AMD graphics, the PS4 will and the XBOX successor is rumored to have AMD
too.

~~~
SkyMarshal
Yes, turns out that ATI acquisition may save them. Nice hedge against
struggles on the CPU side.

~~~
ekianjo
It turns out ATI's death sentence was rather pronounced when it was acquired
by AMD. ATI is now far behind nVidia, and there is no catching back now that
they depend on AMD's situation so much.

~~~
rhengles
"Far behind"? Please elaborate. This may apply to the CPUs, which is
unfortunate. But the GPUs are very competitive.

------
darrennix
I wonder what the market effect will be if AMD exits the picture entirely.
Once Intel has near-total market share for PC chips, wouldn't they become a
prime antitrust target?

~~~
asdf333
ARM processors are starting to move up into the higher end as well, like
servers. On an energy/computation basis they make more sense than x86
processors right now.

The only issue is that software needs to be optimized for ARM processors but
if the savings are there, this will happen pretty quickly.

So I'd actually be worried for Intel as well. Not just AMD.

~~~
iyulaev
>> On an energy/computation basis they make more sense than x86 processors
right now.

Does anyone have ANY numbers that back this up? I've heard this refrain many
many times but I've never seen any hard numbers to prove this. On a processor-
per-processor basis sure an ARM SoC beats a Intel Xeon. But flop or dhrystone,
x86s destroy ARM processors. In a virtualized world, where the number of real
systems doesn't have to match the number of servers, x86 still appears to hold
the lead.

~~~
cmccabe
ARM cores tend to be more power efficient for three main reasons: they operate
at a lower clock frequency, they don't have the CISC legacy baggage, and the
low-level software guys have spent a lot more time on power management than on
x86.

It has to do with the evolutionary heritage of both systems. Most x86 systems
are still sold to individuals or small businesses that will plug them into the
wall and forget about power dissipation. A typical x86 desktop machine will
draw between 300-500 watts. ARM evolved more for the cell phone and tablet
market, and typical power consumption for one of those systems would be under
5 watts.

~~~
jrabone
_A typical x86 desktop machine will draw between 300-500 watts._

With a beefy GPU, perhaps. The new i3 3220T CPU is only 30W MAX TDP, flat out
- a typical (ie. non-gamer) rig is looking more like 150W max, and a lot less
idle (20W should be possible). Not 5W, but nowhere near 500W.

~~~
cmccabe
20W, 200W, what's the difference? Either way, you can't get that out of the
battery on a mobile device. And consumers don't generally buy PCs based on
power dissipation. Sad, but true.

And if you think your typical beige-box PC can handle a power supply that is
specc'ed for 150W-- go ahead and put one of those in there. I DARE you.

~~~
jrabone
The lower figure of 20W is pretty standard: go take a look at the battery in
your laptop. Most Dells are 65Whr batteries - ie. 20W for 3 hours of battery
life (to a first approximation, LiIon is a bit more complicated than that).

The ubiquitous small form factor PCs like the Optiplex 780
([http://www.dell.com/downloads/global/corporate/environ/compl...](http://www.dell.com/downloads/global/corporate/environ/comply/optix_780_dccy.pdf))
use a 235W (max) power supply, which will be overspecced to trade off failure
rates for manufacturing cost. Those machines actually draw less than 150W flat
out. And they're everywhere. A certain large e-tailer with an emphasis on
frugality used to use them as developer desktops(!).

Who knows what's in a typical consumer beige box, but it isn't pulling 500W
_continuously_ , unless they're playing, say, Skyrim 24/7 with a big graphics
card - in which case of course one would specify the correct (safe) component
for the design. I'd argue that they're not typical by that point; most people
won't spend £300 on a graphics card (I do).

~~~
cmccabe
So to recap:

* You point a to 235W power supply as an example of the bare minimum PC power supply-- not too far from my 300W round number.

* You point out that a power supply rated for X isn't drawing X continuously-- a true statement, but it's responding to an argument nobody made. You have to pick a power supply which is rated for your max load-- everyone knows that, or should. It still doesn't change the fact that both max load and average load for X86 are orders of magnitude greater than for most ARM devices.

------
JVIDEL
The thing about AMD is that for every advantage it has two or more
disadvantages. For example, Trinity rocks but the number of laptop models
available with it is staggeringly low and in most cases you have to make due
with lowend specs like crappy screens, no SSD option and a case which feels
like it was made from recycled Compaq PCs from the 1990s.

Which is ironic since Trinity could drive a retina-like display without a
discrete GPU on the side, yet I could only find a handful of Trinity laptops
with optional 1080p displays and two weren't available stateside. The only
performance unit I could find was made by MSI.

There's nothing like the Zenbook or the ENVY15 available, so at the end the
problem isn't a compromise on CPU power alone but on nearly everything else
too. So you have to choose: either you get a good laptop or an AMD laptop, and
that's not fair.

I guess AMD should start working more closely with OEMs to make sure its APUs
are available on products that are not all bargain-bin units but at least some
mid-to-high end units with good features and build quality.

That or do like MS with the Surface and make their own highend laptops and
tablets.

------
zanny
I don't buy this story at all, mainly because AMD never could fight Intel in a
straight up fight. AMD is at least an order of magnitude a smaller company
than Intel, so much so that Intel spent more money on R&D
([http://newsroom.intel.com/community/intel_newsroom/blog/2011...](http://newsroom.intel.com/community/intel_newsroom/blog/2011/01/13/intel-
reports-record-year-and-record-fourth-quarter)) than AMD makes in total
revenue ([http://phx.corporate-
ir.net/phoenix.zhtml?c=74093&p=irol...](http://phx.corporate-
ir.net/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=1652123&highlight=)) last
year.

That isn't even about monopolistic business practices, decisions, or market
forces. You are comparing two companies operating on effectively different
planes of existence. Intel owns the instruction set, has the most advanced
silicon fabs in the world (and still makes their chips in house) and spends
more on R&D than AMD even makes. And _all_ Intel does is make CPUs.

Meanwhile, AMD bought ATI and took a tremendous gamble on APUs. They are
_just_ starting to mature their APU line with Trinity in the last few weeks,
and are _still_ reeling from integrating two large companies together like
that. They had to sell their own fabs off and couldn't even make their most
recent generation of GPUs at Global Foundries because they aren't keeping up
anymore. On that front, the 7000 series graphics cards (from my objective
viewpoint) basically crushed Nvidia for the first time in a while. They were
first to market, as a result didn't have major shortages, and price cut at the
appropriate times to keep their products competitive. It took Nvidia almost
half a year to have their GPU line out after AMD, and their chips, at
competitive prices, are almost exclusively openGL / graphics devices, being
beaten in GPGPU operations by the old 500 series and easily by the 7000 series
because they tried going many core limited pipeline over more generic cores in
the 500 or 7000 series that were better at arbitrary GPU compute tasks.

So they are doing _really_ well in graphics. And their APUs are _really_ good
graphics chips too. The only flaw in AMD right now is that they are
floundering on the cpu fabrication front as badly as Nvidia did with their
graphics line (only with their cpus). They eat power, they are effectively 1.5
generations of fab tech behind, and the bulldozer architecture is weak in
floating point and serial operations.

That doesn't ruin a company. Hopefully next year is the year they really start
moving forward, because I really think AMD is the company to finally really
merge gpu and cpu components into some kind of register / pipeline / alu soup
that can really revolutionize the industry (imagine SIMD instruction
extensions to x64 that behave like opencl parallel operations and have the
normal processor cores work on register ranges and vectors like a gpu, rather
than just having a discrete gpu and cpu on one die).

Even barring that kind of pipe dream, Steamroller is shaping up to be sound.
It finally gets a die shrink AMD desperately needs to stay competitive, if
only to 28nm, and finally puts GCN into their APU graphics instead of the 6000
series era VLIW architecture.

They can't really stand up and fight Intel head on anymore, because Intel got
on the ball again, and their cpus are crushing AMD in a lot of use cases,
especially power usage. But AMD still has significantly better graphics, and
are leveraging it, and they are finally getting over the ATI growing pains, so
I'd wager they are still in the game, if only barely. They have a lot of
potential still.

Footnote: I really think market is a big reason AMD is falling behind. The
Ultrabook campaign is stealing wealthy pc buyers from them, and that is where
chip makers get a majority of their profits (look at the high end mobile i7
chips selling for a thousand bucks). Desktop sales are abyssmal besides OEM
systems or businesses. Intel wins at getting business contracts just by size
alone, they just have more reach. Desktop enthusiasts can bank on AMD being a
cost effective platform, but the wow factor lies in Intel chips, even at the
premium, and they steal that market too. AMD doesn't even do well in the cheap
HTPC market because their chips burn so much power. They are in a crossroads
where all their target markets are either becoming obsolete or they are losing
ground, and not because they have bad products, but because their perceptions
and influence are growing worse.

Right now, AMD is really strong in mid ranges. Mid range laptops with a
trinity APU are really good and extremely cost effective (I had a friend buy
an A8 based Toshiba because it was $500 cheaper than a comparable Intel
machine that could run League of Legends). Piledriver is good enough in the
desktop space to recommend one of the 4 or 6 core variants to friends looking
for a budget PC gaming experience, because they are pretty much more than
enough with a proper overclock for anything major. But AMD has (from my
experience) a bad image right now as a dying company and as a maker of budget
goods, even when their GPUs kick butt and their desktop CPUs can (at least
according to the recent Phoronix Piledriver FX benchmark) hold ground against
even Intels best Ivy Bridge offerings in some cases at almost half the price.

TLDR: I guess after graduating college I had withdrawal on writing essays.
This is a really long wall of text, holy bacon.

~~~
oijaf888
Actually AMD owns the x86-64 instruction set and licenses it to Intel in
return for the x86 license.

Also doesn't Intel also make GPUs?

~~~
zanny
They made GPUs, and they are getting back into it, but any review of the hd
4000 graphics vs anything from the Trinity line reinforces that Intel is still
way behind in the graphics front.

They are saying Haswell will be an improvement, but AMD has the architecture
cohesion to be able to pair discrete and integrated cards in their hybrid
crossfire x, and they just have a decades worth of gpu experience from the ATI
acquisition, that they are better positioned to exploit heterogeneous cores.
Same way Nvidia Tegra in the mobile world is a gaming / video powerhouse
because the gpu is so strong.

Also, x86-64 was / is a specification extension by AMD, not an instruction
set, so I don't think Intel licenses it. They even spent a few years calling
them AMD 64 and Intel 64 even though they were written to the same spec. AMD
still licenses x86 from Intel though. It is like how SSE and other instruction
set extensions are not cross licensed between the two.
<https://en.wikipedia.org/wiki/X64#History_of_Intel_64>

~~~
enraged_camel
I don't know how you reached the conclusion that AMD is doing really well in
graphics and the 7000 series "crushed" NVidia's 600 series. Aside from
numerous inaccuracies in your analysis (for example, the 600 series was
released _three_ months after the 7000 series, not six - March 22nd 2012 vs
December 22nd 2011), in most benchmarks Nvidia fared really well against AMD
with equal framerates and lower power consumption/temperatures.

~~~
zanny
The 680 came out really quick, but was pretty much out of stock for about 2
months after it came out consistently.

But I'm talking more about the 77xx vs the 66x lines, which are the most
mainstream discrete cards in the series, where the 77xx came out in February
and the 66x cards came out in August. So about 6 months.

For all the hype around the 7970 vs 680, in the end the vast majority of their
OEM card sales will be with less expensive hardware, which is why I think AMD
"won" even if the 600 cards give better FPS for lower power usage in video
games. They just basically controlled the market for mid-range cards for
almost _half_ the year, and by the price cuts they have been making they have
been making a really sizable profit off sales until Nvidia brings out a
competitor.

I just want to mention I'm not an AMD fanboy - I have an i7 920 and a gtx 285
right now. My "last" build was around 2006 and was an Athlon x2 with a 1950.

------
JanneVee
Perhaps, but we don't know yet. We don't know if "Hondo"(AMD) will compete
with "Clover Trail"(Intel) for Win 8 tablets because they might really be
competitively priced. Further we don't know why those two are Windows 8 only
currently. There might be a Linux tablet in the future for "Hondo" but "Clover
Trail" might not because of the PowerSGX graphics core.

Also I've spent most of my evening reading reviews of the new AMD CPU they
released and it is looking good for budget enthusiasts.

------
jamesjguthrie
I agree with this article, I don't even consider buying AMD-equipped machines
these days whereas in the past I always bought AMD.

------
vondur
I hope not, I just purchased an AMD FX-8150 8 Core Processor, Gigabyte
Motherboard and 16GB of RAM for $399. So far seems like a good machine.
Performance is not as good as an i7, but it's quite a bit cheaper.

~~~
Symmetry
At the moment Intel owns the high end, but AMD is a pretty good value if
you're not going to use your computer mostly for lightly threaded things like
video games.

[http://techreport.com/review/23750/amd-fx-8350-processor-
rev...](http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/14)

------
TheCondor
Did Athlon and opteron really do well or did net burst do really bad?

I want AMD to continue to compete and push but one bad architecture takes a
while to overcome. Bulldozer++ needs to deliver.

~~~
zanny
Netburst was bad because they hit a mhz ceiling on the architecture and
couldn't force the thermal enveolope any higher, while AMD went with dual
cores and x64, which proved the correct path. They just hit a home run.

What nobody ever seems to consider is that AMD has revenue that less than what
Intel spends in R&D. The fact they can even compete on the scale they do with
Intel is a testiment to their success. Intel was always way too big for AMD to
compete directly against, and after Athlon's sucecss they tried to move into
the big leagues and fight Intel one on one, and lost just due to raw funding
(I'd argue, at least). It is why they had to sell off Global Foundries to buy
ATI, and such.

------
guyzero
This is great news - clockless architectures are the way of the future.

------
cmccabe
Another way of looking at it is that AMD was always David to Intel's goliath.
Intel screwed up big time with the Pentium 4; they designed the architecture
for an extremely high clock rate, and ran headlong into a thermal dissipation
brick wall. The Itanium, Intel's strange attempt to kill the x86 architecture
which had brought them so much money, was another huge blunder. AMD exploited
these opportunities. But unless Intel makes another big mistake, capitalism
will do its thing and force AMD out of the market.

There was an announcement that AMD will make ARM-based chips in the near
future. That niche might have some more air supply than the one they're in
currently. However, they'll have a lot of competitors in the ARM space, so who
knows.

~~~
lostlogin
Wonder if AMD would be able to fill/partially fill a custom Apple chip order
given that rumours suggest they are shopping about for a supplier?

~~~
Symmetry
Well, AMD will be making the chips in the next XBox, the next Playstation, and
the graphics for the next Wii so that's something at least.

------
systematical
I see what you did with that title.

~~~
mckilljoy
lol I came here to post that. Upvotes for you!

