

Intel's Sandy Bridge to Deliver 2% of AMD's Top Graphics Performance - mrb
http://blog.zorinaq.com/?e=31

======
acqq
There's one obvious reason for the difference:

AMD's top graphics card needs up to 300W alone (not counting the CPU power).

[http://www.amd.com/us/products/desktop/graphics/ati-
radeon-h...](http://www.amd.com/us/products/desktop/graphics/ati-radeon-
hd-5000/hd-5970/Pages/ati-radeon-hd-5970-overview.aspx)

The maximum TDP for the Sandy Bridge ( _for CPU and GPU together_!) is 95W:

[http://www.anandtech.com/show/3922/intels-sandy-bridge-
archi...](http://www.anandtech.com/show/3922/intels-sandy-bridge-architecture-
exposed)

And there are of course many additional trade-offs once you design for the
limited power consumption and overall complexity of the chip.

------
Aron
It's a logical direction for Intel to take. They are continuing to ramp up
transistor counts via process shrinks, but the value of multiplying cores is
weak for the consumer right now, so it makes sense to pull in whatever other
more specialized functions they can manage.

This trend doesn't kill or even significantly maim Nvidia, but it weakens
their < 50$ segment.

~~~
Tamerlin
I suspect that Intel is also intent on entering the high end GPU market. The
margins in mainstream processors are decreasing, and the new ARM cores aren't
going to help with that. Margins on high-end GPU's on the other hand are quite
high.

The first Larabee processor probably won't be much of a competitor to AMD's
and nVidia's GPU's in graphics, but that's just the first generation... who
knows what they'll pull off in the second?

(Of course, that could also end up being crap. Intel's history with GPU's
isn't exactly examplary...)

