

AMD reveals Fusion CPU+GPU, to challenge Intel in laptops - ssp
http://arstechnica.com/business/news/2010/02/amd-reveals-fusion-cpugpu-to-challege-intel-in-laptops.ars?utm_source=rss&utm_medium=rss&utm_campaign=rss

======
zokier
What took them so long? Plans for Fusion was announced when AMD acquired ATi,
and that was 2006. And now they announce that they are shipping them in 2011,
five years later?! Meanwhile Intel has already released several integrated
CPU+GPU solutions (Pineview and Arrendale/Clarkdale)

~~~
wtallis
There are a couple of ways that a Fusion-style core can be used.

You could take advantage of the high-bandwidth, low-latency connection between
CPU and GPU to get higher performance, especially in GPGPU applications, but
GPGPU hasn't really taken off yet. There's also the limitation that now CPU
and GPU are sharing the same heatsink.

You could use it to offer a low-end system with good integrated graphics,
because there's no longer a penalty for the GPU sharing the CPU's RAM.
However, this approach pretty much requires a new socket, which probably
eliminates any potential cost savings from not having to buy a discrete GPU.
This approach also sacrifices some in flexibility, as adding a discrete
graphics card later on makes the on-die GPU worthless unless you have a good
asymmetric SLI/Crossfire system, and even then you're limited to the features
supported by the on-die GPU.

The approach taken so far by Intel and AMD is to use it to make a better low-
power platform. Intel's obviously been way ahead here. Ever since the Pentium
M, AMD's had trouble keeping up in the mobile market. ATI's GPUs haven't been
that great for the mobile market, either. It was obvious in 2006 that this was
going to be the hardest approach for AMD to pull of, but it also seems to be
the only one that might pay off.

------
yread
I don't think it will be such a big hit. Current IGPs from Intel can decode
_2_ full HD video streams ([http://software.intel.com/en-us/articles/quick-
reference-gui...](http://software.intel.com/en-us/articles/quick-reference-
guide-to-intel-integrated-graphics/)). The next generation is supposed to have
2 such (or better) GPUs side by side on a chip
(<http://www.fudzilla.com/content/view/17576/1/>). These IGPs will be enough
for even higher number of people than today. How many people need to play
games in uber resolutions, anyway?

------
dirtbox
Laptop graphics are always a minefield of barely adequate hardware. I really
hope this, as well as Nvidia's ION chipsets see an end to these dark days.

~~~
pyre
The ION seems to be pretty good, but decoding HD video doesn't work very well
without using the GPU.

~~~
wtallis
That's a problem inherent in the nature of HD codecs. Decoding with a general-
purpose CPU is too inefficient. Any platform that has acceptable power
consumption for the ultra-portable market is going to have to use dedicated
decoding hardware. Once you take HD decoding off the list of things the CPU
has to be fast enough for, you have the option to use a small, low-power CPU
like the Atom. I don't see any room for improvement by using a different
arrangement. Software writers will just have to get used to using the OS
provided decoders so that any accelerators present can be used.

~~~
vetinari
Gettting used to using the OS provided codecs can be a problem, because it
forces you into specific codec, that you may want to avoid and disallows to
use codec you want to use.

For example, I can't imagine Bink being hardware accelerated anytime soon.

