
Intel will ship processors with integrated AMD graphics and memory - lelf
https://arstechnica.com/gadgets/2017/11/intel-will-ship-processors-with-integrated-amd-graphics-and-memory/
======
eganist
The last time this conversation came up, a slew of people called it a
strategic mistake for AMD. I'm actually willing to bet it's the opposite
(disclosure: I'm long AMD).

The last time AMD got into a fight with Intel, Intel pulled out all the stops.
The only relief AMD got--quite a ways down the road--was in the form of
damages.
[https://en.wikipedia.org/wiki/Advanced_Micro_Devices,_Inc._v...](https://en.wikipedia.org/wiki/Advanced_Micro_Devices,_Inc._v._Intel_Corp).

Something tells me AMD decided to make strange bedfellows this time around
against a common growing threat in multiple spaces, to include graphics as
well as machine learning (nVidia). It's one agreement, but for AMD, it's both
a revenue source as well as a foot in the door for potential collaborative
efforts which could give it some stability as it broadens its offerings and
looks at new avenues for value creation in the face of headwinds such as
resistance in further transistor shrinkage.

~~~
rangibaby
I think the real elephant in the room is ARM. iPad Pro has the performance of
(at least) a toaster “real” laptop. Sure Apple is expensive but very soon you
are going to see cheap and actually good ARM PCs. That can’t be good for
either Intel or AMD.

~~~
mr_toad
My money would be on ARM based Macbooks rather than Windows machines.

~~~
arthurfm
The first ARM-based Windows 10 PCs are due to be released soon.

[https://www.theverge.com/2017/5/31/15711334/microsoft-
window...](https://www.theverge.com/2017/5/31/15711334/microsoft-
windows-10-arm-qualcomm-pcs-asus-hp-lenovo)

[http://www.zdnet.com/article/windows-10-on-arm-hps-leak-
hint...](http://www.zdnet.com/article/windows-10-on-arm-hps-leak-hints-that-
new-snapdragon-laptops-nearing-release/)

~~~
kennydude
ARM-based Windows machines already exist --
[https://en.wikipedia.org/wiki/Surface_(2012_tablet)](https://en.wikipedia.org/wiki/Surface_\(2012_tablet\))

They worked out well

------
martin1975
I'm a huge fanboy of open source, and this move is a boon for Linux given
Intel HD's and AMD's GPU OSS drivers on the laptop as well as the desktop. Not
to mention it will further emphasize the division of gaming vs non-gaming
rigs, meaning NVidia vs AMD(ATI?)/Intel, and hopefully extinguish NVidia
completely from business/work-purpose computers (or even gaming rigs if
possible, since AMD/ATI's GPUs are rather decent too!). Forgive my severe,
palatable dislike for NVidia in this post.

Looks like it's time to upgrade my ThinkPad once Lenovo integrates this setup
into their T line.

Well done, Lisa Su!

~~~
monkmartinez
I like OSS as well, but (correct me if I am wrong) are AMD GPU's way inferior
for ML/Deep learning than that of NVidia? I am moving away from macOS to
ThinkPad. I am specifically waiting for an 8th Gen CPU from Intel with NVidia
discrete GPU for small scale ML and light gaming.

~~~
mr_toad
What is the size of the ML market compared to the gaming market? I get the
impression that the ML market for GPU chips is growing much faster than the
gaming market.

~~~
Nexxxeh
PC gamers have suffered generally because of the cryptocoin boom eating GPU
supply and inflating the price.

Doesn't really apply directly to the laptop market, people probably aren't
buying laptops for mining, but presumably it has some impact.

------
TazeTSchnitzel
This is two weeks ago's news:
[https://news.ycombinator.com/item?id=15635249](https://news.ycombinator.com/item?id=15635249)

~~~
beefsack
This article is actually from the 7th.

(Thought it was from July for a minute, silly US dates.)

------
chx
>and halves the power usage of a traditional design.

Um, really? Rumors talk of a 65W (and a 100W) Kaby Lake G CPU. The CPU is
4C/8T @ 3.1/4.1. That's 10% more than the 7700HQ which is 45W. The GPU side is
rumored to be 1000MHz 24CU and 4GB HBM2 @ 700 MHz.

Now the Ryzen 5 2500U is a 15W part with 8 CUs up to 1100 MHz. The CPU surely
eats some watts out of that so I do not think it's too outlandish to claim 24
CUs eat less than 30W especially with a lower frequency.

All in all, I absolutely can't see the claimed gigantic power usage savings
Intel purports here. It looks like 10% perhaps 15% based on the above data.
What am I missing?

~~~
aeleos
I understood that statement as saying they were halving the power usage
compared to a traditionally designed chip interconnect that uses ordinary
traces rather than the EMIB silicon chip talked about in the article. Not
halving the power usage of the entire chip, just the power required for the
chip connection.

~~~
chx
Bewildering. Does a chip interconnect consume anything worthwhile mentioning?
Wouldn't that be converted to heat? I have never heard of the motherboard
itself heating up...

Edit: yes, the VRM and the chipset itself have beefy heatsinks but not the
interconnect between them.
[https://images.evga.com/products/gallery/131-SX-E295-KR_XL_5...](https://images.evga.com/products/gallery/131-SX-E295-KR_XL_5.jpg)
look at the bazillion traces running on the PCB.

~~~
dbcurtis
It is the drivers on the chip required to go off-chip that consume power and
create heat. Look at a die photo. Those wee little transistors in the middle?
Logic and memory. Those giagantic cow turds around the edge? I/O drivers.

------
mfrw
And it starts to make more sense now:
[https://arstechnica.com/gadgets/2017/11/intel-poaches-
amds-t...](https://arstechnica.com/gadgets/2017/11/intel-poaches-amds-top-gpu-
architect-to-build-its-own-discrete-graphics-chips/)

~~~
IamNotAtWork
What the! There is gotta be some non-compete clause or something.

~~~
masklinn
No, there should not, and in California there is not.

"You're a GPU specialist, you don't get to quit anymore"? Fuck that, it's
insane.

------
Jhsto
Could this have something to do with Apple shipping their own chips with their
desktop computers? I would think that Apple would be interested in using those
new Intel/AMD chips on their laptops.

~~~
mistahchris
I had the same thought. Apple seems like a natural big customer for these
chips.

~~~
geetfun
It definitely looks like a survival move to counter the upcoming AXX Apple
chips in the next few years. New iMac Pro apparently has a A10 in it for boot
up sequence.

Source: [https://www.macrumors.com/2017/11/19/imac-pro-a10-chip-
hey-s...](https://www.macrumors.com/2017/11/19/imac-pro-a10-chip-hey-siri/)

------
spikels
Anyone understand why Intel would do this? Have they given up on their own
GPUs? Is this a response the Nvidia threat?

~~~
Radle
Yes and no. Intel Integrated Graphics was always a graphic solution executed
by the CPU. This here solves multiple problems. A) The Computer has a
dedicated graphics chip. B) The computer has an incredible high bandwidth
between cpu/gpu/main memory. C) The CPU will be capable of utilizing the
Special 3dimension graphics memory. Which is patented by AMD. (HBM)

This is a way more powerful solution than integrated graphics and thus puts a
lot of pressure on Nvidia for laptops and mobile maybe even for office
computers.

~~~
jra101
Intel integrated graphics are not executed by the CPU, there is an intel
designed GPU on the same die as the CPU.

[https://www.anandtech.com/show/9582/intel-skylake-mobile-
des...](https://www.anandtech.com/show/9582/intel-skylake-mobile-desktop-
launch-architecture-analysis/6)

HBM is not exclusive to AMD, NVIDIA also ships GPUs using HBM (Tesla P100 and
V100)

------
analyst_9
Why wouldn't they do this? Intel is a fab company, architecture second. The
main reason they bought Altera was EMIB. EMIB frees them from spending huge
amounts of resources on cutting edge Architecture and Microarch, freeing them
to focus on pushing the boundaries on their foundries (which are falling
behind). Using EMIB enables Intel to incorporate whatever IP will sell in the
market, theirs or somebody else's. The alternative for Intel is to drop their
foundries and use GF or TSMC (don't see that happening). I can see Intel using
Arm, AMD, Nvidia (if they'll let them, Nvidia seems to be pushing the discrete
path hard...that'll fail given high latency of PCIe). Arm has already arrived
in the HPC market, SC in Denver's unofficial theme was Arm (both Cavium and
Qualcomm). There are even awesome desktop machines now
([https://www.avantek.co.uk/store/avantek-32-core-cavium-
thund...](https://www.avantek.co.uk/store/avantek-32-core-cavium-thunderx-arm-
desktop.html)). HPC is the vanguard for server, already Arm compatible chips
are providing perf greater than Skylake Intel server parts at a cheaper price
point...why pay for Intel when you can have a Qualcomm Centriq or Cavium
Thunder X2? IBM Power is yet another option, but hugely expensive. Definitely
useful for GPGPU accelerated applications, but like Intel, IBM is going to be
given a run for its money by AMD in sheer number of PCIe lanes.

~~~
arcanus
> SC in Denver's unofficial theme was Arm

I don't agree. SC's unofficial theme was AI/ML. Every single vendor and HPC
center were talking AI, not all were talking ARM. The student cluster was
dominated by GPUs, not a single one ran ARM.

~~~
redshirt
We don’t really care about ML. If just so happens ML and AI are easy compared
to what HPC normally deals with. ML is just statistics and learning functions
which is in turn really dominated by the linear algebra. It’s hard to hear
that when you like buzzwords, but it’s just algebra and pretty simple at that.
That’s why things like grad student built systolic array processors (popular
in he 60’s) dominate here. It’s also why SC is showing lots of ML...because
the community knows how to do that better than anybody.

------
foota
I'm curious to see how these will be priced. They could save on costs over a
dedicated gpu by sharing cooling systems and other parts, which would be
exciting.

------
phkahler
This whole thing looks a lot like AMDs EHP concept from a couple years ago. In
fact I've been waiting for the thread ripper/EPYC approach to get a GPU and
HBM in the same package with Ryzen. That would be the perfect module to put
into the AMD project quantum concept PC. OTOH the Intel solution is targeted
at laptops. Let's hope AMD does the same for the desktop and soon.

~~~
gdwatson
What's the gain over discrete GPUs in a desktop, where space and power are not
at a premium the way they are in a laptop?

~~~
gmueckl
Omitting a bottleneck like the PCIe bus between the CPU and GPU makes memory
sharing between the two much more viable. This allows a totally different kind
of offloading of work from the CPU.

------
hishnash
The title is not quite true,

intel will ship a package that has an AMD GPU+HBM combined with a CPU but they
are not the same chip, basically, Intel is buying already fabricated GPU chips
from AMD and putting them together with their CPU to sell them as one unit.

------
joosters
This seems doubly bizarre, in that high-end Intel processors already tend to
have integrated graphics hardware, yet high-end AMD processors don't.

For instance, I built a new computer using an AMD Ryzen processor recently
(solely for its CPU speed). I hadn't realised that the CPU had no graphics
hardware at all, and that I would need a separate graphics card in order to
get any output on a monitor. Whereas previous computer builds using i7 CPUs
need no extra hardware.

~~~
opencl
It seems to largely be a cost issue, since AMD does not exactly have huge
amounts of cash to pour into lots of different chips designs for specific
purposes like Intel.

AMD wanted to release 8 core parts that competed on price with Intel's 4 core
parts. Roughly half the die area (and thus manufacturing cost) of Intel's quad
core desktop CPUs is the GPU, so by cutting the GPU they were able to make 8
cores at similar cost to Intel's 4 cores + GPU.

It also allowed them to reuse the same chip for desktop, workstation, and
server parts dramatically reducing R&D costs for the whole lineup. And really
"high end" Intel processors, as in everything with 8 cores or more, also lack
the GPU anyway.

The Raven Ridge CPUs are 4 cores + GPU (with much faster GPU than anything
Intel has). I imagine they'll do a higher core count part with GPU once
Globalfoundries starts making 7nm chips.

------
hellbanner
But these still come with the Management Engine?

~~~
phkahler
When I slip on my tinfoil hat I wonder if that's the reason for this. There's
nothing here that AMD couldn't offer on it's own so why combine their graphics
with Intel CPUs if not for the ME?

~~~
eropple
...because Intel will pay them money, which can be exchanged for goods and
services, for letting them do so?

~~~
phkahler
Any customer who would buy this would probably buy an AMD APU instead and AMD
would take home more of the money without giving their competitor anything.
AMD also has the capability to produce almost the exact same product using
their own processor instead of Intel, so again why help your competitor by
sharing the profits?

I'd say either AMD doesn't have the production capacity for whomever the
customer is (Apple?) or its something more nefarious like preserving ME.

Given that AMD is still a close second in graphics, maybe it really is a way
to increase volumes for their GPUs. But that still suggests there's some
reason they can't deliver the same with their own CPUs.

~~~
subwayclub
Intel has more of the existing partnerships with OEMs, and that isn't going to
reverse immediately with a strong showing from AMD. The new Ryzen Mobile
notebooks are low-to-midrange entries and groundbreaking in their category,
but Intel still covers that higher end segment.

It's easier to see this from Intel's perspective: Intel's biggest threat is
from the onrush of GPU-driven computing, and Nvidia is the market leader
there. The classic play is to starve them of oxygen by leveraging existing
channels to get them out of the gaming notebook market. Thus comes this weird
saga with AMD and Raja which is a win win deal in fact: Intel gets ammo to
fight Nvidia now and a key hire for their own development later, and AMD gets
another source of cashflow and marketshare plus a graceful exit for what has
been reported as shaky, conflicted executive management at RTG. Although
they've delivered decent hardware and software recently, there have been
numerous PR flubs from the group and the business unit performance is
questionable overall. There is an open question of what happens to RTG later
on, but perhaps the answer is simply to survive in Intel's shadow again.

~~~
phkahler
I can buy that. I didn't know there were issues at RTG. The point about OEMs
not switching quickly is also a reality. It still seems really strange. The
next move could be nVidia making their own high performance CPUs to integrate
with their GPUs. Think risc-v here, they've already embraced it and the
potential and freedom to innovate are wide open.

------
sdfjkl
Cool. I hope Intel writes the drivers though.

------
pcr0
I guess there was more to Raja Koduri's departure to Intel from AMD.

------
qwerty456127
Are current Intel GPUs really this bad?

~~~
sounds
Yes, they don't have the stability to be used for 9-to-5 work, and they don't
have the performance to be used for gaming.

