
Rivals Intel and AMD Team Up on PC Chips to Battle Nvidia - MekaiGS
https://www.wsj.com/articles/rivals-intel-and-amd-team-up-on-pc-chips-to-battle-nvidia-1509966064
======
ksec
This is like, "Hell has frozen over" kind of news

Am I the only one who smell this as very "Apple" wanted?

I dont think AMD will be giving up any GFX secret, more likely this is AMD
shipping Intel a Mobile Gfx Die to be integrated within the same CPU package.

But in any case, Why not just have Intel ship a Mobile CPU without iGPU and a
Separate GPU.

And AMD, why now? When Zen is doing great, has great roadmap and potential,
along with much better GFx then Intel. Why?

Edit: OK, I didn't read carefully, while this is WSJ, it is still a rumor,
nothing has confirmed.... yet.

Edit2: It is confirmed now.

Edit3: Yes AMD will be shipping die to Intel, and it is EMIB at work.
[https://www.pcworld.com/article/3235934/components-
processor...](https://www.pcworld.com/article/3235934/components-
processors/intel-and-amd-ship-a-core-chip-with-radeon-graphics.html)

~~~
microcolonel
> _But in any case, Why not just have Intel ship a Mobile CPU without iGPU and
> a Separate GPU._

Power, manufacturing costs, integration and testing costs, design size...

> _Am I the only one who smell this as very "Apple" wanted?_

Why would Apple care about NVIDIA? They're already beating them in the kind of
performance that matters on products Intel will be competing for.

~~~
bryanlarsen
My theory on why Apple prefers AMD over nVidia:

AMD hardware at a specific price point is more powerful than nVidia hardware
at the same price point. However, nVidia has superior drivers that eliminates
the difference.

And since Apple prefers to use their own drivers, nVidia loses their main
point of differentiation.

But of course the "Apple" drivers for video cards are basically vendor drivers
with Apple doing QA & release management. But the driver is nVidia's secret
sauce, they're not going to show it to Apple, since Apple is now a very
competitive GPU manufacturer. At some point Apple will probably put their own
GPU's inside Macs, so nVidia doesn't want to give them a head start.

AMD cares less about giving Apple a head start because they care more about
the short term than the long and competing against nVidia.

~~~
walshemj
The new Vega boards lag nvidias 1080ti apart for some edge cases of interest
to cryptocurrencies

~~~
EdgarVerona
Though to be fair, they also are significantly cheaper than the 1080ti. Well,
if you can find one at MSRP at least.

~~~
hotmail
Coding with Vulkan directly seems to eliminate the nvidia perf advantage in
many cases, so...is it still the hardware or the software lagging?

~~~
joombaga
Vs. coding for nvidia or vs. coding for DirectX independent of card-maker?

------
bryanlarsen
Much better source: [https://newsroom.intel.com/editorials/new-intel-core-
process...](https://newsroom.intel.com/editorials/new-intel-core-processor-
combine-high-performance-cpu-discrete-graphics-sleek-thin-devices/)

credit to trynumber9:
[https://news.ycombinator.com/item?id=15635771](https://news.ycombinator.com/item?id=15635771)

------
andy_ppp
This seems like a lose lose and a net negative for everyone.

1) AMD must be preventing Intel from in future building similar integrated
GPUs using anything like AMDs patent portfolio in GFX.

2) Intel will be able to 100% push out any vendors from moving to Ryzen for
presumably faster integrated graphics.

3) Nvidia doesn't have an x86 CPU and last time I looked - all PCs and Mac
laptops were using x86.

As some have speculated maybe this is Apple telling their suppliers to jump
and Intel and AMD said how high...

I'm going to be interested to see if we ever get a jump to Apple using ARM and
internally designed GPUs, I have a feeling that Jonny Ive must be wetting
himself over making a reasonably powerful laptop that thin.

~~~
saas_co_de
No, I think it is win-win.

1) AMD can sell chips in high end systems where it can't compete with Intel on
CPUs or Nvidia on GPUs.

2) Intel can ship a high end graphics experience without dGPU which gets them
a level of graphics in a form factor they couldn't otherwise achieve.

3) Ryzen is a low/mid range chip. Even if it could match Intel's performance
it could never match Intel's brand, and Intel doesn't want to match AMDs price
so they will stay in different market segments. Ryzen sales will not be hurt.

4) AMD gets valuable brand recognition by getting Radeon into more premium
devices which could actually boost sales for cheaper Ryzen/Radeon devices down
the line.

5) Shafts Nvidia which is a win for both sides.

6) Opens lines of communication for possible future merger or fab deal, which
is not so much an issue from an anti-trust standpoint when you look at the
total competitive landscape of ARM, Nvidia, Apple, Qualcomm, etc and the
shrinking relevance of x86 in the big picture.

~~~
sambe
Regarding 3): most benchmarks I've seen quote 2-7% difference in IPC to Intel.
I wouldn't call that low/mid-range. Am I misunderstanding something?

~~~
coldtea
From the parent: "Even if it could match Intel's performance it could never
match Intel's brand".

~~~
dahauns
Not with that attitude.

(i.e. argued that way this would be more like "setting up for failure" on AMDs
part.)

~~~
saas_co_de
Brand perceptions can definitely change but the reality right now is that for
the past 5 years AMD has been shipping non-competitive parts and has had zero
premium design wins and very little money spent on advertising.

For the average consumer, every high end system they have seen in recent
memory is Intel, and every ad they have seen is Intel, and if they have seen
AMD at all it has always been positioned as a budget product.

That is going to take a long time for AMD to turn around assuming they can
sustain performance parity with Intel.

That is also why AMD is focused on servers and semi-custom where they are
selling to technical people who are evaluating based on price/performance and
not based on brand perception.

~~~
myrandomcomment
I am pretty sure the current PS4 and Xbox count as premium design wins for
AMD.

~~~
freeflight
There's nothing "premium" about those gaming consoles, maybe the PS4Pro/XboneX
could be considered "premium" for the console sector.

Imho parent was talking about the PC sector and in that regard, AMD has been
sadly trailing way behind Intel/Nvidia for quite a while now. AMD has
struggled to oppose Intel's i5/i7 dominance. Similarly, they still have no
real competition for the high-end NVidia GPU's, like the 1080's.

If money is not a limiting factor, then you will be hard-pressed to come up
with a build that does not include an Intel CPU and at least one Nvidia GPU.

~~~
wolfgke
> If money is not a limiting factor, then you will be hard-pressed to come up
> with a build that does not include an Intel CPU

If money is not a limiting factor, you probably look at AMD EPYC now.

~~~
StudentStuff
Yeah, the PCIe lanes alone are killing it, never mind the increase in CPU
performance. Intel doesn't have much to offer to oppose this dominance, and
their at a wall with process technology (can't go much smaller than 7nm
without massive power leakage) with a 3 to 4 year pipeline just to get a new
chip out. Making one chip takes 9 months end to end, assuming the design is
ready now, hence the long pipeline.

------
itissid
Some thoughts from anandtech: "The agreement between AMD and Intel is that
Intel is buying chips from AMD, and AMD is providing a driver support package
like they do with consoles. There is no cross-licensing of IP going on: Intel
likely provided AMD with the IP to make the EMIB chipset connections for the
graphics but that IP is only valid in the designs that AMD is selling to
Intel"[1]

[1][https://www.anandtech.com/show/12003/intel-to-create-
new-8th...](https://www.anandtech.com/show/12003/intel-to-create-new-8th-
generation-cpus-with-amd-radeon-graphics-with-hbm2-using-emib#comments)

~~~
mozumder
If EMIB is that valuable, then they should expand it to main system memory as
well as NVMe flash/XPoint storage.

~~~
itissid
EMIB requires that all components can fit on a die that can be connected to
the other components like the CPU using an embedded channel through the
substrate. Expansion of the substrate to accommodate other components is not
without costs.

~~~
mozumder
For laptop purposes, wouldn't Intel just stack 8 or 16GB of main memory DRAM,
like they would HBM?

~~~
itissid
Umm, probably, but NVMe is still connected via PCIe. The whole point of this
deal is the EMIB connecting RAM and GPU. And its hard, EMIB connections is
mostly to do with power dissipation and throughput.

I am probably out of my depth guessing why companies are doing this with NVMe.
Maybe its just that the market does not exist for it maybe its just the
cost...

------
bitL
Intel + AMD should get together in offering a CUDA alternative/compatibility
for Deep (Reinforcement) Learning/AI where NVidia is experiencing exponential
growth for the past few years and they are simply non-existing there full of
half-baked efforts.

~~~
microcolonel
AMD and Google IIRC are already working on a CUDA implementation for AMD GPUs.

~~~
bitL
How much do I need to wait until TensorFlow can run stable and at the same
speed as on cuDNN on AMD hardware? I can't even contemplate buying AMD right
now (gaming is not very important to me).

~~~
jdietrich
AMD's performance deficit is about to get a lot worse. Nvidia's upcoming Volta
architecture is massively optimised for deep learning - they're touting a 12x
performance increase for training and 6x for inferencing over Pascal.

I think Intel have a better chance of catching up with Nvidia at this stage.
They've been on an acquisition spree and have picked up a huge amount of DL-
related IP. They have immense R&D and fab resources at their disposal.

[https://wccftech.com/nvidia-volta-tesla-v100-gpu-compute-
ben...](https://wccftech.com/nvidia-volta-tesla-v100-gpu-compute-benchmarks-
revealed/)

~~~
dragontamer
That's only if you use Volta's Tensor cores however.

AMD's 16-bit packed performance with Vega is more than respectable vs NVidia's
16-bit packed performance in Pascal.

In the future, all AMD needs to catch up to Volta's Tensor cores is to build
Tensor cores themselves. That doesn't seem like a major technical hurdle. I'm
fairly certain that Google would be the primary patent holder on Tensor-cores.

------
cjensen
Worth noting: Skylake and Kaby Lake have been horror shows for Intel
Integrated GPUs driver crashes. To add to this, some paths (like DXVA) are
actually slower than Haswell.

I wonder how much of this is a fix for GPU reliability rather than GPU
performance.

~~~
craftyguy
> Skylake and Kaby Lake have been horror shows for Intel Integrated GPUs
> driver crashes

My experience with Intel iGPUs on skylake and kaby lake is quite different.
They work very well on Linux.

~~~
cjensen
I guess I should have clarified: Win7

~~~
minikites
Skylake and Kaby Lake aren't supported by Windows 7:
[https://arstechnica.com/information-
technology/2017/03/micro...](https://arstechnica.com/information-
technology/2017/03/microsoft-is-getting-ready-to-block-windows-updates-for-
old-windows-on-new-chips/)

~~~
cjensen
Skylake is supported, as mentioned in the article you linked to.

Kaby Lake's GPU is not a change from Skylake, so I expect the same problems,
just in Win10 instead of 7.

~~~
minikites
>only security fixes for Skylake systems running Windows 7 and 8.1

I don't think a security fix is going to address your GPU issues. You should
update to a supported operating system.

~~~
cjensen
It's only been two months since the last Intel driver was released in Windows
Update on Win7. The GPU is supported just fine thanks.

Also, we build systems which run our application suite controlled by a custom
control panel, and we have yet to manage to fully tame Win10's intrusiveness
into the operator's workflow. Even if we do fully tame it, maybe the next
version will change the number of things needing tamed. I'd use Windows Server
instead, but it isn't supported by ASUS on desktop motherboards. We're in a
dead end and seriously thinking about Linux at this point.

~~~
craftyguy
I suggest just using the superior OS at this point: Linux.

------
tombert
I'm not super familiar with the law in this area, but doesn't AMD+Intel
teaming up get into antitrust territory? The x86 architecture still basically
has a monopoly on the desktop, laptop, and server space. Could someone a bit
more familiar in this area elaborate?

~~~
simion314
I think this would be Intel+Radeon division and not the AMD CPUs, Intel would
still compete with the AMD CPU division

------
grogenaut
Any chance this is try before you buy for Intel?

~~~
Strom
Close to zero chance of Intel being allowed to buy AMD, unless they open up
x86 for others. Although if alternative architectures like ARM start gaining
some actual ground on desktops/laptops, say in 20 years, then it might not
trigger anti-monopoly bells as hard.

Then there's the option of Intel buying just the Radeon parts of the company.
However seeing as the GPU market seems to be growing fast, it would have to be
quite the offer for shareholders to agree.

~~~
maaark
They might be allowed to buy Radeon Technologies Group, though.

------
johnvega
For those who are excited about AR/VR Mixed Reality headsets that Microsoft
recently released, I think being able to use them with more laptops or maybe
even thin and light laptops will be a great option. If you haven't tried the
floating 2D desktop inside a 3D VR of the computer you're using is an
interesting experience.

------
dna_polymerase
AMD should focus on getting great support for their existing tech. If their
current GPU lineup would be better supported by Tensorflow & the likes that
would greatly increase their adoption. Also, more RAM per GPU guys. Memory is
a big bottleneck for many models I use.

~~~
dman
Check out the Radeon ssg. Its a gpu with a directly addressable 2tb ssd built
into the gpu.

------
contingencies
Next new desktop / graphics card purchase (hell, throw in a VR headset/gun-
style controller, an android horse[0] and a real Mongolian saddle[1]) = _Mount
& Blade: Bannerlord_[2], IMHO set to be the best game in decades. No bandwidth
for thought until then. :)

[0] Played a cowboy game on one of these in a Shenzhen VR house, most fun I've
had in years! Just don't look at the video of yourself playing afterwards!

[1] Cheap realism.

[2]
[https://www.taleworlds.com/en/Games/Bannerlord/](https://www.taleworlds.com/en/Games/Bannerlord/)

------
shmerl
Is this really going to be Intel with AMD GPU, or Intel simply paying to AMD
for patents instead of Nvidia, while still making their own GPU?

Somehow I doubt, AMD want to give away their APU competitive advantage to
Intel.

~~~
mastax
[https://arstechnica.com/gadgets/2017/11/intel-will-ship-
proc...](https://arstechnica.com/gadgets/2017/11/intel-will-ship-processors-
with-integrated-amd-graphics-and-memory)

AMD GPU Die + Intel CPU die on one package.

~~~
shmerl
Interesting. Given it's Intel and AMD, I hope it will work seamlessly on
Linux, unlike the Optimus horror.

~~~
SXX
PRIME offloading on laptops with same combination worked flawlessly for years.
Its obviously work on desktop as well between GPUs with open source drivers.

~~~
shmerl
Not on combination of Intel + Nvidia though.

------
unnawut
Genuinely curious. I wonder how a partnership between competitors like this
usually starts off? Someone near the top (i.e. mamagement) of each company say
"Hey, we are behind on this. Can we borrow it?". Or their staff talk and
suggest it up? Or else? Does this happen all the time but few would only make
it through? Or these discussions rarely happen?

------
m3kw9
This prove how dominant Nvidia is, Intel and AMD is literally shaking in their
boots to even consider partnering up to attempt to defeat a common enemy

~~~
sclangdon
Is Nvidia that dominant? I don't have any numbers, but I'd be surprised if
they had more GFX cards installed than Intel. In the high-end market, sure,
but that's not a very big part of the whole story.

And as for game consoles, both Xbox One and PS4 uses an AMD. Only the Switch
uses NVidia.

~~~
friedman23
Comparing dedicated graphics cards to integrated graphics and then claiming
intel has more chips installed than Nvidia is asinine.

------
nerpderp83
Ok, I haven't read the article, on mobile... But Intel and AMD shouldn't be
aligning on anything. This sounds extremely anti-competstive.

Nvidia should get an automatic x86 license.

Actually, I have a proactive anti monopoly idea. Any company that is the
predominant player in a market cannot use patents to limit the ability of a
competitor to make a compatible product.

This would mean anyone could make x86 chips w/o a license.

~~~
ac29
There are rumours and speculation about this, but my reading into it is that
this is going to be a single SKU (maybe 2 or 3 variants), possibly for a
single customer.

My best guess is Apple -- they were already using Intel CPUs with AMD GPUs,
and they probably opened up their checkbook to make this collaboration happen.
It should give them performance and power benefits, and Intel and AMD sell the
same number of CPUs and GPUs, respectively, as before.

Not sure why you think this is so anti-competitive. Who knows if NVIDIA was
offered the same deal and declined? They are much less prone to making "semi-
custom" parts for third parties, while AMD already has a track record of doing
this for Sony with the PS4 and Microsoft for the Xbox One.

~~~
jrs95
Given that gaming was specifically mentioned, I'm assuming it probably isn't
Apple. It's probably for products more similar to the Razer Blade Stealth. The
concept of something like that is cool, but the integrated graphics are
definitely the bottleneck.

~~~
adamson
I doubt gaming is really their endgame here, there are many reasons to want
integrated GPU and CPU on the same die beyond gaming that would be relevant to
Apple for desktop and mobile (integrated pipelining, higher memory bandwidth)

------
Lramseyer
This is fantastic for AMD! Intel has the largest market share for GPUs with
integrated graphics. While I haven't looked into it too much, it sounds like
an opportunity for Intel's market share on top of what they already have.
Let's just hope that Intel doesn't pull any legal tomfoolery and steal AMD's
IP.

------
floatboth
Oh, so still not confirmed officially.

Why would AMD agree to this? This would massively eat into Raven Ridge laptop
sales.

~~~
mmrezaie
Probably next Apple release cycle is the main motivation, but who knows?

~~~
mtgx
I'm actually quite surprised Apple didn't go with Ryzen/Threadripper already
in their Mac Pros, considering what a huge multi-thousand dollar margins that
would have offered them (at least if they went with the same absurd prices as
the Xeon Mac Pros). And they could've still claimed a significant boost for
their Mac Pro performance compared to the old Intel chip in the last
generation.

They could've also replaced all of their dual-core laptops with quad-core
Ryzen APUs for about 2-2.5x increase in both CPU multi-thread performance and
GPU performance. And I don't think it would've cost them more, or not
significantly more at least. AMD seems to price their cores at 50-60% of
Intel's cores.

~~~
pault
Isn't thickness and therefore heat dissipation and power consumption paramount
in Apple's laptop line? AMD struggles in all those areas as far as I know.

~~~
mtgx
AMD's latest chips and APUs are more efficient than Intel's. You should do a
hard reset on everything you know about AMD's chips that is older than a year.

------
darklajid
I just got a G702ZC - basically a desktop Ryzen in a somewhat portable
package.

The keyboard is crap. It has a dedicated key for a 'Look! This is my current
system load. You can even connect a mobile to see it on your phone!' ROG key
where a numlock would be. It doesn't have a End key, which infuriates me.
There's a power button in the top right of the keyboard which I WILL press in
the future. It's just a failure about to happen. The keyboard is utterly
flawed.

Well.. I at least have a Ryzen 7 with 8 cores, right? Yeaaaaah... Right now
the braindead EFI interface (you can switch between 'ez' (sic) and advanced
mode) doesn't seem to expose a method to enable AMD-v. So ... Yeah. I have
8(16) cores to .. I don't know. Watch YouTube I guess.

AMD and Asus really dropped the ball here. The keyboard is unacceptable. The
AMD-v issue is .. low. So fucking low.

Don't buy this laptop?

------
neurotech1
Non-Paywall link [http://archive.is/X3vJN](http://archive.is/X3vJN)

------
habeebtc
So, this might seem like strange bedfellows, but recall that AMD and Intel are
both competitors and each other's customers.

AMD licenses x86 from Intel, and Intel licenses x64 from AMD (because Itanium
failed to win the 64 bit market).

------
sabujp
First of all, Intel and AMD are not rivals. The only reason AMD is alive is
because Intel lets it live. Intel would pretty much become a monopoly in the
server line if AMD were to disappear.

~~~
sumedh
> The only reason AMD is alive is because Intel lets it live.

Didnt Intel play dirty with AMD by forcing OEMS to use Intel chips instead of
AMD?

~~~
sabujp
That was a long time ago when AMD might have been a threat.

------
quickben
And the stock goes wild in exactly 10m on market opening.

~~~
skinnymuch
AMD has a decent bump of being up over 6% so far today (markets open for 15
min). I was going to go in but read this news too late. Amd is too volatile
and random to her short term they'll go up more than the current 6% bump.

------
Abishek_Muthian
Superman carrying Kryptonite with him?

P.S AMD K series were named after kryptonite as Intel's pentium series were
considered to be superman.

------
martin1975
Great news. The only way we will see progress w/OpenGL, and eventually Vulkan,
as well as Wayland and its litany of Wayland enabled compositors is with open
source drivers.

Intel and AMD seems to have seen the light of Linux. I wonder who's next -
Microsoft perhaps ditching Windows for Linux and building a super GUI on top
of Linux to make it an OS X killer?

------
nicktelford
I honestly can't see this going anywhere; Intel have no skin in the discrete
graphics game, so what do they have to gain from this? They're not really
competing with nVIDIA, except perhaps in the emerging AI hardware market?

~~~
mtgx
Intel has already displaced Nvidia in most laptops because its integrated
graphics got "good enough". Some OEMs continue to add low-end Nvidia graphics
to Intel-based laptops, even though they have the _same_ or _less_ performance
than Intel's graphics, simply because they don't want Intel to think they are
their exclusive supplier and can raise prices at will.

What this deal will do is allow Intel to become the de-facto leader in the
_growing_ "laptop gaming" market, displacing both Nvidia and AMD from that
market. AMD had a chance to dominate that market now with multi-core Ryzen
CPUs and its dedicated GPUs, but it seems they've just decided to hand that
market over to Intel on a silver platter.

Such a huge mistake from AMD. This is why I would rather AMD would be acquired
by someone like Samsung or Broadcom to give AMD the money it needs than do
stupid deals like this one with Intel because it's so strapped for cash.

~~~
bitL
I guess AMD is betting on EPYC; desktop is not progressing much and they are
non-existing on laptops either, possibly blocked by Intel on the vendor level,
so it's better for them to get a few % of profit instead of none from this
segment.

~~~
lettergram
Exactly my thoughts. I posted in a comment below (also responding to mtgx).
Essentially, the laptop chips AMD can produce are still slightly better (well,
cheaper) - it's just they can't get into the market for a few years until
vendors are up again.

------
nailer
Why doesn't NVIDIA build a CPU? They'd have to license x86 from Intel/AMD
(since most Windows apps are built for x64) but AFAIK CPUs are far less
complex (in sheer amount of transistors) than GPUs.

~~~
littlecranky67
Because x86 is a patent minefield, AMD has a license but other companies would
have to negotioate with Intel about those patents before they can do x86.

~~~
lawl
And then AMD for x64.

~~~
nailer
Yes, that's why I mentioned that in the question.

As an aside: I know it's not the done thing to complain about down votes, but
I do find an honest, polite question - not even a statement - being modded to
oblivion somewhat unusual.

------
mtgx
This will probably go down as AMD's biggest strategic mistake in the past
decade (other than the Bulldozer architecture).

Did they at least revise their licensing deal where Intel basically adds a
requirement that AMD can't be sold to other companies? If not, then AMD's
leadership must be clueless. They should've revised that clause the first
chance they got to make another deal (like this one!) with Intel.

~~~
jjawssd
As a "strategic mistake" counter-argument: It is possible that AMD would go
under without this deal.

~~~
DoofusOfDeath
Leading AMD must be emotionally exhausting. Can you image leading a company
that for so long is fighting for survival?

~~~
manigandham
This is basically every startup. Many of the quiet successes take decades.

~~~
qeternity
This is most _businesses_ in general where healthy and creatively destructive
competition takes place.

