
Intel Core with Radeon RX Vega M Graphics Launched: HP, Dell, and Intel NUC - mrmondo
https://www.anandtech.com/show/12220/how-to-make-8th-gen-more-complex-intel-core-with-radeon-rx-vega-m-graphics-launched
======
a012
OP added "ATI" by themselves rather than keeping original one. What a shame.

Anyway:

> t provides an additional six displays up to 4K with the Intel HD graphics
> that has three, giving a total of nine outputs. The Radeon Graphics supports
> DisplayPort 1.4 with HDR and HDMI 2.0b with HDR10 support, along with
> FreeSync/FreeSync2. As a result, when the graphics output changes from Intel
> HD Graphics to Radeon graphics, users will have access to FreeSync, as well
> as enough displays to shake a stick at (if the device has all the outputs).

Yes, if those NOCs/HTPCs provide all of those capabilities, otherwise it's
just marketing words. In reality, only top notch has more than 1 DisplayPort,
I guess.

~~~
prewett
This way does emphasize the incongruity of Intel and AMD in one product. I
like it.

However, I read the article hoping to figure out why Intel is doing this, and
no luck there. Are they giving up on the Intel graphics?

~~~
a012
No they don't, they'll fuse AMD's VEGA core with their CPU (including their
IGP) to make this. AMD GPU will be used for heavy task like gaming, rendering
while Intel IGP will be used for less power consumption tasks like displaying
or h264/h265 encode/decode.

~~~
dsr_
"fuse" is a little strong. There's a VEGA GPU in a multi-chip module,
connected with a 8x PCIe lane. It's not like Intel licensed the GPU for
integration into their own silicon.

We've already seen laptops with Intel/AMD hybrid graphics, this just moves it
from the motherboard to the other side of the socket without actually giving
you the high-speed interconnect that an GPU-on-CPU gets.

------
rwx------
'but still no ECC' Are these cpus going to be useless without ECC memroy?

~~~
dna_polymerase
These are designed for mobile computers, notebooks for that matter. You don't
need ECC in notebooks (consumer hardware). So no they are not useless.

~~~
adrianratnapala
So here is the odd thing: servers are (often) managed by Trained
Proffesionals, and have backups and failovers.

Personal computers are, well, not like that. And yet they often have peoples
important creative works, correspondence etc on them. ECC is probably more
useful there, it's just that it is harder to make any benefit _visible_ to the
customer.

(That doesn't mean customers don't care about realiability. It is just that
they have no sane way of distinguishing a product that really is realiable
from one with advertising that lies about being reliable).

~~~
dna_polymerase
The need for ECC has nothing to do with maintenance or proper administration.

You really don't need ECC as normal user, most of the time bit flips won't
really hurt you. However, if, for example, you run long-running tasks like 3D
rendering or Physical Simulations you may want ECC just to be sure your OS
won't be killed by a bit flip in the wrong section. Your photo gallery or
music collection however will most likely never be hurt by something like
that, so still, consumers don't need to waste their money on overpriced ECC
memory.

~~~
ianhowson
> Your photo gallery or music collection

On the contrary, these are generally highly compressed. A single bit flip has
major consequences.

My personal photo and music collections are what I care _most_ about bit flips
for, as I intend to keep them for a lifetime.

I suspect that the target market for this product would be mobile gaming and
thereforce ECC would be a negative for the system as a whole.

------
Tsiklon
I could imagine an OEM like Apple or Dell looking at these to free up space in
their board layout on their notebooks or small form factor computers

In such a scenario ECC memory is not really a high priority, no?

Looking at the PCI-Express lanes available those other 8 CPU lanes look ripe
for a thunderbolt 3 controller, with the rest of the peripherals being powered
from the PCH (or does that make no sense at all?)

------
mozumder
Not sure why Intel just doesn't release a full system in package for a basic
laptop system, with 16GB or 32GB main memory and 256GB or 512GB of NVMe flash?

I think Intel still makes flash chips and they originally started out as a
DRAM manufacturer as well..

~~~
mrweasel
Because that would mean discarding an entire package if just one component
failed or is damaged during production.

It would also mean that manufacturers would have to order different parts for
two systems that are identical, expect from memory or storage. In your
example, with 16 or 32GB of RAM and 256 or 512GB of storage, you'd end up
having to order four different SKUs from Intel, and Intel would have to
manufacture four different SKUs, plus one without memory and storage.

Logistically I don't think it makes sense to add storage and memory to the
package, it just adds inflexibility and more SKUs.

~~~
nnq
yeah, it would be harder to make and sell "handicaped" configurations to users
for high prices, forcing them to prematurely upgrade... hence everybody would
milk less money from the end-users. also, some still sell 5400 RPM HDDs (!!!)
in some versions of current-gen machines and _some people actually buy them!_

all systems you see today with like 6-8GB of RAM and <500 GB SSD storage will
practically _force_ their users to upgrade in <3 years, whereas if you sell to
the average home user a system with >=16GB RAM and >500GB SSD, he could keep
using it for up to 6 years with no chance of it feeling underpowered... so
_very_ bad for business!

~~~
pjc50
.. but it costs twice as much?

------
sleavey
Why are AMD helping to sell a competitor's product? The article says it's
strictly business, but I would have thought the quick buck made today by
selling graphics chips to Intel would be outweighed by the long term benefit
(e.g. growth in the combined CPU/GPU market) this provides to Intel.

~~~
frou_dh
Maybe the likes of Apple said to Intel: "Look, if you don't find a way to
supply GREAT integrated graphics, we're going to do our own laptop chips".

So Intel goes cap-in-hand to AMD for their tech.

~~~
nolok
If I remember the last numbers right, Apple Laptop sales share are around 10%
at best, so if anything like that story happened, it would not be them but HP
(25%)/Lenovo (20%)/Dell (15%).

~~~
frou_dh
...do they have in-house CPU and GPU designs?

~~~
nolok
Are you suggesting that companies like HP or Lenovo couldn't get into
producing ARM chips if they wanted to ?

Intel isn't threatened by ARM yet in a performance laptop.

Of course one could reply laptops don't need that much power anymore to do
what most users use them for today (thus people even here using tablet as
their main tool) but that's another issue entirely, "much cheaper and good
enough", one where Apple isn't Intel's main opponent either (I would even
wager in that one, Intel is its own ennemy, given how terrible they've been
handling the Atom brand and performance to protect margins).

~~~
frou_dh
Those companies could conceivably do a lot of things, but that's a lot
different to having in-house CPU and GPU design teams already staffed and
firing on all cylinders for years.

Not sure I get you re Apple being an opponent to Intel for "cheap" parts. It
goes without saying that Apple would not sell their own stuff to anyone else.
And for their own use, I'm sure the hardware would be very cheap to
manufacture.

~~~
nolok
Doesn't change what I said though. Unless you expect Apple to take over the
laptop market (if they didn't reach above 10% during the last pro-apple
decade, they're not going to do so now especially given their current issues
in the field), or start selling their custom chips to third parties (they've
never ever done so, unless I'm mistaken) AND those chips taking over the
market, they're not the threat Intel has to worry about.

Intel has to worry about Nvidia taking over the graphics and machine learning,
AMD punching above its weight in workstation with EPYC and ThreadRipper, and
ARM being cheap and good enough in all the low power fields and growing.

A middle of the line, ARM based performance oriented chip for laptop is no
threat to i5/i7, except indeed the risk to lose their place in macs.

~~~
frou_dh
I just don't see how other laptop manufacturers are doing as being
particularly relevant to Apple's decision making.

If they ditched Intel, that would _very_ publically rip a stripe off of Intel,
regardless of absolute market share. Not something Intel wants, and is what I
contended Intel have gone to an unusual length, integrating AMD graphics, to
avoid.

------
justinclift
Hopefully the new Mac Mini's use these. If the Mac Mini line ever does get
updated that is. :)

------
jacksonsabey
The new AMD APUs don't have ECC support either, I'm hopeful the future
versions will

~~~
sp332
Why would they disable ECC support for the APU version?

------
make3
Imagine if Intel ended up buying AMD.. That'd be weird (and bad).

~~~
bartread
In the UK that sort of merger would almost certainly be blocked by the
Competition Commission: I'd be interested to hear what the situation is in the
US, where Intel and AMD are based.

~~~
tracker1
FTC/SEC would likely block such a merger... even in the current administration
it would be nearly impossible to get done.

~~~
bartread
Thanks - reassuring to know.

------
Shivetya
Seems like a combination tailor made for Apple yet nothing I have read points
to Apple actually using it, Dell and HP I have seen mentioned.

------
leandrod
One question: free drivers?

------
gsich
ATI has been bought by AMD in 2006.

------
capisce
AMD, not ATI.

------
mrmondo
Whoops - s/ATI/AMD

------
pricetag
I wonder what this will do for $INTC?

------
itsnotvalid
Just also wonder if this CPU is free from meltdown.

------
smallnamespace
How ironic that AMD itself doesn't have an integrated full performance GPU/CPU
package (cut-down APUs notwithstanding).

Also who negotiated the deal to let Intel slap only its logo on the packaging,
even though the AMD die is clearly larger?

~~~
TazeTSchnitzel
It's a “semi-custom” deal, like with the AMD silicon in PS4 and Xbox One, so
the company getting it made for them (Intel) gets their name on the chip.

------
dna_polymerase
Are Meltdown and Spectre pre-installed or do I have to download them via the
Intel ME? \s

I don't know how they have the audacity to release new CPUs while all their
products just went to shit...

~~~
prewett
Yeah, because all they have to do is just tweak some code and re-release... /s
It takes 2 - 4 years to design a chip. If Intel doesn't release the new chip
they started designing 2 - 4 years ago, they waste that development cost, fall
behind AMD, and you complain that they are behind the times. If they do
release the chip you complain they haven't fixed the bug you just learned
about a week ago. It took a year for everyone to fix their _software_ , and
you want Intel to magically come out with new hardware? Sheesh.

I think all the "Intel sucks" comments should have a disclaimer "Full
disclosure: I've never designed hardware in my life, I have no idea how to run
a business, I really just hate Intel and this is an excuse to vent my hatred."

~~~
dna_polymerase
Nope. This bug is far from new. Intel knew about this since 2012 [0]. I don't
expect them to release a fix within a year or something. I know that CPU
design is far from easy or doable within weeks. But how they handled the whole
matter shows that they don't care. Intel sucks, that's true. They should admit
that they produced shit and start talking to their customers and show that
they care.

Oh and also, who in their right mind would buy a CPU with those bugs?

[0]:
[https://twitter.com/TheSimha/status/949361495468642304](https://twitter.com/TheSimha/status/949361495468642304)

~~~
whywhywhywhy
>They should admit that they produced shit and start talking to their
customers and show that they care.

Someone correct me if I'm wrong but isn't almost every modern CPU vulnerable
in some way with Spectre? If so isn't every modern CPU manufacturer "producing
shit" in your eyes?

~~~
dna_polymerase
Most modern AMD CPUs are not vulnerable or at least not to the full extent, so
no not all CPU manufacturers are producing shit.

