
Intel with Radeon RX Vega Graphics: Core I7-8809G with 3.1 GHz Base, 100W TDP - DiabloD3
https://www.anandtech.com/show/12207/intel-with-radeon-rx-vega-graphics-core-i78809g-with-31-ghz-base-100w-target-tdp-overclockable
======
sspiff
Weird how they include an HD 630 on the die. This eats up a fair chunk of die
area, and eats into the thermal budget. It also means that the displays will
probably be wired to the HD 630, so the Radeon can power down to save power.
Taking out the HD 630 could have lowered the power usage to a level more
palatable to a laptop computer at full blast.

This means that you have to pay a penalty to use the Radeon to render, but
send its framebuffer result back to the HD630 for displaying. This is also the
case current Nvidia Optimus or AMD Enduro systems. Draw backs from this
approach are frame tear / lack of proper V-Sync and added latency.

If this ends up being used by Apple however, it is likely that a hardware
demux will be used to connect the display to both graphics chips, and switch
it back and forth on demand. Other vendors don't do this, because it is more
expensive, and passing the display from one GPU to another requires the pixel
clock rate (among other details) to be shared between the different GPUs,
which is not something that the vendor driver (Intel/Nvidia/AMD) stacks
support.

Note that such a hardware demux is used in previous generations of MacBook
Pros, and it is not publicly documented, which means very poor and/or very
late support in Open Source operating systems such as Linux.

Some system builders (Lenovo and Asus have done this, probably others) wire
the internal display of laptops to the Intel GPU, and the external HDMI/DP to
the discrete graphics. In the past, this allowed you to power more displays
(when Intel GPUs were restricted to 2 displays max), and it still allows you
to avoid the Enduro/Optimus hassle for external displays, when you are likely
to be powering the laptop with the A/C adapter anyway. It also meant that
vendor/OEM specific drivers needed to be used in some cases, though.

~~~
DiabloD3
In Windows 10 and Linux, this can be nearly for free without any additional
hardware. Linux calls this dma-buf, and Windows 10 basically requires it for
any WDDM 2.x compliant drivers: GPU to GPU direct DMA mapping without CPU
involvement (directly routed over PCI-E and not through the CPU memory
controller when possible).

Basically, it means Optimus style and hardware-muxer style setups are
obsolete, and display framebuffer and render framebuffer are entirely
disconnected naturally; and also does not require any overt cross-architecture
support (ex: Nvidia Optimus support in Nvidia's driver requires explicit Intel
support; dma-buf solutions don't).

Presumably, OSX also can leverage this.

~~~
sspiff
Sure, but then you still don't have V-Sync, which causes terrible and easily
noticeable tearing on almost anything (at least in the case of Nvidia, it
seems to be much better with AMD from my annecdotal testing). I doubt Apple
would find tearing acceptable on their products, honestly.

~~~
DiabloD3
That isn't related to dma-buf-type usage, though. That would be purely one of
the drivers not signalling the end of the frame to the other (since all
graphical APIs have suitable methods for this, and WDDM itself natively
understands the concept).

Also, in Windows 10, the window manager's compositor (equivalent thereof) does
all the event signaling fine; ie, you could easily have a situation where a
game is rendered on one GPU, displayed on the other, and the display GPU +
monitor has VESA Adaptive-sync enabled, and the game correctly triggers new
sync windows as if it had been running on the same GPU.

Does this work in practice? Who knows, but its supposed to, especially with
fixes that 1703 and 1709 added.

~~~
mrec
Is this intended/expected to work on desktop setups with discrete GPUs too? I
believe Optimus _et al_ were always laptop-only, but some desktop users might
care about heat/power/fan noise.

~~~
DiabloD3
It's expected to work. I've tested it on mine, Haswell i7's own iGPU, and a
full sized GCN Radeon, with a game rendering on the Radeon but displaying on
the iGPU.

Multi-monitor setup with 2 on the Radeon (normally all 3) and 1 on the iGPU,
seemed to work fine. No real visible lag either, from what I could tell with
limited testing.

------
fredsted
I wonder if this was designed for an Apple product? Maybe a new iMac or Mac
mini?

~~~
ksec
Mac Mini - TDP too high. iMac - Lots of space for CPU + GPU.

We all thought the whole reason of sticking a dGPU close to CPU was because of
space, hence a Notebook type machine. But if anyone could enlighten me why I
want a desktop with Integrated dGPU rather then CPU+GPU combo which is likely
cheaper, and easier to cool down.

Not to mention you are stuck with KabyLake 4 Core instead of the newer Coffee
Lake with 6 Core.

~~~
deafcalculus
For the same reason companies make SoCs. It saves money. Intel's idea with
EMIB (multiple dies in one package) is to save money by using older
semiconductor process for things such as the southbridge (ex: USB) that don't
need the latest and most expensive process. This way, they'll also get more
value from their investment in previous process nodes.

Routing HBM to the GPU using PCB traces is expensive, and in the low-to-mid
end, saving money on PCB and cooling matters. If EMIB is going to be used
anyway, why not have the GPU in there as well. With 1024-bit 4GB HBM on
package, memory bandwidth won't be a problem, and the i7-8809G will likely be
faster than a PS4, which means it'll be good enough for 1080p gaming.

KabyLake was probably chosen because this design was finalized before Coffee
Lake. The next iteration will likely have more cores.

~~~
ksec
It only saves money If

1\. It will only make sense if Intel were actually manufacturing the GPU
themselves as well. In this case they dont. And EMIB is an Intel specific
technology, AMD doesn't use it when they package HBM with their dGPU.

2\. AMD makes a profit before selling it to Intel, and Intel has to make a
profits for integrating it with EMIB, i doubt the cost saving of using EMIB
together with HBM, if that was true, out done the profits margin of both
companies.

------
stevefan1999
With a liitle more polishing over this combination, I can see the future:
thinnest ultrabook with desktop grade performance and experience; packing a
monstrous 4C8T top tier CPU and a high end graphics chipset in an ultrabook is
definitely great for everyone. Would have been better if a Ryzen CPU is on the
run for me.

This is going to be the future, no, It should have had been!

~~~
craftyguy
With a 100W TDP, it's going need more than just a 'little more polishing' to
be viable for a mobile ultrabook product.

~~~
stevefan1999
Agreed, I must have mistaken ultrabooks and notebooks. But rumor has it this
hybrid chip will set its hold on 2018 MBPs. We might see how well it does
first hand from Apple.

------
vesak
Incidentally, HP is already selling an AMD Ryzen + Vega convertible laptop:
[http://store.hp.com/us/en/pdp/hp-envy-x360-convertible-
lapto...](http://store.hp.com/us/en/pdp/hp-envy-x360-convertible-
laptop-15z-touch-1za07av-1)

I'm waiting for the reviews to come in before making any decisions, but it
looks like quite a nice machine.

~~~
dragontamer
That's a Vega 8-compute units that's fully integrated with the CPU (and both
the iGPU and CPU share a single memory controller to DDR4 RAM). The one talked
about with the article is a Vega 24-compute units with extremely high-speed
dedicated HBM memory. The performance difference would be night and day.

\-------------

With that said, I purchased that laptop you're talking about. A brief
overview:

1\. Terrible screen: something like 60% sRGB coverage (maybe worse), and it
shows with dithering and banding artifacts. Its also noticeably dimmer than
other laptops.

2\. Horrible HP Store -- I bought the laptop during Black-Friday, and HP
shipped me a bricked laptop. Then they wouldn't replace the laptop because I
got a "custom" SSD drive or some bullS!@#!@ excuse. My choices were to
repurchase at a "non-sales" price, or get a refund. I opted for the refund and
had to wait until the next sale to get a decent price. I suggest you buy from
a reputable retailer who has a decent return policy, just in case the computer
is broken or some other issue comes up. You do NOT want to be dealing with
HP's online store customer service.

3\. Minor default driver issues -- Fullscreen Youtube has occasional glitches
and I've had the screen shutoff at least once randomly. Based on other owners:
this seems to go away once you reinstall Windows and do some of HP's legwork
for them. You should expect to reinstall Windows anyway because of the large
number of bloatware (McAffee, Candy Crush, etc. etc.) that comes with the
laptop.

4\. AMD Driver Issues -- There's no Raven Ridge version of the latest "AMD
Adrenaline" driver set. AMD's driver rollout for this laptop seems to be slow.
I figure that as Raven Ridge APUs become more widespread, AMD's driver
situation will improve.

5\. Defaults to a Hard Drive -- Why HP? Just... why? But it has an M.2 slot,
so its super easy for me to just stick a Samsung 960 Evo in there and fix that
issue.

\-----------

Aside from the poor quality screen (which stands out), its a good laptop
overall. I think I'll be fine with it. Its got a numpad, the "360" 2-in-1 form
factor definitely works, and the Stylus support is great.

The laptop is also fully expandable, with an M.2 slot, SATA slot, and easily
accessible DDR4 DIMMs:
[http://h10032.www1.hp.com/ctg/Manual/c05819570](http://h10032.www1.hp.com/ctg/Manual/c05819570)

So you can upgrade the laptop to your hearts content. You are going to be
stuck with a dim, dithered, banding screen however. Its sufficient for office
applications but holy crap the color irregularities are quite obvious. So this
laptop's screen definitely can't be used for any serious art (which is a
shame, since the Stylus support is so good).

Its definitely a $700ish laptop, with all of the issues that is typical in the
$700 price range. (ie: Poor Screens are seriously common at this price range).
You'll get very good price/performance and reasonable gaming on the go, since
you get 4c / 8threads, a decent iGPU (its not as good as the MX150, but it can
play the newest Doom at 720p and Rocket League at 57FPS / 720p).

I guess... the x360 as purchased isn't a very good laptop. But its actually
very easy to turn it into a good laptop (aside from the screen). Reload
Windows, fix up the drivers a bit, stick an M.2 SSD in there and its actually
great.

~~~
vesak
Thanks a million! I've also had a mediocre experience with a previous
generation x360. It is frustratingly close to being a very good machine in my
opinion if just a few details were of higher quality. But I suppose it
couldn't be quite as cheap that way.

------
deafcalculus
I suppose this marks the death of low-end discrete GPUs.

~~~
olegkikin
That's a good thing. Low-end GPUs suck anyway, but add so much weight and
power consumption.

~~~
baldfat
Once someone goes to the $70 GPU the argument starts to fall off. Sure the sub
$50 cards are only worth it for someone that just wants another monitor (BUT
buy a display port card and monitors)The value to performance of these $50
cards is low.

Looking at a GTX 1030 this $70 card's performance to value improves greatly.
It has the same performance as a AMD 480.
[https://www.newegg.com/Product/Product.aspx?Item=N82E1681413...](https://www.newegg.com/Product/Product.aspx?Item=N82E16814137140&cm_re=gtx_1030-_-14-137-140-_-Product)

~~~
paradisechris
Did you mean something else? After a quick look a 480 has around twice the
performance. Nearly bought a 1030 ;)

~~~
baldfat
460!!!! I made a mistake :(

