
Marrying Vega and Zen: The AMD Ryzen 5 2400G Review - pella
https://www.anandtech.com/print/12425/marrying-vega-and-zen-the-amd-ryzen-5-2400g-review
======
Certhas
tl;dr: "With the Ryzen 5 2400G, AMD has completely shut down the sub-$100
graphics card market. As a choice for gamers on a budget, those building
systems in the region of $500, it becomes the processor to pick."

~~~
zrm
> With the Ryzen 5 2400G, AMD has completely shut down the sub-$100 graphics
> card market.

Well, almost. Integrated graphics generally take the place of additional CPU
cores. For example, there is no 8-core Ryzen with integrated graphics. So if
you need cores (like for compiling code) then you end up with a discrete
graphics card even if you don't need fast graphics.

I actually wish Intel would make a cheap discrete graphics card for that
purpose, because they have the most stable Linux graphics drivers.

~~~
fyi1183
To be fair, AMD's Linux drivers have been pretty solid for a while now. I
don't have an objective comparison, but the general perception definitely has
some catching up to do. For those Ryzens you'll probably need a very recent
kernel, but that's to be expected with new hardware.

~~~
jandrese
It depends. We got a few machines with AMD RX460s (Or similar, I can't
remember exactly) and the users wanted Ubuntu on them.

So I install 16.04, the most recent LTS release available because I don't want
these machines to turn into a pumpkin after less than 9 months but the
graphics are a total fail. SVGA 800x600 only.

So the kernel on that release is too old and updating the kernel on Ubuntu is
fraught with peril. So I look into third party drivers and find that their
support was dropped entirely from even the third party Ubuntu support. So I go
to ATI/AMD's website and find the driver package, but it turns out to be a
tarball full of .debs with zero instructions on how to install or enable them.
I was having flashbacks to working on Linux installs in the late 90s and
having to hack everything to get it working.

In the end I had to install 17.10 and apply the hacks to make it work[1] just
to get the graphics running, and now I'm going to have to revisit those
machines in a couple of months to upgrade them to 18.04.

[1] Disabling the new systemd DNS resolver that crashes on literally every
lookup attempt on our network. Also had to learn how to program the modeline
equivalent for Weyland for when you're going through a KVM and don't have
EDID. This was no picnic in later versions of X, but I had already figured it
out and had earlier versions as reference. I still miss xconfig sometimes.

~~~
y4mi
Yes, using the latest hardware is painful with Ubuntu.

It's gonna be a breeze in a few month though. And once that happens, the open
source AMD drivers are going to be stabler than the Nvidia blob again.

~~~
jandrese
Given that the nVidia blob apparently doesn't work with Weyland it's going to
be a problem in the near future.

------
lykr0n
I think it would be amazing if there were also able to stick a GB or two of
HBM onto the same chip as kind of a L4 cache / VRAM for the GPU. Performance
would be amazing, especially considering that Zen's performance gets better
with RAM speed and GPUs are almost entirely RAM starved.

~~~
wastedhours
I've never considered that as an idea before, but now am left wondering why it
hasn't been done. Guessing the manufacturers make more money from full-unit
upgrades than they'd expect to make from incremental memory unit purchases
(even if they made the RAM slot proprietary).

~~~
bryanlarsen
Intel has a new chip coming out with AMD graphics and HBM memory:
[https://www.anandtech.com/show/12220/how-to-make-8th-gen-
mor...](https://www.anandtech.com/show/12220/how-to-make-8th-gen-more-complex-
intel-core-with-radeon-rx-vega-m-graphics-launched)

~~~
dkarbayev
But that’s not a desktop CPU. I would gladly buy it if it were.

~~~
wmf
With a 100W TDP it basically is a desktop CPU.

------
walrus01
Pending good motherboard support for 2160p60 (4K 60Hz), the $169 CPU looks
like a great choice for a 4K home theatre PC. Just a few months ago I built a
Ryzen 1500X based HTPC with a geforce 1030 passively cooled card.

The geforce1030, based on a slower version but the same architecture as the
1050/1070/1080, gets me hardware accelerated H265/HEVC and the capability to
play most MKV files without dropping frames at up to very high bitrates.
Typical playback for something like a 2160p60 24GB copy of the original Blade
Runner is at about 18% CPU usage on all four cores using a recent VLC beta.

But the total money spent was something like $89 motherboard + $89 video card
+ $165 CPU. It would be nice to not have to buy the video card.

~~~
DCKing
Be aware for 4K that almost no motherboards exist at this time that can pair
one of these APUs with a HDMI 2.0 output. Almost all AM4 motherboards come
with HDMI 1.4 outputs that limit 4K output to 30Hz - probably fine for movies,
noticable in the UI, and not great for games (even if rendered at lower than
native res). Though there are some current gen AM4 boards with HDMI 2.0 and
native 4K60 output, it's somehow a premium feature that is bizarrely reserved
for high end gaming motherboards.

The next generation of motherboard chipsets from AMD should remediate this,
but motherboards with these will not launch before April.

~~~
gascan
I'm a bit fuzzy on this, but I think any motherboard with DisplayPort 1.2 or
better will do the job. DisplayPort has been ahead of HDMI on the 4K60Hz game.

Picking a random NewEgg motherboard with AM4 & DisplayPort, it looks like it
would do the job (see Specs):

[https://www.newegg.com/Product/Product.aspx?Item=N82E1681313...](https://www.newegg.com/Product/Product.aspx?Item=N82E16813132988)

So does the cheapest board:

[https://www.newegg.com/Product/Product.aspx?Item=N82E1681314...](https://www.newegg.com/Product/Product.aspx?Item=N82E16813144135)

~~~
walrus01
I did some research on this before building my current system (see parent
comment). At the time I bought the stuff about two months ago the A8 and A10
CPUs (pre-Ryzen generation) for AM4 looked like terrible options. Looking at
CPUs, the whole series of currently shipping Ryzen don't have onboard video,
and I wanted a 1500X because I knew I could overclock it by at least 200-300
MHz with little effort.

Almost went to an Intel 8th generation Core architecture and Intel's
integrated video. But the performance difference between intel onboard and a
$80 to $90 geforce1030 was significant. I wanted to build something that would
still be able to play ridiculously high bitrate HEVC.

------
Symmetry
I was going to "I sort of wish I could get a Thinkpad or Latitude with one of
these in it" then I googles and realized that both will be coming out this
year.

~~~
karimf
Remember that this is the desktop APU, with 65W TDP. On the article they
mentioned that AMD doesn't have any plan on the laptop APU.

Edit: oops, they released the laptop APU earlier, like ryzen 7 2700U.

~~~
masklinn
cTDP down to 46W sounds about OK for a "workstation"-type laptop.

Unclear whether it will be BGA-packaged though.

~~~
pjmlp
I can imagine seeing it on a Thinkpad P model.

------
myrandomcomment
I have been waiting for this for a new HTPC built that can also do a good job
as a gaming system. Now comes the fun of reading each motherboard spec in
detail to figure out if it can output 4K@60 (HDMI2.0?) By fun, I mean pain as
this info is hard to find on the boards I have looked at so far.

~~~
nnain
For 4K@60 or 1440p@144, you would need a motherboard with 'Display Port' not
just HDMI. That should eliminate quite a few.

~~~
zokier
Wrong.

[https://www.hdmi.org/press/press_release.aspx?prid=133](https://www.hdmi.org/press/press_release.aspx?prid=133)

> _September 4, 2013_ – IFA 2013 – HDMI Forum, Inc., a non-profit, mutual
> benefit corporation, today announced the release of Version 2.0 of the HDMI
> Specification. This latest HDMI Specification, the first to be developed by
> the HDMI Forum, offers a significant increase in bandwidth (up to 18Gbps) to
> support new features such as 4K@50/60 (2160p),

~~~
nnain
Ah thanks for the check. I would still recommend getting a motherboard with
Display Port(DP) to future proof the core components. I bought a gaming
monitor 2 months back, and it came with only the DP cable. I see DP
recommended by most game rig builders.

------
hs86
I really like that AnandTech's print view is just the same layout but
everything on a single page.

------
nnain
This is such a breath of fresh air for PC gamers who have been waiting to
build a budget gaming machine, ever since the crypto-mining frenzy messed up
the GPU market. Most GPUs are selling at a 30%ish price hike right now!

~~~
xigency
PC building is a nightmare right now between RAM and GPU prices. It's clear
that they are not a valued demographic to parts manufacturers.

Anyway, I would recommend a Ryzen 3 + GTX 1050 at a minimum for a gaming
computer. The performance here on GTA V, a several years old game, is pretty
poor.

~~~
revelation
Actually, they are. Arguably it's the cryptominer fools that the part
manufacturers want nothing to do with, because chances are you ramp up
manufacturing now (so 3-6 months until you actually see higher supply) and by
then they are on to the next coin fad.

------
stcredzero
What are the chances that AMD will ever try to shut down the $200-$250
graphics card market?

~~~
jtuente
If they can socket a Vega 16+ APU, they might be able to put a dent in the
sub-$200 market. The Intel G platform uses Vega20 and Vega24 with HBM, but
they aren't socketed.

~~~
mizzack
Not gonna happen with DDR4. Clock for clock, the Vega8 and Vega11 go toe to
toe when OC'd, pointing to bandwidth starvation.

Maybe with a bigger package they could throw HBM2 on there... Something like a
Ryzen 7 2 CCX configuration (8c/16t) + Vega ~20 + 4GB HBM2 in a package that
fits the Threadripper/Epyc socket.

At that point you're stuck paying a premium for the X399 platform, though, you
should probably just buy a dGPU :)

~~~
nerpderp83
Would u recommend over clocking memory for the best perf increase?

~~~
mizzack
RAM speed is definitely important for CPU and GPU perf, but much like other
Ryzen parts it hits a point where higher end kits provide little to no
benefit. 3200MHz is about where you start hitting seriously diminishing
returns, especially considering cost of high end RAM kits.

You can see a much larger gaming benefit by overclocking the IGP. In most
tests I saw today, an overclocked Vega8(1100MHz) in the $99 part easily
catches up to the stock Vega11(1250MHz) and can be pushed to ~1500MHz.

[https://www.kitguru.net/wp-
content/uploads/2018/02/xRaven-3-...](https://www.kitguru.net/wp-
content/uploads/2018/02/xRaven-3-Time-Spy-
rev.png.pagespeed.ic.QuYRXcT33J.webp)

[https://tpucdn.com/reviews/AMD/Ryzen_5_2400G_Vega_11/images/...](https://tpucdn.com/reviews/AMD/Ryzen_5_2400G_Vega_11/images/memory_scaling2.jpg)

------
shmerl
Are there any Dell laptops with Ryzen APUs?

~~~
yndoendo
Not that I know of so far, still only sell AMD A-XX laptops. Also looking
forward to an AMD Ryzen Mobile (U) that can replace a XPS 13, since Intel is
not an option.

------
singularity2001
The one news I'm hoping to hear from AMD is that they produce a deep learning
TPU.

------
Quequau
I'm sorta surprised that AMD hasn't made a "smallest Zen with double the
biggest Vega" to appeal to the cryptocoin crowd.

~~~
simion314
I think everyone is expecting mining to fall and they don't want to invest the
R&D and create the manufacturing , build the hardware and in the end it won't
sell. Mining seems like a gamble so I would personally not risk it, if they
want the miners can buy the regular GPUs

~~~
cslarson
I recently (yesterday) calculated[0] that Ethereum is roughly 64% of the GPU
mining market by power. That's a lower bound - it's likely much higher, maybe
75+%. Hybrid POS/POW is expected in the fall and will likely come with a
healthy drop in the block reward. Following that will be full POS (2019?)
after which, at least on the main Ethereum chain, there will be no mining. I'd
like to see the block reward dropped much sooner but that's a long shot
request.

[0]:
[https://www.reddit.com/r/ethereum/comments/7wufiq/ethereum_i...](https://www.reddit.com/r/ethereum/comments/7wufiq/ethereum_is_64_of_all_gpu_mining_by_power_09gw_of/)

------
polskibus
What's the cryptocurrency mining efficiency vs price ratio like? Is this going
to be the new Rx 580 in terms of popularity?

~~~
jlebrech
rubbish for crypto but good value for games hopefully

~~~
baldfat
Why can't we just have Intel or AMD design the ultimate mining chip that works
better with more of that one purpose chip :)

~~~
jlebrech
Or a free socket, or maybe an nvme mining chip?

