
The AMD Radeon RX Vega 64 and RX Vega 56 Review: Vega Burning Bright - robin_reala
http://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review
======
lhl
I ordered a $499 RX Vega 64 from Amazon at 9AM this morning, mainly to show
support AMD's open source Linux GPU efforts [1] and since there should be
pretty good compute potential, but I will have to admit to being fairly
disappointed about the power consumption and general performance (only a 30%
improvement vs 2yo Fiji w/ a shift from 28nm to 14nm and much touted
architectural improvements [2] vs Nvidia's 70% uplift launched over a year
ago).

Interesting notes from the reviews - HBCC is not enabled by default but
doesn't give much of a performance boost anyway [3].

According to Computerbase [4] (DE) primitive shaders are still inactive:
[https://www.computerbase.de/2017-08/radeon-rx-
vega-64-56-tes...](https://www.computerbase.de/2017-08/radeon-rx-
vega-64-56-test/) (RTG engineer discussing that it should be automatically
used in the future [5])

One interesting note is that while it looks like initial supplies have all
sold out, early testing shows that OOTB Vega is pretty awful for mining right
now. [6] Vega 64 is clocking at about 30MH/s at about 300W - a flash-BIOS'd RX
470 can get 28-30MH/s at ~150W.

[1] [https://www.phoronix.com/scan.php?page=article&item=rx-
vega-...](https://www.phoronix.com/scan.php?page=article&item=rx-vega-
linux1&num=1)

[2] [http://www.pcgamer.com/the-amd-radeon-rx-vega-56-and-
vega-64...](http://www.pcgamer.com/the-amd-radeon-rx-vega-56-and-
vega-64-review/)

[3]
[http://www.guru3d.com/articles_pages/amd_radeon_rx_vega_56_8...](http://www.guru3d.com/articles_pages/amd_radeon_rx_vega_56_8gb_review,31.html)

[4] [https://www.computerbase.de/2017-08/radeon-rx-
vega-64-56-tes...](https://www.computerbase.de/2017-08/radeon-rx-
vega-64-56-test/)

[5]
[https://mobile.twitter.com/ryszu/status/896304786307469313](https://mobile.twitter.com/ryszu/status/896304786307469313)

[6] [https://nl.hardware.info/reviews/7517/23/amd-radeon-rx-
vega-...](https://nl.hardware.info/reviews/7517/23/amd-radeon-rx-vega-56--
64-review-minder-euros-meer-watts-testresultaten-amd-radeon-rx-vega-voor-coin-
mining)

~~~
Nursie
And a non-flashed 1070 can get around 30 at 120W(ish)...

I guess the miners are hoping that a driver update will unleash the rumoured
70+MH/s. If it dosn't materialise I can imagine these being sold on fairly
quickly, with that power consumption.

~~~
ihsw2
I wouldn't be surprised if the performance is artificially hindered until
later driver updates, to prevent them from being gobbled up by miners.

~~~
riffraff
why is it an issue if miners gobble it up? A sale is a sale no?

~~~
mastax
Miners tend to have very high rates of warranty returns, due to running the
cards 24x7x365. In some markets manufacturers are trying to market mining
cards which don't have display outputs, etc. and have short warranties. Many
markets (US, EU) don't let you invalidate warranties just for using a product
24x7, so it doesn't work everywhere.

~~~
mamon
They could base GPU warranties based on usage, the same way car part
warranties are based on mileage. In fact Samsung already is doing the same for
SSDs: they have data write limit of 400 TB.

------
slizard
RX Toaster, they need to pull some magic out of the chip with upcoming
drivers, otherwise this won't be the big return everyone expected, I feel.

I just hope the MI25 cards will be far cheaper than the $6k Tesla P100's and
that the ROCm stack is good enough so for compute the architecture may still
be a viable and very competitive option.

~~~
microcolonel
On the Open Source Linux OpenGL drivers (but not the proprietary ones), it is
competitive with the 1080Ti at considerably lower MSRP and real retail price
(once stocks recover from the most hyped GPU launch since... the Geforce
9800GTX? I don't even know). Power to performance ratios are also not well
shown with the current shipping proprietary drivers; frankly I don't
understand how AMD can't seem to get that team to deliver stellar day-one
drivers for every launch, it is what it is I guess.

~~~
slizard
> On the Open Source Linux OpenGL drivers (but not the proprietary ones), it
> is competitive with the 1080Ti at considerably lower MSRP

Yaaay! Really, that's cool but it's a negligible fraction of the market.
Unless Linux gaming becomes a market force overnight, it won't matter a lot, I
think.

> Power to performance ratios are also not well shown with the current
> shipping proprietary driver

I doubt they can close the large power gap that the initial benchmarks show
(compared to the 1080 and esp. 1070), but I'm hoping the power margin will get
narrower and a bit of performance increase might compensate too.

Indeed, I am dampening my expectations, I'm most hopeful that in the high-end
GPGPU compute arena they might not be too late and will provide enough
Perf/W/buck to gain traction.

------
gcp
"Surprisingly, native FP16 operations are not currently exposed to OpenCL"

Given that these cards have potential to crash the price/perf of current deep
learning solutions, this is more than regrettable.

~~~
gwern
This is the sort of decision which scares away deep learning people: when the
only hardware advantage is not supported at launch, it tells us that AMD
really is not serious about good support and that the stack cannot be trusted.

~~~
gcp
I regrettably agree there. I spent time doing OpenCL implementations of some
of our stuff, but most recent decisions from AMD's side seem to indicate
they'd rather get rid of it as well, in favor of some CUDA compatibility
layers.

This isn't very encouraging to continue supporting AMD cards. Might as well
drop them entirely until their CUDA compatibility is more mature.

------
nsxwolf
Impossible for me to get excited about this. What are these really going to
sell for after the introductory stocks run out - $1000?

Can't wait for this crypto madness to end.

~~~
manaskarekar
I feel the other way, I want power users to buy these in bulk and dump them
for cheap on ebay once they are no longer powerful enough for mining.

Of course, many factors affect the final price on eBay and it's also about
whether you need one today.

~~~
nsxwolf
Right now it looks like the RX 480s and 580s get dumped for MSRP. After being
abused like that.

------
phkahler
Where is Quantum [1] and the EHP[2]? Since we have Epyc with 4 die, and Thread
Ripper with 2 die and 2 place holders, it's time for AMD to make a multi-chip
module with Ryzen, Vega, and 2 stacks of HBM2 memory. That could be packaged
into an awesome SFF PC much like their old Quantum concept.

[1] [http://wccftech.com/amd-project-quantum-not-dead-zen-cpu-
veg...](http://wccftech.com/amd-project-quantum-not-dead-zen-cpu-vega-gpu/)

[2] [http://wccftech.com/amd-exascale-heterogeneous-processor-
ehp...](http://wccftech.com/amd-exascale-heterogeneous-processor-ehp-
apu-32-zen-cores-hbm2/)

And please don't illuminate it with plain LED light, use laser diodes so we
get that ethereal look of the interference pattern ;-)

------
Matthias247
Does anyone else also think the bundles are mostly a bad deal for the
customers, and maybe only a good one for AMD?

First of all the monitor. Chances that I need one are fairly small. And if
yes, why would I want exactly that one? I personally think the resolution (and
especially pixel density) is below what I would want now, especially with so
much GPU power available. A 4k screen with Freesync would be more attractive.

Then the CPU/board. Chances are a little higher that one could use that, but I
guess even less than 40% of buyers would require a CPU and board. If we
restrict it to Ryzen 1700 options the number would be even less, since a 1600X
might be the more interesting option for many gamers. And of course some may
also prefer Intel. If we treat the CPU/board part as the main benefit of the
bundle it's also still mostly a zero-sum game, you pay 100$ extra to get a
100$ discount. I could most likely get a better overall deal when I buy both
GPU and CPU seperatly in shops where each has the best price.

When we finally factor the games in it might be a small benefit for a small
number of buyers. However their value is really small, since Prey is already
some days old and one can get it discounted.

If the demand for the bundles is as low as I think and AMD really wants to
sell most of the cards in bundled form (predicted by anandtech), then the
overall prices will mostly trend towards the bundle prices for all cards due
to availability.

~~~
benchaney
The reason for the bundles is that they are trying to prevent purchases by
miners. I do agree that they aren't a perfect solution

------
JohnTHaller
"Relative to the GeForce GTX 1080, we’ve seen power measurements at the wall
anywhere between 110W and 150W higher than the GeForce GTX 1080, all for the
same performance."

The Vega 64 has similar performance to the GTX 1080 for a similar price (if
not for the fact that the new Radeons will never be in stock due to mining),
but with much higher power consumption.

------
shmerl
Why are they so power hungry? Vega looks very promising for Linux gaming, but
that power consumption is just way overboard.

~~~
zanny
It is a trend that AMD overvolts their cards to push their clocks to compete
with Nvidia. Undervolted every GCN generation comes in line with Nvidia in
performance per watt, the problem is your actual performance is too low to be
impressive then. In the end, the numbers win - so if you are at a slight
disadvantage (and generally AMD makes one series of dies, and sells them to
both the consumer and enterprise segments, whereas Nvidia will at the hardware
level disable features of dies when going into gaming cards to eek more
efficiency out of them) then cranking up the power draw to reach performance
parity almost surely sells more units than being just as efficient but
performing worse per die area (because wafers are near fixed cost).

------
013a
This is unfortunately very disappointing. It practically can't even compete
with a 1080 at the same price, and that card has been out for a year now.

Really goes to show how far ahead Nvidia is.

~~~
bryanlarsen
Only on DirectX 11 games, which I could care less about. When I buy a video
card, I expect it to last 2-3 years, which means that the most important
benchmarks are the DX12 and Vulkan ones. Vega has a significant lead on those.

~~~
overcast
If you could care less, then you do care at least somewhat. There are still
PLENTY of DX11 games to play, considering a lot of us have backlogs the size
of Texas. DX12 titles are very limited right now.

~~~
shmerl
Vulkan titles will overtake them.

~~~
dingo_bat
Why do you think game studios will support vulkan when they can just target
dx12 and get pc and xbox support?

~~~
shmerl
They already do. There are more Vulkan supported games out, than DX12 ones as
far as I know. Why has many variables. But simple answer, Windows 7 is still a
common target (it has no DX12 support). More deeply, because forward looking
studios don't want to fall into hard MS lock-in, and growing number of studios
support Linux as well.

There was some good write up from Star Citizen developers on this topic, who
decided to ditch DX12 for Vulkan.

~~~
lagadu
Can you link a few? I feel like I'm missing something here because the only
AAA game out with Vulkan support I can recall is Doom.

~~~
shmerl
Just some:
[https://en.wikipedia.org/wiki/List_of_games_with_Vulkan_supp...](https://en.wikipedia.org/wiki/List_of_games_with_Vulkan_support)

Plus, more games for Nintendo Switch will be using Vulkan. Also games with
common engines like Unity, Unreal and Cry (that's in the future, they
basically have just added or adding Vulkan support).

------
Nursie
Well I bit the bullet and ordered one of these, at the "special introductory
price" of _only_ as much as an nVidia GTX 1080...

The compute power seems pretty good from that review. YesI will be mining with
it, so I'm hopeful of good results. I won't just be mining with it so I hope
the game performance is worth it too!

~~~
sgarman
Don't you have to carefully balance your mining speed and power consumption to
be profitable. This seems like the worst card for that?

~~~
Nursie
If you're buying the card purely for mining, sure.

If you want one anyway (i.e. you don't necessarily put the card cost in your
sums), then the returns ought to be better than just the electricity costs.
Especially if (say) mining to hold rather than sell straight off.

There are also rumours of outrageous hash rates being possible - multiples of
the current nVidia hash rates. But we'll see if they ever come to fruition.

------
popol12
I really wonder if miners are going to buy the whole stock despite the game
bundle program and the low power efficiency.

~~~
jrs95
I doubt it, a 1070 is still better performance per dollar with the power
consumption figured in

------
coldcode
I wonder which model Apple will use in the iMac Pro.

~~~
johnbehnke
I think Apple says 11 TFLOPS Single Precision and 22 TFLOPS Half-Precision,
which puts the GPU at a Vega 56 I think? I don't think Apple has said how many
possible GPU configurations there are, so you could be able to bump it to a
64.

~~~
aseipp
You're correct -- the specs page of the new iMac Pro is intentionally vague,
but they do confirm this much at least: [https://www.apple.com/imac-
pro/specs/](https://www.apple.com/imac-pro/specs/)

(The 64-variant will actually have 16GB HBM2, instead of 8, though.)

------
yohann305
their stock price jumped 8% today, i wonder if it's too late to get in ?

~~~
yclept
AMD has been bouncing up and down between ~ 10-14 since ryzen hype

