
AMD Radeon VII Review: An Unexpected Shot at the High End - gbrown_
https://www.anandtech.com/show/13923/the-amd-radeon-vii-review
======
tutanchamun
EDIT: if the chart on the Anandtech review is true I have to take back what I
wrote below. I didn't know the AMD Radeon Instinct MI50 (6.7 TFLOPS FP64, 16
GB memory with ECC and also 1 TB/s bandwidth) only costs 1000 USD, I expected
it to be closer to the FirePro cards. So if you don't need active cooling this
seems like a even better choice.

Good that AMD reverted their decision on FP64 which means the card has 3.5
TFlops.

The only real competitor in the consumer space seems to be the Nvidia Titan V
which has 6.9 TFlops, 12 GByte of memory with 652.8 GB/s and costs 3000 USD.

The Radeon 7 only has half of the TFLOPS for FP64 but 16 GByte with 1024 GB/s
and only costs 700 USD.

I don't have a use case for double precision but I'm sure people who need it
will like the card.

~~~
throwaawyw
The only disappointment is that AMD's software stack is very lacking for
compute. While, Nvidia commands absurd prices for 2080Ti with a mere 11GB
GDDR6, Radeon's with 16 GB of HBM2 are closer to the memory spec of Titan
V100.

OpenCL, while being a much cleaner API, suffers badly from the lack of quality
libraries. The requirements aren't even that large for the killer application
of DL - BLAS, a few convolution kernels, and random number generation.
Darknet's {of the YOLO fame} interface to CUDA, for instance is composed of
fewer than 10-15 functions.

AMD's efforts in this regards have not been very satisfactory IMO. Infact
third party work such as CLBLAST are of much higher code quality than
libraries like AMD's own CLBLAS (or it was). Sadly, AMD over the past few
years has seems to have come under the belief that building a transpiler from
CUDA bytecode to AMD's would be the magic bullet. It remains to be seen if
this will suffice, but it does seem like a quixotic solution to a
straightforward problem.

Had such a foundational software libraries (which mind you requires a lot of
harware tailored optimization ) been perfected over the past few years (even
while being multiple years behind NVDA), they could easily have turned the
game, more so considering NVDA's shady tactics of banning GTX/RTX cards in the
cloud.

I don't know what went wrong. It's not like AMD didn't have engineers working
on ML applications; the direction seems misguided - AMD's engineers (unlike
NVDIA's) seem to have focussed on porting frameworks onto low quality
libraries, rather than securing the foundation. I'm afraid the ROCm effort
will have the same effect.

~~~
dman
Could not agree with you more - tracking the AMD compute story has been
painful. Talks about having Tensorflow / Torch upsptream support have been
going on forever. Reading between the lines OpenCL seems to not be the favored
way to do compute anymore, so its unclear if I was starting today what the
canonical forward looking path is for writing compute on AMD cards.

~~~
dragontamer
> I was starting today what the canonical forward looking path is for writing
> compute on AMD cards.

HCC and HIP seem to be the canonical compute path forward, with compiled
ahead-of-time OpenCL being supported as well.

There's some "#pragma openmp target" tests in the Github repos. So AMD is
clearly working on that as well. If HCC got OpenMP support, it would be
significantly easier for typical programmers to write GPU code.

\--------

If I were starting a project today, it'd be OpenCL 1.2 on the AMDGPU-pro
drivers (the old proprietary driver is more stable for sure), but with a
careful eye on HCC / HIP advancements. HCC and HIP have gotten a huge amount
of attention / github commits in the past year, so its clearly where AMD's
main effort is going into.

------
IloveHN84
For me, 700$ are still high end.

Time ago the high end was above 500$ and the price climbed during this time
thanks to cryptomining.

With 200/300 you could buy a modest graphic card that could last 3 years
playing top titles. Now in that price range you can just keep up one year and
then it's completely trash

~~~
Synaesthesia
I think a RX580 or Nvidia 1060 is still perfectly great for modern gaming and
can be picked up for $200 or less.

~~~
holtalanm
rx 580 8GB is good. the 4GB version is lagging behind a little -- VRAM spills
into system RAM regularly.

I assume the 1060 3GB has a similar problem.

source: I have a 4GB and am currently saving up to upgrade.

edit: also, the rx 580 4GB generally runs at a lower clock than the 8GB
counterpart, which affects performance, as well.

~~~
skohan
Which titles are using more than 4GB these days?

~~~
dragontamer
[https://techreport.com/r.x/2019_02_06_AMD_s_Radeon_VII_graph...](https://techreport.com/r.x/2019_02_06_AMD_s_Radeon_VII_graphics_card_reviewed/memory.jpg)

Seems to be a VRAM-based calculation. There have been some youtubers looking
into the "pop in" effect on GPUs with less VRAM. Basically, GPUs will render
images at a lower texture level if the texture wasn't fitting in VRAM. So the
failure case is pretty smooth these days and hard to notice.

Basically, the mountain range in the background may only be a 2k texture, but
may pop into a 4k texture after a bit. Because the 2k texture is a mipmap of
the 4k texture, its actually hard to notice. So you can stretch lower VRAM
Settings these days better than you'd expect.

Another thing: just because VRam is allocated doesn't necessarily mean that
the VRAM is being used. If you look at a wall in any video game, all the
textures are going unused. Taken to the extreme: there are people and objects
"behind" a lot of geometries that you'll never see in the game, but will be
allocated to VRAM. As such, a game may allocate 9GB total, but not really need
the last 2 or 3 GB. So even if a game reports 9GB or 10GB of usage, you may
get by with a 8GB card.

\-------------

In any case, we're long past the 4GB of RAM mark. Practically speaking, games
require 6GB at least and may break 8GB in the near future IMO. 8GB seems to be
sufficient for now though (thanks to pop-in, VRAM overallocation, etc. etc.)

------
piinbinary
It seems that this has a better process (7nm vs. 12nm), more memory (and
bandwidth), and more power draw than the RTX cards.

So why does it have equal/lower performance? Is it just that it has fewer
transistors (13.2B vs. 18.6B)? (And if so, why doesn't the smaller process
allow more transistors?) Or is there also a difference in the architecture
where a better design could get more performance out of the same amount of
silicon?

~~~
metalliqaz
There are so many variables. A few that you didn't mention in your comment is
the number of shaders/cores and the core frequency.

However there's one other thing to consider for games. Essentially every big
video game has a special profile in the Nvidia drivers that optimizes
performance for that specific game. Gaming card reviews almost always use
popular AAA games for benchmarking. It's possible and even likely that each
was developed with Nvidia cards in mind, in coordination with Nvidia to
optimize those drivers.

~~~
penagwin
It depends on the game, many games (especially previous ones) would say
POWERED BY NVIDIA or whatnot. Usually specific series get special treatment,
Hitman goes AMD but the Witcher goes Nvidia or whatnot. To my knowledge that
generally means that A) they're using hairworks or something, and B) they
likely got to work with that company for extra optimization.

Then I believe the drivers also optimize games without the developers
involvement (But I would assume this isn't as effective as working directly
with that company).

But generally certain games "work better" on AMD/Nvidia based on whoever they
worked with/built it for, however the majority are "better" on nvidia for
various reasons.

~~~
rasz
In case of Nvidia it also means game will force certain feature known to not
work as well on AMD, like for example tesselation in Crysis 2 going down to
many triangles per pixel just to make sure AMD cards struggle
[https://techreport.com/review/21404/crysis-2-tessellation-
to...](https://techreport.com/review/21404/crysis-2-tessellation-too-much-of-
a-good-thing/2) , or invisible tesselated water under main geometry
[https://techreport.com/review/21404/crysis-2-tessellation-
to...](https://techreport.com/review/21404/crysis-2-tessellation-too-much-of-
a-good-thing/3) Other times Nvidia will "help" developer drop DirectX feature
it doesnt like, like Assasin Creeds removal of DX10.1 after joining Nvidia
"meant to be played" program.

Optimizing drivers is one thing. Sabotaging libraries, and then forcing said
libraries on developers under the guise of "support"/cross promotion (read
Nvidia is paying developer in advance) is another.

------
bryanlarsen
The big surprise is the disabled PCIe 4 support. It makes sense in isolation,
but since they plan on launching Zen2 ASAP with PCIe4 being one of its
flagship features this would be very complementary. Doesn't the left hand of
AMD talk to the right hand?

~~~
dragontamer
AMD's gotta have a reason to sell the higher-end MI50.

~~~
bryanlarsen
They could have sold a ton of Zen2 / Radeon7 bundles. Instead they chose to
sold a few dozen more MI50's.

------
shmerl
The pricing puts it outside of what I'd consider a gaming oriented card. Same
can be said about Nvidia's RTX by the way.

Nice performance though, it's the highest performance "gaming" card with open
drivers today.

------
tutanchamun
Taken from the other thread:

Other reviews:

\- (guru3d) [https://www.guru3d.com/articles-pages/amd-radeon-
vii-16-gb-r...](https://www.guru3d.com/articles-pages/amd-radeon-vii-16-gb-
review,1.html)

\- (tomshardware) [https://www.tomshardware.com/reviews/amd-radeon-vii-
vega-20-...](https://www.tomshardware.com/reviews/amd-radeon-vii-
vega-20-7nm,5977.html)

\- (techpowerup)
[https://www.techpowerup.com/reviews/AMD/Radeon_VII/](https://www.techpowerup.com/reviews/AMD/Radeon_VII/)

\- (arstechnica) [https://arstechnica.com/gaming/2019/02/amd-radeon-
vii-a-7nm-...](https://arstechnica.com/gaming/2019/02/amd-radeon-vii-a-7nm-
long-step-in-the-right-direction-but-is-that-enough/)

\- (gamersnexus) [https://www.gamersnexus.net/hwreviews/3437-amd-radeon-vii-
re...](https://www.gamersnexus.net/hwreviews/3437-amd-radeon-vii-review-not-
ready-for-launch)

\- (gizmodo) [https://gizmodo.com/amds-radeon-vii-is-a-solid-gaming-
card-b...](https://gizmodo.com/amds-radeon-vii-is-a-solid-gaming-card-but-
thats-just-1832412200)

\- (gamespot) [https://www.gamespot.com/articles/radeon-vii-review-can-
amds...](https://www.gamespot.com/articles/radeon-vii-review-can-amds-new-
card-handle-4k-pc-g/1100-6464872/)

\- (digitaltrends) [https://www.digitaltrends.com/computing/amd-radeon-vii-
revie...](https://www.digitaltrends.com/computing/amd-radeon-vii-review/)

\- (extremetech) [https://www.extremetech.com/computing/285286-amd-radeon-
vii-...](https://www.extremetech.com/computing/285286-amd-radeon-vii-review-
this-isnt-the-7nm-gpu-youre-looking-for)

\- (pcper) [https://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-
VII-...](https://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-VII-Review-
Supercharged-Vega)

\- (techspot) [https://www.techspot.com/review/1789-amd-radeon-
vii/](https://www.techspot.com/review/1789-amd-radeon-vii/)

\- (tweaktown) [https://www.tweaktown.com/reviews/8894/amd-radeon-vii-
review...](https://www.tweaktown.com/reviews/8894/amd-radeon-vii-review-team-
red-back-enthusiast-gpus/index.html)

\- (engadget) [https://www.engadget.com/2019/02/07/amd-radeon-vii-review-
vi...](https://www.engadget.com/2019/02/07/amd-radeon-vii-review-
video-4k-benchmarks/)

\- (pcmag) [https://www.pcmag.com/review/366382/amd-radeon-
vii](https://www.pcmag.com/review/366382/amd-radeon-vii)

\- (techradar) [https://www.techradar.com/reviews/amd-radeon-
vii](https://www.techradar.com/reviews/amd-radeon-vii)

\- (hothardware) [https://hothardware.com/reviews/amd-radeon-vii-review-and-
be...](https://hothardware.com/reviews/amd-radeon-vii-review-and-benchmarks)

\- (kitguru) [https://www.kitguru.net/tech-news/zardon/no-custom-radeon-
vi...](https://www.kitguru.net/tech-news/zardon/no-custom-radeon-vii-at-
launch-amd-say-partners-free-to-make-them/?PageSpeed=noscript)

\- (hardwarezone) [https://www.hardwarezone.com.sg/review-amd-radeon-vii-
review...](https://www.hardwarezone.com.sg/review-amd-radeon-vii-review-
finally-high-end-card-amd)

\- (rockpapershotgun) [https://www.rockpapershotgun.com/2019/02/07/amd-
radeon-7-rev...](https://www.rockpapershotgun.com/2019/02/07/amd-
radeon-7-review/)

\- (hexus) [https://hexus.net/tech/reviews/graphics/126752-amd-radeon-
vi...](https://hexus.net/tech/reviews/graphics/126752-amd-radeon-vii/)

\- (bit-tech) [https://bit-tech.net/reviews/tech/graphics/amd-radeon-vii-
re...](https://bit-tech.net/reviews/tech/graphics/amd-radeon-vii-review/1/)

\- (legitreviews) [https://www.legitreviews.com/amd-radeon-vii-16gb-video-
card-...](https://www.legitreviews.com/amd-radeon-vii-16gb-video-card-
review_210489)

\- (overclock3d)
[https://www.overclock3d.net/reviews/gpu_displays/amd_radeon_...](https://www.overclock3d.net/reviews/gpu_displays/amd_radeon_vii_review/1)

\- (techgage) [https://techgage.com/article/amd-radeon-
vii-1440p-4k-ultrawi...](https://techgage.com/article/amd-radeon-
vii-1440p-4k-ultrawide-gaming-performance/)

\- (computerbase - google translate)
[https://translate.google.com/translate?sl=auto&tl=en&u=https...](https://translate.google.com/translate?sl=auto&tl=en&u=https%3A%2F%2Fwww.computerbase.de%2F2019-02%2Famd-
radeon-vii-test%2F3%2F%23abschnitt_benchmarks_in_1920__1080_und_2560__1440)

\- (pc games hardware - google translate)
[https://translate.google.com/translate?sl=de&tl=en&u=http%3A...](https://translate.google.com/translate?sl=de&tl=en&u=http%3A%2F%2Fwww.pcgameshardware.de%2FRadeon-
VII-Grafikkarte-268194%2FTests%2FBenchmark-Review-1274185%2F)

\- (harwareluxx - google translate)
[https://translate.google.com/translate?sl=de&tl=en&u=https%3...](https://translate.google.com/translate?sl=de&tl=en&u=https%3A%2F%2Fwww.hardwareluxx.de%2Findex.php%2Fartikel%2Fhardware%2Fgrafikkarten%2F48510-7-nm-
gpu-und-16-gb-hbm2-die-radeon-vii-im-test.html)

\- (golem - google translate)
[https://translate.google.com/translate?sl=de&tl=en&u=https%3...](https://translate.google.com/translate?sl=de&tl=en&u=https%3A%2F%2Fwww.golem.de%2Fnews%2Fradeon-
vii-im-test-die-grafikkarte-fuer-videospeicher-liebhaber-1902-138906.html)

\- (hardware.info - google translate)
[https://translate.google.com/translate?sl=nl&tl=en&u=https%3...](https://translate.google.com/translate?sl=nl&tl=en&u=https%3A%2F%2Fnl.hardware.info%2Freviews%2F9051%2Famd-
radeon-vii-review-nieuwe-kans-voor-vega-op-7-nanometer)

~~~
petercooper
\- (linus tech tips)
[https://www.youtube.com/watch?v=alhEgNvzv50](https://www.youtube.com/watch?v=alhEgNvzv50)

------
bluedino
Could this be the piece Apple was waiting for to release the new Mac Pro?

~~~
minikites
At this point I expect the Mac Pro to go the way of AirPower with Apple never
mentioning it again and calling the iMac Pro "good enough".

~~~
auggierose
To be honest, for many purposes it is good enough. I love this machine. The
Vega 64 inside the iMac Pro even has 16GB instead of the standard 8GB.

------
twtw
I'm sorta surprised that the specs are so similar to the MI50. Seems like this
might cannibalize the sales of MI50 for applications that don't use fp64 (i.e.
all ML/DL uses).

~~~
dragontamer
Well, that's the thing. Its clearly a discount MI50. PCIe 4.0 and a coherent
GPU-fabric however are MI50 only features, so anyone buying bundles of GPUs
for hardcore compute will want to stick with the MI50 for sure.

------
vkaku
TL;DR:

\- Stock/Better than RTX 2080 performance at 1440p/4k.

\- Needs driver updates for overclocking and for matching 1080p performance.

\- Heat design bad on founder card.

\- Expensive and mostly Sold out.

I'll wait for Navi.

------
tracker1
Happy to see at least _SOME_ competition, would be nice to see AMD offer
something by end of the year competative to RTX 2080 Ti, which should at least
drive some pricing competition. I wouldn't be surprised to see a $100-200 drop
for RTX 2080 and 2080 Ti because of this.

------
baybal2
300W power consumption, and it still only caught up with Nvidia cards drawing
less than 200W on gaming performance

~~~
snvzz
>300W vs 200W

That's not measured. That's manufacturer spec, which is best ignored.

NVIDIA's TDP is not the thermal design number as you'd expect, but rather, the
"typical" as in measured while running some arbitrary loads.

It's dodgy, but that's how NVIDIA rolls.

Always do refer to measurements for actual numbers to compare.

Manufacturer specs are best not trusted.

~~~
m0zg
Discovered this the hard way when building quad GPU servers. 1080Ti has TDP of
250W per spec, yet most "1600W" power supplies will crap out and shut down
your system if you fire up 4 1080Ti's simultaneously. The only brand I've
tried that didn't crap out was EVGA. Supposedly Corsair 1600W PSU is also OK,
but I haven't tried it. Even in nvidia-smi (which I think still averages
things out somewhat and never shows the momentary peak power), 1080Ti will
often jump way over the 250W "limit" when it's at 100% utilization.

