

The R9 Fury is AMD’s best card in years, but just who is it for? - Audiophilip
http://arstechnica.com/gadgets/2015/07/the-r9-fury-is-amds-best-card-in-years-but-just-who-is-it-for/

======
ComputerGuru
The R9 Fury is for me and anyone else who's main machine is a virtual one.
nVidia has locked out its consumer graphics cards from being passed-through to
a virtual machine unless you buy their multi-thousand-dollar workstation
graphics cards, while AMD lets you pass your graphics cards through to your
VMs, no problem.

Edit: For clarification, if you want to use a virtual machine directly (i.e.
not in a window or a virtual terminal on another machine running a virtual
machine browser), keyboard and mousing directly into the VM and having the VM
direct its output directly to a video card/monitor, you must "pass through"
the video card so the VM "owns" it exclusively and the host machine does not.
Regardless of whether you want gaming-level graphics or just basic VGA output,
you need to give it an actual, real graphics card (there are exceptions with
new cards by nVidia that understand VMs and let you "create" a graphics card
out of a subsection of a more powerful card, but at prohibitive prices). AMD's
entire graphics card lineup, from their $20 to their $700 GPUs will let you do
this. nVidia's cards detect they're being passed-through to a VM and won't
install the drivers, unless you perform a (risky) hardware hack or use their
professional graphics cards that a) aren't designed nor optimized for consumer
graphics card use, and b) cost at least an order of magnitude more than their
regular graphics cards.

~~~
Fr0styMatt8
I really wish AMD would promote this type of thing more heavily in general
(having advanced features that Nvidia restricts to their high-cost cards). I
seem to remember a similar thing with AMD being fantastic for Bitcoin mining
and from memory it's because Nvidia gimped certain features that were on the
cards, while AMD did not.

(on that note, I'm constantly annoyed that despite it being Nvidia specific,
everyone supports CUDA for everything cool).

I understand companies have to do this to segregate their products and that
it's impractical to have separate fab lines for a chip family for this kind of
difference. It's still something I think AMD has a competitive advantage on
though and they really need to push that alone (unless they just don't really
want to - to sell their own high-end professional cards).

A worse example of this is Nvidia cards not running in systems that have AMD
graphics cards in them. Which means, for example, you can't buy a low/mid-
range Nvidia card to sit in your system as a dedicated PhysX/CUDA card. I'd
consider such a setup for my system and I imagine there's probably others like
me out there.

~~~
woadwarrior01
The bitcoin mining edge that AMD cards had over Nvidia cards was came down to
the AMD ISA having a single cycle hardware rotate instruction, which the
NVIDIA cards lacked. Also the AMD philosophy of many dumb cores vs the
Nvidia's philosophy of fewer smarter cores made a significant difference.

~~~
optimiz3
Not true - AMD GCN also has bitselect, which does

dst = (A & B) | (~A & C).

This is equivalent to the SHA-256 Ch "Choose" function.

It also can be used to compose the SHA-256 Maj "Majority" function with an
additional XOR.

Additionally, the AMD ISA doesn't have a "rotate" instruction per-se, BFI_INT
is "bitfield insert", which can be used to compose arbitrary length rotates,
but is not strictly a rotate in and of itself.

AMD ISA also has instructions for computing integer add with carry, which is
great for crushing elliptic curves.

NVidia really, really, sucks for integer compute. Friends don't let friends
buy NVidia!

------
lqdc13
It is very much application-dependent. For games they might be similar, but
that's about it.

AMD cards tend to have much better integer performance, which is good for
cryptography-related tasks.

NVidia has way more RAM + better float performance. Plus the fact that many
libs only implement things with cuda as backend and not opencl, you are pretty
much stuck with only one viable option.

~~~
dogma1138
NVIDIA was only faster in half/mixed precision compute, in full precision AMD
was faster (with the exception of Titan cards besides the Titan X which
doesn't have good full precision support either).

AMD has also always loaded the cards with much more RAM (8GB vs 3GB in the
previous gen) and with much higher memory bandwidth (512bit vs 384bit), NVIDIA
countered it with effective memory management and compression support.

It's a bit funny that in the previous gen AMD boasted that 8GB is needed for
4K while NVIDIA was shipping cards with only 3, now AMD is pointing with a
smirk when AMD can't ship it's high end cards with more than 4GB due to
current HBM limitations while NVIDIA is shipping 6 and 12GB cards.

------
ploxiln
The reason I tend to be more interested in AMD graphics cards is that I want
basically workable open source drivers in Linux. AMD pays a couple of people
to maintain the upstream linux driver for their cards, NVidia doesn't. So if
you have an AMD card that isn't totally new in the last 2 months, you get a
working proper resolution desktop with no hassle. (So for this strategy to
work with the Fury, you'd probably have to wait another month or two.)

I still dual-boot windows just to play recent games, it pretty much always
works better than the alternatives, like the closed source Catalyst driver,
wine, even native ports. An NVidia card would probably work a bit better for
this purpose, and their closed source drivers for linux are also better, but
for me the open source drivers are the more significant factor.

~~~
dogma1138
NVIDIA releases Linux and BSD drivers every month together with the Windows
drivers (Linux drivers are released 1 week earlier usually)

They aren't GPL but the shader compilers in the drivers are considered "trade
secrets" in this industry, AMD doesn't release those under GPL or any other
OSS license either.

If you want you have nouveau driers which NVIDIA sorta supports, but they are
a piece of crap compared to the "retail" ones.

From testing both NVIDIA and AMD drivers for Steam OS i have to say that
NVIDIA is lightyears a head of anything AMD or ATI has ever put out on Linux.
The drivers are stable, allow for overclocking, have full NVAPI support, a
much better OpenGL mini driver (only ones to support 4.3) and are even aware
of WINE and similar solutions and optimize it in the driver.

Now with NVIDIA buying Transgaming (probably for Tegra based consoles) the
support for non-native games will probably even be better, i wont be surprised
if enough legal loopholes to be found (Wine has some issues with requiring
certain DLL's ;)) to see NVIDIA supports running games with no official linux
support through their Geforce Experience software which is now rumored to come
out after the trasngaming purchase.

~~~
ploxiln
I'd agree, closed-source nvidia drivers for linux are better than closed-
source AMD drivers. I'm just saying, open source AMD drivers in the upstream
kernel are better than open-source nvidia (nouveau) drivers in the upstream
kernel, even if neither are good enough for serious games from the last
decade. I boot to Windows 7 for games.

I've had nvidia cards in the past, as well as UT2004 which had a native linux
version, and it worked well enough, but performance was still less than half
what it was in windows (with the closed-source nvidia driver in linux). And
what with xrandr, kernel mode setting, etc, I prefer proper integrated linux
support (even without enough 3d acceleration for games).

------
fegu
I wish reviewers would include some measure (real, not paper) on GPGPU
performance. WPA hashes per second is a widely known reference number for
integer performance. Neural network training is a similar one for floats.

~~~
raverbashing
Yeah, it seems this is becoming more relevant than FPSs on a game nowadays

------
andy_ppp
The performance difference are a few percent right? About 10-15 FPS? I guess
nvidia are still considered the best but that means AMD have to reduce prices,
which is better for me.

Reminds me that the nvidia graphics card in my MacBook is going wrong. Hmmm.

~~~
digitalzombie
Depending on the gen, there was a recall recently for MacBook btw, a few
months back.

My buddy got recall and they gave him his money back IIRC.

~~~
andy_ppp
Yes, even though mine is not involved in the recall it exhibits all of the
same signs as those that are. Hmmm.

~~~
STRML
Same here. Late 2013? Exact same problems, somewhat mitigated in Yosemite
(fewer complete crashes) but still the occasional garbled screen or inability
to switch to dGPU.

Weirdest problem: can't start up the MacBook (physically) cold without garbled
video. Have to start it up, let it warm up a bit and get the juju flowing,
then hard reboot.

~~~
andy_ppp
Oh I should be careful what I'm saying, my problems are limited to
intermittent screen craziness (flickering of the top half of the screen), GPU
switching is a non issue because I have a program I use that forces discrete
GPU.

And most importantly no crashes recently.

------
linky123
CUDA is so important to us, we don't even consider AMD.

~~~
Zardoz84
Your are looked to a single vendor on his proprietary language ? You should
move to use OpenCL ASAP.

~~~
linky123
Yes. It provides the performance we need. Nothing else comes close.

