
AMD says Nvidia’s GameWorks “completely sabotaged” Witcher 3 performance - mwill
http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/
======
bhouston
This has always been the point of NVIDIA's propertary game engine technology
for more than a decade now - give it away for free or very often for big
titles integrate it into game engines using NVIDIA developers (who often come
to work on site.) Then once the game is released mention that it works
better/looks better on NVIDIA technology than on competitors (AMD/Intel) GPU
technology. And of course blame it not working well on AMD/Intel on their
"inferior" technology.

NVIDIA must be loving this controversy as it was the whole game plan all
along.

~~~
SloopJon
I read something very similar in this anonymous Slashdot post when Assassin's
Creed Unity came out:

[http://games.slashdot.org/comments.pl?sid=6048819&cid=483821...](http://games.slashdot.org/comments.pl?sid=6048819&cid=48382141)

Choice quote: "Nvidia literally doesn't care if bouncing ten simple particles
on your screen uses 30% of your GPU performance, so long as the same effect on
an AMD GPU takes 80%. Nvidia is this dirty."

~~~
ikeboy
>So Nvidia 'invented' TXAA- an horrifically bad AA method both in appearance
and 'hit' on performance- but a method that runs far better on new Nvidia
hardware than it does on new AMD hardware.

Why is that, though? Why can't AMD do whatever Nvidia's doing to make it just
as fast on their system?

~~~
tarpherder
Because Nvidia is playing to their relative strengths. Differences in hardware
can't always be overcome in software.

------
chao-
There was an article back in March wherein a Stack Overflow (or Quora?) user
comment about the hilariously porous boundary between game engines, drivers
and actual hardware on GPUs. I'll see if I can find it because as much as I
love a good ole shitshow of industry skulduggery, I can completely see a best-
intentions effort leading to these sorts of unintended consequences (doing
their best to get tessellation working, and so forth).

 _Disclaimer of non-fanboy-ism: I don 't play video games anymore, and only
have graphics via Intel on my machine. No stock in Nvidia or AMD either._

Edit: Found it, and my memory completely sucks. It was an article on
gamedev.net about lower-level graphics APIs:
[http://www.gamedev.net/topic/666419-what-are-your-
opinions-o...](http://www.gamedev.net/topic/666419-what-are-your-opinions-on-
dx12vulkanmantle/#entry5215019)

~~~
nextw33k
I remember reading that link and thinking about how the standards are not
working.

It also concerns me for the future of the open source graphics stack. If the
proprietary drivers are being updated to fix game specific problems, how are
the open source drivers going to fair when the general rule for open source is
not to have hacks within a code base.

I can only hope that the Mesa 3D stack comes to Windows in a meaningful way to
ensure a common open 3D graphics library. Vulcan might shift the problem to
the game engine from the driver but that's a long way off yet.

~~~
lsadam0
> . If the proprietary drivers are being updated to fix game specific
> problems, how are the open source drivers going to fair when the general
> rule for open source is not to have hacks within a code base.

I'm an outsider, so it's easy for me to sit back and make blanket statements.
That being said, the responsibility should lie on game developers to insure
their games behave properly on hardware. Making proprietary drivers to address
specific games is a bad development practice. Hardware markers should be
responsible for insuring their drivers are accurately supporting standards.

It just sounds like the whole games industry has fallen into some really bad
development practices.

~~~
chao-
The thing is, as referenced in the article I linked, there is a blurred line
in functionality between what the GPU can do, via what the driver exposes, and
what the GPU can do, if only they would write something for you to help
massage it a bit and enable in the way you need.

Without access to the engineer's knowledge of the GPU's hardware, it's hard
for the open source developers to even know what they ought to be able to
implement. It follows that game studios (and consumers) would be crazy to rely
on the coin-flip-odds that an open source implementation will figure out the
right thing to do for a particular optimization of the GPU's internals

I am also an outsider, but have had a few friends work on various graphics
projects (game engines, CAD software, one of them use to have a map of the
OpenGL state machine above his bed). To hear them tell it, trying to maintain
absolute "best practices" and only use truly cross-platform, cross-vendor code
in your libraries while still hitting deadlines for games is like the mythical
web startup that somehow ships despite also investing time in 100% unit test
coverage, 100% integration tests, with a little bit of mutation testing and
generative testing on top to help test-drive the test suite. That is: it's
more of an ideal to strive for than a reality to be lived.

As a Linux user I weep at this state of affairs, but given the way GPUs are
made, I don't know what could change.

------
bd
HairWorks is not something to be particularly upset about. Vast majority of
players, even on high-end GPUs just keep turning off those "premium" IHV-
specific features as they are just so heavy.

BTW AMD kinda started this with their TressFX hairs (notably used in new Tomb
Raider).

It had similar characteristics to HairWorks - extremely expensive feature for
relatively small overall visual impact, better running on AMD than Nvidia
GPUs, but still running pretty bad even on high-end GPUs.

Something to keep in mind: those super-duper features you see at GPU launch
events, they are mostly just marketing, "halo effect" features, not practical
enough to be used in games for almost everybody.

\--------

We are speaking about miniscule fraction of all players who have GPUs powerful
enough to runs all bells and whistles.

Titan X, the only GPU that can run Witcher 3 fast with HairWorks enabled [1],
exists in so small numbers that it doesn't even show up in Steam Hardware
survey [2].

GTX 980, the next most powerful GPU, is owned by less than 0.5% of all people
who use Steam. And GTX 980 runs 1080p ultra with HairWorks just ~37-40 fps.

GTX 970, the next most powerful GPU, the most popular Maxwell GPU, huge
bestseller, is owned by ~3% of Steam users. It's the only remaining GPU that
can stay above 30 fps with HairWorks on.

So something like 95+ percent of all gamers are not affected by this issue at
all.

\--------

[1] [http://pclab.pl/art63116-27.html](http://pclab.pl/art63116-27.html)

[2]
[http://store.steampowered.com/hwsurvey/videocard/](http://store.steampowered.com/hwsurvey/videocard/)

~~~
sjwright
> BTW AMD kinda started this

The difference is AMD's TressFX source code is open, which allowed Nvidia to
resolve the performance disparity without much fuss.

Whereas the Nvidia's HairWorks source code is closed, leaving AMD unable to
pitch in engineering resource for a diagnosis or solution.

~~~
agapos
Also note the dates: Tomb Raider's TressFX happened in 2013 while Crysis 2 and
the 'flat-surface tessellation' happened in 2011 (with Patch 1.8 I believe).

Not something one can call a start.

------
white-flame
So this is the screenshot with HairWorks enabled:
[https://cdn.arstechnica.net/wp-
content/uploads/sites/3/2015/...](https://cdn.arstechnica.net/wp-
content/uploads/sites/3/2015/05/the-witcher-3-wild-hunt-pc-screenshot-002.png)

Is there a comparable one with it turned off? Because that untextured hair
looks completely out of place next to all the other nicely textured & shadowed
parts of that scene (well, the chainmail does look amusingly flat where it
goes over the shoulder). The hair might move nicely in gameplay, but it looks
plain, garish, and low tech in a still.

I suspect this works much better with dark hair, where the texture is not as
noticeable as the silhouette.

~~~
bhouston
> Because that untextured hair looks completely out of place next to all the
> other nicely textured & shadowed parts of that scene

The hair has nice shape and I believe it has physics, but it totally isn't
shaded properly. I'm not sure if they are not using the proper Marschner
shader
([http://www.ephere.com/plugins/autodesk/max/ornatrix/docs/con...](http://www.ephere.com/plugins/autodesk/max/ornatrix/docs/concepts/expensive_shader.html))
or if the issue is they are not integrating the environmental lighting
properly - could be both.

------
fabian2k
From all that I read, the game itself performs very well on AMD cards. The
only exception is the Hairworks feature. But that has such a large impact even
on high-end Nvidia cards that this is the first thing that one disables. It is
probably the single most expensive setting in the game, and for a relatively
small benefit.

------
Arzh
I'd like to point out the Slightly Mad quotes about working with AMD. I have
never heard of a developer having a good experience with AMD. Even Carmak, the
man with probably the most klote for a game programmer, has had a hard time
getting help from them. Their support for developers has been why I've been
choosing them for my PCs ever since I upgraded from my 9800 Pro.

~~~
TazeTSchnitzel
> Their support for developers has been why I've been choosing them for my PCs
> ever since I upgraded from my 9800 Pro.

Don't you mean why you _haven 't_ been choosing them. Or are you choosing AMD
_because_ they (supposedly) aren't nice to developers?

------
yAnonymous
Nvidia technologies have been heavily used in both these games:

* Watch Dogs

* The Witcher 3

Both suffer from technical problems with the graphics, don't perform as good
as they could (across all systems) and look considerably worse than in the
previews.

I like Nvidia cards and drivers, but as a developer I'd stay very clear of
Nvidia-exclusive technologies. I guess the business guys must like the free
advertising.

------
phn
I hope the new low-level API kids on the block, like metal and vulkan, can
switch the fight to actually compete in having solid GPU performance, and
benefit consumers in the end.

Then, maybe a GPU independent HairThing™/intermediate level toolkits can
actually flourish.

~~~
tarpherder
Why not just DX12?

------
jcastro
It should be noted that hairworks crushes the entire game no matter what GPU
you have (Nvidia or otherwise) except for one, the Titan X, which retails just
north of a thousand dollars.

Everyone else is pretty much forced to turn down some setting or another to
hit 60+ fps. And that's just at 1080p! I thought my GTX970 would handle it no
problem, but there are still tweaks to be made.

------
superskierpat
What about that mention of recent updates making 7xx series graphic cards
performance worse? I have a 750 in my laptop and yesterday I booted pillars of
eternity and was having much more performance problems than usual.

------
danmaz74
"...AMD's chief gaming scientist..." this is some fancy position

~~~
varelse
Well he does have a _fancy_ degree in chemistry and that makes him a
_scientist_!!!

------
zamalek
Solution: AMD optimizes HairWorks. Reality: complaining is easier.

NVIDIA have always claimed that no exclusivity deals are signed with
GameWorks. Others have [supposedly] always been welcome to assist game
developers during the phases that NVIDIA gets proactively involved in.

It's just an extension of the systemic AMD problem: they [very likely] have
the best hardware but their Windows drivers are horrific and not only do they
not have developer support; they actually play the victim because others take
the initiative. For crying out loud, even Intel has game developer
support/code. _Correction: it turns out that AMD do actually have some open
source libraries for game devs._

The problem here isn't NVIDIA. NVIDIA set out to optimize the game as best as
they could for their hardware. AMD didn't set out to do anything: once again
they sabotaged themselves by not getting involved. If I'm lazy and I get left
behind that is my fault, not the fault of the other guy who did what I didn't
do.

~~~
the_mitsuhiko
> It's just an extension of the systemic AMD problem: they [very likely] have
> the best hardware but their Windows drivers are horrific and not only do
> they not have developer support

You are strongly misrepresenting the actual situation. AMD is painfully aware
that their drivers lack compared to Nvidia but they disagree with the reason.
AMD does not want to put nearly as many hacks into the drivers as Nvidia does.
They went quite far (by making Mantle) to make the drivers leaner and more
predictable.

Nvidia traditionally detects misbehaving game code and patches it up with
various degrees in the driver.

> AMD didn't set out to do anything: once again they sabotaged themselves by
> not getting involved.

AMD made the right step and backed the only thing that will evolve the
situation from the status quo: Mantle and now Vulkan/DX12. Otherwise they
would always end up playing this game of magical drivers that NVidia supports
but nobody likes.

~~~
zamalek
> Mantle and now Vulkan/DX12

The issue here isn't that at all. The issue is a turn-key hair rendering
solution. ~~Which AMD does not provide.~~ Correction: AMD provides TressFX,
which had bad NVIDIA performance before NVIDIA stepped in and fixed it. ~~In
fact I can't find any graphics code samples on their website, contrasted with
the abundance on the NVIDIA website.~~ Correction: AMD does have samples.

~~~
the_mitsuhiko
> The issue here isn't that at all. The issue is a turn-key hair rendering
> solution.

That is incorrect. The issue here is hair rendering which AMD has a turn-key
solution for it: TressFX.

~~~
zamalek
Right, I stand corrected. However, do keep in mind that NVIDIA had similar
performance issues during the Tomb Raider launch - and instead of complaining
about it NVIDIA stepped in and, as I've been saying, _helped the game
developer improve TessFx compatibility with their cards._

~~~
chronid
Do keep in mind that NVIDIA received the code for the game so they could
optimize their driver for it [1].

There is no chance of the same happening with AMD and TW3, since at least at
the beginning of 2014:

    
    
        According to Nvidia, developers can, under certain licensing circumstances, 
        gain access to (and optimize) the GameWorks code, but cannot share
        that code with AMD for optimization purposes.
    

See [2] for the source of that statement.

The situation is _really_ that different.

[1] [http://www.eurogamer.net/articles/2013-03-07-nvidia-
apologis...](http://www.eurogamer.net/articles/2013-03-07-nvidia-apologises-
to-tomb-raider-pc-players-plagued-by-geforce-issues)

[2] [http://www.extremetech.com/extreme/173511-nvidias-
gameworks-...](http://www.extremetech.com/extreme/173511-nvidias-gameworks-
program-usurps-power-from-developers-end-users-and-amd)

~~~
zamalek
It seems as though that might have changed[1] (dated 1 year after yours). In
truth those in the AAA industry might only be privy to what is actually going
on, as I'm seeing a lot of PR bullshit/he-says-she-says from both sides.

If NVIDIA is actively restricting access to the code for HairWorks that is
indeed very scummy, however, they would technically still be within their
rights.

[1]: [http://www.pcper.com/reviews/Editorial/NVIDIA-and-AMD-
Fight-...](http://www.pcper.com/reviews/Editorial/NVIDIA-and-AMD-Fight-over-
NVIDIA-GameWorks-Program-Devil-Details/NVIDIAs-Response)

~~~
chronid
Yes, the FUD seems to be everywhere, I concur. And we'll never know the truth
unless a GameWorks contract gets leaked in full, aka never. Business as usual.
:)

Intel was still within their right (technically) when it forced their own
compiler to compile non-optimized code for AMD CPUs. That does not make it
right for us, the customers, this is the point I wanted to make.

~~~
zamalek
> That does not make it right for us, the customers, this is the point I
> wanted to make.

That's an discussion that's likely to never come to a sensible conclusion :).
The AMD/NVIDIA situation is definitely one of the negative outcomes of an
open/competitive marketplace.

------
snowballsteve
I was getting stoked about the Witcher 3 when I got some free time. All this
talk is making me think twice about the game and not about my AMD card.

------
jokoon
Don't graphic drivers already optimize themselves for specific games ? I don't
really know what's optimized underneath though...

~~~
pjc50
From elsewhere in this thread, [http://www.gamedev.net/topic/666419-what-are-
your-opinions-o...](http://www.gamedev.net/topic/666419-what-are-your-
opinions-on-dx12vulkanmantle/#entry5215019)

"Optimisation" may include "throw away the shaders provided by the game and
replace them with entirely different ones written by NVIDIA for this specific
game".

------
shmerl
So what are they going to use on Linux? Supposedly Witcher 3 is being ported.
Weirdly, HairWorks uses DirectCompute, which makes it unportable anywhere
outside Windows.

May be with emergence of Vulkan, developers won't need to use these vendor
specific APIs, since they'll have low level enough access to implement their
own physics logic efficiently on all GPUs.

------
Shengbo
If AMD's platform is really as good as they claim, why does everything run on
the nVidia one? Also, I don't think this is an issue worth getting upset
about, especially if you can just turn the feature off in-game. If the
particular engine I'm using runs better on nVidia cards, then of course I'm
going to prioritize their platform as a base for my hair simulations or
whatever fancy extra stuff I want to have in there for people with high-end
GPUs.

~~~
the_mitsuhiko
> why does everything run on the nVidia one?

What is "everything"? PC Gaming is nVidia centric because they put a lot of
money behind it. AMD supplies the chips for all next gen consoles so by pure
market penetration in the gaming market, I'm pretty sure they are doing quite
well.

~~~
Shengbo
You're right, my wording wasn't clear. I was talking about the market they're
complaining about.

I just feel like it would be more beneficial to improve communication and
developer support instead of generating negative PR by saying things like
nvidia "completely sabotaged" performance when in fact all they really did
according to developers is they had a good product and they were always
available.

------
frik
Witcher 3 had looked awesome on all press photos and in-game videos til last
week. The finished game looks the same on all platforms, a highend PC runs on
idle (30% Gpu). Witcher 2 Enhanced Edition (2011) has a pretty similar
ggraphic. Assassins Creed Unity looks like a game from the future. Witcher 3
is a good game, but the PR wasn't that good.

@down voter: do you mind to share your thoughts

~~~
yAnonymous
True. The game is great and still looks very good, but considerably worse than
the previews. Especially the lighting and the over-saturated colors.

The PR team did good though, because visually the game has the same issues as
Watch Dogs, but it isn't getting anywhere as much hate for it.

