
AMD says DirectX is hobbling PC graphics - iwwr
http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1
======
ENOTTY
Some of the Slashdot comments[1][2][3] for this clarify what he really means.
It seems like the guy from AMD overgeneralized his remarks for the press.

1:
[http://games.slashdot.org/comments.pl?sid=2044704&cid=35...](http://games.slashdot.org/comments.pl?sid=2044704&cid=35541158)

2:
[http://games.slashdot.org/comments.pl?sid=2044704&cid=35...](http://games.slashdot.org/comments.pl?sid=2044704&cid=35541474)

3:
[http://games.slashdot.org/comments.pl?sid=2044704&cid=35...](http://games.slashdot.org/comments.pl?sid=2044704&cid=35541106)

------
Zak
_[High-end PC GPUs are ten times better than console GPUs, so why don't PC
games look ten times better?_

Game companies can't optimize games primarily for the best possible PCs.
Hardcore gamers who build their own high-end gaming systems pay the same $50
for a game as do casual gamers who run them on laptops[0]. There's a point of
diminishing returns when trying to make a game look spectacular on the best
hardware but able to degrade to run on what the majority of the market
actually has. Compounding the effect is the fact that it's no longer necessary
to constantly upgrade a computer to keep it useful, so casual gamers aren't
upgrading hardware for non-gaming purposes the way they used to.

[0] Game companies try to get a bit more money out of hardcore gamers through
special editions, but that's limited.

~~~
lutorm
I think game companies _can_ optimize for high-end PCs for the same reason
that car companies can make high-end models: you don't get a reputation for
being a kick-ass game for running ok on ok hardware, you get that by being
kick-ass on hardware that most people wouldn't even dream of buying because
the people that write reviews _do_ run on such systems.

~~~
Zak
Actually, I think you get a reputation for being a kick-ass game by having
awesome gameplay and (when appropriate) a good story. Sure, looking pretty is
nice, but I don't think graphics matter quite as much as the gaming industry
thinks.

------
dexen
Once upon a time there was a Blit terminal [1], which allowed processes to
display content in independent windows. One process managed the windows,
others could draw only in own windows; the terminal took care of that.

In an ideal future, GPU could be extended to support concurrent access from
several processes, each with own context. The GPU would enforce separation;
each process would draw to own window. Without going through syscalls...

\----

[1] <http://en.wikipedia.org/wiki/Blit_(computer_terminal)>

------
kabdib
Okay, AMD: Stop making new graphics cards. Stop the market churn, let it get
uniform and predictable and then let developers catch up and start stretching
the metal. Remember, content has to catch up, too, and that's several years
worth of effort.

But in the mean time, your competitors are going to eat your lunch.

That's what consoles do. They provide a large, uniform ecosystem which makes
it possible for developers to stretch performance without breaking the bank on
testing or risking tens of millions of dollars on a buggy dud.

Any time you want to declare your graphics cards "a console," feel free to
stop introducing higher powered hardware.

------
ChuckMcM
I think the hardware guys want to offer an alternative, that that doesn't
penalize them for sucking at some aspect of the DirectX API model.

In the way, way back times there was a spunky startup called 3dfx which made
the 'Voodoo' video card. It was fast and it had enough fill rate that doing
mip-mapped textures really flew. There was another company called nVidia that
made the a new graphics engine based on NURBs. 3Dfx published a straight
forward API called 'Glide' which was pretty close to the metal and nVidia
partnered with Sega to publish an API to this new engine.

It was a great time to program graphics, I had both cards and really liked the
Glide API (even wrote a toy 3D graphics engine on top of it). But one of the
things that I didn't like was that if you got a game it would have the
'Voodoo' version (as was the case with the Tomb Raider 1 and 2) or the 'Sega'
version. They would look great with the right hardware and they would look
like crap without it (falling back to software rendering).

So Microsoft created an API to rule them all and said do your best with this
API, and while games were not as impressive as they might be with a
direct/dedicated API, they were at least workable on several different configs
(and the PC has a _bazillion_ number of configurations). Once the graphics
hardware crossed the minimum acceptable visual threshold somewhere in the
Nvidia TNT2 / PowerVR / Voodoo2 era with the crappy, but functional, DirectX
3, the convenience of not having to patch was trumping best fidelity for most
of the market. All of the private APIs stopped being worked on at that point.

There was the OpenGL/DirectX debate, but even Carmack suggests that without an
active SGI pushing it forward that DirectX has eclipsed OpenGL it in terms of
capability.

So now that DirectX is so dominant, we come to AMD's issue. AMD (actually the
old ATI now a part of AMD) has some really killer graphics architect types.
They can imagine really really cool ways of connecting CPUs to GPUs to memory
which would allow stunning realism with less work on the part of the
programmer (which is code for 'even lame programmers would look good.') Except
that programmers won't program to a graphic card feature if it isn't in Direct
X because then they have a fork in their code base or it sucks on vendor Y
which doesn't do X etc.

So for a graphics architecture type at ATI(AMD) to get his or her cool feature
in production, they have to design it, build it, convince Microsoft to add it
to DirectX, possibly give up rights to some of the IP so that others can
implement something like it, and then wait for everyone to catch up so that
when Direct X version n+1 ships you can use it on your card which is now 2
years old.

Kinda sucks doesn't it?

If ATI(AMD) offered their own API that they wrote then they could update their
API at the same rate they rolled out new silicon features and be much more
agile. Hence our friend Huddy in the referenced article trying to make the
argument that you (the game developer) would be better off if you weren't
"held back" by Direct X.

I've talked with folks like Huddy in the past and have suggested that one
strategy for making this happen would be to go 'open' with the API. Which is
to say publish the source code that could be compiled on open source systems
so that non-Windows OS's could support high performance 3D acceleration (all
features) without having to run a Windows driver or the Windows OS, on the
hope that in doing so we might be able to create a 'better' gaming/graphics
experience outside of Microsoft's control, which in turn would put pressure on
Microsoft should that loss of control be perceived as threatening their core
business.

The challenge for _that_ strategy is the whole "we are stuck in the X11 model
stone age" that most of these OS's are mired in. (I am so tempted to resurrect
SunTools, it probably sucks more than I remember but it was really fast even
on a Sun3 so it should be like prescient on a modern machine)

So you've got the graphics card vendors looking for more flexibility, that is
a good thing. We should try to leverage that to get better (usable) open
source drivers out them.

~~~
cookiecaper
The interesting thing about this comment is that OGL allows such vendor
specific extensions without totally destroying everything else. No need to sit
around for 2-3 years and wait on MS, people can start programming for that
hardware relatively easily. My understanding (not too involved in 3D) is that
vendors usually mimic any unique extensions that their competitors might have,
but even so it's easy to test caps and turn off a feature if hardware doesn't
support it (in fact, all games do this, even DX games; you still have to test
caps and profile hw to know what code paths to run and what features to
enable). A well-architected engine is flexible and allows you to write sfx
even if only one card on the market supports that feature without much hassle,
and this is a lot easier to do in OGL with its extensions model than DX,
afaik.

Again, not too heavy on the 3D, please correct anything I got wrong.

I agree about open drivers, though. There is such a wide field of
possibilities with open drivers if vendors would only take them seriously and
quit being so paranoid about "their IP" and all that. I think great OSS
drivers is one of the most important goals for desktop Linux atm.

------
fendrak
In an ideal future, we'd have something like x86, but for graphics hardware.
Standardizing the assembly language would go a long ways towards encouraging
good compiler design and third-party libraries for rendering.

~~~
bx_lr
That future might be happening already. Both AMD and Intel have products that
pair CPU with GPU, ARM has Mali, NVidia's Tegra has on-chip GPU.

I think discrete GPUs will become niche products in the future. Once
mainstream GPUs are on-chip, the variety of different GPU architectures will
probably be reduced. The next step might be a standard ISA for GPU.

It is hard to say where GPUs will be in three years, but at least the industry
is getting interesting again. It has been more of the same for so many years
in discrete GPUs, but now the on-chip GPUs are potentially game changing.

~~~
st0p
Even though your vision is probably right, I'm not entirely happy with it. I
love choice, I love being able to choose a certain processor and GPU and
upgrading one of them after a year.

I'm probably the minority though, so business wise it makes sense.

------
baddox
Obviously, games "looking then times better" is poorly defined. Even still, I
would argue that PC gaming actually _is_ many times better graphically than
the 360. Most PC gamers have at least 1080p resolution, and even modest PC
hardware can handle that. Modern "HD" consoles, however, almost invariably run
their games (especially the blockbuster action titles) at a much lower
resolution like 600p or 660p. Also, PC games have had antialiasing for many
years, something many (if not most) console titles noticeably lack.

~~~
alnayyir
4x AA is standard to 360 games.

~~~
baddox
After playing Halo 3 on a 1080p display, I find that hard to believe.

~~~
alnayyir
4x AA can't fix 640p resolution on a 1080p display.

