
Announcing Microsoft DirectX Raytracing - mxfh
https://blogs.msdn.microsoft.com/directx/2018/03/19/announcing-microsoft-directx-raytracing/
======
bhouston
The only issue is that a lot of this is already possible in the current
renderers at a reduced cost.

\- Soft shadows work in WebGL fairly well using PCSS:
[https://clara.io/player/v2/8f49e7c3-7c5e-43f0-a09c-33a55bb1b...](https://clara.io/player/v2/8f49e7c3-7c5e-43f0-a09c-33a55bb1baf6?wait=true&configurator.show=true)

\- Translucency can be faked effectively using a few different methods.
[https://clara.io/view/5c7d28c0-91d7-4432-a131-3e6fd657a042](https://clara.io/view/5c7d28c0-91d7-4432-a131-3e6fd657a042)

\- Screen space ambient occlusion, if using SAO, is amazing. SAO test:
[https://clara.io/view/2e1637a7-a41d-4832-923a-e6227d1ebaaa](https://clara.io/view/2e1637a7-a41d-4832-923a-e6227d1ebaaa)

\- Screen space reflections also work. Our reflections are this:
[https://clara.io/player/v2/b55f695a-8f4a-4ab0-b575-88e3df8cd...](https://clara.io/player/v2/b55f695a-8f4a-4ab0-b575-88e3df8cd89c?wait=true)

\- Fast high quality depth of field:
[https://clara.io/player/v2/ce7d91ed-1163-4cbc-b842-929adc4ef...](https://clara.io/player/v2/ce7d91ed-1163-4cbc-b842-929adc4efbf6?wait=true)

\- Real-time global illumination:
[https://www.siliconstudio.co.jp/middleware/enlighten/en/](https://www.siliconstudio.co.jp/middleware/enlighten/en/)

So while I think that Raytracing is awesome, it generally will not increase
existing real-time render quality that much. My experience with the game
industry, even if you have a better way of doing things, if it takes more
CPU/GPU cycles that a hack that achieves basically the same quality, it will
not be adopted. It is that simple.

~~~
jasonwatkinspdx
While the above is accurrate I think it's a bit one sided.

Physically Based Rendering has become the dominant approach in VFX because
it's a huge simplification and productivity boost. Integrating tons of special
purpose hacks into one renderer isn't just hard for the renderer devs, the
control parameters exposed to artists become an absolute nightmare. Combing
the PBR perspective with raytracing greatly simplifies both sides of this.

Not every game is trying to be the next Far Cry. Ray tracing _will_ be adopted
by teams that value that unification and simplification over getting the very
last bit of performance possible by a pile of hacks. As the hardware improves,
which still seems likely, we'll see the balance point of who makes that call
shift in favor of tracing IMO.

~~~
AboutTheWhisles
I don't think pure raytracing and ray traced global illumination are going to
be adopted by many games if there is still such a giant problem with noise.
That is far from a solved problem, even in multi-hour per from visual effects
renders.

~~~
kbwt
This paper shows pretty good results from essentially throwing a RNN at it:

[1]
[http://research.nvidia.com/sites/default/files/publications/...](http://research.nvidia.com/sites/default/files/publications/dnn_denoise_author.pdf)

~~~
AboutTheWhisles
Where will the 1000 frame fly through and the 10 noisy frames that they use
come from? I find that paper very disingenuous when they don't count these
things in their filtering time.

Not only that, but they compare everything with "1 sample per pixel" knowing
that the SURE filter uses per pixel statistics of variance that require more
than one sample per to start working.

------
joosters
[https://www.youtube.com/watch?v=LXo0WdlELJk](https://www.youtube.com/watch?v=LXo0WdlELJk)

I'm pleased to see that the tech video includes spherical mirrors. As I
understand it, all raytracing demos are obliged by _universal law_ to include
at least three reflective balls in any promotion of the technology.

One day, someone will figure out a game where these super-shiny ball bearings
are a critical part of the gameplay, and at that point, raytracing will
finally take off...

~~~
Aardwolf
Even in AAA games of today with amazing graphics, I still see polygons in
cylindrical, conical and round objects (I'm not talking about raytracing but
regular rendering). Everything looks so realistic, but a well or bucket [1] or
some other cylindrical object you encounter looks not round but faceted,
breaking the suspension of disbelief.

Raytracing may bring more roundness, but why not quadratic surfaces or some
other fast enough to render method of real roundness in rasterizers? It may be
expensive (although one surface would replace many polygons...), but it'd
solve one of the last remaining uglinesses :)

[1] [https://imgur.com/gallery/AdDyT8B](https://imgur.com/gallery/AdDyT8B)

~~~
Tobba_
That's just because the industry is stuck using triangles in the art pipeline
(and don't put in as much effort into the low-polys as they used to). If they
were willing to use some variation of patch-based pipeline so that they could
utilize the tessellation hardware that has existed for years, it wouldn't be a
problem. I've heard artists are scared of it due to how amazingly awful some
editors managed to make the NURBS editing experience, but with a decent editor
it really wouldn't be a problem.

Sadly tessellation largely just gets used to screw over AMD performance (for
those sweet Nvidia kickbacks; if you're wondering why some games like Crysis 2
tessellate flat surfaces into an insane number of polygons, this is why).

~~~
virtualritz
> I've heard artists are scared of it due to how amazingly awful some editors
> managed to make the NURBS editing experience, but with a decent editor it
> really wouldn't be a problem.

That's simply not true. I started getting into (high end) 3D in the early
90's. I've probably used every NURB or higher order surface modeling tool
under the sun since then.

In fact, there are many amazing modeling tools for working with NURBs or other
higher order surfaces.

They just all don't even come remotely close to polygon modeling when
precision is secondary and workflow/ease of use is paramount.

So the answer in the VFX world, for years, has been subdivision surfaces. They
have almost all of the good of bi-cubic patch modeling and almost all of the
good of polygon modeling. What's more, existing polygon modeling tools can
easily be upgraded to subdivision surface modelers by merely enforcing
2-manifold topology and adding the ability to display an [approximation to]
the limit surface in real time.

Some of the schemes have nice properties. For example, after one step of
Catmull-Clark, the entire surface consists of quads. And when treating each
local grid of quads as the control polyhedron of a cubic b-spline patch (not a
NURB but an UNRB, a uniform, non rational b-spline patch), the surface of the
patch is equal to the limit surface that would be obtained by the subdivision
scheme.

~~~
phkahler
>> the surface of the patch is equal to the limit surface that would be
obtained by the subdivision scheme.

That's not true if any of the vertices is extraordinary. In other words a
vertex shared by more or less than 4 quads. The good news is that subdividing
those quads each into 4 smaller ones will result in 3/4 of the surface area
meeting the right criteria. Or the area hard to deal with becomes 1/4 the
size.

The ultimate coolness of subdivision surfaces is that every vertex on a
subdivided mesh is a leaner combination of the original vertices. The weights
do not change during animation. The weights only change if the topology does.
Another nice thing is that meshes at different LOD contain the vertices of the
lower LOD mesh.

They really do everything with one exception - they can't exactly represent
quadric surfaces which are so important in CAD tools.

------
AareyBaba
AMD has a free ray tracing renderer called Radeon ProRender
[https://pro.radeon.com/en/software/prorender](https://pro.radeon.com/en/software/prorender).
Supported by Maya and Blender among others ...

~~~
KaoruAoiShiho
It's too slow to be relevant.

~~~
OCASM
Not anymore:

[https://www.anandtech.com/show/12552/amd-announces-real-
time...](https://www.anandtech.com/show/12552/amd-announces-real-time-ray-
tracing-for-prorender-and-radeon-gpu-profiler-12)

------
mnw21cam
> From each light source within a scene, rays of light are projected, bouncing
> around until they strike the camera.

Which is definitely not how raytracing is done if you want to get an image
before the end of the world occurs. Rays are traced from the camera, and then
back to the light sources. Some amount of pre-lighting can be done by tracing
photons forwards from lights, but not the main image generation.

~~~
bhouston
In path tracing
([https://en.wikipedia.org/wiki/Path_tracing](https://en.wikipedia.org/wiki/Path_tracing)),
which is what the VFX industry uses primarily, both directions happen.

~~~
oh_sigh
Does anyone use path tracing for live renderings? I think that is the
question.

~~~
WillKirkby
Demosceners are starting to, for sure.

[https://www.pouet.net/prod.php?which=69642](https://www.pouet.net/prod.php?which=69642)

~~~
tzahola
Video version: [https://youtu.be/9r8pxIogxZ0](https://youtu.be/9r8pxIogxZ0)

~~~
ino
epilepsy warning

The demo is amazing on its own as a piece of art, let alone it's done
programatically, let alone it's rendered in real time, let alone it's just 4k.

------
oneplane
That's great.. if you are on Windows. I get that Microsoft works on Microsoft-
technology, but it really doesn't help adoption or global development as a
whole. DirectX doesn't work anywhere except Windows PC's and the Xbox. That's
no biggie for Microsoft, but it's bad for everyone else.

~~~
naikrovek
> DirectX doesn't work anywhere except Windows PC's and the Xbox.

And most new phone chipsets. And a lot of old phone chipsets. And a lot of
ARM-based single board computers. And a lot of non-Xbox consoles like the
Switch have hardware support for DirectX...

DirectX is in a lot of places you would not expect.

~~~
lpghatguy
What? All of those places have hardware support for DirectX, but none of them
(barring Windows phones) actually have DirectX running on them.

------
VikingCoder
Cute video:

[https://www.youtube.com/watch?v=LXo0WdlELJk](https://www.youtube.com/watch?v=LXo0WdlELJk)

~~~
CyberDildonics
This doesn't even look like it runs at a full 60 frames per second unless they
are speeding it up in the editing.

~~~
VikingCoder
I haven't seen evidence one way or the other, but it's specifically labeled
"real-time GPU raytracing."

Do you have evidence it doesn't run at 60 fps?

~~~
taejavu
Well, the video is clearly dropping frames so obviously not maintaining
60fps...

~~~
kevingadd
"Real-time" has never meant 60fps. It typically just means interactive
framerates - fast enough to drag a camera around or move an actor. You can get
away with calling 20fps realtime, and lots of stuff runs at 24 or 30fps.
(Obviously nobody wants a game like Doom or Battlefield to run at 20 fps,
though.)

------
snvzz
Friendly reminder DirectX is a proprietary windows-specific API, with its
associated lock-in. Don't fall for such simple trap.

~~~
gaius
Have fun shipping a real product on OpenGL without a proprietary binary driver
blob.

~~~
dfox
Both nouveau and amdgpu are good enough for every linux game in my steam
library.

~~~
headsoup
Same here, except all the newer games I probably would like to add to my
library but can't because Win/DX only...

Nouveau is still problematic because of the lack of access to the NVIDIA
proprietary code, i.e. the extra bits like fan management, monitors, etc. That
and it's performance still just isn't there against the proprietary drivers.
AMD drivers are pretty good and Wine performance has increased significantly
over the last year or so.

Hopefully Vulkan encourages more widely supported game dev, it's a terrible
thing that gamers are practically locked into the Windows OS by way of
proprietary graphics stack and the old catch-22 of 'no market for Linux
games/no games on Linux.'

~~~
pjmlp
Just like they are locked into every games console ever built and are
completely fine with that.

The gamer culture is not the same as FOSS one, cool games, getting hold of IP
and belonging to a specific tribe (e.g. PS owners) is more relevant than
freedom of games.

Currently Vulkan only matters on Linux and flagship Android devices.

It remains to be seen if Microsoft will ever allow ICD drivers on the Store or
what is the actual use of Vulkan vs NVN on the Switch.

------
tomovo
Anyone remembers Intel's Larrabee project? It featured HW accelerated
raytracing. They had Quake Wars running on it:
[https://en.wikipedia.org/wiki/Quake_Wars:_Ray_Traced](https://en.wikipedia.org/wiki/Quake_Wars:_Ray_Traced)
or video:
[https://www.youtube.com/watch?v=mtHDSG2wNho](https://www.youtube.com/watch?v=mtHDSG2wNho)

~~~
orbital-decay
Imagination and OTOY also experimented with hardware raytracing on mobile:
[https://www.imgtec.com/legacy-gpu-cores/ray-
tracing/](https://www.imgtec.com/legacy-gpu-cores/ray-tracing/)

------
psyc
Graphics programmers have been predicting this revolution for almost as long
as I can remember. It's quite exciting that so many of the big players seem to
believe it is almost time.

~~~
vardump
> Graphics programmers have been predicting this revolution...

Mostly it has seemed like it's been everyone except the actual graphics
programmers.

~~~
Tobba_
Yep. There's a billion and a half reasons you wouldn't use full-blown path
tracing for games. It'll always perform worse regardless of what you do, the
frame budgeting is likely going to be a total nightmare, the tree search you
need to trace rays requires too much branching to map well to GPUs (it's
perfectly possible, just another absolute nightmare as your complexity goes
up), console GPUs are (and have always been) extremely underpowered in terms
of compute (but good with textures), etc.

Limited uses? Sure. Otherwise hell no. It's never been more than a gimmick GPU
vendors use to show off (in real-time, offline rendering is something else
entirely).

~~~
bhouston
Maybe GPU vendors will offer specialized spatial look up trees or something .

~~~
Tobba_
To clarify: that's more of a problem in programming the GPU in a portable
manner, and that _may_ or may not turn into a problem depending on the
implementation. GPUs can do loops that run a different number of times per
thread just fine, at the cost of having to wait for the slowest to finish.
Things start to turn sideways if you want diverging branches though (at least
AMD _kind of_ has support for this, but it's limited). If you want animations
to work there's a good chance your search would become pretty complicated and
you run the risk of smacking into that.

Also, I doubt GPU vendors would add special-purpose hardware like that. They'd
much rather just add some new shader instructions to help doing it.

~~~
monocasa
> Also, I doubt GPU vendors would add special-purpose hardware like that.
> They'd much rather just add some new shader instructions to help doing it.

The PowerVR guys added special purpose hardware for doing spatial lookups for
raytracing.

~~~
zamalek
And it's _awesome._ [1] Getting shadows right for the specifics of a game is a
nightmare; balancing shadow acne and performance. Just being able to do one-
size-fits-all shadows on desktops would be a major step forward.

[1]: [https://www.imgtec.com/blog/ray-traced-shadows-vs-
cascaded-s...](https://www.imgtec.com/blog/ray-traced-shadows-vs-cascaded-
shadow-maps/)

~~~
Tobba_
For reference, VSM/4MSM are a set of shadow mapping techniques that don't
suffer from acne, at the cost of suffering from bleeding problems that also
need biasing.

Shadows are probably one of the places where raytracing makes a lot of sense
though. It's one of few shadow rendering techniques that don't blow a ton of
precious fillrate, and it shouldn't take too many rays.

------
IvanK_net
It reminds me the ray-traced game I made a few years ago :)
[http://powerstones.ivank.net/](http://powerstones.ivank.net/)

If you don't move, the image "improves" over time. You can change the
resolution in the top left corner. Play it in fullscreen. It works on phones,
too.

------
pkilgore
Raytracing is sort of like nuclear fusion: a few years away for my entire
life. Not clear what changed yet.

~~~
MBCook
Well I imagine today’s GPU would be more than capable of doing an amazing
looking game at 320x240 or maybe 640x480 at a reasonable frame rate.

The problem is at this point people want it at 1080, 4k or 8k.

~~~
gnode
The goalposts have also moved in terms of quality. Raytracing these days no
longer just means point light sources and mirrors, but full global
illumination with bidirectional path tracing.

Rasterization has improved a lot over the years, so the meaning of
"raytracing" has to improve to be competitive.

------
dharma1
And here's a nice demo from Remedy
[https://youtu.be/70W2aFr5-Xk](https://youtu.be/70W2aFr5-Xk)

~~~
CyberDildonics
This has a lot of artifacts and does not look better to me than full fledged
games that have already been released.

~~~
kevingadd
It's a tech demo. It's not supposed to look better than retail games, it just
shows off some new experimental technology. If you check out the slide deck
for the demo they compare each technique with existing solutions (i.e. SSAO vs
traced AO, SS reflections vs traced reflections) and the advantages are very
obvious.

If you want to see how much existing techniques suffer compared to tracing,
just check out the absolutely miserable, incredibly ugly, just disgusting
screen space reflections in the brand-new Crytek game Hunt: Showdown. The
water is a nightmare.

------
tgb
The article says that MS (like everyone else) expects there to be fewer and
fewer "fixed-function" features in a GPU with more shifting over to software-
defined shader code. But isn't this just an introduction of more fixed-
functionality to the pipeline? So what's the advantage of doing it this way
versus writing a ray tracer in the current compute shaders?

~~~
kevingadd
APIs like a ray tracing API can be implemented in driver software (on-CPU) or
in device firmware (on-GPU, but programmatic, not in silicon). Technically
this also applies for the old fixed-function pipeline, of course, but it's
worth considering the difference.

A raytracing API also isn't forced into the pipeline for every rasterized
polygon like old features - hardware T&L, geometry shaders, etc - were. It's
something you use on-demand in a compute or fragment shader.

------
detritus
I can't help but feel this is aimed at making future Augmented Reality (ie.
Hololens) applications integrate into their host environments better.. or
perhaps I'm just giddily over-extrapolating possibilities!

~~~
phkahler
Good point. Ray tracing makes compensating for lens distortion trivial.

------
userbinator
I'm not sure what's the big deal here; the demoscene has been doing realtime
raytracing for many many years:

[https://news.ycombinator.com/item?id=11848097](https://news.ycombinator.com/item?id=11848097)

~~~
egypturnash
Large scene graphs full of dynamically allocated objects with arbitrary motion
and animation are a very different problem than a flyover of a static
landscape or a Mandelbulb that you’re tweaking parameters on.

------
kyleperik
In my experience with blender, though you can create very detailed photo
realistic images without raytracing, but using raytracing makes it much easier
to make the same photos. It takes longer, but it's probably the direction
we're going generally with 3d rendering given Moore's law.

Anyway this seems a little overhyped

------
keyle
Unreal Engine 4's post on this [https://www.unrealengine.com/en-
US/blog/unreal-engine-4-supp...](https://www.unrealengine.com/en-
US/blog/unreal-engine-4-supports-microsoft-s-directx-raytracing-and-nvidia-
rtx)

------
kev009
So I guess this is to mesh with the new HW support from nvidia and AMD?
Because ray tracing wasn't usually the providence of GPUs.

[https://www.anandtech.com/show/12546/nvidia-unveils-rtx-
tech...](https://www.anandtech.com/show/12546/nvidia-unveils-rtx-technology-
real-time-ray-tracing-acceleration-for-volta-gpus-and-later)

[https://www.anandtech.com/show/12552/amd-announces-real-
time...](https://www.anandtech.com/show/12552/amd-announces-real-time-ray-
tracing-for-prorender-and-radeon-gpu-profiler-12)

I haven't touched a PC game in ages, would be fun to come back and see some
epic like HL3 done up in this.

------
synaesthesisx
This is quite exciting for augmented reality applications - I think it's a
reasonable assumption that the next iteration of Hololens will have some sort
of hardware acceleration for this (like Nvidia's RTX w/ Volta announced
today).

Relevant:

[https://arstechnica.com/gadgets/2013/01/shedding-some-
realis...](https://arstechnica.com/gadgets/2013/01/shedding-some-realistic-
light-on-imaginations-real-time-ray-tracing-card/?comments=1&post=23723213)

"I am 90% sure that the eventual path to integration of ray tracing hardware
into consumer devices will be as minor tweaks to the existing GPU
microarchitectures." \- John Carmack

------
electricslpnsld
The article's title says raytracing, but the actual text describes path
tracing, no?

------
laythea
Can this be applied to Path tracing? If so, would this be able to accelerate
something like OctaneRender?

I stumbled across this a year or so ago and was amazed at how the realistic
image is built up after rotating. [https://home.otoy.com/render/octane-
render/](https://home.otoy.com/render/octane-render/)

------
newnewpdro
I can't help but imagine the massively power-hungry machines hosting the GPUs
necessary for real-time rendering of Pixar-level movie graphics.

Won't this resemble power demands of crytpocurrency mining rigs except
actually used for gaming? As if it weren't already bad enough the amount of
energy we use per capita...

------
piracykills
So where are we realistically with ray tracing? What kind of performance can
be expected with modern GPUs?

------
ingen0s
Also dont forget to link the SDK:
[http://forums.directxtech.com/index.php?topic=5860.0](http://forums.directxtech.com/index.php?topic=5860.0)

~~~
ingen0s
[https://1drv.ms/u/s!AmttvHQpGvyVbOmqJ4vut9zN3Ss](https://1drv.ms/u/s!AmttvHQpGvyVbOmqJ4vut9zN3Ss)

------
blauditore
The Seed tech demo advertises depth of field - so does this renderer simulate
a lens? Is this common in raytracing?

~~~
pavlov
Yes, simulating lenses is standard for physically-based rendering.

------
ingen0s
Awesome. Cant wait to test this out.

------
Tloewald
Ray Tracing is also a lie! Are they giving us an unbiased renderer?

Looks interesting anyway.

------
polskibus
can this somehow work with autocad, etc so that construction fragments could
be previewed in real time with Raytracing and global illumination turned on?

~~~
electricslpnsld
"CAD"-like programs from the animation and VFX worlds (Houdini, Modo, etc)
already do this, so if Autodesk really wanted to they could almost certainly
add real-time path tracing to AutoCAD.

------
braderhart
Is this open source?

~~~
romanovcode
No, DirectX is proprietary closed-source technology from Microsoft.

------
mtgx
Why not path tracing?

------
enzanki_ars
More discussion at
[https://news.ycombinator.com/item?id=16620515](https://news.ycombinator.com/item?id=16620515)

~~~
dang
We merged that discussion here.

------
jlebrech
how much of this is due to crypto? do they have a raytracing chip that's much
less general purpose and specific to raytracing (and incompatible with
mining)?

------
shmerl
So why DX12 and not Vulkan? It's time for MS to start helping the industry
instead of proliferating lock-in and complicating things for engine
developers.

~~~
swebs
I don't know why you're getting downvotes. DirectX is harmful for consumers
and only serves to lock in games and users to Microsoft Windows.

~~~
thelittleone
Microsoft is the dominant platform for games. What financial gain would they
derive from pursuing a non lock in strategy?

~~~
headsoup
Sometimes I just wish Valve* would pop up and announce: 'oh so HL3 is coming
out soon, and it will be a Steam Linux exclusive for the first 6 months.'
While not likely, something like that would be a really interesting gauge of
how likely genuine Linux adoption could be...

* I use Valve due to their apparent Linux push with SteamOS (what's the go with that btw!?)

~~~
romanovcode
Most people are not Linux fanatics like you. 99.9% of people do not care to
use Windows as long as it works.

~~~
swebs
According to the Stack Overflow developer survey, the percent of developers
using Windows dropped below 50%. Nobody actually likes using Windows (except
for a few Stockholm syndrome sufferers and zealots who drink the MSDN kool-
aid), it's just that people are forced to use it due to third-party
application support like video games. It seems like every week there's a new
article on how Windows 10 is shit for users, whether it's spying on them with
telemetry, serving unwanted ads, or reseting user settings after forced
updates.

Windows is such an anti-brand that they couldn't even get customers to buy a
Windows phone after a $500 million advertising campaign.

[https://insights.stackoverflow.com/survey/2018#technology-
de...](https://insights.stackoverflow.com/survey/2018#technology-developers-
primary-operating-systems)

~~~
pjmlp
Sorry to disappoint you, but as developer with Stockholm syndrome that left
Linux back for Visual Studio, C++ and .NET.

Also former IGDA member and attendee of a few GDC conferences, game developers
only care about shipping games and their IP.

AAA studios don't care 1 second about APIs to make a better world.

Adding a new rendering backend to a games engine is a trivial task, when
compared to the pile of features a game engine needs to support.

Also most Windows developers don't care about Stack Overflow surveys.

~~~
shmerl
_> Adding a new rendering backend to a games engine is a trivial task_

You keep saying this, but it remains false. If it would have been so trivial,
studios wouldn't have hard time adding such backends and wouldn't need to hire
third party porting experts when they decide to do it. You can see how long it
takes major engines like Unreal to make a fully functional backend (since
features added to such engines are publicly communicated). It's very clear
it's not trivial at all.

And MS and Co. obviously do all they can to keep this difficult, that's the
main idea of their lock-in which they designed to tax developers with.

What's bad though, is your justification of this practice.

~~~
DoveBrown
It's trivial in the sense that it's low technical risk. I've not worked in the
games for a while. At one company with an inhouse engine the rendering backend
was initially DirectX9 written mostly by one person. He then implemented the
Xbox360 backend. Another person did the backend for PS3 (OpenGL based). Don't
have the exact timings it was ten years ago, but after the initial material,
geometry lighting pipeline was done (and that's independent from the backend),
the engine guy was never on the critical path.

They added Wii, DirectX10, iOS and Android backends while I was there. None of
these were ever considered risky and none had more than one person working on
it. Each console/platform has it's own quirks in how to optimise the scene for
rendering but the having something rendering on screen is pretty much trivial
once you have the machinery in place.

I can't speak for Epic, they are making an engine for every possible game and
every possible rendering scene which is a harder problem than what we were
doing. But the rendering backend isn't the hard part.

~~~
shmerl
_> It's trivial in the sense that it's low technical risk._

The problem is not in the risk, but simply in the cost itself. It's an extra
tax to pay. However quality can also suffer, see below.

 _> I can't speak for Epic, they are making an engine for every possible game
and every possible rendering scene which is a harder problem than what we were
doing. But the rendering backend isn't the hard part._

The story of Everspace illustrates my point. They were bitten by multiple
issues in OpenGL backend of UE4, and it took Epic a long time to fix some of
them. Their resources are limited, and they are more focused on more
widespread backends obviously. Which is exactly the result lock-in proponents
are trying to achieve.

~~~
DoveBrown
Sure, there is effort involved, but from my point of view it's a small one (a
couple of man months on a project with 50+ coders). I'm going to have to make
adjustments between the platforms because the hardware is different. Even on a
DirectX PC, you can often have a lot of differences between the rendering
scene for Nvidia, AMD and Integrated Intel gpus.

Don't know exactly what issues Everspace had with the UE4, but you want to
have a fun night go out with some Epic licensees and get them to tell you war
stories of issues they have had when they tried to do something which Epic
hadn't done in their games. You're paying Epic for the "battle testing" and
often they didn't fight those battles.

Part of the reason I left the games industry is that once you work at studio
with an internal engine it is extremely frustrating to work on AAA games
without the freedom to walk over to the engine programmer and get them to move
the engine closer to what you need.

~~~
shmerl
_> Part of the reason I left the games industry is that once you work at
studio with an internal engine it is extremely frustrating to work on AAA
games without the freedom to walk over to the engine programmer and get them
to move the engine closer to what you need._

Internal engines also on average are less cross platform. Simply because big
publishers and shareholders don't want these very expenses that creep into
development because of lock-in. That's why many Linux releases for such games
use source or binary wrappers, rather than proper native rendering to begin
with. This highlights my point above.

~~~
DoveBrown
I disagree with the characterisation that internal engines are less cross
platform because of lock-in, the big publishers don't care about lock-in. It's
not part of the calculus in deciding whether to support a platform or not.

A port of a game is more than changing the low-level APIs used to control the
hardware. It's the hardware of the platfrom the decides the complexity of
producing the port.

Linux is a special case because it's the same hardware as a the Windows. Your
market is people who want to play the game but aren't dual booting. Most of
the issues with producing your port are going to come down to driver
incompatibilities and the fact that every Linux system is set up a little bit
differently (the reason Blizzard never released their native Linux WoW
client[1]). It's not a big market and there are loads of edge cases.

For big publishers and AAA development, they're not looking to break even or
make a small profit. They need to see multiples of return on their money or
they aren't going to do it. Using a shim is cheap and doesn't hurt sales
enough to matter to them.

[1]
[https://www.phoronix.com/scan.php?page=news_item&px=OTA0NQ](https://www.phoronix.com/scan.php?page=news_item&px=OTA0NQ)

~~~
shmerl
I'm looking at publishers who do release Linux games using internal engines.
Most of them use binary or source wrapping. Only a minority are implementing
proper native rendering in those engines. And I bet it's based on cost
considerations like I said above. How would you explain it otherwise?

And I'm sure that cost plays a role when small market is evaluated. The higher
is the cost, the less likely such publisher is to care, because prospects of
profits are also reduced. So it goes back to my point. Lock-in proponents like
MS and Co. benefit from lock-in in slowing down competition growth.

~~~
DoveBrown
I agree that cost is a consideration of doing the port. From my experience
what renderering API is used at the bottom is a very small factor in that cost
calculation.

I think where we disagree is that I don't think of the lower level API as
being much of a lock in. The better graphic programmers I know have pretty
extensive experience of the various flavors of DirectX and OpenGL. The general
principles are the same and good programmers move between them easily.

~~~
shmerl
_> I think where we disagree is that I don't think of the lower level API as
being much of a lock in._

Lock-in here doesn't mean they have no technical means of implementing other
graphics backends, it means that implementation is hard.

A lot of common middleware supports Linux just fine. It's graphics that's
usually the biggest hurdle. People have expertise to address it, but it's
still a tax to pay. And different distros support is a very minor thing in
comparison.

If graphics is not the biggest issue, what is then in your opinion?

~~~
DoveBrown
> If graphics is not the biggest issue, what is then in your opinion?

Graphics is the biggest issue, but the issue isn't at the API level. It's in
the driver and hardware differences below that layer.

The "tax" as you call it, comes mostly from the hardware drivers leaking
through the abstraction. Part of this is AAA game developers fault since they
are attempting to use all the GPU with edge-case tricks to eke out more
performance.

