The trend in renderers is extremely clear, nearly everybody is building a GPU version, and the makers and their users all report pretty big speedups with GPUs...
> Because anyone can pipe their shit through Intel Open Image Denoise after the fact. It’s free even.
The same is true of the OptiX denoiser. I’m not sure what point you’re making?
> Not least in the amount of noise it produces using comparable sampling settings. Without using any GPU compute.
The amount of noise has nothing to do with GPU vs CPU.
I am reading between the lines here but I think he is saying that the 'trend' you are talking about is a figment of your own and the imagination of some marketing folks from some companies -- combined.
I.e. it has no substance from a user's pov a priori (his point) and it has no substance from the pov of someone looking at the numbers of this a posteriori (my point, elsewhere in this tread, which I am happy to actually back up any time).
I mean a Quadro 8k RTX can stash 48GB. Standard on 3D artist workstations is 128GB today. Even my freelancer friends have this in their boxes now at the very least.
Go figure what is standard RAM size on render rigs on farms these days based on that ...
And that's not even considering compute restrictions on these GPU rigs that make them simply unfit for certain scenes.
Please do, instead of claiming you can back it up, please just do it, what are you waiting for?
The list of offline renderers adding GPU ray tracing support is pretty long. If you think the trend isn't real, then are you saying you believe the list isn't growing? If you think it's imagination, maybe you could produce the list of serious commercial renderers that are not adding GPU support, and perhaps evidence they're not currently working on it.
RenderMan, Arnold, Blender, Vray, Modo, RedShift, Iray, Clarisse, KeyShot, Octane, VRED, FurryBall, Arion, Enscape, FluidRay, Indigo, Lumion, LuxRender, Maxwell, Thea, Substance Painter, Mantra... pretty sure there are whole bunch more... not to mention Unreal & Unity.
It's quite true that memory limits are a serious consideration. Which is why, currently, GPU renderers that swap aren't generally a thing. They will be in the future, but right now you get CPU fallback, not swap. So seeing the claim about swap in the comment makes it suspect. Despite the trend and various improvements will continue to be a factor for a while as the limits improve. That doesn't change the trend. It means that preview is currently a bigger GPU workflow than final frame.
V-Ray on GPU will swap in the sense that it offloads textures out of the GPU and then re-uploads them later for another bucket while still rendering the same frame.
And you know, just because everyone is adding GPU support doesn't mean that professionals will switch their entire pipeline and render farms on their heads just to use it.
I acknowledge that they have GPU support and that some people like it, but I personally can usually not use it, so it is also not a purchase decision for me.
Plus, people already have large farms of high-memory high-CPU servers without GPUs, so switching would require lots of expensive hardware purchases.
And you usually render so many frames in parallel that it doesn't really matter if the single frame takes 5 minutes or 50 minutes. You just fire up 10x more servers and your total wait time remains the same.
> just because everyone is adding GPU support doesn't mean that professionals will switch their entire pipeline and render farms on their heads just to use it.
You’re right, it doesn’t. The fact is that it’s already happening with or without you. Widespread GPU support being added is a symptom of what productions are asking for, not the cause.
> The trend in renderers is extremely clear, nearly everybody is building a GPU version, and the makers and their users all report pretty big speedups with GPUs...
The 'trend' of the US government under president Trump is also extremely clear. Sorry, I couldn't resist. :)
TLDR; This 'trend' is not economically viable except for two parties. Makers of GPUs and companies renting out GPU rigs in the cloud.
Aka: it's just a trend. It's not that anyone sat down and really looked a the numbers. Because if they did this trend wouldn't exist.
It's also history repeating itself for those that do not learn from it. It will not go anywhere. Mark my words.
I've been there, in 2005/2006, when NVIDIA tried to convince everyone that we should buy their Gelato GPU renderer. I can elaborate why that went nowhere and why it will go nowhere again. But it's a tad off topic.
Feel free to elaborate, I have no idea what your point is here or what you mean with your non-sequitur about the government or how that relates to developers of 3d renderers in any way. I don't know what you mean by "it's just a trend." The fact is that there's evidence for my argument, and you're attempting to dismiss it without any evidence.
Comparing Gelato to RTX seems bizarre, they're not related, other than that they're both Nvidia products. Are you trying to say you distrust Nvidia? RTX already is a commercial success, and there are already dozens of games & rendering softwares using RTX on the market and hundreds more building on it. RTX already went somewhere.
> Because anyone can pipe their shit through Intel Open Image Denoise after the fact. It’s free even.
The same is true of the OptiX denoiser. I’m not sure what point you’re making?
> Not least in the amount of noise it produces using comparable sampling settings. Without using any GPU compute.
The amount of noise has nothing to do with GPU vs CPU.