I used to develop professional 3D tools and CUDA is used to accellerate features in 3D software including: scene manipulation, simulation, real time rendering, ray tracing, rendering shader effects and video encoding etc.
Support for all the CUDA code that's already out there. Not that I've used Metal, but that's what seems to be the reason behind OpenCL's lack of adoption
Sure. But there is also a lot of Metal code out there in order to make iPads fly. And they really do fly, I was amazed what kind of performance I could get out of them. For example, I had an algorithm running on my iMac Pro, programmed in Swift, utilising all of its 18 cores fully. It took about 5 minutes on a typical example. Then I recoded the algorithm for Metal, and it runs on the same example on my iPad Pro in under 10 seconds.
So, my bold prediction: Apple is going to shred Nvidia to pieces within the next 10 years.
The video says the accelerator card is for 4k & 8k RAW video. So I wouldn't expect it to do anything at all other than decode 4K & 8K RAW footage, kinda like the RED ROCKET-X.
I've heard that CUDA can help decrease render times on Adobe Premiere Pro and several other programs. I'm guessing that includes 3D modeling software like Fusion 360 or something to help with various molding and simulation features.
Not F360 that I'm aware of, but for 3D, 3DS Max, C4D, and Maya both support GPU-based rendering using CUDA, which can be a big performance boost. On the video side, Premiere Pro, Vegas, and AVID tend to have better support for CUDA as well. However, AMD support has been catching up in the last couple years.
FWIW, I've done some pretty complex part simulations in F360, and it's not that slow, on my Vega MacBook pro 2018. But, if it is, Fusion has online-sim calculator or renderers that are pretty affordable.
Probably has to do with other software like Maya or other simulation like programs. In fact F360 worked flawlessly on my 2016 dual-core 2.0 ghz CPU with 8gm of RAM.
tin foil hat thoughts what if CUDA on MacOS allows Adobe products to outperform Final Cut rendering times and Apple don’t want to jeopardize it’s offering?
Could be possible, but given Apple optimizes like crazy using its own hardware and software, Apple products get crazy performance. On top of that Nvidia and Apple aren't on good terms, so they obviously didn't want to build a very powerful workstation.
Raytracing, rendering, certain kinds of vector operations, physical simulations.
CUDA/OpenCL is all about writing super parallelism code to be run on the GPU instead of CPU, so many trivially parallel problems are usually outsourced to it.
What do 3D modelers use CUDA for? I know it gets used for machine learning, but I'm not familiar with what else.