There is at least already a WebGPU implementation in Rust (the one that Firefox uses). So they could use that if they wanted to. I guess it's probably better for the overall health of the ecosystem if there are multiple implementations though.
Yup, wgpu is already a thing. While it's ironically widely used on desktop, it's less mature in the browser context. Like, there is an open-world 3D MMORPG using wgpu for its graphics, meanwhile it's not yet enabled on stable Firefox.
I'm not sure whether many different implementations is inherently good, though.
You can say the same thing about something as simple as "shared memory" -- normal multiprocessing computers have had shared memory since time immemorial, but browsers literally disabled SharedArrayBuffer from 2018 to 2020 and anyone using them to communicate with Web Workers had to find another way. Browsers run a 24/7 onslaught of extremely untrustworthy code, whereas games only run themselves.
Firefox has not enabled WebGPU via wgpu for the same reasons Chrome Security has done an in-depth review of Dawn. It is a component that must be hardened. For anyone out there trying it out by enabling config flags, remember to disable it once you are done. It will be ready in time.
I would love to hear about an implementation of multiplayer that receives code from hostile opponents and executes it, but I do not anticipate you'll find many examples.
> SV_SteamAuthClient in various Activision Infinity Ward Call of Duty games before 2015-08-11 is missing a size check when reading authBlob data into a buffer, which allows one to execute code on the remote target machine when sending a steam authentication request. This affects Call of Duty: Modern Warfare 2, Call of Duty: Modern Warfare 3, Call of Duty: Ghosts, Call of Duty: Advanced Warfare, Call of Duty: Black Ops 1, and Call of Duty: Black Ops 2.
In case this needs to be pointed out, an RCE in a game is an accident, not the way they designed their multiplayer to work. I was describing why the Firefox team might wait for a feature to be security-hardened before releasing it. The answer remains the same -- they design and market the thing to be secure even when it executes untrusted code. Activision does not advertise their games as able to "securely execute RCE gadgets from maliciously crafted steam authentication packets". This part may be surprising: the Chrome and Firefox teams do, in fact, try to ensure that when someone gains RCE, that they execute it securely and it can't get very far.
I am not attempting to claim that games do not have security issues or cannot experience remote code execution, just that this is not a normal pattern of behaviour that they plan for, so it is normal that a game author would deploy wgpu long before Firefox does (while Firefox spends a lot of effort on fuzzing, etc). If anything a terrible CVE that Activision has expended apparently zero resources fixing is a very good example of what I'm talking about.
https://veloren.net/
I'm a bit impartial since I'm a former contributor, but I think it's super cool.
Aside from that, the Bevy game engine also uses wgpu on non-web, but afaik no game of particular significance or player base has shipped with it yet. I think the biggest user of it is actually a software tool for mining (the hardhat kind), but it's a "call us for a quote" kind of thing so hard to tell how big it is.
That's kind of irrelevant to the adoption potential of WebGPU.
Those examples you gave are not comparable at all, ativex and flash are way, way higher level, don't operate at all like graphics API middleware.
PNaCL was a WASM alternative design, which thankfully lost as WASM is much more flexible.
My point is that WebGPU is way better positioned than any middleware, it has industry backing and official support from all relevant platforms (or plans for it).
It's also benefitted a huge amount from hindsight.
Just wanted to point out that wgpu has both webgpu and webgl2 backends. So, currently, most projects use the webgl2 backend via wgpu for any rust app running in firefox right now.