
WebGPU Shading Language - pjmlp
https://gpuweb.github.io/gpuweb/wgsl.html
======
throwaway34241
Some context: browser vendors are working on a successor to WebGL, and there's
been a debate over what shading language to use.

A text based language would be easier for beginners (since they wouldn't need
to install a separate shader compiler) and would avoid sites that need dynamic
shaders having to download and run a shader compiler in the browser.

Using SPIR-V bytecode would benefit from a large amount of work put into an
existing standard and an open source compiler ecosystem that many companies
are already using.

It looks like a decision has been made to accept a text based language, but
one that is defined based on SPIR-V semantics and is easily convertible back
and forth from SPIR-V. This seems like a great trade off since if it works out
it should have the main advantages of both approaches.

~~~
Athas
Is there really such a great need to make the lowest level shader API more
accessible to beginners? You can always build things on top that make a low-
level API more friendly, but you can _never_ build something below the lowest
available level (short of inventing a new standard). What kind of beginners
are these that want to do low-level shader programming, but without any
tooling? Don't web programmers already accept the use of heavyweight build
tools even for their plan old ordinary Javascript?

My enthusiasm for seeing yet another GPU language is very restricted. I simply
cannot see the rational arguments in favour of this design. We were _so_ close
to having a standard IR for this stuff.

~~~
interartdev
Knowing Vulkan, WebGL, and OpenGL and working a bit with some of those
technologies while hobbying the others, WebGPU will take about a decade to be
fully adopted, and even then, only by a fraction of companies that are willing
to spend manpower resources into developing something interfacing low level
graphics directly.

~~~
username90
WebGPU is important for game engines, currently it is missing a lot of
features so you can't write the shaders you want.

------
Jasper_
So this is in a weird place. It's trying to be a floor wax and a desert
topping, that is, both a compile target and a human writable language, and the
committee is having trouble deciding the vision and goals for the language.

You can see some of that contention here,
[https://github.com/gpuweb/gpuweb/issues/586](https://github.com/gpuweb/gpuweb/issues/586)
, along with other issues filed.

This is going to be a bumpy ride. I hope it turns out okay.

~~~
fulafel
Based on the current version of the "Goals" text it's more of a SPIR-V surface
syntax: "Trivially convertable to SPIR-V"

Sounds like a good thing at first blush, if shaders can be then decompiled to
this in debuggers etc.

~~~
Jasper_
Apple is pushing back on that wording.
[https://github.com/gpuweb/gpuweb/issues/582](https://github.com/gpuweb/gpuweb/issues/582)

------
kvark
The funniest (and saddest) thing about WGSL is that we realized there is a lot
of confusion and tried to make a FAQ quickly to address that, but it got split
and stuck in reviews:

[https://github.com/gpuweb/gpuweb/pull/562](https://github.com/gpuweb/gpuweb/pull/562)

[https://github.com/gpuweb/gpuweb/pull/576](https://github.com/gpuweb/gpuweb/pull/576)

------
shmerl
So it's a new language for generating SPIR-V? Is this driven by shortcomings
of GLSL and HLSL?

~~~
ericflo
It's driven by Apple hating Khronos, so we all have to come up with a new,
non-Khronos thing if it's to be used in browsers made by Apple.

~~~
thewebcount
That seems a bit harsh. Isn't WebGPU to WebGL what Vulkan is OpenGL? It's the
next generation of more efficient 3D rendering pipeline.

~~~
ericflo
Yes, but if you follow the discussions, the clear consensus was to use SPIR-V.
Except Apple, who came up with all kinds of technical reasons not to use it,
before eventually admitting they have a legal dispute with Khronos group. Now
we get this weird language that is SPIR-V in text form but not by Khronos, and
suddenly there is consensus.

~~~
shmerl
_> before eventually admitting they have a legal dispute with Khronos group._

Interesting, that's the first time I hear about it. Do you have some links on
the topic? What is the essence of the dispute?

~~~
ericflo
From the meeting notes [1]:

MS: Apple is not comfortable working under Khronos IP framework, because of
dispute between Apple Legal & Khronos which is private. Can’t talk about the
substance of this dispute. Can’t make any statement for Apple to agree to
Khronos IP framework. So

1\.
[https://docs.google.com/document/d/1F6ns6I3zs-2JL_dT9hOkX_25...](https://docs.google.com/document/d/1F6ns6I3zs-2JL_dT9hOkX_253vEtKxuUkTGvxpuv8Ac/edit)

~~~
shmerl
Weird, but at least it explains why they are causing a problem.

~~~
pjmlp
Apparently it goes back to how Apple wanted to evolve OpenCL after releasing
it to Khronos.

------
OceanSunfish
A webkit blogpost from 2018 claims SPIR-V may not be possible to fully secure
in a web context. I'm uncertain of the progress or if the original claim was
legitimate, but I believe the current experimental implementations of WebGPU
do not protect against raw gpu memory access.

[https://webkit.org/blog/8482/web-high-level-shading-
language...](https://webkit.org/blog/8482/web-high-level-shading-language/)

------
grenoire
Are we missing f64 and other 64-bit integer types because of... JavaScript?
Again!?

~~~
Jasper_
GPU's rarely have double precision arithmetic.

~~~
rrss
Which GPUs do not have double precision arithmetic? As far as I'm aware, all
of AMD's and Nvidia's from the last decade have double precision support. It
may not be fast, but it's there.

~~~
Jasper_
Right, the desktop GPUs have support for float64, but it's not exposed in
HLSL/GLSL to the best of my knowledge. "precision highp float" in GLSL will
get you 32-bit on all vendors AFAIK, you have to use OpenCL/CUDA for 64-bit
floats, but it's way slower and actually sometimes emulated.

None of the mobile GPU vendors (Apple, ARM Mali, Qualcomm, PVR, BroadCom)
support it. You can see that here, where the only vendors supporting
shaderFloat64 in Vulkan are desktop GPUs.
[http://vulkan.gpuinfo.org/listdevicescoverage.php?feature=sh...](http://vulkan.gpuinfo.org/listdevicescoverage.php?feature=shaderFloat64&platform=android)

~~~
yoden
64 bit support has been around on the desktop for a decade at this point
(Through OpenGL4 / GL_ARB_gpu_shader_fp64 / cl_khr_fp64).

Unfortunately it is not exposed in Firefox or Chrome, even if the underlying
hardware supports it. I think this is creating a chicken and egg problem - if
WebGL were to actually support the capabilities of the underlying hardware,
maybe there would be more interest in deploying 64-bit extensions on other
platforms like mobile. Even emulated support would be a win, compared to
having every developer emulate fp64/i64 themselves in GLSL.

So I am also disappointed in this proposal's failure to mention 64-bit
support, even as an extension. At least they're reserving the keywords, I
guess.

------
sandov
That syntax is freaking ugly. Should've made it more like C and GLSL.

~~~
blackpawn
That was my first thought as well. I wonder what the rationale for this
divergent syntax is.

------
nullifidian
We need optional SPIR-V shaders in Chrome's and Firefox's webgpu
implementations, so that websites that use it wouldn't work on retrograde NIH
text-based Apple platforms, forcing them to implement SPIR-V eventually.

Such standards politicking, creation of unneeded entities/standards is
basically a "tech sin". They are sinners!

~~~
kvark
There are no technical issues with us having an extension that allows SPIR-V
binaries to come in. It's a political issue, however, since it weakens WGSL
positions. We'll be talking about this more with Google.

------
mschuetz
I just wish it had support for double and int64. So much great stuff that you
can do with 64bit atomic operations that aren't possible with 32 bit atomics.

