
WebGL2 Fundamentals - indescions_2018
https://webgl2fundamentals.org/
======
mrspeaker
This is really a fantastic resource - it doesn't hide too much away behind
"helper" libraries so you understand where everything is coming from. It's the
only beginner resource I've found that covers drawing _multiple_ things: that
was always my stumbling block with WebGL!

When you've exhausted this and are looking for next steps to try, I recommend
this youtube series by Sketchpunk Labs:
[https://www.youtube.com/watch?v=LtFujAtKM5I&list=PLMinhigDWz...](https://www.youtube.com/watch?v=LtFujAtKM5I&list=PLMinhigDWz6emRKVkVIEAaePW7vtIkaIF)

~~~
davidwparker
Shameless self plug: You should check out my Youtube channel. I have a
playlist that I cover Webgl (not 2)
[https://www.youtube.com/watch?v=oDiSqQT_szo&list=PLPqKsyEGhU...](https://www.youtube.com/watch?v=oDiSqQT_szo&list=PLPqKsyEGhUnaOdIFLKvdkXAQWD4DoXnFl).
I've done 108 WebGL screencasts, everything is built up slowly over time from
scratch (so nothing is hidden away), and covers a lot for beginners to include
covering drawing multiple things. Cheers!

------
typon
Can someone explain why the API design is such that it takes roughly ~100
lines of code and understanding of many different concepts (buffers,
vertex/fragment shaders, uniforms, attributes, etc.) to draw a monochrome
triangle on a canvas?

I'm not trying to be snarky or say WebGL sucks (I have very little experience
with it), I'm just wondering why the design choices have culminated into such
an API?

~~~
Jasper_
Because this is what a modern programmable pipeline looks like. You can go
from that same ~100 lines of code to 80,000 triangles and have it still run
well.

You need:

1\. A way to send opaque data to the GPU (this is a buffer)

2\. A way to programmatically transform that data (this is a shader)

3\. A way to specify input parameters to those transforms not found in the
data (a uniform)

4\. A way to decompose that opaque data into tangible (an attribute)

With not much more code, you can go from that to something like e.g.
[http://magcius.github.io/pbrtview/](http://magcius.github.io/pbrtview/) using
the same exact primitives. This is an example of a custom WebGL2 engine (not
using three.js) in action. Leaf through
[https://github.com/magcius/pbrtview/blob/gh-
pages/src/models...](https://github.com/magcius/pbrtview/blob/gh-
pages/src/models.ts) and you'll still see the same createBuffer, bufferData,
drawArrays. There's not much more that I added on top.

~~~
meritt
You seem to have a good grasp on modern 3D programming -- Do you have any
books recommendations that provide a solid overview of the topic?

~~~
Jasper_
My two biggest shelf pulls are Real-Time Rendering [0] by Akeine-Moller,
Haines and Hoffman. 3D Math Primer for Graphics and Games [1] by Dunn &
Parberry

[0] [https://www.amazon.com/Real-Time-Rendering-Third-Tomas-
Akeni...](https://www.amazon.com/Real-Time-Rendering-Third-Tomas-Akenine-
Moller/dp/1568814240) [1] [https://www.amazon.com/Math-Primer-Graphics-Game-
Development...](https://www.amazon.com/Math-Primer-Graphics-Game-
Development/dp/1568817231/)

I also read a ton of presentations and papers. Highly recommend the famous PBR
SIGGRAPH course notes [0], especially the intro to light & physics by Naty
Hoffman. GPU-Driven Rendering Pipelines [1] is another recent goodie.

[0] [http://blog.selfshadow.com/publications/s2013-shading-
course...](http://blog.selfshadow.com/publications/s2013-shading-course/) [1]
[http://advances.realtimerendering.com/s2015/aaltonenhaar_sig...](http://advances.realtimerendering.com/s2015/aaltonenhaar_siggraph2015_combined_final_footer_220dpi.pdf)

------
transcranial
See [https://webgl2fundamentals.org/webgl/lessons/webgl2-whats-
ne...](https://webgl2fundamentals.org/webgl/lessons/webgl2-whats-new.html) for
an overview of new (well, to WebGL at least) features.

Direct texel lookups and the expanded texture formats has been amazing for
using WebGL 2 for GPGPU purposes.

------
tw1010
WebGL has been around for quite a while now, but I never see any websites
using it. Surely they exist somewhere? What have I been missing? Anyone have a
good source of links to things using WebGL?

~~~
tlackemann
Most browser games not written in Flash are most likely taking advantage of
WebGL[1]

My personal favorite is the Canada Banknote[2]

[1] [http://www.pixijs.com/](http://www.pixijs.com/)

[2]
[https://www.bankofcanada.ca/banknotes/banknote150/](https://www.bankofcanada.ca/banknotes/banknote150/)

~~~
Theodores
From that I learned about Nunavut, this was not on the maps I had as a kid.

[https://en.wikipedia.org/wiki/Nunavut](https://en.wikipedia.org/wiki/Nunavut)

Fascinating.

------
franciscop
The text on the main page is difficult to read with random 3D text on its
background.

~~~
kennethkl
i agree -- couldn't bare trying to read, closed tab immediately.

------
Exuma
This is really awesome, I will be going through this for sure.

------
Jyaif
Does anybody know what are the penetration numbers for WebGL and WebGL2?

~~~
jsheard
According to webglstats.com, WebGL is at 97% and WebGL2 is at 39%.

Unfortunately neither Safari or Edge support WebGL2 by default yet.

~~~
pjmlp
And their support it pretty lame in most mobile devices.

It might be supported, but fast is it not.

My Asus, LG, Nokia and Samsung devices don't have any hiccups with OpenGL ES
3.x games, however most WebGL demos are janky on them.

~~~
jsheard
I wonder how much of that performance difference is due to "Three.js vs
Unity/Unreal" rather than "WebGL vs Native GLES".

AFAIK Three.js is still missing standard optimizations that are taken for
granted in a "real" engine, like batching and occlusion culling, and some
optimizations it does support are scarcely used due to poor tooling. Pre-baked
lighting and compressed textures are technically supported but there's no easy
workflow to pre-generate the necessary data.

~~~
debacle
Are there any other libraries that aren't Unity/Unreal that are better than
Three.js?

~~~
jsheard
I think PlayCanvas is the most robust engine that's built specifically for the
web. Only the core runtime is open-source though, to get the full benefit you
need to use their proprietary editor and asset pipeline.

------
tamriel
I would actually just recommend diving into THREE.js.

It's truly one of the best documented and most "obvious" codebases out there.
If all you've got is a solid understanding of JS, you can just start hacking
up something nontrivial on THREE.js and you'll come out of it knowing most of
WebGL.

And in doing so you'll eventually realize that even WebGL itself is quite
"high level" \-- there is a _huge_ amount of abstraction that you take for
granted in the browser.

~~~
wwwtyro
> you can just start hacking up something nontrivial on THREE.js and you'll
> come out of it knowing most of WebGL.

Sorry, but this is wildly untrue. I don't really know where to begin, but you
wouldn't even need to touch shaders to accomplish that, so no, you wouldn't
know most of WebGL. THREE.js is very much a thick abstraction over WebGL.

> And in doing so you'll eventually realize that even WebGL itself is quite
> "high level" \-- there is a _huge_ amount of abstraction that you take for
> granted in the browser.

Also not accurate. WebGL is a -very- thin wrapper over OpenGL ES, which is
itself as low level as you can go without stepping into something like Vulkan.
There's almost no abstraction.

~~~
TazeTSchnitzel
> WebGL is a -very- thin wrapper over OpenGL ES

This is true. From experience I can say that the core experience of using
WebGL in JS is very similar to using OpenGL ES in C.

That being said, WebGL and the browser do a little more work for you than the
GL API does in C, things you'd otherwise need a helper library for. In
particular, there's no shared library or header file hell, context creation is
easy, image import is easy, and JS will handle memory allocation for you. It
eliminates a lot of the annoying stuff needed to get to writing the actual GL
code, though you still have to go through the state machine hell of GL itself
:)

~~~
pjmlp
While true, only beginners actually use raw GL calls, even in C.

Anyone with experience quickly builds up their mini-engine to handle loading
shaders, images, fonts, models, handling everything together for each model,
in a mini scene graph way, handling driver and GPU specific bugs...

------
madez
Seriously, we are having immense trouble with securing CPU virtualization and
now we are talking about GPU virtualization for unreviewed, unaccounted,
automatically downloaded and executed code? Stop this madness.

Use native code and native APIs like OpenGL or Vulkan.

~~~
roblabla
Care to explain how GPU virtualization is somehow worse/harder than CPU
virtualization ?

~~~
madez
I didn't say that. It's more oil to the fire.

~~~
albertgoeswoof
Brb, turning off all my cpus and gpus, can’t be adding more fuel!

~~~
madez
You seem to misunderstand. It's not about not using CPUs or GPUs, but about
what code runs on them, and how that code is accounted for. Who vetted for it
and who is responsible for it?

