You need a browser with WebGL support. Tested on IE11, Firefox, Chrome, Edge, Chromium."
I assume only a subset of OpenGL calls can be sent and things like 3d volume textures would eventually require WebGL2?
Or the webgl polyfill created with mesa and emscripten.
I had to search to find out for sure what he was talking about. A Qt application runs a small, local web server, streaming UI calls to a WebGL context on a page you view through your browser of choice. http://blog.qt.io/blog/2017/02/22/qt-quick-webgl-streaming/
This differs from Electron in that your backend is C instead of JS, your UI is Qt instead of HTML, and the runtime doesn't embed the browser. I'm just guessing, but I suspect you also don't develop any code on the browser client context.
I've also personally found value in running Electron from a page in a local HTTP server process, so I don't find that part too different, but YMMV.
Maybe it's more appropriate to say this is like per-app RDP via WebGL.
(It also doesn't seem useful to compare it to Electron-based apps. The Qt comparison to an Electron app would be a Qt/QML app that uses Qt's WebEngineView, a control that embeds the Chromium browser.)
 It removes the need to develop a remote API/RPC interface, the need to make the app so modular that you can detach its UI completely, the need to built, test and distribute desktop releases etc.. It is also easier for the customers, "set password on device, put IP address in browser, done", and likely requires less support. It probably also works much better for use on tablets (which is becoming very common in many industries, well, at least where the work environments aren't too dirty); industrial control UIs have used touch input for a long time, and tend to use screens with sizes comparable to a tablet.
At the time, what I wanted to do is an app store that is across browsers and platforms.
The method I tried to use was a mixture of vnc and h264 frames. for interactive surface, such as QOpenGLWidget, I send with h264, but for static content like a button, I used vnc-like approach.
All UI interaction requires a round trip from the browser to the server, with a whole new set of GL commands sent back to the browser. It's a fancier version of VNC ultimately.
It would never work well over the Internet, it might be tolerable on a LAN.
I'm not giving it a hard time, I legitimately can't see the user case and want to know.
You see, not all software are ideal to be implemented with pure web technology, although the limitation is quickly going away.
One example is expensive 3D modelling tools, such as Maya and 3D studio max. they are still desktop apps. They suffer from high piracy rate due to their prices. this is a negative feedback loop. You price them higher, you will have more people pirating them.
So this streaming Qt thing is a solution for quickly making a conventional desktop app into a web service and you can charge people with subscription.
This is a combination of the power of server machine and the convenience of the web. As web is not as capable as desktop app on 3D rendering, threading and direct accessing to GPU hardware.
Streaming the software ui is a viable solution for SAAS already, for example, as I know, both office 365 and the online version of libreoffice use this approach.
What I wanted to do was making a platform to make this easy. But I eventually gave up, because I think the biggest developers who would adopt this are heavy 3D graphics companies, and that narrows down to Adobe and Autodesk, as computer graphics is very labor intensive and monopolized. They both are big enough to develop their own solutions.
Just to clarify, Qt WebGL Streaming is not a video streaming solution, so the rendering still happens in the browser via WebGL. The hardware requirements to run a graphics app with this feature would be no lower than they would be to just run the Qt app natively, so the only benefit is the portability and accessibility of the web.
It doesn't take much horsepower to just open an application like Blender, but it might be useful to run it on a powerful server and access it from a lighter more power-efficient laptop.
If you search this page, you will find my other threads mentioning my approach, which was based on video streaming.
This way you can present, preview, annotate, comment, etc. a model, texture, game level if you say to 3rd party contractor and create a field for communication changes, visual diffs, notes, etc.
2-3 years ago I had to retrofit an existing game engine tools into a small size (was able to do it down to 1GB), and had to package this to each contractor.
This might be better.
they never mentioned that they don't support opengl 4.5 with this webgl backend, so I guess they achieved it by rendering the opengl 4.5 context into a texture map and send that texture map content to web.
but this is just a guess.
I see that going down well with your actually paying customers :)
Remember, you're not gaining money when you're screwing over paying customers to hunt people who aren't going to pay anyway.
Plenty of apps benefit from some stuff being done on the client and the server.
With really shitty interaction latency.
Is lack of viable adoption by users the only reason you gave the project up?
Is this a video stream of a QT UI being rendered in the server ad-hoc for the client?
Is this a QT UI being converted to WebGL (planes, meshes, etc) on the fly?
Is this a QT UI compiled for WebGL?
The program UI state and program data is all server side, but the actual rendering happens client side. Instead of sending the client a stream of X11 commands, it sends OpenGL/WebGL drawing instructions, and the client sends input events back to the server.
It's basically a way to remotely view a Qt app. Like VNC but faster and better integrated into browsers.
What are the instruction being sent to the browser? "paint this text" here, "paint this surface" here, etc?
It would be interesting to see a local demo that pairs with WebAssembly, though.