Firefox 40.0: 1000 Bunnies at 12 FPS
Firefox 40.0: 10000 Bunnies at 12 FPS
Firefox 40.0: 20000 Bunnies at 12 FPS
Firefox 40.0: 30000 Bunnies at 12 FPS
Chrome 44.0: 1000 Bunnies at 60 FPS
Chrome 44.0: 10000 Bunnies at 60 FPS
Chrome 44.0: 30000 Bunnies at 30 FPS
Why this extreme difference between Chrome and Firefox?
about:config in Firefox says "GPU Accelerated Windows: 0/3" so I guess it is somehow not using the GPU. Although then Im surprised it can render 30k bunnies at 12 FPS. On the other hand, it says "WebGL Renderer: NVIDIA Corporation -- GeForce GT 630/PCIe/SSE2". So I guess it's using the GPU. Also I would expect this WebGL demo to not even run without FF using the GPU. Strange. Any ideas?
On Windows both Firefox and Chrome use ANGLE [0] to translate WebGL into DirectX calls, so this will be an issue in the browser not the driver or hardware.
The bunnies are being rendered on your GPU, but the final frames are being slowly composited in software. There is no fast path to quickly composite WebGL output on Linux yet.
The FPS counter doesn't appear on my smartphone (falcon, CM12.1 Browser), but i would estimate the performance is at about 10-15FPS at the default number of sprites, let alone the full 300 000.
For me, (nVidia GTX 980) Chrome beats Firefox on Windows 8.1 by a lot when rendering 300,000 bunnies.
- Chrome 44: 100 FPS
- Firefox 40: 60 FPS
- IE 11: 10 FPS
I don't think that FF is limiting itself to 60 FPS as I have a 144Hz monitor and other WebGL demos can render at frame rates well in excess of 60 FPS in firefox.
That is odd because the three.js demos i run report well in excess of 60 fps in FF when my monitor refresh rate is set 144Hz. The stats.js component only measures up to 100FPS and the simple demos often exceed that.
edit: I also have G-SYNC enabled and I noticed that the bug thread says the FF syncs to vsync on Windows. nVidia's G-SYNC might be messing with vsync to achieve higher than 60FPS on supported monitors?
Chrome 44.
The trick is to create 1 Mesh with 300.000 or more planes and render all stuff once.
I pass the x,y data to a texture and sample it in the shader.
I think it's meant to be a stress test of their particular way of rendering Haxe sprites, not a measure of WebGL or browser performance.
It also seems ripe for tuning; it appears to leak a dom object every frame and churn through memory at a pretty epic rate :) (at least per chrome dev tools)
about:config in Firefox says "GPU Accelerated Windows: 0/3" so I guess it is somehow not using the GPU. Although then Im surprised it can render 30k bunnies at 12 FPS. On the other hand, it says "WebGL Renderer: NVIDIA Corporation -- GeForce GT 630/PCIe/SSE2". So I guess it's using the GPU. Also I would expect this WebGL demo to not even run without FF using the GPU. Strange. Any ideas?