Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Learning WebGL (learningwebgl.com)
47 points by franze on April 26, 2011 | hide | past | favorite | 32 comments


How do you "install" a game?

For example, let's say you have 150MB of art assets. You wouldn't want players to have to re-download the assets each time they go to play.

So how would you cache: 1) a texture, 2) a vertex+index buffer?


Check out the HTML5 LocalFileSystem APIs. They aren't widely available yet (Chrome 9+, behind a flag) but they are exactly what you want for this kind of thing. http://www.html5rocks.com/tutorials/file/filesystem/


Also, I should mention Application Cache (5MB local storage). Very useful for static game assets: http://www.html5rocks.com/tutorials/appcache/beginner/


Eh, it's moderately useful. 5 MB is nothing these days — that's smaller than the L3 cache on a modern processor. If your game is only 5 MB, you hardly even need caching.


The browser caches assets like that automatically.

Alternatively, if you want to have a more install-like install, you might have luck with local storage: http://diveintohtml5.org/storage.html

Local storage seems to currently max out at 5 MB, but that's easy for the browser vendors to change if they see the need.


Cache size is (fortunately) limited, and I wouldn't want a webapp to be able to store 100+MB of data in local storage without explicitely asking for my consent beforehand.


This hits on one of the biggest obstacles to people perceiving web apps as "real apps". Most web pages feel, and in actually are, very disposable. You read an article at a site once, you don't want those assets to persist on your hard drive where they'll never be reused. Many people clear their caches regularly.

What we need is a way for a site to say "i'm an app" which will tell the browser to store the cache in a separate bucket. Then when users go to clear their cache, there is a separate checkbox for web apps which is uncheck by default (just how passwords are unchecked by default).

This is what Mozilla should be working on with it's web apps project. The installation part is just mascara. We need the browsers to do the distinguishing, not users.


This sounds like a good idea, but as someone who runs a website, why wouldn't I say that it's an app even if it isn't?


Good point. The obvious answer is to prompt the user when a website says it's an app and wants to use special storage options. But wording that might be difficult to prevent confusing the user. It's not as easy to understand as "this website want to use your geolocation" or "this website wants to send you notifications".


I think I heard this was going to be a feature of HTML 5.


I think this is what is being planned. Browser will ask you if you want to allow the app to use local storage (and hopefully give it a limit). If not large webgl games are going to be a pain if you have to wait 30 minutes to play. I'm not saying 10Gb but 1-2 might not be a bad limit.


You can't rely on the browser cache, it is limited in size and depends on other websites as well.

Localstorage is useful, but as you say limited.

The real solution is IndexedDB. Partial (but useful already) support is in Firefox and Chrome. Other browsers are expected to follow soon.


> You can't rely on the browser cache, it is limited in size and depends on other websites as well.

And as if that wasn't enough, browsers don't trust expiration dates in HTTP, either. Press reload enough times, and all resources will be reloaded.


I cannot wait for WebCL to be released as well, in the mean time how would you suggest using the image buffer in WebGL to help simulate many complicated particles?


Yeah WebCL will be really useful at least for what I'm doing. I know they have a private list right now, hopefully they will open it up soon.

One issue with WebCL is the device support. WebGL is capable of running of pretty much anything but WebCL (unless they decide to make a fake software version for system who don't support it in hardware) is going to be limited to a small subset of clients for a while.


The most important pragmatic capability missing from WebGL that limits its usefulness for more general-purpose compute is read-out of floating point values from renderbuffers. It was decided by the WebGL standards body to not support this as it is not supported by vanilla OpenGL ES 2 and therefore might disenfranchise Khronos members' mobile products and/or fragment the WebGL ecosystem.

I don't know if you've noticed, but most WebGL users are on personal computers and not mobile devices. The only mobile implementation I've seen is FF4Mobile and it's quite inadequate for all but the simplest WebGL programs. Most PC GPUs have had floating point readback for a few years now. Additionally, the irony of the situation is that ALL of JS numbers are doubles and ALL OpenGL ES 2 GPUs use floating point internally. It is really only the standards that have not caught up and force devs to do int-float conversion across the GPU boundary. Add to this the performance constraints of 3d rendering and any sane dev will want to do as little in JS and as much in GLSL (on GPU) as possible.

I asked Kenneth Russell, the Chrome WebGL lead, about floating point readback support at GDC and I was told that I'd "just have to do the pee-pee dance".

With this single capability, many tasks "needing" WebCL become possible with WebGL. OES_texture_float (according to WG members) only allows loading float textures, not reading out of them.

I wonder if Adobe and Microsoft know to use floating point for GPU computing? Here's hoping that a little competition fixes this blatant oversight.


I agree with you but there is nothing that prevents browser vendors from making the extension available I believe. I do wish the WebGL standard would support extensions a little better. I've also seen a lot of request for things like 3D textures. We will have to see how it goes.

This is after all version 1.0 of the standard. I don't see why 1.1 can't include optional support for OES_texture_float.

Does the iPad support it? I know a lot of android phones do too. So no reason it should be barred.


Sorry if what I wrote was unclear.

OES_texture_float is a current WebGL extension that is implemented by several browsers. The issue is that OES_texture_float does not specify readPixels support. As far as I can tell, even with native OpenGL ES 2, there is no extension that specifies readPixels functionality for floating point textures. I believe this is a bug in the OES_texture_float specification as binding a texture format to a texture engine is entirely type-level and no types are required after the resource is bound. If there is a hardware limitation, you're doing it wrong.

So the current state of support is: some browsers implement OES_texture_float and you can load floating point textures. Some browsers, when you have enabled OES_texture_float, also allow writing floating point textures. Unfortunately, the only thing you can do with a written floating point texture is re-read it in a later vertex (if you can get vertex texture fetch support which is still lacking for WebGL on Windows due to DirectX impedance mismatch) or fragment shader.

If you want to read back the results of your float computations, you have to pack them into 4 x 1 byte pixel color channels and then unpack them on the javascript side back into floats. Of course, when converting the GPU native floating point values into color channel integers, implementations "helpfully" clamp to [0..1] and then multiply by 255 (!) so that 0 -> 0 and 1 -> 255. This plays havoc with floating point precision as 255 is not exactly representable. Complicating matters further, ES2 and its corresponding GLSL have no provision for integer or bitwise operations and so the implementor is reduced to using floating point operations to pack floating point values into 4 [0..1]-values that then get molested into 4 byte arrays which can then be read back into Javascript which can then waste browser/CPU time rebuilding floating point values from pixel byte channels with accompanying (technologically unnecessary) numeric noise.

I've not seen any WebGL implementation running on the iPad and Apple won't say anything about it. If I had to guess, I'd say that WebGL is attractive to Apple in the HTML5 sense but not in protecting their native application advantage. The Steve will probably decree that native 3d apps have superior performance by virtue of not being written in a sloppy, GC'ed language and run in a giant sandbox. I don't think native iOS apps allow float texture read-back, either.

The real question is: Is WebGL's lowest-common denominator (phones) approach to 3d a bug or a feature? On the bug side, it's absurd that I can't read float values off of my desktop GPU in 2011. On the feature side, it means that even your iPhone 3G will be able to poorly execute and render a WebGL application at an unbearably low framerate!


Just thinking out loud, if WebGL renders on to a canvas, would it be possible to use the canvas context to read the image data?


Nice, but until Microsoft support WebGL, it doesn't seem like it'll ever be that useful...


Let me disagree. Within certain demographics, high tech loving makers, you can sometimes get away with using WebGL: http://tinkercad.com

Our stats indicate the IE is much less of a problem than expected.


Yeah because MS is necessary for anything... Oh wait, Android, Opera, Chrome(>10%market share), Firefox(at least 20%market share), Safari and iOS... people now can make a living without caring for what MS does or not, just add "You need a standards compliant browser" with links and care only for the people that does.

MS is using this trick with Hotmail: You need IE version X to benefit from the amazing features hotmail has.


Useful for what? Grandparents and stuffy offices stick to IE, but the majority technical users are running FF/Chrome/Safari/Opera. It would be a bad idea to write a B2B or photosharing app that requires WebGL. But, games, art, visualization and optional site-bling can benefit a great deal. Like most new tech, it's a good plan to target techies first and let the late adopters come in after the kinks have been worked out.

Even though the spec has hit 1.0, the dust hasn't nearly settled on the implementations. Lots of demos only work on specific browsers or have been broken by browser updates. Meanwhile, MS is apparently extremely devoted to not breaking sites that used to work when they put out new updates. I'm personally OK with MS waiting until WebGL solidifies before putting out their first implementation (hopefully...).


Useful enough for me and my crowd (Neuroscientist). I've not had many users complain (none actually) that they had to install Firefox or Chrome. Mainly because the tools we useful enough that the effort was worth it to them.

I think if the games and toys are enough fun, people won't care that they have to run it in a different browser. Hell look at how many people buy new computers just to run the latest games.

3D on the web in my mind will be most useful for games and I believe scientific visualization and engineering applications which are usually people who don't care about installing a different browsers.


I would think most WebGL apps would apply mainly to entertainment, which people will do at home. Home users can simply be directed to download a browser that supports WebGL. They are all free.


By that logic, until Microsoft supports Android and iPhone they will never be useful.


I think the GP is wrong, but that's a very unfair characterization. A website is not cognate with a mobile phone OS.


You're right of course - it's not that great of a comparison.

On the other hand, have you looked into modern UI architectures? Particularly on mobiles, they look a lot like operating systems aspiring to be web sites...



That's more than a year old. This site is worthy of repost.


Site is down for now.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: