
Chrome dev on WebGL security and Microsoft bullshit - robin_reala
http://games.greggman.com/game/webgl-security-and-microsoft-bullshit/
======
yaakov34
Passionate and unconvincing. You can't get around the fact that WebGL lets a
website control the graphics driver in a way which is much too direct for
comfort. The central problem is the poor quality of driver code compared to OS
code. If these were of equal quality, then indeed this would be no more
dangerous than a JVM. [EDIT: and for extra danger, the driver code generally
runs at elevated privilege.] You can presumably exploit a hole in the OS by
executing something in a JVM, but that's been an extremely rare exploit
because of the high quality of the JVM and OS code in the relevant parts of
their interface. This thing opens a huge attack surface against much more
vulnerable code. Driver code is buggy, usually prematurely released in a rush
to market, and deliberately closed and obscured. Maybe the answer is to make
that code better across all the graphics companies, but until this happens, I
fear WebGL.

The guy himself gives examples in which they have to work around known bugs
which exploit vulnerabilities with formally valid inputs. But what about the
unknown bugs? Those are the ones that will get us, no?

Now, I don't know if Flash 11 or Silverlight are just as bad. Maybe they are
(although I see claims in this thread that this is not so for Silverlight).
But this is not a good reason to support a hugely insecure technology. It's a
reason to fear these other technologies as well.

~~~
icefox
I am not familiar with flash's api, can anyone point us to the docs in flash
on how to access the shaders or a specific example?

~~~
ken_railey
The shader language is called AGAL, here is an intro:
[http://blogs.adobe.com/flashplatform/2011/04/intro-to-
adobe-...](http://blogs.adobe.com/flashplatform/2011/04/intro-to-adobe-
graphics-assembly-language-agal.html)

~~~
malkia
Thanks, never heard of it. But why would they chose something so low-level
that looks like assembly (reminds of the old shaders - I guess I have to read,
maybe... easier translation for CPU if there is no GPU?).

~~~
Impossible
Adobe also has a tool called Pixelbender 3D where you can write shaders in
their flavor of HLSL. Its an extension to pixel bender (which was for 2D
shaders only). Pixelbender 3D has to be used offline though, so if you want to
embed shaders in a swf they have to be written in assembly. The reasoning for
this is that the pixelbender shader compiler is larger than any single part of
the flash runtime, so it would greatly bloat the flash runtime for a little
bit of developer convenience. Adobe has done a poor job of showing that you
can use high level shader code in their demos though, pixelbender will end up
being the preferred way to author shaders for molehill, the assembly stuff
will only be used by a few people for tests\quick hacks.

------
JonoW
I'm a fan of WebGL and I really hope it does gain traction, but I think people
are being really irresponsible for ignoring what MS is saying (perhaps a bit
of anti-MS mentality?). There ARE avenues for attacks through WebGL (e.g.
[http://blog.mozilla.com/security/2011/06/16/webgl-
graphics-m...](http://blog.mozilla.com/security/2011/06/16/webgl-graphics-
memory-stealing-issue/)), you'd be crazy to ignore them.

Also, if Silverlight is susceptible to the same vulnerabilities, then that
makes MS hypocritical, not wrong! And comparisons with Java and Flash getting
hardware shaders is beside the point - MS doesn't have control over which
features those products choose to include or not, they only have control over
IE and Silverlight.

I hope a reasonable scenario develops where IE does get WebGL, perhaps a
whitelist of drivers that adhere to higher degrees of security for use with
WebGL.

~~~
cgranade
It would be crazy to ignore MS on this point, yes. It would also be crazy,
though, to take them at face value. While there are avenues for attacks via
WebGL, I think it's clear that MS is being at least somewhat disingenous about
the implications one can draw from the existence of such attacks. I am not
enough of an expert to tell lies and facts apart in MS's statements, and so I
cannot derive any real value from what they say. I am hence forced to derive
my opinions on the matter from other, more trustworthy sources, such as the
Mozilla blog that you linked to.

~~~
extension
Mozilla and Google have their own agendas and biases around WebGL. If you want
an independent opinion, look to security experts and low level graphics
programmers who don't have a big stake in the browser vs native platform
controversy.

~~~
tptacek
You're going to find that security researchers are going to agree with
Microsoft's take on this. People have been finding actual flaws in this stuff.
Web browsers are the hardest piece of software we have to secure in 2011.
Coupling them directly to video drivers is disquieting.

~~~
cgranade
There's a big difference between saying that something is presently insecure
and that it is by nature not securable. I understand and agree that WebGL is
not presently secure at all, but Microsoft is making a set of claims far
beyond this: that not only is WebGL insecure, it can never be made secure, and
moreover, that Silverlight is somehow immune to the problems facing WebGL.
That additional set of claims is why I say that MS is being disingenuous, if
not outright lying. More than a good-intentioned warning about insecure
software (which would be exceedingly ironic, given the source), MS's statement
comes off as opportunism.

~~~
tptacek
I see WebGL people saying essentially the same thing: that there is a
reasonable degree of security that can only be assured by obtaining
cooperation from driver vendors; that, in other words, the security of WebGL
is not entirely under their control.

------
nextparadigms
_"It’s even more telling that Microsoft hired a firm, ContextIS, to create
this bogus FUD so they’d have someone to hide behind. It’s pretty convenient
that only a few minutes after ContextIS’s latest post Microsoft already had a
well manicured response. If ContextIS had any credibility they’d be posting
about the same issues on Silverlight5 (warning: might crash your machine),
Flash 11 and Unity3D. "_

I always wondered about those IDC and Gartner predictions that WP7 will become
#2 mobile OS by 2015, when it can't even get past 2% right now and its growth
is slowing down. How did they get those numbers all of the sudden, when they
couldn't even predict _half_ of Android's growth when Android was actually
growing fast. So that seemed very fishy to me at the time.

If Microsoft is paying companies to do this kind of "publicity" that favors
them, then I wouldn't be surprised if they are behind those so called
predictions either.

~~~
bad_user
Microsoft is known to hire other companies for such things, but they aren't
really hiding it.

And I am sure that the ContextIS report is accurate and that WebGL represents
a security risk, but TFA is also right - if WebGL is a security risk, than so
are Silverlight 5 and Flash and Java applets, which are and will be everywhere
anyway.

But because WebGL will get multiple competing implementations and everything
is in public review, I'm also sure that it will be just better than the
various other browser plugins that allow access to the GPU. The only problem
with this picture would be IExplorer, since it has such awful upgrade cycles,
that's why I'm kind of hoping they won't implement WebGL instead of coming up
with something half-baked that ruins the experience for everybody.

I also couldn't care less about Gartner and IDC predictions: they never
predicted anything worthwhile anyway. But WP7 will be popular. Maybe less than
both Android and iOS, but popular nonetheless.

~~~
rbanffy
> but they aren't really hiding it.

That's not exactly true. I never saw a Microsoft press release clearly stating
they funded a report they are referring to. They usually read like "$firm has
published a study showing $product destined to dominate" or something like it.

------
hubriz
What a bunch of lies this is. Silverlight has nothing to do with XNA linked by
the author. But even if we assume that SL5 uses XNA, XNA does not expose raw
buffers in the code the way WebGL does and it does not allow one to write
shaders on the mobile devices (one of the attack vectors mentioned in the MS
article). SL itself does let you write PS (FS if you're from OGL crowd) and
yes, this is an attack vector. But comparing this to WebGL is a false
equivalence.

I find it amusing that even John Carmack (strong OGL supporter) agrees with MS
on this one yet it is some random Google employee who gets his word spread.
This author knows FUD very well, I give him that. He's proficient in spreading
it.

~~~
daleharvey
> Silverlight has nothing to do with XNA linked by the author. But even if we
> assume that SL5 uses XNA

I am not amazingly familiar with windows development, but "The core of the XNA
Games Studio 4.0 graphics libraries is now included in Silverlight 5 Beta and
is used to create 3D graphics. ", doesnt seem like a large assumption

> "John Carmack (strong OGL supporter) agrees with MS on this one yet it is
> some random Google employee who gets his word spread"

I have a feeling the chrome developers have a better idea about how to sandbox
a browser than John Carmack does, as awesome as he is, this is just arguing to
authority when there is a long list of points that OP made that havent been
refuted.

~~~
hubriz
> doesnt seem like a large assumption

It is for the reason I've mentioned before: XNA abstracts out pretty much
everything. With XNA you're not programming to the metal anymore, which is
unfortunately true with WebGL. The greatest concern seems to be with the way
OGL handles various buffers. For example in WebGL it is possible to create
bufferData stating that it's of size X and provide Y values with Y much
smaller than X. What you get is a buffer full of stale data from the memory.
Bad idea.

Also there's much less "stuff" you can break in XNA with lower profile levels
(feature sets) which are suitable for the web. E.g. you can't write shaders
for the mobile devices today for both security and perf reasons. You get a
preset amount of them and that's it. Yes, it's limiting, but you can't choke
GPU to death and you can't submit shader payload that's known to crash given
driver.

It's all about false equivalence that author makes in his blog entry. Problems
exist with pretty much every piece of advanced software. But you can add small
attack vector, or a large one. They are not equal and making it seem otherwise
is bogus.

JS is amazing and I love 2D canvas. But WebGL is trying to put low-level stuff
in the high-level code. It's awesome in terms of performance, but is
inherently fragile and may be abused easily. It seems that it's not sensible
programmers MS is afraid of but malicious attackers. And it's very unlikely
that WebGL won't be happily abused.

As for Carmack, he stated at various occasions that he's closely following web
graphics. He's also more than knowledgeable about the issues with drivers, how
they can be hit even accidentally and how (un)likely it is for hardware vendor
to fix them timely.

Also aside from the security, drawing from my experience, it is extremely
difficult to make semi-advanced graphics code perform well and look acceptably
similar on different GPUs. This is something APIs exposed in the high-level
languages (which JS I think undeniably is) should hide from the developer, not
dump on him. The fact that there are two different 3D canvas contexts - webgl
and experimental-webgl - doesn't really help.

~~~
magicalist
> The greatest concern seems to be with the way OGL handles various buffers.
> For example in WebGL it is possible to create bufferData stating that it's
> of size X and provide Y values with Y much smaller than X. What you get is a
> buffer full of stale data from the memory. Bad idea.

Please at least check these things before asserting them. Assuming good will,
I can see how it might be easy to assume that some things that are true in
desktop opengl will be true in webgl, but both the linked blog post and the
webgl spec explicitly state that all buffers are initialized upon creation,
and all access calls are bounds checked. Tests of this are also part of the
webgl conformance suite.

If you want to see where it is specified, see here:
<http://www.khronos.org/registry/webgl/specs/latest/#4.1>

~~~
hubriz
That is only partially correct. As mentioned before, there are two types of
WebGL canvas contexts out there: webgl and experimental-webgl. Former does not
put the GLX_ARB_create_context_robustness extension in place, which is
basically what you're referring to.

------
rbanffy
> Microsoft has never supported OpenGL

That's not exactly true. OpenGL support was introduced with Windows NT, IIRC.
At that time, Microsoft found OpenGL support critical for NT to compete with
the unix workstation segment.

They later started emphasizing Direct3D as the preferred way to do 3D,
possibly because supporting OpenGL would make porting 3D-heavy games to other
platforms easier.

~~~
eropple
Or, _possibly_ , because it's a better API and more developer-friendly
environment.

Working at a low level with OpenGL is really painful, and there are no
Direct3D-quality libraries on top of it to blunt that pain.

~~~
stephen_g
Do you have any comparisons or examples? I've found OpenGL to be a fairly
straightforward API (if you ignore all the depreciated stuff)...

~~~
eropple
Just off the top of my head, because it's been a while since I rolled a list
like this. One is a complaint from friends of mine that I haven't personally
tripped over; others are my own issues from assorted hilarious Failures Of
Graphics Programming.

* A total lack of typing. Everything is a GLuint. By the time I've bodged together sufficient type safety to be comfortable, it looks like DirectX.

* Extensions suck. Abjectly suck. While Direct3D has its problems, it does a pretty good job of saying "you must support these things". OpenGL attempts to vaguely say the same thing, but the difference is that Direct3D enforces support of things I want to use. It seems that you end up with many more code paths for OpenGL if you want to properly handle a lot of stuff.

* Difficult to query about GLSL problems, if possible at all. (An older example that's stuck with me is the noise() function, which nobody implemented the last time I dealt with this stuff. They returned a constant. Detecting this failure mode was nontrivial.)

* Tooling. As usual, Microsoft is way ahead in this area.

A project of mine uses OpenGL instead of D3D, but that's primarily because I'm
not the graphics guy on that project. My own stuff just uses XNA, as it's 2D
stuff I want to deploy to the 360.

------
robin_reala
The author puts forward the (unsubstantiated) assertion that Microsoft paid
context to search for WebGL security issues. Not that this is a bad thing
(more eyes = better security long term) but it would have been good of Context
to provide this information in their recent writeups.

------
latch
It's nice to see people this passionate. You could tell that he started off
quite frustrated and slowly cooled down as he was writing, which I think
always makes things more genuine/fun.

He's got me sold.

~~~
bonch
So emotion sways you rather than fact-based rationality, gotcha.

~~~
latch
When it comes to arguments of logic, passion is a compelling and important
emotion yes. However, the article was full of facts as well. Combined, it's
pretty win.

------
patrickaljord
Two relevant links on the issue:

* DOS vulnerability in Silverlight 5s 3D (similar to WebGL DOS vulnerability) <http://news.ycombinator.com/item?id=2680001>

* Microsoft architect Avi Bar-Zeev: "Why Microsoft and Internet Explorer need WebGL (and vice-versa)" <http://news.ycombinator.com/item?id=2667332>

~~~
yaakov34
Avi Bar-Zeev says that the solution to WebGL security will be to ask the user
to explicitly enable it per site (so it runs only on trusted sites), to sign
shaders in the same way drivers and plugins are signed now, and maybe to
maintain a whitelist of sites allowed to use shaders (or any WebGL at all).
OK, that will work, but I don't feel that there is much chance of that
happening. I still remember when JavaScript was a new-fangled untested
technology, and IT departments would disable it by default on untrusted sites.
Some people still surf only with scripts disabled. But really, there was
immense pressure to give web apps a lot of the features of native apps, and
everyone pressed ahead despite the security concerns, which ended up being
very serious and costly. I suspect that the same thing will happen with WebGL
- it will end up being on by default. The Khronos white paper on WebGL
security is a very short and superficial dismissal of extremely complex
issues, and Avi Bar-Zeev is being unrealistic about browser vendors being able
to work around it all.

~~~
magicalist
I don't believe that is what he is saying. He's saying that he thinks that
extreme would still be worth it if that was required to run WebGL.

------
ChuckMcM
First note:

"I work at Google on Chrome ... I was on Microsoft’s side in the Java lawsuit,
the Internet Explorer lawsuit and several others."

I would expect Google to get a demand notice _from Oracle_ to make the poster
available for deposition in their suit :-)

Secondly, this bit:

"So imagine my disappointment when I start seeing the FUD from Microsoft about
IE9 vs other browsers. Cherry picking benchmarks, cherry picking conformance
tests and generally basically lying."

This has been a standard of tech marketing in some circles for so long, its
astonishing that you are just now seeing it. From HP claiming memorex disk
media would cause disk head erosion (these were flying heads) and invalidate
your warranty, to Oracle lying about DB2 performance or the configurations, or
storage vendors benchmarking on systems where they used thousands of disk
drives so that none of them actually had to seek.

When there are only 'standards' Microsoft's browser team has to out execute
other browser development teams. Market share declines suggest that this isn't
a 'winning' strategy for them. When there are proprietary 'standards' for
which other browser teams have incomplete information, browser dominance is
assured. And as Microsoft is fond of saying, "Windows is 'open' because you
get Windows based computers from any vendor."

Microsoft's goal is to make you look stupid, your goal should be to make what
they think irrelevant. Complaining about their tactics just wastes time.

------
twp
From the article:

"[The GPU process] validates that the shaders submitted to the GPU use only
the minimal features allowed by the spec. That means no dynamic indexing of
sampler arrays. It means no infinite loops."

Halting problem solved. News at 11 :-)

~~~
masklinn
> Halting problem solved. News at 11 :-)

Not necessarily, WebGL could simply restrict the shaders language to not be
turing-complete. No need for solving the halting problem if you start from a
total language.

~~~
magicalist
yep. because webgl is designed to work on platforms that may have to fully
unroll loops in shaders (because of its mobile baseline), webgl shaders are
not turing complete in and of themselves.

------
yread
He has some good points and I like the description of safe-ish implementation
of webgl in chrome. But I think the main point he misses with the
Silverlight/Flash comparison is manageability from a security point of view.
Running a plugin is different from using a feature in the browser - people
often disable the plugins and are conscious about plugins being an attack
surface so the admins often deploy policies disabling use of plugins.

~~~
tintin
Are you saying it's very hard to implement a checkbox to disable WebGL? A lot
of browsers protect you from opening or resizing windows via Javascript. How
could disabling WebGL be more complicated? My browser is even warning me when
a Javascript is running slow.

~~~
saucetenuto
I mostly agree, but there's a bit more to it than that. For example, you want
to be able to report back to the page that WebGL is disabled, so that whatever
content is supposed to be there can degrade semi-gracefully.

~~~
tintin
You mean something like:

    
    
      if(document.getElementById("canvas").getContext("webgl"))
    

;)

------
majmun
stopped reading at this line : "they might have a little more credibility if
they weren’t promoting a technology, Silverlight 5, that provides the EXACT
SAME FEATURES with all the same issue"

first of all:

\- silverlight might be completley different division of microsoft company
than IE. and have nothing in common.

\- i dont recall silverlight is included in IE either.

~~~
masklinn
> \- silverlight might be completley different division of microsoft company
> than IE. and have nothing in common.

Not relevant. Silverlight is shipped by the same company and provides the same
capabilities with no significant differences/restrictions

> \- i dont recall silverlight is included in IE either.

Not relevant either, silverlight is shipped by Microsoft and some microsoft
websites prompt for its installation, so Microsoft as a company does not seem
to have much problem with Silverlight and its 3D capabilities.

~~~
majmun
read the related Ms article "WebGL considered harmful". there its stated that
they wont include it in BROWSER beacause it is considered harmful. so
silverlight is not include either. even if it is exactly the same as WebGL

~~~
nkassis
But they were pointing out that what WebGL does would never pass their
security reviews. This isn't even the IE team from my understanding but the
security team at Mircosoft making this statement. I'm sure they work on
multiple projects.

------
malkia
FYI: Chrome uses D3D to render OpenGL or Windows (AngleProject)

~~~
jra101
I believe that is only true for specific GPU/driver combinations with known
problems in their OpenGL driver.

~~~
malkia
I think it's always using D3D on Windows. Take a look with depends.exe on

depends "C:\Users\<user-
name>\AppData\Local\Google\Chrome\Application\14.0.797.0\libglesv2.dll"

It shows that D3D.DLL is used. There is simply no reason to complicate the
life of QA and have two renderers.

Besides tracking down what works and what doesnt is very hard, and often
depends on other things - drivers, settings, etc. (I have some experience with
it, I work in gamedev company and we do port to PC regularly).

I myself like OpenGL much more than D3D, but my graphics peers think that D3D
gives you more power to the metal. Now AngleProject is just trying not to be
close to the metal, but to use what seems to work a bit more than OpenGL.
(Although in honestly I haven't seen problem with desktop OpenGL on my Vista
and XP machines - I have to use Maya, MotionBuilder and others and they do use
OpenGL).

------
smackfu
How odd to cite his experiences internally at Google in a post not officially
from Google. And to use "we" in his responses in the comments.

~~~
magicalist
I see one "we" in his comments ("We're happy to accept the criticism when it
is valid. This isn't."). I'm also not sure what experiences implementing webgl
he's supposed to talk about except for his experiences at google. Making an
argument based on actual implementation details is what you'd want here.

------
underwater
Mirror of text:
[http://viewtext.org/article?url=http://games.greggman.com/ga...](http://viewtext.org/article?url=http://games.greggman.com/game/webgl-
security-and-microsoft-bullshit/)

------
swah
I wonder how many Chrome developers develop in Win7 vs OSX.

~~~
bengoodger
A sizeable chunk of people on the team are developing Linux because of the
build/link speed advantages. Some of us stick it out with Windows because we
either like the UI better or because that's where most of our users are right
now :-) I use Windows because I like the MSVS IDE the most, but only because
for me it's the best of the worst. I really feel like we're pushing the
envelope of how large a solution can be developed in it - with hundreds of
vcprojs it takes a long time to load and de-jank itself. I wish Microsoft
would use it themselves for things like Windows or Office, I would imagine
it'd improve a lot if they ate their own dogfood.

~~~
hyperrail
> I would imagine [Visual Studio would] improve a lot if they ate their own
> dogfood.

We do - just not VS :)

I used to work on Microsoft SQL Server. Our build system back then was
substantially the same as Windows's. We used the Visual C++ command-line
compiler, but we didn't use Visual Studio projects. Instead, our build utility
was the BUILD.EXE program in the Windows driver development kit
(<http://msdn.microsoft.com/library/ff542351>). BUILD is a wrapper for make
that defines various useful macros, and enforces certain conventions for
makefile contents (for example, the Sources file contains all and (mostly)
only the source filenames to be compiled in the current directory). Our build
environment wasn't the public DDK, but both have the same origin.

Relatively few people used Visual Studio, mainly because its C++ Intellisense
was very slow (it took half an hour one time for the VS debugger to load a
just-in-time crash dump). Instead, I used the Source Insight editor (a
proprietary third-party product, but site-licensed by Microsoft), and used the
WinDBG debugger from the public Debugging Tools for Windows. Source Insight is
very fast at code browsing, fast enough for SQL Server's code base - indeed
that is its main marketing bullet point. WinDBG isn't particularly fast, but
it's no slouch either.

------
smogzer
Here, see MS security at stake here <https://github.com/mrdoob/three.js/> .

------
tzs
He's not making his case when he essentially asserts it is OK because they've
put in work arounds for a large list of known driver bugs.

