Hacker News new | comments | show | ask | jobs | submit login

This is what I meant by "naive approaches." Take your example of Pepper API layer versus NPRuntime. When you try to use a sandboxed NPRuntime plugin you hit a hard performance wall with synchronous dispatch overhead. You also introduce huge potential for deadlocks that can be very difficult to detect. This isn't the kind of thing you notice in a simple simple proof-of-concept or casual discussion, but it becomes painfully obvious when trying to implement a real-world plugin.

So extend the Web APIs to handle this. You don't have to do synchronous calls.

Robert O'Callahan talked in that thread about the potential for extending the capabilities of Web Workers to allow the kind of things you want to do to be done asynchronously. I know I'd love to have the ability to render to a 2D or 3D canvas context in a Web Worker, for example (the kind of thing that sandboxed plugins want to do), but all the effort that could have gone to that went to this weird plugin- (and NaCl-)specific API instead.

It's kind of sad, because I would like to use these APIs in my web content, but I can't. The cynic in me would say that it's because Google wants to push NaCl. I don't want to program C++ to get access to these goodies; I want to program in CoffeeScript.

PPAPI is a platform for plugins to safely port existing native code, and implement performance critical components. The PPAPI platform capabilities (audio, canvas, webgl, etc) already exist as regular JavaScript APIs in Chrome. So, if you want to use them on normal web content you can. However, in doing so you incur both the strengths and limitations of the web platform (e.g. JavaScript dispatch, IPC overhead, and a single-threaded execution context).

You gave the example of using canvas from a web worker. Unfortunately, the whole web platform is built on the assumption of a single-threaded execution pipeline. Web workers get around this by using postMessage and not having access to the DOM, shared variables, and other stateful parts of the platform. Canvas and WebGl as spec'd and implemented are tied to these stateful parts of the platform (DOM, etc.), which is why you cannot use them from a worker.

So, your desired change would involve either a major overhaul of the web as spec'd and implemented, or a change to the Canvas and WebGL standards. The first is probably not going to happen. The second is achievable and implementable, but wouldn't help address the use cases I described in my first sentence.

I'm well aware that, at the moment, the canvas context is restricted to a single thread. But this could be changed. All you have to do is to provide an API to move a canvas context to a different thread/worker. This is quite simple. There's a Mozilla bug on it: https://bugzilla.mozilla.org/show_bug.cgi?id=709490

It requires no more engineering effort than it took to implement the corresponding code in the PPAPI, and probably a lot less. Moreover, non-NaCl and non-plugin code could benefit from it.

As explained before, what would need to be done is simply to expose native versions of the Web APIs. These APIs would mirror the Web APIs more or less exactly (maybe using pointers to buffers instead of typed arrays, for example, but otherwise they'd be the same). In the cases in which Web APIs are not sufficient, both the native and JavaScript versions would be extended. In this way, "performance critical components" of native code and Web content would both benefit.

What are you talking about? Are you suggesting that Adobe provide an implement Flash via Web APIs and Javascript? We're talking about the challenges of having a plugin interface for Chrome and how NPAPI is limited/flawed and you're suggesting... Web APIs?

Maybe I'm really confused because I don't understand.

The proposal that roc and smfr put forth is this: Native code could call Web APIs, slightly tweaked for the benefit of native code and to satisfy the security/isolation guarantees that Chrome wants to enforce. These same APIs would have JavaScript versions, so that ordinary Web content could use them as well. For example, the same asynchronous 3D command stream API that Pepper exposes would be available to Web Workers.

Here's some further reading:

* https://mail.mozilla.org/pipermail/plugin-futures/2010-May/0...

* https://mail.mozilla.org/pipermail/plugin-futures/2010-May/0...

Note that Darin's followup message included this:

"I do agree however we want to keep Pepper APIs in sync with existing Web APIs. That will obviously be a challenge if they are not one and the same, but I don't think it is impossible or even that difficult to achieve with them being separate APIs."

This has already failed. Web Workers cannot render even 2D content asynchronously today, while Pepper can (which advantages plugins and NaCl over ordinary Web content). This is exactly the problem that roc and smfr were trying to avoid.

I don't think that Darin's example, of an <embed> tag being untouchable by Web content, is an insurmountable problem. In fact, I think this is exactly the kind of thing you want JavaScript content to be able to do -- high-fidelity JS versions of PDF and Flash would be great. And the issue of Web Workers and audio APIs needing shared memory is a Chrome problem (because Chrome puts each worker in a separate process), but not a problem for browsers generally.

Also, the argument that "it's an existing plugin, you don't want them to rewrite their code" doesn't hold water. It's an entirely new set of plugin APIs. Adobe has to rewrite its code anyway. It's just that the other browser vendors wanted the Web as a whole to benefit from this, not just Adobe.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact