Hacker News new | past | comments | ask | show | jobs | submit login
Mozilla Research Projects (mozilla.org)
440 points by yuashizuki on Dec 24, 2014 | hide | past | favorite | 92 comments



A neat thing about Shumway is just how much of it is written in Typescript[0]. It's great that Mozilla is getting behind the project.

- [0]: https://github.com/mozilla/shumway/tree/master/src


It's really fantastic to see Mozilla become a sort of steward and incubator of the open web. Considering that Mozilla originally came from a revolutionary (at the time) and somewhat desperate attempt by Netscape to counter the monopoly power of Microsoft/IE. If I'm remembering correctly, this was the first high profile corporate OSS dump (years before Java, for example) and shortly thereafter was largely considered a failure since Netscape became irrelevant and the Mozilla project didn't stop the MS juggernaut. Therefore it would be a number of years before a company was willing to take a risk like that again.

Of course with hindsight we can see that the Mozilla folks played the long game. Quietly working in the background they produced a product (Firefox) that actually did largely kill IE dominance. You can argue the role that Chrome had in this, but my opinion is that Firefox created the market for non-IE browsers. Without this trailblazing Chrome would not exist.

Congrats to everyone responsible, from the beginning to the present day.


Keep up the great work, Mozilla! I'm excited to see where a lot of these projects go, especially asm.js.


I hadn't heard of about half of these - it's great that there's a sort of directory for them.

Sweet.js is pretty awesome (I was going to say it's sweet, but that's stupid). Broadway.js and Shumway look pretty awesome, I'm going to check them out tonight.

Regarding Parallel JavaScript, does anyone know how this relates to Khronos' WebCL project? Hardware manufacturers seem really interested in WebCL, but software developers aren't.


Unlike OpenGL, OpenCL allows writes to arbitrary addresses computed by the OpenCL kernel. To be safely run on the web, you have to solve a very difficult validation problem to prove that an arbitrary kernel won't do bad things. I don't think anyone has adequately solved that problem. As stated at <https://bugzilla.mozilla.org/show_bug.cgi?id=664147#c30>, "The future for GPU compute in Firefox is ARB_compute_shader, which is now part of ES 3.1."


Yep, I'd heard browsers wanted to use Shaders instead of WebCL. I think there's a couple of node packages that implement WebCL, but they're not in the browser environment and they're really only used with trusted code.

Looking through some discussions on Google Groups related to chromium development, it looks like they thought WebCL might be safe to use eventually, but not in the near term.


I donated some money to Mozilla this year. May you could too?


Donated to Mozilla using Bitcoin. Feels good, man.


I "donated" to Mozilla using Code.


LLJS is listed here but the last commit is over a year ago. Having spent the last month hand-coding arrays of structs in js, I'm really feeling the need for better low-level constructs. Looks like I'm stuck waiting for rust + emscripten to be a valid option.


Structs are coming to JS proper in the form of Typed Objects. For a whole load of details, check the spec draft (https://github.com/dslomov-chromium/typed-objects-es7) or this very readable paper: http://smallcultfollowing.com/babysteps/pubs/2014.04.01-Type...

Implementation of Typed Objects is progressing very nicely in Firefox; if you want to play around with them, just download a Nightly build.


Typed Objects are a huge step forwards but they still have some limitations. I frequently use things like:

    solverStates = {
      numVars: uint32
      numConstraints: uint32
      numStates: uint32
      solverStates: solverState[numStates]
    }

    solverState = {
      los: uint32[numVars]
      his: uint32[numVars]
      constraintStates: *void[numConstraints]
    }
Even using C, dealing with types like this is kind of a pain and would benefit from some macro magic. With Typed Objects it looks like I have to choose between variable sized arrays and pointers - I can't have both because then the gc won't know where to look for the pointers. According to http://wiki.ecmascript.org/doku.php?id=harmony:typed_objects working with such types is an explicit non-goal:

> In particular, binary formats often need expressive and dynamic data dependencies that are decidedly out of scope for this API, such as being able to specify an array whose length is determined by a preceding integer

That's probably the right decision for js, given the constraints of sane gc implementation. It still looks like my best option is to use asm.js and have full control over layout.

Either way, I am grateful that Mozilla continues to push for a fast, open and portable language. I don't have much love for js, but it's a damn sight better than trying to deliver portable native code to non-technical users.


And they are using Github!

I did some issues on Firefox, but stopped, because the Bugzilla/Mercurial workflow was so bad.

I did some fixes for other OSS projects on Github later and it felt like a charm.


Full support for contributing to Firefox via git is coming: http://glandium.org/blog/?p=3413


Doesn't mean we're going to use github.


well, if you would optimize of the patch-process (Creating PRs in Github is much faster than creating patches and uploading them to Bugzilla), it would lower the barrier.


It is a bit of a pain to get everything all set up, but once you do, you can use moz-git-tools [1] to just do:

git bz attach -e HEAD

...and then it will figure out which bug to attach it to, bring up a little file to edit so you can change the commit message, and it will automatically attach the patch to the bug. You can also set reviewer flags, obsolete other patches, etc., at the same time.

[1] https://github.com/mozilla/moz-git-tools


We're moving away from the attach-patches-to-bugs model, but in another direction: bugzilla integration with reviewboard, with changesets pushed to a review server and review requests being triggered automatically.

See https://mozilla-version-control-tools.readthedocs.org/en/lat... for details


Sounds nice :)

The old model felt rather clunky. I always had the impression that the coding was the easy part, getting the patch-creation right felt like a chore.


Hmmm... would it make sense to talk about a JVM written in Rust? Could that make it easier to write a safe JVM that would be less susceptible to exploits? It would be wonderful if we could get there and have a Mozilla browser with "out of the box" Java support without needing a separate plugin.


Aren't most or nearly all of the Java exploits related to running sandboxed code? Instead of relying on a code security model, use containers or other external permission systems.

For Microsoft, all the RCE were either use of C/++ or loading .Net into partially trusted scopes (like a browser).

I think Rust would have perhaps fixed a few of the managed runtime vulnerabilities, but probably not the majority. Forget sandboxing and you're done.


>Aren't most or nearly all of the Java exploits related to running sandboxed code?

Yes. Memory corruption flaws are still found with some regularity in the JVM, but they probably represent less than 10-20% of browser security issues.


Rust can't really make JIT compiled code substantially safer. JIT compilers are pretty much the unsafest things ever.


To be specific: Rust's memory safety guarantees only guarantee that the Rust code is free from memory safety violations, it can't make any sort of guarantees that the output of that code is sane. That said, maybe the other features like ADTs and explicit compile-time tracking of ownership make writing correct JITs (slightly) easier over plain C++, but that's yet to be seen (and it seems like it would be small in any case).


There are already JVMs written in Java like JikesRVM, Graal/SubstrateVM and a few others.

There is little that Rust safety could add over Java memory safety.

There have been talks that Hotspot might eventually give place to Graal, but no plans were ever given.


So I take it this means that Mozilla is reopening "Mozilla Labs".

It's interesting to see Shumway on there as I was under the impression the project was put on hold.


> So I take it this means that Mozilla is reopening "Mozilla Labs".

No. Mozilla Research and Mozilla Labs were distinct entities. From http://www.ianbicking.org/blog/2014/09/professional-transiti...:

"Note that Mozilla Labs is distinct from Mozilla Research – Research is the home of projects like Rust and ASM.js. Mozilla Research is still going strong. To make a broad distinction between the two groups: Research has worked on foundational technologies for the web, especially related to programming languages, while Labs was product-focused. Also Research has been led by Brendan Eich and now David Herman, with what appears to be a fairly clear vision and succession. Labs was led by a number of people with different interests and different visions – some people with an eye to external validation, some looking to spur disruptive (also uncomfortable) changes in Mozilla, some hoping to enable and include external contribution."

> It's interesting to see Shumway on there as I was under the impression the project was put on hold.

Shumway is definitely under active development. See https://github.com/mozilla/shumway/ for details and commit history.


Oh, thank you for clarifying that. I hope Mozilla comes up with another but successful Labs-like thing. I'd like to see them experimenting with P2P technologies and products.


> It's interesting to see Shumway on there as I was under the impression the project was put on hold.

Having interned at Mozilla Research last summer, I can assure you that there is a small, passionate team of paid developers working hard on it.


I'm surprised that pdf.js is described as a "success". I don't think I've ever heard a user of it say anything good about it. Any time I have heard about it, it's always somebody else in the office cursing loudly about it taking forever to open a PDF, or that it locked up Firefox, or that it's rendering something improperly, or that they wanted Foxit Reader or some other app to open the PDF instead of Firefox. Just because it's integrated with Firefox doesn't make it a "success", as far as I'm concerned. Australis is bundled with Firefox, too, and it's almost universally hated.


> Australis is bundled with Firefox, too, and it's almost universally hated.

My own anecdotal evidence is completely the opposite. I've never heard anything but praise. Being able to open pdfs right in my browser with no stupid plugins is downright heavenly. There's only a couple PDFs I've ever seen that made pdf.js choke. I absolutely love it and am very glad I don't have to mess with some external PDF viewer or annoying plug-in ever again.


Ditto. I'm happer with pdf.js than with my external acrobat pro. Also, it takes a few sessions to get used to it, but I don't see much of a difference in how I use FF with australis (i.e. its kinda good but nothing to think about too much)


I'm happy enough with PDF.js that I went out of my way to install it in Chrome (which is my primary browser for reasons).


I absolutely hate it. It only takes one or two PDFs that crash or freeze Firefox to install a completely separate viewer that is much better. It depends on how often it happens to you, but I probably had three crash or freeze type issues before I switched. I did have dozens of documents with invisible text (wrong styling in pdf.js) that messed it up for me, though.


I've had a few electronics datasheets that have cause PDF.js to fail on or run slowly but in the main i've found that it works perfectly well.

For the ones that do fail it's easy enough to download them and open them up in a different viewer.


One could argue that it was shipped in Firefox too early. But the performance and rendering quality/completeness has improved greatly since then.

("Initially flaky software improves over time" is a common story, albeit a boring one, and bad first impressions can be hard to shift.)

I'm always happy when it fires up and saves me having to switch to an external PDF viewer.


I've never had any real issues with pdf.js - I love it, to be honest. It's great not needing to have another Adobe plugin installed.


> Australis is bundled with Firefox, too, and it's almost universally hated

[citation needed]. remember that people bitching on web forums does not equal the general population's impression. firefox has extensive telemetry in the browser -- i'm certain that if australis hurt usage metrics, there'd be changes, or at least discussion in the mozilla community (which, of course, is essentially all publicly visible)


An objective look at Firefox's metrics show that it went from around 35% of the browser market a few years ago to under 10% today, and this number is still dropping. In addition, Firefox has almost no usage on mobile systems.

Those metrics tell us everything we need to know about Mozilla's efforts these past few years: people in general are not happy with them, to the post of abandoning Firefox.


Several things:

As of recently, that trend has finally stopped: Firefox has started gaining (very) small amounts of market share recently. The point where the speed of user attrition slowed down was actually slightly before the Australis launch, but after that, the development sped up considerably.

Thus, it looks like Australis helped substantially with winning back users.

Plus, a few years ago, the competition was essentially IE6/7. Not hard to gain market share against that. Now there are several great browsers to choose from. Which I'd say is a fantastic development, even though I work for Mozilla.

Finally, there are the billions of dollars that e.g. Google pours into advertising for Chrome, which might have something to do how market share developed.


I don't hate Australis, and I'd be surprised if that hatred is anywhere near universal, it's very Chrome-esque and people don't seem to have such a problem with that. What seems more likely is that you're hearing a vocal minority that hates it, I'd bet most people couldn't care less so long as it works.


You need to expand your universe.

PDF is pretty good and people generally like it and don't think twice that they're not using, say, adobe reader.

As for Australis, there were loud outcries when it happened (as it always does with big UI changes, see 3.x to 4) but it quieted down quite fast.

That's because "loud" doesn't equal majority and this vocal minority got used to, moved elsewhere or simply used some addon to keep things how they liked them.


pdf.js is better than a plugin in my opinion. It's probably not as good UX as a plugin at the moment, but pdf.js is pretty promising for the future.


Are your friends using late 2000s netbooks? I routinely open large PDFs (greater than 5mbs) with no problem. Firefox is very customizable if you don't like the Australis look you are free to change it. I use tree-style tabs personally. This comment is just unnecessary trolling.


My experience with excessively large PDFs is the same, albeit they are largely just text and tables with no complex vector graphics.


Are you sure those people are cursing PDF.js and not Adobe's Acrobat Reader? The terrible performance and security vulnerabilities in Acrobat Reader were a major motivation for Mozilla creating PDF.js in the first place.


That's because we have a habit of complaining more than praising. The 90% of the time it just works don't get a mention. Thought personally I use an external viewer (SumatraPDF on Windows)


Same here. It usually blocks, gives blocks instead of characters and has hiccups when switching pages.

I usually configure PDFs to be saved, so that I can enjoy a native PDF viewer instead.


Then let me be one of the first. I like pdf.js and Australis.


I don't like Australis but I like pdf.js.


I hope Shumway will arrive before major shift to Wayland on the desktop.

Daala is a very exciting project. The current mess of codecs support on the Web is just horrible.


Why would the two be related? One is a replacement for Flash. the Other for X?


Flash is dependent on GTK2 and for Wayland GTK3 is required. So ideally Firefox for Wayland shouldn't have any dependencies on GTK2 stuff (since mixing them together in one process isn't really a good idea, plus even if you manage to mix them somehow in separate processes GTK2 would still require X to function, so you'll have to fall back to XWayland in that case probably which is already not ideal).

So Shumway can solve this (at least in context of Flash) replacing the Flash plugin altogether.

For more details, see https://bugzilla.mozilla.org/show_bug.cgi?id=627699


Firefox already uses an external process to run Flash so leaving that on GTK2 while the browser uses GTK3 isn't an issue. The equivalent of XEmbed (to put the flash content in the page) in Wayland is writing a nested Wayland compositor which would be compatible with an XWayland window. Basically, Firefox itself can be a native Wayland app while still using the X11 dependent Flash player.


Maybe Adobe Flash doesn't work with pure Wayland, but needs X? So with Shumway you wouldn't need an X compositor (at least for Flash)


I think the user was just listing things they are excited for. Not drawing relationships between them.


what about Mozilla Brick and WebComponents ?


WebComponents are based on a collection of proposed web standards that are still being implemented by browser vendors. Its not something by Mozilla. You can learn more about the current direction and updates regarding Web Components in this blog post https://hacks.mozilla.org/2014/12/mozilla-and-web-components...

As for Brick, it is mostly made by people from the Apps team and last time I checked it was going thru a rewrite to base it on platform.js instead of x-tags. You can check with someone from that team on #apps on irc.mozilla.org. Brick is not research, it is just a collection of Web Components.


Thanks for the link! I'm very happy to see Mozilla commitment for implementing Custom Elements and ShadowDOM and I hope IE and Safari will catch up some day.


So x-tags is getting phased out ? Unfortunately, Brick feels a bit lacking pitted against Polymer.


No pdf.js?


I think the take away is that pdf.js is no longer a mozilla research project. It has been included with Firefox since v19.


"Following on the success of pdf.js, a high-fidelity PDF renderer written in pure HTML and JavaScript, the Shumway project..."


"...aims to implement an emulator for the Flash multimedia platform."


The point is not that Shumway is directly connected with pdf.js, but rather that the wording in its descrition suggests pdf.js is no longer a research project.


but I thought Shumway was about converting flash to html, not pdf, or is it now pdf and flash?


(Director of Strategy for Mozilla Research here.)

pdf.js was the brainchild of Andreas Gal, now our CTO but at the time a cofounder of Mozilla Research along with Brendan Eich and me. That project started briefly in Research but was shepherded to product pretty quickly.

Shumway is a separate project but similar in spirit. It's not a converter per se, but rather an emulator including a full ActionScript Bytecode (ABC) JIT, implemented in pure JavaScript. Right now we are interested in getting Shumway to the point where it can be used in Firefox as an alternative to the native Flash plugin for certain kinds of web Flash content, to provide better security, stability, and performance. Over time as Shumway matures the ultimate goal would be to eliminate the need for the Flash plugin entirely, but we'll walk before we run.

But Shumway is usable as a standalone project as well, and others have begun taking notice. For example, Prezi has invested in Shumway as a library for rendering vector graphics:

https://medium.com/prezi-engineering/how-and-why-prezi-turne...


Separate projects. But, pdf.js is integrated into Firefox and is past the research stage. I believe they're just sort of saying, "Hey, we made that other seemingly impossible idea work when we made pdf.js actually work pretty well, so you probably shouldn't laugh when we say we're implementing Flash in JS and HTML5."


It is. pdf.js does pdf, shumway does flash. Read the linked page.


It looks like that's a Mozilla Labs project, not a Mozilla Research project.

https://mozillalabs.com/en-US/pdfjs/


Also, if you folks enjoy these projects take your time (and money) to donate a some bucks to Mozilla.

Mozilla is the only independent vendor pushing technology and principles focused on people over profit.

You can find the donation page at https://sendto.mozilla.org/page/contribute/givenow-seq#page-...


> Mozilla is the only independent vendor pushing technology and principles focused on people over profit.

That it is the only independent vendor doing so is categorically false.

That it always puts people over profits is questionable. It's relationship with Google and reliance on ad revenue runs counter to those principles. Perhaps it is doing so as a survival tactic only, but they do compromise on the principle.


Does it still have a relationship with Google?


For a while, the Thunderbird default search engine was pegged to Bing. There was no easy way to change it. A marketer came to the bugzilla to explain they would not fix it for business reasons.


Which seems backwards because before Chrome, Google would pay Mozilla pretty large amounts of money [0] just to have their search as an easily-changeable default.

[0] http://www.pcmag.com/article2/0,2817,2398046,00.asp


I'm not sure if it still has relationships with Google, but it did form a partnership with Yahoo recently [https://blog.mozilla.org/press/2014/11/yahoo-and-mozilla-for...].


They diversified. I Think they provide Yahoo search by default most places, Yandex in Russia, and some other one in China.


Yahoo! is the default for U.S.A. Yandex for Russia, Baidu for China. All the others default to Google. And yes, you can always change the default search engine to your preferred one in the preferences.


Nah, there is too much JS on their agenda. Locking web to legacy languages is not cool. Please do not donate.


Not the biggest fan of js myself. As something cooked up in 10 days during the summer of '95, it's syntax is brutal. The original intention was to put "Scheme in the browser". The road to perdition is paved with good intentions....


Despite the parent comment getting downmodded, I think it makes a valid point.

Why don't we see a true plurality of programming languages supported within the browser?

I'm talking about proper implementations, too. Not hacks that use something like Emscripten to mangle C code down to JavaScript (which is all that asm.js is, after all).

NaCl and PNaCl are a much more general and sensible approach, rather than trying to contort JavaScript into a psuedo-bytecode.

If Mozilla really does care about openness and freedom, then we'd see more emphasis being placed on languages other than JavaScript. But CmonDev is right, we just don't see that happening. We see a monoculture developing around JavaScript and only JavaScript. In general, monocultures of any type are an unhealthy thing, and lead to stagnation.


NaCL is not portable and PNaCL will never be a standard, because it's very complex, very implementation specific and the other browser vendors will never adopt it. Quite the contrary, I view PNaCL as being the hack you're talking about, being Google's version of ActiveX.

The great thing about JavaScript is that it can be used as a compilation target, like bytecode. There are already many compilers that do that, like ClojureScript and Scala.js, I'm working on a mixed JVM/JS project in Scala right now. And I get the secure sandbox, the portability of JS and the tools for free. Which is the advantage of a standard, otherwise I might as well go native.

And given that you can compile C++ to JavaScript with really good results, I really don't get what problems you're trying to solve. And yes Mozilla is responsible for making this happen, getting everybody on board with ASM.js, even Microsoft. And I find that to be a great development.


Compile C to LLVM bytecode. Compile LLVM bytecode to Javascript.

Compile C to a subset of LLVM bytecode.

I fail to see how the second is harder than the first. In both cases, there are major implementation issues to overcome. The difference is that LLVM bytecode was designed to deal with this while asm.js is contorting an already problematic language and making unofficial (non-spec) guarantees.

The argument that compilers to JS exist is a non-issue because there already exist compilers from JS to LLVM (eg. javascriptCore). LLVM already has advanced optimizers that are well tested, so writing compilers targeting LLVM will already give a performance head start.

The argument that the bytecode is somewhat implementation specific does not matter for three reasons. The first is that a monopoly on implementation does not matter unless the choice was wrong (and even today, all the major JS engines share a lot of similarity in their high-level structure). Secondly, there are multiple LLVM JITs and native compilers around with each one having it's own take on implementation which seems to indicate that LLVM isn't that closely tied to a single way of doing things. Finally, in JavaScript itself, new features are not simply designed with the programmer in mind. A major factor is whether the big 4 find it easy to implement showing that JavaScript itself is implementation specific to those 4 companies opinion. Couldn't a bytecode do the same if necessary?

As there exists a way to compile LLVM code to JS code, backward compatibility with older browsers is simply a matter of adding an additional step in the compilation chain. The JS could check if LLVM was supported and browsers without support would simply ignore script tags that use LLVM.

As to the question of "why not just stick with JavaScript". I program in JS every day and really like most of the good parts of the language. This does not excuse the aweful parts of the language nor the problems it causes in real world companies where most programmers aren't "rock stars".

For a programmer new to JavaScript (even if they already have lots of programming experience), takes a disproportionately long time compared to other languages and until this is learned, the programmer is probably installing land mines that will need to be fixed later. Every fundamental feature of JavaScript (with the possible exception of closures) has a gotcha that you will run into (and not esoteric gotchas -- many of these will be run into very early on).

Even the most advanced of us find reasoning about JS inheritance to be nearly impossible for non-trivial systems and teaching this is even harder.

Aside from ubiquity, what reason is there to keep the language around? The only awesome features are that it's very function centric (minus proper tail calls) and it's combined dict/object take on prototypal inheritance is very nice to work with (unless you actually need to use the inheritance part).


LLVM bytecode was not designed to be a universal bytecode; it still contains architecture-specific information for code generation.

How long would it take for Mozilla, Google, Apple, and Microsoft to standardize on a common bytecode and then write compatible implementations for x86, ARM, and other platforms? NaCL or PNaCL are contender bytecodes, but Google is the only browser vendor that is interested.

JS is a standard language that already exists, so asm.js is backwards compatible with browsers that were shipped even before asm.js existed. The arguments for a universal bytecode over asm.js are mostly about elegance and "Better is Better." asm.js is definitely "Worse is Better" design.

http://mozakai.blogspot.com/2013/05/the-elusive-universal-we...


One thing I want to comment on here is a mistake I often make myself, which is to focus on the language/compiler/interpreter/jit/translators/etc. and to loose sight of the platform APIs -- user interfaces, input devices, OS services, data sources, and so on. As fun as it is to talk about languages on HN, APIs are arguably a bigger part of the big picture in many platforms. LLVM itself doesn't come with any platform APIs, so it's typically just one part in a larger and much more complex system.


The LLVM lack of APIs is part of the appeal to me. It gives us a chance to remove the cruft on the APIs (because LLVM wouldn't be backward compatible anyway) and do them right or at least how we now believe they should be after screwing them up so many times.


JavaScript is open and free.

Also, with Rust, they are already developing another language.

You might be right with the monoculture problem, but JS is a big thing with many developers, it would be dumb not to use this resource.


Do you think it's also worrying that we all communicate in English? Is the English language a stagnated monoculture?

Pandering to language fanboys isn't a good idea. Javascript is a good enough language, and if you don't like it, write in something else and compile it to javascript.


I'm not smart or experienced enough to know the answer to this, it's an honest question. Doesn't compiling things to javascript inherently limit what we can do? For example, if a feature is supported in some other language but not javascript then I imagine even if there were a way to compile to javascript, there might be a severe performance impact.

I guess what I am saying is that it seems like we are stuck with javascript for historical reasons and it's a shame that we aren't exploring other options. Of course, I can also completely see that sticking with javascript is the most practical option as well.


> Doesn't compiling things to javascript inherently limit what we can do?

Yes, any concrete compilation target has limitations, including JavaScript.

For example, in JavaScript, there are no 64-bit integers, which can slow down some code. As another example, strings on .NET and Python are different, causing Python on the CLR to either behave differently or run more slowly.

I think the important thing to realize is that there isn't a "perfect" compilation target. What we really want is a target that is (1) fast on all languages, (2) portable to run on all OSes and architectures, and (3) safe to run in a sandboxed manner. There is no target I am aware of that maximizes all 3. All we can do is try to get close.

I would say that JavaScript is doing pretty well in that respect already (that is, like all other options, it isn't perfect, but it's relatively decent), and remaining missing features are being worked on.

> it's a shame that we aren't exploring other options.

I don't think that's true at all - such experiments have been done! We had or have ActiveX, Java, Flash with its multiple VMs, Silverlight/.NET, NaCl, PNaCl, and others I forgot.

Given the performance of JavaScript engines today, those experiments of other VMs don't show a big enough reason to prefer them. asm.js code runs fast enough to run AAA game engines, which is a very good test.


Thanks for the response. I'm glad to see that I was wrong. asm.js looks extremely promising


Think of JavaScript as the JVM, CLR or ABC of the Web runtime. Now imagine someone trying to add a sperate bytecode/intermediate language to those runtimes. There are security issues and architecture issues to consider.

The language monoculture problem exists just as much on other runtimes as it's does on the Web. It's just nobody is expected to write in JVM or CLR.

That's why projects like asm.js and Emscripten are important. These projects show vendors what things are needed to make JavaScript into a better compile target so that someday writing in other language for the Web is as natural as it is on JVM and CLR.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: