

Mozilla Research Projects - yuashizuki
https://www.mozilla.org/en-US/research/projects/

======
bjz_
A neat thing about Shumway is just how much of it is written in Typescript[0].
It's great that Mozilla is getting behind the project.

\- [0]:
[https://github.com/mozilla/shumway/tree/master/src](https://github.com/mozilla/shumway/tree/master/src)

------
bkeroack
It's really fantastic to see Mozilla become a sort of steward and incubator of
the open web. Considering that Mozilla originally came from a revolutionary
(at the time) and somewhat desperate attempt by Netscape to counter the
monopoly power of Microsoft/IE. If I'm remembering correctly, this was the
first high profile corporate OSS dump (years before Java, for example) and
shortly thereafter was largely considered a failure since Netscape became
irrelevant and the Mozilla project didn't stop the MS juggernaut. Therefore it
would be a number of years before a company was willing to take a risk like
that again.

Of course with hindsight we can see that the Mozilla folks played the long
game. Quietly working in the background they produced a product (Firefox) that
actually did largely kill IE dominance. You can argue the role that Chrome had
in this, but my opinion is that Firefox created the market for non-IE
browsers. Without this trailblazing Chrome would not exist.

Congrats to everyone responsible, from the beginning to the present day.

------
ChrisAntaki
Keep up the great work, Mozilla! I'm excited to see where a lot of these
projects go, especially _asm.js_.

------
thomasfoster96
I hadn't heard of about half of these - it's great that there's a sort of
directory for them.

Sweet.js is pretty awesome (I was going to say it's sweet, but that's stupid).
Broadway.js and Shumway look pretty awesome, I'm going to check them out
tonight.

Regarding Parallel JavaScript, does anyone know how this relates to Khronos'
WebCL project? Hardware manufacturers seem really interested in WebCL, but
software developers aren't.

~~~
derf_
Unlike OpenGL, OpenCL allows writes to arbitrary addresses computed by the
OpenCL kernel. To be safely run on the web, you have to solve a very difficult
validation problem to prove that an arbitrary kernel won't do bad things. I
don't think anyone has adequately solved that problem. As stated at
<[https://bugzilla.mozilla.org/show_bug.cgi?id=664147#c30>](https://bugzilla.mozilla.org/show_bug.cgi?id=664147#c30>),
"The future for GPU compute in Firefox is ARB_compute_shader, which is now
part of ES 3.1."

~~~
thomasfoster96
Yep, I'd heard browsers wanted to use Shaders instead of WebCL. I think
there's a couple of node packages that implement WebCL, but they're not in the
browser environment and they're really only used with trusted code.

Looking through some discussions on Google Groups related to chromium
development, it looks like they thought WebCL might be safe to use eventually,
but not in the near term.

------
ape4
I donated some money to Mozilla this year. May you could too?

~~~
exo762
Donated to Mozilla using Bitcoin. Feels good, man.

~~~
k__
I "donated" to Mozilla using Code.

------
jamii
LLJS is listed here but the last commit is over a year ago. Having spent the
last month hand-coding arrays of structs in js, I'm really feeling the need
for better low-level constructs. Looks like I'm stuck waiting for rust +
emscripten to be a valid option.

~~~
tschneidereit
Structs are coming to JS proper in the form of Typed Objects. For a whole load
of details, check the spec draft ([https://github.com/dslomov-chromium/typed-
objects-es7](https://github.com/dslomov-chromium/typed-objects-es7)) or this
very readable paper:
[http://smallcultfollowing.com/babysteps/pubs/2014.04.01-Type...](http://smallcultfollowing.com/babysteps/pubs/2014.04.01-TypedObjects.pdf)

Implementation of Typed Objects is progressing very nicely in Firefox; if you
want to play around with them, just download a Nightly build.

~~~
jamii
Typed Objects are a huge step forwards but they still have some limitations. I
frequently use things like:

    
    
        solverStates = {
          numVars: uint32
          numConstraints: uint32
          numStates: uint32
          solverStates: solverState[numStates]
        }
    
        solverState = {
          los: uint32[numVars]
          his: uint32[numVars]
          constraintStates: *void[numConstraints]
        }
    

Even using C, dealing with types like this is kind of a pain and would benefit
from some macro magic. With Typed Objects it looks like I have to choose
between variable sized arrays and pointers - I can't have both because then
the gc won't know where to look for the pointers. According to
[http://wiki.ecmascript.org/doku.php?id=harmony:typed_objects](http://wiki.ecmascript.org/doku.php?id=harmony:typed_objects)
working with such types is an explicit non-goal:

> In particular, binary formats often need expressive and dynamic data
> dependencies that are decidedly out of scope for this API, such as being
> able to specify an array whose length is determined by a preceding integer

That's probably the right decision for js, given the constraints of sane gc
implementation. It still looks like my best option is to use asm.js and have
full control over layout.

Either way, I am grateful that Mozilla continues to push for a fast, open and
portable language. I don't have much love for js, but it's a damn sight better
than trying to deliver portable native code to non-technical users.

------
k__
And they are using Github!

I did some issues on Firefox, but stopped, because the Bugzilla/Mercurial
workflow was so bad.

I did some fixes for other OSS projects on Github later and it felt like a
charm.

~~~
tschneidereit
Full support for contributing to Firefox via git is coming:
[http://glandium.org/blog/?p=3413](http://glandium.org/blog/?p=3413)

~~~
khuey
Doesn't mean we're going to use github.

~~~
k__
well, if you would optimize of the patch-process (Creating PRs in Github is
much faster than creating patches and uploading them to Bugzilla), it would
lower the barrier.

~~~
tschneidereit
We're moving away from the attach-patches-to-bugs model, but in another
direction: bugzilla integration with reviewboard, with changesets pushed to a
review server and review requests being triggered automatically.

See [https://mozilla-version-control-
tools.readthedocs.org/en/lat...](https://mozilla-version-control-
tools.readthedocs.org/en/latest/mozreview.html) for details

~~~
k__
Sounds nice :)

The old model felt rather clunky. I always had the impression that the coding
was the easy part, getting the patch-creation right felt like a chore.

------
bobajeff
So I take it this means that Mozilla is reopening "Mozilla Labs".

It's interesting to see Shumway on there as I was under the impression the
project was put on hold.

~~~
Jakuv2000
I'm surprised that pdf.js is described as a "success". I don't think I've ever
heard a user of it say anything good about it. Any time I have heard about it,
it's always somebody else in the office cursing loudly about it taking forever
to open a PDF, or that it locked up Firefox, or that it's rendering something
improperly, or that they wanted Foxit Reader or some other app to open the PDF
instead of Firefox. Just because it's integrated with Firefox doesn't make it
a "success", as far as I'm concerned. Australis is bundled with Firefox, too,
and it's almost universally hated.

~~~
__david__
> Australis is bundled with Firefox, too, and it's almost universally hated.

My own anecdotal evidence is completely the opposite. I've never heard
anything but praise. Being able to open pdfs right in my browser with no
stupid plugins is downright heavenly. There's only a couple PDFs I've ever
seen that made pdf.js choke. I absolutely love it and am _very_ glad I don't
have to mess with some external PDF viewer or annoying plug-in ever again.

~~~
zzleeper
Ditto. I'm happer with pdf.js than with my external acrobat pro. Also, it
takes a few sessions to get used to it, but I don't see much of a difference
in how I use FF with australis (i.e. its kinda good but nothing to think about
too much)

------
shmerl
I hope Shumway will arrive before major shift to Wayland on the desktop.

Daala is a very exciting project. The current mess of codecs support on the
Web is just horrible.

~~~
arianvanp
Why would the two be related? One is a replacement for Flash. the Other for X?

~~~
shmerl
Flash is dependent on GTK2 and for Wayland GTK3 is required. So ideally
Firefox for Wayland shouldn't have any dependencies on GTK2 stuff (since
mixing them together in one process isn't really a good idea, plus even if you
manage to mix them somehow in separate processes GTK2 would still require X to
function, so you'll have to fall back to XWayland in that case probably which
is already not ideal).

So Shumway can solve this (at least in context of Flash) replacing the Flash
plugin altogether.

For more details, see
[https://bugzilla.mozilla.org/show_bug.cgi?id=627699](https://bugzilla.mozilla.org/show_bug.cgi?id=627699)

~~~
amaranth
Firefox already uses an external process to run Flash so leaving that on GTK2
while the browser uses GTK3 isn't an issue. The equivalent of XEmbed (to put
the flash content in the page) in Wayland is writing a nested Wayland
compositor which would be compatible with an XWayland window. Basically,
Firefox itself can be a native Wayland app while still using the X11 dependent
Flash player.

------
tkubacki
what about Mozilla Brick and WebComponents ?

~~~
soapdog
WebComponents are based on a collection of proposed web standards that are
still being implemented by browser vendors. Its not something by Mozilla. You
can learn more about the current direction and updates regarding Web
Components in this blog post [https://hacks.mozilla.org/2014/12/mozilla-and-
web-components...](https://hacks.mozilla.org/2014/12/mozilla-and-web-
components/)

As for Brick, it is mostly made by people from the Apps team and last time I
checked it was going thru a rewrite to base it on platform.js instead of
x-tags. You can check with someone from that team on #apps on irc.mozilla.org.
Brick is not research, it is just a collection of Web Components.

~~~
tkubacki
Thanks for the link! I'm very happy to see Mozilla commitment for implementing
Custom Elements and ShadowDOM and I hope IE and Safari will catch up some day.

------
mindcrime
Hmmm... would it make sense to talk about a JVM written in Rust? Could that
make it easier to write a safe JVM that would be less susceptible to exploits?
It would be wonderful if we could get there and have a Mozilla browser with
"out of the box" Java support without needing a separate plugin.

~~~
Jweb_Guru
Rust can't really make JIT compiled code substantially safer. JIT compilers
are pretty much the unsafest things ever.

~~~
dbaupp
To be specific: Rust's memory safety guarantees only guarantee that the Rust
code is free from memory safety violations, it can't make any sort of
guarantees that the output of that code is sane. That said, maybe the other
features like ADTs and explicit compile-time tracking of ownership make
writing correct JITs (slightly) easier over plain C++, but that's yet to be
seen (and it seems like it would be small in any case).

------
mp3geek
No pdf.js?

~~~
MegaDeKay
"Following on the success of pdf.js, a high-fidelity PDF renderer written in
pure HTML and JavaScript, the Shumway project..."

~~~
pedalpete
but I thought Shumway was about converting flash to html, not pdf, or is it
now pdf and flash?

~~~
dherman
(Director of Strategy for Mozilla Research here.)

pdf.js was the brainchild of Andreas Gal, now our CTO but at the time a
cofounder of Mozilla Research along with Brendan Eich and me. That project
started briefly in Research but was shepherded to product pretty quickly.

Shumway is a separate project but similar in spirit. It's not a converter per
se, but rather an emulator including a full ActionScript Bytecode (ABC) JIT,
implemented in pure JavaScript. Right now we are interested in getting Shumway
to the point where it can be used in Firefox as an alternative to the native
Flash plugin for certain kinds of web Flash content, to provide better
security, stability, and performance. Over time as Shumway matures the
ultimate goal would be to eliminate the need for the Flash plugin entirely,
but we'll walk before we run.

But Shumway is usable as a standalone project as well, and others have begun
taking notice. For example, Prezi has invested in Shumway as a library for
rendering vector graphics:

[https://medium.com/prezi-engineering/how-and-why-prezi-
turne...](https://medium.com/prezi-engineering/how-and-why-prezi-turned-to-
javascript-56e0ca57d135)

------
soapdog
Also, if you folks enjoy these projects take your time (and money) to donate a
some bucks to Mozilla.

Mozilla is the only independent vendor pushing technology and principles
focused on people over profit.

You can find the donation page at
[https://sendto.mozilla.org/page/contribute/givenow-
seq#page-...](https://sendto.mozilla.org/page/contribute/givenow-seq#page-1)

~~~
CmonDev
Nah, there is too much JS on their agenda. Locking web to legacy languages is
not cool. Please do not donate.

~~~
RobinInTheRain
Despite the parent comment getting downmodded, I think it makes a valid point.

Why don't we see a true plurality of programming languages supported within
the browser?

I'm talking about proper implementations, too. Not hacks that use something
like Emscripten to mangle C code down to JavaScript (which is all that asm.js
is, after all).

NaCl and PNaCl are a much more general and sensible approach, rather than
trying to contort JavaScript into a psuedo-bytecode.

If Mozilla really does care about openness and freedom, then we'd see more
emphasis being placed on languages other than JavaScript. But CmonDev is
right, we just don't see that happening. We see a monoculture developing
around JavaScript and only JavaScript. In general, monocultures of any type
are an unhealthy thing, and lead to stagnation.

~~~
bad_user
NaCL is not portable and PNaCL will never be a standard, because it's very
complex, very implementation specific and the other browser vendors will never
adopt it. Quite the contrary, I view PNaCL as being the hack you're talking
about, being Google's version of ActiveX.

The great thing about JavaScript is that it can be used as a compilation
target, like bytecode. There are already many compilers that do that, like
ClojureScript and Scala.js, I'm working on a mixed JVM/JS project in Scala
right now. And I get the secure sandbox, the portability of JS and the tools
for free. Which is the advantage of a standard, otherwise I might as well go
native.

And given that you can compile C++ to JavaScript with really good results, I
really don't get what problems you're trying to solve. And yes Mozilla is
responsible for making this happen, getting everybody on board with ASM.js,
even Microsoft. And I find that to be a great development.

~~~
hajile
Compile C to LLVM bytecode. Compile LLVM bytecode to Javascript.

Compile C to a subset of LLVM bytecode.

I fail to see how the second is harder than the first. In both cases, there
are major implementation issues to overcome. The difference is that LLVM
bytecode was designed to deal with this while asm.js is contorting an already
problematic language and making unofficial (non-spec) guarantees.

The argument that compilers to JS exist is a non-issue because there already
exist compilers from JS to LLVM (eg. javascriptCore). LLVM already has
advanced optimizers that are well tested, so writing compilers targeting LLVM
will already give a performance head start.

The argument that the bytecode is somewhat implementation specific does not
matter for three reasons. The first is that a monopoly on implementation does
not matter unless the choice was wrong (and even today, all the major JS
engines share a lot of similarity in their high-level structure). Secondly,
there are multiple LLVM JITs and native compilers around with each one having
it's own take on implementation which seems to indicate that LLVM isn't that
closely tied to a single way of doing things. Finally, in JavaScript itself,
new features are not simply designed with the programmer in mind. A major
factor is whether the big 4 find it easy to implement showing that JavaScript
itself is implementation specific to those 4 companies opinion. Couldn't a
bytecode do the same if necessary?

As there exists a way to compile LLVM code to JS code, backward compatibility
with older browsers is simply a matter of adding an additional step in the
compilation chain. The JS could check if LLVM was supported and browsers
without support would simply ignore script tags that use LLVM.

As to the question of "why not just stick with JavaScript". I program in JS
every day and really like most of the good parts of the language. This does
not excuse the aweful parts of the language nor the problems it causes in real
world companies where most programmers aren't "rock stars".

For a programmer new to JavaScript (even if they already have lots of
programming experience), takes a disproportionately long time compared to
other languages and until this is learned, the programmer is probably
installing land mines that will need to be fixed later. Every fundamental
feature of JavaScript (with the possible exception of closures) has a gotcha
that you will run into (and not esoteric gotchas -- many of these will be run
into very early on).

Even the most advanced of us find reasoning about JS inheritance to be nearly
impossible for non-trivial systems and teaching this is even harder.

Aside from ubiquity, what reason is there to keep the language around? The
only awesome features are that it's very function centric (minus proper tail
calls) and it's combined dict/object take on prototypal inheritance is very
nice to work with (unless you actually need to use the inheritance part).

~~~
sunfish
One thing I want to comment on here is a mistake I often make myself, which is
to focus on the language/compiler/interpreter/jit/translators/etc. and to
loose sight of the platform APIs -- user interfaces, input devices, OS
services, data sources, and so on. As fun as it is to talk about languages on
HN, APIs are arguably a bigger part of the big picture in many platforms. LLVM
itself doesn't come with any platform APIs, so it's typically just one part in
a larger and much more complex system.

~~~
hajile
The LLVM lack of APIs is part of the appeal to me. It gives us a chance to
remove the cruft on the APIs (because LLVM wouldn't be backward compatible
anyway) and do them right or at least how we now believe they should be after
screwing them up so many times.

