
Fay -- A proper subset of Haskell that compiles to Javascript - nandemo
http://chrisdone.com/fay/
======
perfunctory
The combined man-hours spent on producing various compilers targeting JS would
be enough to:

    
    
      * Build the next generation browser that supports a proper programming language.
      * Write migration instructions.
      * Convince major web-app providers that they need to migrate.
      * Help with migration.

~~~
jballanc
Browsers don't need a programming language. Browsers need a standardized
bytecode. The problem is, bytecode can have a significant impact on potential
performance. So, as long as browsers are still competing on JS performance,
they are actively disincentivized from pursuing standardization of a
bytecode...

~~~
lerouxb
Bytecode on the web is a shitty idea. The minifiers and obfuscators out there
are already breaking "view source" and with that the spirit of the web.

Besides: I can't think of language-neutral bytecode projects that ever worked.
Remember Parrot?

~~~
exDM69
> Bytecode on the web is a shitty idea.

In my book, some kind of byte code or other intermediate language is a lot
better idea than distributing the source code of web apps. There are a lot of
disadvantages to delivering programs (for immediate execution) as source code.
Note: this is orthogonal to open source and licensing.

So what would be needed is an intermediate program representation that:

    
    
      - Is well defined
      - Is a good compilation target from various source languages
      - Is easy to validate for correctness and security
      - Is fast to interpret and compile
      - Distributed in fast to read binary format
      - Suits a dynamic language
    

In my opinion LLVM IR or CLR bytecode are the most technically suitable
alternatives out of existing ones.

The web has quite a long legacy of Javascript and a tradition of supporting
deprecated browsers so this kind of revolution is unlikely to happen any time
soon.

> Besides: I can't think of language-neutral bytecode projects that ever
> worked. Remember Parrot?

JVM, CLR and LLVM are all popular compilation targets that are used with
dozens of different source languages each. Bytecode and program representation
was never the hard part, it's the frameworks of the OS/platform underneath.

~~~
knowtheory
I strenuously disagree that distributing the source for web apps is orthogonal
to open source.

That the web is by default open presents the social dynamics that are quite
different than in systems like java applets, which are by default closed.

Open source can exist regardless of whether the web is open or not, but you
are kidding yourself if you think whether or not source is distributed _with_
a web page doesn't effect the open source landscape.

~~~
exDM69
Today, JavaScript is distributed in minified and/or obfuscated form. And the
JavaScript code might have originally been generated by a compiler from a
different source language. So calling the stuff your web browser downloads and
executes the "source" is not really a valid argument.

Instead, JavaScript is used as a program representation with small size and
immediate execution in mind, and it really sucks in that role.

Even if the code is distributed in source code form with white space,
identifier names and comments intact (bandwidth isn't cheap, you know),
licensing is still the factor that legally determines if it is free software
or just open source by default.

~~~
knowtheory
The ability to figure out how a piece of code works is different than being
able to take that code and reuse it elsewhere.

View source is important both for the sake of debugging in the case of 3rd
party javascript, and also for the sake of being able to teach others.

Regardless of the minification issue, it's possible to use a code formatter to
expand a minified file and at least identify where declarations are being made
and what functional behavior is being specified. Minification makes viewing
source inconvenient, but that's not equivalent with closed.

~~~
moe
_it's possible to use a code formatter to expand a minified file_

If you go to that length you can just as well use a decompiler, no difference.

~~~
beagle3
Huge difference.

Minified files, more often than not, have the original identifiers, original
code structure (e.g. you know a _for_ from a _while_ ). With a decompiler, you
lose all of that.

Google closure is a compiler (and thus loses a lot of this data), but most
minifiers are not semantically destructive.

~~~
moe
Many other minifiers also munge the symbols (e.g. YUI compressor). And why
should this process retain anything anyways? If you want your source-code
available for download then... put up a download link.

~~~
beagle3
> And why should this process retain anything anyways?

Because javascript has some reflection, and also because of weird scoping
rules (with, eval, etc.) you can never be sure where a symbol comes from
unless you have access to the exact version of every other script on the page,
e.g. jquery, etc)

> If you want your source-code available for download then... put up a
> download link.

I agree - but there is still a huge difference between deminified and
decompiled source readbility - which is all I was saying.

------
drcode
This really looks fantastic: All code is run through the GHC compiler for
static type checking and then translated to javascript. This means:

    
    
       1. It has full Hindley-Milner type inference
      
       2. It produces readable javascript that is reasonably debuggable
    

I could imagine using this for real projects in the near future.

~~~
epidemian

      > It produces readable javascript that is reasonably debuggable
    

The examples on the page don't seem readable and reasonably debuggable to me.
The simple square definition:

    
    
      square x = x * x
    

Compiles into:

    
    
      var square = function($_a) {
        return new $(function() {
          var x = $_a;
          return _(x) * _(x);
        });
      };
    

Which, in turn, has all the "$" and "_" indirections.

This is basically why i think that trans-compilation to JS from a semantically
very different language is not a good idea, unless a way of debugging _in the
source language_ is provided. The success of CoffeeScript, i think, comes from
it not differing too much from JS semantically; yes, it adds things like
classes or everything-is-an-expression semantics, but the step of trans-
compiling those things into JS is pretty trivial.

I hope the advent of source maps will help in the ease of use of these
languages.

~~~
sic1
I could not agree more with all your points. Source maps are the key, and i
look forward to them very much.

I am a coffeescripter, and i already have mixed emotions about the extra layer
of complexity it puts on my code for others to get up to speed. Taking it
further from the actual language (javascript) just adds a layer of obfuscation
that does not really help javascript and its community at all (and many of the
times your peers). It may make you feel better about what you are doing at the
time (e.g. i must have typed vars, or whatever you people say), but its just
making you feel better, it still is javascript, and you still have to debug
javascript across browsers - that's just that. You still need to know all the
javascript, you cant just use X to js and only know X. Only reason i get away
with coffeescript is when i know im modern browser world, not old ie and the
greater world of browser bugs. Otherwise im writing javascript.

------
Xcelerate
A lot of people seem to be questioning whether we _need_ a new programming
language. Nobody _needs_ a new anything most of the time. At least for me, I
enjoy it when people create all sorts of new programming languages. It gives
me a new way to try things and interesting insights into something I would
have never thought about before. I think all of these languages that compile
to Javascript are great.

------
spicyj
I don't know very much about Haskell, but this looks really great from a quick
glance. I wonder how performance compares to plain JS – surely all the
laziness and resulting JS closures have a cost.

~~~
mekwall
What do you mean? The performance will be exactly the same as if you
implemented it in plain JS. The laziness and closures sure has an impact, but
they fill their purposes and the resulting application wouldn't work the same
without them.

~~~
spicyj
If you always implemented it the most general way, sure, but most people don't
write lazy, curried functions when using JS (and often needn't when writing
Haskell even though it makes more sense in that case).

~~~
mekwall
As always, its a matter of preference. Fay is for devs who prefer the syntax
and semantics of Haskell before that of JS, and it will not solve anything
else beyond that. The overhead is likely neglible in most cases, and is
probably worth it if it lowers development time.

------
clvv
I wonder if there're any plans to make it self-hosting. It would be more
appealing if it can run in browsers and node like CoffeeScript.

~~~
nandemo
Not sure if I understand your suggestion... I suppose if one wants to write
server-side code in a Haskell-ish language then one would simply use standard
Haskell plus Yesod or another web framework.

~~~
clvv
It is always better to have alternatives. Being self-hosting and compiling to
a widely-used language like javascript is definitely a plus. Another useful
case is that an interactive "Try Fay" page can be setup (like "Try
CoffeeScript" page).

~~~
echaozh
Elm is actually self-hosting by your standard.

If the average IQ of the world population is higher, I think Haskell will
become a widely-used language. Haskell is a language much more than its
syntax, and takes a lot of effort to learn.

~~~
jaekwon
That's why it sucks, sir.

------
fuzzythinker
Instead of comparing to Roy/Elm, which I have never heard of (has anyone?), he
should compare it to livescript, a fork of coffeescript that has quite a few
syntactic similarity to Haskell (it's inspired by it).

<http://gkz.github.com/LiveScript/>

~~~
tikhonj
I think this is targeted at Haskell people looking to compile to JavaScript
rather than JavaScript people looking to use a new language.

Roy and Elm are new languages that people who follow Haskell are probably
familiar with. Comparing to them makes more sense than comparing to LiveScript
if your audience is Haskell programmers.

LiveScript is not really anything like Haskell at all except in some entirely
superficial ways. Even the syntax isn't all that similar. Essentially, it's a
slightly more functional CoffeeScript. While certainly interesting to the same
people and for the same reasons as CoffeeScript, it's not very interesting to
people who primarily use Haskell.

~~~
jaekwon
Can you be more specific about how LiveScript is nothing like Haskell?

Granted that LiveScript isn't Haskell, how are the syntactic features that are
derived from Haskell not at all like Haskell?

~~~
tikhonj
Which parts of the syntax _are_ like Haskell? As far as I can tell, the syntax
for calling functions is _similar_ to Haskell, but that's about it. Maybe some
of the operators are vaguely like Haskell's, except they're baked into the
language instead of being definable by the user.

The vast majority of the syntax seems based on CoffeeScript.

~~~
GeZe
LiveScript has many features over CoffeeScript that may not exactly be as in
Haskell, but are inspired by it. You can define curried functions, use
partially applied operators (addTwo = (+ 2)), use operators as functions (sum
= fold1 (+)), use infix functions, ("hi" `startsWith` 'h'), compose functions
(h = f . g), have proper list comprehensions, and its standard library,
prelude.ls, is based off of Haskell's Prelude module (inclusion is optional
though if you want to use underscore.js or something else). For more
information check out [http://gkz.github.com/LiveScript/blog/functional-
programming...](http://gkz.github.com/LiveScript/blog/functional-programming-
in-javascript-using-livescript-and-prelude-ls.html)

------
fuzzythinker
I noticed the use of $, _, and globals like enumFromTo in the compiled js. It
says $ means "thunk" and _ means "force". Does this means this compiled js
further compiled so $, _, and globals are further replaced with something like
Fay$$...?

~~~
chrisdone
Those two symbols are merely to make the code easier to read as there are a
lot of thunks and forcing. Previously it was Fay$$force and Fay$$thunk.

Technically these are global within the Fay output, but that is all wrapped up
in a closure, so it won't interfere with anything outside.

Additionally, _ is not a valid function name in Haskell, and $ (and any other
non-letter symbols) are encoded as $123$ where 123 is the unicode point.

So these two symbols are free to use, if that bothered you at all. enumFromTo
is defined in the Prelude.

------
bdg
Lovely article but I'd like to ask that we call things that "compile to
javascript" transpilers. JS is not "web assembly" and if we keep communicating
that message in the community it will be believed to be.

~~~
tree_of_item
Projects like this are motivated by a belief that JavaScript _is_ web
assembly, though.

------
it
This is great. Now Haskellers can not only develop in a single language on the
browser and server, but it looks like this could make it possible to leverage
existing Node.js code on the server side. chrisdone, is there anything more
that needs to be done to support using Node modules?

------
angeladur
What does getSquare (Math _ square _) = square mean?

~~~
chrisdone
That's a pattern match. Math is a data constructor, it makes a Math object. It
takes three arguments. So (Math 1 2 3) is a Math object.

A pattern is a way of deconstructing an object into its constituent parts and
bringing some of those into scope.So (Math x y z) is a valid pattern, which
would bring these values into scope: x=1, y=2, z=3. In this case I'm bringing
only the second argument into scope. In a pattern, _ means "ignore this
argument".

Here is a short interactive lesson on pattern matching:
<http://tryhaskell.org/#19>

------
notime
Hearing the words "compiling to Javascript" here, from Google, and elsewhere
drives me nuts. Generating code in a dynamic and uncompiled language is NOT
compilation! It is just a type of translation. If you want to make up a word,
call it "relanguifying"- I don't care- just don't call it compilation.

~~~
edsrzf
Wikipedia describes a compiler as "a computer program that transforms source
code written in a programming language into another computer language".

That certainly seems to describe what's going on here.

~~~
notime
Actually Wikipedia on the disambiguation page says that compiliation is: "In
computer programming, the translation of source code into object code by a
compiler."

On the main wikipedia page, you cut off the full definition: "A compiler is a
computer program (or set of programs) that transforms source code written in a
programming language (the source language) into another computer language (the
target language, often having a binary form known as object code). The most
common reason for wanting to transform source code is to create an executable
program."

Note how it says "the target language, often having a binary form known as
object code." and "The most common reason for wanting to transform source code
is to create an executable program."

If you then go down into the description, you'll see: "The front end checks
whether the program is correctly written in terms of the programming language
syntax and semantics. Here legal and illegal programs are recognized. Errors
are reported, if any, in a useful way. Type checking is also performed by
collecting type information. The frontend then generates an intermediate
representation or IR of the source code for processing by the middle-end.

The middle end is where optimization takes place. Typical transformations for
optimization are removal of useless or unreachable code, discovery and
propagation of constant values, relocation of computation to a less frequently
executed place (e.g., out of a loop), or specialization of computation based
on the context. The middle-end generates another IR for the following backend.
Most optimization efforts are focused on this part.

The back end is responsible for translating the IR from the middle-end into
assembly code. The target instruction(s) are chosen for each IR instruction.
Register allocation assigns processor registers for the program variables
where possible. The backend utilizes the hardware by figuring out how to keep
parallel execution units busy, filling delay slots, and so on. Although most
algorithms for optimization are in NP, heuristic techniques are well-
developed."

So, just stating that it is "a computer program that transforms source code
written in a programming language into another computer language" is
inadequate. There is more to it than that, and unfortunately so many just
don't get it.

------
combataircraft
Very cool except it has no NodeJS example. By the way, JavaScript does not
suck. It has module and package systems, check out NPM, and OneJS for using
all NodeJS utilities in client-side; <http://github.com/azer/onejs>

~~~
jhuni
> By the way, JavaScript does not suck.

The depths to which JavaScript sucks are well documented & well understood:
[http://wiki.theory.org/YourLanguageSucks#JavaScript_sucks_be...](http://wiki.theory.org/YourLanguageSucks#JavaScript_sucks_because):

~~~
jaekwon
Well, yes actually. But most of how JavaScript sucks are _fixable_. The core
of how the language works (imperative, prototypical) is _great_.

~~~
spacemanaki
> The core of how the language works (imperative, prototypical) is great.

Some might disagree with this assertion. I don't think I'd ever put
_imperative_ in the pros column for JS, and in any case its imperative nature
doesn't really distinguish it from most popular languages.

