
JavaScript Modules - steveklabnik
http://jsmodules.io/
======
jonpaul
The problem with introducing new keywords and doing this in a backwards
incompatible way, is that we'll be running with three module systems for
years: CommonJS, AMD, and ES6. As far as I can tell, and please correct me if
I'm wrong, it looks like ES6 modules aren't solving any new problem that
CommonJS did not solve. Browserify does an excellent job of bringing CommonJS
to the browser and it's gaining a lot of momentum. With ES6 now pushed back
until June 2015... I'm not sure what to think anymore.

~~~
jlongster
It adds a lot of stuff, and I think this site is going to add more explanation
about that. I'm not fully qualified to go into everything, but off the top of
my head:

* The dependencies are static, so you can't do stuff like `if(Math.random() < .5) { require('foo') } else { require('bar') }`. That is a _good_ thing. A lot of people at this point agree that a statically analyzable dependency graph is a huge win.

* Bindings are truly exported. If you exported `const foo = 5`, imported it in another module, and tried to change it, you would get a static error. Semantics and identity of bindings carry over. This also means they are mutable; if the module exporting `foo` changes it everyone else will see it too.

* A fleshed out module loader API, which is seriously lacking in node

There are other advantages too. The simplistic approach of CommonJS works, but
long-term it's not the best solution for the wide use-cases of JavaScript.

EDIT: and this will be backwards-compatible. I don't know all the details but
this can be implemented in a browser today without breaking things (I know
people on the teams working on this)

~~~
notduncansmith
> The simplistic approach of CommonJS works, but long-term it's not the best
> solution for the wide use-cases of JavaScript.

Could you elaborate on this point a little more? I've worked on Node.js
projects varying greatly in size, and even on the larger ones I've yet to feel
encumbered the CommonJS module system.

~~~
Touche
By "wide use-cases" he means the Web and server. CJS doesn't work well on the
web. ES6 modules were designed from the start to support both.

~~~
domenicd
This is a common strawman, with several examples to disprove it:

\- [https://github.com/cujojs/curl](https://github.com/cujojs/curl) \-
[https://github.com/montagejs/mr](https://github.com/montagejs/mr) \-
[https://github.com/substack/wreq](https://github.com/substack/wreq)

~~~
Touche
to support CommonJS in the browser you either 1) require that it is wrapped
(and therefore isn't really CommonJS) or 2) you XHR it, wrap it yourself, and
eval. That's it's possible to work around the problem doesn't discount that it
"doesn't work well" in browsers.

------
russellsprouts
While a module system in JavaScript will be useful, this seems overly
complicated.

I like Lua's approach to modules. In Lua, a file is basically equivalent to a
function body. It accepts arguments and can return values. In JavaScript, this
is not allowed.

In Lua, a function can accept multiple arguments in a tuple called `...`.
Every file is implicitly a vararg function, so we can unpack the `...` to get
arguments.

    
    
        > add.lua
          local a,b = ...
          return a + b 
    
    
        > file.lua
          local add = load 'add.lua'
          print(add(1,2)) --> 3
    

load takes a file path and returns that source file as a function. Most files
ignore arguments, but they do return values.

    
    
        > add2.lua
          return function(a,b) return a + b end
    
        > file2.lua
          local add = require 'add'
          print(add(1,2))
    

`require` loads a source file, executes it, and returns the value that it
returns. It is similar to `return load(fileName)()`, but it also has a search
mechanism, and caches the result. So, the Lua equivalent of export default is
simply returning a value at the end of a source file.

The named exports are also easy to replicate. You simply return an object with
the keys you want.

    
    
        > math.lua
          return {
            add = function(a,b) return a + b end,
            sub = function(a,b) return a - b end
          }
    
        > other.lua
          local math = require 'math'
          local add, sub = math.add, math.sub
    

You can also have something similar to the `export function name(){}`
declaration.

    
    
        > hello.lua
          local exports = {}
    
          function exports.hello()
            print "Hello World"
          end
    
          exports.VERSION = '0.0.1'
    
          return exports
    
        > file3.lua
          local hello = require'hello'.hello
          local version = require'hello'.VERSION
    
          hello() --> Hello World
    

In all, I think the Lua's require is a much cleaner way to implement modules.
Unfortunately, it cannot handle circular references. It may be possible to add
this feature using Lua's coroutines, however. I will have to research how it
is handled in JavaScript to be sure.

~~~
pcwalton
> In all, I think the Lua's require is a much cleaner way to implement
> modules.

The idea behind ES6 modules is to add _static_ name resolution to JavaScript.
Because of the incredibly dynamic nature of JS, you can never know what any
global name resolves to, as it could change at a moment's notice. The same is
true in Lua's module system: `_G` is completely mutable and it (or its
metatable!) can change at any time, altering the bindings in the program.

I personally feel that, given how JavaScript is getting used in larger and
larger codebases, some measure of static checking was long overdue.

------
jiyinyiyong
It can be great if published before CommomJS existed. It's a bit annoying when
people have embraced CommonJS. It aleady took me a lot time choosing between
CommonJS AMD and CMD. Please end this war as son as possible.

------
modarts
Hate to come off as a Luddite, but why should I care about this when we have a
really nice solution in the form of CommonJS + Browserify?

~~~
thealphanerd
I think this is due to the fact that not every has ability / desire to live in
the console. Having a development process that can work without a compile step
is important.

~~~
warfangle
browserify + watch ... run it and forget it

bonus points because you can hook a linter into it.

~~~
Touche
Run it for every project you mean. The nice thing about the web is that with a
simple http server you can just start coding. Compile-steps take away that
incredible advantage.

~~~
warfangle
> Run it for every project you mean.

Yep. Is that so bad? You know you can have multiple Terminal.app's open at
once, right?

> The nice thing about the web is that with a simple http server you can just
> start coding.

The nice thing about browserify/node is that I can install node, clone my seed
script, execute it, and just start coding.

> Compile-steps take away that incredible advantage.

While giving you the huge advantages of TDD/unit testing; the npm repository;
being able to use the CommonJS module pattern...

If you're just woodshedding a toy, or trying to learn - by all means, set up
lighttpd and boot sublime and go at it. But if you're trying to build
production-quality software... hone your craft; be better than a codemonkey.

~~~
Touche
> While giving you the huge advantages of TDD/unit testing

You don't need a compile step to do unit testing. Or even to use CommonJS for
that matter.

> If you're just woodshedding a toy, or trying to learn - by all means, set up
> lighttpd and boot sublime and go at it. But if you're trying to build
> production-quality software... hone your craft; be better than a codemonkey.

You don't need a compile-step in development to create good software. Client-
side loaders have all of the advantages of server-side loaders _but you can
also use them without a watch task_. ES6 modules are just going to make it all
easier.

~~~
warfangle
> You don't need a compile step to do unit testing.

Okay, I guess I could have a window with karma open and keep refreshing it
every time I change code. Having an eshell open running my unit tests every
time I save is a lot faster, though. Unfortunately, most of what I do is
remote over SSH instead of having a local development environment. And that
doesn't even begin to cover building a jenkins (or other CI) script to run
your non-compile-time unit tests.

> Or even to use CommonJS for that matter.

Which client side loaders would you recommend? I've used AMD/RequireJS in the
past and its headaches were not worth the benefits. The only other project in
the running is Webpack.

> but you can also use them without a watch task

I suppose I don't understand the watch task hate. It's there. It runs. It
versions my build. It lets me know when I've saved something stupid. It's
faster than tabbing to a browser and going through the steps to reproduce the
iota of code I just wrote, or refreshing a testrunner page.

How do you do continuous integration and save-time unit tests without a
preprocessing step without tying your project to a bloated IDE?

> ES6 modules are just going to make it all easier.

When ES6 modules come around to being available in IE9 (or IE9 finally dies),
I'll believe that you can do all this without a processing step. Even when ES6
modules become available in modern browsers, you'll still have to run
everything through a preprocessing build step to get backwards compatibility.
And woe to whomever tries to do dynamic dependency loading on mobile... that
200-2000ms per-http-request overhead on 3G will bite you right in the bounce
rate.

------
aikah
A language module system shouldnt depend on a build tool or a third party
library,so javascript modules are more than welcome.

The transition from ES5 to ES6 will be something interesting to live.ES6 is a
paradigm shift for Javascript.

------
phleet
Can someone explain to me the migration path towards ES6 and syntax like this?
This isn't syntactically valid ES5, so is there a reasonable path to making
the browsers run ES6 directly?

If I just include it in a <script> tag directly, it'll throw syntax errors in
any non-es6 compliant browser. So I'd have the transpile, and send down the
es5 version of the code.

But if I'm transpiling, what's the point? I still have a build step, so why
use ES6 over another compile-to-js language? The eventual promise, that years
and years and years from now, all the non-es6 compliant browser will be dead,
so I can send down the es6 code directly?

~~~
ripter
script supports a new type, module.

    
    
        <script type="module">
            // loads the 'q' export from 'mymodule.js' in the same path as the page
            import { q } from 'mymodule';
    
            new q(); // -> 'this is an es6 class!'
        </script>
    

You can use it today with things like :
[https://github.com/ModuleLoader/es6-module-
loader](https://github.com/ModuleLoader/es6-module-loader)

------
pmontra
The syntax is better than the CommonJS one. Check the side by side comparison
at [http://jsmodules.io/cjs.html](http://jsmodules.io/cjs.html) However
CommonJS is here now and JS engines might take a while to implement the new
syntax. Example: the => (which is in the same ES6 draft) works in FF since
version 22 and not in V8 yet.
[http://wiki.ecmascript.org/doku.php?id=harmony:specification...](http://wiki.ecmascript.org/doku.php?id=harmony:specification_drafts)

~~~
178
That is not an honest comparison, though.

    
    
        import { getCodec } from "iconv-lite";
    

would be

    
    
        var getCodec = require("iconv-lite").getCodec;

~~~
couchand
Or in CoffeeScript,

    
    
        {getCodec} = require "iconv-lite"
    

Explicit dependencies without introducing new syntax? Yes, please.

~~~
cwmma
or in es6 var {getCodec} = require("iconv-lite");

------
dreamdu5t
Pointless. CommonJS rocks. Browserify with a watcher and forget about it.

~~~
alessioalex
true that, CommonJS and Browserify are so simple and proven that I don't even
get the effort for ES6 modules.

------
efnx
Would it be impossible for browsers to include support for more than one
scripting language? I've never looked at the architecture of browsers but it
would be fun to run pychrome or haskafari.

Just because we all got on the js train doesn't mean we have to ride it till
the end.

I guess this is why there are so many cross compilers from x -> js. Thanks be
to those guys. Cheers.

~~~
kyrra
It is a lot of work. WebKit didn't want it in their code base, but blink is
experimenting with it with oilpan[0]. It would also take a lot of work in
Firefox to do. The big issue is having both runtimes executing in the same
DOM. handling garbage collection becomes a lot more complicated.

[0] [http://www.chromium.org/blink/blink-
gc](http://www.chromium.org/blink/blink-gc)

------
canadev
Modules are a good thing.

I quite like npm as far as package managers go (the default installation
directory being ./node_modules is a big improvement over the global rubygem
directory, IMO).

All the same, I don't mean to be overly negative but code like this is why I
am not a fan of JS:

var isNode = typeof process !== "undefined" && {}.toString.call(process) ===
"[object process]";

I mean, I am no JS expert but it took me a couple of minutes just to parse
this thing, and this is the next generation. I just don't like it.

~~~
nawitus
The main problem with npm is that it's a module manager for JavaScript files.
As applications scale, you need to start packaging other files in addition to
.js files, and npm doesn't support it.

You can try to create workarounds and hacks, but that's pretty much building a
new package manager on top of npm.

The core problem is that if your package depends on another package, it's
impossible to know the location of that package (no, it's not always or even
often under node_modules).

There's of course a lot of other problems with npm I've noticed..

~~~
couchand
Can you elaborate with more details? In my experience it's flexible and
predictable. Folks seem to be able to package C++ extensions to Node well
enough on npm. What else would you like to package? What other problems have
you noticed?

~~~
nawitus
Well, it can be anything. TypeScript definition files, for example, can't be
packaed with npm, since you can't know their location in relation to the
package that needs them.

Other problems include that npm is not really supported on Windows and
specifying a 'git'-based url always installs the package even if it exists on
a lower level (which works completely different from other version types).
There's also weird cache bugs from time to time (npm cache clean helps there).
And if you have a single package dependency that's missing, you can't force it
to finish npm install.

~~~
couchand
Forgive me, I have very little experience with TypeScript and searching
doesn't immediately reveal how definition files work. Is that a TypeScript
source file that you'd want to include in another TypeScript project? Perhaps
I'm missing something, but it seems strange to ask npm to distribute that. I
prefer CoffeeScript but I still get annoyed when npm packages distribute their
CoffeeScript source.

I can understand why installing a package from a version control URL should
skip checks, since the very fact that you're using the direct URL kind of
indicates that you want that specific commit.

I've literally never run into a missing dependency with npm. How did you get
there? Are you installing packages from several different repositories? What
good would forcing do if you're missing a dependency? If you know you don't
really need it can't you stub it?

TBH I'm not that concerned with support on Windows. _shrug_

~~~
nawitus
>Forgive me, I have very little experience with TypeScript and searching
doesn't immediately reveal how definition files work. Is that a TypeScript
source file that you'd want to include in another TypeScript project?

Yes, that's right. Since the application I'm working on is composed of
multiple TypeScript-based Node.js modules, the type information has to be
carried from one module to another. Since npm doesn't do this well, a new
package manager was actually written (TSD, TypeScript Definition manager).

>I can understand why installing a package from a version control URL should
skip checks, since the very fact that you're using the direct URL kind of
indicates that you want that specific commit.

No, the version url doesn't need to point to a specific commit. npm really
should use the semantics for all version urls. If I have two packages that
both point to say "myrepo.git", then there shouldn't be two duplicates.

>I've literally never run into a missing dependency with npm. How did you get
there? Are you installing packages from several different repositories? What
good would forcing do if you're missing a dependency? If you know you don't
really need it can't you stub it?

The missing dependency can be caused by a single Git repository being down. I
encountered this when our private Git repository wasn't accessible for a while
due to firewall issues, but I still wanted to keep development going on until
that got fixed. The only solution is to manually remove that dependency and
install it manually, then revert changes when the Git repository becomes
accessible. npm install --force would be better.

>TBH I'm not that concerned with support on Windows. shrug

I mainly develop on Linux, but customers want Windows support. The main
problem is documentation. Node.js download page doesn't say that Windows
support is not reliable. There's various problems on Windows, one main one is
apparently worked on (
[https://github.com/npm/npm/issues/3697](https://github.com/npm/npm/issues/3697)
).

------
drderidder
I'd have preferred support for Common JS (or closer to it), which doesn't seem
to suffer from the namespacing and syntax issues raised by this new spec.
Importing to the global namespace by default seems kinda dodgy, and the
mechanisms to avoid name collisions are syntactically awkward, imho. I think
this proposal might actually have cemented the long-term popularity of
browserify for years to come.

------
jlongster
I kind of wish this was posted tomorrow in the morning so a lot more people
would see it. It's crucially important for JavaScript.

~~~
roryokane
According to [http://kangax.github.io/compat-
table/es6/](http://kangax.github.io/compat-table/es6/), absolutely no browsers
support JavaScript Modules yet; only the Traceur compiler[1] and EchoJS
compiler[2] do. I would not call this feature “crucially important” yet.

[1] [https://github.com/google/traceur-
compiler](https://github.com/google/traceur-compiler) [2]
[https://github.com/toshok/echo-js](https://github.com/toshok/echo-js)

~~~
ollysb
The ember community has really embraced ES6 modules and now with the
introduction of broccoli[1] and the broccoli-es6-transpiler[2] it's really
easy to start using modules and the other ES6 features right now. If you're
already using SASS or Less then an ES6 transpiler seems like a natural next
step.

[1]
[https://github.com/broccolijs/broccoli](https://github.com/broccolijs/broccoli)
[2] [https://github.com/sindresorhus/broccoli-
es6-transpiler](https://github.com/sindresorhus/broccoli-es6-transpiler)

~~~
couchand
...and if you're already using browserify this seems like a natural step
_backwards_.

------
guiomie
"We’re going to build a simple asap module, which lets you schedule some work
to happen as soon as possible, but asynchronously. In Node.js, you can do this
via process.nextTick," ... I thought process.nextTick wasn't the equivalent to
an async, and that async could only be done in native modules ... Can someone
correct me or the author?

~~~
Touche
process.nextTick is a native module. You can test this simply yourself:

    
    
        var a = 1;
        process.nextTick(function() {
          console.log(a);
        });
        a = 2;

------
eleith
Coursera is hosting a tech talk on 7/23 in mountain view where james burke
(author or require.js) will be speaking about javascript modules.

sign up here: modular-js-coursera-tech-talk.eventbrite.com

------
pje

        > var isNode = typeof process !== "undefined" &&
        >              {}.toString.call(process) === "[object process]";
    

WHAT DO YOU THINK YOU'RE RUNNING FROM?! THE DISEASE IS INSIDE OF YOU!

~~~
pekk
Can you clarify?

~~~
oblio
Well, back off from your Javascript knowledge a little and try to figure out
if the intention of the code is readable.

As an outsider is seems... cryptic.

~~~
Touche
It's definitely cryptic but stuff like this always happens in dynamic
languages. They couldn't have created a process.thisIsNodeJS_NoForRealItIs
because anything could fake it. My making it part of the toString it can't be
overwritten.

~~~
voyou
Who cares if it's being "faked", though? The code should just check directly
for the existence of process.nextTick, not use a bunch of obfuscatory nonsense
to check that the string representation of some object matches some arbitrary
string from which you might be able to conclude that you're running on node,
from which you can then conclude that you _might_ have access to
process.nextTick.

~~~
Touche
Because Node doesn't own process.nextTick. It's valid for a library to create
one in the browser. If I'm writing JS that can be run in Node or the browser I
want to know if it's REALLY IS Node or not.

