The problem with introducing new keywords and doing this in a backwards incompatible way, is that we'll be running with three module systems for years: CommonJS, AMD, and ES6. As far as I can tell, and please correct me if I'm wrong, it looks like ES6 modules aren't solving any new problem that CommonJS did not solve. Browserify does an excellent job of bringing CommonJS to the browser and it's gaining a lot of momentum. With ES6 now pushed back until June 2015... I'm not sure what to think anymore.
It adds a lot of stuff, and I think this site is going to add more explanation about that. I'm not fully qualified to go into everything, but off the top of my head:
* The dependencies are static, so you can't do stuff like `if(Math.random() < .5) { require('foo') } else { require('bar') }`. That is a good thing. A lot of people at this point agree that a statically analyzable dependency graph is a huge win.
* Bindings are truly exported. If you exported `const foo = 5`, imported it in another module, and tried to change it, you would get a static error. Semantics and identity of bindings carry over. This also means they are mutable; if the module exporting `foo` changes it everyone else will see it too.
* A fleshed out module loader API, which is seriously lacking in node
There are other advantages too. The simplistic approach of CommonJS works, but long-term it's not the best solution for the wide use-cases of JavaScript.
EDIT: and this will be backwards-compatible. I don't know all the details but this can be implemented in a browser today without breaking things (I know people on the teams working on this)
> The simplistic approach of CommonJS works, but long-term it's not the best solution for the wide use-cases of JavaScript.
Could you elaborate on this point a little more? I've worked on Node.js projects varying greatly in size, and even on the larger ones I've yet to feel encumbered the CommonJS module system.
to support CommonJS in the browser you either 1) require that it is wrapped (and therefore isn't really CommonJS) or 2) you XHR it, wrap it yourself, and eval. That's it's possible to work around the problem doesn't discount that it "doesn't work well" in browsers.
> The dependencies are static, so you can't do stuff like `if(Math.random() < .5) { require('foo') } else { require('bar') }`. That is a good thing. A lot of people at this point agree that a statically analyzable dependency graph is a huge win.
yeah but the 80% solution of loading them both statically but only executing them when the require is run works ridiculously well in practice.
ES6 modules are actually a nice combination of the asynchronicity of AMD (really required in a browser) and the code clarity of commonJS, so it's not really the case that it's not solving anything that commonJS doesn't solve. It also deals with circular dependencies better than CommonJS.
On top of that, module loaders provide hooks, so it's quite possible to use a module loader (e.g. System.JS) that can load commonjs or amd modules.
Browserify is brilliant, and if I couldn't use es6 I'd be using it, but I'm pretty excited about es6 modules.
Won't people use a pre-processor for ES6 modules, like they do for CommonJS and AMD? Otherwise, it'd be a lot of HTTP requests. So you wouldn't have to wait for browser support to start using ES6 modules.
Probably. This is indeed how I see most es6 development going for a while, but many servers already support SPDY which allows the server to push things into the clients cache that it knows they will need. This basically removes the need for bundling, while keeping the advantages of separate files.
This. I can't imagine, except maybe in a development mode, where I would want all of these different modules loading at different times especially if they're loading later rather than referencing the script in the HTML's HEAD.
I'm sort of assuming it can be ignored for now. Like you said, it doesn't add anything, but also doesn't preclude anything that we can do today, so it seems like a wash.
While a module system in JavaScript will be useful, this seems overly complicated.
I like Lua's approach to modules. In Lua, a file is basically equivalent to a function body. It accepts arguments and can return values. In JavaScript, this is not allowed.
In Lua, a function can accept multiple arguments in a tuple called `...`. Every file is implicitly a vararg function, so we can unpack the `...` to get arguments.
> add.lua
local a,b = ...
return a + b
> file.lua
local add = load 'add.lua'
print(add(1,2)) --> 3
load takes a file path and returns that source file as a function.
Most files ignore arguments, but they do return values.
> add2.lua
return function(a,b) return a + b end
> file2.lua
local add = require 'add'
print(add(1,2))
`require` loads a source file, executes it, and returns the value that it returns. It is similar to `return load(fileName)()`, but it also has a search mechanism, and caches the result. So, the Lua equivalent of export default is simply returning a value at the end of a source file.
The named exports are also easy to replicate. You simply return an object with the keys you want.
> math.lua
return {
add = function(a,b) return a + b end,
sub = function(a,b) return a - b end
}
> other.lua
local math = require 'math'
local add, sub = math.add, math.sub
You can also have something similar to the `export function name(){}` declaration.
> hello.lua
local exports = {}
function exports.hello()
print "Hello World"
end
exports.VERSION = '0.0.1'
return exports
> file3.lua
local hello = require'hello'.hello
local version = require'hello'.VERSION
hello() --> Hello World
In all, I think the Lua's require is a much cleaner way to implement modules. Unfortunately, it cannot handle circular references. It may be possible to add this feature using Lua's coroutines, however. I will have to research how it is handled in JavaScript to be sure.
> In all, I think the Lua's require is a much cleaner way to implement modules.
The idea behind ES6 modules is to add static name resolution to JavaScript. Because of the incredibly dynamic nature of JS, you can never know what any global name resolves to, as it could change at a moment's notice. The same is true in Lua's module system: `_G` is completely mutable and it (or its metatable!) can change at any time, altering the bindings in the program.
I personally feel that, given how JavaScript is getting used in larger and larger codebases, some measure of static checking was long overdue.
> Unfortunately, it cannot handle circular references.
Wouldn't this be bad design in 99% of circumstances anyway?
I say 99% because I don't want to be categorical, personally I think circular references/dependencies are always bad on some level.
It can be great if published before CommomJS existed. It's a bit annoying when people have embraced CommonJS. It aleady took me a lot time choosing between CommonJS AMD and CMD. Please end this war as son as possible.
CommonJS is not an official standard that is part of the language. It works, but it's also competing with AMD.
ES6 Modules officially add modules to the language. It gives a standard that should work in browser and in node. It's heavily inspired by CommonJS and AMD.
I think this is due to the fact that not every has ability / desire to live in the console. Having a development process that can work without a compile step is important.
Run it for every project you mean. The nice thing about the web is that with a simple http server you can just start coding. Compile-steps take away that incredible advantage.
Yep. Is that so bad? You know you can have multiple Terminal.app's open at once, right?
> The nice thing about the web is that with a simple http server you can just start coding.
The nice thing about browserify/node is that I can install node, clone my seed script, execute it, and just start coding.
> Compile-steps take away that incredible advantage.
While giving you the huge advantages of TDD/unit testing; the npm repository; being able to use the CommonJS module pattern...
If you're just woodshedding a toy, or trying to learn - by all means, set up lighttpd and boot sublime and go at it. But if you're trying to build production-quality software... hone your craft; be better than a codemonkey.
> While giving you the huge advantages of TDD/unit testing
You don't need a compile step to do unit testing. Or even to use CommonJS for that matter.
> If you're just woodshedding a toy, or trying to learn - by all means, set up lighttpd and boot sublime and go at it. But if you're trying to build production-quality software... hone your craft; be better than a codemonkey.
You don't need a compile-step in development to create good software. Client-side loaders have all of the advantages of server-side loaders but you can also use them without a watch task. ES6 modules are just going to make it all easier.
> You don't need a compile step to do unit testing.
Okay, I guess I could have a window with karma open and keep refreshing it every time I change code. Having an eshell open running my unit tests every time I save is a lot faster, though. Unfortunately, most of what I do is remote over SSH instead of having a local development environment. And that doesn't even begin to cover building a jenkins (or other CI) script to run your non-compile-time unit tests.
> Or even to use CommonJS for that matter.
Which client side loaders would you recommend? I've used AMD/RequireJS in the past and its headaches were not worth the benefits. The only other project in the running is Webpack.
> but you can also use them without a watch task
I suppose I don't understand the watch task hate. It's there. It runs. It versions my build. It lets me know when I've saved something stupid. It's faster than tabbing to a browser and going through the steps to reproduce the iota of code I just wrote, or refreshing a testrunner page.
How do you do continuous integration and save-time unit tests without a preprocessing step without tying your project to a bloated IDE?
> ES6 modules are just going to make it all easier.
When ES6 modules come around to being available in IE9 (or IE9 finally dies), I'll believe that you can do all this without a processing step. Even when ES6 modules become available in modern browsers, you'll still have to run everything through a preprocessing build step to get backwards compatibility. And woe to whomever tries to do dynamic dependency loading on mobile... that 200-2000ms per-http-request overhead on 3G will bite you right in the bounce rate.
Can someone explain to me the migration path towards ES6 and syntax like this? This isn't syntactically valid ES5, so is there a reasonable path to making the browsers run ES6 directly?
If I just include it in a <script> tag directly, it'll throw syntax errors in any non-es6 compliant browser. So I'd have the transpile, and send down the es5 version of the code.
But if I'm transpiling, what's the point? I still have a build step, so why use ES6 over another compile-to-js language? The eventual promise, that years and years and years from now, all the non-es6 compliant browser will be dead, so I can send down the es6 code directly?
<script type="module">
// loads the 'q' export from 'mymodule.js' in the same path as the page
import { q } from 'mymodule';
new q(); // -> 'this is an es6 class!'
</script>
Would it be impossible for browsers to include support for more than one scripting language? I've never looked at the architecture of browsers but it would be fun to run pychrome or haskafari.
Just because we all got on the js train doesn't mean we have to ride it till the end.
I guess this is why there are so many cross compilers from x -> js. Thanks be to those guys. Cheers.
It is a lot of work. WebKit didn't want it in their code base, but blink is experimenting with it with oilpan[0]. It would also take a lot of work in Firefox to do. The big issue is having both runtimes executing in the same DOM. handling garbage collection becomes a lot more complicated.
Dartium (https://www.dartlang.org/tools/dartium/) is a version of Chromium with both JavaScript and Dart VMs running side-by-side, so it's certainly possible. Of course, getting every major browser developer to adopt a new scripting language is a completely different story.
I quite like npm as far as package managers go (the default installation directory being ./node_modules is a big improvement over the global rubygem directory, IMO).
All the same, I don't mean to be overly negative but code like this is why I am not a fan of JS:
var isNode = typeof process !== "undefined" &&
{}.toString.call(process) === "[object process]";
I mean, I am no JS expert but it took me a couple of minutes just to parse this thing, and this is the next generation. I just don't like it.
The main problem with npm is that it's a module manager for JavaScript files. As applications scale, you need to start packaging other files in addition to .js files, and npm doesn't support it.
You can try to create workarounds and hacks, but that's pretty much building a new package manager on top of npm.
The core problem is that if your package depends on another package, it's impossible to know the location of that package (no, it's not always or even often under node_modules).
There's of course a lot of other problems with npm I've noticed..
Can you elaborate with more details? In my experience it's flexible and predictable. Folks seem to be able to package C++ extensions to Node well enough on npm. What else would you like to package? What other problems have you noticed?
Well, it can be anything. TypeScript definition files, for example, can't be packaed with npm, since you can't know their location in relation to the package that needs them.
Other problems include that npm is not really supported on Windows and specifying a 'git'-based url always installs the package even if it exists on a lower level (which works completely different from other version types). There's also weird cache bugs from time to time (npm cache clean helps there). And if you have a single package dependency that's missing, you can't force it to finish npm install.
Forgive me, I have very little experience with TypeScript and searching doesn't immediately reveal how definition files work. Is that a TypeScript source file that you'd want to include in another TypeScript project? Perhaps I'm missing something, but it seems strange to ask npm to distribute that. I prefer CoffeeScript but I still get annoyed when npm packages distribute their CoffeeScript source.
I can understand why installing a package from a version control URL should skip checks, since the very fact that you're using the direct URL kind of indicates that you want that specific commit.
I've literally never run into a missing dependency with npm. How did you get there? Are you installing packages from several different repositories? What good would forcing do if you're missing a dependency? If you know you don't really need it can't you stub it?
TBH I'm not that concerned with support on Windows. shrug
>Forgive me, I have very little experience with TypeScript and searching doesn't immediately reveal how definition files work. Is that a TypeScript source file that you'd want to include in another TypeScript project?
Yes, that's right. Since the application I'm working on is composed of multiple TypeScript-based Node.js modules, the type information has to be carried from one module to another. Since npm doesn't do this well, a new package manager was actually written (TSD, TypeScript Definition manager).
>I can understand why installing a package from a version control URL should skip checks, since the very fact that you're using the direct URL kind of indicates that you want that specific commit.
No, the version url doesn't need to point to a specific commit. npm really should use the semantics for all version urls. If I have two packages that both point to say "myrepo.git", then there shouldn't be two duplicates.
>I've literally never run into a missing dependency with npm. How did you get there? Are you installing packages from several different repositories? What good would forcing do if you're missing a dependency? If you know you don't really need it can't you stub it?
The missing dependency can be caused by a single Git repository being down. I encountered this when our private Git repository wasn't accessible for a while due to firewall issues, but I still wanted to keep development going on until that got fixed. The only solution is to manually remove that dependency and install it manually, then revert changes when the Git repository becomes accessible. npm install --force would be better.
>TBH I'm not that concerned with support on Windows. shrug
I mainly develop on Linux, but customers want Windows support. The main problem is documentation. Node.js download page doesn't say that Windows support is not reliable. There's various problems on Windows, one main one is apparently worked on ( https://github.com/npm/npm/issues/3697 ).
I'd have preferred support for Common JS (or closer to it), which doesn't seem to suffer from the namespacing and syntax issues raised by this new spec. Importing to the global namespace by default seems kinda dodgy, and the mechanisms to avoid name collisions are syntactically awkward, imho. I think this proposal might actually have cemented the long-term popularity of browserify for years to come.
According to http://kangax.github.io/compat-table/es6/, absolutely no browsers support JavaScript Modules yet; only the Traceur compiler[1] and EchoJS compiler[2] do. I would not call this feature “crucially important” yet.
The ember community has really embraced ES6 modules and now with the introduction of broccoli[1] and the broccoli-es6-transpiler[2] it's really easy to start using modules and the other ES6 features right now. If you're already using SASS or Less then an ES6 transpiler seems like a natural next step.
Besides the points made by the sibling comment, this feature is crucially important to JavaScript. It hasn't been implemented yet because the spec itself is just now being finalized. This page was made in an effort to spread the word about the upcoming modules, which people really need to see now and understand because it's going to be a big change.
I don't see how it still can't be crucially important.
"We’re going to build a simple asap module, which lets you schedule some work to happen as soon as possible, but asynchronously. In Node.js, you can do this via process.nextTick," ... I thought process.nextTick wasn't the equivalent to an async, and that async could only be done in native modules ... Can someone correct me or the author?
It's definitely cryptic but stuff like this always happens in dynamic languages. They couldn't have created a process.thisIsNodeJS_NoForRealItIs because anything could fake it. My making it part of the toString it can't be overwritten.
Who cares if it's being "faked", though? The code should just check directly for the existence of process.nextTick, not use a bunch of obfuscatory nonsense to check that the string representation of some object matches some arbitrary string from which you might be able to conclude that you're running on node, from which you can then conclude that you might have access to process.nextTick.
Because Node doesn't own process.nextTick. It's valid for a library to create one in the browser. If I'm writing JS that can be run in Node or the browser I want to know if it's REALLY IS Node or not.