Hacker News new | past | comments | ask | show | jobs | submit login
Why I Left Gulp and Grunt for Npm Scripts (medium.com/housecor)
73 points by codeaddslife on Jan 20, 2016 | hide | past | favorite | 72 comments

I wrote a related blog post [0] and provided a sample starting point [1] a few months back.

I'm amazed more people haven't heard of webpack, it solves a ton of these problems.

[0] https://github.com/cesarandreu/blog/blob/master/a_reasonable...

[1] https://github.com/cesarandreu/web-app

I am so happy to have left the web back into native, just about when gulp, grunt, npm, yeoman and friends were picking up steam.

It is not enough to jungle DOM libraries, CSS generation, JavaScript frameworks, browser compatibility headaches, one also needs to use the build tool of the day.

If you're using any of those tools above just because they're en vogue, you're doing it wrong. You use those tools because they solve specific problems that you're facing and to make your job easier and not because the "cool kids" on the block are using them.

I'd say that nearly all the tools listed in your comment serve a purpose and solve specific problems and ease certain pain points that I deem them useful and assets in my workflow.

It strikes me that this is really at the root of at least some of the 'front-end' problems we've been discussing in recent submissions.

It feels a bit like going to an all-you-can-eat restaurant and eating both too much, and insisting on trying everything they have on the menu. As a result, you might get sick from overeating and from mixing food that you shouldn't mix. And while such behavior is human nature, and while it can be argued that this alone is a good reason to avoid such a restaurant in the first place, it can also be argued that much of the problem could be avoided by applying a little more restraint.

Gulp might be great in some cases, but why not use the 'npm script' approach until you need it? Installing packages for everything is seductive and just one command away, but it's not all that different from the cut-and-paste-from-stack-overflow programming that most of us learned to avoid.

Now I'm not saying that the problems in the 'front-end ecosystem' aren't real. I'm sure they are as I still run into them even though I try to be very conservative these days, and I'm sure many of the complaints are from people who are better coders than I am.

But at least for myself, my Node.js work has become significantly more fun, and has involved significantly less 'grunt work' (heh) by simply avoiding the urge to add yet another dependency, API or tool "just because it's there".

We don't choose our tools, the customer's IT does it, so are the wonders of consulting when you work on existing projects.

And they choose them, because they follow fashion.

Do your customers really dictate that you use grunt or gulp in your build process in their project?

I seriously find this to be very implausible.

I seriously find this to be very implausible.

Why? If they're already using gulp and have people in-house that understand gulp, why is it so unreasonable that they don't want their consultants adding a new build system to the mix?

Well, I was implying that those projects in question are greenfield not inherited ones but even in the case of inherited projects, I think that devs/consultants have bargaining power when negotiating contracts to work out a compromise when it comes to non core details of the project like build systems and the like and project owners or managers could accommodate the consultants' requests regarding these points.

A lot of consultants work with existing teams. As someone who has worked on such projects from the existing-team side, it would be very off-putting for the consultants to come in and demand to change this sort of trivia before they've contributed any value. Having said that, if they come in and contribute a ton, and then point out that they believe they and the rest of the team could be more productive by switching to such-and-such tool, then they will probably be listened to. Consultants don't come in with credibility, they earn it.


As already replied by other HNer, most of the projects are existing ones.

Our customers are the enterprise, not startups.

The consultants are expected to bring value within the customer's IT stack.

Usually even the dev environments are provided by IT, to be returned on project termination.

If the customer hires us to assess their stack and provide feedback on what to change, that is another matter.

Which still needs to be approved by their IT anyway.

Makefiles still work!

That's not hip enough. Why use a solid technology that's been around for almost 40 years? Instead, you can use a flavor-of-the-month tool written in a language originally designed for doing form validation and pre-loading rollover images of dancing kittens?

Not for matching bullet points on job adverts.

Besides I never liked them that much.

The worst is when the same developer uses the same tools in his build process for two different projects but they need to be executed in a different order for each.

This. A thousand times this. Embrace the unix philosophy of small tools and npm scripts and watch your baroque frontend build process with all its plugin dependencies reduce to amazing simplicity. Here's another good write up with some significantly more useful examples:


> Package.json also doesn’t support variables

Not sure what he means by this, npm scripts do support env vars from package.json config values:


Gulp does embrace the unix philosophy of small tools and piping output.

Unfortunately it embraces it so much that there are several different variations of connecting the parts together, and even then some important tools don't play very nicely with the rest of the Gulp framework.

Consequently, it takes several lines of awkward boilerplate just to run a simple Browserify job that is a one-liner at the CLI. I've also seen Gulp adapters for popular tools, such as ESLint, that break suddenly when something apparently unrelated in the NPM set-up is updated, even though ESLint itself also still runs just fine with a one-liner at the CLI.

In fact, those were the two examples I used when I put my foot down over build tools on one project I work on a little while back. Within 20 minutes we'd replaced well over 100 lines of Gulp code with about 10 one-liner CLI jobs that did all of the same things. The new version has the added advantages of actually working, which the Gulp-based code used to but didn't any more, and of requiring a total of 0 changes since then to remain working, other than when the tools themselves changed their interfaces.

For every small tool you want to use in Gulp you either have to write your own Gulp task in JS, or add a plugin wrapper to the tool as an additional dependency to your project. Doesn't seem ideal. Just by switching to npm scripts you could probably reduce by at least 50% the number of 3rd party dependencies in your build.

Also streaming stuff around inside a Gulp process is good in terms of speed, but imo quite complicated conceptually, and doesn't match the simplicity of piping stdout between one-liner CLI commands.

> When you use npm scripts, you don’t search for a Grunt or Gulp plugin. You choose from over 227,000 npm packages.

This statement is sensationalist and I don't think it presents a useful argument. Sure I could write a script that uses one of those packages but that has nothing to do with whether or not any of those packages actually helps me achieve my goal.

Completely agree. Comparing these numbers doesn't make sense; of course npm has more modules, but why would I care about having e.g. node's express framework available to my build tool/task runner?

You're right. You wouldn't care about everything in npm. The point is this: When using Gulp/Grunt, we have to search for a plugin. This greatly limits our selection.

I see your point, it makes sense. But I have to note that in gulp (at least) you aren't restricted to plug-ins. You can use "generic" npm packages, as long as they output a stream (e.g. browserify).

It's definitely hyperbole, but the point still stands. Back when I used Gulp, I've run into quite a few situations where a particular package I wanted to use didn't support Gulp, but did have CLI support. And even if both were supported, adding a line to package.json was faster and simpler than using the gulp approach.

That said, I have nothing against Gulp and it's quite possible I might use it in the future. For example, the approach of using npm scripts can be problematic if you need to work on different operating systems. And I'm sure there are plenty of cases where the complexity is best managed with Gulp or its ilk.

Using npm instead of Grunt/Gulp/etc. has been extensively discussed by Keith Cirkel at http://blog.keithcirkel.co.uk/how-to-use-npm-as-a-build-tool.... Would've been nice to at least acknowledge that, if not join efforts and try to address some of the inherent shortcomings (e.g. the readability of long lines - https://github.com/keithamus/npm-scripts-example/issues/30).

It has been acknowledged in the post: it is one of many links prefaced by I’m far from the first person to suggest this. Here are some excellent links:

I'm moving in a similar direction for similar reasons - but by shifting functionality out of the node ecosystem altogether. I recently released devd, a small, dev-focused HTTP server with build-in livereload which has taken the place of gulp livereload and node-based dev servers for many of my projects (https://github.com/cortesi/devd).

I'm now working on the next step. It's not quite ready to be announced yet, but I'm cooking up modd, a similarly focused tool for monitoring the file system and responding to changes (https://github.com/cortesi/modd). Modd has already supplanted gulp entirely for many of my use cases, and has replaced supervisor and a bunch of other tools to boot. Many of the actions triggered by modd for front-end projects are precisely invocations of npm scripts as described in the article. A few more features (desktop notifications with Growl/notify, and script access to the list of changed files), and modd will be ready for me to ask for public feedback.

Both modd and devd are small, single-purpose tools written in Go, released as statically compiled binaries with no external dependencies. I've tried to make them tight and focused, and if I get it right, they will hopefully be a refreshing change after gulp and grunt.

Why create devd when you could have just used live-server? It automatically injects the LiveReload stub and watches the directory for changes.

I guess, if your end goal is to move everything away from the node ecosystem. To each their own.

Well, first, I think it's perfectly fine to have multiple takes on the same problem space. This is how we progress - people try different things, and we gradually figure out what works. Live-server does a specific set of things, and devd is a different take on a slightly wider set of things, and I think that's just dandy. The overlap is far from precise, and I think there's enough room for both devd and live-server (and for the many other tools that do similar things).

Second, not everything is node. Devd has lots of uses outside of node and even outside of front-end development.

Third, I probably wouldn't have written it if I didn't believe that devd did at least some things better than the current tools (at least for some people). You should try it - maybe you'll like it, and if you don't then that's fine too. As you say - to each their own. ;)

grumble reinventing wheels grumble

Live-server is capable of a lot more than I mentioned. There's also lite-server which is geared more toward SPAs.

Anyway, if you enjoy hacking on devd I won't fault you for it.

I'm not really looking to switch to go. I'm one of those weirdos who actually really likes JS. The massive NPM ecosystem just makes everything that much more fun.

How to Use npm as a Build Tool - Keith Cirkel http://blog.keithcirkel.co.uk/how-to-use-npm-as-a-build-tool...

Using npm scripts to build an asset pipeline - Modulus http://blog.modulus.io/using-npm-scripts-to-build-asset-pipe...

Why don't they provide a good working example, using Browser Sync or LiveReload, Sass, etc ?

I haven't found a good build process using npm as a task manager or build tool; I haven't seen any good example on any article I've read so far; gulp was built for, and it's much clear in my opinion!

If you're doing front-end development

Use JSPM for module management, transpiling (Babel, Traceur, Typescript), and bundling

live-server for LiveReload. With zero config it injects the LR stub and sets up a file watcher on the directory it's run in.

BrowserSync can be loaded via a npm command: https://gist.github.com/addyosmani/9f10c555e32a8d06ddb0

SASS can be setup with: https://medium.com/@brianhan/watch-compile-your-sass-with-np...

Here's what I use to build my Angular2+ES6 app: https://github.com/evanplaice/evanplaice.com/blob/master/pac...

Using npm as the default task runner is becoming more common so it's not to difficult to find good examples thru Google.

Here's something I've been using recently:

  $ jq .scripts package.json 
    "start": "npm-run-all --parallel client server live",
    "client": "beefy client/index.coffee:index.js $npm_package_config_beefy_port --cwd client -- --transform coffeeify --extension '.coffee'",
    "server": "BEEFY_PORT=$npm_package_config_beefy_port LIVE_PORT=$npm_package_config_live_port PORT=$npm_package_config_port nodemon --watch server --watch template --ext coffee,jade server/app.coffee",
    "live": "live-reload --port=$npm_package_config_live_port --delay=3000 client server template"
No sass in there, but one can see it could easily be added.

A small tip for those who use sass and use node-sass (and npm scripts):

I ran into a strange problem where if you use node-sass to watch a directory for some reason adding a new partial and the @import statements doesn't quite work as it should. If you change something in the imported file, node-sass won't recompile so you don't see your changes. The only way around it is to re-save the file that does the @import, or restart the process.

The solution I found is to use the more generic 'watch' package to watch the directory, and make it simply re-run node-sass without the watch flag. Aside from solving the problem, it also ended up feeling cleaner to use 'watch' everywhere I needed it instead of relying on all the different watch approaches of the different tools.

The one downside is that certain tools' own watch capabilities are faster or do partial updates. In the few cases where that is true I would just use the built-in watch functionality of course. But it's rarely been an issue for me.

It isn't 100% up to date anymore, but I made a web app repo [0] that gives you everything you need for a frontend app. This repo is very close to what we're using in my job for our app.

Furthermore, I wrote a related blog post [1] about this a few months back. You don't need gulp.

[0] http://github.com/cesarandreu/web-app

[1] https://github.com/cesarandreu/blog/blob/master/a_reasonable...

The article mentions two good working examples that do everything you mentioned and more:

https://github.com/coryhouse/react-slingshot https://github.com/kriasoft/react-starter-kit

Personally I ended up with a mix.

Leave Gulp to be gulp and do the heavy lifting while using npm scripts to run the gulp tasks and anything that doesn't live inside gulp.

It gives the simplicity of allowing people unfamiliar able to look at the package.json file and no further if they just want to use it, and allows all the "plug and play" style plugins from gulp.

Npm scripts work for very superficial tasks, but fail to deliver even for such a simple task as live reload:

- You need to run both server and watcher from the same shell, and the proposed way is to use parallelshell which is not a robust tool such as Gulp. Specifically trying npm run dev as suggested leads to some error that, after termination, leave 4 processes running in background that you have to kill manually, or else you can't access the same ports. Not fun.

- It requires to "highjack" your source files with script tags that I don't feel belong there: <script src="//localhost:9091/livereload.js"></script>

Why Gulp plugins are better?

- Fast: use streams, no temporary files. - Gulp plugins have uniform API: stream in, stream out; no massive command line options. - Convenient and expressive node-glob abstraction to select files/directories to be watched. - Less magic, more control and understanding of what is going on, less chance and dependence on bugs.

Here is the absolutely basic LiveReload setup that I wasn't able to achieve with Npm scripts:


I may not have been in the node and npm space long enough to fully understand the point of this, but it sounds like the endless argument of "stop using jQuery and go with raw Javascript instead".

The argument goes that modern Javascript has moved far enough along that a library like jQuery is not as necessary. But the majority of the examples I've seen of how to avoid jQuery means essentially coding your own custom version of jQuery.

This argument of dropping Grunt/Gulp seems the same to me. "Don't use a pre-coded builder or task runner, just build it yourself". I'm not sure I agree that's necessarily the best route for every project and/or coder.

Right now I've been playing with node and gulp, just for fun. So far I've found the most comfortable route for me is to have gulp kick off the process and then use non-gulp packages when it feels right. For instance, I don't use a gulpified version of Handlebars, I just use the actual npm package. I can't say this is the "best standards" way to go about it, but it feels right for me.

...it sounds like the endless argument of "stop using jQuery and go with raw Javascript instead".

I respectfully disagree.

Using jQuery is helpful because it makes things work reliably where raw JS sometimes runs into browser quirks, and because it makes the code for various common functionality cleaner and more concise.

The trouble with tools like Gulp in the current ecosystem is that while they were supposed to do something similar, in practice they often seem to have the opposite effect: they introduce quirks and instability even when the tools they are running work just fine, and they make things that should be simple one-liners take a bunch of boilerplate and actual thought to set up.

This argument of dropping Grunt/Gulp seems the same to me. "Don't use a pre-coded builder or task runner, just build it yourself".

But here your default assumption seems to be that this kind of tool is necessary, and that not using one is the active change.

Why do I need a task runner to automate running something like

    browserify src/main.js -o dest/main.js

    eslint --parser babel-eslint --plugin react src
at all? I used these exact examples a while back to argue for dropping Gulp from a project I work on, and at that point the Browserify logic in the Gulp file ran to nearly a dozen lines of boilerplate, which had been substantially changed at least twice since the start of the project, while the ESLint task had stopped working, for reasons that we never determined but which certainly didn't involve ESLint itself not working.

>> The trouble with tools like Gulp ...

I've heard similar arguments about jQuery as well. But are you saying there's a fundamental problem with gulp itself, or how people are using it?

As for your browserify example, keep in mind I'm not heavy into node/gulp at the moment, I would say that it is rather simplified as compared to what gulp actually offers. From my way of thinking your example only handles the single file, what happens when you complicate the matter a bit more? Do you end up recreating gulp in the end?

But are you saying there's a fundamental problem with gulp itself, or how people are using it?

As far as I can see, Gulp relies on plugins to be useful, and too many plugins for important tools seem to be either unstable or overly complicated.

The Gulp ecosystem is also a smaller illustration of the general Node/NPM culture problem that almost anything, no matter how simple, seems to get its own package to install. This very fine-grained packaging results in its own kind of dependency hell and a sort of artificially forced update schedule where updating one package to fix a bug or get some useful new feature has a knock-on effect that forces updates of numerous other things to remain compatible (or not, if those updates themselves break things or introduce new bugs).

From my way of thinking your example only handles the single file, what happens when you complicate the matter a bit more? Do you end up recreating gulp in the end?

Yes, that kind of command would be for a web app with a single entry point JS file generated from a single starting source file. But even that case, the simplest possible, requires a lot of boilerplate for no benefit at all in Gulp. Worse, the required boilerplate has changed several times within a year or two just to keep the Gulp infrastructure working.

I'm not sure what you have in mind for "when you complicate the matter", but I have yet to see an example related to Browserify that Gulp made any easier, even on projects with multiple targets each with their own starting point for building. So no, I don't see how any greater complexity I've encountered on a real project would wind up recreating a Gulp-like tool in the end (though of course that's just my own experience and doesn't mean no-one else has such a project).

>> As far as I can see, Gulp relies on plugins to be useful, and too many plugins for important tools seem to be either unstable or overly complicated.

Could you not say the same for any system that relies on or offers third-party additions? If I use a npm package that's badly coded, do I blame node or npm?

As for the rest of your comment, I appreciate the feedback. But by complicate the matter, I mean multiple files with multiple tasks to perform on those same files that results in a single output for each original file. A simple example from a gulp task I've been playing with; start with a folder structure of markdown files, iterate over each getting front matter for options and then removing front matter, convert each markdown file to html, compile with front matter listed Handlebars template file, change name of file to match front matter title (although I'm thinking of skipping this step and keep original markdown file name, which means I do nothing), output to single file in a build folder but keeping original folder structure. I also have a watch task that looks for any markdown file being updated and it runs this task for me. Now, I'm assuming all this can be done without gulp, but does using gulp not help with this task?

Could you not say the same for any system that relies on or offers third-party additions?

Of course. Any external dependency is a trade-off between the extra value you get, typically through saving time and/or solving a problem better than you know how to do yourself, against the overheads.

The key point in this case is that I just don't see much extra value in Gulp and its plugin ecosystem. As shown in my examples above, it's a whole load of extra mechanics to replace literally one-liner CLI operations. On most projects I've seen try to adopt Gulp, it wasn't clear that the Gulp-based tooling was actually more effective even when everything did work properly, which is why as you've probably figured out by now I'm not much of a fan.

But by complicate the matter, I mean multiple files with multiple tasks to perform on those same files that results in a single output for each original file. [...] Now, I'm assuming all this can be done without gulp, but does using gulp not help with this task?

I don't think I can give an informed opinion about your specific example without knowing more about the details. All I can say with confidence is that in the more complicated multi-stage build processes I've worked with, I've yet to find Gulp an advantage over plain old scripting of the kind we've been using for a decade or five. Shell scripting was made for this kind of automation work, and if you prefer more general programming tools or need cross-platform portability then you can write similar logic straightforwardly in any number of programming languages, without all the complication that these "task runners" introduce.

At that point, you have far more flexibility as well. That makes a noticeable difference if you want to do things not quite within the target area of a tool like Gulp, such as processing of two different types of file together. How do you wrangle Gulp's stream-based approach into, say, hashing CSS or JS filenames for cache-busting purposes and then incorporating the appropriate hashed filenames into your HTML files' headers? You can do it -- it's ultimately just JS code, after all -- but is Gulp's framework really helping at that point, or are you just coding your way around it? In my experience, it's been the latter in 100% of cases, but again, YMMV.

I see, thanks.

It is nothing like that. Npm/node is the task runner and you are not doing anything yourself that you wouldn't have to do with grunt/gulp anyway. They are the equivalent of making a function like `add(a, b) { return a + b; }` instead of just using the `+` operator that is already there, it is nothing like jQuery vs "raw javascript".

Not that I disagree, but it seems you simplified things a bit much as compared to the subject at hand.

Plus, I've never heard of npm/node as being described as a task runner. I would appreciate an explanation as to what you mean by that.

Did you even read the article? You can define tasks in package.json and use `npm run ...` instead of defining them in gruntfile and using `grunt ...`

I don't understand the negativity here, I simply stated I've never heard of it described that way. I'm only asking for information here. I guess I have to apologize that I don't necessarily and automatically equate scripts as the same as a task runner. Please refer to my original post where I clearly stated I'm new to this environment. That would include terminology as well.

I can understand the frustration because the whole point of the article and the discussions here is that using the scripts functionality of NPM is equivalent to using a task runner, and, according to some here (including myself) superior in most cases.

But you're right that it's maybe not immediately obvious unless you've already played around with the scripts functionality and task runners in general. And I suppose you do need to have a basic familiarity with the concept of piping data from one unix command to another, running commands in sequence or parallel, the difference between "&", "&&", ";", and "|" etc. Not to mention the difference between STDERR and STDOUT which I still don't fully master. Thankfully this has almost never been an issue in practice.

That said, I strongly suggest you take this opportunity to continue diving into this issue and maybe trying out both approaches yourself. My own experience getting into the node ecosystem and back-end stuff has been that I wasted countless hours on figuring out task runners, solving weird bugs and issues, switching from grunt to gulp, where I could have avoided all that by learning how to use the npm scripts functionality instead.

As a bonus, becoming more familiar with *nix approach to stringing together commands has been incredibly useful for many other things! For example, I've been able to replace certain scripts (backups, ftp synchronization, data conversion) with much simpler solutions that rely on this OS-wide functionality.

Actually, I'd say it's the other way around. Don't use a task runner because in many cases the operating system and the tools themselves already support what you need: parallel and sequential execution, grouping a complex set of commands under one 'name', etc.

Where your analogy does make some sense though, is cross-platform support. In the same way that jQuery solves cross-browser issues, a task runner solves the problem where your npm script commands might not work on non-*nix systems, or vice-versa. But that's a bit like using jQuery exclusively to solve XHR incompatibilities, instead of a library like superagent.

There is definitely a level of that in the article. "Use command line tools. Or, if it gets complex, make a .js file and use ShellJS" is basically suggesting you create your own version of the plugins already available.

Yeah there's some Not Invented Here (NIH) sentiment to this behavior and it's holding back the community from moving forward and exploring new frontiers and solutions than reinventing the wheel or tearing down structures to rebuild them only to tear them down shortly again and so on and so forth.

Most of the issues with Grunt and the like can be done away with by following good coding practices. i.e. keep your code outside the Grunt context and write unit tests for it. See the Thin Grunt Manifesto:


I'm not sure I buy fleeing to the command line as a viable alternative. You are in essence going to have to write a bash script at some point because the command will bloat beyond what can be sanely contained in a JSON property. Then you have to write tests for it. So why not just do that in Javascript, since you already have the infrastructure sitting there.

I think it is the case that most groups simply write lazy Grunt code in ways that make it hard to test. This is a chronic problem throughout the devops space; the code is frequently terrible. So don't do that. Don't drag the Grunt instance out into your code. Separate your concerns. Write testable functions and classes.

> You are in essence going to have to write a bash script at some point because the command will bloat beyond what can be sanely contained in a JSON property.

or do it in python like I did - http://kopy.io/yvEGl it works amazingly and gives you endless flexibility without require bash.

Many a time I have begun a node project, and many a time I have started under the basic premise of "I'll just use an npm task for this one little build task". Task runners do seem to add a bit of complexity to a project which new contributors sometimes find friction in. As the project grows and I add more and more tasks - for linting, building sub projects, testing, etc - I end up writing "glue" js files which get called by npm to get the desired output of a library or tool, or to compose a set of tools in the way I need. After a while you realize its been done before, and in a composable way - you're writing gulp and grunt tasks without actually calling gulp or grunt, and you've ceded reusing any existing code for it. Upon this realization I normally realize I can pull in gulp and then discard a large chunk of my handwritten tasks in favor of easily composing task runner plugins.

Also, minor nit: The author claims `&&` is "cross platform". I regret to inform him it is not. It may well work for bash and cmd.exe but it has been disallowed as a command seperator in powershell for a few versions. Aiming for cross-shell compatible commands is more of a minefield than you'd expect - I actually knew someone who'd rewired a ruby DSL to use for a shell. I've found it best for crossplat to write as much in JS as possible and just let npm tasks call out to scripts, to use as little of the shell as possible.

This has been my experience too. I have every intention to use npm tasks but wind up pulling in gulp when my js "glue" gets too lengthy. I wind up wondering yet again why I wanted to omit gulp. The tasks are super readable and, I think, easier for new contributors than the pipes and ampers and glue scripts that my npm tasks were growing.

I am using npm scripts for everything too. Check out my article "tiny npm package": http://g14n.info/2015/12/tiny-npm-package/

Checkout my postversion hook!

By the way, even if JSON does not support comments, sometimes I add a "#TODO:fooscript" prop in the scripts, and add comments as well.

Another interesting trick is that a script can call other npm scripts.

Also another nice feature is that command in node_modules/.bin folder are added to PATH.

Yes, npm scripts rock!

I was struggling to integrate Webpack, Browsersync, hot reloading, Mocha and much more using Gulp

I'm a little unclear on why you'd try to integrate Webpack into Gulp - Webpack is more than sufficient by itself. And I've tried using NPM-only scripts, but they haven't yet matched the dev server, hot reloading, production-readying functionality inside Webpack.

Webpack configuration files are absolutely gross, though.

The `cross-env` used by the author seems useful, I didn't know about that, and for this reason I'd usually write a separate bash script when I wanted to execute something like

  FOO=1 some-command
because this syntax doesn't work in `package.json` on Windows.

BTW. If you're not into bash for writing long scripts, it's pretty easy to write shell scripts in nodejs. Node 0.12+ has native `execSync` that is nice to have to write build code in JS in synchronous way. I wrapped it with two helper functions to have sth like `execAndPrint` and `execAndReturn`:


It's not as powerful as bash script (doesn't support | syntax for redirecting etc) but if you need that, you can extract it away to a helper bash script.

See also https://github.com/shelljs/shelljs

That title I would call sensationalist. Our tooling choices don't have to take this king-of-the-molehill, all or nothing, "did X just kill Y?" extreme stance. (I thought the article made some reasonable points, though.)

With so much conflicting wisdom to sort through - "yuck, monolith!", "omakase!", "YAGNI!", "not enough abstraction!", "too much abstraction!" - we should admit that we often end up leaning on instinct/emotion when making these decisions.

Just remember that you don't have to pick one philosophy and stick with it for all time exclusively. Often down the road you'll see the value in something you thought you had sworn off.

> I’ve found Gulp and Grunt to be unnecessary abstractions. npm scripts are plenty powerful and often easier to live with.

The main problem with this line of reasoning is that as things get more complex and expansive over time (and they will do as night follows day) on the npm scripts front, the developer would likely resort to build abstractions to hide all the tangled wires and simplify the dev process and then we're back to square one.

Abstractions are necessary evil but they should be built and also managed efficiently and wisely to avoid the common pitfalls and pain points and make the development process easier and less annoying.

I'm glad to see more people following this approach.

The real consequence of having the community diverging on build tools is the added work required by tool creators to support them. I created a tool that included a grunt wrapper just before gulp became the new 'hottness' and quickly became fed up with the whole mess.

Soon after, I discovered that npm scripts can call the bin scripts of locally installed dependencies. I haven't looked back since.

NPM scripts make composing tasks nice... and there's nothing stopping you from adding custom configs to the package.json object.

I went the other way as well and used Python with Envoy to call every thing from the command line.

It's quick, easy to debug (arguments to commands can be dumped in debug mode, run manually etc).

Also for certain tasks Python annihilates Gulp plugins.

This is a production example (not pretty code per se) http://kopy.io/yvEGl when combined with intellij watchers this makes it very easy to rebuild on change.

It adds yet another dependency just to get a project to initialize though which is a small negative. I guess npm needs python anyway for node-gyp but then you have to stick to python 2.

I deploy dev environments via vagrant/ansible so that's a slight upfront cost but something I never have to touch.

Using virtualenvs I can also keep one project isolated from another in terms of jumping around the filesystem.

The convenience is a major win and since I use python for everything other than the web framework which is PHP it actually ties together nicely.

Unconventional but it works really well.

> Misconception #3: Gulp’s Streams Are Necessary for Fast Builds

The author fails to acknowledge that gulp is also asynchronous. This is what makes it "fast". Streaming is just a nice abstraction for passing data through a pipeline.

Sure you can do async on command line but that's not trivial and the author doesn't address that.

'make -j' seems trivial enough to me.

The next article will be entitled "Why I Left npm Scripts for Make"

really simple automation I use: https://github.com/martingallagher/gawp

this reminds me of 'real programmers' by xkcd[1]


Javascript is way too complicate these days :S

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact