This is slightly related and I don't want to sound like I'm trying steal its thunder, because this looks really cool. I work on the asset pipeline that comes with the Dart SDK. It has many of the same principles as these.
Any transformation step can read in many input files and produce many output files. The built-in dev server tracks the entire asset dependency graph and only rebuilds the assets that are dirtied by a source file changing.
We have a plug-in system, and it's built on top of the same package management system that the SDK uses, so you can get transformer plug-ins as easily as you can get any other dependency.
We still have a lot of work to do to fully flesh things out, but it already does a lot, including supporting complex scenarios like transformers whose own code is the output of a previous transformer.
This is a great question. There are two main reasons:
1. The asset build system is built on top of a bunch of policies and conventions specific to our language's package manager. The build system needs to be able to locate plug-ins somehow, and it's hard to define that "somehow" without some kind of assumptions for how dependencies are located and organized.
2. Like other platforms, we want to minimize external dependencies. If we reuse grunt, then every user has to have node and npm installed. There's nothing wrong with that, of course, and many users do already, but we'd like to avoid forcing that dependency.
3. Since the build system is plug-in based, we need a way for the build tool and plug-ins to communicate with each other. We want that API to be simple so that it's easy to create plug-ins. Since they're sending entire assets through that communication channel, we need it to be efficient too. It's hard to do that in a language-agnostic way.
4. We want to make it as easy as possible for people to contribute to the build system and various build plug-ins. Since we already assume users know Dart, then when those components are also written in Dart, it increases the chances that they are able to pitch in and help.
I don't like walled gardens or feeling like we're reinventing the wheel (though I'll note that before this announcement, I wasn't aware of many other build systems that were as many-to-many based as ours), but there is value in having things be internally consistent.
There's always value in being able to stick to one language. I've worked on a project where we had a Gemfile, Gruntfile, composer.json, bower.json, .bowerrc, and package.json all within the same repository.
Broccoli is a new build tool. It’s comparable to the Rails asset pipeline in scope, though it runs on Node and is backend-agnostic.
This first line is a little disingenuous. Technically, it's not backend-agnostic, since it depends on Node being installed on the backend (in the same way that Sprockets [1] depends on Ruby). The Rails asset pipeline is a framework-specific integration of Sprockets. In much the same way, you could more closely integrate Broccoli with Rails if you wanted and call it a new Rails asset pipeline.
The project itself looks great, just the first line was confusing since they started the docs off by comparing apples to oranges.
A better comparison would probably be, "It's comparable to Sprockets (which powers the Rails asset pipeline), but runs on Node instead of Ruby."
Someone commented below (but their comment is dead, so I can't respond directly):
The Rails asset pipeline requires a JS runtime, so Broccoli doesn't have any more dependencies than asset pipeline in my mind.
Technically they're right that the Rails asset pipeline does require a JS runtime by default. But it's important to note that the Rails asset pipeline doesn't strictly require the JS runtime; it's only required for the coffee-script gem, which happens to be included by default for new Rails apps. If you're not using coffee-script for any of your assets, then no, it doesn't require a JS runtime.
But isn't Broccoli more of a frontend build tool? The primary motivation seems to be to speed up frontend development in terms of the save-file-build-wait-view-in-browser loop? I don't know much (anything) about the Rails asset pipeline, but isn't it used by the actual backend framework? As far as I can tell Broccoli is used to compile assets before the frontend files get anywhere near a backend ... so in this way its backend agnostic.
Sprockets is a frontend build tool. It works basically the same way.
The main difference between sprockets and grunt / lineman / broccoli, is that sprockets do not watch files. It's a web application that compile your assets on the fly[0][1] when you need them.
So it's lazy, and blocking. Meaning that you do not have to guess if your assets have been recompiled yet. Your browser just ask for them, and get them as soon as they are ready.
[0] In development mode of course
[1] In fact it cache the output, but whatever.
Bingo. But keep in mind that Sprockets doesn't have to be used just as a Rack server to compile on the fly, it can also be used to precompile to static files. In fact, you could then combine it with something like the watcher gem, and you now have automatic precompiling whenever you save your source files. In fact, there's the guard-sprockets gem which does all this for you [1].
No, it is correct. Backend agnostic means that it doesn't require your frontend project to be served by a rails app, a node app, a python app or whatever. It is written in node, but it is backend agnostic. This is not disingenuous at all if you use the same definition of backend agnostic as the author is implying.
What I'm saying is that the author's definition is incorrect. Backend agnostic would mean that it doesn't matter what software or language your backend is running. Backend agnostic would be a term you'd use e.g. for a front-end library. Broccoli is not that. Broccoli requires that your backend has Node.js installed and running. Even if Broccoli were written in Bash, it still wouldn't be backend agnostic.
Think about it this way. Heroku, for example, can either have Ruby or Node installed on a single server, not both. If you have a Ruby app deployed on Heroku, you could not use Broccoli in production, because it requires Node to be installed on the server (not that you'd want to anyway; it's a precompiler, so you'd run it locally before you deployed to Heroku). The fact that you can have a backend setup on which this could not run means it's not backend agnostic.
Anyway, that's not the disingenuous part. The disingenuous part is implying it's better and more flexible than the Rails asset pipeline, which is an apples-to-oranges comparison. This is comparable to Sprockets. Sprockets does this sort of thing (with a slightly different implementation), but is written in Ruby. Then someone created a sprocket-rails gem, which set up and configured Sprockets to run more tightly-coupled with Rails (making it easier to use within a Rails app). That is what then became the Rails asset pipeline.
You can still use Sprockets with a Rails app, Node app, Python or Django, whatever, as long as you have Ruby installed on the server. Just the same as you can use Broccoli as long as you have Node installed on the server. This tool is comparable to Sprockets, not the Rails asset pipeline.
Now, sprockets, in addition to precompiling, also happens to have a live Rack server that can run in development mode, which you'd have to configure with your app in order to use. In that case, yes, Sprockets has more interdependency with the specific implementation of your web app. But if you just used Sprockets for it's precompiling, and added the Watcher gem, you'd essentially have what I understand Broccoli to be (but I could be wrong as I haven't actually delved much into the Broccoli source code).
I'm not being down on Broccoli. Quite the contrary, I think it looks really cool. I'm just saying that comparing it to the Rails asset pipeline implies that the existing alternatives aren't as flexible or configurable as they really are.
I've also been playing around with gulp, and whilst still in the 'trying to use it like grunt' kind of mindset, I have seen some merit in using these stream based build tools.
Have to wonder why this all requires new build tools to achieve though.
It's obviously technically possible (npm shows grunt-gulp which does exactly what you'd think), so is the grunt architecture so firmly rooted in files that streams could not be leveraged, or is it something that will come in time?
Personally managed to get a nigh on perfect build process using grunt-watch, so no rush to move away.
Nice to see build tools still moving forward though.
I think the irksome point of grunt is simply that you tend to have many temporary directories for various states of files... especially if you use CoffeeScript or other intermediate languages, not to mention merge/minify actions and their temporary spaces.
grunt takes an almost entirely declarative approach, which becomes really hard to reconcile with situations that the order of execution really matters. because you are basically trying to map a flat(ish) array of commands onto a tree-like structure to trick the grunt internals to work the way you need them to.
You can either do this explicitly (by setting up aliased), or implicitly (by all the temporary file watch jiggery and pokery).
It's unlikely that I will start any new projects with grunt, even though I know it a lot better than gulp right now.
Yep- a search on npmjs shows not quite a dozen tasks for Broccoli while Grunt has hundreds of results. Not that quantity necessarily == quality but when it comes to a task runner, I need to, you know, run tasks.
Definitely awesome that people are trying to optimize in the devops space but just not sure I could be convinced to switch to something that isn't at least somewhat mature.
I think the trouble comes in that 3/4 or more of Grunt plugins are things that really have very little reason to be plugins in the first place. Given how flexible node really is, and how npm tasks work, I'm more an more inclined to simply having a ./scripts/ directory with a file per task registered in package.json, so I can simply `npm run taskname` and have it correlate to `./scripts/taskname.js|coffee` ...
With node.js as a host environment, it's easy enough to do pretty much whatever you want as a script/task, and is pretty much cross platform, and works well. Shell scripts come close, but ignore the elephant in the room (Windows).
For the record, I'm biased as all hell, like node.js and JavaScript in general.
> Shell scripts come close, but ignore the elephant in the room (Windows).
I use mingw and make, never had a problem on windows and it's not like mingw is heavy to download. Most people use git and git comes with a bash on windows too, so 0 excuses to learn shell scripts.
Honestly half the battle is having a great easy to use plugin style that gets lots of people making plugins, and given the incredibly brief mention of plugins which makes them seem quite complex, I'd be quite wary of broccoli at this point.
I also wonder how it compares to Gulp (mainly in terms of performance in practice). I've just started to use Gulp, migrating away from Sprockets and so far it has been a joy.
fast builds isn't the entire broccoli offering, I would trade slow builds for accurate consistent and durable builds, amazingly with broccoli I get both.
I actually do not believe grunt/gulp vs broccoli makes terribly much sense to compare as broccoli aims to be a accurate/stable/fast build pipeline, it does not aim to replace your task runner. It's primary goal is to be the best possible build pipeline, and should be programmatically accessible to your existing task runner.
Anyways, since I didn't really compare grunt/gulp with broccoli, let me explain what you get:
* primitives make sense, was able to get a fairly non-trivial or ordinary pipeline setup really quickly
* builds so fast, you don't notice them.
* accurately handles changes (deletions and git branch changes tend to often cause similar tools issues)
* doesn't lose changes that occur while building
* true pipeline. eg. describe the transforms, and the system handles the rest. No needing to construct make-shift pipelines yourself
* immediately usable as server (error reporting, locking)
* development builds.
* minified source mapped production builds.
* the Broccolifile API is well suited for constructing custom piplines
And finally:
the above you essentially get for free, if you need customizations the Broccoli file provides are an API well suited for the task.
As the author mentions, the real insight seems to be to switch from the file-based unit to the tree-based unit. This means asset building can be made more intelligent.
Now, I wonder if the same approach can be taken in gulp using vinyl-fs? Otherwise, I wonder if it might be worth plugging this tool as a specialized asset pipeline into existing grunt/gulp files.
I have been working on replacing the whole production build pipeline as well. And since Grunt isn't a thing you can ignore these days I created a simple grunt wrapper that collapses the entire 200+ lines of config down to two config options: https://github.com/Munter/grunt-reduce
I'm happy to see this reach beta version, it's a great step in the right direction. Grunt is too generic as a tool and we've all seen Gruntfiles reach enormous lengths, to a point when it's really hard to figure out what is processing what.
One thing that has room for improvement though is the syntax, which in my opinion doesn't reveal the intention behind some methods and is a bit too coupled with the implementation. What does `makeTree('lib')` mean? If it's taking a folder and its files then why not rename it to something like `broccoli.requireFolder('lib')`? Also another thing that might improve usability would be chaining compilers instead of calling them directly with the tree as parameter.
These are just minor things anyway, I'm sure the library will improve over time. Congrats joliss, great fan of your work!
I would like to request renaming brocollifile.js to brocolli.js. I believe brocollifile.js is too long for a standard build file name. Gruntfile.js always bothered me. Compare it to pom.xml, build.xml, package.json and it feels out of place.
I like the Broccoli name quite a bit (and the whole concept and implementation - great job Jo!), but I also prefer `Brocfile` to `Broccolifile`. Does that make me a broccoliphile against broccolifiles?
I want to mention mincer[1], which I have used in the past for compiling assets, and it has been an entirely painless process. Definitely take a look at it as an alternative, which has been around longer and has seen assistance from the folks behind sprockets[2] (according to the README) for creating a similar API.
The fun part is when your non-technical coworkers ask what you're reading so intently and you read something like this to them with no context: "Run broccoli serve to watch the source files and continuously serve the build output on localhost. Broccoli is optimized to make broccoli serve as fast as possible, so you should never experience rebuild pauses."
Other possible tasks including initializing the database (for a new developer), regenerating some artificat that isn’t normally regenerated during the build, and launching a local web server running the project.
Yet ANOTHER build tool. I've started placing bets on when repositories will flip to NEW-HOTNESS-BUILD-TOOL at the cost of actual product development time.
Engineers will constantly run toward shiny baubles at the expense of everything else.
Any transformation step can read in many input files and produce many output files. The built-in dev server tracks the entire asset dependency graph and only rebuilds the assets that are dirtied by a source file changing.
We have a plug-in system, and it's built on top of the same package management system that the SDK uses, so you can get transformer plug-ins as easily as you can get any other dependency.
We still have a lot of work to do to fully flesh things out, but it already does a lot, including supporting complex scenarios like transformers whose own code is the output of a previous transformer.
More here: https://www.dartlang.org/tools/pub/assets-and-transformers.h...