To be fair, if you ever install and execute someone else's software without either reading the code yourself or making an attempt to verify that the source is who you intend (i.e. at least checking MD5), then you're guilty of precisely the same security gaffe.
A lot more can be done wrong with a shell script pointing to some 3rd party domain than with an installed app, especially with sandboxing in modern OSes, app stores, and the much higher probability and ease of using `sudo` vs executing an app with higher privileges.
Anyone who values the user experience over the tiny chance of a security breach.
I'm not validating it as a good practice, but that seems to be the main reason and I've not heard of Pow or Homebrew causing any problems in this regard.. yet ;-)
Never mind the security aspect, how am I going to uninstall the damn thing or upgrade its components (which other things in my system might use) with a custom installer?
There's a reason I don't install things that don't use apt.
You are piping the output of an untrusted command into a shell that will execute it. This leaves no way to verify that the code your are running is legit. No md5 verification, no cursory inspection of the script, etc.... There could even be hijacked dns that points to a server that specifically does a bad thing. Maybe not a huge deal for small projects but it's not a great idea in general.
HTTPS is an entirely different story, I don't know that people would necessarily like installs like "curl https://my.script.ly | sh", but there's at least a mechanism to verify the identity of the source.
You are asking developers to download a shell script from $random_site and run it immediately without any thoughts whatsoever as to what effects it might have.
The biggest problem is validating the download prior to installing it. The use make install or install.sh or other similar script, the following should be done:
Download package
md5 package
verify md5 == published md5 of packge
extract
make install / install.sh / etc
With "curl package.github.com" | bash" the validation is missing. I don't mind the curl x | bash for my dev machine or testing/dev vms, but that is not happening on production. And if I need said software on production, I have to find a different way to install.
If the bad guy has intercepted the DNS, they can just provide the md5 of the bad script, no?
That's enough extra work, and unreliable enough, that the attacker might not bother. Why work so hard to sabotage the user who checks md5sums when you can just wait for a user that doesn't? Just because thieves can carry lockpicks doesn't mean that you shouldn't bother locking your car: Protection against lazy, opportunistic thieves is still better than nothing.
The other advantage of the MD5 plan is that you can download the MD5 from a different site than the script, at a different time and over a different internet connection (or, perhaps, over https). A specific, important version of that use case is: If you're installing the script over and over again in an automated fashion, you can download its MD5 in advance, cache it, and then check it against every future download of the script to verify that the script hasn't changed. When the script gets updated and the MD5 legitimately changes, you audit the diff and then update your copy of the MD5 for the future.
Meanwhile, using curl-over-HTTPS seems like it couldn't hurt, but better make sure 'curl' is really checking the cert and aborting on cert mismatch, because tools can be very sloppy about this. Also, you're still trusting the third party site, and once their site gets hacked it's game over… unless you have another canonical source for the MD5 sum.
One ultimately realizes why real packaging systems have signed packages, with private keys assigned to developers.
Isn't that just a suggestion to get up and running quickly (for example on a dev VM)? There's nobody forcing devs to blindly follow the instruction and pipe it directly into bash. You are free to curl -L -o install.sh get.whatever.com if you want to inspect the install script.
It's not an argument at all. It's a manner of speech. To be more verbose: I can't think of a compelling reason why you would blindly type whatever instructions you see on the screen, into your terminal. Unless you were being forced.
EDIT: It looks like the new FAQ addresses this issue: http://yeoman.io/faq.html > "How does Yeoman differ from tools like Brunch or BBB?" and "How does Yeoman differ from Grunt?". Post left up for other users who are wondering how this compares to Grunt or Brunch.
However, the answer seems a little... underwhelming. Apparently it's "we've made making your own scaffolds easier". Are there any other, unmentioned benefits?
EDIT2: Partial updates as I discover this myself (hope that's okay): one huge advantage is how it uses Bower beneath the surface. No more manual AngularJS/JQuery/other dep upgrading!!
It has many capabilities useful to the modern web dev workflow.
Of most note to me, it acts as a "project creation" tool. AKA "A Scaffolding generator."
It will pull things like HTML5 Boilerplate, jQuery, Backbone.js, etc down from github, and properly generate the project files you need to start a project with those dependencies.
You dont have to worry about how any of it fits together, it will get you up and running with the latest version of everything with a simple commandline.
It also does things like minify css and javascript, as well as compiling LESS/SASS and Coffeescript.
They are unfortunately trying to make a single tool that solves many problems, when they might be better served by making many tools, that are all good at 1 thing each... I'm still waiting for them to launch this so I can see how that all pans out.
If you have already built a few web apps, wouldn't you already have your scaffolding that can be copied into a new app? And as a bonus, you will have scaffolding that you understand very well.
"You don't have to worry about how any of it fits together..." That doesn't seem like an advantage.
I agree with this. I like knowing exactly what all the pieces of my stack are, and how they mesh together.
If someone is churning out websites right and left I could see this being a more interesting tool, as it seems to be more powerful that any "boilerplate generation" scripts I would write myself.
The pain point of managing dependencies is indeed non-trivial in my experience, so I will keep an open mind for tools that look to solve this.
Random stream-of-consciousness idea: I create a new directory, and in it a text file containing the following on separate lines: "jQuery html5boilerplate AngularJS". I then run a build command to pull all these resources together in a sane way. This would allow me the fine-grained control I prefer, help ease the tedium of fetching dependencies, and obviate the need for a stream of "yes/no" questions at the terminal. This functionality may exist already, and it seems like it could be built by leveraging the logic being Yeoman, but with a different "UI".
Can this tool connect to a mysql database and generate html5 CRUD forms based on the database? I have seen a tool like phreeze.com and I am wondering why there is not more in that direction (building web app basic pages based on existing database design)?
This is a tool for front-end developers. It would have to support a myriad of server-side languages for that, which is completely out of scope, and tools that do it already exist.
From what I can guess, if you happen to be on a non-supported linux distribution (or any non-OSX OS with sh, really) and have clang installed, the script will go as far as running https://raw.github.com/mxcl/homebrew/go which should fail fast for lack of a /usr/bin/sw_vers binary.
I'm not brave enough to actually run the script and check, though.
When I saw the Bower announcement yesterday I was a bit concerned that with Yeoman coming up in the next weeks, there would soon be two brand new competing front-end package managers. It's presently surprised that Yeoman actually builds on top of Bower.
I'm using Brunch[1] for a current project, and have used Backbone Boilerplate[2] in the past. Brunch is pretty great, though bbb is better hooked up with requirejs out of the box, if you're into that kind of thing.
I really want Yeoman to take off, in particular because I really want one of these tools to become the breakout hit. I'd rather everyone be focusing on improving one tool instead of all this effort going diffuse across a bunch of different solutions. I'm a python dev, and I thank the stars that pip became so standard. (Not that this is a direct analog to pip, but pip provides a subset of this functionality when it comes to packaging.)
One note: the Ember init is waaaay basic. I know that these are going to be community maintained, but I would have hoped that the Ember generator bundled with this initial release would have included at least commonjs or requirejs integration.
Why all the hate? If it's that bad, help out and make it cool.
Make it a brew package, or npm, or gem maybe. Alternative installers for those who dont want the curl script|sh thing.
I think its a great idea!
*edit Actually just noticed that it IS an npm package that can be installed using `npm install yeoman´
I have a similar more light-weight project template that requires no installation (except node).
Pith auto-generates test module dependencies and uses Backbone too :)
was able to get it running on windows, so I am happy with it. Needs more in-depth guide so I know what it provides and the reasoning behind ur "sane defaults"
A quick note, the bottom of that script has this section:
if [ "$COMPASS" -eq 0 ]; then
echo ""
echo "Install hiccup: no compass"
echo "Sorry chap, compass wasn't setup because there was a problem with your ruby setup. You can check the documentation here for help: [link to documentation]."
fi
You should probably make that an actual link for when people have problems.
Very nice install script by the way, you clearly had a lot of fun making it.
Ah, I've been building something like this for my own use in static sites. I use Django templating with variables loaded from a config file, thinking about supporting some sort of database for more complex sites. Also planning on compiling (less), combining and minifying css/js files automatically on file changes. Like a jekyll of my own on steroids. It's a cool little side project to have that you can add bits and bytes to when you need them.
I've been watching/waiting for Yeoman for quite a while now. I can't help but wonder how much Twitter's Bower pushed you towards launching today. This looks to fit a lot more of the workflow than just package management though and does a lot of the things I'm currently doing myself through a huge mashup of other tools.
EDIT – totally missed that this in fact uses Bower for the package managment, so probably couldn't even launch until that was released.
If anyone's interested in something similar for PHP that goes a few steps further, I've got a project I'm looking for feedback from some advanced developers on -- our website doesn't convey it well though yet: http://emr.ge
This is great. I would like to support it spinejs, maybe it is easy to add, haven't look yet.
This reminds me how much we need a global package manager, for most of the open source languages. Why would each framework need it's own way of handling dependencies.
To the authors: Please make sure to remove trailing whitespace. I see it all over the code base. Run 'git diff 4b825dc642cb6eb9a060e54bf8d69288fbee4904 --check' to catch those. That SHA is that of the empty tree, applicable to any git repo.
I don't know who to reply to: There's a lot of correct comments from people saying they don't want to run a script through curl from a site. Fair enough.
But why not just fork the repo on Github (or onto your own infrastructure) and run the script from there, where you can verify any tampering?
I spent weeks explaining to junior devs to not do that on production servers. I thought it was obvious, but apparently it isn't...