"The project has been inspired by PBS of Andrew Moffat, and has borrowed some of his ideas (namely treating programs like functions and the nice trick for importing commands). However, I felt there was too much magic going on in PBS, and that the syntax wasn’t what I had in mind when I came to write shell-like programs. I contacted Andrew about these issues, but he wanted to keep PBS this way. Other than that, the two libraries go in different directions, where Plumbum attempts to provide a more wholesome approach."
Plumbum is the latin name for lead, periodic table symbol "Pb", perhaps memorable enough considering Plumbum was inspired by an earlier project called PBS.
The module documentation doesn't mention error handling. Perl has no consistent exception system and this makes it very difficult to build reliable programs in the language. The standard library has inconsistent error handling. And CPAN modules like this usually have none at all.
I like PBS (now sh.py) for certain use cases. If I'm writing an actual shell script, I think it is brilliant. It keeps the script focused on the task at hand instead of Python's somewhat painful process communication.
On the other hand, if I have an application that needs to communicate with a subprocess as a small piece of the whole, I'll use other methods that are less "magical". It's not that I'm inherently against magic, but rather that, in that use case, I generally want very explicit control over what is happening.
This tool hijacks the import mechanism by directly writing to Python's look-up tables. After all, when you do
from sh import git
there's no module 'sh' invoked in the normal sense. Instead, the library generates a wrapper for the shell command 'git' on the fly. While that kind of monkey patching may be neat, it's also a bit brittle and a potential security issue.
Python has a lot of built-in support for overriding how imports work, and modules have always been just namespaces. As hacks go, this isn't very hacky in Python, and it isn't particularly brittle, as it's just implementing existing interfaces.
It is also not "monkeypatching", which ought to be reserved for things that involve reaching into an existing class and modifying things. This on-demand loading, which doesn't have anywhere near the same evil factor, but is rather more like dynamic programming languages working-as-designed.
You may find this distasteful. I do, actually, though I'm not 100% sure why. But it's not because it's some sort of abuse of Python. Python is very nearly designed to do this, and the last little bit that it isn't designed for isn't that big a deal, especially compared to something like the "import python modules through zipfiles" functionality, which now ships with the core.
It's true that it is another place that you can get a python script to execute code outside of its environment, but you get a ton of those for free with the stdlib.
Well, the Python and Ruby communities have different engineering cultures and this is part of that. "Explicit is better than implicit" is one of the mantras on the Python side.
You expect import imports names which have actually been declared somewhere. Names which you can find and look at the definition. This library essentially hijacks the import process to allow you to import any name, each name is then actually a wrapper on a system to run shell commands. Definitely a hack, but a cool and useful looking hack.
That is a reasonable assumption in the common case, but it's reasonable to not always assume that. The same is true if you have obj.v(), you would normally expect there to be some v property defined on the object but really it might be coming dynamically from __getattr__().
This is something that python specifically has language level support for, I wouldn't say it's a hack just because its using a feature that isn't taught in Python 101.
Rightly or wrongly I expect import to be simple, fail only when packages haven't been installed properly, and more generally depend only on PYTHONPATH. When I'm debugging I don't normally even look at the import statements. Other replies have described what this does; I think I'd rather offer such functionality as something that looks like a method call that might fail (something like git = sh.getProxyForShellCommand("git") - probably not that verbose, my head's in java-land at the moment, but you get the idea)
To the parent and anyone else who hasn't experienced the wonders of `pip`, check out the article "Tools of the Modern Python Hacker: Virtualenv, Fabric and Pip" [1]. All three of these tools are invaluable for even the most trivial of Python projects, and make developing in Python much more enjoyable.
I just learned Python a week ago, and PIP was an important part of that.
For those of you who are just learning, PIP will automatically download, install, and then compile any python modules you point it at. Just make sure you have the module's recommended compilers installed first.
I have found this extremely useful - used it to write many things - from a set of scripts that bootstrap chef server onto a node from scratch to a file chunking program that optimizes log files to align with hadoop block sizes using multiprocessing and this. It made a lot of things very easy.
This version introduces many positive changes: specially 'Iterating over output' that I have been waiting for a long time.
Andrew wants to increase his support for MacOS and would like to have test results from "python setup.py test" (to run the whole test suite). One identified bug is: http://bugs.python.org/issue15898
I would love to see more people use this to simplify their work!
If anyone is interested in looking into the scripts I wrote to see what's possible, let me know.
Just look at how it's used. It's not the same. sh imports shell commands as functions. envoy allows you to run shell commands very easily, like perl's ``.
Definitely neat, but of course platform-dependent.
Due to the cross-platform needs of Mozilla's PDF.js build scripts, we've been writing a Node.js lib on top of Node's APIs that enables you to write shell-like scripts that run seamlessly on multiple platforms:
This seems only tangentially related. The point of something like sh.py is to ease the use of external commands you need to use. The functions that shelljs partially implements (cd, pwd, ls, find, cp, rm, mv, mkdir, test, cat, sed, grep, which, echo, exit, env) are already trivially accomplished in Python.
It's probably not ready for prime time - past it's initial use cases it hasn't been tested much. Things like sh.py and jash are a really neat solution for some problems.
I was hoping for a Node.js alternative. One question, have you thought about using https://github.com/samshull/node-proxy instead of searching through the environment to find binaries? Using .get() you would only need to identify programs that are requested, rather than knowing them all up front.
Founder of Commando.io (http://commando.io) here. The tutorial on SSH was particularly interesting, since we are doing some of the same sort of things to help with orchestration of servers. Currently we are using `libssh2` via a PHP module, but switching to a sparkling new node.js interface for the SSH and SCP connections and executions shortly.
Throwing exceptions when a command returns non-zero exit status is very useful indeed. However, this isn't very different from using the shell's own && operator.
I still believe that wrapping shell commands with functions is the way to go. Functions can intelligently check their arguments and prevent propagation of dangerous (or otherwise obviously incorrect) arguments.
Surely implementing the shell commands natively is the way to go?
This way you then have to parse the output of ifconfig, say.
eg I have been doing this for Lua [1] you can do
> i = nl.interfaces()
> print(i.lo)
lo Link encap:Local Loopback
inet addr: 127.0.0.1/8
inet6 addr: ::1/128
UP LOOPBACK RUNNING LOWER_UP MTU: 16436
RX packets:261454 errors:0 dropped:0
TX packets:261454 errors:0 dropped:0
> print(i.eth0.macaddr)
f0:da:f0:38:36:39
The functionality is now reasonably comprehensive, so you can rename interfaces, add addresses, although it is still a work in progress, as there is a fair amount of work involved as there is a lot of shell to implement, but it can be incrementally useful.
It saved my day, I expected partials with cwd parameter and they were there. I used this instead of GitPython + manual popen for some git management tasks.
Lua has a function called os.execute() where you can call programs and utilities from the command line. Cross platform as well; I've got more than one Lua script where it's working in the command line and can handle being on Unix or on Windows.
A common example of this that I use quite often is "clear" or "cls". It makes the determination what OS it's using, then issues the appropriate command to clear the terminal window.
http://plumbum.readthedocs.org/en/latest/index.html
Here's the explanation on the differences:
"The project has been inspired by PBS of Andrew Moffat, and has borrowed some of his ideas (namely treating programs like functions and the nice trick for importing commands). However, I felt there was too much magic going on in PBS, and that the syntax wasn’t what I had in mind when I came to write shell-like programs. I contacted Andrew about these issues, but he wanted to keep PBS this way. Other than that, the two libraries go in different directions, where Plumbum attempts to provide a more wholesome approach."