
Better bash in 15 minutes - wsxiaoys
http://robertmuth.blogspot.com/2012/08/better-bash-scripting-in-15-minutes.html
======
phaemon
If you're using:

    
    
        set -o errexit # or set -e if you prefer
    

Then you probably also want:

    
    
        set -o pipefail
    

Otherwise, it only checks that the last command succeeds, so something like:

    
    
        ls *.ssjkfle | wc -l
    

will actually continue as success despite the "ls" failing.

------
rshm
'set -o nounset' is a must have. I Just suffered this script from Samsung
Printer setting Utility. sudo ./uninstall wiped the /opt

    
    
      DEST_PATH=/opt/$VENDOR/$DEST_DIRNAME
      #remove destination
      VERSION=`cat "$DEST_PATH/bin/.version"`
      if rm -fr "$DEST_PATH"; then
      	echo "INFO: $APP_NAME (ver.$VERSION) has been uninstalled successfully."
      ...

~~~
laurent123456
Wow, nounset or not, doing a `rm -rf` on a variable without any check is quite
irresponsible. Especially if they expect the script to be run as sudo.

------
narsil
Another useful capability I use to do cleanup is `trap`.

    
    
        function cleanup {
            ...
        }
        trap cleanup EXIT
    

See more here: [http://linux.die.net/Bash-Beginners-
Guide/sect_12_02.html](http://linux.die.net/Bash-Beginners-
Guide/sect_12_02.html)

~~~
eik3_de
to clean the command prompt line after CTRL-C:

    
    
        trap "{ echo; exit 1; }" INT

~~~
e12e
I don't know, if you're fearing a ctrl-c in the middle of one of those
newfangled moving, colourized progress bars (hello, npm), "reset" might be
more appropriate?

~~~
eik3_de
sure, if you do coloring/bold/underline/inverse in the script, reset would be
appropriate. but I wouldn't do it by default, because it's quite slow ('time
reset' takes 1s here)

~~~
shabble
"tputs reset" has much the same effect (I think; there are some cases I've
noticed it not fully restoring things) without the delay.

------
kirubakaran
Use:

    
    
      #!/usr/bin/env bash
    

Instead of:

    
    
      #!/bin/bash
    

This makes the script more portable as you don't rely on bash (or any
executable) to be in /bin.

[http://en.wikipedia.org/wiki/Shebang_(Unix)#Portability](http://en.wikipedia.org/wiki/Shebang_\(Unix\)#Portability)

~~~
sgentle
Are there any situations where you wouldn't be able to find bash in /bin/bash?
I haven't ever seen such a system, but I'd like to know if it's something I
might run into...

The other thing is, if you're targeting obscure systems, wouldn't it be just
as likely that there wouldn't be a /usr/bin/env, /usr/ might not be mounted,
or that bash might not be installed at all?

I suppose what I'm asking is: for practical purposes, is the lowest common
denominator /usr/bin/env or is it /bin/sh?

~~~
thwarted
The lowest common denominator is /bin/sh. But bash has a lot of niceties, and
the POSIX shell can get the job done, but that lowest common denominator is
pretty low. You're still going to be dealing with annoying differences on the
platforms anyway.

I don't use env on my shebang lines specifically because it's then PATH
ordering dependent (there used to be a thing where /usr/local/bin and GNU
tools were _last_ in root's path, but _first_ in non-root users' path). I'm
more confident (perhaps incorrectly) that /bin/bash or /usr/local/bin/bash is
what I expect it is vs the first thing found in PATH that is named bash is
what I think it is (however, this applies moreso to coreutils-like things,
that have different SysV vs BSD semantics/options, vs a shell such as bash,
which is known everywhere to be GNU bash). Some tools, like ssh, can propagate
environment settings based on local, remote or config file settings, and I'd
rather not be surprised.

This used to be a bigger deal on systems that put reasonable (where
"reasonable" == "what you're used to") tools in /usr/ucb or /opt/gnu, rather
than system paths. If you're going to create something that's intended to be
"portable", you've got bigger fish to fry than if and where bash is available,
and it's wise to abstract system differences to different scripts
(run.linux.sh, run.freebsd.sh, run.osx.sh, run.aix.sh, etc) than try to keep
everything in one massive script anyway.

~~~
dmytrish
Yes, using PATH for determining bash location is quite vulnerable to all kinds
of security exploits.

------
earless1
I'm glad this included the "Signs you should not be using a bash script"
section. Bash is a very good solution for many cases, but it becomes downright
unruly when dealing with a lot of string manipulation and more complex
objects.

~~~
chubot

        your script is longer than a few hundred lines of code
        you need data structures beyond simple arrays
        you have a hard time working around quoting issues
        you do a lot of string manipulation
        you do not have much need for invoking other programs or pipe-lining them
        you worry about performance
    

It's not a sign you shouldn't be using bash. It's a sign you shouldn't be
using ONLY bash.

People who insist on rewriting an ENTIRE program in Python, Perl, or Ruby fail
to understand the Unix philosophy (this is a real misunderstanding I've
encountered in my work, not a straw man).

You can just write the complex part in another language, but keep the rest in
bash. In other words, main() stays in bash. Python et. al. is used as just
another process. bash is a language for coordinating processes.

You don't want a 2000 line bash script. But it can be worse to have a 5,000
line Python script that shells out to tons of other utilities, or awkwardly
and verbosely reimplements 'xargs -P' or 'sed -i'. Often you can do the same
job with 500 lines of bash and 500 lines of Python (or C), and that is the
ideal solution.

Python and bash are pretty radically different languages, and they are not
interchangeable (the advices "just rewrite in Python" seems to imply they
are). You can use each one for the things they are good at.

~~~
nikatwork
Modularity is nice, but it's generally easier to debug a program in a single
context.

I work with some deployment systems that chain together small scripts in a
bunch of different languages. They are a nightmare to troubleshoot.

I'd much rather the spaghetti was all on one plate than follow it from table
to table...

~~~
chubot
If the tools are coherently designed, it should be easier to debug, because
you can just use -x to log the commands being run and paste them in to see
what went wrong. It's debugging with a REPL.

The biggest mistake I see people making is to hard code paths, ports, user
names, configuration, etc. inside Python/Perl scripts. All that stuff belongs
in shell. All the logic and control flow goes in the scripts. If it's factored
this way, then you have a very natural way of testing the scripts with test
parameters (i.e. not production parameters).

I don't doubt that there are many multi language shell scripts that are
spaghetti. Factoring into processes is definitely a skill that takes thought
and effort, and I don't think anyone really teaches it or writes about it. The
only books I can think of know of is The Art of Unix Programming by ESR and
the Unix Programming Environment.

But it's definitely made me way more productive once I started thinking like
this. People say talk about the "Unix philosophy" for a reason. It's a real
thing :) It's not an accident that Unix is wildly popular and has
unprecedented longevity.

------
ygra
I love the very last list »Signs that you should not be using a bash script«.
That should be a required part of every language/tool introduction/tutorial.

So very often people lose track of when to use what tools. (Although
admittedly, so very often people are forced into some tools by external
constraints.)

------
sleepydog
I found this web page, from the author of musl libc, very insightful:

[http://www.etalabs.net/sh_tricks.html](http://www.etalabs.net/sh_tricks.html)

Shell scripts are great, I use and write them every day (and quite advanced
ones, too). But it's very hard to make a shell script robust.

Unfortunately it's hard to find a replacement that is stable and installed
everywhere. Perl is pretty close. And python too, if you are careful about
making your script compatible with all the different versions.

------
oneandoneis2
A link on HN about improving bash and it wasn't instructions on how to install
zsh. I'm pleasantly surprised :)

~~~
weaksauce
I like bash and all, but a well tuned zsh is amazing. I was hesitant for a
while but it really improved my workflow in the shell.

~~~
TylerE
Yep!

And thanks to oh-my-zsh ([https://github.com/robbyrussell/oh-my-
zsh](https://github.com/robbyrussell/oh-my-zsh)) it really requires very
little fiddling, maybe 10 minutes worth.

~~~
themoonbus
I'm a recent zsh + oh-my-zsh convert, and I have a severe case of "why didn't
I do this before"s.

Here is a great gallery of oh-my-zsh themes:
[http://zshthem.es/all/](http://zshthem.es/all/)

~~~
TylerE
Yea, it's pretty awesome.

What's really neat is you can even use it on Windows.

My personal setup is Cygwin Zsh inside Console2 (tabbed cmd.exe shell) proxied
via ansicon.exe (makes cmd.exe ansi color escape aware). It's not quite as
nice as say Konsole on linux or iTerm2 on a mac, but it's the best I've found
for windows.

Setup guide:

Setup console2 like this
[http://www.hanselman.com/blog/Console2ABetterWindowsCommandP...](http://www.hanselman.com/blog/Console2ABetterWindowsCommandPrompt.aspx)

This post sets up ansicon: [http://www.kevwebdev.com/blog/in-search-of-a-
better-windows-...](http://www.kevwebdev.com/blog/in-search-of-a-better-
windows-console-using-ansicon-console2-and-git-bash.html)

~~~
NateEag
I'm stuck on Windows at work, and landed on ConEmu as my terminal emulator
(running bash, though - I've never made the jump to zsh):

[https://code.google.com/p/conemu-maximus5/](https://code.google.com/p/conemu-
maximus5/)

I haven't tried Console2, so I can't provide a comparison, but here's Scott
Hanselman deciding to switch:
[http://www.hanselman.com/blog/ConEmuTheWindowsTerminalConsol...](http://www.hanselman.com/blog/ConEmuTheWindowsTerminalConsolePromptWeveBeenWaitingFor.aspx)

------
dingaling

      complete -r
    

disables Bash 'smart tab completion', which in theory is a great idea ( use
tab to complete arguments or only list files applicable to the program ) but
which never seems to work properly for me.

Disabling it saves a lot of frustrated tab-banging.

~~~
dfc
Something is terribly wrong with your setup if command line completion is not
working.

~~~
aidenn0
A lot of installs come with over-complete smart completion configurations that
make <tab> take several seconds (or even 10s of seconds) to complete in fairly
common situations.

~~~
dfc
Please give me one or two examples of "fairly common situations" where it
takes >=20 seconds for bash to respond to the tab.

~~~
claudius
Filename completion on remote systems using ssh/scp. It is a fairly common
situation for me, although I wouldn’t say it takes 20s. Maybe 2-5s?

~~~
shabble
Setting up SSH multiplexing with ControlPersist[1] can help quite a bit here,
since after the first connection you don't have to go through the
init/connection phase for subsequent completions.

You can even preemptively fire up a master connection to commonly accessed
hosts to avoid the initial delay.

[1]
[https://en.wikibooks.org/wiki/OpenSSH/Cookbook/Multiplexing](https://en.wikibooks.org/wiki/OpenSSH/Cookbook/Multiplexing)

~~~
claudius
Ah, that’s interesting! I assumed that something like that was going on, as
subsequent completions take much shorter (in the same command), but I did not
know about preemptively connecting to common hosts.

Thank you very much, I’ll try to play around with it :)

------
mateuszf
Nice here document feature I have found recently is heredoc with pipe, e.g.

    
    
      cat <<REQUEST_BODY |      
      {
        "from" : 0,
        "size" : 40
      }
      REQUEST_BODY
      curl http://localhost -d @-
    

It allows to pass heredoc text to standard input of next command.

~~~
lotheac
This is a prime example of useless use of cat. Heredoc already means "pass
this as stdin", there's no need to pipe it. Your example without cat:

    
    
        curl http://localhost -d @- <<REQUEST_BODY
        {
          "from" : 0,
          "size" : 40
        }
        REQUEST_BODY

~~~
pdkl95
I like modularity:

    
    
        request_body() {
            cat <<REQUEST_BODY |      
            {
                "from" : 0,
                "size" : 40
            }
            REQUEST_BODY
        }
    
        get_url() {
            curl http://localhost -d @-
        }
    
        request_body | get_url
    

I find it helps readability when you come back to it a year later. Of course,
it's also easy to parameterize the body, if needed.

/readability sometimes trumps YAGNI

------
rtpg
are we going to get a better bash at one point? I've always felt like the only
thing bash scripts are good at describing is I/O redirection. But
conditionals, dealing with variables, pretty much everything else is
frustrating and error-prone

I use fish as my main shell and its slightly better, but just testing things
on variables can be a huge mess.

~~~
Goosey
I enjoy using fish as my main shell, but as soon as I ran into a "curl
oneliner install" that failed in fish (in my case Homebrew's at the bottom of
[http://brew.sh/](http://brew.sh/)) and required me to jump into bash. I enjoy
fish so much I continue to use it, but I am in a state of fear that I will at
some point encounter some failing shell script that leaves my system in a
broken state.

Do you have any recommendations to make fish play better with shell scripts
intended for bash?

~~~
cpenner461
I used fish for about 6 months or so and loved it - with this exception.
Initially I'd just hop into bash to do whatever I needed (as another comment
suggests), but what was the "last straw" for me was not being to have some of
the convenience functions that virtualenvwrapper exposes for working with
python virtualenvs. My solution was to switch to zsh/oh-my-zsh - I'm pretty
happy with it so far, although I do miss a few things from fish (namely the
auto suggest/complete when typing previous commands).

------
cynik_
I'd really recommend using `set -x` or `bash -x script` to sanity check all
the commands and expected output.

See [http://www.tldp.org/LDP/Bash-Beginners-
Guide/html/sect_02_03...](http://www.tldp.org/LDP/Bash-Beginners-
Guide/html/sect_02_03.html)

~~~
scott_karana
The article mentioned both -x and -v, didn't it? ;)

------
Karunamon
_Try moving all bash code into functions leaving only global variable
/constant definitions and a call to “main” at the top-level._

One of my main complaints with bash.. the file is evaluated in order - you
can't call a function on the line before it's declared.

This fails:

    
    
        #!/usr/bin/env bash
        bar='bar'
        foofunction
    
        foofunction(){
          echo 'foo'
          echo $bar
        }
    

Basically you have to write your entire script in reverse, and i'm unaware of
a good way to get around it.

~~~
nfoz
Well, what would you do instead? It's an interpreted language. When you type
"foofunction" at a prompt, you don't want it to wait around in case you define
that to have meaning later.

You probably want the interpreter to be smart: "Am I loading a script from a
file, or am I receiving instructions interactively on the command-line?" But
now there's two modes of execution, and code in a bash script won't work if
you type it into the prompt yourself. That's a bit uncomfortable.

------
zx2c4
If you'd like to see a decently written piece of bash that incorporates many
of these suggestions, check out pass, the standard unix password manager.

Project page: [http://www.zx2c4.com/projects/password-
store/](http://www.zx2c4.com/projects/password-store/) Source:
[http://git.zx2c4.com/password-store/tree/src/password-
store....](http://git.zx2c4.com/password-store/tree/src/password-store.sh)

------
hyp0
Excellent, _bash the good parts_. More than 15 minutes though.

Googling _bashlint_ , _shlint_ turns up some discussion (bash -n, ksh -n, zsh
-n, some github projects), but I doubt they cover this article's specifics -
though most (all?) of it could be automatically checked. I think _some_ could
be automatically added (e.g. _set -o nounset_ ) - perhaps a bash-subset (or
coffeescript-style language) possible...

~~~
curiousbiped
Try out shellcheck - there's an online checker at
[http://www.shellcheck.net/](http://www.shellcheck.net/), and if you like it,
the source for it is on github at
[https://github.com/koalaman/shellcheck](https://github.com/koalaman/shellcheck).

------
knyt
The author uses ${} a lot more than I see in most code. Is it helpful to
always use the ${var} syntax instead of simply writing $var?

I can see universal application of ${} being advantageous in avoiding
accidental "$foo_bar where you meant ${foo}_bar" situations, and ${} makes it
clearer that you're referencing a variable. The only cost would seem to be
more typing.

------
voltagex_
I also like [http://google-
styleguide.googlecode.com/svn/trunk/shell.xml](http://google-
styleguide.googlecode.com/svn/trunk/shell.xml) but of course some things that
work well for Google might not work for you.

------
njharman
>> This will take care of two very common errors: >> Referencing undefined
variables (which default to "") >> Ignoring failing commands

Better is subjective... About half my scripts depend on those features. For
default arguments, and fail early.

~~~
skywhopper
He's advocating fail early with "set -e". And as for the other, you can allow
overrides with syntax like:

    
    
        COULD_BE_SET=${COULD_BE_SET:altvalue}

------
dsfadadsffd
There is no need for long set flags, e.g. use

    
    
      set -e
    

and not:

    
    
      set -o err
    

etc.

~~~
nfm
They are functionally equivalent, but the longhand versions are clearly more
readable/greppable/google-able. The longhand versions have the exact same
benefits as calling a variable eg. `target_file` instead of `a`.

~~~
stormbrew
Usually true but not always. Calling a for loop iterator "index" instead of
"i" just adds unnecessary noise, for example.

I think "set -e" in bash should become common on the same level, it's pretty
rare that you really want a script to continue after an unguarded error
return.

~~~
twic
I set both those options in every script i write. So i do it like this:

    
    
      #! /bin/bash -eu
    

Because i do it in _every_ script, readability and greppability are not
important to me; i just need to apply the flags and get on with the script.
Taking up two whole lines for them just adds noise.

If i was more selective in my use of those flags, then i would agree that the
long forms were preferable, for the reasons given.

~~~
mturmon
One drawback with this is that if you, or someone, does

    
    
      % bash script.sh
    

to run your script, then the shebang line will never be seen, and your script
will run with -e off. If the "set -e" is explicitly given, this won't happen.

As you can guess, I've done this by mistake. One case is after transferring or
unarchiving files where execute flags get turned off by mistake. Or using
utilities, like job schedulers, that are tricky in whether they run the script
as an executable, or via a shell interpreter.

------
dllthomas
One thing I've liked is throwing ${PIPESTATUS[*]} at the front of my PS1.

------
celebril
Or you can just use Zsh, which is superior in any way to Bash. ;)

~~~
vacri
Except existing deployment spread :)

