
Use the unofficial Bash strict mode - redsymbol
http://redsymbol.net/articles/unofficial-bash-strict-mode/
======
embolalia
Better than set -e is trapping ERR. You can set up a trap that prints out the
command that failed, and the line it's on, rather than just dying quietly like
set -e does. It's much easier to debug that than trying to work ahead from the
last command with output. For bonus points, when you have a bunch of scripts
together, put the trap code in its own file and source it in all the other
scripts (rather than duplicating code).

~~~
_pmf_
It's also amazing that you can trap SIGTERM to perform clean-up tasks when the
user Ctrl-Cs a script.

~~~
bdonlan
That would be SIGINT - SIGTERM is sent when you kill the process without
specifying a specific signal to send, or when the system is shutting down.

------
daveloyall

        The set -e option instructs bash to immediately exit if
        any command has a non-zero exit status. You wouldn't
        want to set this for your command-line shell,
    

...but, it's a great addition to your buddy's `.bashrc`. For maximum
effectiveness, be physically present, perhaps with a camera. :)

~~~
redsymbol
Oooh, this is my new favorite. Previously it was "echo exit >> ~/.bashrc" ;)

~~~
jasonjayr
Place this entire line @ the bottom of your target's .bashrc

    
    
        echo 'sleep .1' >> ~/.bashrc
    

By week 2 my poor friend had conditioned himself to work within 1 session
since spawning new sessions became unbearably slow :)

------
pflanze
The disappointment with set -e is that it does not work everywhere:

\- with an unset 'hello' variable, try:

    
    
      set -eu
      echo "`echo $hello` world"
      echo "huh still going?!"
      foo="`echo $hello` world"
      echo "not reaching this as expected"
    

Fun, now you have to manually write your whole program in what's basically SSA
(static single assignment) form.

\- It's deactivated in if/then context, which at first makes sense, but then
when you try to force it on explicitely like:

    
    
      set -eu
      if (set -eu; false; true); then
          echo "huh why still true??"
      else
          echo false
      fi
    

and your declaration is just ignored.. you begin to wonder how many places set
-e misses really.

------
quotemstr
I don't think that setting IFS this way is a good idea. Of your variables do
happen to contain tabs and newlines, you still get unwanted word expansion.
Much better is to just use double quoted expansion primitives that always
expand one element to each word: "${foo[@]}"

~~~
staticshock
Agreed. The IFS trick is shortly followed by an explanation, which includes a
bunch of anti-patterns, such as bare ${foo[@]} and $@ usage.

Rule of thumb: quote all your variables.

[http://mywiki.wooledge.org/Quotes](http://mywiki.wooledge.org/Quotes)

~~~
redsymbol
That's the common answer, and while I practice that myself, I have to disagree
with it as guidance. Maybe you and I are fastidious enough to always remember
to quote all our variables, but many are not - I know I was writing shell
scripts for a couple of years before I realized its importance, and it's
_very_ common that even experienced engineers don't know or care to do it. If
someone writing or editing the code forgets to quote the variable that allows
subtle bugs to sneak in.

It's unfortunate the semantics of bash don't have variable references behave
like they are quoted by default. I really wish it did.

~~~
dap
While setting IFS=$"\t\n" may make problems less likely when strings contain
spaces, it's still not correct. File names (and other strings) can contain
tabs and newlines, too. That's relatively rarer, but the quoting approach
always works.

~~~
redsymbol
Hm, is that true? When I run this script:

    
    
      #!/bin/bash
      items=(
          'a'
          'b c'
          "d\te"
          "f\ng"
      )
      
      echo "Unquoted:"
      for item in ${items[@]}; do
          echo -e ".  $item"
      done
      
      echo "Quoted:"
      for item in "${items[@]}"; do
          echo -e ".  $item"
      done
      
      set -euo pipefail
      IFS=$'\n\t'
      echo "Unquoted strict mode:"
      for item in ${items[@]}; do
          echo -e ".  $item"
      done
    

... I get this output:

    
    
      Unquoted:
      .  a
      .  b
      .  c
      .  d    e
      .  f
      g
      Quoted:
      .  a
      .  b c
      .  d    e
      .  f
      g
      Unquoted strict mode:
      .  a
      .  b c
      .  d    e
      .  f
      g
    

Note the output for "Quoted" and "Unquoted strict mode" are identical.

(GNU bash, version 4.2.37(1)-release (x86_64-pc-linux-gnu))

------
spbnick
Aside from the comments above, I would add "shopt -s shift_verbose" which
enables "overshifting" detection.

E.g. when a function receives less arguments than necessary and you retrieve
them like this:

    
    
        function f()
        {
            declare -r x="$1";  shift
            declare -r y="$1";  shift
            declare -r z="$1";  shift
        }
    

with less than three arguments passed and shift_verbose on you will get an
error message the moment you shift and with "set -e" in addition, execution
will be aborted.

See [https://www.gnu.org/software/bash/manual/html_node/The-
Shopt...](https://www.gnu.org/software/bash/manual/html_node/The-Shopt-
Builtin.html)

Also, "set -o noclobber" might be useful. If you try to redirect to an
existing file with ">", it will fail. If you explicitly want to overwrite the
file without triggering the error, use ">|".

See
[https://www.gnu.org/software/bash/manual/html_node/Redirecti...](https://www.gnu.org/software/bash/manual/html_node/Redirections.html#Redirecting-
Output)

------
pjungwir
When using `set -eu`, how do you check the number of arguments?:

    
    
        if [ -z "$1" ]; then
          usage();
          exit 1;
        fi;
    

If I'm using `set -eu`, that dies on the `$1` with a nasty error message,
rather than printing my usage message. I've resorted to moving `set -eu` to
after these kind of checks, but that makes me uneasy.

~~~
michaelmior
Use the `$#` variable which contains the number of arguments.

------
bumbledraven
Good info for writing bash scripts that are large enough to need debugging,
but why would one do that in the first place? Python or Perl would be better
choices at that point.

~~~
SEJeff
When you are fundamentally running shell commands, using python/perl really
don't make a lot of sense. Use the best tool for the job. Note that for my
dayjob I write python almost fulltime, but if I'm almost exclusively running
shell commands, I'll write a shell script. Just because you can do something
one way doesn't necessarily mean I should.

See I'm of the opinion that if you need arrays and associative arrays, bash is
the wrong tool for the job. If you have a recent bash it has both of those,
just seems wrong in such a clunky language with awful scoping.

~~~
eropple
I write a lot of bash scripts, but lately I've been doing most of my shell
interop in Ruby instead. Backticks are pretty good and I can munge things in
easier ways than bash or Python. And with `ruby -n`, the script is invoked
line-by-line--perfect for processing piped content.

~~~
SEJeff
Use the best tool for the job. If you don't know bash/bourne shell well, use
ruby/perl/python/etc. I started out as a sysadmin years ago and know bourne
shell/bash very very well. It is all about what works best for the problem.

~~~
eropple
Sure. And don't get me wrong, I write a lot of bash scripts. =) I think any
logic beyond string replacement is probably edging out of where it's a good
idea, if only because other people then have to read my stuff later, but it's
totally fine for that. I'm saying more that I think Ruby (or Perl) make more
sense than Python given the tools it provides.

------
nodesocket
The problem with the `-e` flag is if you want to output custom errors, say
json. I prefer to have bash inspect status codes.

    
    
        if [ $? != 0 ]; then
            echo "{\"error\": \"Failed to connect to the database.\"}" >&2;
            exit 1;
        fi

~~~
matt_kantor
Just do it all in the condition instead of separately:

    
    
        if ! command_that_may_error; then
            echo "{\"error\": \"Failed to connect to the database.\"}" >&2;
            exit 1;
        fi

------
john398053
Regarding setting $IFS:

    
    
      for arg in $@; do
    

A better way to do it is to quote it:

    
    
      for arg in "$@"; do
    

because then you can capture newlines and tabs. Bash will automatically
convert it to separate parameters, even though there is only one quoted
variable.

~~~
js2
See footnote 2. [http://redsymbol.net/articles/unofficial-bash-strict-
mode/#f...](http://redsymbol.net/articles/unofficial-bash-strict-
mode/#footnote-2)

~~~
pflanze
I wonder why not to set IFS='' then. I haven't used this myself in production,
but quick testing seems to do what I expected: make $foo behave like "$foo".

