
Use the Unofficial Bash Strict Mode (Unless You Love Debugging) - gkst
http://redsymbol.net/articles/unofficial-bash-strict-mode/
======
devit
The article is dangerously wrong in its discussion of IFS.

What you should do to avoid the problem of mishandling spaces is use proper
quoting (for i in "$@"; do ...), not changing IFS; setting IFS to \n\t will
still break embedded tabs and newlines.

In general, in bash scripts any of use of $ should always be between double
quotes unless you have a reason to do otherwise.

~~~
gdavisson
Agreed. In addition to still having trouble with tabs and newlines, setting
IFS still leaves the other big problem with unquoted variables: unexpected
expansion of wildcards. The shell considers any unquoted string that contains
* , ?, or [ to be a glob expression, and will replace it with a list of
matching files. This can cause some really strange bugs.

Also, an unquoted variable that happens to be null will essentially vanish
from the argument list of any command it's used with, which can cause another
class of weird bugs. Consider the shell statement:

if [ -n $var ]; then

... which looks like it should execute the condition if $var is nonblank, but
in fact will execute it even if $var _is_ blank (the reason is complex, I'll
leave it as a puzzle for the reader).

Setting IFS is a crutch that only partly solves the problem; putting double-
quotes around variable references fully solves it.

~~~
stantona
The test command has certain rules depending on the number of arguments. The
most pertinent rule is: For one argument, the expression is true if, and only
if, the argument is not null.

In this case

    
    
        [ -n $var ] 

is the same as

    
    
        test -n $var
    

$var is not quoted, so when this command is run, word splitting occurs and
therefore $var is null. Which there falls into the one argument rule above.

Therefore, always quote your variables.

------
ishtu
It's also a good idea to check your complex scripts before run with awesome
shellcheck tool. [http://www.shellcheck.net/](http://www.shellcheck.net/)

~~~
sshykes
Thanks for this!

[https://github.com/koalaman/shellcheck](https://github.com/koalaman/shellcheck)

and `brew install shellcheck`

~~~
Redoubts
To be honest, the build time required for this (and usually GHC as well) get
really annoying. Especially for simple updates.

~~~
chei0aiV
Does homebrew not support precompiled binaries?

~~~
bsandert
Yes, but for use in a CI it's a real drag

~~~
chei0aiV
Why is that? Are the download servers slow?

------
Hello71
[http://mywiki.wooledge.org/BashFAQ/105](http://mywiki.wooledge.org/BashFAQ/105):

> Why doesn't set -e (or set -o errexit, or trap ERR) do what I expected?

> set -e was an attempt to add "automatic error detection" to the shell. Its
> goal was to cause the shell to abort any time an error occurred, so you
> don't have to put || exit 1 after each important command. That goal is non-
> trivial, because many commands intentionally return non-zero.

[http://mywiki.wooledge.org/BashFAQ/112](http://mywiki.wooledge.org/BashFAQ/112):

> What are the advantages and disadvantages of using set -u (or set -o
> nounset)?

> Bash (like all other Bourne shell derivatives) has a feature activated by
> the command set -u (or set -o nounset). When this feature is in effect, any
> command which attempts to expand an unset variable will cause a fatal error
> (the shell immediately exits, unless it is interactive).

pipefail is not quite as bad, but is nevertheless incompatible with most other
shells.

~~~
gdavisson
The example given in the article:

grep some-string /some/file | sort

is a good example of why -e and pipefail are dangerous. grep will return an
error status if it gets an error (e.g. file not found) _or_ if it simply fails
to find any matches. With -e and pipefail, this command will terminate the
script if there happen to be no matches, so you have to use something like ||
true at the end... which completely breaks the exit-on-error behavior that was
the point of the exercise.

Solution: do proper error checking.

------
makecheck
To be honest, I think traditional shells are now only good for environments
where you know you aren’t doing anything too weird and all the most _likely_
inputs work as expected without a lot of effort. This is spending time wisely;
just because everything except a zero can technically be part of a Unix
filename doesn’t mean that I want to invest hours or days making damn sure
everything works for pathological cases.

If I actually _do_ want to guard against every case imaginable, I immediately
switch to Python or some other language that at least knows how to quote
things unambiguously without a lot of effort.

~~~
mbrock
Shell is a lot better than the other languages I know at many tasks involving
I/O redirection, spawning programs, etc. It's kind of arcane, but so are the
APIs for doing that stuff in other scripting languages. I'm eagerly awaiting
some new contender in the system scripting language arena though.

------
1amzave
Fails to mention what is in my opinion _the_ most devious, subtle potential
pitfall with `set -e`: assigning (or even just a bare evaluation of) an
arithmetic zero. `foo=0` won't do anything surprising, but `let foo=0` will
return 1, and thus abort your script if you're not careful.

Also, as an alternative to the proposed `set +e; ...; set -e` wrapper for
retrieving the exit status of something expected to exit non-zero (generally
cleaner in my opinion, if slightly "clever"):

    
    
        retval=0
        count=$(grep -c some-string some-file) || retval=$?

------
lisivka
I wrote library for shell scripts with design goal to work properly in strict
mode: [https://github.com/vlisivka/bash-
modules](https://github.com/vlisivka/bash-modules)

------
alwaysdownvoted
One solution is not to use Bash. There are more basic shells that are equally
as, or more, POSIX-like.

------
mhd
I still don't get why bash (or zsh) don't try to integrate more Korn shell (88
& 93) scripting features. But there the focus seems more on more colorful
prompts and autocompletion handholding…

And even despite more free licenses (AFAIR, IANAL), you can't depend on actual
Korn shells being available on Unices. At least the dependent app situation
has been getting a lot better, mostly by the death of workstations and their
proprietary OSs (try depending on almost any grep/awk/sed option/switch when
it has to run on Solaris/AIDX/HP-UX). Although "all the world's a GNU/Linux"
seems the new plague upon our lands here…

So after all these years, I'd say we're still in pretty much the same
situation that birthed Perl. Which still would be my preferred choice if I'd
actually have to distribute scripts and we're not talking about my own
private, context-specific shortcuts, scripts and functions.

~~~
ta0967
which particular ksh features do you miss in zsh?

------
mchahn
I use set -e or not based on the needs of the script. Many times I want the
script to continue and sometimes I don't. Sometimes I don't set until partway
down the script so the top isn't strict. I wouldn't want to set it on every
script.

~~~
skuhn
I've encountered a fair number of people who blindly set -eu on every script
as a matter of course, as suggested by the author here. While this article
goes through a bunch of the pitfalls, in my experience people often fail to
account for the many (sometimes unintuitive) ways this can cause an abrupt
exit. Sometimes stopping halfway through is just as bad as continuing under
false assumptions, particularly if the script is so simplistic that it isn't
obvious to the user that it exited midway through.

I almost never use -e, and as a result I have to stay vigilant and test return
codes all over the place. I prefer that to the kludge of forcing everything to
return zero and I think it produces overall better results. Ultimately you
want to handle an error, not just abort, and -e doesn't do much to promote
handling properly.

------
teddyh
Instead of “|| true”, I prefer “|| :” – it’s shorter, and : is always a built-
in, even in bare-bones shells where “true” is an external command.

Since “|| :” is always written at the end of lines, it should be short and
visually unobtrusive.

~~~
NegativeLatency
':' isn't valid as a command in the fish shell (unless you make an executable
named ':' and put it in your path)

    
    
      user@host ~> :
      fish: Unknown command ':'
    

edit: format

~~~
teddyh
The _fish_ shell? Does it claim POSIX /bin/sh compatibility?

[http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3...](http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html#tag_18_16)

------
chei0aiV
[http://www.shellcheck.net/](http://www.shellcheck.net/)

~~~
ubercow
Shellcheck is great when you integrate it with your editor.

I use it all the time when I have to write shell scripts.

------
mhw
A simpler solution is to use the Plan 9 rc shell for scripting. It had more
sensible syntax and doesn't rescan input so many of the issues raised in the
article just don't occur.

------
plugnburn
What a nonsense.

Instead of handling non-zero exit statuses in a correct way,the article
suggests to interrupt the script right in the middle, with probably some
temporary files and processes hanging around which can't be cleaned up if
something goes wrong.

The same BS goes though the entire article.

Has the author actually written anything bigger than echo "Hello world!" in
Bash?

~~~
zurn
In bash you clean up in the exit handler:

    
    
      trap mycleanupfunc EXIT

~~~
plugnburn
I've read it, don't worry. So basically we create a problem with the use of -e
flag and then solve it with traps... And what if our logic depends on exit
statuses, for example when we check whether some utility or a file is present
in the system? I don't want the script to exit, I want it to go another logic
branch!

P.S. No, temporarily disabling the option is not a solution, it's another
workaround for the problem created out of nothing.

~~~
matt_kantor
Exit traps for cleanup are a good idea in any case. Scripts can be killed by
signals and whatnot as well.

I'm not sure what scenario you're imagining with your other concern. This does
what you would expect:

    
    
        set -o errexit
        if [ -f somefile ]
        then
            echo "File exists."
        else
            echo "File does not exist."
        fi

------
lottin
Or simply start with

    
    
        #!/bin/sh
    

and stick to POSIX-compliant code.

I'm not sure that writing scripts that rely on bash-specific features is such
a great idea.

~~~
the_why_of_y
Exactly how does that improve the error handling in your shell script beyond
"ignore all errors", as discussed by the OP?

I am genuinely curious how you would write a command with a pipe in plain
POSIX /bin/sh such that a non-zero exit status from the program that writes
into the pipe is detected (as can be done in bash with "set -o pipefail" or
"$PIPESTATUS").

------
leni536
Note that these options don't propagate to subshells. So be avare of your
commands between ` marks.

~~~
jwilk
They do propagate to subshells. The only exception is that bash (unlike other
shells) clears -e in command substitutions.

~~~
bboreham
They might propagate to subshells. Someone did a lot of checking:

[http://www.in-ulm.de/~mascheck/various/set-e/](http://www.in-
ulm.de/~mascheck/various/set-e/)

------
iuyoynp
there's also this insane idea of not scripting in bash. ffs!

