
Safe ways to do things in bash - signa11
https://github.com/anordal/shellharden/blob/master/how_to_do_things_safely_in_bash.md
======
freedomben
I've written a ridiculous amount of shell script in my day, especially when
doing "devops" before we had a term like "devops" to describe it. I've fallen
in almost every pit bash has. With that background, here is my opinion.

1\. This article contains excellent advice and should be starred for later
retrieval.

2\. Having basic scripting skills will make you a _way_ better programmer.
Many times I've done huge refactors and needle-in-hay-stack searches using
only shell commands.

3\. Shell is the universal language.

4\. Bash isn't that bad once you get used to it (seriously. I'll grant you tho
that arrays are still nasty ;-) ).

5\. Bash is not that dangerous if you follow best practices. Don't be lazy!

6\. You will not regret getting really good at shell script. You'll have to
take my word for it now because you don't know what you're missing.

~~~
h1d
I think it's time people start using something better than bash/zsh that is
decades old, like fish or even come up with a more modern shell.

Even by looking at these examples, you see it has less verbosity like "then"
and "do", you can reference arguments as $argv instead of cryptic $@ and exit
status code as $status instead of $? which is confusing with $! and the likes.

[https://blog.codeship.com/lets-talk-about-shell-
scripting/](https://blog.codeship.com/lets-talk-about-shell-scripting/)

[https://fishshell.com/docs/current/tutorial.html](https://fishshell.com/docs/current/tutorial.html)

Shell is such an integral part of admins and programmers workflow yet I find
it hard to believe this field has been so slow at improving. Even the fish
site jokingly states "Finally, a command line shell for the 90s" implying the
others are even older.

~~~
zingmars
I think a big issue is that bash is available everywhere, while fish might not
be. There's also the fact that a lot of us have fancy dotfiles for our
work/home computers, and switching to another shell would mean having to
rewrite them in the target shell language.

~~~
h1d
So, you'd rather use something that is decades old because you can't/don't
want to (ask to) install 1 new program on the server and take a weekend to
rebuild your config file?

You can simply import aliases as is and in some cases, you may be able to even
simplify parts of your config.

Personally, it was easy for me as I'm administering the servers and it was all
just installing fish on every servers. (some dozens.)

~~~
znpy
> So, you'd rather use something that is decades old because you can't/don't
> want to (ask to) install 1 new program on the server and take a weekend to
> rebuild your config file?

Absolutely, yes. Because long story short, bash is TriedAndTrue® technology.
Stuff that works.

It takes a bit of dedication to be mastered at a decent level but it pays off
immensely. It's so ubiquitous it is one of those tools that you can learn
once, use for the rest of your life, and use in a lot of contexts.

It's so widespread that it can bring you very far with very little.

All this being said, as someone who used to write code for a living (and now
works as a System Engineer, using the bash shell everyday) I must say that if
you do not do input verification (according to the language of your choice)
and something goes wrong then it's your fault.

~~~
opinionator1
"works"

More like "better the devil you know" (if you can even say that much).

Bash doesn't scale. Every shop larger than 1 has been burned by bash gotchas.
Use a real scripting language and shell out to the commands and builtins when
necessary.

At my shop bash is strictly disallowed in production environments and we're
all better off for it.

~~~
znpy
> Bash doesn't scale.

Thanks mate, I had a good laugh.

I wouldn't expect bash to scale anyway. That's not what it's meant for. It's
meant for system administration task automation.

On a more serious note ...

In many occasions, the performance you get depends on how you tackle the
problem you have, though. Even using bash and the tools from the unix toolbox,
sometimes you can gain significant improvements on how you manage your data.

Anecdotal: I cannot remember the details, but I remember that rearranging the
order of sorting and searching and removing duplicates (sort, sort -u, grep,
uniq mainly) I saw a significant speedup.

Anecdotal (2): I cut the execution time of a night-running job from hours to
minutes (tens of minutes, to be honest - but still less than an hour) just by
slicing the size of a problem into smaller parts and by handling each slice in
parallel (the machine had 48 cpus, but the problem was being solved
"sequentially" on one cpu alone). I wrote some 30-50 lines of python, just to
implement parallelism control: the rest of the problem was still handled with
bash script. Partial results were reassembled at the end. Bash has
coprocesses, so I might have handled that in bash as well, but python was more
handy at the time (meh, i just wanted to optimize that problem).

What I am trying to say is that sometimes the "scaling" you get is justified
by the size of the problem, sometimes it's not.

~~~
znpy
It's worth noting that bash is really a glue language to call other programs.
If you mainly use the tools from the unix toolbox (i'm thinking of grep, for
example) you really get the "scaling" (the performance) of native executable
code.

Again, it really depends on how you handle your data.

Having a number of filters chained via pipes is really efficient, for example,
when compared with looping over an array and executing some python/perl/ruby
one-liners every time.

------
nodesocket
Highly recommend shellcheck[1]. There is a SublimeLinter plugin[2] that
automatically checks your shell scripts as you code them. It generally makes
best practice suggestions including quoting.

[1]
[https://github.com/koalaman/shellcheck](https://github.com/koalaman/shellcheck)

[2] [https://github.com/SublimeLinter/SublimeLinter-
shellcheck](https://github.com/SublimeLinter/SublimeLinter-shellcheck)

~~~
aequitas
Not only does it make suggestions, almost all of the 'error codes' have
extensive documentation on why something is wrong and often multiple
alternative solutions for each use case (eg:
[https://github.com/koalaman/shellcheck/wiki/SC2086](https://github.com/koalaman/shellcheck/wiki/SC2086)).
I learned more bash from Shellcheck than all tutorials and references
combined.

~~~
freedomben
Same. I was a bit cocky when I first tried out shell check because I had been
doing bash for years. Shell check flagged something I'd been using for a while
and after reading the docs I realized shell check's suggestio was much cleaner
and just-as-safe way to do what I was doing. Really impressive piece of
software.

------
koolba
From the article:

> Should I use curly braces?
    
    
        Bad: some_command $arg1 $arg2 $arg3
        Extra bad (cargo culting unnecessary braces): some_command ${arg1} ${arg2} ${arg3}
        Correct: some_command "${arg1}" "${arg2}" "${arg3}"
        Better: some_command "$arg1" "$arg2" "$arg3"
    

> In the "extra bad" and "correct" examples, braces compete with quotes under
> the limits of tolerable verbosity.

> Shellharden will rewrite all these variants into the "better" form.

I prefer the "${bracey}" form for all variable usage. Yes it's marginally more
verbose but it has the advantage of being consistent, easier on the eyes due
to less overall quoting when part of full string interpolation[1], and cleaner
diffs[2] as converting "${foo}" to "${foo}-bar" only leads to word-diff of
"-bar".

[1]: "${foo} bar baz" v.s. "$foo"" bar baz"

[2]: _You are source controlling your shell scripts right?_

~~~
saagarjha
> You are source controlling your shell scripts right?

My personal heuristic for shell scripts are that if I care enough about them
to put them under source control, they shouldn't be a shell script.

~~~
JauntyHatAngle
My personal experience would disagree, and I feel that is throwing caution to
the wind.

As any software person knows, you really can't tell what's going to happen to
the code/scripts you put out, not committing it to source control is a
dangerous game to play.

If you have a simple shell script sitting on a server doing some basic task,
why would you not have it under source control where it can be viewed by
future teams and seeing what changes have been made to it over time? Seemingly
simple problems can be caused by minor changes which are very visible if its
under source control.

Just because its simple doesn't make it any less important. Complexity is not
a good measure of its importance.

Especially when you start trying to implement IAC in legacy areas...

~~~
occams_chainsaw
At that point, you care enough to put it under source control, and it
shouldn't be a shell script. That's the entire point of the comment you
replied to

~~~
JauntyHatAngle
It's that arbitrary distinction that I am responding to. I don't agree that
you should reserve source control for the complex.

A shell script can be the appropriate tool for many important tasks.

~~~
sverhagen
I agree that simpler scripts deserve to be in source control too. I read it to
mean what I also would say, myself: if it's anything but the most trivial of
scripts (so, rather _programs_), they shouldn't be in BASH. Some people in
this discussion are clearly very well versed in BASH. Great. For average
developers, though, it's hard to build (good) programs in BASH, and the rabbit
hole swallows them. Every time.

------
Xcelerate
I always think — when a programming/scripting language requires _this much_
bizarre knowledge just to write basic code that performs basic tasks, perhaps
it is time for that language to be retired.

I really don't understand why bash still exists. I've switched to fish and am
much happier with the change.

~~~
LeoPanthera
Because it is ubiquitous. You can virtually guarantee that bash will be found
on any arbitrary unix-like system.

~~~
h1d
I keep seeing this statement but this absolutely does not apply to everybody.

How often can you just install fish or use other programming language instead
of being forced to use bash?

It's sad people keep using the default just because of being afraid that the
next system you touch might not have it and you waste tremendous amount of
productivity without using something better.

I know a guy who used vim with default config for that reason. Utterly
nonsense.

~~~
znpy
No one ever thinks of interoperability between _people_. Bash is known, to a
variable depth, by pretty much every sysadmin (and a big part of being a
sysadmin is being able to write shell scripts).

So I might prefer the fish, and who came before me might have had a preference
for zsh. So now I have to deal with three different shells: bash (default),
fish (for the scripts i'll be writing from now on) and zsh (because of
compatibility). Congrats.

> I know a guy who used vim with default config for that reason. Utterly
> nonsense.

I started using less and less emacs/vi(m) customization and learn more and
more of the defaults for the same reason: whenever I log on a client system I
am instantly proficient with the editor without stupid complaints like "but on
my box is different..." .

Anecdotal: I have seen people losing their editing speed/proficiency because
all of a sudden they were in a clean vim session and had no one of their shiny
and colorful plugins.

~~~
h1d
Why can you not modify the editor environment that you use often?

Hard to see I need to deal with environments where I can do nothing but keep
the default every time I use it.

Once you get the speed that is tuned to your liking, the default sounds like
you're walking with legs strapped.

~~~
znpy
> Why can you not modify the editor environment that you use often?

Because I am not the only one accessing those environments (please note the
plural here).

Currently there are about 13 other sysadmins in my team and we manage clients
infrastructures among other things (managed services). From time to time
someone from another team accesses those environment (not a sysadmin, but
still familiar with the bash shell). Sometimes the client accesses those
systems (rare, we try to discourage and avoid that).

Can you even imagine what a mess it would be if we all started applying our
own favorite settings ?

(edit: fix grammar)

~~~
h1d
How hard is it to create a user for each of the system admins?

It seems it's not a good practice to share a single account as it makes it
hard to tell who did what.

------
jwilk
> POSIX mandates /bin/sh

Nope. On the contrary, it says: _Applications should note that the standard
PATH to the shell cannot be assumed to be either /bin/sh or /usr/bin/sh_

Source:
[http://pubs.opengroup.org/onlinepubs/009695399/utilities/sh....](http://pubs.opengroup.org/onlinepubs/009695399/utilities/sh.html)

A pedantically-compliant shell script should not have shebang at all.

~~~
Twisol
What's wrong with `#!/usr/bin/env sh`?

~~~
_kst_
I've worked on systems where env was installed as /bin/env and not as
/usr/bin/env . (I think it was SunOS 4.)

For that matter, under Termux on Android it's
/data/data/com.termux/files/usr/bin/env (but termux has a hack to make normal
shebangs work).

~~~
dredmorbius
Termux rewrites scripts via termux-fix-shebang.

~~~
rkeene2
It now also does rewriting on the fly for certain shebang lines, as part of
its C library, so unmodified scripts can run.

~~~
dredmorbius
I didn't know that, thanks.

~~~
rkeene2
No problem. I wrote it because I version control my scripts and didn't want to
fork them for running them on my laptop (Chromebook with Termux).

------
nisa
If you reached a point where you need to require bash and not a posix shell
and need to enforce these rules just use python or lua if you can or some
scheme or whatever else... it's not worth the wasted time hunting bash cruft.

if you are on busybox with ash none of this is helping (except shellcheck
which is great).

~~~
mixedCase
Sometimes you just want some 30-50 lines of piping a few commands and a couple
of conditionals.

Python (the language with the most community traction to replace bash for
scripts) is a royal pain to use for this without libraries that are present in
no default system, and even with those it often ends up being more verbose
than it should.

~~~
nisa
> Sometimes you just want some 30-50 lines of piping a few commands and a
> couple of conditionals.

maybe I was not clear - nothing against shell-scripting but doing some weird
dancing like in the article is imho useless, also depending on bash is a
stupid idea imho.

posix sh + shellcheck is all you need. if you can't solve your problem in
posix sh rethink your code / approach and simplify until it will work.

~~~
thisacctforreal
I agree.

I've used this page successfully as reference for portable syntax:

[http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3...](http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html)

Some sections on features I use heavily:

\- Parameter Expansion (specifically :-, %, %%, #, ##)

\- Special Parameters (specifically "$@", $#, $?)

\- set --, this lets you set the $1, $2, etc. variables. I use this with "$@"
for arrays, primarily for building command strings.

Here's an small shell script demonstrating some of them:

    
    
        #!/bin/sh
        
        # err function
        err() { echo "$1" >&2; exit 1; }
        
        # init variables
        unset src
        unset dst
        dry_run=false
        
        # get arguments
        while [ $# -gt 0 ]; do
            case "$1" in
                -d|--dry-run) dry_run=true ;;
                --) shift; break ;;
                -*) err "unknown option: $1" ;;
                *)
                    if [ -z "$src" ]; then src="$1"
                    elif [ -z "$dst" ]; then dst="$1"
                    else err "unexpected argument: $1"
                    fi
                ;;
            esac
            shift
        done
        
        # sanity checks
        ## TODO: print usage
        if [ -z "$src" ]; then err "source not specified"; fi
        if [ ! -d "$src" ]; then err "source does not exist"; fi
        if [ ! -r "$src" ]; then err "cannot read source directory"; fi
        
        if [ -z "$dst" ]; then err "destination not specified"; fi
        
        if ! rsync --version >/dev/null 2>&1; then
            err "missing rsync(1)"
        fi
        
        # build rsync command
        set -- rsync -aq "$src" "$dst"
        
        # log command
        echo "copying $src to $dst"
        echo "    $@"
        if ! $dry_run; then
            if "$@"; then
                echo "success"
            else
                # rsync will have printed an error message
                err "rsync exited with error code $?"
            fi
        else
            echo "dry run; not executing"
        fi
    
    

Also be sure to read the man page for test (the [ command).

------
Aelius
Interesting stuff, but given bash's

\- Relative unportability \- Poor noncompliant sh interpreter \- Poor
performance (can be 4x slower than POSIX sh shells like dash for certain
tasks)

I personally think bash is always the wrong choice. Use POSIX sh (or a real
language). POSIX sh is 98% the same thing, there's no good reason to even use
bash over sh in the vast majority of cases. It's just this blight that won't
go away.

Given that bash is mostly sh anyway, most of this writeup applies to sh, too.
AFAIK the only thing bash-specific here is arrays.

~~~
rascul
I stick with bash to avoid implementation differences due to POSIX ambiguity
[0]. I know that all my bash scripts will run on at least bash-4.0 but I have
no idea what shell /bin/sh is going to be for any given system. I haven't
written a single shell script where the performance difference matters, nor do
I expect to. And there are a few nice bashisms besides arrays. I'm not sure
what portability issues you're referring to.

[0]
[https://stackoverflow.com/a/16376043](https://stackoverflow.com/a/16376043)

------
mehrdadn
A little heads-up: "$var" does what you think, but "$(cmd)" likely does not do
what you think:

\- The former just gives you a string whose contents are identical to that of
var.

\- The latter would do similar for the output of cmd, except that it _strips
away the trailing newline_. This is often not an issue, but can be crucially
important in some cases, and can catch you off-guard.

The point I'm making here is that it's actually _quite_ difficult to get a
string that literally has the contents you want. The fact that *nix lets you
put pretty much any characters in file names (even newlines) means that, just
like in Windows, your scripts can actually fail even when you try to quote
things "properly".

~~~
dozzie
> The point I'm making here is that it's actually _quite_ difficult to get a
> string that literally has the contents you want.

It's not difficult, it's just tedious.

    
    
      foo=$(whatever).
      foo=${foo%.}

~~~
mehrdadn
That's not general POSIX, right? I seem to recall it's Bash-specific? (P.S. I
think you forgot quotes?)

The other problem (which I guess I accidentally brushed under the rug when I
singled out "variables") is that having to do this actually means you _need to
put it in a variable_. If you're nesting subshells, this gets pretty darn
tedious, easy to forget about, and difficult to read pretty quickly... it
seems you wouldn't consider that "difficulty" and think of it as "just
tediousness", but I think if something is too easy to do incorrectly and too
tedious to get right, that's also a kind of added difficulty.

~~~
dozzie
> That's not general POSIX, right? I seem to recall it's Bash-specific? (P.S.
> I think you forgot quotes?)

Wrong. (And no, I did not.)

POSIX, or rather SUS (I've never had an access to POSIX), mandates ${foo%...}
syntax and its three cousins. And assignment is not subject to word splitting
for variable expansion.

~~~
oblio
Regarding POSIX:
[http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3...](http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html#tag_18_06_02)

------
nunez
Shellcheck made me a MUCH better Bash developer.

Also, I prefer using [ condition ] for tests instead of the less-portable [[
cond ]] syntax despite the latter being more feature-rich. Didn’t see that one
in there.

~~~
melq
Why do you prefer the former?

~~~
nunez
Portability

------
theamk
If you get to the point when you need to write

    
    
        IFS=$'\v' read -d '' -ra a < <(printf '%s\v' "$s")
    

it is a good sign that it's time to switch to some other language. For example
in python, it will be just

    
    
        a = s.split("\v")
    

(and yes, sometime you have legacy system, or writing initrd script. But how
often does this happen? Any why does your initrd has full bash anyway, as
opposed to dash or sh?)

------
skywhopper
Most of this is fine, but not all. In particular, command failure conditions
are a huge source of bugs, but `set -euo pipefail` is not going to solve all
your problems. `set -e` is just as likely to cause problems because it can
cause scripts to silently fail. And pipefail can pass through errors that
aren't relevant.

These are great tools to have, but don't blindly invoke them as magic spells
if you don't actually understand the implications.

------
ljm
> Gotcha: Errexit is ignored depending on caller context

It proves the point that it's a gotcha but those examples seemed sensible to
me. As far as I understand it, `set -e` doesn't turn every unchecked, non-zero
exit code into an exception because there's no way of knowing whether the
function or sub shell you're invoking was written by you or pulled in
elsewhere, and as a result you don't know if a non-zero exit code is a
legitimate, show-stopping failure.

Those functions and subshells might as well be mini inline executables and in
that context it makes sense to only check the final output. If that's horribly
wrong and confusing, maybe there should be a move to make `set -e` the default
so _all_ error handling is explicit, but you've got other languages for that
that don't involve throwing `|| true` onto the end of every unimportant
command you run.

I also realise that this doesn't make a case for Bash being intuitive.
Precisely the opposite. But I suppose you have that with a shell where it's
more important to be adaptive to the person behind the keyboard at the expense
of the purity of the implementation. Especially considering the history of it
all.

------
p3llin0r3
Recently I've replaced most of my bash scripting with the python library
invoke.

It's written by the same people who wrote the python ssh scripting framework
fabric.

[http://www.pyinvoke.org/](http://www.pyinvoke.org/)

Works great!!

~~~
TheGrassyKnoll
I'll have to try that one. I've done a few things with: Python process
launching [http://amoffat.github.com/sh](http://amoffat.github.com/sh)

~~~
theamk
Be careful with sh library! It runs the programs under tty by default, so you
get random effects like ascii color sequences and truncated git output. There
is an option to change this, but it not default, and it is easy to forget it.

------
thomasjames
Backticks are very error prone $(cat foo.txt) is much more explicit and
visually clearer for command substitution.

~~~
schizoidboy
And this way also allows nesting.

------
fredley
A tip from me, based on a mistake I made yesterday: don't `source
~/.bash_history` instead of `~/.bash_profile`.

(Luckily it entered vim relatively soon, from where I could kill the process).

~~~
xyproto
At least it's not recursive.

------
ta4354546444
Bash strikes me as a bit of a mess, as in people threw the kitchen sink into
it for 'portability'.

Things like being able to open a socket e.g. using the /dev/tcp/<ip>/<port>
stuff give me the willies a bit.

I lean towards Ruby if it's going to be anything longer than a few lines, or
requires anything but the simplest of logic/commands. Otherwise I was always
told /bin/sh is likely to be the most portable, so tend to use that in the
absence of any other good reason.

~~~
BenjiWiebe
I've written two network/telnet game-bots in Bash. It's terribly hacky but fun
in it's own way. Using the dev tcp trick and arrays and lots of bashisms.

------
devhead
nice write up, starred it for future hand outs to my friends.

I do find that once my bash script is going over a hundred lines or so it's
likely a good time to move to python.

I love bash, it's great; at some point setting up a proper script (in bash)
with arguments, options, logging and or validation you end up spending more
time getting it to work than you do on solving the actual problem; enter your
favorite programming language here.

------
floatingatoll
The advice to double-quote everything is an interesting way to circumvent
"detailed knowledge may be required". I wish it noted that you can't quote
regular expressions, though:

    
    
      $ cat - > foo
      #!/bin/bash
      a="BCD"
      [[ $a =~  .C.  ]] && echo 1
      [[ $a =~ ".C." ]] && echo 2
      ^D
      $ bash foo
      1
      $

~~~
Hello71
this is intentional, since you can store your regular expression in a
variable:

    
    
        $ a=BCD; b=.C.; [[ $a =~ $b ]]; echo $?; [[ $a =~ "$b" ]]; echo $?
    

otherwise, interpolating variables in regular expressions as text would
require other syntax (more confusing).

also, "cat -" is redundant, use "cat". this behavior is specified by POSIX.

~~~
floatingatoll
I use “cat -“ so that my code makes more sense to people. I want STDIN
declared somehow, and the dash is effective. Technically I shouldn’t have
bothered with the cat at all in an HN code snippet, but it was a courtesy to
provide a familiar environment for the block I wanted to convey. It worked so
well that you linted it! I really appreciate the thought.

------
JepZ
While I agree with the guide, there is one thing I was missing while writing
bash scripts with 'set -e' and that was some kind of stack tracing. So I added
a nice trap function to my personal bash script template. Be aware, that this
version does not include all best practices described in the guide.

    
    
      #!/usr/bin/env bash
      #--------------------------------------------
      # Default Bash Script Header
      set -eu
      trap stacktrace EXIT
      function stacktrace {
              if [ $? != 0 ]; then
                      echo -e "\nThe command '$BASH_COMMAND' triggerd a stacktrace:"
                      for i in $(seq 1 $((${#FUNCNAME[@]} - 2))); do j=$(($i+1)); echo -e "\t${BASH_SOURCE[$i]}: ${FUNCNAME[$i]}() called in ${BASH_SOURCE[$j]}:${BASH_LINENO[$i]}"; done
              fi
      }
      
      SCRIPT_DIR="$(dirname "$(readlink -f "$0")")"
      #--------------------------------------------

------
elbear
I notice a pattern in all articles about BASH. Some people say it should die.
Others praise its ubiquity and versatility at handling certain tasks. Of
course, both sides are right.

From my point of view, BASH will never disappear, because it's not a living
thing that runs out of food or habitat and dies. That doesn't happen, unless
there's a major revolution in computing that makes current paradigms obsolete,
something on the scale of the disappearance of the dinosaurs.

Until that happens, I welcome any projects that aim to decrease the amount of
buggy BASH in the wild. I avoid it as much as possible, but if someone's going
to use it, at least they'll have safety nets to reduce the possible damage.

PS: I have a feeling my metaphors are all over the place, but I hope that
doesn't detract from the message.

------
Groxx
Overwhelmingly a great resource, though I will nitpick at this one:

> _Globbing is easier to use correctly than find._

For very simple purposes or small file trees, sure. Outside that: find is
_incredibly_ more powerful, and worth using in many cases. If nothing else,
learning to use `find . -path ignore_this_path -prune -o *.ext -print` can
change e.g. a Go project script from visibly-slow to instant. (e.g. the latest
place I used this went from 1-5+ (hot vs cold) seconds to 5-50ms)

------
codedokode
How does one properly quote an argument for a command that can contain spaces?

For example:

    
    
        parent_dir = "$(basename $dir)"
    

How to quote $dir here? What if it contains spaces or other special
characters?

That's what I dislike about Bash.

Also, always start your scripts with

    
    
        set -e
    

This prevents script from running after error, without any messages though.

Also, I always make mistakes when using [ and [[.

~~~
zeroimpl
You just add quotes around $dir. Bash understands the quotes
properly/recursively.

    
    
        parent_dir = "$(basename "$dir")"

~~~
codedokode
So I finally found the answer. Thanks.

~~~
8xde0wcNwpslOw
Bear in mind (pointed by the article as well) that variable assignment doesn't
allow spaces around the equals sign, and that in this case the outer
parentheses are unnecessary anyway.

------
kazinator
> _Quoting inhibits both word splitting and wildcard expansion, for variables
> and command substitutions._

The result of a variable substitution isn't subject to wilcard expansion,
whether quoted or not.

If your only reason to quote "$foo" is because you think _foo_ expands to a
globbing pattern, and no other reason is justified, you can drop the quotes.

~~~
koala_man
It _is_ subject to wildcard expansion. You can verify this with

    
    
        var="/*"; echo $var

~~~
kazinator
Ooops, you're right! I somehow mixed this up with special contexts.

    
    
      case $var in
      '*' )
        echo asterisk ;;
      esac
    

But wildcard expansion is suppressed there unconditionally. Same as in
assignment context: other=$var.

~~~
Hello71
those both suppress word splitting too.

------
andreyv
> Furthermore, set -u must not be used in Bash 4.3 and earlier.

This is a dangerous advice. set -u indeed requires more verbose array handling
in Bash < 4.4, but it also catches code like this:

    
    
      rm -rf "${OOPS_UNDEFINED}/"

------
nerdponx
Step 1 to write safer shell scripts: use a safer shell. Zsh gives the user
much more control, has safer defaults, and is itself quite portable (even if
Zsh scripts are not portable to other shells).

~~~
Hello71
unfortunately, zsh is installed on probably about 1% of Linux systems
worldwide. maybe it can be installed almost anywhere, but the fact is it
isn't, and you might as well use Python or something at that point.

~~~
h1d
And how many on the system that you actually touch? How hard is it to (ask to)
install a new shell? Obviously if you're distributing your script publicly no
one would write that in zsh compatible way.

------
vinceguidry
Reason number 650 why I love Ruby: it offers ridiculously gradual transitions
from shell scripts. One time I built up a pretty weighty conditional-heavy
script in Bash and didn't want to keep adding to it. So I simply made all the
bash calls use backticks. Took me all of a few minutes to get option parsing
working again.

One day I'll learn the Ruby way of doing shell one-liners and that'll
hopefully keep me out of man pages for that sort of thing forever.

------
wocram
Is this project ready to share?

It's not published to crates.io, and doesn't have a license or Cargo.toml

------
chubot
I don't agree with the advice to use arrays and set -u together. Sometimes you
just need to get something done, and being pedantic works against you.

This advice only works in bash 4.4, and many common distros are on bash 4.3,
like Ubuntu 16.02 LTS. (bash 4.4 was released September 2016.) Because of this
bug, the section "how to begin a bash script" is version-specific and awkward
IMO.

If you need to process untrusted filenames, use arrays, but otherwise it's
probably more trouble than it's worth. [1]

I want my scripts to work on older versions of bash, and I think 'set -u' is
important, so I mostly get by without arrays. It's not ideal, but shell is
full of compromises.

A workaround is to use ${a[@]+"${a[@]}"}, which avoids the bug in bash 4.3,
but that seems too ugly to recommend.

More details in this comment I wrote on the same article:
[https://lobste.rs/s/4jegyk/how_do_things_safely_bash#c_kmldw...](https://lobste.rs/s/4jegyk/how_do_things_safely_bash#c_kmldw6)

[1] _Thirteen Incorrect Ways and Two Awkward Ways to Use Arrays_
[http://www.oilshell.org/blog/2016/11/06.html](http://www.oilshell.org/blog/2016/11/06.html)

------
BeetleB
I gave up long ago on trying to learn shell scripting _and_ remembering it.
Switching to xonsh for my shell made shell scripting easier.

[http://xon.sh/](http://xon.sh/)

------
ausjke
For further portability you can convert bash shell to C and compile it into
binaries.

[https://github.com/neurobin/shc](https://github.com/neurobin/shc)

------
jonnycomputer
if it needs to be safe, use python.

please don't down vote me

~~~
tootie
Safe bash is what Perl was invented for. Don't downvote me.

~~~
jonnycomputer
someday i might even add it to my toolkit

------
meow_mix
Why ever resort to shell scripts when we have languages like ruby / python for
anything more complex than installs?

Clarity is king people

~~~
scbrg
Because

    
    
      foo | awk '/bar/ {print $3}'
    

is more clear than

    
    
      import subprocess
      foo = subprocess.Popen(['foo'], stdout=subprocess.PIPE)
      for line in foo.stdout:
        if 'bar' in line:
          try:
            print(line.split()[2])
          except IndexError:
            print('')
    

Sometimes shell scripts are more clear. Especially for tasks that involve
running lots of external commands.

~~~
meow_mix
Nobody doubts that pipes and languages like awk are great for one liners, but
I think that's a little besides the point of this post, which is advocating
for things like the use of bash arrays:

``` array=( a b ) array+=(c) if [ ${#array[@]} -gt 0 ]; then rm --
"${array[@]}" fi ```

``` array = [a, b] array << c array.map { |i| `rm #{i}` } if array.length > 0
```

There's also nobody stopping you from using text processing tools like awk and
sed, or bash one liners in ruby/python either, but I think we should leave the
logic and arrays for scripting languages, no?

~~~
scbrg
You're moving the goalposts! Previously it was "anything but install scripts".
Now it's "logic and arrays".

I actually think we agree with each other, but we express it differently. I
never write _bash_ scripts, I write POSIX shell, so all the array juggling of
bash is something I never deal with. As you say, by the time you need arrays,
you should have switched languages already.

That said, I think there's a fairly large domain of problems - apart from
install scripts - that are better solved with shell scripts, _because of their
clarity_. Anything which relies on invokation of lots of other tools, and in
particular problems that fit the pipe model (take output from this tool,
extract interesting bits from it, and feed it to that tool, etc). And this I
say as an otherwise almost slightly fanatical Pythonista :-)

------
shmerl

        if [ ${#array[@]} -gt 0 ]; then
    

a better way to write it:

    
    
        if (( ${#array[@]} > 0 )); then

~~~
Hello71
the idiomatic way:

    
    
        if (( ${#array[@]} )); then

~~~
shmerl
I prefer not to use implicit boolean conversions. It's also less readable.

------
xixixao
If your everyday language is JavaScript, use shelljs. It’s great. You’ll get
portable scripts in no time. Sure it’s “slow”, in a way that likely doesn’t
matter because of what you’re calling from the script.

------
IshKebab
From the author of "Russian roulette: how to probably not die", "Staying
healthy with fast food" and "Self-immolation for dummies".

If you need safety don't use Bash.

~~~
waynecochran
Amen. Writing bash scripts (I have been programming in Unix since 1985) is an
unnatural, fragile, and error prone endeavor. If is more than 5 lines long, I
use Perl (which still far from ideal). When an article is almost solely about
what not to do, that tells you something.

------
tzahola
Obligatory: [http://redsymbol.net/articles/unofficial-bash-strict-
mode/](http://redsymbol.net/articles/unofficial-bash-strict-mode/)

------
Annatar
" Should I use backticks?

Command substitutions also come in this form:

    
    
        Correct: "`cmd`"
        Bad: `cmd`
    

While it is possible to use this style correctly, it looks even more awkward
in quotes and is less readable when nested. The consensus around this one is
pretty clear: Avoid."

This is how one can tell a rookie just learning to program in shell: usage of
$() syntax is limited to Bourne family of shells which implement that
particular POSIX specification aspect.

Backticks, on the other hand, although they incur a performance penalty since
they spawn a subshell, make one's code instantly portable across all UNIX-like
operating systems and even across disparate shell families, as they work
exactly the same in C-shells. (Whether one should program in a C-shell family
is a different discussion.)

The subshell performance penalty is negligible in 99% of the cases as this
1970's technology has tiny memory and processor overhead due to the fact that
it's been developed on systems with small memory and a slow CPU, so it had
been optimized for performance.

Over my 30+ years of shell programming, I know of only one documented instance
where the $() construct which doesn't spawn a subshell made a difference, and
was the only time it was actually a valid requirement:

[https://www.joyent.com/blog/building-packages-at-
scale](https://www.joyent.com/blog/building-packages-at-scale)

but even then, the author ended up using dash, not bash.

For maximum portability and closest adherence to POSIX, program in Korn shell,
ksh93. (Modern versions of ksh implement ksh93 functionality.) Then you may
safely use $() and be assured it will work in all Korn shells across different
operating systems (even in ksh88).

Otherwise, DON'T avoid using backticks, because you will be giving away
portability for no good reason. Don't program in bash, but in original Bourne
shell (sh) for maximum portability across different operating systems; don't
assume that you can use bash constructs in Bourne shell (as /bin/sh on
GNU/Linux tells bash to run in Bourne shell emulation mode, but that mode
isn't implemented completely or correctly, since bash constructs are still
accepted). Always test your shell code on a traditional UNIX like HP-UX or a
Solaris derivative like SmartOS if possible, with a real Bourne shell.

~~~
posix_me_less
> _" This is how one can tell a rookie just learning to program in shell:
> usage of $() syntax is limited to Bourne family of shells which implement
> that particular POSIX specification aspect. Backticks, on the other hand,
> although they incur a performance penalty since they spawn a subshell, make
> one's code instantly portable across all UNIX-like operating systems and
> even across disparate shell families, as they work exactly the same in
> C-shells."_

The '$()' notation is a standard shell feature de facto. The fraction of
people who care about their script working on all UNIX-like operating systems'
default shells is very close to 0. The recommendation is fine - this notation
is more readable and nestable.

> _" Always test your shell code on a traditional UNIX like HP-UX or a Solaris
> derivative like SmartOS if possible, with a real Bourne shell."_

Only if your script needs to be "original Bourne shell" compatibile. Which is
almost never for most script writers.

~~~
Annatar
“Long live the GNU/Linux hegemony and monoculture, the only truth and true
religion”.

Lovely.

~~~
posix_me_less
Long live the inventiveness and free spirit of contributors who brings us
useful improvements and progress to what would otherwise be a cumbersome
legacy computer interface.

~~~
Annatar
Which interface? Are you sure you could not have made a broader generalization
and a more nondescript statement?

