
Shell script mistakes - reinhardt
http://www.pixelbeat.org/programming/shell_script_mistakes.html
======
barrkel
The single biggest shell script mistake is not handling whitespace in file
names correctly, and it's almost impossible to do correctly if you have weird
file names: embedded newlines, leading and trailing spaces, embedded tabs.
Embedded quotes can be tricky too, especially if you're writing a script that
generates a script.

That bit, writing a script that generates a script, happens surprisingly often
in bash. It's cheaper to pipe a stream to sed that converts it into a shell
command than it is to iterate over all the lines, and individually pluck out
the arguments for the commands you want to execute. Leaving the script as
something that outputs shell commands also lets you inspect what it does
before committing to it (by piping it to bash).

~~~
yogsototh
The problem with file name is the main reason I use zsh.

    
    
        for fic in **/*(.); do
            doSomethingWith $fic
        done
    

will take way more time than using find, but will work even with files with
stranges chars.

~~~
q_revert
and another reason to use zsh is the incredibly nice looping syntax, for
example, this is the same as

    
    
      for fic (**/*(.)) doSomethingWith $fic
    

I find the fact that it fits in a single line makes me much more likely to use
it in everyday shell use.. though I'd still revert to bash for scripting

------
BCM43
I find that after a shell script gets to be over 3 or so lines, it's easier to
switch over to python or perl. Do others feel the same?

~~~
laumars
I don't agree with this. I love Perl, it's up there as one of my favourite
languages (despite it's many short comings), but in my opinion shell scripts
make more sense if you're writing scripts which depend upon a number of
additional programs as the bulk of it's processing.

For me, the point at which Perl makes more sense is if your script requires
more internal logic than it does depends on spawning other programs.

For example, if I'm writing a routine to auto snapshot ZFS / Btrfs volumes and
delete any over a certain age, the script would be dependant on your file
system CLI tools. So it makes more sense to have an 80+ line shell script than
it does to write that in Perl / Python.

However if I was writing a routine which requires users inputting details,
where those details need to be sanity checked and then stored some where (such
as a database), then the core logic of that program resides within your script
(where you'd have to read inputs, do your sanity checks and then write to the
database). So a Perl or Python script makes more sense.

Obviously you can do either of those examples in each of those languages
(crudely speaking as I know shell scripts aren't technically a programming
language); but that's just a basic example of where I personally draw the
line.

I also think this is one of those occasions where it doesn't massively matter
which approach you take just so long as the code works and is maintainable
(though I draw the line at one maintenance script I saw last year. It was a
Python script where every other line was _os.system_. It just struck me as
rather pointless starting a Python interpreter if you're just going to use it
like a shell script - you might as well do the whole lot in the shell to begin
with).

~~~
jsight

        *(though I draw the line at one maintenance script I saw last year. It was
        a Python script where every other line was os.system. It just struck
        me as rather pointless starting a Python interpreter if you're just
        going to use it like a shell script - you might as well do the whole lot in the shell to begin with).*
    

FWIW, one of my favorite features of Perl syntax is that you can do things
like this. A lot of my quick-and-dirty sysadmin scripts end up being Perl
scripts with lots of backticks. It's handy for when they grow (as they often
do) into more full-featured scripts.

------
gwu78
This example

    
    
      for $file in *;do wc -l $file;done
    

could be reduced to

    
    
      for $file in *; { wc -c $file ;}
    

in some POSIX-like shells.

Is the for loop even necessary?

    
    
        echo wc -l * |sh 
    

But...

[http://www.in-ulm.de/~mascheck/various/argmax/](http://www.in-
ulm.de/~mascheck/various/argmax/)

------
bloat
This is a great page in the same vein - bash specific, but quite a bit more
comprehensive.

[http://mywiki.wooledge.org/BashPitfalls](http://mywiki.wooledge.org/BashPitfalls)

------
knweiss
I recommend the shell script static analyzer ShellCheck:
[https://github.com/koalaman/shellcheck](https://github.com/koalaman/shellcheck)

------
sateesh
One of the subtle shell script mistake which I was unaware was that if a shell
script is modified, currently running instances of the script might fail [1].

1\.
[http://stackoverflow.com/questions/2285403](http://stackoverflow.com/questions/2285403)

------
memracom
Let's not forget unit testing. After all a shell script is code and code
should be unit tested.

[https://code.google.com/p/shunit2/](https://code.google.com/p/shunit2/)

~~~
LukeShu
I prefer
[http://bmizerany.github.io/roundup/](http://bmizerany.github.io/roundup/) .
It works in fewer shells, but when you know which shell(s) you are targeting,
that doesn't matter. It is far less magic than shunit2--writing the tests
actually feels like writing shell.

