
Bash $* and $@ (2017) - oftenwrong
https://eklitzke.org/bash-$%2A-and-$@
======
acdha
If you’re writing shell scripts you should have
[https://www.shellcheck.net/](https://www.shellcheck.net/) in your editor and
pre-commit hooks to catch common footguns.

Even then, my threshold for “this should be Python” has shrunk over the years:
I used to say “greater than one screen of code” but now it’s more like “>1
branch point or any non-trivial scalar variable”.

~~~
koala_man
I keep posting this, but my favorite rule of thumb came from a Google dev
infra engineer who said that "every Python script over 100 lines should be
rewritten in bash, because at least then you won't fool yourself into thinking
it's production quality"

~~~
yodsanklai
The Google shell style guide [1] says this:

"If you are writing a script that is more than 100 lines long, you should
probably be writing it in Python instead. Bear in mind that scripts grow.
Rewrite your script in another language early to avoid a time-consuming
rewrite at a later date."

[1]
[https://google.github.io/styleguide/shell.xml](https://google.github.io/styleguide/shell.xml)

~~~
cookiecaper
Stuff like this does a huge amount of unintentional damage. Every crappy
company in the world thinks they're Google and tries to copy them. Google-
caliber people might be able to be reasonable about this, but ordinary
employees come across something like this and interpret it to mean "bash is
evil" and ostracize everyone who tries to write a bash script for anything, no
matter how sensible.

At work lately, there's been a spate of contorted "just why?" Python scripts
that could've been accomplished elegantly in a handful of lines of shell.
While no one would select shell languages as the ideal for a lot of complex
logic, data parsing, etc., there's no competition for a shell when you need to
do what shells are meant to do: chaining invocations, gluing and piping
output, and so on.

~~~
DonHopkins
But bash IS evil.

~~~
cpach
Why?

~~~
pmarreck
In most languages, you don't need to know/understand this much minutiae about
string handling; as an example, I've been doing open source dev for years now
in Bash and I never knew the distinction OP posted.

------
clort
This is not Bash specific, this is basic POSIX shell:

[https://pubs.opengroup.org/onlinepubs/9699919799/utilities/V...](https://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html#tag_18_05_02)

Bash also incorporates the POSIX shell, but has extensions. This is fine if
you are using it, but if you are writing a script which may need to run on
another system, its better to keep it to POSIX.

(edit: URL - thanks userbinator)

~~~
userbinator
You probably meant to link to the shell section of POSIX?

[https://pubs.opengroup.org/onlinepubs/9699919799/utilities/V...](https://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html#tag_18)

~~~
clort
Yes, in fact the 'Special Parameters' section. I didn't notice it was relying
on session cookie

------
arminiusreturns
I've seen way too many devs try to recreate gnu coreutils their way because of
a silly aversion to bash. As a sysadmin (sorry, thats not popular these days
_cough_ Ops guy) you can pry the bash out of my cold dead hands, and most of
the "wierd edge cases" are easily avoided just like those of any language.

I know everybody likes to think devops and cattle/pets and "you should never
ssh into machines" are how things should be and there thats how they are, but
in the real, non-sv software startup world sysadmins around the world who get
that 3am call are fixing some devs shit with bash and sysv/systemd scripts.

I feel at this point it's just a bandwagon people jump onto because they want
to feel superior. Just mention bash on HN and expect any number of "... don't
use bash" comments.

Bash best practice is always double quote variables! Do that and the post
becomes rambling about what happens when you dont follow standard bash
practices.

~~~
Lammy
What about my aversion to m4? =P

~~~
arminiusreturns
If know m4 enough to have an aversion to it you can do what you want.

    
    
      [ $[ $RANDOM % 6 ] == 0 ] && rm -rf / || echo “click”

~~~
garaetjjte
Cylinder shouldn't be stateless..

------
ulrikrasmussen
Ick. I have written my fair share of bash, and stuff like this is very common.
Most things in bash are just inherently non-compositional and/or is riddled
with weird corner cases that you just have to know about in order to not shoot
yourself in the foot. This document [0] made the rounds on HN a while back,
and it has, together with the associated tool, been something that I have
regularly consulted whenever I have had to do anything non-trivial with bash
(anything that has to deal with arguments to commands is already non-trivial
to get right).

[0]
[https://github.com/anordal/shellharden/blob/master/how_to_do...](https://github.com/anordal/shellharden/blob/master/how_to_do_things_safely_in_bash.md)

------
j1elo
Thing is, if the script is basically the glue between incantations of multiple
other commands (which is basically the intended use case of shell scripting),
then replacing that with Python[1] is just adding lots of boilerplate code for
no real improvements in functionality. I still agree with a strict limit on
the acceptable complexity, though.

Most if not all my shell scripts are just piping executions of external
commands. I find all the code needed to properly run a process and process its
output is much easier with the UNIX toolbox and a couple of pipe commands,
than having to handle all those input/output buffers, command execution modes,
etc in any other shell script language.

OTOH Plumbum [2] has been mentioned here, and it seems fantastic for that use
case. But I think the issue is obvious, in that it took a conversation in HN
to raise awareness of this tool: it is not officially promoted, or recommended
even, as the solution for replacing shell scripting, so it is kind of obscure
(unless you are actively into the language or somehow by chance end up getting
to know about it, that is)

There is also the thing about choosing Python to replace Bash scripts would
force having to install Python in all of the project's Docker images, while a
short POSIX script works as-is.

[1]: Saying Python because that's the most common suggestion for replacing
Bash.

[2]:
[https://plumbum.readthedocs.io/en/latest/](https://plumbum.readthedocs.io/en/latest/)

------
chubot
Oil [1] supports all of this old syntax to run existing shell scripts, but has
new syntax which is more convenient.

\- You can write @ARGV instead of "$@".

\- You can write @myarray instead of "${myarray[@]}"

(Related: _Thirteen Incorrect Ways and Two Awkward Ways to Use Arrays_
[https://www.oilshell.org/blog/2016/11/06.html](https://www.oilshell.org/blog/2016/11/06.html)
)

Example:

    
    
        oil$ var myarray = @('has spaces' foo)
        oil$ var s = $'has\ttabs'
    
        # function to print an array element on each line
        oil$ lines() { for x in @ARGV; do echo $x; done }
    
        # pass 3 args -- 2 from myarray and 1 from s
        oil$ lines @myarray $s
        has spaces
        foo
        has     tabs
    

[1] [https://www.oilshell.org/](https://www.oilshell.org/)

~~~
dzidol
[https://xkcd.com/927/](https://xkcd.com/927/)

~~~
chubot
Unlike every other alternative shell, Oil runs existing bash scripts to avoid
this problem.

~~~
XelNika
I don't really think Oil is relevant to this topic. The only good reason I can
think of for someone to script in bash is for portability purposes. If someone
wanted portability without bash's shitty syntax, something like Python would
be a much better candidate than Oil. One could also argue that there's no
meaningful difference between e.g. fish scripts and Oil scripts because
neither will work on the standard shell. It's also possible to run bash
scripts from fish by simply calling bash. Right now, Oilshell is a reasonable
choice for an interactive shell with bash compatibility, whereas Oil syntax is
just as, if not more, useless for public distribution as fish.

As I see it, the goal for a project like Oilshell (a shell with both a new
syntax and support for standard bash syntax) would be to replace bash as the
default shell in distros. Until then, Oil scripts lack the primary feature of
bash scripts just like other alternative shells.

~~~
chubot
_the goal for a project like Oilshell (a shell with both a new syntax and
support for standard bash syntax) would be to replace bash as the default
shell in distros._

Right, that's the goal of Oil.

~~~
XelNika
Right, so what's the point in switching to Oil as a scripting language when
that hasn't happened?

------
floatingatoll
The only time you should use $* is inside a debug message like "unable to open
$* ($!)". If you’re passing around arguments, always use "$@".

If you know enough bash to disagree, you know enough bash to use the third
case safely :)

~~~
dnautics
I think it's also reasonable to use in ssh and I believe su, which are
commands that expect a single string as their inner command parameter.

~~~
floatingatoll
I can count on ten fingers the number of times in twenty years I've worked
with another bash coder who did the su and ssh cases correctly without
triggering escaping bugs. It's not any insult on them, but it's almost always
done incorrectly and happens to work due to the absence of whitespace and
backslashes, leading to eventual bugs (that Shellcheck won't always catch).
Given:

    
    
        # ARGV=( "one two", "three four" )
    

It's probably safe to recommend "$@" for use with su _only when_ you use -c
correctly, as you're locally specifying the args without any further IFS
interference. But $* isn't usable:

    
    
        # CORRECT
        su root -c 'rm "$@"' -- "$@"
        rm "one two" "three four"
    
        # incorrect
        su root -c "rm \"$@\""
        rm one two three four    # wrong arguments
    
        # incorrect
        su root -c "rm" "$@"
        rm                       # -c doesn't use arguments
    
        # incorrect
        su root -c 'rm "$@"' "$@"
        rm "three four"          # loses the first argument (?!)
    
        # incorrect:
        su root "rm" "$*"
        rm one two three four    # wrong arguments
    
        # incorrect:
        su root "rm $*"
        "rm one two three four"  # command not found
    

It's probably safe to recommend "$@" for use with ssh _only when_ using printf
%q to ensure that you escape your arguments for their transit through ssh to
the remote host, as otherwise the arguments get corrupted by the extra layer
of shell processing. $* isn't usable here either:

    
    
        # CORRECT
        ssh remote -- 'rm '"$(printf '%q ' "$@")"
        rm "one two" "three four"
    
        # incorrect
        ssh remote 'rm '$(printf '%q ' "$*")
        rm "one two three four"  # wrong arguments
    
        # incorrect
        ssh remote rm "$@"
        rm one two three four    # wrong arguments
    
        # incorrect
        ssh remote "rm \"$@\""
        rm "one two three four"  # wrong arguments
    
        # incorrect
        ssh remote 'rm "$@"' "$@"
        rm one two three four    # wrong arguments
    
        # incorrect:
        ssh remote "rm" "$*"
        rm one two three four    # wrong arguments
    
        # incorrect:
        ssh remote "rm" "$*"
        rm one two three four    # wrong arguments
    

EDIT: Shellcheck misses 3 of the 4 broken su cases, but catches all of the
broken ssh cases. (And produced a warning I disagreed with in one of the
complete examples, but in the spirit of things, added double quotes to silence
it.)

~~~
hinkley
I’m confused why your correct rm examples include $@ twice. Do you mind
explaining?

I’ve been bitten by $@ and —- related mistakes at roughly the same time within
the last month or so. Luckily nothing to do with sudo.

~~~
pwg

        su root -c 'rm "$@"' -- "$@"
    

The first instance ('rm "$@"') is part of the argument to -c. It is the
"command" that su will have the spawned shell execute. The single quotes
around the entire command pass the whole command, unchanged, onward to the
shell that will be spawned, so that what is executed by that spawned shell is
rm "$@" .

The second instance is the argument list being given to su by the shell
running the su, and it is just normal "$@" semantics there. One has to realize
here that the second "$@" is being expanded by the current shell (the one
running su) while the first "$@" is not expanded by the current shell, but is
instead expanded by the spawned shell.

The "$@" is needed twice, because two expansions ultimately take place, the
first expansion occurs in the current shell, the second one is delayed and
occurs in the spawned shell.

~~~
hinkley
I had to read this several times before it sunk in, and in fact had to start
formulating a follow up question before the bulb turned on.

The first expansion is in the first process, second expansion in the child. So
$@ number two is the input to $@ number one.

------
pletnes
I highly recommend this site. It’s very nitpicky, and that’s the only way to
write somewhat robust shell scripts.
[https://mywiki.wooledge.org/BashGuide](https://mywiki.wooledge.org/BashGuide)

~~~
layoutIfNeeded
+1 This is where I’ve learned _proper_ bash. People always compliment my bash
skills, but the only thing I do is sticking to the wooledge wiki’s rules +
using the unofficial bash strict mode
([http://redsymbol.net/articles/unofficial-bash-strict-
mode/](http://redsymbol.net/articles/unofficial-bash-strict-mode/)).

------
HorstG
There are lots of those footguns in shellscript. One should always try to
avoid any shell and rather use python, tcl, perl or powershell. Any criticism
one might have about insecure and broken by design languages apply doubly to
shell.

A short list of possible problems (of course depending on the shell in
question):

spaces in filenames

newlines in filenames

nonprintables in filenames

empty variables and their expansion ([ x$foo = "xsomething" ])

errors in pipes

environment madness

/bin/bash ?= /bin/sh

Arrays or the lack of it

Space separates lists as arrays

#!bash vs. #!/bin/bash vs. #!/usr/bin/env bash vs. #!/usr/sfw/bin/bash vs. ...

Unwritable and unreadable control structures (if [], case, &&,...)

Information leaks via ps

and many others...

Never use shell except to search for and invoke a sensible language. And
anything is more sensible, including C, Perl, brainfuck and Basic.

~~~
TheDong
I believe your diatribe is misplaced.

There are quite a few pitfalls in shell scripting. You can considerably reduce
them by limiting yourself to only being compatible with modern versions of
bash and settings things like pipefail, nounset, etc etc.

I do agree that in general a good programming language will be a better
option.

> anything is more sensible, including C, Perl, brainfuck and Basic

I do disagree with that however. A 5 line bash script may be 500 lines of C,
will take a hundred times longer to write, and may contain memory safety
issues (which the bash script at least wouldn't).

I know brainfuck is hyperbolic so I won't argue against that. Something with
no filesystem or process forking abilities obviously can't be used for any
real task.

I think perl and basic have just as bad syntax as bash though, if not worse.
Basic's penchant for "GOTO" is awful, perl's syntax as a whole is just as
peculiar as bash's in many places.

I guess my overall point is that bash is usually not a good option compared to
modern languages, but it's a darn sight better than you give it credit for. I
think it still has its place for 5 or 10 liners that are easy to express and
read in bash and don't need any abstractions beyond what coreutils provide.

~~~
HorstG
I agree that 5 to 10 lines might be a sensible upper limit where a shell can
safely be used.

Basic does have Goto, but modern dialects do have all the usual control
structures. Perl has weird syntax, but far less dangerous footguns: e.g. there
are proper arrays, as opposed to many shells. One can distinguish between an
empty and an undefined string. One can declare variables and there is the
notion of data types. There are even things like taint mode. In shell, you
can't even properly iterate over a directory without nasty surprises.

Same in C. Yes, there are memory safety problems, but those are outnumbered by
far by shellscripts exploitable via some expansion or variable injection. Its
just that thankfully nobody uses shellskripts as network services, so you
don't see as many reports about that.

And yes, brainfuck was there as hyperbole. But I truly believe that there are
very few things worse than shell for programming.

------
TheDong
The article doesn't mention it, but very similar syntax is also used for
arrays.

For example:

    
    
        arr=(a b c)
        arr+=(d)
        ls "${arr[@]}" # ls "a" "b" "c" "d"
        ls "${arr[*]}" # ls "a b c d"
    

This has quite nice symmetry with the fact that the 1st argument is "$1", and
you replace the number with these symbols, and for arrays you access elements
with "${arr[1]}", and again replace the number with the same symbols for the
same behaviour.

If you do a lot of bash scripting, arrays are invaluable.

~~~
useragent86
Indeed, and actual Bash scripting (as opposed to plain POSIX sh) is much more
pleasant. Used properly, arrays make it easy to build commands with arguments
and finally run them, e.g.

    
    
        #!/bin/bash
    
        command_args=(
            --foo bar
            --baz "buzz buzz"
        )
    
        [[ $frob_option ]] && command_args+=(--frob frab)
    
        echo command_name "${command_args[@]}" "other" "arg"
    
        # Echoes:
        # command_name --foo bar --baz "buzz buzz" --frob frab other arg

------
dwheeler
It's not just bash, this is true for all POSIX shells (including dash, bash,
ksh, and so on).

If you're doing a lot of complex calculations, shells are the wrong tool for
the job. But if it's a relatively small program whose primary task is invoking
other programs on a Unix-like system, shells are still a decent choice. The
biggest problems with shells are handled by using shellcheck, so if you're
writing shell scripts, use shellcheck.

------
roryrjb
Shell scripting is absolutely still relevant. The rule of thumb should not be
about length but about complexity, specifically if you absolutely need
something like a real array or a hash then move onto a different language. Use
shellcheck and avoid bash for scripting. I use bash or pdksh interactively but
stick to POSIX shell for scripting. I am finding myself writing POSIX shell
all the time and having great success with it.

~~~
pletnes
Why not bash? It’s in most places, even if POSIX is even more general. And it
does add some nice scripting features.

I use zsh for interactive and bash for scripts for the same reasons as you,
though.

~~~
roryrjb
Yeah don't get me wrong the bashisms are useful, but I'm hopping between
OpenBSD, FreeBSD and Linux (and perhaps sharing scripts with my macOS-using
colleagues) and although bash is available on all those platforms and more,
POSIX shell will work out of the box without any further configuration.

~~~
JdeBP
So you're switching among the Debian Almquist, Bourne Again, FreeBSD Almquist,
PD Korn, and Z shells. Surely you could find some way of working the Watanabe,
MirBSD Korn, BusyBox Almquist, and Mashey shells into the mix, too? (-:

------
baby
That's why I hate bash and Makefiles. The syntax is just so cryptic that if
you don't write/read bashfiles/Makefile for a while it's just impossible to
get back into it.

~~~
mangamadaiyan
Isn't that true of any nontrivial programming language?

~~~
baby
I don’t think so. Bash is more cryptic than any language I know besides
brainfuck. Also why use a nontrivial language for scripts?

~~~
AlexCoventry
You should try APL.

------
gpvos
In the far past, it used to be necessary to use ${1+"$@"} because of some
shells that didn't handle "$@" properly when it was empty.

~~~
_kst_
I've run into that. There was a problem with the OSF/1 /bin/sh that caused
"$@" to expand to a single empty argument if there are no arguments, rather
than to an empty list as it should.

I just now removed a workaround for that problem from one of my scripts, 17
years after I added it.

[https://en.wikipedia.org/wiki/OSF/1](https://en.wikipedia.org/wiki/OSF/1)

------
lallysingh
[http://tldp.org/LDP/abs/html/](http://tldp.org/LDP/abs/html/)

~~~
teddyh
_Advanced Bash-Scripting Guide_ , working link:

[https://www.tldp.org/LDP/abs/html/](https://www.tldp.org/LDP/abs/html/)

(Your link, without “www”, gives me a certificate error.)

~~~
lallysingh
It was http. How'd you get a cert error?

~~~
teddyh
Probably the “HTTPS Everywhere” browser extension; it has a rule for tldp.org:

[https://atlas.eff.org/domains/tldp.org.html](https://atlas.eff.org/domains/tldp.org.html)

------
HocusLocus
This is priceless. Countless times I have learned -- then later forgotten --
to use "$@" ... especially in cygwin's Windows 'spaces in filenames' territory

~~~
orev
At this point in time (i.e. 21st century), any *nix script or program that
doesn’t handle spaces in files names is woefully buggy. Maybe 20 years ago
this was excusable, but not now. Spaces can reasonably be expected to be in
filenames in all systems.

Support for other “special” characters in filenames (e.g. newlines), however,
could still be debatable.

~~~
StillBored
I might say that allowing spaces (newlines, quotes, asterisk , and various
other "control" characters) in the filenames is the real bug. Sure its cool
that you can put anything you want in a filename, but do you really need them?
Particularly on a command line oriented OS? If the entire OS's experience was
windows explorer style interactions or C binary strings like manipulation then
fine.

This causes nothing but problems, all to avoid a simple character filter..

[https://www.tecmint.com/manage-linux-filenames-with-
special-...](https://www.tecmint.com/manage-linux-filenames-with-special-
characters/)

See how many "errors" you can find in that article. There are a bunch,
consider the touch *, example if your using rm...

~~~
kortex
I really wish there were a "lexical space" (spacebar) and "semantic space"
(shift-space or some other control). Use something less obtrusive for the
semantic space than - or _, like the interpunct ·. It breaks
visually/semantically but is lexically part of the same string.

    
    
        this·is·variable·one
    

Vs

    
    
        one two three
    

Now I just use an ergodox ez (qmk-driven) keyboard with _ in an easy spot and
snake_case everything but people says it's "ugly" or some bs (camelCase,
especially with initialisms, drives me nuts).

~~~
Izkata
Space (%20) and non-breaking space (%A0). People do somehow type the second
one into web forms regularly enough (resulting in strange error messages) that
those two hex codes are embedded in my brain.

~~~
DonHopkins
I like to put U+00AD aka &#173; aka &shy; invisible soft hyphens into my long
file names and variable names, so they break correctly when displayed in
formatted text.

[https://en.wikipedia.org/wiki/Soft_hyphen](https://en.wikipedia.org/wiki/Soft_hyphen)

Margaret­Are­You­Grieving­Over­Goldengrove­Unleaving­Leaves­Like­The­Things­Of­Man­You­With­Your­Fresh­Thoughts­Care­For­Can­You­Ah­As­The­Heart­Grows­Older­It­Will­Come­To­Such­Sights­Colder­By­And­By­Nor­Spare­A­Sigh­Though­Worlds­Of­Wanwood­Leafmeal­Lie­And­Yet­You­Will­Weep­And­Know­Why­Now­No­Matter­Child­The­Name­Sorrows­Springs­Are­The­Same­Nor­Mouth­Had­No­Nor­Mind­Expressed­What­Heart­Heard­Of­Ghost­Guessed­It­Is­The­Blight­Man­Was­Born­For­It­Is­Margaret­You­Mourn­For.txt

~~~
kortex
Neat! I'm gonna try both of these in my workflow.

------
jblow
Why are we still using this in 2020?

~~~
yjftsjthsd-h
Because nobody has written a good (universal) alternative. POSIX sh is
available everywhere and is extremely well suited to gluing things together.

~~~
jblow
Obviously, but then the answer is, why? What the hell is our problem?

Even if you think shell scripting is a good idea (which I don’t), just fix all
the obviously dumb toxic stuff like this and put out a shell that is minimally
different with only semantic fixes. Emit warnings now for toxic semantics but
still support them, and in a couple of years, turn off that support for good.

~~~
peterwwillis
I think the problem is the people who don't understand the difference between
"obviously dumb toxic stuff" and "features".

This thread is all about the difference between $* and $@. The difference is
explained in the man pages for most shells, but since most programmers don't
read instructions, they often need blog posts to explain to them how a
documented feature of a language works.

I highly recommend the _dash_ man page ([http://man7.org/linux/man-
pages/man1/dash.1.html](http://man7.org/linux/man-pages/man1/dash.1.html)) as
a concise explanation of portable shell syntax. From the _dash_ man page,
under _Special Parameters_ :

    
    
         *            Expands to the positional parameters, starting from one.
                      When the expansion occurs within a double-quoted string it
                      expands to a single field with the value of each parameter
                      separated by the first character of the IFS variable, or
                      by a ⟨space⟩ if IFS is unset.
    
         @            Expands to the positional parameters, starting from one.
                      When the expansion occurs within double-quotes, each posi‐
                      tional parameter expands as a separate argument.  If there
                      are no positional parameters, the expansion of @ generates
                      zero arguments, even when @ is double-quoted.  What this
                      basically means, for example, is if $1 is “abc” and $2 is
                      “def ghi”, then "$@" expands to the two arguments:
    
                            "abc" "def ghi"
    

It turns out we didn't need a blog post to explain it, because it's in the
manual that nobody reads. But we should definitely complain about how this
crafty, unusual piece of obviously dumb toxic stuff works, because how were
you supposed to know to RTFM?

To answer your question "why still in 2020?", it's because these are
independent features people needed. Sometimes people wanted the $* semantics,
and sometimes the $@ semantics. So both exist. It's up to you to learn how the
system works and use it properly.

It's not like Python doesn't also have weird edge cases that you won't know
until you learn the whole language. I've seen people spend hours futzing about
with lambdas and list comprehensions to try to fix a bug, which I addressed by
just rewriting the expressions as regular-old loops and data structures. Bash
isn't uniquely bad, it has warts like everything else. Take out the warts you
don't want and someone else will complain that they're missing.

~~~
jessermeyer
Yes, it takes a lot of experience and mindfulness to develop the taste for
what is genuinely healthy and what ultimately is bad, even if it tastes good.

------
hapless
It's 2020. Friends don't let friends write shell scripts.

~~~
BossingAround
When I first came into the world of SWE, I thought "why would anyone use Bash
nowadays when we have Python?"

A colleague answered me: "If we left, there's around 500 people in this
building who could support my Bash script, and around 50 Python people."

It kind of stuck with me. At this point, I feel like Bash is one of the common
tongues between all tech roles (that deal with Linux, that is).

Python, while nice, is a lot more niche, since you really have to be into
development to know Python, while every sysadmin worth their salt can debug a
Bash script. And, of course, every SWE, TSE, DOE, .... worth their salt know
Bash scripts as well.

If you want to get a job (in the Linux land), bash scripting is typically an
"of course".

~~~
baby
woot, this is definitely wrong in my experience. Most people can read/write
python to some degree, and even if you don't know python you can quickly ramp
up to understand a config file or a simple program.

On the other hand most people don't write bash scripts and usually have to
deal with them when encountering legacy systems or languages.

------
ljm
Man, one of the worst debugging experiences of my life was when we built an
extensigble build tool, based on Bash. Most of it worked, but there was always
an inconsistency between $* and $@ and there would be PRs that swapped those
values around, back and forth.

They were both totally valid; we just hadn't agreed on a calling convention
for the tool so people were trying to fix their individual problems based on
their own habits.

~~~
mikelward
They are not both totally valid.

You almost always want "$@".

If you're using $* , you're not supporting arguments that contain spaces, and
you'll break if arguments contain wildcards. If you're using "$*" , you're
treating multiple arguments as a single argument.

[https://unix.stackexchange.com/a/41595/3169](https://unix.stackexchange.com/a/41595/3169)

------
lalaland1125
I highly recommend that people learn how to use
[https://docs.python.org/3/library/subprocess.html](https://docs.python.org/3/library/subprocess.html)
as a replacement for Bash. It's a little more work upfront, but it's vastly
more maintainable.

~~~
ben509
I disagree. Here's a warning from subprocess[1]:

> Use communicate() rather than .stdin.write, .stdout.read or .stderr.read to
> avoid deadlocks due to any of the other OS pipe buffers filling up and
> blocking the child process.

The trouble with `communicate()` is it only handles very simple cases,
basically, your output has to fit in memory. Same problem exists in
asyncio.[2]

Yes, you can usually work around this. That doesn't mean it's a good
replacement; whereas complex pipelines are so trivial in bash that any user-
defined function can be used in a pipeline, subprocess generally forces you to
do the dumb thing and create a mess of temporary files.

And that's not even considering you're writing 10 times as much code than you
would to accomplish the same task.

[1]:
[https://docs.python.org/3/library/subprocess.html#subprocess...](https://docs.python.org/3/library/subprocess.html#subprocess.Popen.stderr)

[2]: [https://docs.python.org/3/library/asyncio-
subprocess.html#as...](https://docs.python.org/3/library/asyncio-
subprocess.html#asyncio.asyncio.subprocess.Process.stderr)

~~~
kortex
Why is this still such a problem? People have been talking about replacing
bash with python for years, and yet trying to actually do just that, is a
pain. Plumbum doesn't quite cut it.

I think in part it's because the POSIX-style shell is so tightly wound with
the OS, it's extremely proficient at spawning and forking processes, working
with files, and communicating, and that experience is seamless. Python feels
like a different world, and doing any subproc, pipes, or file i/o always feels
like crossing some boundary and back.

~~~
detaro
I feel like this could be a situation parallel to JS on the web: You can
change the language that's available everywhere only slowly and in limited
ways, so people build compilers targeting it as the output language from
"nicer" languages.

------
thennegah
Lol, we just made a bash function using this.

// echo_and_run() { echo "$*" ; "$@" ; }

Logs the cmd before running.

~~~
gcmeplz
You can also use `set -xv` to get nice debugging logs

[https://www.gnu.org/software/bash/manual/html_node/The-
Set-B...](https://www.gnu.org/software/bash/manual/html_node/The-Set-
Builtin.html#The-Set-Builtin)

------
mattbillenstein
Oh man, wish I'd seen this a few days ago - ended up doing something ugly
using printf and xargs...

~~~
mikelward
You should try stackoverflow or unix.stackexchange.com.

e.g. my answer
[https://unix.stackexchange.com/a/41595/3169](https://unix.stackexchange.com/a/41595/3169)

~~~
mattbillenstein
Hmm, still not able to do what I want with this -- I want to take an argument
like:

foo.sh --clean bar.yml

And actually run something like:

blah -e '{"clean": true}' bar.yml

where -e and the thing in '' are two separate args...

------
3xblah
[https://www.in-ulm.de/~mascheck/various/bourne_args](https://www.in-
ulm.de/~mascheck/various/bourne_args)

------
perl4ever
I advise picking up the money that's closest and running while your dog holds
off the nymph and the rest.

------
throw7
At one time, I did think python would replace my bash shell.

Then I tried it. Either I'm an old dog or I was naive.

Both are true, I think.

------
marcacohen
Here's an easy way to see how this works. Run this script:

echo dollar-star: for i in $ _; do echo $i; done echo dollar-at: for i in $@;
do echo $i; done echo quoted-dollar-star: for i in "$_"; do echo $i; done echo
quoted-dollar-at: for i in "$@"; do echo $i; done

./x.sh a "b c" d dollar-star: a b c d dollar-at: a b c d quoted-dollar-star: a
b c d quoted-dollar-at: a b c d

~~~
_kst_
[https://news.ycombinator.com/formatdoc](https://news.ycombinator.com/formatdoc)

Blank lines separate paragraphs. Text surrounded by asterisks is italicized,
if the character after the first asterisk isn't whitespace.

Text after a blank line that is indented by two or more spaces is reproduced
verbatim. (This is intended for code.)

Urls become links, except in the text field of a submission.

