
Things I Wish I'd Known About Bash - zwischenzug
https://zwischenzugs.com/2018/01/06/ten-things-i-wish-id-known-about-bash/
======
jordigh
Using readline is a great thing to know about too.

My favourite little-known readline command is operate-and-get-next:

[https://www.gnu.org/software/bash/manual/html_node/Miscellan...](https://www.gnu.org/software/bash/manual/html_node/Miscellaneous-
Commands.html)

You can use it to search back in history with C-r and then execute that
command with C-o and keep pressing C-o to execute the commands that followed
that one in history. Very helpful for executing a whole block of history.

For some reason, this documentation is hard to find! It's not here, for
example:

[http://readline.kablamo.org/emacs.html](http://readline.kablamo.org/emacs.html)

I'm a bit saddened when readline replacements don't implement C-o. For
example, the Python REPLs don't have it.

~~~
lillesvin
I've overridden ctrl-r in my local Bash to search with fzf[0] and I'm using my
history so much more now.

Didn't know about ctrl-o though, it sounds great! I hope that my ctrl-r
override doesn't somehow break it.

[0]: [https://github.com/junegunn/fzf](https://github.com/junegunn/fzf)

E: Fixed link.

~~~
CaptSpify
That link is a 404 for me?

~~~
tokenizerrr
It had a trailing >,
[https://github.com/junegunn/fzf](https://github.com/junegunn/fzf)

------
sethrin
I wrote a book on Bash too. The most important thing for anyone to know about
Bash is that it's intended as a command language, not a general purpose
scripting language. If it's longer than 10 lines, or if it uses two or more
variables, you should probably have written it in something other than Bash.

~~~
chucknelson
Would you consider non-trivial install scripts as an exception to this general
rule? I mean, you wouldn't write some install script in ruby or python, right?

~~~
ghettoimp
I don't know why not?

At the least, rather than Bash, you might consider Perl as a default, lowest
common denominator for scripts that need to run anywhere.

\- It's nearly as ubiquitous as bash.

\- It has approximately the same kinds of file/path operations built in.

\- It has reasonably good support for strings/regexes/etc. all built-in, so
you don't have to call out to tools like sed/awk/grep all the time and hope
that they are available and compatible across your target platforms.

\- It provides reasonably good arrays and hashes, which are horribly horrible
in bash.[1]

\- You can use syscalls very easily if you really need to, but usually you
don't.

[1] Of course, no language can save you from the file system disaster
([https://www.dwheeler.com/essays/fixing-unix-linux-
filenames....](https://www.dwheeler.com/essays/fixing-unix-linux-
filenames.html)), but being able to know that "foo bar" is a string instead of
two array elements is a good start.

Mostly this all applies to Ruby or Python too, modulo perhaps the degree of
ubiquity.

~~~
gbacon
Great point about ubiquity.

Perl regexes are the best of breed that everyone else replicates — far better
than “reasonably good.”

The Perl erasure in this HN thread is startling.

~~~
grzm
> _" The Perl erasure in this HN thread is startling."_

 _" Erasure"_ to me implies some active effort to remove Perl from discourse.
I don't see anything like that here: indeed, there are a number of positive
mentions, and no negative ones I see. Granted, Python and Ruby are both
mentioned more often, but none of those is at Perl's expense. Am I
misunderstanding what you mean by 'erasure'?

~~~
gbacon
This is what Perl is designed to be. Perl unlike the others is almost certain
to be on any Unix or Linux installation. Several commenters leaving out Perl
in discussions of the next step up from bash scripts is truly strange.

I suppose being ignored beats the typical _herp-derp_ anti-Perl bigotry, but
I’d prefer all-around civility.

~~~
grzm
> _" Several commenters leaving out Perl in discussions of the next step up
> from bash scripts is truly odd."_

I'm having a hard time following you here. Do you think that they're doing so
for any other reason that Perl is no longer their go-to tool? There are
communities where Perl is still used: PostgreSQL for example uses Perl for
some of its scripting, as well as its build farm tool, in particular because
of its portability on older systems.

That said, from what I've seen over the past 10 years or so, Perl hasn't had
much of a presence in areas where a lot of computer work in tech is being
done. For example, in cloud computing, or scientific computing, or machine
learning, or web frameworks. Please don't read this to mean that Perl _couldn
't_ be or isn't being used in these cases or wouldn't be a better fit. (As an
aside, I think Perl missed out a lot while a large portion of the community
was focused on Perl 6: there's only so much energy in a community, and that
absorbed on Perl 6 wasn't focusing on evangelism. But that's not something I'm
interested in litigating here.) Or that there isn't something a bit
frustrating in seeing the wheel reinvented time and time again. And so many
examples on the web use bash as a common denominator. This puts Perl further
out of mind if it's not already part of your everyday workflow. And how many
developers today have come of age without seeing Perl in their everyday
environments?

Consider the current forum. What's the percentage of front-page posts that are
about Perl or tools where Perl is a part of the tool chain? It would be
understandable for the people who frequent HN to not view Perl as their go-to.
I don't consider it uncivil for people to neglect to mention some other
language when it's not something they'd actually think of reaching for. It
seems the solution would be to share examples of where Perl provides
advantages, both in the comments here and in submissions to HN.

~~~
Too
Well said, perl might be the theoretically best match in specifically this
problem domain but the thing is there are only so many programming languages
one can learn.

If i had to choose _only one_ of ruby/python or perl i would choose the former
and it would be able to cover my base _both_ as glue-code and for more
programs. Perl would maybe make the glue code a bit easier but instead i would
be much less employable and have a much harder time finding other people who
can read the glue. I'm not qualified to have an opinion on perls capabilities
for other programs but i'm sure there are valid reasons most people prefer
other alternatives.

------
joshbaptiste
Hang out in #bash on IRC Freenode and you will be a Bash jedi
[http://mywiki.wooledge.org/BashFAQ](http://mywiki.wooledge.org/BashFAQ) best
resource IMO for quick Bash syntax lookups as I always need to refer to the
BashFaq to remember parameter expansion sub-string retrieval.

    
    
      parameter     result
      -----------   ------------------------------
      $name         polish.ostrich.racing.champion
      ${name#*.}           ostrich.racing.champion
      ${name##*.}                         champion
      ${name%%.*}   polish
      ${name%.*}    polish.ostrich.racing

~~~
itwy
They are needlessly rude and mean at #bash. A bunch of scumbags, actually.

~~~
LambdaComplex
I think it's a result of constantly dealing with people who ask for help,
receive good advice, and then ignore it

~~~
viraptor
If that annoys then, they can always quit. Being rude in this situation is
either a choice or lack of ability to cope with stress. Neither excuses being
rude...

------
unixthrowaway50
The sections on quoting and globbing suggest this author, though obviously
trying to be helpful, isn't really knowledgeable enough to be writing such a
guide.

I suggest reading this instead:
[http://www.grymoire.com/Unix/Sh.html](http://www.grymoire.com/Unix/Sh.html)

Granted, it's about the Bourne shell, but since Bash, Korn, and every standard
UNIX shell is supposed to be compatible with it, it's well worth learning.

And IMO, if you need more than what the Bourne shell provides, you should be
using a proper programming language like Python instead.

------
amelius
Bash has a huge number of little shortcuts that are difficult to learn. When
one encounters a sequence of symbols like $(...), it is difficult to Google
for its meaning. The reason shells nevertheless have these shortcuts is of
course because they _are_ shells: from the commandline it can be very
convenient to use shortcuts.

But, in my opinion, that's where it should stop: one shouldn't use a shell
language for scripting. In scripts, it is simpler to use more verbose and
clear constructs, because most editors are very powerful and provide shortcuts
themselves.

~~~
yorwba
Yeah, but why are you trying to use a search engine that cares less and less
about exact matches, when there's a manual?

    
    
        >man bash
        /\$\(  # search pattern needs escaping
        ...
        value is evaluated as an arithmetic expression even if the $((...)) expansion is not used (see Arithmetic Expansion below).   Word  split‐
        ...
        n      # go to next match
        Command Substitution
           Command substitution allows the output of a command to replace the command name.  There are two forms:
    
                  $(command)
           or
                  `command`
        (detailed description follows)
    

I'm still in favor of using more verbose and especially more clear constructs,
but not because they are easier for Google, but because they ideally hold
enough information on their own that you don't even need to look it up to know
what it does.

~~~
orev
Man pages are specifically reference documents, not tutorials or guidebooks.
To say one should use a man page is to say that one must completely digest the
entirety of the tool prior to ever actually using it. That’s just simply not
feasible, nor should it be expected of anyone beyond trivial tools. Man pages
simply don’t provide the context for solving a problem like a guidebook or
tutorial would, which is why there are so many sites that start with a problem
and then explain the tools.

~~~
tomsmeding
Which is why the parent advised to treat the man page like a reference
document, by searching in it. Some man pages are just badly written and are
indigestible even when searching for a specific thing, but in general, that
approach works quite often.

~~~
hyperpape
Is there some trick to searching man pages that I don’t know? Because my usual
experience is:

    
    
      type man foo
      type /-p
      type n n n n n n n n n
    

as there are a bunch of matches like “...does bar when combined with -p...”

A presentation of man pages that used hypertext would make me a lot happier.

~~~
Pete_D
If foo has a Texinfo manual (GNU tools like bash usually do) then you can try
`info foo` and search the index with i or I for -p. Texinfo manuals also have
hyperlinks you can press enter on.

info is a greatly underused system and I'd recommend any *nix users to spend
some time learning how to navigate it.

~~~
hyperpape
Thanks. Is there a good way to open that in a browser, rather than a console?

~~~
teddyh
In general, the best way to read Info documentation is inside Emacs.

~~~
oblio
"If you are a bash newbie, you should read the bash manual. If you want proper
search for the manual, you should use info. If you want proper use of info,
you should use Emacs."

Kind of a deep rabbit hole, isn't it?

~~~
teddyh
Not in this case; the bash manual is not in Info form, but a regular old-style
Unix man page.

------
chriswarbo
For me, the biggest gotcha in bash is whether or not a sub-process/shell will
be invoked, which can affect things like mutable variables and the number of
open file handles. For example:

    
    
        COUNT=0
        someCommand | while read -r LINE
                      do
                        COUNT=$(( COUNT + 1 ))
                      done
        echo "$COUNT"
    

This will always print `0`, since the `COUNT=` line will be run in a sub-
process due to the pipe, and hence it can't mutate the outer-process's `COUNT`
variable. The following will count as expected, since the `<()` causes
`someCommand` to run in a sub-process instead:

    
    
        COUNT=0
        while read -r LINE
        do
          COUNT=$(( COUNT + 1 ))
        done < <(someCommand)
        echo "$COUNT"
    

Another issue I ran into is `$()` exit codes being ignored when spliced into
strings. For example, if `someCommand` errors-out then so will this:

    
    
        set -e
        FOO=$(someCommand)
        BAR="pre $FOO post"
    

Yet this will fail silently:

    
    
        set -e
        BAR="pre $(someCommand) post"

~~~
maksimum
I think I've run into the first issue you describe, and I'm having a hell of a
time trying to understand it. Would you mind taking a look at my example and
helping me out?

Consider the following:

    
    
      cd /tmp/
      echo -e "hello world\nhello world\n:)" >> hello.txt
      cat hello.txt
    

Outputs:

    
    
      hello world
      hello world
      :)
    

Then running

    
    
      bash -c 'sed s/"hello"/"hiiii"/ hello.txt | tee hello.txt'
      cat hello.txt
    

yields

    
    
      hiiii world
      hiiii world
      :)
    

Reseting to the original `hello.txt` and running

    
    
      ssh localhost -t 'cd /tmp && sed s/"hello"/"hiiii"/ hello.txt | tee hello.txt'
      cat hello.txt
    

yields an empty file.

Replacing the command with

    
    
      ssh localhost -t 'cd /tmp && sed s/"hello"/"hiiii"/ hello.txt >> hello.txt'
    
    

yields

    
    
      hello world
      hello world
      :)
      hiiii world
      hiiii world
      :)
    
    

I'm trying to figure this out so I can publish a script to set up a test env
for a package I'm trying to publish, and somewhat stuck on this step...

~~~
mkl
You're using hello.txt as both an input file and an output file, which seems
like it's just asking for race-condition problems.

What about using sed -i to make changes to the file, and not doing any bash
redirection?

Alternatively, if the real problem is more complex than that, try using a
different name for the output file and rename after it's finished.

------
lvillani
shellcheck ([https://www.shellcheck.net](https://www.shellcheck.net)) is an
absolute godsend when writing Bash/POSIX sh scripts.

It catches so many errors that I think it's a must have in every programmer's
toolbox. It even catches bash-isms when you are targeting POSIX sh. It saved
me many many hours of grief trying to debug shell scripts I wrote and changed
the way I write them for the better.

~~~
gkya
Targeting POSIX as much as possible is really important if you don't want to
force Bash on people, especially with open-source public code. Many OSes don't
have Bash in the default install, but many just assume that bash is available
on all the target systems.

~~~
Aloha
like which?

I'm hard pressed to think of a modern unix that doesn't include bash by
default. Solaris maybe?

~~~
nisa
everything embedded - OpenWRT/LEDE uses busybox with ash - Android uses mksh?
that only supports a subset of bash features. Every Debian/Ubuntu has ash as
/bin/sh that only supports POSIX.

~~~
Aloha
I'd argue that writing scripts for an embedded target for a regular
server/workstation target are fundamentally different problems - almost
anything I'd write for those platforms would be targeted for them - not for
general purpose unix.

Portability is only desirable, if you need portability - otherwise it
frequently adds complexity for little return benefit.

------
tomxor

        `` vs $()
    

I really appreciate authors who attack ambiguity or non-obvious equivalence in
a subject head on. It's one of those aspects of explaining that takes to heart
the perspective of the learner.

Another recent example I encountered is from the online book
[http://neuralnetworksanddeeplearning.com](http://neuralnetworksanddeeplearning.com),
there are some confusing ambiguities and plain misleading contradictions in NN
terminologies, one being: multi-layer perceptions do not use perception
neurons, these things can really screw with you while you're learning,
especially the more subtle ones.

Some authors seem to have the ability to have full empathy for the novice
while retaining expert knowledge and deep understanding by being able to both
predict and answer relevant questions at the right point in an explanation at
the right level of detail.

------
jwilk

        rename -n 's/(.*)/new$1$2/' *
    

There's a chance this won't work on your system.

There are two incompatible versions of rename in the wild:

1) Perl one: [https://metacpan.org/pod/distribution/File-
Rename/rename.PL](https://metacpan.org/pod/distribution/File-Rename/rename.PL)

2) from util linux: [http://man7.org/linux/man-
pages/man1/rename.1.html](http://man7.org/linux/man-pages/man1/rename.1.html)

Debian (and dertivaties) ship the former; other Linux distros likely the
latter.

~~~
falsedan
I think this is not specific to the shell you are using, so it happens in
bash, fish, ksh, zsh, and so on.

~~~
gbacon
It’s misleading because rename is a command that accepts regex arguments. The
shell isn’t doing anything more sophisticated with the quoted argument than
passing it to rename as a positional parameter.

The shell of course does expand unquoted globs.

------
dorfsmay
Two important things missing:

1) This is a huge pet peeve of mine, but it kills me when my coworkers "bash"
emacs and sing the praise of vim, then proceed to explain to me that ctrl-R in
bash searches for the last command with a given pattern. They also often
refuse to believe me that they're basically using emacs controls (because you
know emacs' dirty). So, please, if you're in love with vim and use ksh or
bash, learn about "set -o vi". Oh, and stop preaching!

2) "help xxx" for any xxx bash functionality, right there from the command
line!

~~~
throwanem
> they're basically using emacs controls

But not well implemented! You're supposed to be able to edit your search
string, and I've never found a way.

~~~
LukeShu
Huh? Backspace works the same for me in Bash C-r as it does in Emacs C-r.

~~~
throwanem
It never does anywhere for me. Good to know it's supposed to, though; maybe
I've got something mapped weird, or the versions of the shell I'm using are
old enough to be unwelcoming in this way, or I don't know what, but knowing
it's not by design means there's a fix to be found for it. Thanks!

~~~
jmiserez
You should try hstr
([https://github.com/dvorka/hstr](https://github.com/dvorka/hstr)). It
replaces CTRL-R with a full page interactive history search that really works.

Demo GIF here:
[https://unix.stackexchange.com/a/375914](https://unix.stackexchange.com/a/375914)

~~~
dorfsmay
If you use either vim or emacs, there's something to be said about using that
knowledge for your command line history.

~~~
jmiserez
hstr has a vi mode.

------
Sir_Cmpwn
#0: If you're writing scripts that are destined for other users, use POSIX sh
instead.

~~~
reacharavindh
Do you know of any learning resources that are strictly POSIX sh instead of a
specific shell? I'm looking to learn and it will be very helpful.

~~~
Sir_Cmpwn
How about the standard?

[http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3...](http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html)

------
jwilk

        $ grep somestring file1 > /tmp/a
        $ grep somestring file2 > /tmp/b
        $ diff /tmp/a /tmp/b
    

You shouldn't do that, but not because it's not neat enough. /tmp is world-
writable, so you might be writing to somebody else's file, or over a symlink
that was set up by someone else. Use mktemp¹ for creating temporary files.

¹ [http://man7.org/linux/man-
pages/man1/mktemp.1.html](http://man7.org/linux/man-pages/man1/mktemp.1.html)

~~~
erk__
Could you not do that with pipes instead, something like:

    
    
        $ diff <(grep somestring file1) <(grep somestring file2)

~~~
eduren
That's what the article is recommending.

------
jwilk
> if [ x$(grep not_there /dev/null) = 'x' ]

This is still wrong if the command can output spaces or meta-characters. You
should quote the left operand, and then you don't need to prepend x:

if [ "$(grep not_there /dev/null)" = '' ]

~~~
gkya
Does the second only work with bash? Because IIRC it didn't work with
FreeBSD's /bin/sh, you needed the initial x too.

~~~
jwilk
No, this should work in any POSIX-compliant shell.

------
y7
The author mentions the substitution !:1-$ to insert all the arguments from
the last command (like !$ substitutes the last argument and !! the full
command). Note that !* does exactly the same thing, which is a bit easier to
type/remember.

------
Nimitz14
..it'd be nice if what was actually happening was explained of just statements
like "[ is the original form for tests, and then [[ was introduced, which is
more flexible and intuitive"

~~~
falcolas
I recommend reading `man bash` for the section on Conditional Execution. It's
remarkably readable and useful for man pages. It also cleanly explains the
difference between the two.

------
luckydude
+1 for the parts that are portable to Bourne shell/ksh/zsh.

-1000 for the parts that are specific to Bash. Stuff like that has been huge pain in my ass over the years. Some clever programmer uses some bash-ism and the build breaks on some ancient hardware that doesn't have bash.

I realize my complaint sounds like Henry Spencer's Ten Commandments and
perhaps it feels outdated but trust me, you don't want to wade into thousands
of lines of shell to track down why something doesn't work on the stupid AIX
box.

~~~
coldtea
> _Stuff like that has been huge pain in my ass over the years. Some clever
> programmer uses some bash-ism and the build breaks on some ancient hardware
> that doesn 't have bash._

Shouldn't the problem be the "ancient hardware that doesn't have bash" itself?

~~~
stephenr
A brand new macbook pro will not have bash4.

~~~
lillesvin
The machine may not be ancient but I would still argue that the problem there
is the ~10 year old version of Bash that Apple has decided to ship rather than
the programmers that use features added to the shell within the last 10 years.

(I should add that I don't know what version High Sierra ships with but Sierra
seemed to ship with 3.2.5x-ish which was 9 years old at the time.)

~~~
stephenr
Well put it this way:

You can assume everyone has a modern bash, and make it the end users problem
if they don't, or you can write portable shell scripts and know it will work.

Honestly the things you can't do in posix shell compared to bash border on
"use a fully featured language" anyway.

~~~
lillesvin
Thing is that Bash is something you can assume to be reasonably widely
available[0] — like Perl or Python — but I wouldn't expect to have to avoid
any features from the last 10 years of either of those two. Sure, a certain
grace period is to be expected but I think 10 years is way past that.

[0]: I know POSIX is supposed to be even more widely available but depending
on what you're targeting then it may not be the best option [source:
[https://en.wikipedia.org/wiki/POSIX#POSIX-
oriented_operating...](https://en.wikipedia.org/wiki/POSIX#POSIX-
oriented_operating_systems)].

~~~
stephenr
Afaik neither perl or python are gpl3-only licensed.

Thats the blocker on macOS.

------
teddyh
If you use a thing a lot, you should probably invest some time in reading the
manual for it.

The manual for bash consists of its Unix-style "man" page, and is therefore
more of a reference than an instruction manual. I suggest using the _Advanced
Bash-Scripting Guide_
([http://tldp.org/guides.html#abs](http://tldp.org/guides.html#abs)).

------
jorams
I really wonder why the author didn't just leave off the first two, and call
it "Eight Things I...". Starting off with "backslash escapes can be confusing"
and "/* doesn't just match things consisting of 0 or more slashes!" makes no
sense if you're later going to skip over !! because it's "obvious".

------
Exuma
This is the most helpful bash diagram ever... why didn't I search for this
before! [https://zwischenzugs.files.wordpress.com/2018/01/shell-
start...](https://zwischenzugs.files.wordpress.com/2018/01/shell-startup-
actual.png)

~~~
tzs
As given in the article, it's also the most annoying bash diagram ever...it
needs an explanation of what the 7 different colors of arrows mean. All that
was given is:

> It shows which scripts bash decides to run from the top, based on decisions
> made about the context bash is running in (which decides the colour to
> follow).

> So if you are in a local (non-remote), non-login, interactive shell (eg when
> you run bash itself from the command line), you are on the ‘green’ line
> [...]

With a bit of Googling I believe I found the origin of that diagram:

[https://blog.flowblok.id.au/2013-02/shell-startup-
scripts.ht...](https://blog.flowblok.id.au/2013-02/shell-startup-scripts.html)

The author there explains how the colors work:

> Fortunately, I’ve read the man pages for you, and drawn a pretty diagram. To
> read it, pick your shell, whether it's a login shell, whether it's
> interactive, and follow the same colour through the diagram. When the arrows
> split out to multiple files, it means that the shell will try to read each
> one in turn (working left to right), and will use the first one it can read

~~~
kerny
This diagram is not entirely correct.

The remote bash startup order is further complicated by the existence of a
compile time flag SSH_SOURCE_BASHRC. This flag determines if a remote non-
interactive shell will load the ~/.bashrc file.

This flag is turned off by default and stays off in some distributions (like
Archlinux), but is turned on in others (Debian, Fedora, ...) to replicate very
old rsh behaviour.

------
seanwilson
> !$ - I use this dozens of times a day. It repeats the last argument of the
> last command.

Press ESC then full stop instead. Less key presses.

~~~
tux1968
Fewer key presses, but doesn't work if you're using vi bindings.

~~~
mkl
ESC _ or M-_ does work in vi mode though (and does the same thing as ESC
./M-.). Search for "yank-last-arg" in the bash manual.

~~~
tux1968
Thank you.

Went searching how to do it in Zsh as well when the Zshell Line Editor (zle)
is configured in vi mode:

$ bindkey -M viins '\e.' insert-last-word

Will make ESC-. work from insert mode.

------
JepZ
One of my favorite Bash patterns is the following. I call it 'feed the fish':

    
    
      . <(curl https://example.com/trusted.sh)
    

While it is _extremely dangerous_ I found it so easy to remember, that it
stuck in my head. It downloads the script and executes it in the current
shell. So if anything unexpected happens you probably have a real problem ;-)

Background: A few years ago I was writing a Bash based OS installer. So after
booting from a live CD I had to fetch the installer and execute it, which lead
me to using that pattern frequently.

While I love it, I can't stress enough how dangerous it is.

------
wodenokoto
What was the surprising part about

    
    
        echo '*'
        echo "*"
    

? Both prints an asterisk. Is '*' some sort of BASH variable?

~~~
Sniffnoy
The asterisk * is a glob. Since double-quotes allow variables inside to be
dereferenced rather than always quoting everything literally, presumably he
was expecting that double-quotes might still allow globs to work rather than
causing them to be quoted, when in fact all quotes will quote globs whether
single or double. So he was expecting it might print out "a".

------
earenndil

        if [ x$(grep not_there /dev/null) = 'x' ]
    

See now, I never get why people do this. -z has existed forever.

~~~
cjhanks
Because it's acronym is very unclear. I think it stands for "zero", but I am
not sure.

------
mlinksva
re 9) I feel better about always feeling at least slightly confused about what
files are being sourced.

The graph included seems to originate from
[https://blog.flowblok.id.au/2013-02/shell-startup-
scripts.ht...](https://blog.flowblok.id.au/2013-02/shell-startup-scripts.html)

~~~
kerny
That graph is not entirely correct:
[https://news.ycombinator.com/item?id=16088866](https://news.ycombinator.com/item?id=16088866)

------
hiisukun
I didn't see it mentioned, but I use it frequently so here is my tip. I'm not
sure if it is bash specific (I just use it!).

Instead of typing !$ for the previous command's final argument, you can use
the keyboard shortcut alt+. (alt+period). Pressing it multiple times will go
to the last argument of previous commands. I use this quite a bit and found it
easier than !$, because you can see which command it will be : )

I still don't always understand exactly what is happening with subprocesses vs
subshells (chriswarbo's post is very useful in pointing out how wrinkly this
can be), so I try and keep bash my usage simple.

------
aplorbust
Blog states that author has 20 years of development experience.

Blog post suggests he knew little of basic shell scripting until recently.

Blog also reveals he is selling a book on shell scripting with Bash, "Learn
Bash the Hard Way."

~~~
grzm
> _" Blog post suggests he knew little of basic shell scripting until
> recently."_

I don't see how you could come to this conclusion based on the title. From the
article, the author elaborates:

> _" Recently I wanted to deepen my understanding of bash by researching as
> much of it as possible."_

There seems to be a lot of good commentary in this thread indicating that
people are finding the post useful. There are points here that I have. Do you
have specific disagreements with the post contents?

------
gumby
If you want all the args to the previous (or earlier) command just use ! _.
(e.g. a common idiom for me is cat `!_ `). Or you tried a git mv out of habit
but the dir isn’t being managed by git: !gi: _

~~~
gumby
Oops that’s the asterisk (star character) that HN interpreted as italics. In
hacker news formattingnmy two examples are

    
    
      `!*` and !gi:*

------
BeetleB
This is why I use xonsh ([http://xon.sh](http://xon.sh)). It lets me use
Python as a shell, and I don't have to remember too much awkward syntax.

------
brianlund
I wasted a lot of time on this one:

If you declare a local variable and set it in the same step e.g: local MYVAR =
$(/bin/false) The return code you get is from the local declaration, not
assigning a value to the variable. It can be quite confusing when you
afterwards check the return code with $? and it returns 0. Avoid it by
assigning the value in a seperate command.

------
walshemj
Interesting but from a modern perspective is not perl a better scripting
language to learn.

I have never used bash scripts professionally i.e. as a language rather than a
simple script with just a command in.

Like wise back in 87 or so I got trained in sed and I have only used it once
since, and that was when I was playing around with early linix's (when it came
on a huge number of floppys)

~~~
taormina
You have a terminal if you're on a Mac or Linux machine. It takes bash
commands by default. Knowing perl is good, but knowing more bash is always
good.

I hadn't known that <(echo "hi") is treated as a file with the contents of
stdout, which simplifies commands that take files as arguments.

~~~
walshemj
I meant using bash as a scripting language not as a shell.

------
kerny
9) The remote bash startup order is further complicated by the existence of a
compile time flag SSH_SOURCE_BASHRC. This flag determines if a remote non-
interactive shell will load the ~/.bashrc file.

This flag is turned off by default and stays off in some distributions (like
Archlinux), but is turned on in others (Debian, Fedora, ...) to replicate very
old rsh behaviour.

------
saagarjha
I've been looking for something like <() for a long time. Thanks for sharing
it!

Regarding :h, is it any different than just using dirname?

------
martincmartin
If you "set -e", you probably also want to "set pipefail". By default, a
pipeline returns the return value of the last element. pipefail means that if
any element of the pipeline fails, then the pipeline as a whole will fail. I
discovered this the hard way when I had:

make run-asan-test | c++filt

And even if the tests failed, the script would succeed.

------
indigodaddy
Can someone explain :h as the article's description was not clear at all to me
what was going on there.

~~~
mmjaa
Yeah, I couldn't make that work for me (on Linux or MacOS) .. although I'd
love it if there were a way to quickly get 'just the directory' or 'just the
filename' in bash with a shortcut, instead of having to resort to $(dirname
blah) and so on .. I'm sure there is some way but :h doesn't look to be the
shortcut as expected.

~~~
lloeki
Assuming _blah_ is in a var named _name_ , you can use:

    
    
        dirname => "${name%/*}"
        basename => "${name##*/}"
    

Sadly you can't use it with !$ since it's not a variable. The closest you can
do is:

    
    
        $ ls foo/bar/baz
        ls: foo/bar/baz: No such file or directory
        $ last=!$; echo "${last##*/}"
        baz
        $ echo "${last%/*}"
        foo/bar

------
i_feel_great
What is the best alternatives to bash for writing fairly large automation
scripts? I have already looked at Python, Lua and Guile. The last two
especially since I like using them and they have some sort of posix
interfaces. I haven't looked at Perl 6.

~~~
gbacon
Why didn’t you consider Perl 5?

------
mehrdadn
The trickiest part of Bash I know is that "$(command)" results in the
truncation of the output of the command before the newline. It's both handy
and damning depending on what you're trying to do.

~~~
bewuethr
I don't think that's correct, do you have an example? This works for me:

    
    
      $ var=$(echo $'a\nb')
      $ echo "$var"
      a
      b

~~~
mehrdadn
Oh, I meant before the _trailing_ newlines, sorry for being unclear. Try using
a\nb\n\n\n instead of a\nb and observing that the output doesn't change.

~~~
bewuethr
Ah, okay, got it. Yes, that can be surprising.

------
emmelaich
I haven't used the ! history for many years. It's simply easier and faster to
use command line editing.

! was useful before command line editing; but not much since.

------
thibran
Does fish-shell have an equivalent for '<()'?

~~~
0942v8653
[https://fishshell.com/docs/current/commands.html#psub](https://fishshell.com/docs/current/commands.html#psub)

~~~
thibran
Thanks a lot. Shells have so many features it's easy to miss one.

~~~
lloeki
Watch out, there are some limitations to fish's psub preventing it to work as
bash's >()

[https://github.com/fish-shell/fish-
shell/issues/1786](https://github.com/fish-shell/fish-shell/issues/1786)

------
luord
I didn't know a few of these, and I think I'll find `<()` specially useful.

------
wand3r
I understand "the hard way" is a commonly used phrase but it does seem a bit
infringing on Zed Shaw's entire series Learn Code the Hard Way. Easily
confusing. Other than that, good work

------
ausjke
bash or its minimal version such as ash is critical for embedded systems where
python/perl etc are too fat.

------
zbentley
The thing I wished I had learned earlier is "quick and dirty assertions". If
you write lots of functions in Bash, you quickly end up getting tripped up by
cases where an argument is omitted and the function does something totally
batshit given the missing (empty string) argument. Now, the canonical way to
handle this is to put validators on your input, (and make sure those
validators don't crash with cryptic errors if someone calling your function is
using "set -u") like so:

    
    
      function() myfunc {
        local foo="${1:-}"
        if [ -z "$foo" ]; then
          echo "Invalid first parameter!" >&2
          return 127
        fi
        ...
      }
    

...but _man_ , that's time consuming when you have lots of parameters.

Instead, the quick and dirty way is to just "assert" via [parameter
expansion]([https://www.gnu.org/software/bash/manual/html_node/Shell-
Par...](https://www.gnu.org/software/bash/manual/html_node/Shell-Parameter-
Expansion.html)):

    
    
      function myfunc() {
        local foo="${1:?First parameter must be provided}"
        ...
      }
    

Much quicker, especially when throwing things together in a hurry. It has a
gotcha, though: ":?" assertion doesn't cause a function to return early, it
_shuts down the whole interpreter_ after outputting the error. So it's more
like a true assert() statement than an input validator. If you'd only ever
call your function in a subshell, this won't matter (because the subshell will
exit early with a nonzero code, big deal), but otherwise it can be a nasty
surprise to users when an argument-validation issue inside a function shuts
the program down. Then again, the "return 127" in the first example would also
shut the program down if someone was using "set -e".

...and while we're on the subject of "set -e", I think that the ["unofficial
Bash strict mode"]([http://redsymbol.net/articles/unofficial-bash-strict-
mode/](http://redsymbol.net/articles/unofficial-bash-strict-mode/)) (putting
"set -euo pipefail" and "IFS=$'\n\t'" at the top of your scripts) has been a
bigger bug-prevention/rapid development aide to me than anything else. To be
clear, I think it's a means of detecting _some_ kinds of bugs. I've read
Wooledge and others' objections to those patterns, especially "set -e", and
agree with the point that this does not make your programs _objectively safer_
and shouldn't be counted on as a crutch. Then again, neither does a linter,
but it still helps you detect and avoid some kinds of bugs, so why not use it?

~~~
williamdclt
I usually do something like that:

    
    
        [ -z "$1" ] && echo "Invalid first parameter!" >&2 && exit 127
    

as a precondition. I must admit that it's longer to write, but I can write a
bunch of preconditions for my function then write the logic with an appeased
mind

------
meow_mix
It's 2018 and we're still talking about writing non-trivial bash scripts?

~~~
mbrock
It's 2018 and bash is still the most convenient language for an enormous
number of tasks. Funny, that...

------
zbentley
Something I wish people teaching intermediate or advanced Bash tricks would
emphasize more is how to make your program compatible with other shells. With
the rise of Zsh's popularity, and the switch to Dash for Ubuntu/some Debian
derivatives, I see a lot of people repeating bashisms in code they share
without the knowledge that a) their code may not work for an unexpectedly
large number of people, and b) switching to compatible equivalents doesn't
make their code worse or less performant in many/most cases.

The most common bashisms and ways to avoid them are:

\- Double brackets ([[) around conditions. Yes, I know that [ is a program
(don't believe me? "which ["). That doesn't mean Bash uses it; it uses a
builtin which is (almost) equivalent to [[ instead. Use that and your code
will work in zsh/dash/all other POSIX shells. And while you're at it, stop
using "which" as an authority for "is this a shell builtin or not?"
[type()]([http://linuxcommand.org/lc3_man_pages/typeh.html](http://linuxcommand.org/lc3_man_pages/typeh.html))
is your friend.

\- When comparing strings for equality, use a single equals sign "=", not "=="
(e.g. 'if [ "$foo" = "some string" ]'). I know it feels dirty if you've
programmed in any other language, but it changes nothing about your code's
behavior and makes it compatible with several other shells.

\- Don't use "function funcname()" syntax. It adds nothing over the basic
"funcname()" syntax, but prevents your code running in many/most non-Bash
shells. And consider putting your function-opening brace on a separate line
(someone once told me that there are shells that won't accept any other
function declaration style, but I've never seen one, so ymmv).

\- Don't use "local" if you need to interoperate with ksh. Abandoning "local"
pollutes global namespaces, though, so your call.

\- Don't use substring expansion (e.g. extracting the 3rd-10th characters of a
string via 'substr="${somevar:3:7}"'. That's not supported in many other
shells. Alternatives include sed/awk/etc., or, if invoking external programs
is absolutely unacceptable to you, something horrific like:

    
    
        substr()
        {
            local input="${1:?String is required}"  
            local dist_from_start="${2:?Start position is required}"
            local dist_from_end="${3:-${#input}}" # Here, it's actually 'offset', not distance.
            local start_nulls=
            local end_nulls=
    
            dist_from_end=$(( 5 * (${#input} - ($dist_from_start + $dist_from_end)) ))
            dist_from_start=$(( 5 * $dist_from_start ))
    
            # Make a string of the regex for "any not null character" that "masks" the
            # characters in the input before the start point, and the characters after
            # the end of the substring. This is disgusting, and is only done because the
            # parameter expansion statements can't contain repetitions (e.g. [^\0]{5})
            # without the bash-only 'extglob' shell option.
            # The not-null character is used because it will never match in a shell
            # string.
            while true; do
                if [ "${#start_nulls}" -lt $dist_from_start ]; then
                    start_nulls="${start_nulls}[^\0]"
                elif [ "${#end_nulls}" -lt $dist_from_end ]; then
                    end_nulls="${end_nulls}[^\0]"
                else
                    break
                fi
            done
    
            input="${input#$start_nulls}"
            echo "${input%$end_nulls}"
    
        }
    
        substr "$@"
    

...actually, please never use that code. Ew.

Anyway, some more bashisms:
[https://mywiki.wooledge.org/Bashism](https://mywiki.wooledge.org/Bashism)

------
tzahola
My take on the same topic:

\- use the unofficial strict mode: [http://redsymbol.net/articles/unofficial-
bash-strict-mode/](http://redsymbol.net/articles/unofficial-bash-strict-mode/)

\- use parameter substitutions like ${foo#prefix}, ${foo%suffix} instead of
invoking sed/awk

\- process substitution instead of named pipes: <(), >()

\- know the difference between an inline group {} and a subshell ()

\- use printf "%q" when passing variables to another shell (e.g. assembling a
command locally and executing it via SSH)

~~~
oblio
Parameter substitution is far from intuitive. Every time I have to open one of
my older shell scripts (>3 months ago), I'm thankful I wrote comments.
Otherwise I'd have to man/google things again. I really wish they'd use
function names or something, instead (sub, etc.).

~~~
tzahola
My "mnemonic" is that # means _prefix_ , because every shell script _starts_
with a shebang too. From this I can deduce that ## means longest prefix,
therefore % and %% means suffix.

------
carapace
Damn it this boils down to RTFM, specifically the bash man page.

~~~
falsedan
The real 'Learn bash the hard way' is to read the man page top to bottom every
6 months. Ye gods, I've read it so many times…

Also: lol at HN downvoting advice to read a tool's docs. Bash is terrible for
a lot of reasons, but not because of its lack of informative documentation.

~~~
jwilk
From the HN guidelines:

 _Please don 't comment about the voting on comments. It never does any good,
and it makes boring reading._

~~~
falsedan
I've read the guidelines; you can just downvote instead.

------
clishem
You shouldn't call your book 'learn X the hard way' if it isn't (also) freely
available online if you ask me.

