
Surprising Bash Variables - zwischenzug
https://zwischenzugs.com/2019/05/11/seven-surprising-bash-variables/
======
noisy_boy
Interesting list (not very surprising to me personally though saw $REPLY
mentioned anywhere after a long time). One that surprised me recently (in a
how-come-I-didn't-know-about-it-for-so-long way) was $PIPESTATUS e.g. if you
are running a pipeline "cmd1 | cmd 2" and would like to know the return status
of cmd1, you can use ${PIPESTATUS[0]} to get that. Useful, e.g. when you are
tee-ing the output etc.

~~~
teddyh
I would recommend _against_ using $REPLY, for two related reasons:

1\. It’s confusing to do a plain “read” and later have a magic $REPLY variable
show up, seemingly from nowhere. It has shades of Perl, and I mean nothing
positive with that statement.

2\. Names are good. By setting a name to the thing which is read, you make the
later code easier to read. It’s easier to understand “echo $filename” than
“echo $REPLY”, so if you’re reading filenames, do “read filename”. (Or,
rather, you probably want to do “read -r filename”, but that’s a different
subject.)

~~~
daveFNbuck
I don't do a lot of bash programming, but I imagine using $REPLY also
increases the danger of having the value overwritten before you use it as your
code changes.

~~~
mar77i
Now that you mentioned it, I never hit the case where I would read two
different times in the same function/subshell scope and would never have
considered this a problem. But maybe I'm just preempting this halfway
subconciously applying meticulous scoping?

------
gumby
WARNING: this page has a dangerous typo. _do not put '.' in your path_.

"This is similar to the confusion I felt when I realised the dot folder was
not included in my more familiar PATH variable… but you should _[not]_ do that
in the PATH variable because you can get tricked into running a ‘fake’ command
from some downloaded code."

~~~
fiddlerwoaroof
Today, the threat model for a personal computer makes this a lot less
important: to exploit this someone needs to be able to create a file on your
computer and set its mode to +x: but if someone has that kind of access to
one’s personal computer, it’s already game over.

~~~
tsbinz
you mean like you checking out some repository from github or you unzipping a
file that you downloaded or ...?

~~~
fiddlerwoaroof
Mount those directories noexec

~~~
marcosdumay
Most people want to execute things on their homedir.

~~~
fiddlerwoaroof
I usually mount a separate file system on ~/Downloads

~~~
dredmorbius
Whitelisting is safer than blacklisting. Mount /home noexec, mount ~/bin exec.
If you're going to go this route.

I don't go this far yet, but _do_ strongly recommend nodev and nosuid except
where absolutely needed. This is also a strong argument for multiple
partitions rather than single-partition formatting.

~~~
fiddlerwoaroof
This is really interesting. I mount ~/Downloads anyways, because I put it on a
nfs mount, but I’ve never considered some scheme like what you’re talking
about. Maybe I’ll go for this on my next Linux machine.

Also, there’s a better argument for not putting . in $PATH: if your $PATH is
relatively static, you have less to think about when looking through your
shell’s history for a particular command: if . is on the PATH, you always have
to wonder whether a particular history entry is running a system-wide command
or a command in $PWD

------
robinhouston
If you set $CDPATH, then for goodness’ sake don’t export it. It changes the
behaviour of the cd command to make it output the absolute path of the
directory changed to, which breaks a common shell-scripting pattern for
converting relative directory paths to absolute paths, viz:

    
    
        absolute=$(cd "$relative" && pwd)
    

Conversely, if you’re _writing_ a bash script and it needs to be robust
against people who _do_ export CDPATH, you can do it like this instead:

    
    
        absolute=$(CDPATH=. cd "$relative")

~~~
jsjohnst
Or just learn the right method to do that and use `readlink` or `realpath` or
similar solution.

~~~
robinhouston
That’s a very Linux-centric view. Neither of these works on BSD. It’s probably
fine if you’re writing scripts that only need to run on Linux, but I wouldn’t
call it the “right method”.

~~~
jsjohnst
Readlink is available on FreeBSD, OpenBSD, and NetBSD.

[https://www.freebsd.org/cgi/man.cgi?query=readlink&sektion=1](https://www.freebsd.org/cgi/man.cgi?query=readlink&sektion=1)

So how is that a Linux-centric view again?

~~~
robinhouston
Sure, readlink exists. And it lets you read symlinks. But it can’t be used to
convert relative to absolute directory paths, which I thought was the use you
were proposing.

~~~
jsjohnst
From the man page:

> if the -f option is specified, the output is canonicalized by following
> every symlink in every component of the given path recursively. readlink
> will resolve both absolute and relative paths, and return the absolute
> pathname corresponding to file. In this case, the argument does not need to
> be a symbolic link.

This is the case for *BSD, GNU/Linux, and Irix. If macOS would prioritize
being consistent with the other Nix OSs, it would be supported too, but alas
Apple doesn’t.

------
teddyh
As much as people use the shell, it’s regrettable how few have actually read
the manual. Anything you use for extended periods of time on a regular basis
is worth sitting down and reading the manual for.

~~~
mistrial9
"the manual" is a form of communication for humans. Something _nix has been
de-prioritizing for thirty years! By experience, the_ nix culture has been "if
you do not understand this, you should not be here" and a low-level hostility
towards questions and learning, comes from some science gestalt in the 1800's
German or something.

*nix is friendly, it is just selective in who its friends are" .. is a dot-signature from that era

A manual is a reading experience, a working reference system, and relies on
the visual context it is presented in. There are vast differences in the
quantitative and qualitative contents of manuals, on the same content! The
BASH man page is .. improving ?

Specifically compare an alphabetized list of every option, including "change
the preferred shortcut" and "let the sea-water in, thereby killing everyone if
you are underwater" .. are right next to each other .. to modern manuals in
say, the Python culture. Specifically compare say, huge amounts of text
supplied for some obscure option that is not at all needed, to a terse one-
liner for something crucial that is used everyday.

The parent comment cheerfully "blames the victim" with implied guilt for "not
reading the manual" as if more time and effort on the part of the user, is the
answer to the communication problem posed with such a dense and subtle realm.

Last thing in this rant is, that a small interesting article on something
specific is an antidote to the largely unsolvable challenge here, so good on
that.

~~~
js2
> Something nix has been de-prioritizing for thirty years! By experience, the
> nix culture has been "if you do not understand this, you should not be here"

This is an open-source thing. Hardly anyone wants to write documentation that
isn’t paid to do so. There’s no glory in it. I first learned Unix on
SunOS/Solaris and the man pages were excellent. You could start with “man
intro” and learn your way around the entire system from there.

BTW, if you think you’re any good at documentation, open-source needs your
help. It’s a great way to contribute.

~~~
jonahx
> Hardly anyone wants to write documentation that isn’t paid to do so. There’s
> no glory in it.

There actually is glory in it. Many (granted, not all) popular, celebrated
open source projects got that way by having a great user experience with
intuitive, accessible docs.

I actually think that most programmers (even many very good ones) are simply
bad at writing docs. They actually can't do it well. Or it would be a
monumental effort for them to do it well.

It really is a different skill set.

~~~
BurningFrog
Is writing docs really that different a skill from writing readable code?

Or are these "great programmers" not actually that good at that?

~~~
jonahx
> Is writing docs really that different a skill from writing readable code?

It is correlated (though probably not perfectly) with writing _readable_ code.
But _that_ is a different skill from writing useful, correct software. That
is, you can write useful, correct software whose code is difficult for others
to understand and maintain.

------
nerdponx
_This is similar to the problems seen when the dot folder is not included in
the more familiar PATH variable_

FWIW I don't think I've ever seen this done in any system anywhere. My usual
expectation is that local executables are run as "./foo".

~~~
gizmo686
Windows does this.

------
maxxxxx
My problem with bash is that things are not very discoverable. There is so
much cool stuff you find out even after ten years of using it daily. I wonder
if there is a way to have all its features in a less obscure language.

~~~
lucb1e
Could that be a property of command line things in general? Is that why GUIs
are so popular with non-professionals that don't want to take the time to
learn to use a system properly? There is so much in terms of data manipulation
that you can do with basic tools, but people will open op Excel to find a
graphical button they need instead of using, I don't know, maybe `man -K` or
`apt search` to find the right tool.

For years I've been trying to figure out what exactly the difference is.
Terminals are seen as difficult and outdated, yet the more nerdy you get, the
more likely it is that you end up using (and the more time you will spend in)
a terminal. There has to be a reason terminals haven't replaced GUIs
altogether, and GUIs haven't replaced terminals altogether, despite the near-
complete overlap in things you can do with them. The only thing that really
needs a mouse/touchscreen is when you have something spatial like photo
editing, but conversely, the extreme end of the power user tools almost always
use a terminal and can't really work with a GUI.

~~~
maxxxxx
Possible. I would say that the shell is even worse than most command line
tools. I have started using PowerShell in my Mac and I find it easier to find
stuff there then bash. As soon as I don’t use bash for a month I forget all
the little obscure tricks.

------
gavinpc
"There are dark corners in the Bourne shell, and people use all of them."

[https://books.google.com/books?id=KJQRAwAAQBAJ&pg=PA36&lpg=P...](https://books.google.com/books?id=KJQRAwAAQBAJ&pg=PA36&lpg=PA36#v=onepage&q&f=false)

~~~
teddyh
Why on Earth would you use a Google Books link for something in the TLDP? Here
is the equivalent _canonical_ link:

[http://tldp.org/LDP/abs/html/exit-
status.html](http://tldp.org/LDP/abs/html/exit-status.html) (HTML)

[http://tldp.org/LDP/abs/abs-guide.pdf#page=57](http://tldp.org/LDP/abs/abs-
guide.pdf#page=57) (PDF)

~~~
gavinpc
Thanks, I did search a bit and at any rate couldn't find an actual source for
the quote itself.

~~~
teddyh
A simple search gave me the apparent original source for the quote; an article
titled _Bash — The GNU Shell_ , first published in the _Linux Journal_ in
1994:

[https://www.linuxjournal.com/article/2800](https://www.linuxjournal.com/article/2800)

The latest revision of that text seems to be available here:

[https://tiswww.case.edu/php/chet/bash/article.pdf](https://tiswww.case.edu/php/chet/bash/article.pdf)

------
DonHopkins
>4) SHLVL: [...] This can be very useful in scripts where you’re not sure
whether you should exit or not, or keeping track of where you are in a nest of
scripts.

If you're writing recursive shell scripts where you're not sure whether you
should exit or not, you shouldn't be writing shell scripts.

Leave it to bash to go out of its way to make it easier to write scripts that
break encapsulation and implement spooky mysterious action-at-a-distance by
implicitly depending on how they were run, and purposefully behave differently
whether they're invoked from a command line or another script, and are so
confused about what they should do that they have to make guesses about
whether or not to exit.

~~~
Pawamoy
What about a simple "die" function that echoes a message and return an error
code? Could be used both in scripts and in the interactive shell. It would use
"exit" when SHLVL is more than 1, and "return" otherwise.

~~~
DonHopkins
It's not about the best way to behave differently at runtime and sometimes
call exit depending on how the script was invoked, it's about the idea that a
scripting language would SUPPORT and ENCOURAGE writing that kind of script,
with a special magic built-in variable and a section in the already-complex
documentation encouraging its use.

Scripts should not behave differently depending on if they were invoked from
other scripts, and modular reusable libraries should never take it upon
themselves to call "exit" when there's an error.

The way to control the flow of recursive functions is by passing explicit
parameters (like the recursion depth, or a data structure to walk, or an error
callback), not reflecting on the depth of the runtime stack or looking at how
the function was called.

If bash is too weak to handle errors, or exceptions, or recursion, or passing
functional callback parameters properly, or even file names with spaces in
them, then use a better scripting language, don't just throw up your hands and
exit.

[https://stackoverflow.com/questions/14199689/how-can-i-
handl...](https://stackoverflow.com/questions/14199689/how-can-i-handle-exit-
calls-in-3rd-party-library-code)

[https://github.com/texane/stlink/issues/634](https://github.com/texane/stlink/issues/634)

------
mef
@zwischenzug in case you’re not aware, visiting your site on mobile sometimes
forwards on to a spammy third party ad site.

~~~
zwischenzug
Thanks, have reported to Wordpress.

------
ivanbakel
> This is similar to the problems seen when the dot folder is not included in
> the more familiar PATH variable.

Is this implying that you _should_ put `.` on PATH? I've always heard that to
be a security problem.

What's interesting is that so many of these variables are incredibly tersely
named. What's the reasoning behind not using a full name for them to make Bash
actually readable? Surely not the minor performance scrape?

~~~
johannes1234321
Yes, with `.` in the PATH you have potential security issues: If your working
directory happens to be world-writable (i.e. while doing cleanup in /tmp) an
attacker could put evil replacements into your PATH. "cd /tmp; sudo rm
some_file" might execute /tmp/sudo and do really evil stuff.

Sure, on an otherwise 100% secure system accessed only by trusted users there
is no problem, but not having . in PATH is a trivial mitigation.

~~~
tetha
> Sure, on an otherwise 100% secure system accessed only by trusted users
> there is no problem, but not having . in PATH is a trivial mitigation.

All systems are secure and sufficiently hardened until a botnet demonstrates
otherwise, aren't they? And a botnet is the friendly hostile takeover because
it'll just deploy a bot or a crypto miner and might even patch your system.

Just don't put . on your PATH. And if you do, _append_ it - <PATH=$PATH:.>.
This way, /usr/bin/foo will win against ./foo.

------
dorfsmay
A lot of those variables come straight from Korn shell and are well documented
in the Bolsky and Korn book.

One very useful one not mentioned on this page, possibly because considered
well-known: $RANDOM

[http://tldp.org/LDP/abs/html/randomvar.html](http://tldp.org/LDP/abs/html/randomvar.html)

------
zouhair
There is also BASH_REMATCH[0] used in conjunction with the regex comparison
"=~" to capture subpatern

Here is a function to convert HH:MM:SS to seconds:

    
    
      toseconds() {
          # https://stackoverflow.com/a/12986612/1469043
          Time="$1"
          while :; do
              if [[ $Time =~ ^([0-9]{2}):([0-9]{2}):([0-9]{2})$ ]]; then
                  if [[ ! BASH_REMATCH[3] < 60 ]]\
                  && [[ ! BASH_REMATCH[2] < 60 ]]\
                  && [[ ! BASH_REMATCH[1] < 24 ]]; then
                  break
                  fi
              fi
              read -p "Wrong format. Please use the HH:MM:SS format: " -e Time
          done
      
          echo "$Time" | awk -F: '{ print ($1 * 3600) + ($2 * 60) + $3 }'
      }
    

[0]: [https://www.gnu.org/software/bash/manual/html_node/Bash-
Vari...](https://www.gnu.org/software/bash/manual/html_node/Bash-
Variables.html)

------
nishparadox
PROMPT_COMMAND ( is one of the powerful ones. I have modified my own to log
the commands into timestamped based log files.

~~~
Pawamoy
I use it in combination with a trap on DEBUG signal to compute information
about each interactive command (start/stop timestamps, type, working dir,
return code, and more) and save it in a big log file. I can then transform the
data into pretty charts!

It's also used by the `z` command line tool to maintain its directory
"frecency" list.

------
Pawamoy
I have to say that the LINENO variable does not have the expected behavior in
process substitutions, especially when using multi-line commands (with
backslashes at the end, or using multi-line strings).

LINENO will use the last line number of a multi-line "$(...)" command,
offsetting all the commands' line numbers within the substitution. Since
multi-line strings are concatenated by Bash internally, LINENO will only
increase by one for those.

You end up with line numbers greater than the number of lines in your file, or
overlapping with empty lines or comments. In this matter, Zsh's LINENO is way
better as it will have the expected value (it will always be the line where
the command was started, not an offset).

------
ianamartin
Bash seems like a language written by people who never intend to do the same
thing twice.

------
nibbula
I find it surprising that the fellow that coined the word POSIX, agreed to
censoring the POSIX_ME_HARDER variable.

It's also surprising that the first version of bash committed source code
suicide via a bug in globbing, which makes it's name seem more poignant, and
was why there was a file named "-i" in it's source tree for many years.

------
yakattak
HISTTIMEFORMAT is something I’ve wanted for so long but never knew existed. I
should probably read man pages more!

~~~
nickjj
On the flip side, I added this to my dot files quite a while ago but I didn't
originally find out about it from the docs.

I ran into a situation where I thought "wow it would have been useful if I
knew when this command was run" while going through a massive amount of
history on a foreign server.

So then I Googled for "add timestamp to bash history" and immediately found
it.

------
andrepd
Overall, I feel like "Surprising" is the word that best describes "Bash".

------
chrisweekly
Cool list, couple or three were new to me. Thanks for posting!

------
foobarian
Surprised to see that $RANDOM was not there. Who would have known!

