
Pure Bash Bible - ausjke
https://github.com/dylanaraps/pure-bash-bible
======
taviso
This seems like a good time to mention my (ridiculous) project, a ctypes
module for bash.

[https://github.com/taviso/ctypes.sh/wiki](https://github.com/taviso/ctypes.sh/wiki)

There are some little demos here:

[https://github.com/taviso/ctypes.sh/tree/master/test](https://github.com/taviso/ctypes.sh/tree/master/test)

I even ported the GTK+3 Hello World to bash as a demo:

[https://github.com/taviso/ctypes.sh/blob/master/test/gtk.sh](https://github.com/taviso/ctypes.sh/blob/master/test/gtk.sh)

~~~
singron
I found ctypes.sh to be legitimately useful for managing resources in nix-
shell.

E.g. flock a lock file and set CLOEXEC so that subprocesses don't hold the
lock open after the shell exits.

E.g. Use memfd_create to create a temp file and write a key. Then pass
/proc/$$/fd/$FD to programs that need the key as a file. When the shell exits,
the file can no longer be opened.

You can do similar things with traps, but they aren't guaranteed to execute,
whereas these OS primitives will always be cleaned up.

------
dyanaraps
Hello, I'm the author of the Pure Bash Bible. Happy to answer any questions
you may have.

Here's an example of what bash is capable of:
[https://github.com/dylanaraps/fff/](https://github.com/dylanaraps/fff/) (a
TUI file manager written in bash)!

~~~
hnarn
This might be an odd/off-topic question, but in Telegram this article has an
auto-fetched thumbnail of a cat smoking a cigarette and a text similar to
'heavy metal music playing', I'm just curious where this picture is from, if
you have any idea? I checked the README for the repo, pictures of the
contributors etc. but I'm unable to figure out where it's coming from.

~~~
dyanaraps
That's a very very very old GitHub avatar of mine, I wonder why Telegram
hasn't pulled a later one.

~~~
folex
You can use [https://telegram.me/webpagebot](https://telegram.me/webpagebot)
to refresh avatar cache

------
jmnicolas
I have a personal dislike for regexes and non human readable code, it gives
maintenance headache. It's why I avoid shell scripts as much as possible.

The first example is not human readable, if the name of the function is a lie,
I have no idea what this piece of code do :

    
    
      trim_string() {
        : "${1#"${1%%[![:space:]]*}"}"
        : "${_%"${_##*[![:space:]]}"}"
        printf '%s\n' "$_"
      }

~~~
serhart
Just because you seem to be unfamiliar with bash doesn’t mean it’s unreadable.
Almost any language will look cryptic if you don’t know it. That function is
mostly just parameter expansion and very common in most bash scripts. I bet if
you read the manual you would easily be able to figure it out. You just have
to learn the language.

~~~
mankyd
I'm with jmnicolas on this. I've been using bash on and off for years now
(approaching decades). I still get it very wrong almost always. It's the only
language where this is a persistent problem for me.

Just yesterday I was struck by the difference between if [[ ]]; and if [ ];

I didn't even bother grokking the difference in the end. I simply found
something that worked and moved on with my day.

~~~
serhart
I'm in no way defending bash as a language. There are lots of gotchas and
weird constructs. I avoid bash too. It's just that trim function isn't that
cryptic or "unreadable" if you know the syntax.

~~~
cblades
Maybe "unreadable" and "readable" aren't the best ways to approach this
conversation.

It's certainly less readable than, say, my_str.strip()

~~~
joombaga
Are you comparing a call to a definition? `trim_string "$my_str"` is no less
readable than `my_str.strip()`.

~~~
mankyd
I mean, I would say it is. Why are there quotes around the variable? What
happens when I remove the quotes?

The quotes don't function like the parens. If this were a two argument
function, you wouldn't put one pair of quotes around the whole thing. They're
clearly transforming the variable somehow, but I'm not sure how and/or why
they're necessary.

------
tannhaeuser
Didn't realize at first that "pure" refers to features available in bash
without calling out to external processes, when I would've thought _purism_ in
this context should refer to avoiding bashisms and writing portable (ksh,
POSIX shell) scripts.

~~~
dyanaraps
I understand your concerns about bash features and POSIX compatibility. The
bash bible was written specifically to document the shell extensions bash
implements.

My focus for the past few months has been writing a Linux distribution (and
its package manager/tooling) in POSIX sh.

I've learned a lot of tricks and I'm very tempted to write a second "bible"
with snippets that are supported in all POSIX shells.

(I created the bash bible).

~~~
tannhaeuser
You can add me to those interested in a pure POSIX shell bible. Whenever I'm
about to write a moderately large script, I'm actively avoiding bashisms as
they don't buy me much yet make my script unportable. But, given you've spent
so much time on this subject, are there any bashisms that are truly essential
and you don't want to live without?

~~~
akdor1154

      set -o pipefail

~~~
scbrg
That, and local variables. I avoid bash for the same reasons as GP, so I tend
to (reluctantly) live without pipefail and work around the lack of local
variables with subshells (which I assume has a performance impact - but hey).

------
buraequete
I expected a tool on bash where you can access the _pure_ "Bible"

hallelujah.sh

~~~
sodaplayer
On that note, does anyone have a favorite cli Bible reading program?

~~~
Yajirobe
Why is this getting downvoted?

~~~
amdavidson
Because it is super offtopic. I would also downvote "does anyone have a
favorite CLI recipe manager?"

------
heuxbzjnz
no mention of /dev/tcp?!

yes, it _looks_ like a device node in /dev, but it's really a pure bashism for
opening tcp connections to arbitrary hosts and ports

~~~
dyanaraps
I've been meaning to get around to it. I hadn't messed around with /dev/tcp
prior to writing this bible.

I've since implemented a very bare-bones and very featureless IRC client using
/dev/tcp and bash.

[https://github.com/dylanaraps/birch](https://github.com/dylanaraps/birch)

I will get around to it eventually. The one hurdle I want to get over before
writing a piece about it is the handling of binary data using bash.

This is something a little tricky to do with bash but it'd allow for a
'wget'/'curl' like program without the use of anything external to the shell
(no HTTPS of course).

I want to really understand the feature before I write about it though in the
meantime I could just write a reference to the syntax/basic usage. :)

------
thih9
I'm only using bash occasionally, this was useful.

In particular the "obsolete syntax" section, I wasn't aware of it.

[https://github.com/dylanaraps/pure-bash-bible#obsolete-
synta...](https://github.com/dylanaraps/pure-bash-bible#obsolete-syntax)

------
jbrnh
I don't quite get the recommendation to always use env bash over #!/bin/bash?
If I use the full path, it is to get just that - the system's Bash. If it is
missing or overruled in $PATH then I most likely don't want the script to run
in the first place.

~~~
YawningAngel
'System bash' isn't a universal or clear concept. For example, if you're using
Modern OS X, you likely have bash via brew or some other userspace package
manager but no system bash. Presumably you (or at least, most people) would
still like your scripts to run in this case.

~~~
jbrnh
But if I write the script for a bunch of RHEL servers then OS X and brew are
irrelevant, and the full path is better (IMHO). It's the 'always use env..' I
object to.

~~~
moviuro
It's RHEL... today. `/usr/bin/env` is a POSIX standard. Maybe one day, RHEL
will put bash in /usr/local/bin. Or maybe you'll switch to FreeBSD one day,
and suddenly everything goes boom.

~~~
jbrnh
env and sh are both POSIX but AFAIK the path is not specified for either of
them. If POSIX has an opinion on how you should start a script I would be
happy for a link?

Bash in RHEL is in /usr/bin and /bin as /bin in symlinked to /usr/bin. I think
it is equally unlikely that RHEL (Debian, SLES..) will will move either
/bin/bash or /usr/bin/env as it would break a million scripts out there.

If we should migrate to FreeBSD while, for some reason, reusing linux oriented
bash scripts, changing the path to /usr/local/bin/ would be the least of my
headaches.

I agree that 'env' can make good sense if you don't know who/where/when your
script is used. For internal projects, I don't really see the advantage.

------
faragon
Related, pure Bash L77 data compression/decompression, crc32, hex enc/dec,
base64 enc/dex, binary head/cut, in a single 13KB file:

[https://github.com/faragon/lzb](https://github.com/faragon/lzb)

~~~
dyanaraps
This is really neat, thanks for sharing.

------
iandinwoodie
Dylan, I was wondering if you could share your process for going from chapter
text files to a full ebook? This looks like a really approachable way to
writing a book. Is the conversion something scripted or is it a more involved
process?

------
croo
I write a bash script or two every month so I thought I'm okay. But then came
along the very first example:

    
    
      trim_string() {
          # Usage: trim_string "   example   string    "
          : "${1#"${1%%[![:space:]]*}"}"
          : "${_%"${_##*[![:space:]]}"}"
          printf '%s\n' "$_"
      }
    

Ok, so the : is somehow a temporary variable... Then there is a variable
starting at $ and you lost me :D Can someone break down that line for me? What
the hell is going on here?

    
    
        : "${1#"${1%%[![:space:]]*}"}"

~~~
pxtail
And that's the problem with bash scipting - very quickly it gets very cryptic,
difficult to follow and understand without knowing various "clever tricks" and
gimmicks. This is all fine and dandy for personal usage but god forbid other
developers might need to changes something inside this kind "clever" code.

~~~
ImaCake
Bingo. I am a pretty casual level scripter. My programming is always just a
means to an end (usually biology related) and I would much rather write easy
to read code that takes 30 secs to run than whatever the pure bash thing in
the parent post is.

------
NegativeLatency
Kinda cool, I don’t write or read much bash and tend to stick to sh compatible
stuff.

There are some neat tricks in here but they don’t seem very readable compared
to perl/awk/sed.

~~~
hagreet
Yeah, I also don't look at things like

``` trim_string() { # Usage: trim_string " example string " :
"${1#"${1%%[![:space:]] _} "}" : "${_%"${_##_[![:space:]]}"}" printf '%s\n'
"$_" } ```

and think: "I should use bash more".

Bash is nice for making simple things simple but for complicated things it's
just shitty. I used to think that this is due to the complicated quoting rules
which make the simple things simple but tcl does a much better job at that.

In either case I prefer the clean rules of a Python or Perl for anything
larger.

------
Sir_Cmpwn
Please limit your shell scripting to POSIX sh for the sake of broad
compatibility with current and future operating systems. There's a quick
reference here:

[https://shellhaters.org/](https://shellhaters.org/)

If you find yourself frustrated by the lack of bash extensions, your program
is probably complex enough that you probably shouldn't be writing a shell
script.

------
al_form2000
Very nice recipes. I'd avoid shadowing actual executables with bash function
names (see head in the book) as a harbinger of great grief.

------
bwang008
As someone that writes bash scripts regularly for my job, one of the big
struggles I've had is remembering what I or someone else wrote when I look
back 6+ months later haha.

I had the same issue with reading other people's perl as well.

I think the great and terrible thing about both languages is that there is
literally a million different ways to skin the cat / write a regex.

------
nxpnsv
Neat. Sometimes bash is the right choice, and then a guide like this is a
great complement to google / stackexchange.

------
0x0aff374668
Very useful. I hate having to pull in external processes to do something
simple in BASH that the manpage doesn't cover. Expecting PERL to be on every
machine (or the right version of SED.. posix? gnu?) is not reliable in my line
of work.

------
ethanpil
Would love to see this in a ZSH equivalency... Especially with the impending
move to ZSH in MacOS.

------
Jenz
Looking at the very first example:

    
    
      trim_string(){  
          # Usage: trim_string "   example   string    "
          : "${1#"${1%%[![:space:]]*}"}"
          : "${_%"${_##*[![:space:]]}"}"
          printf '%s\n' "$_"
      }
    
    

This reads horrible. I see no reason to prefer this over programs like sed,
bash is after all a shell, intended firstly for running external
programs/commands.

~~~
Tepix
> This reads horrible.

Just use the shell function by its descriptive name...

The reasons are explained in the foreword:

 _Calling an external process in bash is expensive and excessive use will
cause a noticeable slowdown. Scripts and programs written using built-in
methods (where applicable) will be faster, require fewer dependencies and
afford a better understanding of the language itself._

~~~
james_s_tayler
The slow down caused to any developer that has to try and read that is
probably orders of magnitude more worth optimizing for than how fast a bath
script runs.

I'll take the grep/sed/awk version.

~~~
rhizome31
Not defending this particular example, but there are contexts where avoiding
an external call can make a difference. Think a script calling an external
program in a deeply nested loop. This sort of optimization could be used as a
last resort, after identifying a real performance issue and evaluating the
possibility of restructuring the code.

~~~
IronWolve
I've ran into this decades ago with file servers back. Half a million files,
takes too long to do a simple for loop and calling 3rd party processes for
awk/sed when I was just using them to format/search text. Breaking it down to
just to mosty bash one scripts reduced the run time and ended all pauses.

~~~
Jenz
I was going to argue that it would be better to simply not use a shell for
that, but

> decades ago

Frankly, I can only imagine how the environment then would be. Thinking back
with your current experience, what do you think you would have done if you had
to fix it again?

~~~
IronWolve
Things have changed so much, from the app side laying out data, file
system/storage side, and hardware speeds, that people are much more lazy with
applications due to the environmental improvements.

I use to make lists first, then process the lists, I still do this sometimes
since its faster. If you have to run a query every time, your probably doing
it wrong, but for small stuff, everything is so far, I can chain gnu apps and
be done. I'm not a programmer, I'm a sysadmin so mostly deal with the fixing
things like auditing or fixing data on a file system. (or maybe db)

------
mistrial9
I have inherited about 4K LOC Bash, which mostly works as advertised. No one
wants to touch it! suggestions ?

~~~
tjpnz
Assuming it's all in one file I would move related pieces of functionality
into separate files and and then source them when necessary. Should make
things more manageable for people only wanting to make small changes.

~~~
mistrial9
no, its 27 directories and 947 files, with a plugin architecture (pipeline)

~~~
mistrial9
my mistake, the project is closer to 40,000 LOC, now that I count it...

------
vinceguidry
I have moved to Crystal Lang for the times when i need the speed that ruby
can't offer. I could have leveled up my bash skills, and was indeed using bash
for that porpoise, but crystal is easier i think.

------
sigzero
Kudos for using shellcheck. I use it all the time.

------
tjpnz
Can anyone recommend something similar for Make?

------
deviantfero
Great read dylan!

~~~
dyanaraps
Thanks! Fancy seeing you here. :)

------
no-dr-onboard
Oh cool. Same fellow who made pywal.

------
ape4
Looks more like a "cookbook"

------
yread
for f in *; do is cute but what about those idiots who put spaces in the
filenames?

------
eatbitseveryday
The title made me think you can query The Bible in bash...

~~~
dredmorbius
Not bash, but CLI:

[https://packages.debian.org/buster/bible-
kjv](https://packages.debian.org/buster/bible-kjv)

------
tutfbhuf
How about performance? Is it fast as rust or slow as python?

------
jigglesniggle
While this is interesting I see doing anything but launching programs with
simple text-substituted arguments as too much for bash or sh. Run shellcheck
on some of your own code, or the code of even a simple project to see how hard
it is to really use bash.

Why I think people gravitate towards it is because languages such as python
add too much pomp to launching a shell process. A language like perl is
usually easier to use but everyone hates it now.

~~~
sambe
The overhead of repeatedly launching subshells/processes to do simple
operations can add up quickly, especially if it is happening in loops or in
parallel. Yes, you shouldn't be using bash for performance, we all know that.
But scripts often grow over time and suddenly are found to be slow/resource
hogs. I have seen people demand that proper logging be added to a bash program
and it then get minutes behind because of the overhead of all the processes
called to do the logging. I was able to make that 10000x faster using pure
bash.

As to why people like it: it often feels more natural when you are automating
what you would type interactively. Unix pipes/coreutils/etc. also feel like a
better fit when that automation is mostly about connecting other programs
(it's the auxiliary stuff that you'd maybe want in pure bash). Reading the
subprocess Python documentation does not exactly fill me with joy. I've heard
libraries like Plumbum make it a bit neater - but then you have to ask why
learn a bunch of new libraries when I already know bash? In the end it's about
the best tool for the job. The danger with bash is going too far, especially
if you don't actually know it very well.

~~~
sumosudo
My favourite way to log.. Redirect stdout and stderr ( &> ) into a named pipe
( >() ) running "tee" And get the redirect into the log file as well. `exec &>
>(tee ${__DIR}/${DOC_LOCAL}/${LOG_LOCAL})`

~~~
lordfoo
Would you mind expanding on this with an example? I'm trying to improve
performance of my bash logging

~~~
unixhero
+1. My goal is to learn how to robustly do any logging whatsoever. Eager to
learn new tricks.

~~~
flukus
Do you just want to log the scripts execution or do you want something more
structured? If it's the former you can redirect the output from within the
bash script with this (apologies for any condescension, I'm not familiar with
your skill tree):

    
    
      if [ ! -t 1 ]; then
        exec > /my/log/file 2>&1
      fi
    

The if statements tests if your at an interactive prompt, if your not all
output from the script get's redirected to /my/log/file. The above poster is
instead redirecting into a subprocess " > (tee)" that will both print the
output and log it.

It should be noted that often the bottleneck is the terminal itself, try
running your scripts with a "> /dev/null" to suppress output and verify the
slow part is actually the script.

