Hacker News new | comments | show | ask | jobs | submit login
Ask HN: How can I get better at bash?
286 points by cocolos on June 26, 2017 | hide | past | web | favorite | 185 comments
I am interested mainly in bash helping me with the workflow of apps I run on the command line.



As silly as it sounds, when I was a new Unix SysAdmin, I read the entirety of "man 1 bash", which includes all bash builtins. I found that it improved by bash-foo 100x simply by knowing about so many of the utilities. I also took some cliff notes for things that seemed generally useful.

I did it for an hour or so a night for a week or so.

That being said, a few of my personal favorites to memorize:

* Parameter expansion: https://www.gnu.org/software/bash/manual/html_node/Shell-Par...

* All of test(1) as you can use them in any if statement (/usr/bin/[ is a real command!): https://linux.die.net/man/1/test

* Knowing most of the bash internal variables: http://tldp.org/LDP/abs/html/internalvariables.html

* Keyboard shortcuts and how they are useful. A few example: CTRL-l (no need to ever use /usr/bin/clear), CTRL-k, CTRL-u, CTRL-e, CTRL-a, CTRL-w, CTRL-arrow left, CTRL-arrow right, CTRL-r (history reverse search with find as you type autocomplete)

The best way you can learn the shell is by using Linux as your primary desktop for at least a few months. You'll get very proficient very quickly by doing that.


Also CTRL-_ which is "undo whatever you just typed"


That is a great one I didn't know (and I've been a 'nix monkey professionally since 2005). Thanks!


I was using CTRL+C to open new prompt instead of undoing. Thanks for this.


Ok, but where is the "redo" option?


pressing CTRL-w a few times feels easier to me, less awkward key combination


CTRL-w just deletes a word at a time. CTRL-_ it appears doesn't just delete the whole line you just typed, it does undo of the thing you just typed... counting backspace as "one thing you just typed" -- so it does some things you can't achieve with CTRL+w, you can even get back some text that you just erased.

I never knew this shortcut before. (I will probably never use it...)


Some good advice there.

One thing I have found that less people seem to know, is that the Unix metacharacters are expanded by the shell (bash etc.) not by individual commands. What this implies is that any command, whether built-in or written by you (in C, bash, Python or any other language), has metacharacter support automatically. That is, things like the file globbing/wildcard characters like *, ?, and [ ranges ].

This was not originally true on DOS (as a counterexample) and not sure whether it is true on Windows today (haven't checked), though I did notice that more commands seem to support wildcards in Windows nowadays.

Also, for some years now, Windows too has had redirections like:

command >file.txt 2>&1

(redirect stderr (2) to the same destination that stdout (1) is pointing to, i.e. file.txt), which Unix had from the start.


    keith@illy:~$ man bash | col -b | wc
       5738   46385  323374
A slim novella's worth on Debian Sid's man page. Never thought of just reading the whole thing, so thanks.


Yes, the man page is very good. The online gnu documentation is good, too. Some of the information will not be categorized as bash.[1] Also check out sed and awk.

[1] http://www.gnu.org/software/coreutils/manual/html_node/index...


>All of test(1) as you can use them in any if statement (/usr/bin/[ is a real command!):

Yes, and not only in an if statement. You can also use test or [ command in commands of the form:

test condition && commmand2

or

test condition || command2

which will only work if the condition is true or false, respectively (IIRC, need to check this).


That is correct, no need to check on this. I've done this for years :)


Thanks. Well, I was nearly 100% sure too (been using Unix for years too (from before Linux was created)), but "nearly" is not 100% (since I don't use that feature often), and don't like to make claims I am not sure about without a disclaimer, hence I made one :)


+1 for the shortcuts.

Also one should take a look at rlwrap after becoming comfortable with the keyboard shortcuts.


Most of the responses here so far that do not include some sort of a guide are not the responses you're looking for (imho).

Mind your pipes and quotes. Guard your variables with braces. Do not export everything, test for and (try to) handle return codes and conditions and keep it simple (emphasis simple) but most of all just write it.

BASH (or Bourne) is ubiquitous when dealing with systems (vs programs). You don't need to be on the fashionable lang of the day by any measure. BASH, for most cases, will always be there, always ready and, in most cases, is the default human interface for deployed systems. As scripting languages go you don't need "better", you need dependability, zero dependencies with no requirement for modules or any other whizbangwoohoo plug-in. Language Fashionistas and personal preferences aside at least some level of fluency with BASH should be mandatory for anyone interfacing with a system.



Google Whack!


Technically no. A Googlewhack requires 2 dictionary words. Dave Gorman won't be visiting us.


nice, you made a Googlewhack


If there's a term of photobombing a Googlewhack, I think I just did that.


I am surprised by how quickly that took! Google's crawlers are hard working.


1000 times this.

You're going to get a lot of snark from people saying things like "don't", or "learn python instead".

This epitomizes "a little knowledge is a dangerous thing".

Bash has many cringeworthy aspects, but we have to deal with the world as it is, not the world as we would like it to be, and the reality is that bash is the default shell on 99.9% of unix boxes you encounter — even if you use an alt shell on your machine.

Coworkers machine? Bash. Default AWS AMI? Bash. init script to bootstrap $DAEMON? Bash. ssh to a server at your workplace? Bash. Random O'Reilly Linux tutorial? Assumes bash.

My advice?

Take some time.

Sit down.

and read "man bash"

cover-to-cover.

at least once.

A lot of the illogical things in bash make a lot more sense once you understand its parsing/expansion rules. And once you understand what is in the language vs an external terminal program in your PATH.

Since that sounds unappealing (and I scoffed at that very advice for many years), I've also found the wooledge bash guide to be very helpful.


Bash is unbeatable as a functional concept for chaining command-line tools together. Once you start getting functions or even a lot of if/while constructs, it's usually time to switch to Python/Perl.


I wrote a guide to the shell (posted elsewhere here). I concur with this statement wholeheartedly. Bash as a command language is exceptional, as a scripting language it is sub-par at best. I've been using Ruby a lot lately in command line scripting, little things like

$ ls /*.orig | ruby -e 'while f = gets do ... end'

It's not quite as easy as the shell tools for little things, but I feel like the crossover point where the lack of programming constructs begins to outweigh the initial ease of Bash scripting is about ten lines of Bash. Which is not to say that it's not useful -- I do think that Bash is something that every developer should know -- but that you really need to have Bash and another command line scripting language in your toolbox, and know when to use each.


I used to think of it in terms of "number of lines" but then I found a task that worked quite well in a longer bash script (creating a .deb file). In fact, it worked better than if it were a python script, because all the commands were right there.

Now my metric is, "if it needs a function, or even a complex while loop,switch to perl/ruby"


I think that what I would tell new people would be "more than ten lines, OR using more than two variables, OR any flow control constructs". Bash can make even simple conditional statements difficult, and while my rule is absurdly restrictive, Bash can still be amazingly useful within those bounds. Probably mine is the lower bound for "anything less complex than this is fine to write in Bash" and yours is "anything more complex than this should definitely not be written in Bash", with the middle ground being "have a good reason why Bash is the best tool for the job".


I've been using linux since 1992 and if there's one thing I can't stress enough is to use full directories and not anything abbreviated. After 25 years, I still find myself slipping up, overlooking some minute detail, causing data loss.


I use a variety of operating systems, many of which put things in different locations. Hell, even across linuxes, locations differ.

Don't do this. $PATH exists for a reason.


I've heard this suggestion to use full paths for a long time. Why use full paths?


I'm sure there are other reasons, but the big one is that your scripts may get called in an environment other than your normal logged in shell.

Cron, for example, doesn't have the same $PATH as your login shell. So no full paths means you can fail to run some commands, or run the wrong copy of one.

There's also the security aspect. If your script has "." in it's $PATH, or something else writeable, I may be able to coerce it to run an imposter command.


This is terrible advice, if you expect your PATH to be something, just set it at the top of the script and be done with it.

No need to make it 100 times harder to read.


Why even expect binaries to be in the same place on different systems with different default PATHs? Know how many times I've seen problems caused by (#!/usr/bin/bash|#!/bin/bash) pointing to the wrong place?

After reading this thread though, I'm not sure I want to suggest people write `#!/usr/bin/env bash` either... since that depends on the path being correctly set, and bash being in the path!


I've started to dislike /usr/bin/env too. At least use -i. Otherwise you're one PATH manipulation away from executing a malicious program as your intended shell.


A PATH, though, is global.

Situations exist where you need more granularity.

There's also things like "watch" and "at" you might use in a shell script. They don't inherit the parent's environment, so setting PATH doesn't help.

You're correct in that "full paths" isn't a definitive answer though. I suppose the more generic advice to not depend on your environment to have it right is better.


I have written a simple tool called mann (https://github.com/soheilpro/mann) to help me remember little things that I learn when working in Bash/Zsh.

Basically, every time I learn something useful about a command, I add it to its mann page and then whenever I need it in the future, I simply run 'mann <command>' to find it.

Here's the current output of my 'mann sed', for example:

  # Add char to beginning of each line
  sed 's/^/#/'

  # Replace with newline
  sed 's/<oldvalue>/\'$'\n''/g'

  # Replace newline
  sed -e ':a' -e 'N' -e '$!ba' -e 's/\n/<newvalue>/g'

  # Plus sign
  sed -E 's/foo+/bar'

  # Digit
  sed -E 's/[[:digit:]]/bar'

  # Inplace
  sed -i'.bak' -e <pattern> <file>


This looks great. I've always kept those types of little snippets in the README of my dotfiles repo - and always keep a printed out copy on my desk. But this seems way more practical.

https://github.com/ben174/dotfiles/blob/master/README.md


If you're not already familiar with it, I would suggest learning about the basic Unix process model -- fork, execve, wait, open, pipe, dup2 and friends.

Bash is essentially a DSL for these. A lot of the weirdness you see in the language is due to these abstractions leaking through. For example:

* Quoting is building execve's argv parameter. It's hard to quote correctly if you don't know what exactly you're working towards.

* Redirections are opening and copying file descriptors. It explains their scope, order and nesting behavior.

* Variables are modifying and passing the environment, and their weird scope is due to forks imposed by the process model.

Once you know how you can do whatever you want in C through the basic syscalls, Bash is an extremely efficient and far less surprising shortcut to do it.


> Quoting is building execve's argv parameter.

can you expand on this? does this elucidate, e.g., variable substitution and the difference between single- and double-quotes? or does it just help demonstrate when you need quotes for an argument that may contain whitespace?


The behavior of quotes can and should be described in terms of how they affect the words they expand to (i.e. the argv you build), so yes, it clarifies all this.

For example, all the various wrong ways of quoting var="My File.txt" or otherwise incorrectly using such a name will result in variations on a wrong argument list:

  execlp("cat", "$var", NULL);            // cat '$var'
  execlp("cat", "My", "File.txt", NULL);  // cat $var
  execlp("cat My File.txt", NULL);        // cmd="cat $var"; "$cmd"
  execlp("cat", "'My", "File.txt'", NULL);// cmd="cat '$var'"; $cmd
  execlp("cat", "My\\", "File.txt", NULL);// cmd="cat My\ File.txt"; $cmd
  execlp("cat", "'My File.txt'", NULL);   // var="'$var'"; cat "$var"
  execlp("cat", "\"$var\"", NULL);        // arg='"$var"'; cat $arg

Meanwhile, all the correct ones result in the same, correct argv:

  execlp("cat", "My File.txt", NULL);  // cat "$var"
  execlp("cat", "My File.txt", NULL);  // cat 'My File.txt'
  execlp("cat", "My File.txt", NULL);  // cat "My File.txt"
  execlp("cat", "My File.txt", NULL);  // cat My\ File.txt
  execlp("cat", "My File.txt", NULL);  // cmd=("cat" "$var"); "${cmd[@]}"
  execlp("cat", "My File.txt", NULL);  // arg="'My File.txt'"; eval "cat $arg"
If you don't know which argument list you're aiming for, you basically have to go by guesswork and superstitions.


What is a good learning resource for what the basic Unix process model (i.e. what you are describing?)

Do you recommend a book or something like that? And preferably for a beginner?


Stevens - Advanced Programming in the Unix Environment

Love - Linux System Programming

The Unix-Haters Handbook

(Unfortunately, the best resource ever for this kind of stuff, from which I learned, does not have an English translation.)


What resource was that?


The notes by my OS lab professor, written in Romanian, with self-learning in mind, with detailed explanations of the workings of major system calls and featuring numerous working examples and pitfalls.

I might translate them some day and put them up on the Internet (if I get his permission and some free time).


Thanks man.


Not the first thing to look for, but I've found ShellCheck[1] to be pretty helpful when it comes to correcting typical mistakes.

[1]: https://github.com/koalaman/shellcheck


Seconded. Running ShellCheck integrated with your editor (easy with vim/Syntastic) is really great way to improve your bash skills.

It will highlight common mistakes, and their wiki explains each one detail and how you should use an alternate, better implementation.


This is a great addition while writing bash scripts. Catches a lot of the bad stuff.


Read Greg's wiki - BashGuide: http://mywiki.wooledge.org/BashGuide


Yeah, reading his BashFAQ helped me progress a lot: http://mywiki.wooledge.org/BashFAQ


Besides the obvious answers of just reading the manual, looking up howtos, and stackoverflow I can recommend some habits that might increase your uptake of bash.

1. If you are not running a unix as your default OS switch to one (ie Linux or Mac).

2. Create a bin (~/bin) directory in your home directory of all your shells scripts and source control it. Any script you ever write put in that directory. Even if its not bash (ie python, perl). I find that it is critical to look at how you did things previously to help you learn as well as it saves time.

3. Any command that is complicated one liner that you create or see on the internet... create script and put in the bin directory mentioned above.

4. Optimize your personal bin directory and review frequently.

5. If you run Linux build your system from scratch (ie read Linux from scratch).

6. Bonus to the above: Automate the creation of your personal system through Packer and Bash!

7. Find where things are not automated.

8. Bash is more than just the shell. A knowledge of GNU coreutils as well as tmux/screen is worthwhile and highly recommended.

9. Learn the readline shortcuts. Particularly "ctrl-r".


Little late here but do you have any links for #6?

I've not been able to find anything good on setting up a dev machine image with packer.


How about you don't? Bash as scripting language is rather mediocre.

Anything that is not simple in bash gets hard to read and debug and probably is wrong on some subtle levels.

I have a rule of thumb that any shell script that grows beyond a screenful of lines gets redone in a proper scripting language.


Bash as a scripting language is actually pretty amazing. It gives you everything you need to perform some quick-and-dirty tasks with minimal overhead. If you need only work on sequential data, files and processes, it's a perfect match.

It's not a full-fledged programming language by any stretch of the imagination (lacking structures more complex than associative arrays), but it's damn good for scripts of all sorts.

As an example, I've reimplemented a subset of Ansible (a command able to send "modules" on multiple machines via SSH and capturing+caching their output for subsequent queries) in ~150 lines of Bash. Considering that the size of Ansible, written in the more proper Python, is ~15000 LOC, I'd say Python is the much lesser scripting language.

Edit: to answer the OP's question, the documentation I've found most helpful to learn Bash is the one present on the Linux Documentation Project, with the page for arrays deserving special mention : http://tldp.org/LDP/abs/html/arrays.html. I spent a lot of time reading the manual before stumbling upon that documentation, and none of it really clicked until I had a few examples before my eyes.


Typically a program like Ansible will have the majority of its use cases implemented in a minority of its code, while the rest of the code will be there to support special cases, edge cases, rare use cases, etc. So that contributes to the disparity in size.


Ansible has to take into account numerous edge cases, operating systems, backwards compatibility, etc. etc. Of course it's much, much bigger than your 150 lines bash script.


OTOH, if you can structure your problem as a composition of pipelines, it can be quite a bit faster in bash than in a "proper" language. You get to choose optimized tools, and they run concurrently.

Writing efficient bash code forces you to think about your problem differently. It's a very similar process to thinking functionally; e.g. you don't want to deal with lines of a file one at a time in a loop, you want to do filters and maps in languages like grep, sed and awk to deal with data in a streaming fashion with a minimum of forked processes.


Is there a "proper scripting language" somewhere that supports the same first-class access to Unix programs and the same piping | syntax that shells in general do?

I would love something like that.


Perl was created to do this stuff. Perl is like duct tape to hold together everything. Most people call it ugly because of that but Perl is the best language for quick dirty one liners that work. Only problem is that if you look at it later you will have problems understanding it.


Agree. Perl is terrific for quick text processing. You can make surprisingly powerful one-liners.


In most platforms you can install the rc shell from plan9. It's what I use exclusively for my shell scripts. It is only when I want to share some script with other people that I consider using /bin/sh, and even then I've gone for rc nevertheless. Here you can read about it: http://doc.cat-v.org/plan_9/4th_edition/papers/rc


Thanks for the suggestion. I've also been after a saner shell, and have been disappointed in one way or another with the approach of either using another language idiomatically (Python, Scheme, Haskell, Scala, etc.), since running commands, piping, etc. are all rather awkward; or using such languages with embedded shell-like libraries, which seem to have awkward edge-cases.

As a "real" shell, it looks like rc maintains the command, file and piping niceties of bash, whilst avoiding the edge-cases of shell-like embeddings.


In Perl, IPC::Run [1], while a bit on the quirky side sometimes, can do that sort of thing.

    use IPC::Run qw(run);
    IPC::Run::run(["echo", "abc"], "|", 
                  ["tr", "a-z", "A-Z"], "|", 
                  ["cat"],
                  \$out);
    print "out: $out";
will yield

    out: ABC
Useless use of cat here just to make a point, of course.

It's easy to use Perl strings as input or output, or files, and there's also ways to interact with streams as they go along.

Generally speaking the slightly more verbose interaction with shell is made up for by the savings when you can use Perl directly to do something rather than spawn a shell process for something simple. (Shell composition can be powerful, yes, but on the other hand spawning a full process and setting up a pipe for things like "tr" or "wc" is often just silly.)

I also personally believe there's a win in the syntax; you may say "What? 'tr a-z A-Z' is way more convenient than '["tr", "a-z", "A-Z"]' but I say one of the biggest and most pervasive errors in shell is to incorrectly set up the arguments by having something interpolated in incorrectly. Having an unambiguous syntax for separating arguments has often made it much safer for me to write the code that has to use the shell. Used correctly it can even be made to work in the face of user-supplied input, something that should generally not be combined with any form of implicit argument separation. (Although bear in mind that "used correctly" encompasses more than just "separating the arguments correctly.")

I am also NOT claiming exclusivity; this is just the thing I know off the top of my head. I'm sure the other scripting languages have nice libraries, too. Though watch out for them being too nice. There's a bit of an impedance mismatch between shell and scripting language. Anything that smooths it over too well is probably either giving up critical features or introducing subtle bugs, which can become security bugs in the face of arbitrary user input.

[1]: http://search.cpan.org/~toddr/IPC-Run-0.96/lib/IPC/Run.pm


Perl of course, even if the adjective "proper" seems weird for Perl ;-)

As a rule of thumb, if a script starts to need arrays or to handle spaces in filename, I migrate it to Perl.

Error handling is also easier and more natural in Perl.


Just found out that xon.sh http://xon.sh/tutorial . html does this.


Tcl lacks the piping syntax by default, but there are various ways to implement it: http://wiki.tcl.tk/17419


Then you have the age old problem: what if one of the tasks in your pipeline(s) fails?


  set -o pipefail -o errexit
Add more options to taste.


https://stackoverflow.com/questions/29532904/bash-subshell-e...

Not pipefail as such, but Bash's error handling semantics (or lack there of) is pretty lame.


Indeed. While you can get the rules regarding file name escaping right, the language does nothing to help you get it right every time, so unless you're careful someone will leave a file with a space (or, God help you, a newline) in the name in exactly the wrong place and blow it up.


Totally agree.

Also, even if you manage to become better in Bash, you are bound to lose your skills at some point when you have been programming in other languages for a while.

I always have to look up how to do even basic things in Bash. I just don't use it often enough for these things to "stick".


It's okay to not have everything memorized. As long as you know what to look for, you're only a google search away.


How about OP does anyway? Bash as a scripting language is fucking great!

Since when is simplicity an argument against writing programs? Whether scripts or frameworks? "Hard to read" is not neccessarily an inherent trait[1] of the language, and more likely wrong on some PEBKAC level.

I have a customised environment at near 10k lines of bash in 5 projects, all of it in the correct tool for the job, aka a proper scripting language, so I can suggest another use for your thumb :-)

1: https://www.reddit.com/r/commandline/comments/2kq8oa/the_mos...


I would say that a screenful is already on the large side. Then again, there is a lot you can do in a screenful in bash.


If you have really limited resources, then bash or one of the other shells are the only tools at hand. Embedded devices might not give you enough disk space to get perl, ruby or python.


There is this one case for which shell scripting is particularly interesting: dependencies simplification.

https://github.com/EtiennePerot/parcimonie.sh/blob/48044f913...


Figure out a problem and try solving it in bash - bash for beginners guide on tldp site can get you started. You get better as you use it.

http://www.tldp.org/LDP/Bash-Beginners-Guide/html/

EDIT : Additional links -

Advanced - http://tldp.org/LDP/abs/html/

Bash programming - http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO.html


ABSG is always my answer to how to learn shell


Yes :) ABSG is good.. but do check out the other links too :)


Just googled ABSG and all I got was 'Adult Bible Study Guide'


To really understand Bash, one's soul must be in tune with God.


http://tldp.org/LDP/abs/html/

Next time, try to google "absg shell".


It's advanced bash scripting guide.


Train yourself to take the 20 minutes required to learn to "do it the right way" every time you need to. Its so easy not to bother because you are busy but in the long run you will save time.


I think this is the right answer, assuming OP actually wants to get better at bash and not route around his original question by learning stuff that isn't bash.

I use bash/shell scripts frequently and have many running 'in production' as cron jobs th at run various jobs or manipulate data for other jobs to run on.

One thing I really like about pure shell is that it's extremely portable and transparent about what it's doing.

I still have to re-learn control structures almost everytime I write a new script, I don't try and memorize [[]] vs [] and all the weird ways equality can work, I just google each time and answers are always on top (once you know what you're looking for).


I couldn't begin to say how many times I've typed "help test". It's a lot. I usually get the answer I'm looking for, when it comes to conditionals.


TIL. even faster than the google


and keep a set of text files for yourself where after learning to do it the right way you write it down. I find that especially with bash/command line things I don't have to solve the same problem often (which would help with remembering) but often enough to be annoying having to re-learn it all the time, so it really pays off to have some documentation to refer to. org-mode and vimwiki are quite good for that.


Read the manual front to back and install shellcheck. Doing both things has paid off for me a thousand times over. The rest is practice. Complete the bash exercises on Hackerrank. Bash is fantastic in it's domain but it does require serious study in my experience


I can second the shellcheck recommendation. Shellcheck makes great recommendations, and every suggestion has a corresponding code you can google to get to a detailed explanation of why that is considered a warning or error. Hell, even if I just considered the times I forgot to put quotes around a variable and got warned by shellcheck I would be happy that I use it.



The Tcl programming language is what shell scripting should be, basically. It is not just a language with all the features you need, it has explicit Unix shell alike scripting capabilities and strong DSL abilities. An example two-liner:

    set files [glob /etc/*.conf]
    foreach f $files {file lstat $f file_info; puts "$f: $file_info(size)"}


    /etc/asl.conf: 1051
    /etc/autofs.conf: 1935
    /etc/dnsextd.conf: 2378
    ... and so forth ...
Also there is an `exec` command that supports pipes, redirections, and so forth:

    set result [exec cat /etc/passwd | grep Directory]
The pipe has no special meaning in Tcl, but because of its DSL capabilities you can do things like that. Exec is a DSL basically.


For scripting, I recommend the rc shell from plan9, which is the one I use for my shell scripts. It is only when I want to share a script with other people that I consider using /bin/sh, and even then more often than not I've gone for rc.

I invite you to read about it: http://doc.cat-v.org/plan_9/4th_edition/papers/rc.

I find the control structures simpler and more elegant, and overall its design feels more consistent. For example, consider an if statement in bash:

  if [ condition ]; then
    ...
  else
    ...
  fi
And now in rc:

  if (condition) {
    ...
  } else {
    ...
  }
Or a case statement in bash:

  case $1 in
    "bar")
      ... ;;
    "baz")
      ... ;;
  esac
And expressed in rc:

  switch ($1) {
    case "bar"
      ...
    case "baz"
      ...
  }
In the past, I've used it as my shell too, but now I use it only for scripting. I think you can install it in most platforms.


I recommend starting w/ Gary Bernhardt's excellent "Tar Pipe" blog post.

https://web.archive.org/web/20161227222637/http://blog.extra...

From there, move on to using the shell as your IDE. How? First, understand the Unix philosophy. I think Ted Dzubia describes this pretty well in his Taco Bell Programming blog posting:

http://widgetsandshit.com/teddziuba/2010/10/taco-bell-progra...

Great, so now you understand that there are a bunch of useful tools out there and you can string them together to do great things. Now you need to discover the tools themselves.

If you're a "read the dictionary" kind of person, go ahead and start off w/ the Gnu Coreutils documentation. https://www.gnu.org/doc/doc.html

However, if you're like me you'll learn fastest by watching other people work. In this case, I have to point back to Gary Bernhardt again. Specifically, his "Composing a Unix Command Line" screencast will open your eyes wide and very quickly introduce you to a range of incredibly useful coreutils programs in the context of solving a very specific problem. This content is $29/mo, but I'd argue it's money well spent. https://www.destroyallsoftware.com/screencasts/catalog/compo...


Flame war between bash/fish/zsh/powershell is almost meaningless to beginners, because the basic skills are common to all shells. (That said, you will love zsh once you use it)

I learned to use shell, about 7 years ago, by reading O'Reilly "Classic Shell Scripting". It is well written, and teach you something that you can hardly learn from google. But don't try to remember everything, especially those advanced string manipulation syntax, because one would usually use a scripting language such as ruby for advanced job.


> I learned to use shell, about 7 years ago, by reading O'Reilly "Classic Shell Scripting".

I can second this recommendation. :)


Helping you how? If you actually have a problem you are trying to solve, then do just that. My experience with command line came from solving problems I had. Today I do a lot in the command line and I am learning new things all the time. However if I just wanted to get "better" at it, then I don't even know where to start because there is no clear goal.


Well said, it's an ongoing process and there is always something new to learn - some use case which we may not have come across. So, it's an ongoing process.


I wanted to get better at bash too, but instead I ended up getting everything[1] done at fish, which is cool, much better as a language, but no environment has fish pre-installed nowadays.

[1]: https://github.com/fiatjaf/react-site


I used to lean on python for much of my scripting needs, mainly because the more advanced bash syntax was pretty daunting. Getting better at bash has a trickle-down effect, especially in this container age. ENV var scoping + loops and various var expansion methods really made it click for me. Shelling out to various tasks (and grabbing the results) is effortless via bash scripts. With bash on windows now it's pretty much ubiquitous. My advice is to consider why the task at hand can't be done in bash, because often times it can, with much more portability.


In both my personal and professional life, I have the opposite conclusion. Bash is only for the simplest scripts and pipelines, and everything else (assuming you're going to use it more than once) gets written in Python or Go.

The Bash syntax is not daunting, or if it is that's never been the problem with Bash. The problem is that Bash or shell programming in general gives you a million ways to shoot yourself in the foot and maybe one or two obscure, funny-looking ways to do what you want. Like iterating over the files in a directory, for example. If you think that's easy in Bash you have either have a funny definition of easy or you're forgetting a few corner cases. Or getting the output from a program—did you know that $(my_prog) or `my_prog` modifies the program output?

For containers we do everything declaratively.


> Like iterating over the files in a directory, for example. If you think that's easy in Bash you have either have a funny definition of easy

Maybe I'm overlooking something...but why wouldn't this work:

for file in $(ls); do {<looped command>}; done


  $ mkdir "hi there"
  $ mkdir "how are you"
  $ ls -l
  total 8
  drwxr-xr-x 2 xxxxxxxx xxxx 4096 Jun 26 14:54 hi there
  drwxr-xr-x 2 xxxxxxxx xxxx 4096 Jun 26 14:54 how are you
  $ for f in $(ls); do echo $f; done
  hi
  there
  how
  are
  you


You can use

  find . -type d
or set IFS.

  IFS='
  '

  or

  IFS='\n'
then reset it later. Perhaps store existing IFS in a variable so you can put it back. The \n vs actual line feed has to do with an edge case in cygwin and dos files.


ls | while read file; do echo "$file"; done

... works better for the easy edge cases, but still probably has some issues. Personally I think klodolph called it; once you get into anything that has a few interesting edge cases bash becomes pretty unwieldy.


The easiest way is still: for file in *; do ...; done

This works fine with spaces and other strange characters, no need to come up with more complicated ways.

If you must use find, the correct way is really tricky and uses obscure bash features like settings IFS to the empty string to split on null bytes and process substitution so that the loop doesn't run in a subshell (which would prevent it from modifying variables in the current environment):

while IFS= read -r -d '' file; do ... done < <(find /tmp -type f -print0)

See http://mywiki.wooledge.org/BashFAQ/020


I like that, I've never thought of or seen it, although it should've been intuitively obvious given how bash expansion works.

I think this demonstrates the point pretty well though - it takes a discussion among three of us before arriving at something that works moderately well, and we found four fairly different ways of doing it...


Fair enough.


for file in *; do { ... }; done

is both much easier and much safer than piping the output of ls.

It does the correct thing with filenames containing spaces or weird characters, and the intent is much more visible, so there is really no reason to use $(ls).

More details: http://mywiki.wooledge.org/ParsingLs


Fair enough! I'm more in the devops/platform space so worrying about python versions and setting GOPATH is wasted time for my purposes. What I love is the portability with minimal effort.


I'm in a similar position. Unfortunately I've seen far too many non-portable Bash scripts fail to run in production and I can't trust them! This includes simple errors like #!/bin/sh at the top, missing programs, or differences between GNU and BSD versions of programs (macOS and Linux). Chalk it up to differences between dev workstations and production. Python has some problems too but they tend to be pretty well-understood and navigated by our developers.

GOPATH isn't actually necessary except when you compile, and Python version problems mostly come up with larger programs you'd never dream of writing in Bash. 2.7 is dirt common even if you're using a slow-as-molasses-update-cycle LTS Linux.


From a 14+ year Linux/Unix admin:

Get a very brief reference book of every common UNIX command. Read all the commands, what they do, what options they take. Start using them.

Shells are most useful when they are used to tie together other programs. In order to do this, you have to know what all the command-line tools you have at your disposal are. Learn the tools, then start writing examples using them. Keep the examples and the docs somewhere to reference them later.

For quick reference, the command 'whatis' will give a blurb from the top of the command's man page. `whatis ls' "ls (1) - list directory contents". View many at once with "(cd /usr/bin; whatis * | grep -v noth)". Many often-used commands come in "util-linux" and "coreutils" packages. Read man pages completely when convenient.

It may also help to have a VM or desktop which has no GUI, where you will be forced to use the command-line. When I was starting out I used a desktop with no X server for months. You can get a lot more done than you think (and 'links -g' provides a graphical browser if you need images)

To learn more about Bash itself, you can look for server installation software packages made with Bash, or in the "init" tools distributed with big distros like RedHat, SuSE, etc before they used systemd. But it's better to get used to more shell-agnostic scripting using UNIX commands than it is to use Shell-specific language/syntax.


"(and 'links -g' provides a graphical browser if you need images)"

Without X presumably needs framebuffer or something?


It has a variety of drivers, such as for X, framebuffer, svgalib, OS/2 PMShell and AtheOS. It also bundles antialiased fonts.


Don't listen to the guys who are saying not to learn bash. In the right circumstance bash is much better than any verbose python script.

I'd say learn the following topics:

pipe grep sed awk find

Once you feel comfortable using and combining these tools you should be able to find out the rest by yourself.


At the risk of being pedantic, grep sed awk and find are not bash, they are their own programs. You can run find from any shell, or without a shell at all.


I prefer Perl over Awk.


I'm ok at bash, but I do not default to complicated bash scripts for my needs. I make little reusable tools. For example I have a tool aliased that makes it easy to apply quick ruby code. For example

echo "345.44

544.50" | rg "€#{l}: €#{(l.to_f * 3).to_i}"

Produces the following output:

€345.44: €1036

€544.50: €1633

Based on this code:

https://gist.github.com/zachaysan/4a31386f944ed31a3f8a920c85...

I find it's much faster to be productive like this than it is to try to do the same with ruby -e because I really only want to manipulate single incoming lines. I don't want to have to write the looping code or the code that sets variables or what have you.

Also, sometimes it gets confusing what tools are just bash functions or alias and which are scripts, so if you ever forget what a tools definition is just type:

type toolname

As for actually answering your question, look at your friend's dotfiles on their github account to learn which tools and tricks they use and when you don't know how something works ask them questions. People will usually point you in the right direction.


Any particular reason you don't use `ruby -p`? Looks like it's doing pretty much the same thing.


One thing I would say is first check that you are running the latest version! There has been a lot of development to bash over the years. If you feel like customizing things a lot I'd check out zsh and http://ohmyz.sh/. It has a lot compatible with bash with (IMO) some saner scripting support.

Aliases are something I use a lot - it's very basic but just having "big long command with options" aliases to something easy to remember makes it much more likely I will not make mistakes, can repeat it easily in loops.

Another thing that complements using bash effectively are using other applications config files. As soon as I have a new host to interact with I add it to my ssh.config file - then any scripting I need to do I don't need to deal with any special files. Other files like ~/.netrc or ~/.pgpass make my shell sessions that much more productive. For some reason many people don't bother ever updating these and rely on the shell history to do anything more than once.

CommandlineFu (http://www.commandlinefu.com/commands/browse) has some nice one liners and there's often some gems on ServerFault (https://serverfault.com/questions/tagged/bash) - just browsing those for topics matching your workflow can be very fruitful.

I've found the more I do anything at the shell of any complexity I end up writing a small python command line client to do any heavy lifting. Argparse (https://docs.python.org/3/library/argparse.html) makes this trivially easy and then I can use regular shell "glue" to combine a few commands together.


In addition to what others said, I can recommend just reading through the manpage once (probably in multiple sittings). Even if you don't remember the exact syntax, you will have an idea what bash can do, and know enough of the jargon to find it again in the manpage when you need it. For example, when I need to replace a substring, I used to do

  FOO="Hello World"
  ...
  BAR="$(echo "$FOO" | sed "s/World/Hacker News/")"
until I remembered that bash can do string replacement by itself. A quick search for "pattern" and "substitute" in the manpage turned up the right syntax,

  BAR="${FOO/World/Hacker News}"


I would also recommend to read Single Unix Specification on shell syntax, and then avoid bashisms whenever possible. Bash had a history of subtly changing its behaviour in these, which leads to scripts getting suddenly broken without any modification on system update.


Could you give some examples? http://wiki.bash-hackers.org/scripting/bashchanges and http://wiki.bash-hackers.org/scripting/obsolete aren't particularly enlightening.


What I was hit with were subtle changes to how [[ ]] worked, along the lines of implicit anchoring in pattern in previous version not taking place in succeeding version. It took quite a while to debug what became a logic error.

After that I started avoiding bashisms in my scripts, especially that they usually provide very little benefit at the cost of gambling against breaking silently on a barely related update.

Nowadays it's also worth noting that #!/bin/sh often (Debian derivatives) points to something that is not Bash, and people writing in shell/Bash usually don't understand the difference.


I'd recommend reading the manpage far more than once.

It runs about 70-80 pages, so counts as a small book.

Hrm. Make that 107 pages:

    man bash | pr | grep 'Page [0-9][0-9]*' | tail -1
46,000 words.


Forget the Bash man page (just temporarily, that is) and read this:

http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3...

The POSIX specification of the Shell Command Language.

Also, don't overlook that there is a GNU Info manual for Bash, not just the manual page:

https://www.gnu.org/software/bash/manual/html_node/index.htm...


I was about to post:

> Add, "#! /usr/bin/python" to the top of your scripts, it will make your life easier.

However, after reading the rest of the thread, it seems Python and similar langs are not actually great for the kind of things people use Bash for, and Perl is the way to go!

Great, another language to learn...

edit, re: python:

fiatjaf suggested xon.sh:

"shell language and command prompt [..] based on Python, with additional syntax added that makes calling subprocess commands, manipulating the environment, and dealing with the file system easy.

http://xon.sh/tutorial


I learned bash primarily for practical reasons (ie. navigating in Linux, parsing files, searching for text) and nothing more. Definitely not from a DevOps or SysAdmin point of view as I'm sure fits the background of many commenters in this thread.

If your use case is pragmatic in nature, I would recommend my post on the topic: http://alexpetralia.com/posts/2017/6/26/learning-linux-bash-...


Take a look at this tutorial [1]. It will teach you some shortcuts, a bit of shell expansion, and help you set sane defaults in bash. One that I'm particular fond is to set

  "\e[A": history-search-backward 
  "\e[B": history-search-forward 
in your ~/.inputrc. So, if you are typing a command which begins with "git", it will only search in history for commands that start with git (instead of returning all commands that may include the string 'git' like Ctrl+r). Having trouble trying to remember that option you passed to `git log`? Just type in `git log` and press the up arrow to find your last usages.

I think it is also helpful to learn Emacs or vim keybindings. I use Emacs keybindings in bash a lot (enabled by default). I have summarized the ones that I used more often in a previous comment [2].

[1]: https://www.ukuug.org/events/linux2003/papers/bash_tips/

[2]: https://news.ycombinator.com/item?id=13404262


Just like how you learn any other programming language: use it to solve your problems.

Anyway, here's a few steps that I would recommend:

1. Go through http://tldp.org/LDP/abs/html/ and http://www.tldp.org/LDP/Bash-Beginners-Guide/html/ , or at least go through the table of contents so that you have a feeling of what bash is capable of. A few important things are: if, while, for, switch, functions, string manipulation, pipe, subshell, command substitution

2. Understand the execution model. Variables in subshell cannot be accessed from the parent shell, this is a common mistake

3. Learn to avoid common pitfalls. I always recommend my colleagues to always quote the variables in double quote, always use "$@" instead of "$*", always use double square bracket instead of single square bracket for testing, use echo to pass return value from functions instead of assigning to global variable

4. Learn awk, sed, grep. Bash can be quite limiting when it comes to data processing and these tools can be quite powerful. You can use bash to glue different filters together at a higher level.

Bash is a fantastic language and there are quite a lot of things that can be much more quickly in bash than in other "proper" languages. A lot of people says that it's too difficult to maintain a shell script beyond a "critical mass" but I believe that if you follow good practices and write modular codes, shell scripts can be very manageable.


If you're interested in getting better at the terminal, I recommend you learn how to customize it. it'll really help you learn by doing and figuring out what it is you want to learn how to do first.

It's what worked for me, though. There are also some workflow ideas that have really helped a lot. Autocompletion and being about to look through your history for commands is super helpful too.

``` cat $HOME/.bash_history | grep -E 'command|argument' ```

https://github.com/zsh-users/zsh-autosuggestions

I just finished a guid on my site about my terminal setup. I hope to read yours once you've customized the pixels out of it.

Aside from things to get your interested in the internals of your shell via bash scripting, you should also consider writing more shell scripts specifically around your workflows. I keep mine in a .files repo on GitHub. Take a look at the install script. It took me over a year to get really fluent in bash scripting enough to make it possible to get better and better at it.

Good luck on your journey!


Bookmark the following url and come back to it as often as you need:

http://tldp.org/LDP/abs/html/

Also this one to learn some cool tricks:

http://www.commandlinefu.com/commands/browse


Yes for ABSG, especially the manipulating strings and process substitution.

That and shellcheck(+syntastic if you use vim) ramps up the skill level quite fast.


First, if your bash script grows beyond about ten lines, it's time to consider rewriting it in a cleaner language. Python's a common one. I used to use Haskell for that kind of scripting as well, which was astonishingly good at it.

Here's my study suggestion:

0. Learn to use variable interpolation and backticks.

1. if blocks and the [ built-in function. Go read about the grammar and look at the flags that [ takes. Memorize the most common couple (file exists, is a directory), and know how to look up the others when needed. Find examples of variable interpolation tricks needed to make this function.

2. for and while blocks. Learn the grammer. for is mostly useful with `seq ...` or a file glob.

3. Learn some of the options to make bash fail early and loudly like pipefail.

4. Most of the power of bash is in the programs you call, and they aren't always the same ones you use interactively. Other folks have mentioned some of these. find, xargs, wait...


I agree that writing long, monolithic scripts isn't optimal, but there's no reason you can't break a large script into multiple smaller scripts. And there are certain tasks where <programming language of choice> won't offer any advantage over using a regular ol' shell script. If one really needs/wants to get into Bash, I'd recommend learning it well enough to make educated decisions about when Bash starts becoming a pain, and the benefits of <programming language of choice> start to kick in. In some cases, a <programming language of choice> will be more of a hinderance than a help, especially in areas where the the functionality you want cannot be found in the standard library.


> if your bash script grows beyond about ten lines, it's time to consider rewriting it in a cleaner language

Just because you don't feel comfortable writing long scripts doesn't mean you should discourage others.

There are many many justifications for sticking to shell, for example if you need to write a portable installer that works across every UNIX variation.


> Just because you don't feel comfortable writing long scripts doesn't mean you should discourage others.

And deprive them of the hard learned lessons from decades of experience?

> There are many many justifications for sticking to shell, for example if you need to write a portable installer that works across every UNIX variation.

Do you mean every Linux variation? Because trying to write shell portable from old HP-UX to Darwin is an exercise in insanity.


Set your default shell to bash, and every time something annoys you, look up how to fix it (stack overflow is basically complete at this point, if you know what to search for) and put it in your dotfiles.

I used to rely on fish, but after a couple of bugs (either in fish or my fingers, not sure) I switched back to bash at my job (on a Linux desktop).

After a few months I had built up a good set of aliases and functions (my most used function is rgrep, see below) and was confidently ^R reverse searching and so on. These things are great because as you jump systems (e.g. to macOS) they continue to work.

TLDR: Practise practise practise!

    # the rgrep function
    
    # recursive text file search from the current directory.
    function rgrep {
        if [ -z "$1" ];
        then
            echo "please supply a search string."
            return 1
        fi
        grep -rn $1 .
    }


I can recommend tools called Silver Searcher and/or RipGrep for text searching.

  https://github.com/ggreer/the_silver_searcher
  https://github.com/BurntSushi/ripgrep


As an alternative, you could also look into PowerShell. it's open source and cross platform. I use it because it's really powerful on Windows.

In any programming language, you learn by practice. Given that your shell does so much, that's the easiest place to find tasks to practice on. I have been leaning on my shell scripts to do a lot of automation. The list is long and I just pick something from that list to work on for most days.

If you don't have system automation that you want to work on, then you probably have a lot of personal data that you can work with. I have scripts setup to manipulate data exports from the various services I consume and then remix that data in my own database. My shell scripts can get the data, operate on it and then shove it into a DB. Then I'll use something else to display that data.


https://www.gnu.org/software/bash/manual/bash.txt

I like bash for the same reason I like emacs, in that no matter what the environment is like, I can usually count on my bash scripts to work. I keep them in emacs org-mode files where I store them in src code blocks. I can tangle multiple code blocks into single executable scripts to different directories. Check out org-mode babel, tangeling, and noweb. Keeping all my bash code in a single file solves my issue with having to dig for that one script I wrote that one time because I forgot how to do this one thing ...

If you aren't running Linux on your desktop yet, consider it. Full immersion is a fast way to learn.


Learn to move around efficiently: End of Line, Beginning of Line, Move by word so you aren't just abusing your keyboard.

Knowing how to reverse search (Ctrl-R) and run last command !vim or !curl to rerun last instance of vim or curl command with args so you don't have to search every time.


I wrote a thing for this! People should read it! Focus is on usage, not scripting.

What Kai Thinks Every Developer Should Know About the Shell

https://github.com/tenebrousedge/shell_guide


This script can be useful to save man pages (not just the bash man page, any man page) as text - removing all the control characters which are used for printing with formatting (bold, etc.):

m, a Unix shell utility to save cleaned-up man pages as text:

https://jugad2.blogspot.in/2017/03/m-unix-shell-utility-to-s...

I've been using it from earlier Unix versions, where these formatted man pages (nroff/troff-formatted) were more of an issue. Also works if you want to open the text form of the man page in vi or vim, for reading, searching, etc.


1. Use it.

2. Conceive of use-cases you can't already solve, and see if you can find a way to do them using Bash.

3. Consider that perhaps Bash isn't the best tool for every job. (It most certainly isn't, though you can abuse it frightfully.)

4. Books. Jerry Peek's guides are getting rather dated, but they're still a good introduction.

5. Read the manpage. Frequently. Find some part of it that doesn't make sense, or that you haven't played with before, and play with it. Shell substitutions, readline editing, parameter substitution, shell functions, math, list expansions, loops, tests, are all high-payoff areas.

6. Take a hard look at zsh, which does a great deal Bash doesn't.


Do some puzzles(wargames). For example: http://overthewire.org/wargames/

It will make you search for several special use cases and will give you some experience with the command line. Basically, you SSH into a box, solve a problem and the solution for that problem is the password for the next SSH connection for the next problem.

This one is for beginners: http://overthewire.org/wargames/bandit/


This doesn't work for everyone, but if you find allow yourself time, trying to solve specific problems you face (asking: could I automate this? Could this be simpler?) is a great way to get better - and how I learnt my way through bash.

I recently released https://terminal.training (paid course for 4 hours) which is just for this kind of question, but I've also started a free mini email course (same URL) that tries to share some of the CLI shortcuts I've come to rely on over the years.


Depending on your level, there's https://terminal.training/ (by the same chap who runs jsbin).


In my opinion the most important thing to know about shell scripting is when not to use it. The shell is very powerful but also clumsy for tasks that exceed the basics.


A while back, I wrote a quick Bash guide called "Adventures in Data Science with Bash" (https://gumroad.com/l/datascience).

It covers basic Bash commands (head, less, grep, cut, sort, uniq, curl, awk, join), but also pipes, for loops, variables, arrays, and command substitution.


Work through the Advanced Bash Scripting Guide (http://freecode.com/projects/advancedbashscriptingguide/). It's a great resource and is one that I turn to when I need to remind myself of how to do something.


The best I can really recommend is practice. A lot of "bash" skill comes not from bash itself, but rather from the tools around it.

If you're not already comfortable with input/output redirection (including pipes, as well as reading from / writing to files via <file / >file, respectively), then that's where I'd start.


The shell language itself is pretty simple and featureless. It relies on the system utilities for most things so once you know the basics of how the shell works you should probably focus on learning those rather than bash. Also I find that when it comes to interactive use having good key bindings and some nice aliases makes a lot of difference.


Use it daily. Use it instead of your graphical file explorer options.

Everything will take longer but imho it's the only way to get better.


This article posted here last year has some excellent tips in advancing your bash-fu, many have been mentioned here but has some extras as well:

http://samrowe.com/wordpress/advancing-in-the-bash-shell


I made a search engine to show the best learning paths for learning any topics.

Here is the path for learning bash : https://learn-anything.xyz/operating-systems/unix/shells/bas...


GNU BASH is a branch of the Bourne-shell family.

Korn shell is a much more complete and capable variant of Bourne. BASH partially implemented many Korn features, but not everything.

The standard reference is the Korn and Bolsky book (2nd edition). I'm not aware of any free/online resources that are profoundly good.

Korn is the very best for scripting.


I'm not sure if this answers your question since I'm not sure what the implications of your workflow are, but I got a huge jump in productivity with Oh My Zsh. It has a bunch of features like tab completing past commands, it includes various shortcuts, etc. It's also compatible with bash.


Its considered kind of a security risk, but I like to save ALL of my commands: (put this in .bashrc)

  export PROMPT_COMMAND='echo "$(history 1)" >> $HOME/.basheternalhistory'  
Now you can search it later for arcane commands you've forgotten how to use.


I recommend also learning a different shell's quirks and syntax. I started using fish a couple years ago, but I still have to write any 'production' scripts in bash. Learning fish cleared up lots of misunderstandings I had with bash and has made my bash much improved.


I found this useful for getting deeper with using bash: "Pro Bash Programming : Scripting the GNU/Linux Shell, Second Edition." It is kind of a brain dump type of book but it called out a bunch of little things I had missed in looking at other information sources.


1) Force yourself to do as much as possible from command-line 2) When you Google for help and find a solution, keep Googling until you understand what you're doing

That worked for me. You could also read the man pages, but step #1 is crucial regardless.


If you like learning by doing, play along with this:

https://web.stanford.edu/class/cs124/kwc-unix-for-poets.pdf

(text analysis in bash)


Check out ShellJS if you want to build scripts. For me at least it made things a lot less archaic: https://github.com/shelljs/shelljs


Anytime I'm stuck with a script I usually end up on https://bash.cyberciti.biz

Vivek (founder) has been writing these tutorials for 17+ years, he knows his stuff.


This covers 90% of bash: Learn Enough Command Line to Be Dangerous https://www.learnenough.com/command-line-tutorial


I have a couple of resources for you:

http://www.commandlinefu.com

http://www.shell-fu.org


Bash has tons of quirks, but as others said, reading man bash helps quite a bit.

There is also this great resource: http://wiki.bash-hackers.org


Understand and utilize functions in your scripts.

If there is one thing I wish I had understood sooner, that would be it.

Top to bottom programmatic flow is one thing. Conditional execution and branching are on another level.


One of my most common uses for shell scripts is writing completion functions for my aliases and custom commands. Hitting tab for a list of all options for a command is a huge timesaver.


I really enjoyed http://guide.bash.academy/ but it's not completed unfortunately.


Pick up a copy of UNIX Shell Programming by Stephen Kochan. Very approachable and lots of good practical examples. Augment it with 'man bash' and Googling.


Help develop and test this: https://github.com/exercism/bash


If its really about helping you with the command line then use fish.

If you need to automate something use your favorite programming language.


man bash. read, digest, apply, repeat. you can also look at existing scripts, figure out what they do, why, make copy, alter, see the changes in behavior. there are also books on bash.

larger point: how do you learn more about X? or get better at doing X? figure that general pattern out and you can re-apply it for anything, not just bash.


Give up? Seriously.

I used to force myself to do all of my ad-hoc scripting in bash, but I got sick of the clunky way of parsing arguments, dealing with arrays, looping over data objects, etc.

I got pretty good at it, but at some point I decided just to stick to a language I knew well (R) to string together various pipelines and construct commands. Any high-level language would work, though. I'm much more productive now.


learn about how return from functions and echo behaves, this often bites many.

learn from this wiki that has many tutorials and good examples http://wiki.bash-hackers.org/bash4


Write and distribute tutorials targeting a wide audience with varied amounts of experience.


hackerrank has various challenges in bash : https://www.hackerrank.com/domains/shell/bash


Have you read the manual?


$ help fc

Using ${EDITOR} to build command lines is awesome. 'nuf said.


Can't wait to see this thread commented on n-gate.


among other things make sure you know how to use ! and ^ -- they will save you tons of time compared to searching or editing


ctrl-r is my favourite.


As you're focused on the command line, I won't mention things I generally use in shell scripts like compound commands (if, for, while) or shell parameters. I will also skip things I don't often use.

First, there is moving around in bash - the arrow keys, or backspace/delete to remove a character, or ^A to go to line start, or ^R to search command history, or tab to complete a command. ^L clears the screen, although from habit I still type clear.

I use shell/bash builtins cd, and pwd often enough. Sometimes export, umask, exit, ulimit -a, echo.

I use shell variables like PS1, HOME, and PATH. I set them in $HOME/.bashrc, which sometimes references files like $HOME/.bash_aliases. I often set a larger than default history file size. I use ~ tilde expansion as an abbreviation for $HOME. For long commands I type regularly, I put an alias in the run control (or run control delegated) file.

I use job control commands like bg, fg, jobs and kill. You should know how bash job control complements and diverges from the system process commands. & starts a process as a background process, and preceding it from nohup tells it to ignore hangup signals.

You should know how single quotes work, and escape characters for them if they are needed.

Then there are pipes (| - "pipelines"), and redirecting of stdin, stdout, and stderr. I use this a lot. Also redirecting or appending output to a file (>, >>). I don't use tee often but sometimes do.

Then there are commands used with the shell a lot. Such as parallel, or xargs.

Also nice which modifies process scheduling.

Script, or typescript, keeps a log of your shell session.

Screen allows for multiple shell sessions. Useful on remote hosts especially (tmux is an alternative).

Then there are the standard file and directory commands I often use like pwd, cd, ls, mv, cp, df, chmod, du, file, find, locate, mkdir, touch, rm, which, and wc.

I manipulate these with commands like awk, sed, tr, grep, egrep, cat, head, tail, diff, and less.

I edit with vim or emacs -nw.

Command like htop, ps, w, uptime and kill let me deal with system processes.

Then there are just handy commands like bc or cal if I need to do some simple addition or see which day of the week the first of the month is.

Man shows you manual pages for various commands. "man command" will show the manual page. For a command like kill, the default will show the command kill - "man kill" which specifically is "man 1 kill". But "man 2 kill" would show the kill system call. You can see what these different manual sections are with "man man" - 1 is executable programs, 2 is system calls etc.

All of it is a process. I mentioned awk. It is one of the commands handy to use with the shell. I have seen entire programs written in awk. Some parts of awk I can use from memory, some I use occasionally and have to look up the flags to refresh my memory, some parts I have never used at all. As time goes on you pick up more and more as you need it.


Some suggestions:

Direct your focus to plain Bourne sh as much as possible, moving on only after you understand what enhancements over vanilla sh Korn or Bourne-Again actually offer.

Pick up Manis, Schaffer, and Jorgensen's "UNIX Relational Database Management" and work through the examples to get a feel for the philosophy behind large, complex applications written in shell.

Join a Unix community (sdf.org) and try to do useful things with shell (e.g. cron-schedule stock quote e-mail notifications via shell scripts, etc).




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: