It also doesn't beg for book purchases for books he didn't write... :P
- grep derp | sed whatever is the same as sed '/derp/ whatever' (you might be interested in sed -r for that matter)
- sed whatever | sed whatever\ else is the same as sed -e whatever;whatever\ else
This list is by no means complete but saves a lot of external processes already, and seriously: think about writing the whole thing in pure bash or awk. There might be just too little gain to justify including all these tools without composing the many features they provide.
The only time I have to use more complicated constructions is to get around the stupid problem that passing filenames with pipes is almost impossible to do safely.
As a result, I did as much as I could using only bash internals, and very rarely did I use pipes, because they always fork()'d a bash subshell in addition to whatever the process was.
In about 7k lines of bash script written to completely automate the iterative development of two games with a shared game engine featuring dockerized server containers, I was able to avoid using sed in almost all of it. When I did use it (pretty sure only two places), it was basically for mass substitution of variables inside of text files, and the multi-line syntax worked very nicely for this, as I could form the sed line with a loop over all the variables I wanted to replace.
It turns out for most bash scripts, you can do the most common sed substitutions using bash's rich variable substitution expressions. For instance, you see in a lot of bash scripts calling commands like "dirname" and "basename" to get the directory and filename of paths. There's a much faster way to do this in bash:
Considering the commented out valid code, it looks like he was saving time by just adding another "|" and check the output, instead of improving the existing command and check the output. Something that I do very often as well.
A question to those who do freelance work: sites like Upwork often have low budgets, where do you guys find better projects with better compensation?
I ditched it, joined some communities online, did a tiny bit of networking, and announced I was looking for work related to X - that was far more fruitful than sending 10 pitches a day for gigs on those sites and getting nothing in response because someone in India will do it for a dollar. Although I did love seeing the same projects re-appear a week later to fix the crap that their offshore team delivered.
edit: And the catalog was 1048 pages long not counting the index.
There probably aren't any hard and fast answers. Take Dracut , for example. It's a successful utility completely written as a collection of bash scripts. The answer probably depends on the specifics of your needs and specific benchmarks.
One thing that helps is putting the spiritual equivalent of "use strict" at the top of your Bash script:
shopt -s -o errexit pipefail nounset noclobber
* Bash Hacker's Wiki 
* ShellCheck 
* Debugging Bash Scripts 
* Shell Scripts Matter 
Moving to Python, say, makes the control flow and the "software engineering" easier, for sure. However, don't underestimate the power of grep, sed and especially awk. I wouldn't want to reimplement a half-arsed, presumably non-bug-free version of awk in Python when I could have just used awk in the first place.
Among many other things I use it to test Kubernetes/OpenShift clusters in Vagrant for Chef recipes:
and here's some others in a more 'native' python format:
> oh! I wrote a tool to do that!
That might just be my favorite HN exchange ever. Y'all remind me of emacs users. Between this and the unusual number of Perl mentions on HN today, my cockles are suitably heated.
That is a compliment. Nary a day passes where HN doesn't give me a reason to keep clicking links and learning.
I started off giggling and now I'm moved from my tablet to a laptop so that I can investigate your shutit - as it looks like a handy tool for learning. The scales look the most interesting.
I have been here for a while, but only recently (past couple of months) decided to comment. I lurked for like 12 months, just to see if I'd fit in. Why? HN continually has commentary about things I haven't yet learned.
In short, this is my awkward attempt to thank you. I'll be spending the afternoon trying to enjoy your shutit scales.
You can find more info here on my blog:
do ping me with your experience.
The other thing I've noticed amongst peers is that I'm usually the only one (or one of a limited few) that remotely understand the shell script too. So I'd serve the team better writing it with Python. YMMV with that of course.
Anyway, just for future reference you can use bash's `help` command to get documentation about builtins. In the case of conditionals, we want to loo at `test`:
$ help test
I suspect some of this is just a difference in background or culture; shell conditionals look completely obvious and intuitive to me, and I'm pretty sure my team can handle shell better than python.
Perl may be as dead as sed and awk are now, but that was the initial use case for Perl before CGI (as in Common Gateway Interface - the first primitive interface for web apps) came along.
Shell with awk/sed/grep is very powerful and very composable for small parsing tasks like this. (It could have been done with even less code than this, but I don't blame the author for not figuring out all the necessary incantations to do so, as these tools can be fiddly if you don't use them every day.)
hey.. we got more unixes here. can somebody grep them out the way?
As you say, the way to make money (with any tool) is to get good enough at using them to be useful to someone that wants to pay you to do so