
Unix Commands I Abuse Every Day - BCM43
http://everythingsysadmin.com/2012/09/unorthodoxunix.html
======
andrewvc
My #1 abuse is xargs -n1.

A lot of people like writing bash for loops, I will try and avoid that as much
as possible, xargs -n1 is the bash equivalent of a call to 'map' in a
functional language.

For instance, let's say you want to create thumbnails of a bunch of jpegs:

find images -name "*.jpg" | xargs -n1 -IF echo F F | sed -e
's/.jpg$/_thumb.jpg/' | xargs -n2 echo convert -geometry 200x

Additionally, it's fully parallelizable as xargs supports something akin to
pmap.

~~~
BrandonM
Why do that when a loop is clearer, safer, and shorter?

    
    
      find images -name '*.jpg' | while read jpg; do
          convert -geometry 200x "$jpg" "$(echo "$jpg" | sed 's/.jpg$/_thumb.jpg/')";
      done
    

This version works even when there are spaces in a filename, whereas yours
will break.

~~~
_ZeD_
false

    
    
        $ ls -Ql
        totale 4
        -rw-r--r-- 1 zed users 33 set  6 07:30 "    spaces    "
        $
        $ find -name '*spaces*' | while read text; do
            cat "$text";
        done
        cat: ./    spaces: File o directory non esistente
        $
        $ find -name '*spaces*' -print0 | xargs -0 cat
        while read is broken with spaces
        $

~~~
ralph
IFS needs some care and attention and read should have -r.

    
    
        $ ls -Q
        "   spaces   "
        $ ls | while IFS="\n" read -r f; do ls "$f"; done
           spaces   
        $
    

For lots of grim detail see David A. Wheeler's
[http://www.dwheeler.com/essays/fixing-unix-linux-
filenames.h...](http://www.dwheeler.com/essays/fixing-unix-linux-
filenames.html)

------
3amOpsGuy
I like this but i can't decide if it's technically abuse or not. The paste
command will happily parse - (meaning read from stdin) multiple times, so for
transposing a list into a table:

    
    
      file.txt:
      line1
      line2
      line3
      line4
      line5
      line6
    
      $ paste - - - < file.txt
    
      line1 line2 line3
      line4 line5 line6
    

Combine with the column command for pretty printing. I seem to find a use for
this pretty frequently.

I like the simplicity of this one but it's not very useful day to day:

    
    
      $ echo *
    

As a replacement for "ls".

~~~
tlack
Back in the crusty old days, FreeBSD used to take forever to install over the
network, but would start an "emergency holographic shell" on pty4. The 'echo
*' trick and various other shell built-ins were very useful for exploring the
system before /bin and /usr/bin are populated.

------
btilly
Random note. The most commonly "abused" Unix command is cat. The name is short
for "concatenate", and its intended purpose was originally to combine 2 or
more files at once.

Therefore every time you use it to spool one file into a pipeline, that is
technically an abuse!

~~~
jlgreco
I have long thought that some sort of zsh completion that detects that abuse
of cat and converts it into the more appropriate `< file` might be a good
idea. If it did it silently it probably wouldn't be worth it but if it
actually preformed the substitution in front of you then it might help users
get more comfortable with the carrot syntax.

~~~
ori_b
Does it really matter that you're starting an extra process?

~~~
veyron
it doesnt matter for a file of size 1kb. For a file of size 10Gb, every
process matters.

For the downvoters: please time how long it takes to do something like `cat
$file | awk '{print $1}' ` and `awk <$file '{print $1}'`

~~~
njs12345
So the two are different because awk's call to read() is effectively the same
as a read directly from a file, whereas copying is taking place through the
pipe with the pipeline approach?

~~~
jlgreco
Basically you see a linear increase in time. If it was going to take a coffee
break's worth of time one way, it will take a slightly longer coffee break
worth of time the other. It is fairly rare that the additional time involved
matters and there isn't something else that you should be doing anyway.

------
stefanu
Stop watch:

    
    
        time read
    

press enter to read elapsed time. If you write your activity in the prompt and
repeat it for multiple activities, you have a nice time log. You can then just
copy&paste it from terminal.

~~~
dave1010uk
I have the time (H:m:s) in my prompt. That way I can easily time commands and
I can also find things more easily in my scroll buffer.

------
pjungwir
For all the pipe lovers in this thread, here is a Perl utility I wrote to help
debug shell pipelines. I call it `echoin`, and whatever it takes on stdin, it
prints to stdout (presumably the terminal) while also treating its arguments
as a command (sort of like xargs) and repeating its input for that command's
stdin. So I can do:

    
    
        foo | echoin bar
    

This is like `foo | bar`, but I can see what's passing between them. It's a
bit like `tee`, but reversed. It's what I irrationally want `foo | tee - |
bar` to do.

    
    
        my $args = join ' ', @ARGV;
    
        open OUT, "|$args"  or die "Can't run $args: $!\n";
        while (<STDIN>) {
          print $_;
          print OUT $_;
        }

------
CGamesPlay
I use the "grep with color for lines plus the empty string" so frequently that
I have a function for it:

    
    
      function highlight() {
        local args=( "$@" )
        for (( i=0; i<${#args[@]}; i++ )); do
          if [ "${args[$i]:0:1}" != "-" ]; then
            args[$i]="(${args[$i]})|$"
            break
          fi
        done
        grep --color -E "${args[@]}"
      }
    

This is only to be used as a filter, since it mangles filenames.

I'm curious, does ack support a highlight-only mode?

~~~
super_mario
Yes, ack --passthu does what you want. I have highlighting turned no in my
.profile (export ACK_COLOR_MATCH="bold red").

------
sordina
I've come to use the following command so often that I've written a script for
it in my ~/bin - 'narrow':

    
    
        #!/bin/bash
        xargs -n 1 grep -l "$@"
    

This takes a list of files on stdin, then greps for the argument in all the
files and spits out the matching files.

The perk is that it can be chained:

    
    
        find *.txt | narrow dog | narrow cat | narrow rabbit
    

This will find all the files that contain dog, cat, and rabbit.

~~~
ralph
But why run so many greps? I don't think there's a need for -n1 here.

    
    
        xargs -rd'\n' grep -l "$@"

------
ezy
Witless uninstall #1 (perfect for those crappy tarfiles that exact everything
into the current directory)

    
    
        tar -tf tarfile.tar | xargs rm
    

Witless uninstall #2. Find a file that you changed just before "make install"
into the wrong spot (usually config.log is a good candidate).

    
    
        find /bogus/installdir -newer config.log  | xargs rm
    

Yes, this is totally unsafe. But it's an abuse... so, there you go

------
dredmorbius
Another 'less' abuse: using it as an interactive grep via '&' line filtering.

It's a newish feature of less (those of you with stale RHEL installs won't
find it). Type '&<pattern><return>' and you'll filter down a listing to match
pattern. Regexes supported. Prefixed '!' negates filter.

Wishlist: interactive filter editing (similar to mutt's mail index view
filters), so you don't have to re-type full expressions.

------
jlgreco
Simpler than using grep to highlight output is using ack with the --passthru
option. (If you have ack anyway)

~~~
dmayle
I use a custom function: function hl() { local R=$1; shift && egrep
--color=always "|$R" $@; }

~~~
philsnow
egregious use of color (uses 256-color term support):

function hl() { local R=''; while [ $# -gt 0 ]; do R="$R|$1"; shift; done; env
GREP_COLORS="mt=38;5;$((RANDOM%256))" egrep --color=always $R; }

then do e.g.

whatever pipeline | hl foo bar baz | hl quux | hl '^.* frob.*$' | less -R

results in foo/bar/baz highlighted in one color, quux in another, and whole
lines containing frob in another. hopefully the colors aren't
indistinguishable from each other or from the terminal background :\

I use a somewhat similar setup in emacs, where a key binding adds the word
under point to a syntax highlighting table, but the color is computed as the
first 6 characters of the md5sum of the word.

------
ggchappell
It should be noted that grep-dot only prints filenames if you give it more
than one file.

Also, it skips blank lines. But of course that might be in the feature-not-a-
bug category; and if you really want to see every line, there's always grep-
quote-quote:

    
    
      grep '' *.txt
    

In any case, fun article. :-)

------
hesdeadjim
I don't think I could live without ack:

<http://betterthangrep.com/>

------
thebigpicture
fmt 1 < file seems like it might be handy

i don't understand his first one though. i thought both linux and bsd greps
had the -H option (show filename).

here's some more:

    
    
       1. sed t file instead of cat file
    
       2. echo dir/*|tr '\040' '\012' instead of find dir
    
       3. echo dir/*|tr '\040' '\012' |sed 's/.*/program & |program2 /' > file; . file instead of xargs or find
    

(of course this assumes you keep filenames with spaces or other dangerous
characters off your system.)

    
    
       4. same as 3. but use split to split file into parts.  then execute each part on a different cpu.

~~~
lmm
The point of the first one is he's abusing grep as cat; what he really wants
is a cat that shows filenames, but since there's no such thing he uses "grep
." as a substitute.

------
btschaefer
My number one used command: grep -Hnri <text> <file or *>

~~~
recoil
This works really well with the suggested --color=always, because it
highlights the line number, the filename and the separators in different
colours.

------
hypervisor
I sense there is some cat abuse going on!

------
bitwize
cat foo | less

Normally I'd typed something like "grep -i 'something' foo | less", and wanted
to just up arrow the previous line and change the grep stuff to cat. I don't
know why, it doesn't really save me anything. Maybe it's the hackerish
"because I can, that's why" instinct at work.

~~~
jaylevitt
It'll save you a few more keystrokes if you do

    
    
        cat !$ | less
    

!$ is "the last argument to the previous command".

~~~
mrud
If you do

    
    
        less <ESC>.
    

it will save you even more keystrokes.

~~~
voltagex_
Do you remap ESC to another key?

~~~
mrud
No, but for me <ESC> acts like <META> and the actual keybinding is M-. or M-_.

The command you are looking for is yank-last-arg for bash and insert-last-word
for zsh.

~~~
graywh
You have that backwards--Meta actually means "prepend Escape" (or 8th bit on,
but that's another discussion).

------
tehabe
Isn't that what is hacking all about? Using thing in a different way than it
was intended?

------
raldi

        You can do "head -n 0" on Linux to mean "all lines". 
    

No you can't.

~~~
AgentConundrum
`head -n 99999` seems like a weird way to do it anyway. Wouldn't it make more
sense to do `tail -n +1`? The output is the same from both commands, but
`tail` doesn't require you to assume arbitrary limits.

Honest question, btw. I'm relatively inexperienced with Linux, and I certainly
haven't used BSD. I'd appreciate any critiques you may have to offer.

~~~
Nick_C
> Wouldn't it make more sense to do `tail -n +1`

Yes (or perhaps tail -n +0 as that is idiomatic, which makes it clear to
anyone what you are intending).

------
praveenhm
Nice unix commands

