Hacker News new | past | comments | ask | show | jobs | submit login
My Favorite One Liners (muhammadraza.me)
269 points by mr_o47 on May 3, 2021 | hide | past | favorite | 190 comments



> This commmand allows you to convert your shell output into an image as this makes it much easier than taking a screenshot of your shell if you want to share your output with someone.

Why is this becoming a more common practice? There is nothing more annoying than a picture of a block of text that I can't copy, quote, or modify and send back to someone.


> There is nothing more annoying than a picture of a block of text

Oh, but there is: Videos of bug reports where a person first has to explain that they have found a bug, and they will try to reproduce it now, and if you're still watching after 3 minutes you might see the important 5 seconds of the video, that could have been expressed in a few well-written sentences!


“A picture is worth a thousand words” is an adage many would agree with.

People intuitively can ‘show’ you what’s wrong but most non-developers will have a hard time phrasing it.

“I press the button and nothing happens” is representative of written bug reports I’ve seen. What button? What’s supposed to happen? What page are you on? How can I reproduce this?

Almost all issues I’ve seen with a video I can reproduce. But I agree that they should be succinct.


> Almost all issues I’ve seen with a video I can reproduce.

Oh yeah, totally, my experience has been that video bug reports are literally a thousand times better than text bug reports. I've worked in console game dev where we even required reproducer videos because they're so much better at showing the context of what happened. Often the text description was wrong and/or wildly incomplete and/or full of assumptions the user made about how things work or what's going on, and the video would demonstrate the issue and allow me to not have to engage in a lengthy back-and-forth conversation trying to understand the situation.

> “A picture is worth a thousand words” is an adage many would agree with.

I do too, but until now, I had never read that the other way around, that a picture might cost a thousand words. ;)


That can depend on what it's a picture of. A picture of a block of text is usually not as good as the text as text.


> Almost all issues I’ve seen with a video I can reproduce. But I agree that they should be succinct.

As long as we retain those opening, over-amplified metal riffs. Jump scares are why I watch tech vids.


But which well-written sentences?

If you do a minimum of testing, bugs often depend on context. For example, in case of a UI bug, maybe the user zoomed in, maybe he is using a non-default font, maybe he is using a different locale, maybe he did some action without realizing it (ex: scrolling, resizing the window), maybe the time is incorrect, etc...

"I click that button and it crashed" isn't going to help you there, you know it doesn't crash, you use that button all the time...

If it is as straightforward as it looks, watching that 3 minute video is your punishment for not testing. Yes, I know, I don't test either, but don't blame the reporter for it.


I agree that a video can provide more context than text does. However, most videos I have received as bug reports in the past failed to do that. I just saw the user doing random stuff leading to random events. I'd say that often the most important thing to have is actually not a well-written bug report or a well-made video, but rather having access to the user's settings file and logs (if that exists for the application), as they often contain the explanation of why you don't see what the user sees.


There was a vendor we had, they used the free version of a popular screen recorder for ALL their support tasks.

Put in a ticket asking how to do something? They sent you a link to a very slow-moving screencast of going in and turning an option on/off or configuring a setting. No audio!

Their knowledge base articles were just a collection of links to screen recordings! It was so annoying.

Guess what happened when that company shut the free version of their app down...


My favorite version of this trope is a video of a computer screen taken from a poorly held phone. Bonus points if the video is in portrait, cropping off important bits of text.

Sure, not everyone's comfortable with a computer, and they think they're clever with a solution. I still get annoyed when I get one of these videos.


Don't forget that the image will also be in HEVC format and will be >5MB.


I love video reports, even bad videos since you can just ask for clarification and usually they point out what they were trying to show and you get the context from the video.

I like it so much that I've considered implementing always-on screen recording in our QA debug builds.


Sounds like you work at H1 :)


My personal theory is that fewer and fewer people spend time on actual general computing devices. When confined to straitjackets like phones and tablets, people resort to the one functionality that is easy to reach for, namely screenshots. I'm guessing the habit spreads?

Just look at the amount of garbage "here's my code as a screenshot, what's wrong?" questions Stack Overflow has to remove each and every single day.


I remember reading a few years ago from a teacher that was teaching a class that involved using the school computers, and a good amount of the kids seemingly didn't comprehend how to use a desktop OS with a windowing system. I can't remember the specifics but they seemed to only use it within the constraints of what they were used to from using smartphones and iPads. Like only using 1 app at a time in fullscreen, and didn't really understand dragging and dropping between different windows.


As a tiling WM user, I virtually never drag & drop between windows. And 98% of time I have maximized browser / editor / terminal.


It's faster to do "screenshot area to paste buffer" then paste it than to select the text you want (text selection requires more precision than just snagging a zone of the screen) and fiddle with text formatting on the receiving program (Slack, email, whatever). I do it all the time when I just want to show someone something short, and don't expect them to need to copy it. Bonus: precisely the same workflow works for non-text.


Maybe just me. I tried the command:

  ps aux | convert label:@- process.png
And got an image that was useless. Nothing readable.


Did you zoom in enough? On my system, it produces an image with pixel width of 11000 and file size of 3 MB, so the text appears too small when the default image viewer shows it at 12% zoom in order to make it fit within the available display size. Zooming in to show it at 100% zoom makes the text readable.

While this command works fine and is readable too at 100% zoom, I wonder why one would do this. Isn't copy-pasting the text output of 'ps aux' more convenient than creating a large image out of it that is not easy to read, filter, etc.?


I had better luck with the netpbm tools:

ps aux | pbmtext | pnmtopng > process.png


pretty sure it just puts the image of all the lines from the screen on top of each other, so it's pretty useless for me too. I've been trying to get this to work so I can automate screen grabs that are required for some audits we do.


Oh. Now I know what kind of one-liners he meant.


100% agree but I think the motivation is not to purposefully be inaccessible but rather to get aesthetically pleasing stylings like code highlighting, font choice, editor theme, etc. People will always choose the pretty solution over the accessible one, unless forced to do otherwise. So, we need to make it easier (easier than screenshotting actually) to share code over text and yet keep all those sick styles you want. Just look at Medium for what NOT to do. Never have I seen a more hostile forum for code sharing. Look at GitHub for a decent example. There really isn’t a good example I’m aware of. Even highlight.js I have trouble getting my code to “scroll overflow” rather than wrap (which for code can be more problematic than it is for regular writing).


Someone should write a browser plugin that OCRs the text in an image and makes it selectable.


There are a few of those extensions around. This is a popular one https://projectnaptha.com/ It is kinda like magic if you ask me.


TextSniper[0] is great on macOS, and is part of Setapp now. I also wrote my own script that hooks into Google cloud vision API for the really tough stuff. I use this many, many times a day. It's changed my life - never realized what a problem this was and how much time could be saved by grabbing bits of previously un-copyable text from stubborn places.

[0] https://textsniper.app


This is built into Android 11. Just go into the app switcher and you can select text from screenshots or anywhere else.



It's better for places where you can't control text wrapping or text is limited. Twitter is the main one.


The only reason to post an image instead of a pastebin is for engagement. I understand the engagement practices we are subject to on attention marketplaces like twitter, but is it that hard to provide a supplemental pastebin link?


It's not engagement. It's usability. Way better having the image in context than having an external link to a text written with fixed width font. Obviously you should ALSO provide the link to the text in the tweet so one can easily copypaste (if interested in the code/text after seeing the image preview)


Some locations and people have sites like pastebin blocked, but image hosting allowed. Also in case of say technical blog posts, a combination of certain text characters might trigger a false positive for malware on your site. You don't get that problem with images.


Here is how I like to share logs/outputs these days:

  ps aux | nc termbin.com 9999


unless it's gonna hug-of-death them, we should promote termbin and similar hosts


It’s convenient to share in a Slack (and similar) channel. At least I do it all the time. Also, convenient to attach in an RCA doc that’s shared with higher ups who don’t have time to play around with metrics dashboard.


We're required to provide screen shot images for system audits. Since somehow those are harder to manipulate (much laughter).

And... that command doesn't work for me when I'm using ssh to access my servers or using x2go.


It's pretty handy for gifs where you actually see the problem getting reproduced. But for the rest I agree it is getting rather annoying. And I honestly don't know why tech-savvy people who work on software themselves would do it. It is slower and less convenient.

For others it's pretty clear though: taking a smartphone picture and emailing it around is probably always faster than other means. Plus I'm pretty sure less tech-savy people simply have no idea Ctrl-C works nearly everywhere, including shells and dialog boxes.


You cannot tweet a piece of formatted text. You can sort quote it in Facebook, but any formatting like color will be lost, and color is often important in terminal output.

So, sadly, the picture is made exactly with the purpose of proper quoting, unfortunately losing the textual content :(


I do this all the time for validation effort etc. where they need an artifact.


I dunno, but google photos does an amazing job of taking a picture and then just allowing you to select text right out of it. Even text with a weird font (possible handwriting?). Kinda blows my mind.


> There is nothing more annoying than a picture of a block of text that I can't copy, quote, or modify and send back to someone.

Don't forget search.

Too many StackOverflow posts start with a screenshot of yum errors or mysql logs.

Ugh.


I imagine that Google can search for text inside images (?)


Of course it can. Heck, Evernote did 10 years ago.


It definitely has some utility especially design/layout problems but if you need to type the same thing to reproduce it, it's a pain.


friction

sharing a snapshot is guaranteed to be easy to do in two clicks

sharing text is less so, selection is more work, finding a host is more work


Passive aggressiveness.


Passive aggression ;)


Great post idea!

"python3 -m http.server" is really useful when building simple JavaScript pages, or when you need a simple static file server for testing.

"ss -p" is great, but if you just want to see what's hogging your bandwidth, iftop and nethogs are much better.

My favourite one-liner is "open .", to open the current directory in Finder. The open command can also open URLs and other files.

I also have an alias for "osascript -e 'display notification "'"$1"'"'", which will display the text you choose in a MacOS notification. It's useful when you need to be notified at the end of a long-running task. "printf '\a'" is also useful if you need the terminal to "ding".


For those that don't know "start ." is the Windows version of "open .", opening Explorer in the current folder.

Speaking of Windows, "pushd \\servername\sharename" is a simple way to mount a share in Windows, if you have the login creds cached, and set the new drive letter as your active drive. Useful if you need to bounce around a bunch of shares.


Oh, I always used: explorer.exe . I didn't know open was aliased to that.


For what it's worth:

It's not "start" is an alias to "explorer", it's that "start" runs the args through "ShellExecute".

I mention this only because it has some other uses. For instance, Windows apps can install run handlers so that something like Start->Run->Excel will launch Excel. Just running "excel" on a command prompt won't work, but "start excel" will work, since that kicks off the right path.

You can also use something like "start /w filename.txt" to start an instance of your editor, and wait for it to close (well, mostly, some apps do weird things and break this)


Also, "notepad FILENAME" will open the file in notepad... etc.


Also, "vim FILENAME", where Vim is an alias set in $PROFILE that point to the vim executeable.


Note that you don't have to specify an alias for it, if the vim executable is located in your $PATH.


> My favourite one-liner is "open .", to open the current directory in Finder.

For linux one can use `xdg-open .` to use the associated application. Works with pdf, image files, videos, etc.

I have the alias x=xdg-open in my .bashrc, so that I use `x somefile` to open files.


Thank you, thank you, thank you! "open ." is post-it note worthy.


Also «open somefile.txt» can open, well, any file or directory, just as if you double clicked it. It works on multiple files, too. And possibly URLs, but I don’t have a mac anymore to check.


A slightly upgraded bash/zsh function for displaying a notification:

  noti() {
    osascript -e \
      'on run argv
        display notification (item 1 of argv) with title "Notification"
       end run' \
      $1
  }

  # Example:
  noti 'Hello world!'
This fixes any escaping issues by passing the notification string to osascript as an argument instead of embedding it in the text of the program (which makes things like quotation marks not work correctly).

Sticking it in your .bashrc or .zshrc will make it available at any time in your shell.


I often like alert better (osascript -e 'display alert "'"$1"'"'"), this way the message don't go away before you click ok.

Can also combine with a vocal message: sleep 2; osascript -e 'say "Hello World!"'; osascript -e 'display alert "hello world"'

The last one copied from https://code-maven.com/display-notification-from-the-mac-com...


I stopped using "say" because it made me jump in my chair too many times.


Most Linux based OSes will probably have busybox installed which includes its own web server. This can be nice if you don't have python (or if you think its overkill.)


Not a one-liner by itself, but sticking pbcopy/pbpaste into a pipeline is great for quick text processing. Wish you had regency support in a text field? `pbpaste | sed s/needle/NEEDLE/g | pbcopy`


> I also have an alias for "osascript -e 'display notification "'"$1"'"'", which will display the text you choose in a MacOS notification

This one is great. Very, very useful.


I use something similar, but it leans on at(1) to popup reminders at predefined intervals. I tend to forget things once I'm buried in my Vim window.

https://github.com/jvinet/dotfiles/blob/master/bin/rin

Examples:

  $ rin +30 Check the turkey
  $ rin '14:30 tomorrow' Watch baseball
I currently have it setup so it pops up a modal dialog using Zenity. As well, it uses my cheap-and-cheerful bespoke notification doohickey that I have running in waybar, another script called notify: https://github.com/jvinet/dotfiles/blob/master/bin/notify

This way, I have a nagging badge/icon in my waybar system tray until I finally do check that turkey and watch that baseball.


If you use WSL, you can get the same functionality as "open ." with

  explorer.exe $(wslpath -w "$PWD")
It's a bit lengthy so this is always aliased in my .bashrc when using WSL


explorer.exe . Has always worked for me from inside wsl


It might not be on PATH, so I have an alias and use "open":

   alias open="/mnt/c/Windows/explorer.exe"
Edit: Added formatting


The linux (ubuntu?) version of `open .` is `xdg-open .`


Nice list, but the descriptions are written like they describe the next command, not the previous one. I would not recommend doing this to cat a bunch of files ;):

> If you want to cat bunch of files at once you can this command.

> rm -f !(test.txt).


I added `margin-top: 3em` to `ul, ol, dl` and it is much more readable.


I'd strongly recommend horizontal rules or something else that will more definitely demarcate each item.


Totally got the same confusion.

I hope whoever follows this blindly doesn’t lose too many files they wanted to cat.


On MacOS

   ./very-long-task.sh && say 'success' || say 'fail'
I use it when I need to context switch and leave something running. `espeak` is an alternative for `say` on Linuxes.


Note ShellCheck's SC2015, especially if you do stuff with permanent consequences in the place of "say":

"A && B || C is not if-then-else. C may run when A is true":

https://github.com/koalaman/shellcheck/wiki/SC2015


Interesting! Good to know, thanks. I never mix && and || in the same line, but didn't know exactly why that is!

The problem is that "In this case, if A is true but B is false, C will run."

I guess with B as "say", B will never be false, but generally it's really not the same as if-then-else.


I like this idea a lot, I just dumped a tiny shell script named 'tellme' on my mac:

    #!/bin/bash
    $@ && say 'The command finished.' || say 'Danger!  Something blew up!'
To give me a quick way to run commands and hopefully enough words to catch my attention.


As an audio-free alternative to `say 'success'`, you could use `tput bel`. Konsole, and many other terminals, can flash the screen or show a normal OS notification when a terminal bell is triggered in a non-active terminal window. This doesn't do the success/fail thing though.


I do something similar for long job on the remote server: an alias to send me an email + a rule in Outlook which opens a popup when an email with this subject is received (I haven't managed to get sound from VNC)


Here's a few functions / aliases I have set up that I use on a pretty regular basis:

    # Generate a random password and copy it to the clipboard.
    pw () {
        pwgen -sync "${1:-48}" -1 | xclip
    }

    # Get the current weather.
    weather () {
        curl https://wttr.in/"${1}"
    }

    # Get the numeric value of a file / directory's permissions.
    alias octal="stat -c '%a %n'"
All of my aliases are listed in my dotfiles at: https://github.com/nickjj/dotfiles/blob/master/.aliases


My version of `pw', might be useful to those without `pwgen':

  function genpasswd() { 
    openssl rand 150 | LC_CTYPE=C tr -dc '[[:alnum:]!@#$%^&]' | tr -d iIlLOo0 | cut -c 1-${1:-12}; 
   }


Oh yeah I forgot to mention in the alias comment that it depends on pwgen. Thanks for the reminder, I just updated it.

I made a whole video on generating random password from the command line and compared a few options (including using openssl) at: https://nickjanetakis.com/blog/generate-a-random-secure-pass...

The TL;DR for not using openssl is that you're not going to get the same amount of characters every time. For example if you run yours with 48 characters instead of 12 it won't generate 48 characters every time you run it. With pwgen (something you can apt or brew install) you know what you're getting every time.


If you change `openssl rand 150` to a bigger number, say 1000 or so, you can generate longer passwords.


Ah, I see.

Did you test your solution on macOS btw? I know macOS uses an ancient version of Bash and has different binaries that act differently vs Linux (such as sed). I don't have a Mac here but I do try to make my dotfiles compatible with it. If your solution works on macOS it would be nice to drop the pwgen dependency.


I use it on macOS and it works fine. There is a brew recipe for pwgen, but that's one more dependency to worry about.


My favorite one liner for a pretty and compact graph-like git log:

  git log --graph --abbrev-commit --decorate --format=format:'%C(bold blue)%h%C(reset) - %C(bold cyan)%aD%C(reset) %C(bold green)(%ar)%C(reset)%C(bold yellow)%d%C(reset)%n'' %C(white)%s%C(reset) %C(dim white)- %an%C(reset)' --all
Add a global alias:

  git config --global alias.lg "log --graph --abbrev-commit --decorate --format=format:'%C(bold blue)%h%C(reset) - %C(bold cyan)%aD%C(reset) %C(bold green)(%ar)%C(reset)%C(bold yellow)%d%C(reset)%n'' %C(white)%s%C(reset) %C(dim white)- %an%C(reset)' --all"
Usage:

  git lg


Usage:

  git branches
Code for your ~/.gitconfig

  branches = branch -a --sort=-committerdate --format='\
  %(HEAD) \
  %(if)%(HEAD)%(then)\ 
  %(color:"#329664")%(objectname:short)%(color:reset) \
  %(color:"#222222")[%(committerdate:short)]%(color:reset) \
  %(color:green)%(refname:short)%(color:reset) \
  %(else)\
  %(if)%(upstream)%(then)\
  %(color:"#999999")%(objectname:short)%(color:reset) \
  %(color:"#222222")[%(committerdate:short)]%(color:reset) \
  %(refname:short) \
  %(color:"#ffc662")%(upstream:track)%(color:reset) \
  %(else)\
  %(color:"#963232")%(objectname:short)%(color:reset) \
  %(color:"#222222")[%(committerdate:short)]%(color:reset) \
  %(color:red)%(refname:lstrip=1)%(color:reset)\
  %(color:"#ffc662")%(upstream:track)%(color:reset) \
  %(end)\
  %(end)\
  '
Meaning

  HASH Commit_date Branch_Name

  A white branch_name means it is local.
  Red means it is remote.
  Yellow branch_name is the current branch.
  It will also tell you if your checked out branch is ahead or behind the remote branch.


This broke my Alacritty build:

  $ cargo build --release
  error: failed to get `bitflags` as a dependency of package `alacritty v0.9.0-dev (/home/myuser/sources/alacritty/alacritty)`
  
  Caused by:
    failed to initialize index git repository
  
  Caused by:
    failed to parse config file: invalid configuration key (in /home/myuser/.config/git/config:57); class=Config (7)


Yep, definitely among my top git aliases.

Google/DDG search for "git lg2" should also reliably take you to this SO answer to copy the alias from: https://stackoverflow.com/a/9074343


On macOS you can read and write to the clipboard with pbcopy and pbpaste:

    $ ls | pbcopy

    $ pbpaste > out.txt
You can also put this in a function in order to get the path of the frontmost Finder window:

    osascript 2>/dev/null -e '
      tell application "Finder"
        return POSIX path of (target of window 1 as alias)
      end tell'
Get the current Finder selection:

    osascript 2>/dev/null -e '
      set output to ""
      tell application "Finder" to set the_selection to selection
      set item_count to count the_selection
      repeat with item_index from 1 to count the_selection
        if item_index is less than item_count then set the_delimiter to "\n"
        if item_index is item_count then set the_delimiter to ""
        set output to output & ((item item_index of the_selection as alias)\'s POSIX path) & the_delimiter
      end repeat'


On Windows I use alias: 'alias pbcopy="clip.exe"' for consistency as I use Mac and Windows daily.

   ps auxw | pbcopy # After alias works on WSL too.


On MacOS you can also drag-and-drop a file from Finder onto your terminal and it will type out the full path for you.


You can also select one or more files and use the hotkey Command + Option + C to "Copy N items as Pathname", which copies the full pathname of each item to your clipboard.


Wonderful! I didn't know that it was possible to copy the file path of an arbitrary selection, thanks!


Also, for quick formatting a blob of JSON you've copied:

  pbpaste | json_pp | pbcopy


On macOS you can use “pbcopy” and “pbpaste” to manipulate or read from the clipboard.

For instance, format the JSON in your clipboard: pbpaste | jq .


If you’re on Windows, Set-Clipboard in PowerShell does this too and is super handy.


jq is lovely. python3 has a json formatter built in as well

pbpaste | python3 -m json.tool


clip.exe when on WSL (alias to pbcopy, if Mac way is the way).


xclip and xsel for X11: xclip -o | jq


Or wl-copy / wl-paste on wayland.


I use autohotkey to script some of my favorite things, which is another great way to get good use out of the command line on various systems where I don't necessarily need/want to set up a .bashrc.

hotkey: "du-m"

Use: Sort directories by usage:

du -m --max-depth=1 | sort -n

hotkey: "wpinfo"

Use: gets basic WordPress site info:

code for AHK

  ::wpinfo::
  (
  echo Home URL $(wp option get home) ; echo Site URL $(wp option get siteurl);echo "### 
  Plugins ###";  wp plugin list;echo "### Themes ###"; wp theme list;echo "### Users ###"; 
  wp user list; echo "### Roles ###"; wp role list; wp core verify-checksums
  )

hotkey: awk(1-5)

use: saves me typing something I fat finger every. single. time.

awk '{ print $1 }'

awk '{ print $2 }'

awk '{ print $3 }' etc

AHK:

  ::awk1::awk '{{} print $1 {}}'
There are others but these are the ones that come to mind. The main thing isn't the scripts but the automation. And AHK can save a ton of error-prone typing or copy/pasting in these scenarios.


Of all the things awk can do, I will never understand the popularity of echoing an argument.

The same thing in bash would be:

  $ arg1 () { echo $1; }
  $ arg1 one two three
  one
Splitting fields is easy with cut:

  $ echo one two three | cut -f 1 -d " "
  one
Normally you just want to stuff it into a variable in which case read is much easier:

  $ read arg1 arg2 arg3 < <(echo one two three)
  $ echo arg1
  one
Symbolic names are normally much easier to read than numeric positions.

That last example looks a bit weird with <() instead of a pipe, but that's just because the right hand side of a pipe is a separate shell so setting variables in it is a bit useless. It's nothing specific to read.


A couple of useful ones in here.

However, instead of:

  git log --format='%aN' | sort -u
I'd recommend using git shortlog, as it'll provide you with counts.

  git shortlog -sn


As far as I'm aware, rm -f !(text.txt) isn't enabled by default, at least in bash. You have to enable extended globbing by running 'shopt -s extglob' first (or add it to your profile).

Great feature though, it adds a ton of extra pattern matching. Not (!) is definitely my most used though.

https://www.linuxjournal.com/content/bash-extended-globbing


Even funner quirk with stuff like that:

    $ bash -c 'shopt -s extglob; rm -- !(x)'
    bash: -c: line 1: syntax error near unexpected token `('
    bash: -c: line 1: `shopt -s extglob; rm -- !(x)'
Because of the strangeness of the bash parser, the glob has to come on the preceding line:

    $  bash -c $'shopt -s extglob\nrm -- !(x)'


Nice list!

> If you want to cat bunch of files at once you can this command

You can use cat itself for this (and save typing a few characters):

    cat *


The use of grep the author proposes will have each line of output prefixed with the name of the file it came from followed by a colon. That is if the glob expands to more than one file though, if it only expands to a single file then it is equivalent to cat. Use of the -h option should be passed to grep if the difference in format is something the author wishes to guarantee.


Even when it expands to a single file, it has different output when the file contains empty lines.


I think

    grep ^ *
would fix that and is just as short.


`grep . *` has different output than `cat` if `*` expands to more than one filename: every line is prefixed by the filename it exists in.


That one gave me a giggle too - perhaps the author got told off for useless uses of cat too many times?


If it was grey -R . * it’d make more sense. Maybe they just missed the -R?


That meme "useless use of cat" is a major pet peeve of mine...

It makes no sense. Starting your pipe with cat is alright. I make a point of always starting all my pipes with cat, even in the cases where it is a bit unnatural.


It's one more process to run, so `cat single_file | ` can be easily substituted with `<single_file `.


I was going to point this out too.

For those who may not know, you can put the I/O redirection anywhere in the command, not just at the end.

So you can write

    < input_file.txt grep some_pattern | grep another 
Which has the same effect as

    grep some_pattern < input_file.txt | grep another
but puts the input redirection at the beginning so you can easily follow the data flow from left to right.

I use "redirection first" in other cases such as

1>&2 echo "Error: missing file"

To make it clear up-front that I'm writing to stderr.


Yep. Writing "<file " is just as good as "cat file |". Still I'm more used to writing cat. The only (marginal) advantage of cat over explicit redirection at the beginning is maybe that you can split the lines more cleanly with cat:

    cat file       |\
        program1   |\
        program2   |\
        ...
If you use a redirection, since the command does not start with a pipe the symmetry is broken.

Redirections at the start of the line are fun. I have an alias

    alias null='>/dev/null 2>/dev/null'
this allows to run GUI programs without cluttering the terminal with useless GTK/QT warnings:

    null evince file.pdf


Pro tip, if a line ends in a | you don't need a trailing backslash. The same is true if a line ends in || or && as those all imply continuation.

Edit:

    cat file       |
        program1   |
        program2   |
        ...


wow, that's great. Now what I miss is to be able to put comments after each such line.


I just tested it and, in bash at least, it actually works fine.

    echo foo | # echoing foo
        tr 'a-z' 'A-Z'
    FOO


this is incredible, I had already given up on this, and the solution was in front of my eyes the whole time! You have just greatly enhanced the elegance of several scripts in my lab.


Wow. Thank you for explaining this! I'd never understood why `cat` was called useless when (so far as I then knew) there was no other way to "start a left-to-right pipe by reading from a file" (other than the aforementioned `grep <filename> *`). This helps!


Or plain `>&2` since the `1` is implied for '>'


There's a natural workflow that tends to get ignored in these discussions: when you're not sure of all the details of the pipeline yet and your file is big enough to slow you down.

I typically start with (say) `head file | grep '2020'`. Once it does what I want I move to the next step, and so on until it's done. At that point, and having chained 5-6 commands, replacing `head` with `cat` is faster and less likely to break due to me putting the '<' in the wrong place. The simplicity and extra peace of mind is well worth an extra process IMHO.


Why always start with cat? Not using cat when you don’t need to means less typing, more efficient and mor robust code. Why use it if you don’t need to?


It's easier to change the start of a chain that starts with cat than if you're passing a filename into your first program. Even though it's _technically_ misuse, cat can improve composability.


I often prefer it in scripts for easier parsing later.

cat file.txt | somemonstrousoneliner

immediately tells me it's reading file.txt and doing something with the contents.

Burying the filename deeper in the string makes it harder to figure that out.


In many situations it's quicker to just rely on muscle memory than coming up with the most efficient way to read a few 50kB files.

It's a good thing to be aware of the differences when needed, but often the execution speed is not the reason for writing a bash script.


I don’t think efficiency matters much in this case, but more robust and concise code is better. Most commands either take a file as a unnamed argument or you can just do something like <filename. But I agree, there may be commands where passing the file name is awkward (eg. jq)


We should have a "My Favorite One Liners" thread every month like we do "Who's Hiring". Some of this is going straight into some aliases!



It would be great to have an aliases library with sections - where one may be able to copy a section for a given ch'unk of task types..


> grep . *

I believe this command is present to prefix each line of output with the filename. But this ignores all the blank lines. That can of course be fixed easily with:

  grep ^ *
Here is what I normally use myself if I want to show the content of files with their filenames:

  tail -n +1 *
This shows the name of the file once at the beginning of each file and no more. Here is an example:

  $ tail -n +1 *
  ==> bar.txt <==
  bar 1
  bar 2
  bar 3

  ==> baz.txt <==
  baz 1
  baz 2
  baz 3

  ==> foo.txt <==
  foo 1
  foo 2
  foo 3


Though you get error messages if there are any directories. This will filter those out:

    tail -n +1 `find * -maxdepth 0 -type f`
Getting long enough to make an alias...


> mkfifo hello; script -f hello

> This command will allow you to share your terminal session in real time.

Using `script` for this seems nice for when you don't want them to be able to control the same computer. Like person foo on their computer would

  nc -l 9000
then you could

  script -f >(nc foo 9000)
to show them your session. One could also add some encryption to the pipe.

If both controlling the same computer is not a problem or is even desired, then it's probably simplest to just share a tmux session. One runs `tmux` and the other `tmux a`.


Neat trick with the named pipe and the script command. Note that visual editors (and anything else like ls that are aware of the terminal size) will not work well unless all terminal windows are the exact same size. And type, obviously.


And even then there might be issues. emacs' line numbers don't update correctly for one.

At least for vim and htop, having a larger terminal window doesn't seem to be a problem as far as I can see.


> One runs `tmux` and the other `tmux a`.

Here's how to do this with screen instead of tmux.

Terminal 1:

  screen -mS hello # create screen named "hello"
Terminal 2:

  screen -x hello # attach to shared screen


Just

Terminal 1:

  screen
Terminal 2:

  screen -x
seems to be enough, too.


Yes, but it gets confusing if you already have previous screens running (-x opens the right one if you only have one) so I provided a more "foolproof" version.

I always use named screens since it's quicker and more organized than typing `screen -ls` before every time and then making a decision on whether or not to name the new one (I still think it's better to name the first one in any case in the vast majority of (my) cases).


One I used to use when website performance seemed bad. Prints out hits in the current log with a count for each unique client ip:

  awk '{print $1}' < apache.log | sort -n | uniq -c | sort -rn |less
Produces output like:

    4482 66.249.73.135
    3264 46.105.14.53
    3157 130.237.218.86
    2073 75.97.9.59
    1013 50.16.19.13
    ...
And fairly easy to throw in a "grep" for specific fetched urls, slices of time, etc.


I use cut instead of awk, at least for logs like apache or ones with fixed amount of spaces between fields, because it means less parsing/work, specially for summarizing long log files. Also use cat of the log as input, because is the same pattern for zcat/grep/zgrep if the input is compressed or I did some selection of records before.

Some of the records you may have to search could have the port attached (i.e. output of netstat, haproxy logs or others) so for stripping them I add

rev | cut -d ":" -f 2- | rev

on the list of IPs to not get messed up with IPv6 records.


-> grep * : "If you want to cat bunch of files at once you can (use) this command."

Yes but I don't understand the advantage of that instead of simply using :

cat *.*

or if you want to display the filenames :

tail *

works fine too. Maybe someone can explain.


Less-common one-liners that I use for work:

  # Use jq to craft json documents
  jq -en --rawfile key some_id_rsa \
    '{ "ssh_private_key": $key, "hosts": [ 
         { "type": "EC2", "name": "backenddev", "id": "i-0123456789", "address": "1.2.3.4", "region": "us-west-1" } 
     ] }' > metadata.json
  
  # Use jq to import Terraform resources from a state file
  jq -er '.modules[].resources | to_entries | .[] | [ "terraform", "import", .key, .value.primary.id ] | @sh' \
    generated/aws/route53/terraform.tfstate \
  | xargs -L1 env
  
  # Use Bash to check for a TCP connection (like 'nc -z')
  timeout 1 bash -c 'cat < /dev/null > /dev/tcp/127.0.0.1/3306'


Shameless plug for bashojs (bashojs.org), for the JavaScripters:

A one liner to update all your npm deps to latest. I use this quite a lot.

  basho --import './package.json' pack -j pack -j 'Object.keys(x.dependencies)' -m x -e 'npm install ${x}@latest'
Explanation of how this works:

1) Import package.json, call it 'pack'. 2) Put pack in the pipeline. 3) map pack to pack.dependencies. 4) flatMap (-m x) to remove nesting. Because pipeline is an array, and it contains pack.dependencies - another array. 5) Exec command with '-e'


My favorite is piping stuff into awk to make bar graphs. Here's system load from atopsar for the current day:

  atopsar -p | tail -n +7 | grep '\S' | awk '{printf("\n%s %6.2f ",$1, $5); for (i = 0; i<$5; i++) {printf("")}}'; echo
sample: https://gist.github.com/defulmere/bec1aef40ca4cddb5c421ffa5f...



gracias


Dangit, there should have been an ASCII block character in that second printf. Here's a version with an asterisk instead:

  atopsar -p | grep ":[0-9]\{2\}\s\+[0-9]" | awk '{printf("\n%s %6.2f ",$1, $5); for (i = 0; i<$5; i++) {printf("*")}}'; echo


Among my favorites is this little zinger (awk):

    !seen[$0]++
which gives the union of the lines of all the input files, with none of that 'sort | uniq' business.

It's not a one-liner, but here's intersection:

    ! buf[$0]++ { acc[$0]++ }

    ENDFILE {
        delete buf;
        files++
    }

    END {
        for (k in acc) if (acc[k] == files) print k
    }
and here's set difference:

    ! filenum { acc[$0] = 1    }
    filenum   { delete acc[$0] }

    ENDFILE { filenum++ }

    END {
        for (k in acc) print k
    }
I keep these in my ~/bin with evocative names (e.g., 'union' and 'set-diff'). Once I did that, lots of other things became one liners.


The comment about sharing a terminal using mkfifo reminds me:

Screen has a multi-display mode you can use with "screen -x" that lets multiple clients connect to the same session. Useful if you want to walk someone through a process.


I thought these would be jokes!


The first joke oneliner I learned, circa 1980:

1: gotta light?

No match.

Quickly followed by:

2: Could you beat up superman?

No match.

Oh, the days when csh was cool and not retro.


I remember someone showed this to me on a vintage Mac:

$ bill gates

Did you mean “kill gates”?


    $ %blow
    No such job
We had fun times.


there is a couple jokes in there


I sometimes wish there was a linux command that when piped into turned anything into json readable format, so `ps | json-pipe` would output something like [{pid: 133, name, etc}, ...] or `ls | json-pipe` would turn into [{name, size, etc}].

It would handle most known commands and maybe had plugin support for the rest.

Just a thought I had. Don't really know if it's possible or feasible. But it would make working with command line much more predictable and easier especially if it also supported jmespath! What do you guys think?


I saw this mentioned a few weeks ago:

https://stedolan.github.io/jq

“Like sed for JSON data”


jq is a great tool for working with json data, but the author asked for a "command output in table format to json converter."

The output would then be appropriate for jq.


For this to work on the command line, the json-pipe command would need some heuristic for parsing arbitrary input, could be tricky.

Interestingly enough, the Arista CLI actually implements something like this, but it's not a real pipe or a separate command - each command implements it's own `<command> | json` handler.


Not a linux command, but PowerShell (on linux) have a ConvertTo-Json cmdlet, it can parse dotnet object, so 'Get-ChildItem | ConvertTo-Json' will give you the output you want, but it is less successful with text so 'ls -alh | ConvertTo-Json' will give you an array and not distinct objects.


Nushell comprehensively addresses this: https://www.nushell.sh

(I don't have an opinion yet. Been meaning to try it for a year now but a shell is a difficult thing to switch)


Rails dev. This runs the most recently modified test in your spec/ directory:

    bin/rspec $(find spec -type f -exec stat -f '%a %N' {} \; | sort -r | head -1 | awk '{print $NF}')
So if you are working on a spec & save it, running this will execute that spec & only that spec. I have it hooked up to a mapping in neovim, so hitting `<leader>rt` runs the test.

Needs some tweaking, and I'm aware that guard exists. This is simpler. Quick & does the job.


Using "wc -l" or "sort | uniq -c | sort" to count and classify things.

Count processes and threads by user on Linux:

# ps -Teo euser=,comm= | sort | uniq -c | sort -nr


Another application of this is counting unique IPs in Apache access logs.

  zgrep -Po '^[0-9.]+' /var/log/apache2/test.access.log | sort | uniq -c | sort -n


Sending typescript to a FIFO seems odd. Sure you can see live input rather than following a line at a time with tail -f but you also lose persistence. To share something live I'd just use tmux or screen, granted if you want to ensure the observer has read only access there may be a couple more steps than using a FIFO.


One can always dig their own IP address

  dig TXT -4 +short o-o.myaddr.l.google.com @ns1.google.com


That's neat! I can also find my upstream DNS servers recursive resolver IP by leaving the @ns.google.com off. Looks like my upstream resolvers are using IPv6 :)

Also, I can find the resolver IP for popular public resolvers this way too.

This is probably the coolest thing I will learn about today.


import png:-|xclip -selection clipboard -t image/png -i

Copy a selected rectangle of the screen into clipboard


What is import? Does it read text from image?


import is part of ImageMagick and "saves any visible window on an X server and outputs it as an image file. You can capture a single window, the entire screen, or any rectangular portion of the screen."


nice! I extended it to a little shell script so it gets written to a file, too.


Speaking of one liners... is there a way to get w3m or lynx to print a PRETTY formatted output to the console and exit? Seems all the dump commands are for plain text. Hoping someone here knows as this arcane knowledge doesn’t seem to exist elsewhere.


I get decent output by using headless chrome to dump to a pdf, then pdftotext (from poppler-utils) to format that as text.

  google-chrome --headless --disable-gpu --print-to-pdf=out.pdf https://www.cnn.com
  pdftotext -layout -nopgbrk out.pdf - | less


Some of these are useful. Some of these have better alternatives:

> grep . *

you can "cat *"

> ps aux | convert label:@- process.png

copy-pastable text is a better way to exchange text.

> mkfifo hello; script -f hello

you forget to mention that the other side needs to cat this file. but also, tmux might be better for this.


  > > grep . *

  > you can "cat *"
The advantage of `grep . *` is that it omits empty lines and includes the source file on each matching line by default.


fvi - find vi - recursively grep for a pattern and pass the successful files into the editor. I use this to find examples in a codebase. Sure, it would be nice to set editor's pattern and have it jump to the first occurrence but that has been a bridge too far. Maybe I'll try again with VSC. (Yeah, VSC doesn't seem to have a way of setting the search pattern.)

  function fvi { grep -rl $1 . | xargs zsh -c 'code "$@" <$0' /dev/tty }


Maybe some of these could be shared at http://www.bashoneliners.com :-)


I use this as a one liner script in my PATH to review code before I commit it.

  svn diff "$@" | colordiff | less -R -x4


svn? Blast from the past! Why are you using it?


I knew this one was coming. I've been using it for a very long time and it hasn't let me down. I have considered using git but found no compelling reason to migrate.


Same here, I still use svn on my 20 year old projects and haven't felt the need change. I use git for my newer projects but occasionally run into issues that cause me to have to search for a solution.


In general, for a very large old project that has no major problems with SVN, it's easiest to just leave it. Lots of open source projects are still on SVN (some even on CVS...)


Any sccs users out there?


I thought this was going to be about jokes :(


Was really hoping for a list of jokes when I clicked that link. A bit disappointed.


A politician, a cleric, and a homeless guy walk into a bar, and the bartender says "What is this, some kind of joke?"

You're welcome.


I pulled history and my favorite seems to be: pmount sdb1


ffmpeg -i <sth.mp4> <sth.mp3> ## converting videos to mp3


I have something similar but a little more elaborate (not a one-liner) at my ~/bin to ensure that there isn't a severe loss of quality during the conversion: https://github.com/susam/dotfiles/blob/e434b7c/bin/xmp3


git checkout -f develop




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: