I'm a VIM-only dev, spend all my day in a Tmux session, and I use the command-line probably a lot more than average.
The single best improvement to my command-line workflow I've done in the last few years has been switching shells. First I moved from Bash to ZSH, but about 2 years ago I switched again to Fish.
Why I love it:
- history completion searches by default, so I can just type part of a command, hit arrow-up and look through only the relevant results (ZSH also did this, but Fish is better)
- tab completions are shown to you in advance (in light gray in front of your cursor)
- Basic configuration (which is good enough, honestly) can be done via a web browser. Just run "fish_config", and it starts a web server, pops open your browser and any changes you make there are saved to your actual config files. No more looking up syntax.
- aliases (which are called abbreviations) are actually expanded after typing, so the full command appears in your history. You can also edit the full command before running it, if you need.
Can't agree more with this. I too am a (neo)vim + tmux dev, and I made the same progression in shells. Fish just works out of the box. I actually stopped using autojump (see Better CD in the article) because Fish's context-aware auto-completion is that good.
For me, yes I still write my scripts in hashbanged (ba)sh scripts; the only fish I write is my config.fish. Fish might be an objectively better language, I don't want to learn it and prefer keeping writing (ba)sh, as
1. (ba)sh is an evil I know...
2. ... that I must know anyway to work on servers (while fish's "utility" is low as used nowhere but around the fish shell)...
3. ... and shellcheck makes it (a bit) less of a footgun.
When I do need to write a bash one-liner, I just run it in bash :) . But it's a once-in-a-blue-moon occurrence, since pipes are the same, and fish added syntax for && and || a few years ago (before that, they didn't exist and you had to use and; and or;), and added support a few months ago for `FOO=bar ...` direct env.var setting (before that you had to `env FOO bar ...`). So, your typical `thing --blah | grep stuff | awk stuff | whatever && something` generally works in fish.
Quick note: aliases and abbreviations are seperate things in Fish, using the 'alias' function will create a wrapper function that does not expand after typing.
Have you seen gron[1]? It can't do everything that jq can, but it is much more in the unix philosophy than jq. It simply flattens and unflattens json out into lines so you can use standard unix tools on it.
I personally have settled for fx[1], there's no need to learn yet another tool's syntax, specificities, and quirks.
js anonymous functions are valid inputs to the tool.
when the transformation turns to be more complex than expected I can just copy and paste what I've made so far into a nodejs script.
you can also configure a .fxrc file to automatically import npm packages that you might find useful, shortcuts, or your personal functions.
I am sorry for spamming this comment, but I wanted to share it with all the fzf users because I found it so game-changing:
I use "ESC-c" a lot. It calls fzf and fd, you type a part of your directory name, hit enter, and it cd's to that directory.
export FZF_ALT_C_COMMAND='fd --type directory'
That's in my .zshenv.
It's incredibly useful if you have multi-layered project directories. Like ~/Projects/Terraform/Cool_Project - you'd hit esc-c, type cool, hit enter and you're in the place you wanted to be.
If I want to cd to let's say "~/work/john/some-project" I just type 'fcd' on the console and then "w j s" and "ENTER". Most of the times it work as expected and is really fast.
This is a mode provided by a helper that I wrote for fzf to make it easier to use and more useful. In essence it's just a shell script with a bunch of fzf tricks pre-configured.
entr lets you watch files and re-run a command any time they change. Whenever I'm working on a script, or go tests, or whatever test-like thing I'm doing that's not in its own bloated test harness, I reach for entr. Great software, does what it's supposed to every time.
I used to use entr extensively for Sketch plugin development (the vector editing tool by Bohemian Coding). However the app used to crash randomly and I felt my plugin was causing it. I spent an inordinate amount of time trying to debug it, replacing libraries, changing code and so on. Thankfully I noticed at some point that the app was behaving well for long durations of time when I started it without entr. I got rid of it from my toolchain and things have been well ever since.
I still don't know the root cause, but now I know one more place to look for heisenbugs.
it looks like entr does a bunch of weird, potentially incompatible things, like redirecting stdin from /dev/tty and leaving FDs open in the child. however, my guess is that your build process is non-atomic, and your program is crashing due to necessary files being partially updated. this could probably be hacked around by doing something like `entr /bin/sh -c "sleep 1; exec realcommand"`.
I didn't know of entr, or inotify at the time. Years ago, I wrote a script that did mostly that, but I've found myself to instead more often use a different script to rerun things based on manually triggered global hotkeys. It scratches a different itch, but in case you want to check it out: https://github.com/swarminglogic/shell-scripts/blob/master/r...
In short, you set up a global hotkey to trigger the rerun of a "key"-ed command. Then you can quickly run a command, which can be rerun with that hotkey.
So, a global hotkey set up for `runrerun -r foo`
Then in a terminal you can use `runrerun -b foo -- <COMMAND>`.
It is a bit over-engineered with multiple sessions, different kill signals for restarts, etc. But that's the gist of it.
'entr' will use inotify where available - the utility is that it's 1) cross-platform and 2) easy to integrate with shell scripting (rather than having to write a more complete inotify or other platform-specific tools).
As an example, I use 'guard' [0] in a docker container, and on test failure it writes out to a file which is shared outside the container. It's easy for me to fire up a shell script that uses 'entr' to watch those files and pop up notifications when tests fail.
> if the directory has many files or sub-directories, tree becomes much less helpful: you only see the last screen full of information as files scroll past you.
> broot solves this problem by being aware of the size of your terminal window and adapting its output to fit it.
What? You can just pipe `tree` into `less`. The solution isn't to download Yet Another Binary that does this specific thing and doesn't come installed on most platforms.
Use broot as tree is not necessary of course `tree | less` works for that but as hinted in the article broot contains a lot more features. Fuzzy finding, preview, multiwindow copy/paste, renaming, directory navigation etc
What's wrong with dired mode? Ok, you'll get each directory in its own buffer. But, thinking of that, it shouldn't be to hard to make dired mode recurse into subdirectories inside the same buffer. (And fold/unfold like orgmode does. ;-)
dired-subtree does this, albeit on demand, not eagerly descending into subdirectories. dired-hacks more generally has lots of good stuff like filtering etc.
You could edit files that you just found and renamed. And when you edit files, then there is RCS integration and… linting, spellcheck, printing, you name it. It’s just a unix way. Is broot ISO/IEC 9945:2009 compliant?
I generally just use zsh globbing for this by the way:
ls **(.)
To list all files, recursively, or some variation thereof depending what I want: *(/) for directories, *() for executable files, etc. With a global alias it can be piped to less with LL: "ls *(.) LL". You can actually do a lot of stuff with it, and is basically good replacement for find, except with less typing. The syntax looks a bit daunting at first, but especially for basic stuff it's not that hard (e.g. / and is the same as what ls -F uses).
I wish there was a better pager than less though, but haven't really found it (and no, Rust project X with 115 different colours that don't even work on a light background and/or "fuzzy" matching that never gives me what I want is not it).
Well if we wanna be grumpy this morning... I don't understand why people don't learn how to use `find` properly. And just doing simple things correctly like the `-print0` integration with `xargs -0` to be able to safely send a list of files into rm even if they've got spaces or whatever. I never have much use for `tree` unless I'm trying to take a screenshot and make it look kind of nice.
> this is the opposite of the unix way of doing things
Which makes sense, this is an article for people on Apple hardware (judging by the installation instructions), so no interest in keeping things "the unix way of doing things"
I have no dog in this fight (I use both Linux and macOS as daily drivers) but it could be argued that macOS has more of a claim to be Unix than Linux since the foundation of macOS comes from NeXTSTEP and FreeBSD whose codebases both ultimately descend from the original Bell Labs Unix, whereas Linux started life completely independently.
I'd argue that Linux is more Unixy than macOS in terms of philosophy though, I remember reading that Dennis Ritchie considered Linux as much of a Unix as the other offerings on the market at the time which were the BSDs and commercial Unix variants; his opinion is obviously worth thousands of mine!
Brains recently got underrated. The way mcfly works is cool, but it offloads your habits to some “neural network” that you can’t control. There is not many methods slower than looking at unpredictable line-by-line output and almost nothing more anxious than a possibility of false choice done without thinking. This tool basically replaces a deterministic mismatch with a non-deterministic one. Your brain is good at predicting the former (because it is a neural network connected to itself) and bad at the latter, because two neural substances rarely agree.
My guess is that author wanted history-search-{for,back}ward in their .inputrc for up-down keys but missed that somehow. The default history behavior is undoubtedly tedious. https://unix.stackexchange.com/a/20830
The only thing I’d do to history-search- is to show a list of variants (like in wildmode=list:longest or similar) instead of presenting them one by one. Pretty sure it is already possible.
>history-search-{for,back}ward in their .inputrc for up-down keys
Once I found this, I rarely use Ctrl+R since most of the time I'm searching based on starting command name.
If I'm using something a lot of times, it gets added as alias/function. I also maintain a file with commands that I think may be useful later, easier to search that compared to online search.
Yes! I have OBTF with all “devops/etc” stuff that is hard to remember once a quarter, with headers and explanations, and everything in select-paste ready format.
Author here, thanks for reading. I totally agree that non-determinism is tricky. The big thing that got me to try McFly was the suggestions are path specific. I hear fish shell has this feature built in, but I havn't tried it out.
I may be unique in this regard but the types of actions I do in one path are very different from those I do in others: working on my blog is very different from the types of things I do when working on code and it was nice to have history aware of this.
I haven't actually used McFly a lot though, so I haven't quite seen if it going to become a permanent thing for me. The UI of using FZF for history is nicer, I just didn't like the suggested matches.
You are right though, I don't have my arrow keys setup like this. I'll try that out as well. Thanks for the tip!
`fzf` can be combined with `find` (or `fd`) and then you can just press Ctrl-T and have a fuzzy searching prompt for all files in and under the current directory. It's super convenient.
This is productivity porn for programmers. While obviously these tools are deep and powerful, there is also a learning curve and experimenting wether you can adapt your workflow to them or vice versa. Next year we will have another 5 shiny command line tools, and so on.
My point being: only replace your weapons of choice once you are confident you are using them to their full potential and you identify needs that these known tools wont solve.
Your main point doesn't do alternative tools justice. It is totally fine to use a newer tool as a novice so you are learning something that will serve you better in the long run, even if you aren't an expert in some other tool from the start
> Your main point doesn't do alternative tools justice. It is totally fine to use a newer tool as a novice so you are learning something that will serve you better in the long run, even if you aren't an expert in some other tool from the start
If you are a novice, you don't start with alternative tools, because they are much more likely to lose popularity, stop being developed, and lose relevance.
I've historically spent all my days SSH'd into a remote server in a screen/tmux session writing code in vim and doing various admin/db tasks.
Recently I started using vscode with the remote-ssh plugin and it has surprisingly been a pleasant experience. I have an integrated terminal to the remote box and I have full intellisense for coding (I do 80% coding / 20% admin, probably).
It took a gestalt flip for me to realize that with this plugin installed, vscode is basically a very (very) fancy terminal emulator.
Only problems I've encountered so far are if you paste large strings or quickly scroll through your bash history, sometimes characters will drop from the strings. Also, sometimes you have to reload the window because intellisense stops working because it seems like the language server installed on the remote box stops working.
Coming from remote development Emacs TRAMP, vscode + remote-ssh mercifully removed the cognitive load of dealing with Emacs. Not only do you not have to manually install & configure everything, vscode will tell you what you need and give you a working installation automatically.
I used to keep iterm windows and tabs open but shifted to using pycharm’s integrated terminal.
I use multiple spaces in macOS, with entirely different project contexts. It helped having the labeled terminal tabs coupled to a Pycharm project window.
However, this also let me to realize that the gate commands I was using could be accomplished through pycharms interactive source code management tools.
This helped me realize that there were more powerful things I could do there than in a terminal.
I still find myself using terminal to debug remote environments. But for building, the IDE has taken over over for a lot of my past terminal use.
I think productive programmers would get way further by learning powerful scripting and one liners than picking different pre existing binaries. Its a little oldschool but pipe is the most powerful concept on the command line and it’s wildly productive
Though I agree that new is not necessarily better, and programmers should absolutely try to stratch their own itch with the existing tools (bash, sed, awk and the likes), some new tools are a great improvement and should be granted as such when due.
FZF, for instance, is a major novelty in terms of interface, and it really follows the UNIX philosophy and shows immense composability.
I think fzf is still a useful recommendation (better fuzzy matching for Ctrl-R). Or just Ctrl-R for reverse search in bash/zsh is a good thing to know.
Yes, I've found that the most significant improvements for me came with bash aliases, a few little scripts, and nnn for anything that has to do with folder navigation or file browsing. Its search-based navigation speed is unmatched.
Another big upgrade was dmenu. Just slap it in some bash script and now it's interactive. Combined with xbindkeys you can even bind those menus to keys, and it'll work on any Xorg-based desktop.
Pipe is not old-school. Yea, 20+ years ago I learned that feature - and its much older. But! Daily since then I pipe to grep, sed, awk, cut and occasionally perl, php, ruby or now we have JavaScript CLI tools, wow!
There are those edge cases where i want to edit and get something out of a unix pipe based oneliner i made however in certain scenarios i find myself having to write complicated code for just a one or two line edits to the piped output
In those i recommend using vipe (its a part of moreutils) it lets you put in your favourite text editor in between your unix pipes , so you can edit things manually from the stdout midway a pipe and let the rest of the pipe take it from there.
Its quite handy , this way i can sometimes use vim’s macro functionality inside my pipe oneliner scripts for when its not worth it to code it with sed,cut,awk or grep.
> However, if the directory has many files or sub-directories, tree becomes much less helpful: you only see the last screen full of information as files scroll past you.
Recently i’ve taken to using nnn for anything like this: https://github.com/jarun/nnn - i never used cli file managers but now, it’s up there with my shell, vim & tmux as a core productivity tool.
Show me files in this dir:
$ n
Show/hide details for each:
.
Filter / search file list:
/
Navigate subdirs (or parents) with vim or arrow keys.
Cd into sub dirs by typing unique chars from dir names:
ctrl-n
Drop back to shell i started n from but cd into this new dir:
ctrl-g
Exit n without cd’ing to the new dir i navigated to:
If you’re already in vim, why not use whatever equivalent to NERDTree is popular? Unless you’re only interested in the structure and names, and not the file contents?
It seems like they even have a neovim plugin, you might consider using it.
But - if you're happy with ranger, i'm not sure it's worth the switch - they're very similar. nnn is quite a bit faster than ranger but other than that, i think ranger has more community support.
Thanks for reading the article. broot does more than just piping tree to less, it gives you a TUI version of tree that you can navigate along around it. The fuzzy finder in it works like FZF but it keeps everything in a nice compact tree view. It might be overkill but I like it.
There'll always be multiple ways to skin the proverbial cat.
Shameless plug but I'd written my own $SHELL callewd `murex` as I kept running into pain points with Bash as a DevOps engineer. The shell doesn't have `tree` inbuilt but it does have FZF-like navigation built in.
I've been using it as my primary shell for a few years now and I'm not going to pretend that it isn't BETA it does work. However it's not POSIX and some of the design decisions might rub people the wrong way (given how opinionated peoples work-flows are). But if you're curious then check it out.
(I couldn't work out exactly what the last two did so I'm not sure my translation is accurate. Even if it's not quite right I doubt the intent would be impossible to express with basic tools.)
That first pkill/killall example is a bit silly, but for the others I can see how a "ruby-each-line" can be useful, especially if you're already familiar with Ruby (although I'd probably name it "el", as that's much less typing). I never really learned awk properly myself either, in spite of being quite familiar with most other shell tools. "ruby-each-line" does more or less the same thing as awk.
After all, there's just one person using your shell: you. So whatever works well for you is a "good" solution.
Yeah, like I said: I can see how it could be useful for someone who knows only Ruby.
> I never really learned awk properly myself either
It's really worth it. It's simple — pattern matches and block, BEGIN/END, hashes, match() and gsub() cover 90% of my uses. I very often find myself writing something like this:
awk -v id=$id '
$1 == id {
go = 1
}
{
if (go && $4 == "b667226") {
km += $5
secs += $6
n++
}
}
END {
printf("%d rides, %d km (approx %.1f hours)\n", n, km / 1000, secs / 3600)
}'
This program takes STDIN, reads each line and:
* sets a flag indicating interesting data has been found if the first column matches an ID passes on the command line
* if the flag is set and column 4 contains "b667226", add columns 5 and 6 to running totals, and add 1 to a count of matching lines
* after all the input has been read, prints out a summary of the data
Of course, any language can do something like this, but awk is succinct, easy to iterate on, and available almost everywhere.
awk just always seemed a little too domain-specific to really invest time in. The number of times I think "gee, I really wish I knew better awk" are few.
This week I've been doing some Lua programming; I had done some Lua several years ago for something else, but found that I had forgotten most things. Even for Ruby I've forgotten quite a lot, yet for two years I programmed Ruby every day for a living. It's just that I haven't done much Ruby since, and when I did some Ruby several months ago I had to look up quite a lot of basic syntax things because I had roughly remembered how it worked but not enough to actually get stuff done in it.
In general, I find that effective practical programming skills are a bit of a "use it or lose it" thing. I don't think I'll use awk enough to not "lose it", even though it's a fairly small language, and most problems it solves can also be solved in other ways.
I am sorry for spamming this comment, but I wanted to share it with all the fzf users because I found it so game-changing:
I use "ESC-c" a lot. It calls fzf and fd, you type a part of your directory name, hit enter, and it cd's to that directory.
export FZF_ALT_C_COMMAND='fd --type directory'
That's in my .zshenv.
It's incredibly useful if you have multi-layered project directories. Like ~/Projects/Terraform/Cool_Project - you'd hit esc-c, type cool, hit enter and you're in the place you wanted to be.
For his usecase of funky i use ~/.bash_aliases, that way i can easily share them between machines. Eg to quickly add a new one: alias als="nvim $HOME/.bash_aliases && source $HOME/.bash_aliases"
Be careful with quoting though, got myself into a situation where every new terminal asked for the root password. Shellcheck found the reason quickly.
(edit) Another benefit is that those aliases abstract over differences of invocation on different distros. Here are some examples:
# PACKAGE MANAGEMENT
alias pcl="sudo zypper cc && sudo zypper purge-kernels"
alias pin="sudo zypper install -y"
alias pla="zypper search"
alias pli="zypper packages --installed-only | rg "
alias plu="zypper list-updates && zypper list-patches"
alias pre="sudo zypper remove"
alias pup="far too long" # upgrade all packages and ask for confirmation
alias pups="far too long + shutdown" # upgrade all packages without confirmation && shut down
# QUICK EDITS
alias als="nvim ~/.alias && source ~/.alias"
alias brc="nvim ~/.bashrc && source ~/.bashrc"
alias egr="sudo nvim /etc/default/grub && sudo update-grub"
"cd -" changes to the previous working directory. Your function calls cd multiple times, so calling "cd -" won't bring the user to the previous working directory they expect. ozym4nd145's function only calls cd once.
I used to alias .. etc., but then found that Fish shell has helpful commands and behaviours for directory navigation built in:
> Note that the shell will attempt to change directory without requiring cd if the name of a directory is provided (starting with ., / or ~, or ending with /).
I’m also a heavy user of aliases and can’t see how I’d use funky. Behavior dependent on a current directory is rarely a good idea, and when it is, you can just create an explicit shell ./script
The funky tool sounds like a potential security issue, especially if you clone repos and then enter them without checking the contents first. I suppose it could be made safer by requiring review when adding a new directory or the settings for a directory changing.
I'd love a runonce tool: run a script or tool just once
and forbid further executions.
Useful for ~/.xsession and such.
Also, stuff you'd like:
-sox, audio swiss knife
-ffmpeg, ditto for video
-ncdu, check disk usage
-rclone, mount and rsync everything
-f3, check and forbid overwriting those fake USB drives
-udfclient UDF, but better as it eases compabitibility.
Ditch fat32/NTFS for computer media sharing
-trickle For network/web programmers, it can force really slow connections on software, such as 2G/ISDN speeds and even below. Perfect to test bad conditions
-pdftotext -layout It dumps a whole PDF into a text file. Useful for copy/paste or adapting the format to anything else
- unzip -qcp "$EBOOK_BOOK" "ml" ".htm" | lynx -force-html -dump -stdin -nolist > book.txt EPUB to utf8 dumper.
Trivially adaptable to be used with less, for example. Use "$1" instead of "$EPUB_BOOK" to use it on a script.
I don't like the idea to generate the list of modified files in commit messages, it's not very readable to me plus that I could just generate such info with git log. I tried to write a custom git-wip script to include output of "git status --porcelain=v1" but turns out it's just not necessary since "git log --name-status" could already show it
git log --name-status
# or get modified history of a specified file
git log -p path/to/file.ext
Some interesting tips on this list (and in the comments). It’s definitely worth taking a step back every now and again and investing a little in picking up some newer tools for better workflows.
I recently switched to the latest neovim and redid my config from scratch using more modern plugins. Took a while to understand the ecosystem and figure out how to get things working together nicely. But now, between telescope, ripgrep, trouble and better overall LSP etc, I’ve gone from fast to faster.
The balance of payoff probably still isn’t there yet, but it’s made me want to code more.
If you are a Linux neophyte like me, Midnight Commander[0] can be very useful for learning directory structure and basic commands. After starting it (`mc`), you can quickly toggle its terminal UI on and off with `ctrl-o`.
One of the most productive tools that I have is being able to search command history for commands typed in the current directory. It becomes a knowledge database. I rolled up my one, but I haven't seen any tool doing it, which surprises me. Perhaps the McFly tool mentioned here.
I use "ESC-c" a lot. It calls fzf and fd, you type a part of your directory name, hit enter, and it cd's to that directory.
export FZF_ALT_C_COMMAND='fd --type directory'
That's in my .zshenv.
It's incredibly useful if you have multi-layered project directories. Like ~/Projects/Terraform/Cool_Project - you'd hit esc-c, type cool, hit enter and you're in the place you wanted to be.
I just read the description of funky, and it is like Windows 98 CD-ROM autorun.exe, but the next level. If you clone my repo and enter the directory - I own your system. Funky indeed.
Off topic hijack - I'm trying to find a tool I feel was probably mentioned here at one point - a REPL for shell.
If I recall correctly, you would open up the repl and start typing a command, and the output would refresh as you typed. Does this ring a bell for anyone?
One tool that I don't think gets enough love is the "git-fixup" add-on. It looks at your commit log and recommends relevant commits to fix-up based on your staged files.
If you are using fixup commits in your PR review process it's essential.
The one cli tool that saves me the most time is "gh", and its predecessor "hub". Not having to interact with a website to submit a PR, issue, or just fork a repo, is a blessing. I wish more sites would ship a cannonicla tool to use their APIs...
The single best improvement to my command-line workflow I've done in the last few years has been switching shells. First I moved from Bash to ZSH, but about 2 years ago I switched again to Fish.
Why I love it: - history completion searches by default, so I can just type part of a command, hit arrow-up and look through only the relevant results (ZSH also did this, but Fish is better)
- tab completions are shown to you in advance (in light gray in front of your cursor)
- Basic configuration (which is good enough, honestly) can be done via a web browser. Just run "fish_config", and it starts a web server, pops open your browser and any changes you make there are saved to your actual config files. No more looking up syntax.
- aliases (which are called abbreviations) are actually expanded after typing, so the full command appears in your history. You can also edit the full command before running it, if you need.