I really enjoy using a command-line interface instead of a graphical user interface, and was wondering if anyone had any solid recommendations for applications that one can use in a terminal window.
I love fish-shell, but after years of using it I switched to Oh My ZSH!. The bash compatibility makes it a whole lot easier to use when you're the type of person who has to look up "how to do X on the command line" on the internet more often than you'd like to admit (that would be me).
But the whole reason I use fish is because I DON'T have to look things up on the internet because it uses sane syntax instead of some arcane combination of brackets, symbols, and the word “if” spelled backwards.
fwiw, we are shipping a number of features in fish 3.0 (any day now!) that are specifically targeting at maximizing compatibility with most bash scripts (still not POSIX).
That includes support for && and || and a few other things that should go a long way towards making most code you find drop-in ready.
Really? Doesn't that go against the stated fish design document and ridiculousfish's intention for fish?
I'm more concerned about making fish scripts work well than making fish run bash scripts. (Maybe fish 3.0 will improve that too.) I've tried to use fish for writing somewhat serious scripts several times, and I've always run into frustrating problems that sent me back to bash for scripting. Fish is an excellent interactive shell, of course.
You can see the discussions on GitHub but @ridiculousfish authored that series of commits fwiw.
However with regards to your other matter, I personally just finished rewriting the job control code to close out a lot of long standing bugs affecting correctness and child process behavior in complicated scripting cases, and other members like @faho have done an amazing job fixing up some of the builtins and scripting-related improvements as well as putting in insane effort into the interactive behavior of the shell and its insane library of completions.
Try the current fish master builds and see how if fares. If you have any specific concerns, please file a GitHub issue. It’s an open source project and it can only succeed with the help of the entire community.
fwiw I find the sanity fish brings to process substitution with sane tokenization and automatic escaping of shell substitution output to be a huge boon to productivity over writing in (ba)sh. These days, my scripts are either written in ninja or bmake if they are rule/output driven or fish otherwise.
Thanks for the updates. I have contributed to related discussions on GitHub for several years, but I haven't been able to keep up with all the issues.
> fwiw I find the sanity fish brings to process substitution with sane tokenization and automatic escaping of shell substitution output to be a huge boon to productivity over writing in (ba)sh.
I completely agree, this is why I would much prefer to use fish for scripting when feasible. I'm glad to hear that the scripting is being improved.
I also love zgen and used to use some parts of OMZ with it. But after debugging weird issues a bunch of times and always finding a funny opinionated config in OMZ I changed everything to Prezto. Zgen supports Prezto natively.
that would rather defeat the point, since fish breaks compatibility by design, that's part of what i like about it.
i can however appreciate that breaking bash compatibility is not for everyone.
i mix use of bash and fish. but more often than not i find myself in a situation where i switch to fish because it makes some complex command easier, than the reverse.
There are a few plugins you can use with oh-my-fish and similar tools, for example `fish-foreign-env`, which provide a command (`fenv`, in this case) which runs a given command in bash and then transfers environment variable changes to the fish environment from which it was called.
It's not bash compatibility, but it is good enough for one-offs and even for many scripts and tools that expect to be sourced in your dotfiles from bash.
I'm in the same boat as you. Also used Fish for several years and loved it, but there's some incompatibilities here and there (dont remember exactly what, I think it was difference in how to pipe streams vs bash/zsh).
In the end i switched to zsh & oh-my-zsh and haven't run into issues since (made the switch maybe 6 years ago). I'm a heavy terminal user so this part I'm not experimenting with again.
File manager: https://github.com/jarun/nnn
Google search: https://github.com/jarun/googler
DuckDuckGo search: https://github.com/jarun/ddgr
Bookmark manager: https://github.com/jarun/Buku
Multicore resize & rotate: https://github.com/jarun/imgp
Calculator: https://github.com/jarun/bcal
Date diff and timers: https://github.com/jarun/pdd
Kernel keylogger: https://github.com/jarun/keysniffer
The developer (https://github.com/jarun) writes cmdline utilities only. His tools are highly optimized for performance and integrates seamlessly with the DE. Most of them are available on popular distros and Homebrew.
Since @sharkdp's work was originally inspired by @burntsushi's rg, you might be interested in xsv for CSV viewing (and manipulation): https://github.com/BurntSushi/xsv
I might be revealing my lack of experience here, but why would you need an app to view your csv? What is the benefit of this over running the data frame or df.head?
Visidata is much more than a simple CSV viewer. It can open xls/xlsx/sas7bdat/spss files, and allows you to quickly edit the in-memory copy of the table using Vim-like keyboard shortcuts. It also allows you to create plots using Braille Unicode symbols. It's a great tool!
The thing I really love about ngrok, though, is the web dashboard where you can inspect your tunnelled traffic and even replay requests: https://ngrok.com/docs#inspect
I keep a VPS running with Nginx on it for a specific port. Then I forward that port via SSH using remote forwarding to enable this exact functionality. It's free if you've already got a server running.
god, I had this same idea and thought about making it! I'm so glad you shared this, I had no idea someone else already did it! and did a great job with it, too, by the looks of it
Excuse my very basic question, but how do you manage spam when using mutt? Do you interface to a cloud-based service in mutt, or have some kind of other spam filter?
I’ve always liked the idea of moving my email to mutt and using personal domains, but not convinced I could manage spam well.
the same way you manage it with a GUI mail client. if you use a mailservice that includes a spam-filter, you can use mutt's imap support to access it, and thus access the filtered mailbox like any GUI or webmail client would.
for your own personal domain, either use a service that supports personalization. (gmail does, but there should be others) and you are covered again.
if you want to host your own mail server, then you'll probably want to run your own spam filter on that server. it works by having the mail server forward the mail to the filter (which can be local or remote) and then deal with the mail based on the response from the service.
On one hand it's in the acronym command line interface and on the other
"was wondering if anyone had any solid recommendations for applications that one can use in a terminal window" it really looks like he meant to express console apps not cli I think you are correct.
sure, conceptually it does, but that's not the point.
what matters for most is the fact that i can run the application in a text only terminal, on any machine (remote or local), from any device (be it linux, windows, mac or even a mobile phone (ok, that's rarely practical, but it's possible))
> what matters for most is the fact that i can run the application in a text only terminal, on any machine (remote or local), from any device (be it linux, windows, mac or even a mobile phone (ok, that's rarely practical, but it's possible))
> a GUI does not offer that advantage
Sure, not in the terminal, but X forwarding is a thing and works on every system I've had to use it on.
it doesn't work on the majority of systems i have to work with, which is servers that don't have the necessary tools installed.
it's also very susceptible to latency and most applications don't handle slow connections in a usable manner. (they are designed with the expectation that the gui always responds instantly)
to get something of a tmux/screen like experience, xpra is available, which is an awesome piece of work, but it doesn't help with the latency. even over just local wifi i have some applications become unusable over xpra when they work ok over plain remote X.
the problem is not necessarily X but in part GUI in general. i can't click the mouse anywhere until the respective UI item is visible, so i have to wait for that.
on a commandline on the other hand in most cases i can keep typing even on extremely slow connections because i can anticipate what will happen and i know what keys are appropriate to type next.
using mosh i even get something like editable typeahead which is a marvel and very hard to imagine on X.
Oh, I never said I _enjoyed_ X forwarding, or even use it frequently. My point was just that remote access in a similar way to a shell is possible with a GUI, and could be made even better if X had a means of not having to draw everything but left it up to the toolkit on the other end.
But yes, in general, a shell is just much better:)
It's basically apt/brew on Windows, but it's particularly awesome because it does not manage dependencies. It just downloads and extracts installers, which works because every self-respecting program that works on Windows distributes binaries as an installer (or just a zip) somewhere. Result: everything just works, every program is self-contained, no admin rights needed, no package maintainers needed either.
So far I've been using Chocolatey
(https://chocolatey.org) for doing this (and ninite though that's not command line). But this looks pretty cool too. Thanks for sharing!
I created a small CLI tool that I package for Homebrew, Snap, Docker, Chocolatey and Scoop, and I prefer the packaging experience for Scoop, which is the most easy and straightforward one.
Nice tool btw! I wish I'd known about it 5 days ago, when I taught a class that involved a little bit of web development. I used devd, but getting it in the PATH was a non trivial task for some of the trainees who used Windows. Serve has an installer, which presumably solves that even without scoop or chocolatey.
Cli-junkie brofist! I only drop out of the command line for occasional one-off browser stuff.
Depending on your current familiarity with things, I can't recommend enough simply combing through the tools in coreutils, util-linux, find-utils etc. If you take just three or so a day, by the end of a few months you'll have a solid survey-understanding of what almost every *nix shell has available.
> git-annex is not a filesystem or Dropbox clone. However, the git-annex assistant is addressing some of the same needs in its own unique ways. (There is also a FUSE filesystem built on top of git-annex, called ShareBox.)
So I assume you made it work like Dropbox for you anyway. How did that go?
The "a better Dropbox" comment was quite tongue in cheek. You can think of git-annex as a way of managing files in git without managing the contents, which is exactly what we want for binary blobs like PDFs, word documents, whatever.
The syncronization between computers is done essentially how one would do it with git. Personally, I sync to a private VM host so the content is available "in the cloud".
At the moment I just sycn everything manually, but it's possible to setup git-hooks to make it automatic. git-annex is nice for power users, since it allows fine grained control of where to keep what content. The file metadata is stored in git, but you choose when and how to pull or push the actual files themselves.
That makes it possible to sync cross sections of an entire library, for example. This is how I manage what to put on my kindle and what to keep off it.
Thanks - I'm subbed to that ticket. Will most likely switch over when it's done since a I vastly prefer a single binary rather than having to install python + deps.
It relies on Google’s translation API in GCP which isn’t free but does have a trial period. I assume YouTube would also rely on this service internally for auto captions.
Some great contributions here ... some that are missing (or at least scarce) which deserve more attention:
units - for a broad range of conversions without having to resort to Wolfram Alpha or a search engine
bc -l - the -l option really empowers this awesome little calculator.
mpv - awesome media player - point it at a media URL (it uses youtube-dl in the background, so perfect for viewing or listening with no ads,) but can also play from open directories / SMB shares, online radio, Play a series of images as video, Access any FFmpeg/Libav libavformat protocol or filter (including generating tones and images, and more. And once it's playing, has great interactive keyboard controls for many useful functions.
>bc -l - the -l option really empowers this awesome little calculator.
Right. Since bc is a Unix filter, you can also pipe expressions into it, to be evaluated, so it can be used in general Unix pipelines. E.g.:
echo *arithmetic_expression* | bc -l
and the output comes on stdout, so it can be further piped.
Edit: You have to quote any shell special characters in arithmetic_expression with single quotes, or better, just enclose the entire arithmetic_expression in single quotes. If asterisk (for multiplication) is not quoted, for example, it will be treated as a filename wildcard instead, and likely lead to an error. A wrong and a right example:
For working on servers, tmux + mosh is a killer combination: mosh makes your connection resilient to network issues and tmux gives you (back) scrollback and cross-session persistence.
mutt is hard to beat as an email client and circe (an emacs package) + znc on a digital ocean instance or equivalent makes for a great persistent irc session.
I use git so much that I made these aliases in zsh:
alias g="git show"
alias gh="git show HEAD"
alias gs="git status"
alias gl="git log"
alias gco="git checkout"
alias gd="git diff"
alias gbl="git branch -v"
alias gbd="git branch -D"
alias gri="git rebase --interactive"
alias grc="git rebase --continue"
alias gra="git rebase --abort"
alias gst="git stash"
alias gsta="git stash apply"
alias gx="gco -- \*; git reset HEAD \*"
alias gcp="git cherry-pick"
alias gcpc="git cherry-pick --continue"
alias gcpa="git cherry-pick --abort"
A few hundred instances of "gs" in my .zhistory :)
I’ve been using short aliases for my most frequently used git commands for a few years now. I can’t imagine not using aliases.
alias st='git status'
alias dp='git diff'
alias di='git diff --cached'
alias aa='git add -A'
alias cm='git commit -m'
alias pu='git push'
Having these aliases of mine makes these actions something that you don’t need to spend any mental effort to perform at all.
I highly recommend others to define aliases that make sense to them for the most used got commands.
Mine are named the way they are because the two letter alias represents how I think of the action that the command performs:
* st - status
* dp - diff pending
* di - diff
* aa - add all
* cm - commit
If you define your aliases this way (speaking to the other readers of this thread, not the person I am replying to), you will find that the combinations become muscle memory quickly.
Notice also that unlike a lot of other people including the person I am replying to, my aliases are not prefixed with “g”. Again this boils down to your personal way of thinking. I’ve used other version control systems before, switched to git several years ago, and use git pretty much exclusively. I am of course aware at all times that I am working with git, but at the same time when I think of actions to perform, I don’t think of them as “git this or that”, I only think of the “this or that” part. If that makes sense :P
But yeah, find aliases that work and make sense for you.
alias ..="cd .."
alias ..2="cd ../.."
# ... you get the idea, mine goes up to 5
function mkcd () { mkdir -p "$@" && eval cd "\"\$$#\""; }
alias h="history|grep"
SCM Breeze has aliases like that, plus a bunch of other convenient things like numbered shortcuts for branches and files: https://github.com/scmbreeze/scm_breeze
hahah I have gs and gb(branch), just because I'm just constantly typing them(sometimes for no reason at all), others I end up typing the full commands tho
Since git is by far the command a lot of people type the most, this shell comes in really handy. It supports all regular git aliases, has autocompletion, and you can set default commands to be run when you just press return. It's amazing how much typing it saves.
I used to use ag (the silver searcher) [1] before, and `ag` was practically a muscle memory to me, but then I came across ripgrep and it was so much faster. I've now installed a static binary of ripgrep in every machine I have SSH access to.
This doesn't look to take commands after a pipe into account so IMHO doesn't accurately reflect command usage. For example running above I was surprised awk wasn't in the list but when I went to investigate I found I almost always am piping something to awk.
Hmm yeah. And often I type multiple commands separated by semicolons, which it bunches together as one. Also that and long paths meant mine was very unpretty. Speaking of AWK, this uses the same flawed method, looks nicer (for my history), and is shorter:
history | awk '{a[$2]++} END{for (i in a) print a[i]"\t"i}' | sort -n | tail
Just for curiosity: you have moved away from Emacs org-mode?
If yes may I ask why?
I'm using Emacs since few times and it's now even my WM and org-mode is it's companion for nearly all docs/docs related stuff I use, never found anything even near comfortable and powerful as that combo...
Thanks, just curious because you are the first I here that have used Emacs and decide for something else.
For org-mode outside Emacs I do not know VSc but I've used in the past Vim port and agree: outside Emacs org-mode support is simply too limited to be considered...
This is an abbreviated list of my command-line applications, taken from the Nixpkgs overlay I use for configuring macOS, NixOS, and non-NixOS Linux:
pxc.common.tui.pkgs = with self.pkgs; [
# cli basics
htop # top, but nicer
httpie # curl, but much nicer except for lacking a man-page (--help is very complete)
ranger # file manager
ripgrep # my favorite grep alternative/replacement
tree # prints directory trees
# stuff my fish config uses and some goodies I want
fish # very nice shell, replaces bash
grc # GNU regular colorizer: uses regular expressions to colorize the terminal
tmux # terminal multiplexer (tiling window manager for the terminal)
byobu # some tmux config that makes things a little nicer. Great for new tmux users, hardly used anymore
fzf # fuzzy filtering: used for changing directories, selecting files to open, browsing shell history
fasd # frecency tools: jump to directories, open files, etc., based on fuzzy matches sorted by frecency
keychain # simplified SSH and GPG key loading/management
direnv # barebones projects, pretty nifty
gitAndTools.hub # GitHub CLI extensions for Git
gitAndTools.tig # curses TUI for browsing git logs
# makes tmux pretty with `powerline-config tmux setup`
pythonPackages.powerline
findutils # macOS comes with weak find command
gawk # macOS comes with ancient gawk, tmux-fingers wants a newer one
pass # GPG+git-based CLI password manager
pwgen # for use with pass
p7zip # archiving tool that can do pretty much everything
# chat
weechat # nice terminal-based IRC app
### extras-ish ###
mediainfo # tell me things about multimedia files
asciinema # record TTY/terminal sessions (with audio) as moving text for the browser
cowsay
graphicsmagick # image manipulation. lots of fun for quickly uploading custom emoji to Slack
];
I am a convert from Spacemacs to Doom, and I have found that one of the main differences that sets Doom apart is easier configurability than Spacemacs. In Doom, there is still the concept of "layers," but IMO it is much easier to figure out what the layers are doing and customize them from there.
Also, I have found Doom to be much quicker (takes about 2 seconds to load, not using server/client) and less buggy than Spacemacs. I would highly recommend giving Doom a try!
This. Also, the developer is responsive on his Discord server should you have any issues with the editor. Henrik is patient, kind and reliable, rare traits for open source projects these days.
Aria2: https://github.com/aria2/aria2 an excellent download manager. Supports many functionalities such as downloading torrents, multi-connection downloads, and resuming uncompleted downloads. It also support RPC calls which enables it to be an excellent download manager back-end.
I saw Aria2 has two packages in F-driod. I installed them and looked at the docs but I couldn't figure out how to download stuffs with Aria2. A small example of it's use will be highly appreciated. Thank you.
you need both. One of them pnly supplies a UI, while the other only supplies the binary. I personally recommend running aria2c via termux and using ziahamza.github.io/webui-aria2
It's really worth the time to learn. It's pre-installed basically everywhere, it's amazingly useful. It has the features of a modern text editor all in a command line.
I use vim exclusively, and use a terminal for most things.
FZF - fuzzy file finding lightning quick, I use this as a ctrl+p replacement in vim too.
Kubectx - Supported by FZF. Quickly change k8s context and define namespaces easily too (Kubens)
There are so many to list that I'll skip the more obvious ones (they're well represented in the comments) and pick a few that I've been enjoying a lot lately:
Netflix-Skunkworks/go-jira[0] - Jira interface written in Go that is incredibly flexible.
bat[1] - cat with code highlighting
climate[2] - Shell swiss-army knife of tools
That last one deserves a post on its own. It's certainly not the most portable thing -- requiring that certain binaries be available for some of its functions, but it works as-is for the Linux distributions that I run and provide simple commands for getting information about things that isn't always trivial to remember (such as getting an external IP address of a host from the command-line). Silly things like remembering the CLI options for 'du' (along with the necessary '| sort ... | head -5') are replaced by 'climate biggest-files'. It replaces a mess of tools, but does so, often, by simply calling those tools with the "correct" parameters that I've forgotten.
Yes, a well authored set of aliases for the most common commands would do the same thing, but I end up using 'climate' in cases where the information I need is something I rarely need for a given task and getting that information is going to be a non-obvious exercise involving a few pipes. It's a "kitchen sink" utility with a set of commands that, for the most part, you'd find a hard time grouping them into a single theme -- a command for getting the current weather (after all, it is named climate) next to one to download all files on a web page, next to one to get disk usage stats in a variety of ways. Normally, I don't like tools like this, but this one lands as a big exception to that rule for me.
A function that it provides is to display the public IP address from a host behind the NAT.
Another way to do this: `dig @resolver1.opendns.com ANY myip.opendns.com +short`[0] ... it's just trickier to remember. And since dig is a tool for performing DNS lookups, `man dig` doesn't hint at this particular trick.
In fact, I didn't actually remember the command that displays that for `climate`[1], because if I had, I would have written it as `public-ip`, which `climate help` informed me of. :)
[0] ... or if you prefer a more interactive and profane experience, try `lynx wtfismyip.com`
[1] My public IP is displayed on terminal start since I need it with some frequency; Of course, it's displayed as "External IP:", which is something I'll have to remedy.
on the contrary for the first point. i suspect that most enjoy notmuch with their traditional interface because there are not many alternatives. probably most people looking for interface alternatives switched to gmail.
commandline UI development in particular seems very conservative, and attempts to explore new interfaces are rare. that's why i love this topic and am getting excited about any attempt to change that.
i obviously can't comment on what it takes to integrate notmuch with lumail, and i grant that one may possibly have to develop a mailclient with notmuch in mind for it to work well.
httpie is the nonstandard command I use the most I think (https://httpie.org/). Syntax highlighting for everything that comes down the pipe is pretty awesome. I'm also a sucker for bat (https://github.com/sharkdp/bat) which I just found out about through this thread.
Among them, the ones I use the most are `json` and `yaml` for pretty printing both formats, as well as `urlencode` / `urldecode` / `urlarray` which I recently added.
If you need to handle XML, xmlstarlet (http://xmlstar.sourceforge.net) is fantastic, and much more powerful than the standard line-oriented tools like sed and awk for this task.
The only issue I have with xmlstarlet is that it can be sort of a pain when there are xml namespaces in the document. Most of the time, you just have to pass it the long option that ignores them all together
rsync. It’s installed everywhere and makes it super easy to keep source code in sync between laptop and remote servers. It’s so fast that I completely ignore more advanced solutions based on fanotify or similar, and literally include rsync before every “build/run” command during development.
I wrote a Python cli tool for getting the git status/history over multiple local repositories. I am often working across several repos on a single project, but generally I find it useful to see what updates are available, and also what my team are up to.
Tail -f works by opening the file and remembering the byte offset, if it’s a log file that gets flushed/wiped then tail -f won’t see anything until the logs fill enough to exceed the previous file contents.
Unless you pass the option --follow=name in which case tail will if necessary reopen the file and follow it even when log-application has closed and moved it elsewhere and opened a new one of the same name.
If you aren't using tmux or screen, you probably should be, especially over SSH sessions. They allow you to have multiple consoles open and switch between them, split window, and disconnect from SSH and then reconnect with all your things still open and running (as long as the server stayed running).
I have a EC2 instance who's main purpose is to run a screen session. I ssh to my EC2, and load that screen. I have every server I manage in it's own "window" inside this screen. I run screen on each of these machines, so I can hop between inner and outer screens.
As someone who hasn't invested much time into tmux/screen, what are the benefits of it over just using eg; multiple terminal tabsn or using the terminal's built in split windowing? I'm sure there's probably a benefit but I dunno what it is
If it's all on one device and no networking, I don't know of any great benefit. But if you sometimes connect to the machine remotely then Tmux allows you to pick up an existing session, e.g.
You open tmux on your desktop, open some command line things, then walk away and pull out your laptop or desktop, SSH to your computer, "tmux attach" and have the same session open - and it's open on both devices, both seeing the same updates at the same time, like a terminal-vnc.
I agree with everyone else that it's great on remote systems, but the reason I use it locally too is because I can set up all a project's "tabs" with just one command. I have dozen-line scripts for each client/project that will either create or attach to a tmux session, and when creating set up all the windows how I want: setting their names, cd'ing, opening psql, tailing a log, changing the bottom color (green for development, yellow for staging, red for production), etc. For Rails work I typically have separate windows for models, controllers, views, stylesheets, and javascripts. It makes it really easy to get started after a reboot, or just hide projects for a while and bring them back quickly.
occasionally i need to restart the GUI. using tmux locally is very helpful for that. also sometimes i log into my workstation from remote, like when i am travelling. it's nice to be able to access the running session.
Both tmux and screen allow you to attach/detach. Thus you can log off, and you can pick up your session later.
For example, yesterday I logged into a Linux server at DigitalOcean via SSH. I ran tmux, started a long-running job, detached the session. I closed my laptop to go home for lunch. When I returned at my desk, I logged into the server and reattached the tmux session so I could see the results.
One useful reason to use it over a terminal is portability. If you're happy mostly working in the terminal then tmux will be the same on every OS your work on even if your favourite terminal isn't available.
It persists your session, so if you get disconnected you can just log in again and 'tmux attach' to resume from where you left off. Any long running commands will still be running.
I use some scripts around screen (and tmux would do the job at least as well, just haven't had reason enough to switch) to manage contexts for my shells. The key is that that process will be the parent of any shells I spawn in contained windows, and so if I set an environment variable (I use SESSION) before kicking off screen it will be inherited, and visible in my bashrc. This lets me set up a lot of context specific things - path and functions and aliases - but the single best thing about it is a separate bash history per context.
I really appreciate the people who include descriptions and links for their list. My list has been pruned to avoid duplication (though perhaps knowing what's popular is a good metric too).
asdf: https://github.com/asdf-vm/asdf - installs many different programming languages and supports installing different versions concurrently. I use it for go installs since distribution updates aren't usually fast enough.
solaar: https://github.com/pwr/Solaar - handles configuring a logitech unifying receiver in Linux. Not really something you'd use multiple times, but am thankful it exists (also has a GUI too I believe).
Goaccess is a great little realtime log file analyzer in a similar vein as webalizer. Not only does it work through the console it can also output html.
Three tools I make sure are installed on everything I use via CLI: rsync, pv, & progress.
rsync (usually over SSH): the kitchen sync of file transfer commands. In Linux-land it is quite common to find it already installed, and if not it is in the standard repos. Available for Windows too.
progress (https://github.com/Xfennec/progress): for similar reasons to pv, wanting to see how a long-running task (that isn't giving its own progress information) is getting on.
Used to use that but stopped after while, figured it's easier to just type a name or alias/symlink it if it's used often -- then it's deterministic in a non-interactive way, i.e. I can type it with my eyes closed and I know the end result.
With tools like z I'll have to type less letters in the end but it requires interaction, and interaction/attention is more expensive than keypresses, at least in my world :)
Agreed, highly recommended. I dreaded writing shell scripts for the longest time because of the weird syntax and many gotchas. Having emacs flycheck my shell buffers with shellcheck gives me confidence that I'm not doing anything stupid.
If I had to pick a single utility, it would be pv[0]. It's very simple, but I find myself using it all the time. Basically you stick it in a pipeline of commands to get a progress bar. The simplest usage is just to replace cat or input redirection with a call to pv. Saves a lot of time when I can quickly find out if the command I wrote is going to take 30 seconds or 30 minutes.
Ultimate Plumber is a tool for writing Linux pipes with instant live preview
It's a new CLI tool I've recently written, and it reached a Top 1 position on HN just after the public release.
It works with other classic CLI tools by boosting the speed with which they can be composed together. It's especially useful for quick exploration of logs, CSV files, and other text files or pipelines.
Um; I feel it's kinda as if you asked if a dog is like rhinoceros... How do I even answer that? Simplest answer is: "no, it's not"; though if you squint your eyes super hard, you could see some kind of vague resemblance, maybe?
Have you seen the readme? There's an animated gif showing how it works, and a more detailed description.
> interactively and incrementally explore textual data
Just like less.
> This is achieved by boosting any typical Linux text-processing utils such as grep, sort, cut, paste, awk, wc, perl, etc., etc., by providing a quick, interactive, scrollable preview of their results.
Yep, that's less.
> use PgUp/PgDn and Ctrl-[←]/Ctrl-[→] for basic browsing through the command output;
Sure that's not like less? You can even start searching results of a high output command immediately after the pipe.
Ahahah, lol, ok, get it :D So, apparently I totally reinvented less, I concede :)
To say even more, it's actually very much a crippled less (not all of its functions available!). With just one feature added, namely:
> in the input box at the top of the screen, start writing any bash pipeline; then press Enter to execute the command you typed, and the Ultimate Plumber will immediately show you the output of the pipeline in the scrollable window.
At least for me personally, it was painfully worth reimplementing the crippled less functionalities for this single little addition. Though I do admit I felt the lack of the slash-to-search feature of less already yesterday when using up...
And, to explain myself a bit, I really tried my best at explaining the tool in the readme... it's just that I can't currently think of any better wording :( If by any chance you had some idea/suggestion how you think I could improve it, I'd be super eager to hear it! I'm kinda too deep in the trenches to be able to look at it from a distance, so even seemingly obvious (to you) comments (as the one with the comparison to less, after you explained it) are sincerely valuable to me!
I haven't used the command, but from what I understand, it is like a shell with less-like interface that you can pipe to, and work on a copy of the output stream.
I guess one way to articulate what you are doing is building /up/ more commands in a step by step fashion with immediate feedback. It is not necessarily a bad thing, especially if processing or preparation times for the input are long. However, you could also accomplish the same thing with a smaller sample size and repeatedly applying your commands using a regular shell instead of an interactive less.
I'm a cli junkie but I've gone from using very minimal tiling WMs to vanilla Gnome. So I've stopped using bitlbee for example, but bitlbee is something I'd use a lot in the days of 100% terminal.
It's an IRC proxy for other protocols like Skype or Slack.
I have no idea how it stands up today but I always liked the concept of having one service that handled conversion of all the modern protocols to text based.
Other than that I think people have already given amazing recommendations so I have nothing to add.
Curious what made you leave tiling WMs? I think about switching back occasionally, but still use dwm.
I also used to use bitlbee, but once AIM died I dropped it.
Slack dropped their irc gateway a while ago. I recommend wee-slack[1] which is a weechat[2] python plugin that uses the Slack API instead of the old IRC gateway, which also means it is a lot more feature rich than the IRC gateway was.
My focus has always been on being able to work without issue. Without OS locking up or without tools malfunctioning.
Used to have laptops with 256MB RAM for example, or 1024x768 pixel resolution. So I'd do anything to cram as much as possible into a tiny screen using as little resources as possible.
Now my laptop has 8G of RAM and an i7 CPU at 2.5GHz and 4 virtual cores. On top of that Linux and Gnome have made a lot of advances so it runs like a dream and I can work without having ~100 lines of configuration in my WM.
More seamlessly go from my own to someone elses computer without forgetting that caps lock is my Meta key. It all pretty much resembles Windows and Mac OS more.
I just don't see any purpose with torturing myself using a minimal resource desktop environment when I don't have to.
The main program is still gnome-terminal and tmux, I still do a majority of my work there. Habitually use cli for as much as possible, even simple operations that I could do in the Gnome file browser.
But I do it all with a vanilla Gnome configuration and I still don't miss a thing.
It's probably not what you'd use often/at all but I recently was glad to discover Google's import-mailbox-to-gmail[0] script.
Technically a CLI I suppose, it imports an mbox file into Gmail. I had switched to G Suite and wanted to import some old emails. It took a while but this did the trick!
Oh, I wrote an article on this a while ago! It's mostly about smaller utilities, not full-fledged apps. Also, I ws focusing on the ones that are lesser known. Still could be useful for some: https://code.kiwi.com/lesser-known-tools-we-love-at-kiwi-com...
abcde - A Better CD Encoder (http://lly.org/~rcw/abcde/page/) I bought three cheap USB CD drives and concurrently ripped all of my CDs to FLAC over the course of a couple of weekends. abcde gets the best possible copy.
ddrescue - GNU ddrescue (https://www.gnu.org/software/ddrescue/) not to be confused with dd_rescue (http://www.garloff.de/kurt/linux/ddrescue/) This piece of software is outstandingly good for rescuing bits from failing spinning rust. It doesn't care about the filesystem, it just tries really hard to get data from a raw device. Once you've got all the bits onto a safer medium, you can use testdisk to explore the disk image.
I am always impressed by the usefulness of the git command line. If you wanted to write a GUI for it you usually would have a library to link to but the GUIs (e.g SourceTree) I have seen use the command line and still get good performance.
Another one is ffmpeg. A ton of tools are built on it.
byobu (http://byobu.co/) is an amazing thing to have on your remote server, sessions and switchable windows + splits make this a must-have tool. Think of it as tabs but for your SSH session, plus many more features.
Just a quick note on this - byobu isn't a tmux alternative exactly. It actually uses tmux (or screen) on the back-end. So it's more like a nicely configured tmux.conf/.screenrc with standardized bindings.
Lots of great stuff here! Key takeaway though (for me at least) is that a lot of this stuff turns into cumbersome management if you're not doing the bulk of your work on localhost, but instead dealing with a various few dozen or hundreds (in my case) of different servers. Yeah you can use ansible to automate installation/setup, but then again if you suddenly end up on a fresh server without root access you're gonna be/feel crippled, being used to aliasing everything and replacing standard shells with something extra fancy.
Some level of balance is needed, which is what I've been trying to do. That said I always setup oh-my-zsh and `git config --global alias.kurwa status` wherever I go.
localhost or remote does not make a difference, any manual setup required is annoying either way.
most of the tools don't need manual configuration though and so it's a single command to install all of them that you can have tucked away somewhere to use when needed.
Maybe the best part of bash for me is AWK - so useful, and enjoyable to use. I use ffmpeg a lot for making movies from images, video conversion etc. I use Sage on the command line, mostly for programming in Cython and mathematical stuff with the Sage/Python REPL. It was 2.7GB (!) and comes with an absurd number of libraries and packages[0], mostly for maths and graphics. I started by using it in a browser notebook, but switched to the command line and never went back.
I agree. A lot of them come with a UNIX system, although there are others. I use some of the moreutils stuff. For email, I use Heirloom-mailx. For text editing, I use vim. For database I use SQLite. For typesetting I use TeX. For processing pictures I implemented my own package called Farbfeld Utilities. For making MOD/XM musics, I wrote AmigaMML which uses a text file rather than the GUI that many other programs do. Hopefully other people on here can mention even more, because I am interested in the answer too, and probably my answer is insufficient to the original asker anyways.
a tool that looks very promising is xiki: http://xiki.org/
i haven't been able to try it myself yet, but i am curious to hear if anyone else has played with it.
first GUI on top of git that I didn't hate. Primarily because it doesn't break the flow in command line. Instead of diff/add/commit I open this, but for the rest I still use git directly (or through aliases).
I wonder if there's any 2FA code storing/generating cli application? I know they aren't supposed to be used like that but storing them all on just a phone doesn't seem wise in the long run. Might be useful while building some automated scripts as well.
I am getting fonder and fonder of xargs. It basically reads some input and appends it to a pre-supplied string. Then executes the newly created string as a command.
Trivial example:
ls -1 | xargs -L1 echo processing
I am finding it quite useful to load and delete a bunch of netapp volume snapshots in a single line:
With xargs (and find), I wish I would've found this sooner:
find test -print0 | xargs -0 file
"This allows file names that contain newlines or other types of white space to be correctly interpreted by programs that process the find output. This option corresponds to the -0 option of xargs."
As a windows admin, docker has actually been super handy for running CLI tools with platforms (linux) that I can't easily get or have no will to compile natively myself on windows.
Grep might be fine 99% of the time, but there's no reason to not drop a ripgrep binary on every machine I use regularly.
Ripgrep is faster across the board and has better defaults for working in Git repositories. Since I'm working with a large codebase managed in Git, it's a no-brainer.
grep is usable on very large code bases, because after the first scan, all the sources are loaded in the file system cache, so the successive grep are done in RAM and are very fast.
So much so that I almost never use ctags despite the improved semantic search (you can find more easily the exact identifiers and distinguish them by category). grep is just as fast and precise enough.
Sort of on topic...Does anyone know of a good interactive resource for learning some of the tools that are included by default on UNIX systems? RTFM is useful, of course.
Not sure what you mean interactive - trying things out on bash itself is a good way to learn. I put together a (quite long) single-webpage not-so-quick guide to bash shell,[0] which covers a lot of the basic stuff (well, with separate pages for grep/sed and AWK), at the bottom of the page are links to the best bash websites and names of the best books that I've found. Hope that's of some use. But yeah, the man pages become perfectly adequate guides/refreshers once you've learnt the basics of a command/tool - especially using them with "/" for searching for what you want.
Various grepalike tools like ack, ag and rg have been mentioned. Most comments seem to be based on which tool runs simple searches faster, but note that their feature sets vary widely.
Here's a handy chart that compares the features of each of these tools, along with git grep and plain ol' GNU grep.
Since no one's mentioned it yet: I don't necessarily think nano is great, but it's always there and it opens immediately. Almost all small, quick edits I make (where opening VSCode/Emacs/whatever doesn't make sense) are done in nano. Pretty useful.
* fzf - a general-purpose "selector" ui for choosing "things" from a list based off, such as your history (bound to ctrl+r) or files in a directory (bound to ctrl+p with input generated by bfs or fd)
* ffmpeg - except I am much more likely to use it via one of the wrapper fish scripts I've written such as ffencode, fftrim, ffconcat, ffmute, etc. because it is (necessarily) so damn verbose and infinitely tunable.
* tmux - if I'm working remotely or on a headless machine where I can't just ctrl+shift+n to open a new terminal window (which I prefer)
I use mcabber and irssi for chat. Mcabber doesn't really support much in the way of features (so even on a modern XMPP server you're missing simple things like fetching history), but its vi style shortcuts and nice layout mean I can't give it up. Combined with jmp.chat I can even send text messages from it.
I use `ack` (https://beyondgrep.com/) as a replacement for `grep`. It is simpler, faster, it has colors by default and it's really fast and very useful for programming.
and it allows to open multiple views at once, keeping state in each.
it also supports saved searches as a view.
it's not actively maintained right now, but it is stable. i haven't had any problems yet.
sup is the ancestor of the notmuch libraries which can be integrated into mutt. but i don't know how well that works. perhaps https://news.ycombinator.com/item?id=18483833 can shed some light on that.
> notmuch-mutt, which will create a "virtual" maildir folder with search results whenever a search is made. The upside is that you can search all your folders simultaneously; the downside is that your modifications in the results listing do not carry over, also having to switch folders comes with some more annoyances.
sup does not have that problem. i have used mutt for more than a decade. switching to sup was a revelation. i would really love to see any alternatives that can compete with it.
imagemagick - Clint too for changing images. When you need to turn an entire folder of pictures into a single PDF, encode them so that the file size isn't huge and scale them to avoid morie patterns, it's the tool of choice.
I would gladly see a list of useful aliases for omnipresent CLI utilities like `grep`, `ls`, `cat`, `less`, `cut`, etc. They're surprisingly powerful but going through their man pages is like reading a whole library.
for a software keyboard, since I've seen several lists of aliases:
```alias ,v='mv'
alias r,='mv'```
Otherwise:
ttyclock
tmux/byobu
pv
ncdu
htop
megadl
webcomix and/or dosage - download webcomics into cbz files for local viewing. Installable via pip.
elinks, links2 - Browsers. especially useful if compiled with javascript support.
nnn
aria2c
grc
most
mlocate (prrovides thelocate and updatedb connands. I can't recall why i like it more than slocate or find-utils)
I love ImageMagick. We were able to write a script to join thousands of images from command line. The manual GUI way of joining images would have been incredibly slow.
One that I use a lot and I haven't seen mentioned here is `dict` which is fine for looking up word definitions and acronyms if a bit limited for translations.
You can run updatedb more frequently. By default it is run on a daily cron job and has the smarts to not run when on battery. (see /etc/cron.daily/mlocate)
You can also exclude directories in /etc/updatedb.conf that you don't want locate to index. I think in your case a more frequent cron job make sense. If run often enough, it finishes very quickly when running updatedb, which will you give you the instant search you're looking for.
Second the usefulness of rename, but beware there is some confusion about the name of the tool. It's called 'rename' on debian and derivatives and 'prename' on RH and derivatives.
Since git is by far the command a lot of people type the most, this shell comes in really handy. It supports all regular git aliases, has autocompletion, super useful for setting temporary names when pair programming, and you can set default commands to be run when you just press return.
I found zsh to be quite easy to setup with oh-my-zsh. The installation is a few commands and the defaults are sane, I use it with minimum modifications. YMMV
I'm not big on applications, period. Activities should be decomposed (or at least decomposable) into individual actions that I can stitch together in the shell.
I agree, we can use the pipes to put them together. That is how I do such thing, and design these programs, at least.
For example, one program is "playmod" which read a module music file from stdin, and write the raw audio data to stdout. The program "amigamml" is takes the MML code from stdin and write the module music file to stdout. And then, program "aplay" is playing the audio data from stdin on the speaker. Other example can be, "ls" to list files, "shuf" to put into a random order, and "xargs" to run another program for each one, and a shell script might then do "playmod" and "aplay", you can play the music at random!
There's a frequent (but by no means universal) distinction drawn between "applications" and "utilities". In the former case, you have repeated interaction with a single process, typically with some measure of "ownership" of its display port. On the other hand, a utility typically does a single thing, produces some output, and then exits.
dtrx - Do The Right Extraction. It's a tool "for Unix-like systems that takes all the hassle out of extracting archives": https://brettcsmith.org/2007/dtrx/
In Debian/Ubuntu you can install via apt.
Also, I'd second tldr which has been mentioned here already. It provides simplified man pages which common usage examples that normally fit on a few lines on screen: https://tldr.sh/
Yeah it looks rather unmaintained, but can still be installed via apt on Ubuntu 18.04 LTS. So yeah should be fine for a few years to come if you are a Ubuntu user (the system I use, I have no idea about other distros).
here is a list of all tools mentioned in the thread. unfortunately, including descriptions and links made the comment to large, so for now just the names. the number in front is the number of mentions.
i just discovered some awesome tools in the moreutils package mentioned elsewhere in this discussion:
sponge: soaks up stdin, so you can pipe it back to the same file you just read. compare:
grep -v "^me:" /etc/passwd > /etc/passwd # failed attempt to remove user *me*.
grep -v "^me:" /etc/passwd | sponge /etc/passwd # this works because sponge will wait until the grep is completed before writing back to the file.
another is:
vipe: insert vi into a pipe to do some manual editing of the data while it's handled in a pipe. could also be useful to preview pipe contents while they are being processed.
moreutils also includes vidir, an alternative to qmv.
Doesn't matter because every re-implementation of that old unix tool is made for same purpose and mostly accept same arguments.
For example you can make tar archive and pipe it to nc. On other server nc would accept data and pipe it to tar for unpack. This shows real power of unix pipes.
I wouldn't call it among the best, but it has a real-life use, and was, and maybe still is, used in production by a large motorcycle manufacturer, for whom I originally wrote it:
fish-shell: https://github.com/fish-shell/fish-shell/
ranger: https://github.com/ranger/ranger/
tig: https://github.com/jonas/tig/
ag or rg: https://github.com/ggreer/the_silver_searcher https://github.com/BurntSushi/ripgrep
And all my configurations, especially for fish-shell: http://github.com/c02y/dotfiles