Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What are the best and worst command-line interfaces you have used?
33 points by de_keyboard 7 days ago | hide | past | favorite | 83 comments
I am interested in learning what makes a great (or terrible) command-line interface.

What are some of the best and worst command-line interfaces you have used?

Nearly everything that packs or unpacks stuff. tar, gunzip, and else. I can never remember if they are going to unpack everything into a subdirectory or spill everything into my home, and conversely I never know if after zipping there's gonna be a directory inside or not. Luckily dtrx and atool do the thing you want most of the time.

From the time I was a teen I think I remember that mount was difficult to use, and mounting a CD image required passing some arcane options, though maybe I just didn't know how to use it (on the other hand I didn't know how to use Daemon Tools either, and it just worked).

I don't have a problem with git though. I don't think it's the command-line interface that's hostile, it's git itself if you don't know what you're doing. After taking some time to learn it I actually enjoy solving difficult problems with git.

Long ago someone taught me this memory trick for tar: imagine an angry German stereotype saying “eXtract Ze Vucking File”, thus -xzvf. No idea what it stands for but it does what I need most of the time.

x = extract, z = compress/decompress, v = verbose (list files added or extracted), f = filename follows.

You can leave off the 'z' if it's a tar without compression. If it's compressed, the extension is usually tgz, tar.Z, bz2, or something else other than tar.

You can leave off the 'v' if you don't want to see a list of which files are being added or extracted.

In recent versions of (maybe only GNU?) `tar` you can leave off the `z` flag and it will still decompress based on the filename.

Or just replace the z with an a. That's what I mainly use.

Yeah, extractors/compressors are so bad i ended up hacking my own interface over the common utilities in python (i called it tzar because it's tar, zip, anything really).

A very handy tool for this is https://www.nongnu.org/atool/

Relevant xkcd comic: https://xkcd.com/1168/

That's something I never understood. I learned tar -xvf young, and that's the only command I've ever used, and it works every time.

I interpreted the comic as meaning not any old tar command line, but a command line for a specific intended purpose. I had a job interview once with a test consisting of a whole list of 20 unix questions, one of which was "you need to do x, y and z in tar, what's the command line?". I got about 5 of them, but spent the 5 minutes critiquing the test and suggesting ways to improve it. I got the job. Apparently my 5 answers were the best anybody got anyway.

That's interesting, I always interpreted it as "enter any valid tar command".

That's the most obvious interpretation, but it doesn't really make much sense. At the end of the day, I suppose it's just a joke that tar command line arguments are bizarre and arcane.

tar --version

tar -czf ./file.tgz ./*

edit: i do hope that is correct without checking :-)

So the dash in `-czf` if optional? Which reminds me: why does ps have two sets of options again?

Don't laugh, it's actually three sets of options now:

- UNIX options, which may be grouped and must be preceded by a dash.

- BSD options, which may be grouped and must not be used with a dash.

- GNU long options, which are preceded by two dashes.

Reference: https://man7.org/linux/man-pages/man1/ps.1.html

As an alternative to atool I've always used The Unarchiver's cli version[0] which has been able to handle basically anything I've thrown at it correctly.

[0]: https://theunarchiver.com/command-line

May I suggest to use “zipinfo” or “tar -t” to check the contents of the archive before extracting

This is a pain in the GUI tools as well

The ImageMagick utility mogrify.

By default (without specifying extra command-line options) it overwrites the input files. [1]

In decades of computer use, that is the only command line tool that tricked me into destroying my data because I didn't realize it works differently from all the other tools I use that don't do that.

Usually, a command line utility with less options specified is relatively "safe" and you have to add extra syntax to make it "unsafe".

It was definitely a violation of: https://en.wikipedia.org/wiki/Principle_of_least_astonishmen...

EDIT reply to: >The whole reason for mogrify's existence is that it overwrites the original image file.

Sure, I understand that but many other destructive tools will have safety UI features such as creating ".bak" backup files. That's what many other command line tools do including image utilities. So something like mogrify could have been designed with hypothetical syntax as "mogrify -nobackups". The principle is to type extra syntax to make it more dangerous while still accomplishing the (observable) goal of changing the original files.

[1] https://superuser.com/questions/1575004/imagemagick-how-to-a...

> mogrify - resize an image, blur, crop, despeckle, dither, draw on, flip, join, re-sample, and much more. Mogrify overwrites the original image file, whereas, convert(1) writes to a different image file.

The whole reason for mogrify's existence is that it overwrites the original image file.

It would be much safer to take the sed approach and put mogrify‘s functionality under `convert -i[backup extension]`, and dispense with standalone mogrify altogether.

Imagine having the command “sedd” aliased to `sed -i`. It would be a usability nightmare.

> The whole reason for mogrify's existence is that it overwrites the original image file.

Then it should not exist. To save one "rm", doesn't look like a good idea to me.

> Then it should not exist.

It definitely should.

This becomed critical when you are doing batch processing large amount of images in a cloud instance or somewhere else where disk space is a constraint.

I have used mogrify and similar custom solution when trying to batch-edit gigabytes of images while cleaning data for Computer Vision training.

well, use convert instead. It's the only difference...

Not the worst (I think `tar` and `unzip` probably take the cake) but `find` is pretty atrocious IMO. The order in which you specify certain flags like `-maxdepth` relative to other option flags matters (not just relative to plain arguments, but relative to other option flags starting with `-`).

`find` is definitely one of my least favourite among the traditional Unix tools in terms of interface. `dd` is also a little weird with its syntax but at least I rarely have to use `dd`.

I don’t know, once you take the time to learn ˋfind` it’s ok. You always use mostly the same flags anyway.

ˋfd` is a worthy successor of ˋfind` though.

>I don’t know, once you take the time to learn ˋfind` it’s ok.

this could apply to literally anything. my time is not infinite.

Find is fine. Grep is fine. I do wish they took arguments in the same order.


For interactive shell scripting:

- AWK. Old enough to be part of the POSIX standard and yet considerably more awesome than almost any other UNIX command.

- socat for any kind of network/socket testing stuff.

- sponge and entr just for their genuine usefulness with minimal interface.

For interactive use:

- SSH. I mean, obviously. It's also astounding how well OpenSSH integrates into the whole *NIX TTY landscape. You can use it for years and still learn neat things about it.

- Vim. Yes, it's obscure to learn, but once you do, boy do you get a lot out of it. Also: Since `:terminal`, working on one-off scripts has become considerably more awesome.

- htop and btop++ are good examples of TUIs for interactive use.

- tig is also quite cool for many git interactions (especially partial staging).

- lnav for log analysis tasks.

For batch jobs you set up and trigger as needed:

- Beets (see: beets.io). It's impressive how streamlined it works once you've set it up.


- Pretty much anything Microsoft. I really don't understand how it's possible to consistently produce bad CLI programs for such a long time.

- As a prime example, have a look at `sc sdshow` and `sc sdset`.

Edit: Perhaps to clarify about the sc SDDL commands:

If you look them up, you won't find much "bad" in their documentation. Because nothing about the SDDL syntax they are built on is documented. The best you can do is scrape together what you can from third-party blogposts with a perceived signal:noise ratio of .000001.

And SDDL in itself is phenomenally badly designed. And, of course, this being Microsoft, the tool isn't designed to be used together with other programs.

My niggle is with commands that cannot be composed, ie. consumed by other tools or shell scripts or built into pipelines. There are lots of small things that break composition: Not exiting with a non-zero code on error, sending error messages to stdout, not producing output that can be easily parsed by another command. Too many commands suffer from one of these problems, unfortunately.

One that's unfortunately quite common is unconditionally outputting ANSI colour escape sequences even when the output isn't the terminal.

Honestly that's the drawback of unix-ish interfaces. Don't get me wrong, text first is usually a sane default, but there is also value in an interface like Powershell, where you would write your program as a Cmdlet that returns an object-like response.

worst is git by far.

i'm used to it now but when I was learning, "what the fuck did `git reset --hard` do? where is --soft? Is there a no-flag version? What the fuck did I leave mercurial for. God damn I miss svn."

I still kind of miss Mercurial and TortoiseHG.

The worst thing about git is how commands are inconsistently overloaded.

`git reset` both unstages files (opposite of `git add`, straightforward enough) but also can move the current HEAD pointer to an arbitrary commit. You’d think it could then also discard unstaged changes, but nope, that’s `git checkout -- …`, which incidentally has nothing to do with `git checkout`’s other functionality of switching branches.

There are so many other examples of this. `git rebase` can not only rebase but also combine and reorder commits via `rebase -i`, which seems totally unrelated to rebasing. `git branch` and `git checkout` have overlapping but also distinct functionality. I could go on for hours.

For me, the fact that there are so many who really like git makes it worth recommending a cheat-sheet (or put up with it) for the rest, just to be compatible and have fewer tools to deal/learn with overall, but yes there are reasons some like svn, hg, or fossil. But there are also "got" (currently on openbsd at least) and other convenience tools.

I also found myself making myself a "gh" script to make finding/opening the right git manpage easier/quicker.

I always thought the UI of "git commit --interactive" was at a sweet spot of being both helpful and efficient, once I got familiar with it the first time. It is easier to remember than command-line options. Related UI details in another comment in this page.

Some day I will burn a day of my life to learn it inside out and be done with it. Meanwhile I use magit.

Mercurial was much better IMO but people argued about performance, because, you know, the CPU is the bottleneck when I merge code. How much time has been wasted navigating Stack Overflow?

100% agree with this. The nomenclature of the commands always felt too similar for different things to me, i.e. reset vs revert, checkout vs fetch vs pull. Feel like there could be much more descriptive and distinct names.

If someone comments “git is fine once you get to know it” I might reach through the internet and slap them.

Agree it is one of the most unintuitive cli I have ever used. If it’s really just manipulating a DAG, I wonder if a rewrite is possible with this in mind.

Worst: sqlplus by Oracle. No autocomplete, no backspace, no history function. It's like nobody ever inside Oracle ever had to use this craptastic piece of software to accomplish anything.

EDIT: Oops backspace works. Might have been a combination of putty and sqlplus. Nevermind.

A lot of people forget that the default shells of many *nix systems through at least the mid 90s did not support any of these. No line editing, no interactive history (primitive !-based history doesn’t count), no completion.

Occasionally I’ll still encounter an environment that only has a minimalist implementation of sh (e.g. some busybox builds) and be totally stymied. It amazes me how much people accomplished on old *nix systems with such abysmal usability.

Sure but it's been over 20 years. Readline already existed back then and sqlplus is still the default command line tool for their DBMS.

The best: an old CD player called Workbone on Slackware. It made great use of the number keys for pause/play/ff/rw/eject.

The worst: I used to work with an awful custom-built industrial computer.

You had to enter the 10-character alphanumeric codes of 98 rail cars into a command-line interface that didn't permit backspacing. One mistake. Car 97. Do it all again.

Many of the worst ones I've used have been embedded in physical devices - ethernet or fibre channel switches, disk arrays or controllers, etc. I've been around long enough that I don't expect every CLI in the universe to have things like history, command line editing, or autocomplete, so that's OK. On the other hand, many of these interfaces are wildly inconsistent and that bugs me. Some commands are object-verb and some are verb-object. Some are abbreviated, some are verbose. One command produces an identifier in format X, but the next command consumes it in format Y so you can't even use your terminal's copy/paste function. Excessive modality is another common problem in these interfaces. Which sub-sub-sub-mode was the command I needed in, again? Can't know without actually entering that mode, and then the next related thing you have to do is in a completely different part of the tree. Using such interfaces directly can be painful, and automating them is often inordinately difficult too.

Why is nobody hating on megacli? It's the only tool I use where I actually need a wrapper script. There are other horrible command line experiences, but nothing anyone here listed I would describe as horrible...

Megacli is definitely one of the examples I was thinking of when I said many of the worst CLIs are embedded into physical devices. What a piece of trash.

The Windows Resource Kit binaries come to mind. They were distributed together in the same ZIP, but:

1. They used differing return values for success and failures. Sometimes, a 0 was a success, sometimes 1 was a success, and sometimes only 4 was a failure, but everything else was a success.

2. They used different command-line switch formats. Some utilities used a hyphen, others used a forward slash.

ps(1). It has two incompatible theories of options from the two parent tools that spawned it, both equally horrible. ps -e f is not the same as ps -ef or ps ef. Also, implementations vary wildly across platforms.

Two of my favourites are Watson and Homebrew. Very well thought out.



Watson seems interesting, I’ll take a look at that. Thanks for sharing!

I detest the MikroTik cli. I suppose the cli itself isn't terrible, but the fact that it's coupled with the mikrotik scripting language leaves a very bad, nearly rotten, taste in my mouth.

Does something simple like htop count? I believe there's a bunch of other good stuff that uses ncurses as well.

To me, openssl and nmap are the worst.

Kubectl is among the best: commands make sense and have internal logic, interface is discoverable, output is configurable and supports different formats.

Nah. Kubectl is so verbose that it reminds me of typing xslt in my younger years.

k9s would save us all from carpal syndrome, but it's a curses UI, not a cli.

openssl reminded me of gpg, wow, this one is horrible, and manuals don't help at all.

I'm rather fond of pgcli at the moment, though I haven't used it long enough to get a brilliant feel for it yet.

Pgcli is amazing. It has everything you need, including vi mode!

I find it very easy to remember many terrible command line user interfaces, they are already mentioned in the discussion, but very hard to name a single great one.

Maybe a good command line user interface is one you don't notice nor remember. It just work smoothly. So I will start with cat.

Maybe cli's aren't good for interacting with computers? Heresy I know. ;)

Anything with the Command —global-flags subcommand -subflags command -flags parameters

Git does this but is workable because it’s 2 levels and you don’t need to specify various flags defaults are usually ok

R dplyr is one of the best interface I have ever seen for manipulating data. I think one of Hadley’s biggest contribution to the world of data science. It feels like my brain can breathe out dplyr code

anything that doesn't have (preferably zsh) shell completions.

I mostly discover flags with tab completions and the attached help to it (maybe that's only a zsh feature not sure if other shell have help on flags on completion)

bonus point to very smart completion, i.e: on kubectl if i do kubectl -n foo get pods [TAB] it will get the tab from the namespace foo on completion not from the default/current namespace.

Tangentially, I'm continually surprised at the lack of trivial GUIs for CLI tools. Like, command builders (output is a command you can run/paste into terminal) with procedurally-generated (or hand-built but without much design consideration) widget interfaces - a simple checkbox or line input for each option, with helpful concise tooltips. Training wheels for CLI.

I worked in a campus bookstore with an old terminal system to access the inventory database. It was a simple text interface. What I loved about it was that the commands to navigate were quick to learn (key was first chars in command) and the response time was instant. So you could jump to any part of the system with a few keystrokes ingrained in muscle memory.

Do you have more details about software?

I don't have details about that software, but there was something which I remember from the early 90's or so: software our church had for clerks -- "MIS" for membership info, and "FIS" for finance. A clerk would use one or both of those weekly or more, and they fit the above description perfectly. I missed them when they moved to a graphical UI.

Similarly, I always thought the UI of "git commit --interactive" was at a sweet spot of being both helpful and efficient, once I got used to it.

(All the above inspired the UI of the knowledge manager I wrote/provide at http://onemodel.org -- AGPL, I use daily and rely on it for many things, currently requires user to perform postgresql installation & upgrades; hoping to move to sqlite someday, when health allows. But most people, especially non-keyboard-oriented people, probably wouln't like its UI. It is perfect for me: very efficient/effective and everything you need to know is on-screen.)

And vim & tmux are things of beauty, after the initial investment (ongoing, for vim).

I recommend reading this site: https://clig.dev/

The combination of mosh and screen has been a big productivity booster for me. The shell alias server1='mosh root@server1 -- screen -xRR -D' keeps the connection forever even if I change networks.

The only pity is that mosh breaks compatibility with earlier versions once in a while.

Why use screen when tmux, an improved version, exists?

screen vs tmux is like vim vs emacs. A matter of taste.

Why do you prefer screen?

I generally dislike interfaces which don't react well to terminal resizing, usually because they're trying to do something fancy with progress bars.

rsync is probably my least favourite. I literally don't think I have ever correctly written a command that does what I intended on the first attempt.

I do like anything that's a nice well-structured representation of an API though – things like the AWS CLI are pretty good in practice, IMO.

Surprised to see nobody calling out ffmpeg as one of the most hostile UIs anywhere

It's like a mean rubik's cube, but so so useful!

> worst command-line interfaces

If you mean TUI wise as in curses, nearly all of them.

Command line parameters however, there are so many examples. Tar, unrar and unzip take the cake, all in the same category of tools. "unrar x" whatever it is, I always have to look it up. dd because of the way its parameters are specified: dd if=/dev/zero of=foo.bar bs=4096 count=1024 . It's all historical and I can live with it, but many of them at least have the saving grace that they have excellent manpages or you've used them so often it becomes second nature.

There should be a special place in hell for tools that combine short and long style command line options with a single dash. Like "foo -b a -r -baz 123" with "-baz" being a single option, because I will automatically add an extra dash there out of habit.

Coworkers discovering ncurses or some library around it tend to go on a TUI frenzy for a while, and it inevitably ends up being some convoluted mess nobody wants to use. I'll admit there's a few indispensable curses based tools, for instance top (and some variants along that line), but they're far and few in between. At most dialog comes to mind for dealing with prompts for end users, but I personally abhor it in anything but a setup or installation context.

Those very same coworkers also make tools with indecipherable command line options, often because they don't know the language they're working in has a standardized option parser library or module.

> best

I don't remember which tool it was, it could be "crm" (for failover, think like heartbeat and pacemaker) but I might be mistaken. It's been years since I've used it. The command line had options like so:

tool [options] section action resource_src resource_tgt

But the best part was, if you just started the tool without any options, it'd drop you into a pseudo-shell. It'd show you a blank prompt ending in ">". If you then typed "section" you'd end up with a "section>" prompt.

I remember for what I was doing with it, it felt really intuitive. At any point I could type "help section" and it'd list the available actions for that section, together with short example.

I've forgotten the real syntax, but you'd have commands like:

tool node status

tool node standby servername

tool service stop dns

tool service migrate dns secondary

If you were in the pseudo-shell, you could do:

$ service

service$ migrate dns secondary

service$ up

$ node

node$ status

node$ exit

lazygit and lazydocker - both are really great

Best: darcs

Best: probably that of Docker. https://github.com/docker/cli

Why: it didn't force you to read man pages or look up documentation, but instead allowed every command to explain what it does to you, either when you'd run it with --help, or just no parameters (in case it expects any). Furthermore, invocations of these commands weren't just a long string of arguments, but rather commands that are logically grouped and can essentially be navigated as a tree. All of that made it extremely useful and pleasant, at least in my eyes.

It just feels like it's made to actually be used by developers and to help them as much as possible. Whether you agree with me on that or not, i suggest that you have a look at this excellent talk by Dylan Beattie, "Life, Liberty and the Pursuit of APIness: The Secret to Happy Code", which talked more about the discoverability of systems and the developer experience: https://www.youtube.com/watch?v=lFRKrHE8oPo

Nowadays, you can actually use something like Typer for Python to create similarly useful interfaces, which i strongly advise you to have a brief look at: https://typer.tiangolo.com/


  $ docker
  Usage:  docker [OPTIONS] COMMAND
  A self-sufficient runtime for containers
    ... (a list of items)
    -v, --version            Print version information and quit
  Management Commands:
    ... (a list of items)
    image       Manage images
    ... (a list of items)
  Run 'docker COMMAND --help' for more information on a command.
  To get more help with docker, check out our guides at https://docs.docker.com/go/guides/
  $ docker image
  Usage:  docker image COMMAND
  Manage images
    ... (a list of items)
    ls          List images
    ... (a list of items)
    pull        Pull an image or a repository from a registry
  Run 'docker image COMMAND --help' for more information on a command.
  $ docker image pull
  "docker image pull" requires exactly 1 argument.
  See 'docker image pull --help'.
  Usage:  docker image pull [OPTIONS] NAME[:TAG|@DIGEST]
  Pull an image or a repository from a registry

  $ docker image pull alpine:3.15
  3.15: Pulling from library/alpine
  59bf1c3509f3: Pull complete
  Digest: sha256:21a3deaa0d32a8057914f36584b5288d2e5ecc984380bc0118285c70fa8c9300
  Status: Downloaded newer image for alpine:3.15

  $ docker image ls
  alpine       3.15      c059bfaa849c   11 hours ago   5.59MB
The worst: tar

Why: https://xkcd.com/1168/

In short, it's the exact opposite of the previous example. Frankly, without memorizing the flags, i still have no idea how to work with archives with it. Say, i want to create a compressed archive with it.


  $ tar
  tar: You must specify one of the '-Acdtrux', '--delete' or '--test-label' options
  Try 'tar --help' or 'tar --usage' for more information.

  $ tar --usage
  Usage: tar [-AcdrtuxGnSkUWOmpsMBiajJzZhPlRvwo?] [-g FILE] [-C DIR] [-T FILE]
              [-X FILE] [-f ARCHIVE] [-F NAME] [-L NUMBER] [-b BLOCKS]
              [-H FORMAT] [-V TEXT] [-I PROG] [-K MEMBER-NAME] [-N DATE-OR-FILE]
              ... (a really long list of items)

  $ tar --help
  Usage: tar [OPTION...] [FILE]...
  GNU 'tar' saves many files together into a single tape or disk archive, and can
  restore individual files from the archive.
    tar -cf archive.tar foo bar  # Create archive.tar from files foo and bar.
    tar -tvf archive.tar         # List all files in archive.tar verbosely.
    tar -xf archive.tar          # Extract all files from archive.tar.
   Main operation mode:
    ... (a list of items)
    -c, --create               create a new archive
   Operation modifiers:
    ... (a list of items)
   Local file name selection:
    ... (a list of items)
   File name matching options (affect both exclude and include patterns):
    ... (a list of items)
   Overwrite control:
    ... (a list of items)
   Select output stream:
    ... (a list of items)
   Handling of file attributes:
    ... (a list of items)
   Handling of extended file attributes:
    ... (a list of items)
   Device selection and switching:
    ... (a list of items)
    -f, --file=ARCHIVE         use archive file or device ARCHIVE
   Device blocking:
    ... (a list of items)
   Archive format selection:
    ... (a list of items)
   FORMAT is one of the following:
    ... (a list of items)
   Compression options:
    ... (a list of items)
    -z, --gzip, --gunzip, --ungzip   filter the archive through gzip
   Local file selection:
    ... (a list of items)
   File name transformations:
    ... (a list of items)
   Informative output:
    ... (a list of items)
    -v, --verbose              verbosely list files processed
   Compatibility options:
    ... (a list of items)
   Other options:
    ... (a list of items)
  Mandatory or optional arguments to long options are also mandatory or optional
  for any corresponding short options.
  The backup suffix is '~', unless set with --suffix or SIMPLE_BACKUP_SUFFIX.
  The version control may be set with --backup or VERSION_CONTROL, values are:
    ... (a list of items)
  Valid arguments for the --quoting-style option are:
    ... (a list of items)
  *This* tar defaults to:
  --format=gnu -f- -b20 --quoting-style=escape --rmt-command=/usr/lib/tar/rmt.exe

  $ # Copied from the Internet, because the documentation is overwhelming
  $ tar -vczf new-archive.tar.gz ./files-i-want-to-archive

  $ # Consider the full format instead, maybe we should actually use the full parameters more often?
  $ tar --verbose --create --gzip --file=new-archive.tar.gz ./files-i-want-to-archive
In short, using tar does not inspire joy and it feels overcomplicated, no matter how you look at it, possibly either because creating archives is a complicated domain (though the zip tool might not necessarily support that claim), or because the tool has grown over time and no longer does just one thing and does it well.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact