
Maybe We Should Stop Creating Inscrutable CLIs - kmdupree
https://www.philosophicalhacker.com/post/towards-readable-clis/
======
chubot
GNU utilities have addressed this problem for a long time. Maybe he's on OS X?
You can install them there though. You don't need anything like PowerShell.

His

    
    
        cut -sf 2 -d \| 
    

can be written

    
    
        cut --only-delimited --fields 2 --delimiter '|' 
    
    

Likewise

    
    
        xargs -L 1 -I % jira attach rm %
    

can be rewritten

    
    
        xargs --max-lines=1 --replace=% -- jira attach rm %
    

These are options are easy to find with cut --help and xargs --help. Or 'man
cut and 'man xargs'. (Though honestly it's weird that xargs doesn't accept
--max-lines 1, only --max-lines=1 ? )

\---

I plan to add a mode to [http://www.oilshell.org/](http://www.oilshell.org/)
to support longopts in all the builtins, e.g. instead of

    
    
        read -r -d ','
    

instead of

    
    
        read -raw -delimiter ','
    

I wrote a blog post last night that shows other benefits you can try now:

 _Oil 's Stricter Semantics Solve Real Problems_
[http://www.oilshell.org/blog/2019/08/16.html](http://www.oilshell.org/blog/2019/08/16.html)

~~~
Aqueous
In the article he talks about the longer flag names. But the commands
themselves are inscrutable. 'xargs?' 'cut?' in neither case can you accurately
infer from the name what it does. definitely a good pattern to just name
things what they do.

~~~
dang
That's only the case for an operator that is rarely used. Once it becomes a
primitive of your language, a heavier name gets in the way. You've integrated
its meaning now, so you no longer need it explained, but the cost of the extra
length continues forever. This is why we don't write, say, "x multiply y".
Given this tradeoff, I'd say xargs and cut are good names for those commands:
rich enough to convey a large portion of what they do, but short enough to be
wieldy and—critically for shells—not to take up too much of a line.

~~~
liability
I think that argument made sense years ago when people thought completion
systems were slow or bloated, but on modern computers anybody who still thinks
that way is antiquated or weird. The defaults shells are shipped with are a
problem of course, but that could/should be addressed by distro packagers
IMHO.

Consider the long identifiers used in various modern lisp dialects for
instance, particularly scheme dialects. It doesn't bother people because Emacs
and vim both have completion systems.

A bigger issue with renaming all these utilities is it would negatively impact
people who've already learned them. Alienating your existing userbase to
appeal to newbies might be in-vogue these days, but I think it's crap.

~~~
Izkata
Waiting for a completion system breaks the flow of thought, no matter how fast
it is. After a certain point, they tend to start getting in the way.

~~~
liability
Completion fast enough to be faster than a human's reaction time has been
feasible for many years now. Furthermore there are completion systems that let
you type discontiguous parts of your target word, essentially giving you the
best of both worlds.

~~~
Izkata
To truly be fast enough not to break the flow of thought->text, it has to (for
me) be ~2 tokens ahead of where the cursor is, not completing the current one.
And even then I doubt it'd work completely, because it requires rapid context
switching to check the completion and pick the right one.

------
dwheeler
This seems to be primarily an argument that "CLIs should support GNU-style
double-dash options". I agree with that argument, and in most cases it's easy
to add support for them. Most option-processing systems support double-dash
(long option names), and I find them much easier to explain later.

The problem is that at least historically many of the _BSD developers, and
thus the POSIX standards community, strongly resisted standardizing support
for double-dash. So while utilities_ allowed* to have long option names, the
POSIX standard doesn't directly support them and doesn't standardize them.
That means that if you want to write _portable_ invocations of common tools,
the only way that always works are the short (single-letter) names.

It may be time to revisit this decision. If the *BSD developers (and so on)
are now willing to accept long names, and the POSIX standard is finally
modified to support it, that would eliminate at least one roadblock to
actually using them.

~~~
vimax
Every time I wind up doing

    
    
      > java -v
    
      > java --version
    
      > java -version

~~~
lerey
Don't forget -V (since some applications use -v for verbose)

~~~
vimax
This is java, it would be

-D-XX:VerboseRuntime=true

~~~
needusername
-XX is for implementation specific VM flags an generally uses + or -

-D is for system property and uses key value

so it would either be

-XX:+VerboseRuntime

or

-Djdk.runtime.verbose=true

------
dcolkitt
One thing the author's missing is a consideration of how often a system is
used vs. how often it's learned. Let's just take one example from the article.
Would it have been better if `sed` had been named `streamEditor` from the
start?

sed was written in 1973. Think about that, it's been around for half a
century. It's regularly used by millions across generations. Imagine how many
countless man-hours have been saved from naming it "sed" instead of
"streamEditor"? And I'm not just talking about typing code, but also reading
code.

Yes "sed" is pretty inscrutable if you don't know Unix. But pretty much any
Unix developer or sysadmin instantly recognized "sed" the same way they would
a common English word. Very few people just use the Unix command line tools a
few times, then forget about them. Once you learn Unix, chances are you'll
keep using it for decades. And for those people, their brains almost certainly
parse "sed" much faster than they would "streamEditor".

So, I think here's the lesson. The utility of terseness is very much related
to the relative balance of expert users vs novice leaners. If you expect your
system to mostly be repeat power-users, then terseness is very good. Systems
that are around for decades, used widely by thousands, and general-purpose
frameworks tend to fall in this category. In contrast, a high percentage of
your user's man hours may just be people getting acquainted with the system.
Maybe it's an application that solves some niche corner case, and most people
use it once or twice then forget about it. Or maybe it's some internal
software, where corporate turnover means that most devs working on it at any
given time are neophytes. In this case verbosity and explicitness is a very
good thing. You're not trying to cater to power-users. You want things to be
as clear as possible, even if it means a lot of typing and spelling errors and
wasted screen space.

The challenge here is that you mostly need to make these decisions at the
development stage. You have to try to predict what your user-profile will look
like, before you actually have any users. I doubt the original authors of sed
could have ever imagined how popular and lost-lasting it would become. But
given that it has, there's no doubt they made the right choices.

~~~
bsaul
On another hand, imagine if the original computers on which unix commands were
designed weren't restricted to 80 characters terminals with a few kilobytes of
memory and almost no storage nor CPU. I'm pretty sure the author would have
favored much more expressive commands.

I think design of those command lines were done first and foremost because the
underlying system constraints were huge. Now that we have tabs autocomplete on
shell and IDEs, the length of a command pretty much doesn't have any kind of
importance.

~~~
teddyh
> _I 'm pretty sure the author would have favored much more expressive
> commands._

The very first compiled computer language was designed to be very expressive
and able to be read almost as English, and be simple to read and understand.
Would this be what you wanted? Would you change your mind when I tell you its
name is COBOL?

~~~
madhadron
> The very first compiled computer language was COBOL

FORTRAN and Lisp both predate COBOL.

~~~
teddyh
Lisp was never compiled that early; that came much later.

But sure, I stand corrected; FORTRAN was technically the first compiled high-
level language.

------
a2tech
That bash code he posted as ‘unreadable’ is perfectly clear to me, and I don’t
even know anything about the jira command line interface.

I think the real problem here is people always have preferred tools for jobs.
The author obviously spends more time with JavaScript and finds that to be
clearer and easier to write. I spend more time with bash and for me, it’s very
clear. It doesn’t make it ‘inscrutable’

~~~
dwaltrip
You have to admit the bash code is less readable to someone who hasn't used
those particular CLI arguments or flags in a while (or ever), as compared to
someone trying to read the JavaScript version without knowing much about JS or
the library that the code used.

You might argue that this is an acceptable trade-off to make to in order to
gain terseness and ease of typing. But this is different than saying it's just
matter of tool familiarity.

~~~
ddingus
Yes, but how much less?

I find the bit of bootstrapping I need to make sense of shorter length
operators and options pays off when writing and or reading later.

------
overgard
(Arguably?) Powershell seems to take this approach. I have to admit, I like it
philosophically, but whenever I try to use it there's some part of it that
feels icky to me. It might just be microsoft's
ConventionOfOverusingCapitalizationEverywhere though.

Here's an example of what powershell looks like:

    
    
        Install-Module DockerMsftProvider -Force
        Install-Package Docker -ProviderName DockerMsftProvider -Force
        (Install-WindowsFeature Containers).RestartNeeded
        Restart-Computer
    

IDK, it's undeniably easier to read than the Unix version. I have to admit
some part of me just doesn't love how it looks though.

~~~
chrismorgan
For a case like this, idiomatically I believe it’d end up as something like
this in Powershell (though I’ve never seriously used Powershell):

    
    
      Get-JiraAttachment -Ticket {{args.ticket}} | Remove-JiraAttachment
    

… because Powershell pipes are structured, rather than just being strings.

If you were just wishing to emit the attachment IDs, you’d pipe the first bit
through something like `Select Id`.

------
stirner
As other comments have pointed out, it seems like the author is looking for
PowerShell. I have used a fair amount of PowerShell and it always leaves me
feeling like I should just fire up Visual Studio and learn C#. I think that
shell languages _should_ prioritize terseness at the expense of readability,
allowing experienced users to accomplish their goals as quickly as possible.
Any code that you expect others (or yourself) to maintain in the future does
not belong in a shell script.

~~~
solipsism
Couldn't disagree more.

 _Any code that you expect others (or yourself) to maintain in the future does
not belong in a shell script._

Why? A shell script is often the best tool for the job. If your reason is that
shell scripts aren't maintainable, then this is a circular argument basically
amounting to "we shouldn't make shell scripts maintainable because we
shouldn't ever maintain shell scripts because shell scripts aren't
maintainable".

 _allowing experienced users to accomplish their goals as quickly as possible_

Having both short and long versions of each flag seems to give you the best of
both worlds.

~~~
comex
I agree that ideally shell scripts should be maintainable, and that designs
which allow both succinctness and maintainability – like supporting both short
and long flags – are preferable. But there will always be cases where you have
to choose between the two, and in those cases I think a shell should choose
succinctness. For all of the Unix shell’s weaknesses as a programming
language, it’s an incredibly powerful tool for interactive programming,
writing one-liners at the shell prompt. And when you’re trying to program as
close as possible to the speed of thought, every keystroke counts.

~~~
solipsism
_And when you’re trying to program as close as possible to the speed of
thought, every keystroke counts._

And when is that ever important? I've been in development and sre roles for
decades, and I've never been in a situation where I have to program at the
_speed of thought_.

The only time I've seen milliseconds matter is watching movie depictions of
hackers.

------
swagasaurus-rex
People here in these comments are claiming that terseness in CLI commands is a
good thing.

Perhaps there is a way to do one better? I could imagine an interactive shell
that can show you the relevant portion of the man page for that specific flag.

Man pages never have sat right with me; I would love to have a shell that can
show me (intellisense?) the most common flags and options commands are run
with.

The command line is powerful but the barrier to entry is high. Anybody who
does not fathom complexity hidden behind pages of manuals is doomed to be
overwhelmed with the sheer ignorance and lack of discoverability a shell
provides.

~~~
kmdupree
>Perhaps there is a way to do one better? I could imagine an interactive shell
that can show you the relevant portion of the man page for that specific flag.

I've had a similar idea before. I think this is a great way to think about the
problem.

~~~
oldmanhorton
Just like so many other comments in this thread, you're describing powershell.
There's autocomplete for parameter names in the basic powershell.exe prompt,
but VS Code, VS, and PowerShell ISE (integrated script environment) all show
argument types and docs through intellisense.

------
vnorilo
I think Powershell tried to do something like this article suggests. I
appreciate the efforts, but do agree with the common criticism of verbosity.

Also, it goes to show that no matter how bad aspects of bash are, the power
accumulated over the years is hard to match, and a lot of that power is
strongly coupled to the arcane and terse conventions.

~~~
6thaccount2
Powershell is almost good, yet it falls off the mark in about a dozen ways.

A lot of very simple things are harder to do (text is easier than objects up
until a point) since Powershell treats everything as objects and LinuxCLI/Bash
does everything as text being piped around to specific commands.

The biggest complaint I have with Powershell is that it is so obnoxiously slow
at parsing text files, that it just can't do basic scripting tasks on even
medium sized files ~100MB without tons of StreamReader[] overhead or without
calling a pre-compiled C# binary directly. This is the biggest fail. Nearly
all the obvious ways of appending to files or parsing them stop working once
the file size gets bigger than tiny. This isn't a problem with Linux commands
that just pipe text to highly optimized C binaries. I'd drop Python for 90% of
my scripting tasks on Windows if Powershell were faster in this area, but it
is just painfully slow (if all the obvious ways of doing something in
Powershell take ~5min when even Python can do it in ~4sec, you have a
problem).

However, you can usually write some very beautiful and easy to read code by
using the built-in objects once you learn them. I also like how powershell's
functions allow you to create commands with parameters in a super easy way so
you can easily build DSLs in a way. Everyone feels differently, but calling my
commands in this manner (InvokeCommand -param1 "blah") reads a lot more
naturally to me than (Object.Method(param1)). Also, Powershell can handle
things like filesystem interaction in a way that doesn't feel like a bolted on
afterthought as Python's does.

~~~
vnorilo
Thank you for the insight. I don't do a lot of powershell daily, but the last
time I had a big batch job I was disappointed in how clunky it was to make a
foreach-object -like pipeline utilize a sensible number of processes. I have
forgotten the details but I think I needed some sort for add-on that split
jobs across processes. Also, the pipeline wasn't _pipelined_ in the
parallelism-sense.

~~~
6thaccount2
No problem. I'm certainly not an expert, but that is mainly due to the
deficiency I mention above. I'd use it for a lot more use cases if it could
handle those scenarios.

I think Powershell "jobs" can help run things in parallel, but I've never done
more than just read about it. The Powershell in Action book (3rd edition)
covers that functionality I think. The first chapter to that book is pretty
awesome.

My dream as someone forced to use Windows would be for Microsoft to use some
dollars to make Powershell a one stop shop for data science, simple games
(think commodore sprites), numerical computing...etc. That sounds a little
wacky I know, but I think people would appreciate every computer shipping with
a powerful environment like that. Since Powershell is built on .NET, surely
someone could make a library for Powershell that abstracts away some of the
.NET complexity to where I can just do something like:

Process-Data -data "blah.csv" | Create-PieChart -output "chart.png"

Not having to install Anaconda and import a ton of libraries on the target
computer would be helpful. Note that I'm not saying to reproduce all of Matlab
in Powershell, but putting some of the most popular routines as built-in
commands would be pretty awesome.

~~~
jodrellblank
I like the sound of your dream, but I don't think there's much commercial
return on making a fun to play with 8-bit home computer environment. That's
what Raspberry Pi wanted to be, and instead it became "cheap web browser host"
and geek paperweight. The PS team is small, their real goal seem to be to
encourage you to Azure through managing it with PS from any OS.

It would be up to "the community" to make modules to do this kind of thing,
which would be possible if anyone wanted to, but unlikely to ship with
Windows.

> putting some of the most popular routines as built-in commands would be
> pretty awesome.

Microsoft's push is to avoid more "bloat" by leaning more heavily on optional
modules installed from the PS Gallery. But `group-object` is a builtin for
doing a kind of SQL GROUP BY operation, and Measure-Object just gained a
-StandardDeviation option, and there's now ConvertTo/From-Json, and
ConvertFrom-Markdown. If there is a basic routine that would be popular and
fit many use-cases, you could request it at
[https://github.com/powerShell/PowerShell/issues/](https://github.com/powerShell/PowerShell/issues/)

There is a GPL licensed scientific computing framework called Accord.NET, it's
not wrapped for PS but can be used almost directly, and some of these others
might be viable as well:

[http://accord-framework.net/](http://accord-framework.net/)

[https://www.reddit.com/r/PowerShell/comments/5ijcj7/kmeans_c...](https://www.reddit.com/r/PowerShell/comments/5ijcj7/kmeans_clustering_with_accord_framework_net/)

[https://en.wikipedia.org/wiki/List_of_numerical_libraries#.N...](https://en.wikipedia.org/wiki/List_of_numerical_libraries#.NET_Framework_languages_C.23.2C_F.23.2C_VB.NET_and_PowerShell)

~~~
6thaccount2
Thanks for your thoughts. You're probably right about the pipe dream.

I agree that there isn't much commercial incentive, but I honestly don't see
why major OS providers can't include some basic graphics primitives in the OS.
I'm not talking about embedding the unreal engine or anything. It could be
used for a lot more than primitive games btw.

I understand the desire to avoid more bloat. I can't believe Windows 10
requires the storage and RAM that it does.

The XML & JSON objects are pretty cool.

I'll check out Accord.NET framework too.

------
mamcx
The problem is that the shell is frozen in time. have not evolved and it show.

The shell is just BAD. Anti-USERS and even anti-APPS!. composing commands in
shell is just BAD. Is the worse API you can cobble in a hurry.

\--- I love the interactivity of it, but frankly FoxPro DOS 2.6 have a MUCH
better experience. And in fact exist many others experiences of interactivity
much better (Smalltalk, JS Dev tools, Jupyter notebooks, etc).

I have dreamed about this and think will be nice a shell that:

1\. Instead of just BLOB of binary-hopefully-is-text-maybe it interface with
SHAPES and TYPES:

    
    
        enum Data {
            Stream(Data),
            Lines(Scalar), //utf8, bytes, ints, bools, etc...
            Table(Header, Data),
            Tree(BTreeMap(Data, Data)),
            Graph(Nodes< Data >)
        }
    

This abstract interface will support most stuff we need. Then with mime-types
you can request "Tree as JSON" and get that.

This is similar to HTTP headers: But instead of html and that you ask for a
shape and the type of it. Of course some utilities will only support some and
can reject. And at minimum can return BLOB for interoperatibility with old
shell utils.

When returning you get the mimetype/shape so will know that the blob is really
a JPG and use that fact for displaying purposes and type inference.

With types and shapes you will get some static type validations.

2- Rich output (images, graphs, etc).

3- Uniform help and API discoverability alike swagger.

4- More rich and uniform API for the language that glue all.

For example is nuts that each utility must reimplement sorting, filtering,
etc.

Instead if the shell have filter/map/group/etc in-built commands together with
shapes and types:

    
    
        csv "sample.csv" filter: .. group: .. 
        |> map:..
        |> zip "sample.zip"
    

ie: Is like a database interface

~~~
ale22
You mean using something like typescript as a shell language? That could work
and it would leverage the javascript runtime and yet provide the rich type
system and benefits.

~~~
mamcx
I don't focus in the language itself but in how interface between utilities.
The shell is a place for a light scripting layer and make talk apps in many
langs.

Instead in what the utilities talk. Like in a REST interface you declare the
kind of data (table, tree, lines of text, a binary blob) and this unlock what
combinatory commands are available (so for table/lines you get relational
commands for example) that are global. So the utilities are mostly concerned
about their input/outputs.

Probably for efficience the utilities get a REQUEST call alike:

    
    
        GET /list: WHERE name = "hello"
    

to have the opportunity of do the processing inside.

------
ArmandGrillet
"Maybe we just haven’t rethought CLIs since their inception in an
environment". And there is a reason for that: backward compatibility. CLIs get
used in scripts, and the worst that can happen is to see the script breaking
because the command changed.

Let's take the example given in the article: it uses sed (created in 1974),
cut (created in 1985), and xargs (created before 2001). All these tools are
older than Jira and I'm quite sure that using a version of sed from 10 years
ago or the latest one with basic flags will give the same result. CLIs are
often ancient pieces of software that started as small projects and grew
because features were requested while the way it was working shall not be
broken as millions of developers use sed in their bash scripts and expect them
to work on any kind of Linux machine, old and new.

For context, I'm working on a CLI for an enterprise product and we got in
trouble just because we were changing error outputs. You just cannot do like
on a UI where as long as there is a way to perform the same action as before
users will not complain.

The second issue is that a CLI is used in two very different contexts: as an
application by engineers and as a parts of scripts by those same engineers.
This makes it tricky as a CLI developer to know what to do in terms of naming
as a daily user will use shortcuts quite fast but a new developer reading a
script using our CLI might not understand the purpose of a command. IMO if a
CLI has a good documentation (in app and online) you should focus on making
your daily users happy as someone seeing the command in a script can check the
docs online and hopefully understand what's happening in less than a minute.

~~~
pm215
Random historical note -- xargs dates back to 1977 or 78 --
[https://groups.google.com/forum/m/#!msg/comp.unix.questions/...](https://groups.google.com/forum/m/#!msg/comp.unix.questions/vG2keVnENmo/LkZN5orQrwcJ)
has a post in 1988 from the original author saying they wrote it at Bell Labs
a decade before, and that it actually predates the Bourne shell itself.

~~~
chubot
Wow, nice find :) I'm a fan of xargs, so I saved it here:

[http://www.oilshell.org/archive/xargs-history-
usenet/LkZN5or...](http://www.oilshell.org/archive/xargs-history-
usenet/LkZN5orQrwcJ)

------
llarsson
Humble suggestion: use the terse options when writing iteratively and the long
form in scripts that you save to a file. The latter will likely be read by
others or yourself in a few months from now, and it could be nice as immediate
documentation if you did something fancy with a tool or option that you seldom
use.

For instance, I know the handful of options that I personally encounter and
use for the "grep" program, but the man page lists plenty that I do not
immediately recognize. So typing them out in the script helps a lot with
readability.

Going overboard with terseness or verbosity is obviously not helpful to the
(human) reader, regardless of the direction.

------
ale22
I think there is another post on HN right now about how computers are more
like appliances today and no longer as readily programmable as they were
before.

And looking at that problem, you realize that a shell is a huge part of the
tinkerability of an OS and bash has dropped the ball in that regard. It's like
it's frozen in 70s.

We should be moving towards something like a LISP machine where the whole
system is exposed through your shell itself. Perhaps, we can start using
Chrome dev tools console something like a shell, a good first step might be
using typescript for the shell.

------
gumby
Short options make it quick and easy to use commands you use all the time.

Your little bashism was quick for me to read because it was compact,
scannable, and used popular programs I use frequently (cut, sed etc). Yes, you
need to look up the options for programs you don't use often but you need to
look them up anyway _because_ you don't use them frequently.

Conversely the longer example you used had longItentifiersThatNeedParsing (and
are too long to fit in your fovea) and worse, are easy to accidentally confuse
with say longItentifiersThatNeedNoParsing -- an error you are likely not to
see when reading the code and an error you might especially use when your
tooling has a pop-up of suggested completions.

The gnu approach of long + short options can help; it would be better if
getopt() would write a completion grammar into a section of the ELF file that
could be mmap()ed and used by readline(), similar to the TOPS-20 jsys that
inspired it.

------
wazoox
Perl allows this. Perl allows easy string stream manipulations, file
manipulation, and runs external programs easily with backticks or qx(). As
soon as a shell script has more than a single loop and conditional, you can
rewrite it in Perl instead, just as terse, more readable and way faster. Just
use Perl, the Swiss Army Chainsaw.

~~~
vvillena
Yes! Perl is what I turn into when I have a Bash script that is becoming too
big or too complex. It's easy to to turn Bash code into bad Perl code that is
still better than Bash. From there, it's easy to create a maintainable Perl
script that is more elegant, concise, and maintainable. It's a fantastic glue
language that will tie all the components of your project neatly.

And if you are thinking "yeah, but then I need to learn Perl", don't worry.
The amount of Perl you need for translating Bash into Perl can be learned in a
few afternoons.

------
c3534l
They have this. It's called powershell and I hate it. If you're going to write
code, use a real programming language.

~~~
chopraaa
Can you elaborate on this?

I've been a huge fan of PowerShell ever since I started using it (over two
years now) and it's my go-to scripting language on my workstation. You can
write integrations in the form of modules - VMware has Power CLI, Azure and
AWS both have great PowerShell modules.

Why would you say it's not a real programming language? It's as powerful as
Bash and seems to do the job quite nicely in most cases for scripting.

~~~
c3534l
Oh. It's certainly better than Bash, but it's still better to use Python or
C#. Powershell makes so many weird decisions and it's so verbose and wordy
that it becomes unreadable in the opposite way of Bash. I don't think the
shell _scripting_ space is desirable. Either you should be typing in commands
manually or you should be writing a program. Powershell and Bash exist in a
weird space in between them and wind up being crappy programs rather than
readable shell commands.

------
madhadron
I remember sitting down at a VMS terminal for the first time (a university
library's terminal had trouble with its OPAC and dropped me into a logged in
VMS shell). In fifteen minutes I had learned how to work with it, had the
machine show me the university's network, and started to feel comfortable. I
cannot imagine a similar experience if exposed to a Unix shell for the first
time.

------
thinkersilver
The poster is holding a line of bash to the standard of code and is
illustrating that readability should be the goal and a way of bringing bash
commands to a standard of readability for something like a PR. Readability is
really there to show _intent_ I would say though that if you are bringing this
to the code standards of today then this should really be wrapped up in some
kind of unit test
([https://github.com/sstephenson/bats](https://github.com/sstephenson/bats)
)for it to pass the PR. That would make the code a bit more maintainable and
can be integrated as a stage in your CI/CD pipeline.

If we do that then the intent would be clarified by the input and the expected
output of the test. Then then the code would at least be maintainable and the
readability problem becomes less of an issue when it comes to technical debt.

I've done this plenty of times with my teams and its certainly helped.

~~~
kmdupree
That’s interesting. Hadn’t thought about the testing angle. Thanks for sharing
this

------
eschaton
Sounds like the author wants the VMS shell with its concept of command tables
and attributes that provide fully consistent syntax and standardized online
help. It even makes it nearly trivial for any utility to provide its own
command line that follows all the same syntax, help mechanisms, and
conventions.

------
craigsmansion
> Consider another way we might express the same functionality in a language
> like javascript

If only someone would write some sort of practical extraction and reporting
language, that would really be a killer app for these types of operations.

I can't believe nobody ever thought of that. Definitely a good idea for a
startup.

~~~
m000
You mean like awk?

------
tomc1985
Ugh, I don't want to type that much at a CLI. Terseness is a virtue

~~~
human20190310
Terseness can be a virtue, but when you arrive at ${#arr[@]} to get the length
of the the array _arr_ , you're practically dealing with hieroglyphs.

~~~
tomc1985
Which is fine with me. I have always hated how people seem to want computers
to be more human-like. It's a computer, not a human being. I like that this
stuff takes some skill to learn/use

------
vedantroy
I think the larger issue is that shell scripts are intrinsically bad for
writing readable code.

Consider the "cut" line in the shell script vs the "map" line in the JS
script. The map line is easy to understand because it uses a chain of simple
functions (map, split, array access, and lastly trim). The "cut" line has to
cram all of this functionality into a few command line arguments instead of
composing basic building blocks together.

I mean look at this: What does "only delimited" mean, what does "fields" mean,
I might just be an illiterate boor but I don't remember what delimiter means
off the top of my head (something like seperator, I assume).

cut --only-delimited --fields 2 --delimiter '|'

The JavaScript version is easier to read because it uses fundamental building
blocks in the language that are

1\. Super intuitive. (Split seems pretty easy to understand, especially when
you know the input is a string, similarly everyone knows [1] means array
access).

2\. Widespread. You have probably encountered all of those JS functions
before, I'd say the chance of encountering the "cut" utility is a bit less.

Summary: Shell just fundamentally sucks because it outsources a lot of
functionality to various executables, each which have their own command line
arguments. The ideal method is to use a programming language with consistent,
composable, building blocks.

------
wolco
I feel like in my time we wanted to learn everything. These days people don't
find as much value learning everything and prefer as little new knowledge as
possible.

~~~
BeetleB
In the current time, there's a lot more to learn. Few people are going to hire
you based on your expertise in bash or UNIX command line utilities.

~~~
wolco
It's a skill that will get you out of trouble, allow you to do things others
can't and because of that it might get you promoted which will get you hired.

~~~
BeetleB
In my experience, while what you say is true, if I were to rank the skills
that "might get me promoted", there are far too many that are way higher than
this.

Conversely, look around you at those who are at your "level" in the company,
or at tech people who are higher than your level. How hard is it to find
someone who does not use sed, awk, cut, etc?

~~~
wolco
I've always been a senior developer at the highest level at where ever I've
worked. No one I've worked with over the years has ever used those tools. I do
and it gives me an edge in response time.

But if you want to move up. Everyone role above senior developer involves
dealing with people/managing projects. Those skills get people promoted. More
tech skills don't.

~~~
BeetleB
>Those skills get people promoted. More tech skills don't.

Which was my point.

But frankly, even if I were to rank the tech skills that can get someone
promoted, expertise in sed/awk is pretty low.

------
argd678
Better IDE support would also help here, if it could annotate what each flag
meant for example and provide inline help and autocompletion. The nice thing
about shell is the commands are very short and easy to type on the command
prompt, PS is too verbose and is cumbersome for that use case and more full
featured languages have all the upsides plus better tooling.

~~~
goatinaboat
Powershell has abbreviations (aliases) for common commands and fantastic
autocompletion for flags and arguments passed to cmdlets. Where-Object|Format-
Table is just ?|ft for example.

Also remember that in PowerShell there's very little munging - half of any
serious shell script is sed/cut/awk/tr and so on to munge the output of one
command into the input of the next. That more than makes up for PowerShell's
individual commands or cmdlets having more characters in.

------
oneplane
I get that some people don't like or invest in it, but I'm fine with the way
the default shells work on macOS, GNU/Linux with coreutils or BSD.

Sure, you can make a version that is usuable by people who aren't normally
using it, but then you get the inverse later on where you need a special shell
for people that use it a lot and want shorter commands...

There is no one size fits all, and instead of trying to bend what's there to
suit a larger group you could simply do what others have been doing and make
more 'entry level' shells for those that want/like/need it. There is no reason
we can't have both.

p.s. the examples on the webpage are super easy to read, but the javascript
ones are probably much more prone to aging and breakage than the bash ones...
I'm pretty sure it won't work everywhere, wheras the cut/sed/bash/xargs ones
will work anywhere with the utils in de base systems for the past 20 years.

------
localhost
It would be interesting to have an option in CLIs that will translate all
short options into long options.

This way you can take somebody's

    
    
      `cut -sf 2 -d \|` 
    

and turn it into `cut --only-delimited --fields 2 --delimiter '|'

Sadly this would require a convention along the lines of a --whatif --convert-
to-long in every CLI ...

------
zilchers
I'd say the counter example to this is Powershell - if you've ever used it,
it's quite verbose and can be less than pleasant to write. I think having both
options is the best, when you're getting started it's nice to have verbose,
highly self explanatory commands. As you get more familiar with the
environment, it's nice to move to something more concise.

Import-Csv -Path $src | foreach { $file = $_.column8 Write-Verbose "Writing to
$file" Export-Csv -Path (Join-Path -Path $dstDir -ChildPath "$($file).csv")
-InputObject $_ -Append -Encoding ASCII -NoTypeInformation

~~~
jodrellblank
At the interactive shell, the verbosity collapses:

    
    
        ipcsv $src|group column8|%{$_.group|epcsv "$dstdir\$($_.name).csv" -noty -e ascii -v}
    

The bad part is when writing "good" powershell the verbosity is exponentially
worse and it turns into

    
    
        try
        {
            if (-not [string]::IsNullOrEmpty($_.Column8))
            {
    
                $fullPathName = Join-Path -Path $dstDir -ChildPath $_.Column8
                $pathTestResult = test-path -LiteralPath $fullPathName -ErrorAction Stop
    
    
                # This is a hashtable of the parameters to a cmdlet
                # the only purpose of this 'splatting' is that
                # powershell commands get too long
                # and can't be line-wrapped in any good way
    
                $exportParams = @{
                    Encoding = 'ASCII'
                    NoTypeInformation = $true
                    Append = $true
                    LiteralPath = $fullPathName
                    Verbose = $true
                }
    
                Export-Csv $exportParams
            }
        }
        catch [whatever]
        {
            
        }
    

and on and on and on, ugh.

------
hhsuey
I think bash is terse for at least 2 reasons:

1\. Quicker to type repeated variations of commands at a CLI.

2\. Memorization of flags isn't as much a concern since you're at the CLI, you
can often easily look up the meanings of options and flags with -h or man
pages.

It may be slightly more difficult in the beginning, but once you're familiar,
terseness saves significant time.

I would relate shell languages more to vim or emacs in that regards. They have
a higher learning curve but pay off in efficiency in the long term.

~~~
kmdupree
Yeah shame of me for not mentioning the terseness is faster argument.

Do you think that terseness saves _significant time_ as compared to just tab-
completing longer flags?

~~~
hhsuey
It's the only benefit I suppose.

I neglected to think about tab-completion, but I see you mentioned it at the
end. I've never used it for bash. It really depends on whether I can guess or
remember the first letter(s) to the option. If it's a command I am new to, I'd
still need to print out the available options anyways.

------
geoka9
> Programs must be written for people to read, and only incidentally for
> machines to execute.

That's true of programs that are written to be used and maintained over a
period of time. Shell programs like that are mostly system configuration and
installation scripts; they are not written that often compared to throw-away
one-liners that most shell users write every day. I'd hate to use even a
regular programming language for the daily tasks, let alone a verbose one.

------
bediger4000
I'd buy the argument that using single-letter flags is confusing, except...

the huge popularity of hot keys in GUIs, etc, which have the additional
demerit of not being particularly discoverable or guessable. We accept non-
discoverable hot-keys etc in GUIs, so this argument probably results from
unfamiliarity.

------
chewbacha
Anyone else put off by camel case at the command line?

Jumping back and forth between JS files and functions in camel case and Unix
standard snake case is more confusing for me than memorizing the couple highly
used flags.

I sknow I’m being subjective, but we’re talking style, so it’s all kind of
subjective.

~~~
Riverheart
Powershell is case insensitive. Use the style that resonates most with your
heart.

------
0x8BADF00D
On the one hand I understand how it might be harder to read -char cmdline
flags. But it does save keystrokes. Nobody does

    
    
      for (int iterator; iterator++; iterator < some_val)
    

That would be quite silly.

~~~
BeetleB
>for (int iterator; iterator++; iterator < some_val)

Not sure what you're trying to say. That loop will likely not terminate.

~~~
0x8BADF00D
That’s my point. That code is broken but still readable. The OP is trying to
equate readability with correctness.

------
randallsquared
Hm.

    
    
        | sed 's/ //g'
    

and

    
    
        .trim()
    

only do the same thing if there are no interior spaces, like filenames for
attachments might have.

------
zzo38computer
I happen to like the short options though (at least it is my opinion). (Of
course, some programs can do both, but some programs only have long options.)

------
mr_vile
the cut, sed, xargs, utilities are available on all major unix derivatives and
have a pretty consistent set of command line flags... I don't see why we need
to suddenly slap camelCasedParameters on everything because the author needed
to read some man pages...

------
BeetleB
This is part of the reason I use xonsh instead of bash.

------
fredrik_skne_se
Powershells way of doing CLI is the best way yet

------
tyzerdak
Same shit.

------
tempsolution
This might be one of the core reasons why Linux never took off as a Desktop
solution. Also looking at the comments here just shows how out of touch people
are with the real world...

If you relish in terse commands that don't mean anything to anyone who hasn't
been spending their last 20 years working on the command line, then go for it.
Just realize that most of us (talking about the already small population that
is software developers) are not interested is spending anytime on dealing with
this junk. This is not to mention the other 99% of the population who doesn't
even know what Linux or Bash is, and good riddance for them...

The JavaScript code the author posted is immediately intelligible to anyone
who has been keeping in touch with programming language developments over the
past years. It doesn't even matter if you come from C#, Python, JS, Java, etc.
It should be familiar to all of them. You might not know exactly what it does,
but it seems oddly familiar.

I may be making a leap here, but I would go as far as saying that this very
attitude of command-line elitism is the primary reason why Linux has had zero
chance to take hold in the Desktop space. The developers have no clue why
"normal" people would not want to use the command line (and I am including
myself in it) and their efforts of creating a GUI (just look at the junk that
is KDE, Gnome and Unity... Proudly sold off as GUI) have been so out of touch
with reality, that I can't even.....

Anyway, each to their own. I am mostly using the command line to start a build
process. While I am able to do a lot of more advanced stuff, including
properly using VIM, I am actually regretting spending the time on learning it.
These days I wrote most of my "bash" scripts in Python instead (just to
clarify: I write scripts in Python, for which it seems well suited. I don't
see Python as a viable choice for creating actual software). There are some
scenarios in which Bash is still a better choice, but thankfully they are far
and few between.

~~~
Dylan16807
The Linux Desktop has nothing to do with the command line.

And most of the desktop environments are significantly less of a mess than
Windows 10's mishmash of five separate GUI styles. Even if I don't like Unity
either.

