

Improving the UX of (command-line) tools - hbbio
http://blog.opalang.org/2012/03/programming-tools-ux-experience-how-we.html

======
stcredzero
Opa has some brilliant design decisions. Also, with the advent of Source Maps
in Chrome and Firefox, the time is right for an end-to-end integrated web
development language. Though I do find it strange as a long time OO person
that "tightly coupled" is used as a positive trait in the "Hello Opa" post.

As I predicted years ago, Static and Dynamic typing are starting to lose their
distinctions. I wonder if someone has started producing an as you type type
inferencer? Given the right language design, I don't see why the editor can't
keep track of all of these details, revealing types on mouseover and greying
out vars that are ambiguously typed, much as syntax errors are hi-lighted in
programmer's editors.

~~~
jules
Already exists, e.g. F# in Visual Studio. Note however that type inference and
dynamic typing, although similar on the surface in that you don't have to
write type annotations, are completely different things.

The OO tenet says "Don't couple unrelated things". The converse of this is
"Don't decouple related things", which is equally valid yet often ignored.
Decoupling X and Y means that in the cases where you want to use X and Y
together you need to connect them together again. This is a bad thing if in
99% of the cases you want to use X and Y together.

An example of this is the Java file API. In an effort to decouple a reference
to a file, a reference to a stream of the file's contents, a reference to an
arbitrary stream, reading from a stream, and reading from a stream in a
buffered way, we end up with FIVE classes (File, FileInputStream,
DataInputStream, InputStreamReader, BufferedReader) that will almost always be
used together simply to read the contents of a file. This is of course an
extreme example. A better design is to "couple" these things together in a
gimmeTheContentsAlready(filename).

Likewise, the Opa guys are expecting that most people who use their stuff will
use their database interface, and their language on both the server and the
client, their easy client/server networking, and their http server. Shipping
these as separate components would induce a lot of work to integrate them
together again.

~~~
stcredzero
_The OO tenet says "Don't couple unrelated things". The converse of this is
"Don't decouple related things", which is equally valid yet often ignored._

Orthogonal areas of concern that are used together are just that: _used
together_. They can be used together and packaged in a way that's super
convenient. That still doesn't mean I want them _tightly coupled_.

NB: Works well together != tightly coupled.

 _An example of this is the Java file API. In an effort to decouple a
reference to a file...we end up with FIVE classes_

Yeah, I freakin hate that too. This Java JMI Api I used once, I had to
instantiate _seven_ entities from as many different classes just to get to
send a message. But not all Java libraries are paragons of good object design.
(You can leave off the Java, actually)

 _A better design is to "couple" these things together in a
gimmeTheContentsAlready(filename)._

Sorry, but you're mistaken here. Things are just as decoupled in the
VisualWorks Smalltalk class library. There are equivalents to: (File,
FileInputStream, DataInputStream, InputStreamReader, BufferedReader) and much
more, but you can also still do:

    
    
        'file.dat' asFilename readStream
    

If you wanted to, you could just add a method to String named
gimmeTheContentsAlready that would just return you the whole contents of the
file with that pathname as a String. (I'd name it fileContents, though.)
Either would take 3 short, easy to read lines of code, or one messy one, and
about 2 minutes to write and save in your image.

Convenient methods and well designed APIs aren't the same as "tight coupling."
If you think they are, then you need to read some better OO code.

EDIT: Another example -- at the same place there was a Sun message bus we used
in the same product line, same functionality as JMI, just with better
performance. Needed to instantiate 2 object to send a message with that.

~~~
jules
It just depends on your definition of tightly coupled. By your definition
Opa's implementation isn't tightly coupled either (the http server doesn't
depend on the database, etc.). A better term is probably tightly integrated,
and as far as I can see they also use this term on the rest of the website.

~~~
stcredzero
_It just depends on your definition of tightly coupled._

Back in my day, "tightly coupled" was somewhat like the opposite of "modular."

------
chimeracoder
At the risk of sounding like a broken-record aging neckbeard (reality: I'm a
college student with light stubble), I think the title of this thread is very
misleading, even if the blog post itself contains valuable material.

First, I'm just barely old enough to remember using DOS on a machine without
Windows -- which I believe is the last time that a GUI/DE wasn't viewed as
'essentially essential' for an end-user computing environment. However, I'm
just young enough to remember growing up with Windows for most of my computing
life, so it's not like I was indoctrinated into the ways of the command line
from an early age. Far from it - I'd forgotten all the DOS commands I knew by
the time I turned 10, and I actually only began learning Unix very recently
(while in college).

I don't think command-line tools have a UX problem. If anything, we have a
teaching problem. Command-line tools have a _slight_ learning curve, but
really nowhere near as much as many people think. The story of how I got into
using the command-line for all (well, almost all) my daily tasks is a bit too
long to post here (unless people are interested), but in short: I went from
not knowing the difference between cd and dd all in the span of a month.
(Thankfully, I did _not_ have to learn those two commands the hard way!)

Why was I able to do it? Because Unix has a _ridiculously_ uniform design, and
is _unbelievably_ modular. UI designers today should strive for the uniformity
that my GNU system has. (I use GNU, but this comment applies for the most part
to other systems, like the BSD-based varieties, and is largely, though not
completely, true _even when comparing two different systems!_ ). There are a
few wrinkles around the edges, but when you consider the very
distributed/patchwork way that modern POSIX systems came to be, it's almost
miraculous that the single-dash switches vs. double-dash options distinction
works in most cases the way you'd expect. The difference between required
arguments and optional parameters is implemented very well in most utilities
(it's rare that I have to specify unnecessary flags for required parameters,
for example). Combining switches also works the way you'd expect on most
utilities, and even the naming is consistent (-h, -v, -l, -a, --help,
--verbose, etc. follow the Principle of Least Surprise (POLS) in all but very
few cases). Even if there are differences between, say, ls -G on BSD and ls
--color on GNU, that's a matter of the way those basic utilities were
implemented, and the _interface_ itself is still fairly uniform. (And I'm
willing to grant some leeway for such fundamental utilities as 'ls', since
these tools develop over a period of time).

That's not to say that everything has to follow the Unix-like model. I'm not
saying that Opa's efforts aren't valuable simply because they're not standard
POSIX. On the contrary, I think that tailoring a command-line tool to the task
at hand can be very valuable - SQL queries fall into this category as well.

But they're valuable because they provide a modular (scriptable) tool that
modify a familiar interface to make it easier to use in a domain-specific
context with minimal violations of the POLS. That's fundamentally different
from _improving_ the UX of the command line; instead, it's _applying_ the
general UI principles of command line tools to a specific context. All
instances of a general UI violate the POLS in some way by definition; the goal
is to minimize that.

In the end, I can't think of any other UI that follows the Principle of Least
Surprise as well as Unix does, and it's the most fundamental command-line tool
in my life. We may have a problem with the command line, but the command line
does not have a problem with UI.

(Edit: Does someone know how to write an asterisk without causing it to be
parsed as the opening of an italicized block, as in <asterisk>-nix to refer to
the wildcard 'Unix-like'? I'm on choppy in-flight internet and can't figure
this out reliably with trial-and-error).

~~~
fcurella
I think there is much more that can be done to make tools more user-friendly.

An example that pops to my mind is when you misspell a git command, and git
suggest what you could've intended:

    
    
        $ git pill
        git: 'pill' is not a git command. See 'git --help'.
    
        Did you mean this?
            pull
    

That's already one step towards a better UI, but it could go even one step
further if I could answer the question, ie: I press 'y' and git will pull.

Of course I'm not criticizing GIT (which I think as a better UI than many
tools already), it's just an example.

~~~
chimeracoder
I'm not saying that the UI _couldn't_ be improved - I'm just saying that what
they're doing isn't really an improvement on the the command-line as a UI, as
it's just a domain-specific application. (And, I implied this but didn't state
it explicitly: domain-specific applications should aim to be as 'compatible'
as possible from a UI standpoint with the generic tool - ie, Unix - deviating
only when there's a specific need to, and defaulting to the generic toolset.
I'd need to see more to judge if Opa fits this; it's just a general
principle).

More importantly, though: what you're looking for can already be done! Just
use zsh (or, even simpler: a bash correction tool).

Because Unix is so modular, you can make these kinds of UI changes _without
breaking or even changing the underlying toolset_. That means that you and I
can tailor the UI to our specific needs, and we don't even really have to
touch the general purpose toolset. Other examples include using Vim
keybindings for the shell instead of Emacs (try modifying keyboard shortcuts
with most pre-compiled GUI applications!) or scripting and aliasing.

Thanks for mentioning that, actually, because it illustrates my point very
well! The command-line UI's strength is that it's literally as modular and as
minimal as possible (at least, as far as I can conceive of). That's what I
personally love the most about it. It defines only the bare minimum of tools
necessary to satisfy two conditions:

1\. By default, [almost] everything works as expected, with no customization
(but assuming knowledge of the system). 2\. Those who want to customize will
find that it provides _all_ of the tools necessary to create the exact
interface they want, without modifying the underlying toolset as well. (This
customization can be done to an essentially arbitrary level -- the technical
limitations are well beyond what a reasonable person would consider 'sane').

You may make the argument, then, that the command-line isn't even an interface
at all! It's more like an _abstract_ interface that lets you stitch the tools
together to define the interface you want, all the while providing a 'default
interface' - a sensible set of defaults for those who can't be bothered to
customize.

(Edit: I understand that tools aren't perfect - the git-pull/git-push
asymmetry is an example - but the inconsistencies are vastly outnumbered by
the overarching uniformity. And again, with such a modular model, you can
create the UI you want without waiting for Linus to change the behavior of
git).

~~~
mabdt
Another example of catastrophic UX is certainly the unix command 'find', as
in: find . -name "*.opa" -exec cat \\{\\} \; | wc -l

I wonder if any Unix beginner has ever managed to find the syntax for -exec
without copy-pasting the examples :)

~~~
thwarted
One aspect of having composable components is knowing how they get composed.
The fact that {} and ; _may_ need to be escaped is something having to do with
the shell, not with find and it's -exec syntax.

In fact, this is one of those things that is obvious once you learn about
shell and how commands lines get expanded and executed. The problem there is
that most people don't learn about the shell until forced to by being bitten
by something goofy and unintuitive like -exec arguments. The order in which
people learn about these things, and it being difficult to discover (in that I
can suggest a way to "verify" the command line before running it, but that
doesn't help if you don't know how to interpret the results, which also
requires experience dealing with the system as a whole) is what causes the
catastrophic failure.

That these may need to be escaped is even documented in the find(1)
documentation.

    
    
        -exec command ;
        Execute command; true if 0 status is returned.
        All following arguments to find are taken to be arguments to
        the command until an argument consisting of `;' is encountered.
        The string `{}' is replaced by the current file name being
        processed everywhere it occurs in the  arguments to the command,
        not just in arguments where it is alone, as in some versions of
        find.  Both of these constructions might need to be escaped
        (with a `\') or quoted to protect them from expansion by the
        shell.

------
nullflux
Let's also not forget that the whole flags/arguments debate is often not
really that useful in todays *nix command line, either. I'd love to see more
command-line utilities work like the Heroku gem and less like sed. (Yes, I
know it breaks the whole Unix philosophy of one tool == one action, but from
my experience it is far easier to get new people to learn.)

