
Rich Command Shells - BruceM
http://waywardmonkeys.org/2014/10/10/rich-command-shells/
======
zorbo
Text is the universal interface. You can do things with it. You can strip it,
cut it, transform it, send it to other places. Humans can read it, programs
can read it, your printer can output it. It can be sent to web APIs, it can be
stored anywhere. It's compressible, can be colored and can be copy-pasted and
is infinitely extendable. Thousands of protocols run over it.

The command line works with text. The command line remains the best interface
I've ever used. It's user friendly, composable and available everywhere. It's
easy to automate and easy to extend.

I wish the "command line with pictures" idea would just go away already. It
adds nothing for the general public. I can already view pictures on remote
machines with X forwarding.

Command line with pictures never made it, because there are ten competing
standards. With text, everybody just agreed on ASCII and now Unicode/UTF8.
Text has hundreds of ugly clutches on top of it (Extended ASCII, ANSI, Escape
codes, etc, etc). It still works. It's still simple. It has its problems, but
nowhere near as many problems as GUIs.

Those who don't understand Unix are doomed to reimplement it... poorly.

~~~
anon1385
Text isn't the universal interface in Unix, byte streams are. You can quite
happily send non-textual control characters and such around in Unix, or pipe
data containing NULLs from one process to another. 'Text' is a very seductive
abstraction, but it's one of the most brutal to work with once you start
interacting with the real world and have to give up on ascii and deal with
encodings and unicode and so on.

Putting commands and data inline is a recipe for disaster and a million
command injection exploits. The Unix philosophy has broken the minds of
generations of programmers. It leads them to doing things like concatenating
strings to build SQL queries or doing IPC with ad-hoc regex-parsed protocols
or using a couple of magical characters to indicate that the contents of a
variable should be parsed and executed instead of just stored. Take a read of
some of the earlier threads on HN about _Shellshock_ , and you will find
numerous people blaming Apache for not "escaping" the data it was putting in a
shell variable. As if it even could.

Even Unix nerds have at least partially internalised the dangerousness of the
paradigm -- "don't parse the output of ls" and so on. The fact that the Unix
paradigm (passing everything as strings with magical characters and escape
sequences) is broken for the most fundamental computing tasks like working
with file names ought to be a damning inditement of the paradigm. Sadly people
merely parrot the rote learned lesson "don't parse ls because file names can't
be trusted", without thinking about all the other untrusted data they expose
to unix shells all the time.

Just this week Yahoo got exploited. At first people thought it was
_Shellshock_ , but no, it was just a routine command injection vulnerability
in their log processing shell scripts. A problem blighting just about every
non-trivial shell script ever written.

The usual reply is "don't use shells with untrusted data". But auditing where
any particular bit of data came from can be just about impossible once it has
been across several systems through programmes written in a variety of
languages, stored on a file system, read back and so on. The only sane
solution is to never use shell scripts.

Like the C memory and integer model makes writing secure C code borderline
impossible, the Unix "single pipe of bytes that defaults to being commands"
paradigm makes writing secure shell scripts borderline impossible.

Unix needs to be taken out back and shot.

~~~
rsync
"Like the C memory and integer model makes writing secure C code borderline
impossible, the Unix "single pipe of bytes that defaults to being commands"
paradigm makes writing secure shell scripts borderline impossible. Unix needs
to be taken out back and shot."

What alternative do you advocate/propose ?

Genuinely curious ...

~~~
anon1385
For the interactive general purpose data munging and quick execution of simple
commands that the shell is best at, I really don't know what a better system
would look like. It seems like a really hard problem. Anything purely text
based ends up being fairly cumbersome to use for simple commands if it has to
use real data structures (consider having to type _([ "a", "b"])_ instead of
_a b_ to pass arguments with json style syntax or whatever). At least that was
my experience of trying to write a very simple shell. There are a hell of a
lot of people a hell of a lot smarter than me though.

It seems to me that a lot of shell scripts could be ported to other languages.
Does DHCP on Linux need to use a shell script instead of python or something
like that? The benefits of the shell grammar and semantics which are designed
to make interactive use easy seem more like hindrances in a lot of those kinds
of use cases. I assume it's largely done to make it easier for sysadmins to
customise things. If I was a sysadmin I'd much rather learn python (and feel
like I actually understood it) than the crazy byzantine grammar of bash. Maybe
that's why I'm not a sysadmin.

This paper by Rob Pike might also be of interest:
[http://doc.cat-v.org/bell_labs/structural_regexps/se.pdf](http://doc.cat-v.org/bell_labs/structural_regexps/se.pdf)

>The current UNIX® text processing tools are weakened by the built-in concept
of a line.

~~~
nitrogen
The reason the shell is used everywhere is because it's guaranteed to be
installed (although DHCP used Bash explicitly), is much faster to start and
run a simple script than Python, and its syntax is the command line that
everybody should be familiar with.

------
josephg
I miss TermKit[1]. Its a real shame that the developer abandoned it after it
got a lot of hype. There's a huge opportunity for someone to come along and
make either a new terminal encoding which allows rich, interactive output or
extend VT somehow to which allows the same. If you could extend the terminal
protocol, you might even be able to get it to work over SSH.

HTML+JS seem like the obvious way to do it - even though the web is an awful
platform, its standard, cross-platform and fully featured. Its the perfect
worse-is-better solution for this.

I think the hardest part would be figuring out how to reconcile browser-like
UI events and file streams. Maybe you'd need to make a standardized event
serialization format so your process could receive serialized events via stdin
or something like that. I think it'll be a really hard sell if we have to
abandon our unix pipes entirely.

[1] [https://github.com/unconed/TermKit](https://github.com/unconed/TermKit)

Previous hackernews discussion around termkit:
[https://news.ycombinator.com/item?id=2559734](https://news.ycombinator.com/item?id=2559734)

~~~
lallysingh
> HTML+JS seem like the obvious way to do it > I think the hardest part would
> be figuring out how to reconcile browser-like UI events and file streams.

How about Chrome Apps? They've got a mix of file I/O capability and HTML/JS.

------
tubelite
If I might plug a rich command shell I'm developing:
[http://pigshell.com](http://pigshell.com) (Source at
[https://github.com/pigshell/pigshell](https://github.com/pigshell/pigshell))

It provides

\- A shell for the web. Runs in the browser, pure client-side.

\- File-like abstraction for URLs and other entities exposed by web APIs

\- Unix-like style of composing commands using pipes.

\- Visualization using HTML

For instance, cp -r /gdrive/<username> /home will backup the contents of your
Google Drive to your desktop (see
[http://pigshell.com/v/0.6.2/doc/gdrive.html](http://pigshell.com/v/0.6.2/doc/gdrive.html)
for details). Replace /home with /dropbox/<username>, and the same command
will back up GDrive to Dropbox. And so on.

While it is already useful, there is still some work, especially around
hardening the file abstraction/APIs (and reams of documentation) before before
it can be horizontally expanded to support a bunch more APIs and cloud stores.
I am actively working on these and expect they'll take ~2 months.

~~~
qznc
Interesting. Why do you think that files are good abstraction for a web shell?

~~~
tubelite
It's the old "if you have a hammer, everything looks like a nail". Since I'm a
filesystem engineer, everything looks like a file :)

Seriously though: the file is a powerful and familiar abstraction. The primary
motivation for pigshell was to provide a common minimum abstraction across
different web APIs which would enable basic data movement and backup.

That said, pigshell commands actually pass objects across the pipeline; files
are a kind of object. For instance,

    
    
      cat http://pigshell.com/sample/life-expectancy.html | table2js -e "tr" foo country data | head | printf
    

extracts data from an html table, converts them to plain Javascript objects
and prints their json representation.

------
BruceM
I will probably write a "More Rich Command Shells" to cover things I missed
here that are important in some way (like Apple's MPW).

The next thing I want to write though is about building something that has a
command shell UI today and thinking about how to do so in a flexible way that
works with multiple output devices.

~~~
agumonkey
Thanks for that article.

I'm often thinking about how to present/"unify" these:

    
    
      - command (statements validated by some token),
      - repl (expression, less side-effect driven than the previous one), 
      - unary/immediate mode (each input event is interpretable right away, direct mapping between input and action), 
      - n-ary/non-immediate mode (some parsing occurs, gathering a few tokens into a higher level construct)
      - non keyboard based UI/UX (some sofware like Maya have proper separation, most mouse events have direct translation into object methods)

------
akavel
Given all of that, and also Oberon -- '80s too, and even the Engelbart's Demo
-- '68!, I totally every day wonder and can't understand why in 2010's _all_
of the "modern" OSes still provide the developer just with text-only
consoles??... okay, maybe for end-users there _was_ a jump (in interfaces and
features), and maybe it was significant indeed (movies editing, 3d
modelling/sculpting, possibility of instant communication with significant
fraction of all the people in the world, to bring up some). But it still feels
like even for them, some things were there, and now are not (sorry for
vagueness, but I don't have much time now to think about examples,
unfortunately. When I'm gonna finally start this blog thing, one day...).

Any theories, anyone? I'm really curious. Still believe this can be improved,
and work on some ideas in my free time, but I often wonder why I have to, and
I can't already use those beautiful ancient features?

~~~
richardjdare
That's a very interesting question.

I'd guess that the technical decisions that differentiated early unix from the
lisp machines etc. were based on hardware and cost limitations.(the
development of unix is something I'd be interested in reading more about) Then
as unix took off, these technologies became entrenched, with a kind of
apologetics developing amongst users who liked those systems, or never used
anything else.

There's an interview with Bill Joy where he said that Vi's famous modal
interface was merely a result of his poor quality terminal and network
connection, and now it's used by millions of programmers. I'm not saying that
these users are wrong about the benefits of modal editing or the unix text
interface etc, but that these are _discovered_ benefits that are then used to
retroactively construct a containing narrative of justification that often
walls people off from alternatives.

I'd also hazard a guess that many of these technologies were simply
inaccessible and/or unknown to the mass of post-pc-revolution developers who
create the bulk of our present systems. The number of devs who had access to
Lisp machines was small compared to the number who came in with Unix with its
success in the workstation market, and tiny compared to the number of devs who
came in through home computers/PCs.

I myself grew up with 80s home computers, moving to the PC and Linux in the
late 90's. Lisp machines, Oberon etc. were unknown to me until a few years
ago, and information about them is still hard to come by. Nearly all the lisp
machine sites are web 1.0 with broken links etc. Very few of them try to
market the benefits of these systems to people in the wider community.

After I got interested in lisp and lisp machines (thanks pg!) I spent a
weekend getting that notorious leaked Symbolics Genera distribution working on
Linux, inside virtual box on my mac. After a few hours in the Listener, with
its rich output and superior incremental help system I wanted to show it to
everybody I knew. It was one of the most incredible things I had ever seen on
a computer. For the next few weeks at work I was occupied with the constant
thought, "This is not as good as Genera". It's had a huge impact on how I
think about software, even my indie game development.

~~~
a3n
> the development of unix is something I'd be interested in reading more about

[https://en.wikipedia.org/wiki/Unix](https://en.wikipedia.org/wiki/Unix)

[http://cm.bell-labs.com/cm/cs/who/dmr/cacm.html](http://cm.bell-
labs.com/cm/cs/who/dmr/cacm.html)

[http://www3.alcatel-
lucent.com/bstj/vol57-1978/articles/bstj...](http://www3.alcatel-
lucent.com/bstj/vol57-1978/articles/bstj57-6-1905.pdf)

[http://cm.bell-labs.com/cm/cs/who/dmr/chist.html](http://cm.bell-
labs.com/cm/cs/who/dmr/chist.html)

[http://www.unix.org/what_is_unix/history_timeline.html](http://www.unix.org/what_is_unix/history_timeline.html)

[http://www.faqs.org/docs/artu/historychapter.html](http://www.faqs.org/docs/artu/historychapter.html)

------
ivanca
Strange he didn't mention Xiki: [http://xsh.org/](http://xsh.org/)
[http://xiki.org/](http://xiki.org/)

------
mozmark
If you found this interesting, you may also like GCLI (as featured in the
Firefox dev tools command line - or you can play with it here:
[http://mozilla.github.io/gcli/](http://mozilla.github.io/gcli/) ).

~~~
BruceM
GCLI is interesting, but I didn't mention it because I feel like it focuses
more on the issue of command parsing / completion.

That said, that's a great topic to cover! I think GCLI does a pretty good job
of command completion. The Lisp Machine did as well. I have friends who love
some of the router CLIs, but not sure which one(s).

~~~
lispm
Cisco's IOS is an example.

------
lispm
If you come to Freiheit.com next week (16. Oct 2014) to the Clojure User Group
Meeting in Hamburg/Germany, I'll demo the Dynamic Windows user interface of a
real Symbolics Lisp Machine. Including its command shell called 'Dynamic Lisp
Listener'.

[http://www.meetup.com/ClojureUserGroupHH/events/207314372/](http://www.meetup.com/ClojureUserGroupHH/events/207314372/)

~~~
cpach
Cool! Do you know if the demo will be recorded?

~~~
lispm
It won't.

~~~
agumonkey
Oh the sadness. Please harass someone with a capable smartphone to attempt at
least :)

------
vinodkd
Slightly off-topic, but the recent surge in responsive UI had me thinking
thus: If we are now driven to merge UI logic across different graphical
devices, can we think of apps that span both textual and graphical devices?

After all, the core functionality of the application remains the same. To use
a simple example: when you search for a product online, get a search result
list and then select one from the list, couldnt this flow be modeled just the
same in both gui and text interfaces?

I'm thinking back to the turbo-pascal style applications of the past that
produced full-blown IDEs in text, or the wordstars/wordperfects of yore: the
UI model that sits behind those apps cannot be much different in principle
from the modern day equivalents.

Even farther back, there was a time (and probably still is for college
assignments) when cli applications had a prompt-read user input-respond cycle,
replete with text-based choices to select from and so forth.

What if we were to merge the two worlds instead of trying to get one to
confirm to the other?

------
pjmlp
And this is why UNIX shells feel so primitive.

------
fiatjaf
What about the Aurora/Eve thing?
[https://www.youtube.com/watch?v=L6iUm_Cqx2s](https://www.youtube.com/watch?v=L6iUm_Cqx2s)

It is a shell, isn't it?

------
josh-wrale
How about a declarative and idempotent shell? Well, more declarative and
idempotent than bash + the GNU core utils.

~~~
_mhr_
How would this be useful?

~~~
josh-wrale
This comment thread discusses the methodology.
[https://news.ycombinator.com/item?id=7378764#up_7380392](https://news.ycombinator.com/item?id=7378764#up_7380392)

------
robbles
CLOS and CLIM always sounded to me like the names of programs you'd expect to
see competing at disc wars, or racing in a lightcycle grid.

