> commands should be usable as library subroutines, or vice versa
I was quite surprised when I started using Unix that this was not available, as I had gotten used it on Multics. I had come from MACLISP (on ITS) and Multics and was used to a more flexible and interactive environment.
Architecturally it was much more natural to do somin those environments than in the far less flexible Unix, where every shell program and command was hermetic. But it was also fitting into the constraints of even smaller machines.
I learned from that article that the shell had been around in Multics for 5 years before UNIX inherited the concept! There were no pipes, but it seems like there were command-line arguments even in Louis Pouzin's shell. Does anyone know what it was called? Wikipedia only knows the Thompson shell [0].
"i decided that it was close to a time sharing system, just
lacking an exec call, a shell, an editor, and an assembler.
(no compilers) the exec call was trivial and the other 3
were done in 1-week each" ... "in mid to late 1969" - Ken Thompson [1]
".rc" is derived from RUNCOM. Bell Labs engineers were part of the Multics design team in the 60s and used CTSS. Bob Sobecki (BTL), Glenda Schroeder (MIT), and Karolyn Martin (MIT) wrote early design documents for the Multics shell, in about 1966.
The Multics team suggested many features and changes to the Multics command language as the system design evolved. Prototype Multics had slow performance, and the shell implementation was reworked and simplified several times. For a while we implemented a program called the "mini shell" that lacked the fancy features but was much faster: it passed complex commands to the "full shell" if necessary. Eventually, PL/I improved and we learned to avoid slow constructs, and had one shell program.
> Direct access hems users in a static framework. Evolution is unfrequent and controlled by central and distant agents. Creativity is out of the user's hand.
> Time sharing, as it became popular, is a living organism in which any user, with various degrees of expertise, can create new objects, test them, and make them available to others, without administrative control and hassle. With the internet experience, this no longer need be substantiated.
This is why we need open computing, and this is why large entities often oppose it
I get the sense that they knew they were on a new frontier and excitedly building new things, just because it was cool.
Reminds me of early web2.0, and the recent javascript cambrian explosion. It’s cool to realise that the humble cli, something I use all the time, was just like that one day too.
I wonder if programs on the shell should start emitting JSON objects now?
Often times, the text output, separated by a white space, seems too limiting. Where maybe parsing structured JSON objects might be easier to manipulate.
There are tools like jo[1] which will encode output of shell commands into json.
As for native shell support for structured data, Powershell is probably the best implementation.
In Powershell objects can be piped to other commands and they can be serialized to json, xml,
csv, and even html [2]
I’d be a bit sceptical about JSON everywhere as it would mean throwing out all the shell tools and everything having to parse JSON which seems overkill.
https://en.wikipedia.org/wiki/Run_commands