
Crush: A command line shell that is also a powerful modern programming language - robin_reala
https://github.com/liljencrantz/crush
======
moonchild
I think that these 'improved shell' projects (incl. crush and nushell) that
try to copy powershell somewhat miss the point of the shell in the first
place. If I want a high-level language that's tightly integrated with the unix
environment, I already have perl and tcl. Little more power (maybe less
integration, but greater capacity to build new integration) and I can use raku
or lisp. The point of shell is that it interoperates freely with _all_
programs and _all_ programs share a language. If now, suddenly, I can only use
commands that have been explicitly rewritten to support your environment; or
regular commands are second-class citizens and need support code to interact
with your environment's higher-level functionality——

Then your environment is useless to me. Because I have better programming
languages which, despite being able to interact with the system and its
commands, don't pretend to be one with the system. As a result, the core high-
level language constructs mesh better, because they're not trying to serve two
goals at once. It's not possible for something that's not a 'real' shell
(including fish, zsh, even csh) to integrate with the system the way a real
shell does, because the _system itself_ was designed around the shell.

~~~
pritambarhate
Not many young people know Perl or Tcl (or even lisp for that matter).

>> The point of shell is that it interoperates freely with all programs and
all programs share a language.

Also doesn't mean people should stop trying to come up with these new shells.
May be some "new" shell will catch up and tomorrow's "all" programs will be
compatible with that shell. For example, now on Mac OS X, now ZSH is the
default shell.

~~~
_jal
> Not many young people know Perl or Tcl

There are more young people who know Perl or Tcl or Lisp than know Crush. I've
never understood this reasoning - "Hardly anyone knows X. Let's replace it
with something nobody knows." At most, you're making an argument about a lack
of sunk cost.

> May be some "new" shell will catch up and tomorrow's "all" programs will be
> compatible with that shell

Think of it like transitioning to life in a wheelchair. You can't "catch up"
with your house - it just wasn't designed for wheelchair use. Renovating all
the narrow corridors, random steps, tight angles and vertical reaches probably
costs more than rebuilding.

In this comparison, the mobility mode is your shell.

~~~
Riverheart
"Hardly anyone knows X. Let's replace it with something nobody knows."

Because there's a value proposition. Just like TCL, Perl, and Lisp had value
propositions over other available languages. It's fine if you don't see the
value proposition in Crush but progress sometimes forces us to learn new
things instead of sticking with what we have.

------
pcr910303
While I do like the new wave of the structured shells(elvish, uxy, nushell,
crush, etc...), I don't think anything will actually succeed unless there's a
big change in the operating system.

(I'm not suggesting these aren't valuable - these are valuable experiments,
but I'm saying for them to succeed, a new OS is needed.)

The native OS doesn't really embrace structured communication, and so commands
will still have to treat the structured shell as a second-level citizen.

In an ideal world, for the structured shell to work, 'npm list' should return
a table (or whatever the native data structure is) and you should be able to
do 'npm list | where {deduped==false}'. In the real world, 'npm list' won't
support the structured shell, so you would be writing 'npm list --json | jq
'<obscure commands>' | json:from | where {deduped==false}'. And then 'npm list
| grep -v 'deduped' is more simpler, so you don't really feel the advantages
of the structured shell.

I think for this to be solved, the OS's primary communication system should be
structured, so you can guarantee that every command will be a table. Then the
programmers for the tools of that OS will output a table, and the users will
be able to take advantage of this. Until then... it's basically a fun
experiment.

~~~
cnity
It doesn't have to be this black and white, in my opinion. The most common
data structures I've interacted with on the command line are _newline
separated_ or _JSON_. What if your shell allowed parsing of JSON natively?

npm list --json | ({ json }) => json.somePackage.author.name

What if it allowed parsing of XML natively, allowing instant scrapers to be
written in a single command?

curl www.example.com | ({ xml }) => xml.find('.importantInformation') >
scraped_info.txt

~~~
mywittyname
jq makes handling json in existing bash scripts simple. It can even do some
simple data wrangling via built-in functions like map. It's not a universally
standard command yet, but it's available for most modern systems.

I'm not sure about similar xml cli tools, but they probably exist.

Point is, both of these should be cli tools, not shell built-ins.

~~~
mdaniel
> I'm not sure about similar xml cli tools, but they probably exist.

I :heart: xmlstarlet (sometimes installed as that name, sometimes as just
"xml"):
[https://formulae.brew.sh/formula/xmlstarlet#default](https://formulae.brew.sh/formula/xmlstarlet#default)
=> [https://xmlstar.sourceforge.io/](https://xmlstar.sourceforge.io/)

The xsltproc that ships with libxslt1 is also pretty handy, although more
verbose

While not quite xml, pup is super handy for doing quick html selections:
[https://formulae.brew.sh/formula/pup#default](https://formulae.brew.sh/formula/pup#default)
=>
[https://github.com/EricChiang/pup#readme](https://github.com/EricChiang/pup#readme)

------
varbhat
Your shell is not for me. Here's why.

1) I don't want shell to be full-blown high level programming language. POSIX,
Bash scripts/languages are enough for my shell uses.

2) For anything high level or for tasks shell would become complex, We do have
dozens of fully matured languages. I would rather use those instead of
this/any shell script.

3) Features desirable for shell include syntax highlighting,
autosuggestions,command-not-found helper,tab autocomplete, flag complete. My
shell already has it.

4) I like bash and zsh as shell. Bash is simple,i like it,but it doesn't have
few features that i mentioned. So,I don't use bash as my main shell. zsh also
is compatible with most bash/POSIX and it has all tte features that i
mentioned in (3).fish also has these features but it's syntax is different
which i don't tend to use. So,i don't use fish. zsh has all features that i
like in fish too(mentioned in (3)) .

~~~
rsa25519
> Your shell is not for me.

And that's okay. Personally, I'm thrilled by the idea of a new shell. I don't
plan to use Crush in production (yet?!) but I definitely plan on installing it
on my laptop.

I'm excited to see how things go. I understand, for a while, it won't have all
the bells and whistles of zsh. But that's okay. Everything has to start
somewhere. I remember back when alacritty didn't even have easy scrollback!

Best of luck to Crosh and its author, liljencrantz!

~~~
mywittyname
I'm curious if you've taken the time to learn Powershell?

It's a supercharged version of bash that works with Windows, but instead of
piping character-only data, it can pipe structured data between commands in
the form of objects. The reason I ask is because, while it adds a ton of
flexibility to shell scripting (to the point where I consider it to be a
language), it is also quite complex.

The benefit of shell scripting is that it is simple enough to have as a
tertiary language that you can pickup and use after a six month hiatus. Once
you get into more powerful language territory, there are a lot more components
to memorize and it becomes difficult (for me at least) to keep those idioms in
my head if they go unused for months.

~~~
BeetleB
> The benefit of shell scripting is that it is simple enough to have as a
> tertiary language that you can pickup and use after a six month hiatus.

I definitely did not find that to be the case. I've learned and relearned
shell scripting a bunch of times. I always forget it and need to relearn it.
Worse than perl.

------
frou_dh
Speaking of fancy alternative shells, I tried [http://xon.sh/](http://xon.sh/)
for a while, whose selling point is that its language is a superset of Python.

While it was neat for interactive use to have stuff like proper string
manipulation and list comprehensions, writing scripts with it (.xsh files) was
horrible, because the sorta-but-not-really-Python-ness meant that no tooling
worked. Syntax highlighting was messed up, code autoformatters barfed, linters
barfed, autocompletion didn't work.

I actually find writing scripts in Bash, aided by the amazing linter
[https://www.shellcheck.net/](https://www.shellcheck.net/) to feel almost like
it's now a compiled language. It goes from being a fraught endeavor, to being
kinda fun and educational. It's like pair programming with someone who has an
encyclopedic knowledge of the language quirks. Hook it up to automatically run
on-save in your editor!

~~~
Skunkleton
I check in on Xonsh every once in a while. The first test I do is try and
suspend a sleep into the background. This has never worked on xonsh afaict. I
am excited to give Xonsh a spin, but I can't until it gets its job control
story sorted out.

~~~
davidhalter
They are unfortunately not implementing posix correctly. Instead of using
proper processes they use threads to control certain things which makes
<ctrl-z> not work properly.

This would actually be possible to fix, but it's quite a bit of work.

------
robin_reala
The author, Axel Liljencrantz, was also the initial developer of fish, so has
form in shell development.

~~~
cies
Had a quick look: Fish is in C++, Crush is in Rust.

I also feel Crush addresses some issues I had with Fish. Still on Zsh though
(I like pastable scripts).

~~~
gwd
I was curious how crush compared to fish (having glanced cursorily once or
twice at the latter). Want to expand on the "issues you had with fish"?

------
j88439h84
A similar idea is Mario, for Python code in shell pipelines. Its most novel
feature is (optional) async commands, so you can pipe in a list of urls, make
many concurrent requests, and then process them in a single line.

    
    
        cat urls.txt | mario async-map 'await asks.get ! x.json()' apply len
    

[https://github.com/python-mario/mario](https://github.com/python-mario/mario)

Nushell also seems cool, it's somewhat similar to Powershell.
[https://github.com/nushell/nushell](https://github.com/nushell/nushell)

------
pjmlp
> I also feel that tying a shell to COM objects is a poor fit.

This is not correct. Powershell allows calling into COM, .NET or straight DLLs
for that matter.

It is also what I see as missing opportunity in most UNIX shells that failed
to build up on Xerox PARC, Lisp Machines and ETHZ experiments.

The ability to directly plug into shared libraries, UNIX IPC, and modern
mechanisms like DBUS and gRPC, alongside live debugging capabilities, thus
exposing the whole OS stack to the CLI, making it a proper REPL experience.

Other than that, it looks quite cool.

~~~
liljencrantz
I agree 100 %, shells should provide convenient, high level native access to
all major RPC mechanisms.

There should simple be a grpc command that you use to instantiate a connection
to a grpc server with, and that lets you do discovery, send commands, etc.

It's on my todo list for crush to do that for grpc and dbus.

------
onli
Oil shell should be mentioned here, an alternative approach at a modern shell.
It tries to be a (more sane?) upgrade path to bash:
[https://www.oilshell.org/](https://www.oilshell.org/)

~~~
chubot
Thanks for the mention! That's an accurate way of describing it. I updated the
home page with the one page description: _Oil is our upgrade path from bash to
a better language and runtime._

The ability to run existing POSIX and bash scripts is the biggest difference
between Oil and other new shells (something I'm often asked). Unfortunately
that's also why it's such a long implementation effort. But in reading the
comments here, I still think that was the right choice.

\----

FWIW Oil's approach to structured data is to use interchange formats over
pipes. You can already pipe JSON, HTML, and CSV/TSV over pipes, and people do.
There are many command line toolkits that do this.

[https://github.com/oilshell/oil/wiki/Structured-Data-in-
Oil](https://github.com/oilshell/oil/wiki/Structured-Data-in-Oil)

However the shell is also missing few things which would help that approach,
so I'd like to build some native understanding of those formats into it,
without requiring any special "plugin interface". Plugins are simply
_processes_ in Oil.

For example, Oil already has a QSN "Quoted String Notation" to properly escape
byte strings like filenames (this happens to be the format of Rust string
literals, which are similar to C and JSON string literals). So you don't have
to do parsing and splitting.

The approach to tables will likely be built on QSN in TSV or QSV. And then you
can enhance "npm list" and "dpkg" just to print that to stdout, without having
to integrate into any special "plugin system". It just works with normal Unix
processes and the kernel.

[https://github.com/oilshell/oil/wiki/TSV2-Proposal](https://github.com/oilshell/oil/wiki/TSV2-Proposal)
(not implemented)

~~~
chubot
Update: I just wrote a draft of "Why Use Oil?"

[http://www.oilshell.org/why.html](http://www.oilshell.org/why.html)

Let me know what you think! (feedback link in the article)

------
gonzus
I believe a good fit for Unix commands is using a pipeline where the output of
your command (say, ls) is converted to JSON, so that later in the pipeline you
can use jq to filter / count / etc in a structured, non-whitespace-dependent
way. A combination of jc
([https://github.com/kellyjonbrazil/jc](https://github.com/kellyjonbrazil/jc))
and jq fits the bill for me:

    
    
        ls -l /usr/bin | jc --ls | jq '.[] | select(.size > 50000000)'

~~~
pedro84
$ ls -l /usr/bin | awk '$5 > 50000000'

-rwxr-xr-x 1 root root 51859776 Jan 26 2018 pandoc

~~~
kbrazil
can you do that with netstat? :) (ls was one of the more simple parsers to
write, aside from some corner cases)

------
mongol
We need SQshell, a shell that uses SQL for operations such as listing files,
processes, all that stuff where a structured query language is useful. SQL
join syntax is not that different from piping. In my opinion, SQL knowledge is
timeless, most other IT knowledge seems to have a "best before" date, but SQL
never expires.

~~~
sumtechguy
The thing is with SQL is many devs seem to be in one of 3 states with it.
Allergic to it, will not touch it. Enough to be dangerous, and make very bad
decisions with it. Amazing at it and fixing poor decisions. I am not sure what
it is about SQL but it really seems to rub people the wrong way. I think it is
the order of the statements. I think if they had switched the from and select
statement around people would 'get' it more.

SQL is a neat tool for what it is. But command line bashing does not seem like
a good fit to me. I get what you are saying though. I just would be kind of
miffed to have to do something like 'select * from
folders(/foldera/folderb/folderc) where lastcreatedate > getutcdate('1970')
order by lastcreatedate desc' when I can currently do something like 'ls
-ltar'. It is not that I can't do that. But it takes a cognitive load to do it
for a command I use a lot. Of course I could make a program to do it. But then
I am kind of back where I started? I could get behind SQL in addition to what
I have. But in replacement I think that would be a tough sell.

~~~
smichel17
I am definitely in the "enough to be dangerous" category, but isn't it
technically possible to write SQL queries the way you describe? IE, the only
reasons to write them in SELECT .. FROM order are (1) tradition / what others
expect, and (2) optimizations made by the dbs that assume the traditional
order? But in terms of returning correct results, FROM .. SELECT should work
fine, I think.

~~~
sumtechguy
Having not written a SQL parser. But I would assume there is a grammar parser
in there that puts the items into some IR like language or structure that runs
it. The big ones call it a query plan. So you in theory could flip them around
as long as the query plan was the same. You would just need to basically
convince MS and Oracle and all the other DB projects to do/try it. Not sure
they would see the benefit though. In my head they are backwards but in their
design it may be fine. Ideally the keywords could be in any order. But that
would probably break some implicit ideas that those keywords convey (looking
at you having and group by).

~~~
smichel17
Hmm, I thought I remembered reading something to the effect that this was
_currently_ possible, but now trying to find more information about it I'm
coming up empty.

~~~
sumtechguy
With oracle if it exists would probably be a compiler directive. In MSSQL
would be some sort of DBCC command. Not sure with the other DBs as I do not
use them as much and have not needed to tweak them at that level.

------
nathell
For Clojure aficionados, there's a flavour of Clojure made for scripting and
the command line: Babashka
[https://github.com/borkdude/babashka](https://github.com/borkdude/babashka)

And a proper shell, Closh:
[https://github.com/dundalek/closh](https://github.com/dundalek/closh)

------
dimator
If you like the sql-like filesystem interaction, check out:

[https://github.com/jhspetersson/fselect](https://github.com/jhspetersson/fselect)

I've integrated this into my workflow, it's a polished tool.

~~~
codethief
This is really cool! I've been waiting for something like this for a long time
since I don't use `find` often enough to remember its command line arguments.
Finally

~~~
gen220
If you like the idea of find, but can’t be bothered to remember the args (me
neither), might I recommend `fd`! It’s like find, only faster and with a more
intuitive, grep-like interface

------
frabert
This sounds similar to the ideas found in PowerShell, which I really quite
like in principle

------
caymanjim
This is a shell for people who either don't know Unix or don't like Unix. The
normal Unix tools do all these things, often with less typing, and they're a
lot more flexible and transferrable to other environments.

I get that many Unix commands are obscure, and the syntax is often
inconsistent, but once you're familiar with them, you can do everything this
shell does and a whole lot more. And for anything more complex you've got a
million other languages to choose from.

~~~
liljencrantz
I am the author of Crush, I designed Crush for myself: a very happy Unix user
for 23 years and someone who can honestly say I love Unix and the Unix
philosophy. As such, I feel that I can say with some authority that your first
statement is wrong.

The rest of your command is subjective, but I find that being able to see e.g.
what user has used a lot of CPU and has many processes running via the command
`ps | group ^user cpu={sum ^cpu} proc_count={count}` is both useful and far
harder to do in a traditional shell. YMMV.

------
dimator
Upon further thinking, I think the fact that running and integrating non-
built-ins is so clunky makes this less a shell and more just a shell-like
programming language, and at that point why not python?

Like it or not, the Unix pipe model of command integration is what enables
programs to "know" about each other without ever really knowing about each
other. Crush looks like its foregoing that.

~~~
mst
The author was very clear that they're focusing on the programming language
part first and the running non-built-ins part is just a proof of concept to be
fleshed out later.

------
hestefisk
I like these attempts at innovating the shell. Hardcore Unix users will say it
is practically impossible to replace universal text streams and sh. But what
if we had said the same thing before Unix wasn’t a thing? Then we wouldn’t
have come this far today. One advantage projects like Nushell and Crush have
is that they have second-mover advantage after seeing what PowerShell didn’t
get right. For instance whilst PowerShell is highly consistent, it’s also very
verbose as a result (unless you tweak it to mimic bash with aliases and then
you might as well use bash anyway). Another example is that, whilst the idea
of objects in pipes is theoretically interesting, it creates a lot of
complexity in creating meaningful pipes (it’s a lot faster to do xargs over a
series of words rather than remembering how to do a for loop in pwsh). Maybe
these people are striking the right balance with the flexibility of text and
the rigour of tables / columns.

~~~
majkinetor
All nonsence. 2nd mover ? This would be like 100th mover... a testament its
very hard to maintain such shell which is the reason nobody else succeded so
far.

This is repeated ad infunum by the anti posh crew, while good arguments are
presented against such attitude all the time.

Also, xargs is simply awful compared to what posh gives and awful without any
comparrison.

~~~
hestefisk
xargs is quite powerful and, most importantly, simple.

~~~
majkinetor
Not simpler then PowerShell way. And not even close in power.

------
chriswarbo
Since there are lots of alternative shells being mentioned, I thought I'd give
my tuppence. I want two things from a shell:

\- Easy process execution, including branching on exit codes, overriding env
vars, providing a working directory, etc. This must support concurrent
processes (e.g. pipelines).

\- Easy file and stdio management (i.e. piping). This must support streaming
(passing along chunks of data as they arrive, rather than gathering them into
big intermediate values).

Scripting language REPLs provide everything else I care about (string
handling, looping, variables, etc.). Their built-in support for these two
things is usually really bad, but libraries can plug that gap. I've tried a
few for Python, Haskell, Scala, etc. but the best experience I had was
Racket's shell-pipeline library ( [https://docs.racket-lang.org/shell-
pipeline](https://docs.racket-lang.org/shell-pipeline) it's also used by the
Rash shell that I've not used [https://docs.racket-
lang.org/rash](https://docs.racket-lang.org/rash) ). This makes really good
use of Racket features, e.g.

\- Quasiquoting for mixing commands with normal functions in a simple, sane
way

\- S-expression syntax works well for shell-like code, e.g. no comma-
separation, distinguishing between (unquoted) symbols for names, flags, etc.
and quoted strings for data, allowing special characters in names (e.g. to
allow things like > in our pipeline), etc. Of course, Racket can be written
with I-expressions, sweet-expressions, etc. too, but I've never bothered.

\- Stdio redirection works with Racket's existing "ports" mechanism

\- Env vars can be managed using Racket "parameters" (dynamically-scoped
variables)

(Note that the API seems to have changed a bit since the version I used, so
I'm not sure how the new stuff like laziness fits in)

------
geophile
The improved-shell-piping-objects space is getting crowded. My entry is
marcel: [https://marceltheshell.org](https://marceltheshell.org).

Trying to create a combined shell/programming language is a mistake, in my
opinion. Marcel exposes bits of python on the command line, e.g. for writing
predicates. For more serious programming, marcel offers a Python API that
allows the use of shell commands inside Python, neatly integrated into the
language. For example you can write a series of commands, connected by pipes,
and the result defines a Python iterator so that you can use the pipeline
results in a for loop.

See the website for more details and discussions.

------
PaulHoule
Looks like a unix-ish take on PowerShell.

I love the idea behind PowerShell, but the verbosity turns me off.

PowerShell, of course, has the pipelines but also COM integration, which is a
big deal for Windows (you can script "the desktop" via COM; for instance you
could create a new instance of Microsoft Word and call objects inside Word to
do something to a document.)

Unix has dbus and some similar things, but nothing quite so critical as COM is
to Windows.

~~~
AnIdiotOnTheNet
The verbosity is one of PowerShell's best features. There is so much less
guess work about what a command or argument does, or whether it is '-a' or
'-A', or any of that garbage. You can usually guess how to do things because
there's a reasonable consistency to the Verb-Noun structure. The extra typing
is mitigated by everything being tab-complete including commands, arguments,
and even variables. Not to mention you can just alias commonly used commands
into shorter versions anyway!

Did I mention it also makes readings scripts at least 100 times easier?

~~~
PaulHoule
Around the time powershell came out I got into the (bad) habit of writing
super-concise bash scripts that write bash scripts, for instance:

    
    
       ls *.txt | awk '{print "mv" $1 "/somewhere/"}' | bash
    

You're really not supposed to do this because if your character escaping is
less than perfect somebody can corrupt the bash script that your script writes
(like SQL injection) but it is so much fun...

------
HelloNurse
Publishing a project at this level of immaturity doesn't do it any good. While
it definitely "works", it works as an experiment about techniques to write an
interactive command interpreter in Rust, not as a shell for people to use.

Currently there is no documentation beyond the introductory GitHub page. But
this isn't a fork of sh that only needs documentation of the minor ways in
which it departs from POSIX compliance: there is an unknown set of internal
commands, each with a complex table format for its output (it is necessary to
know all relevant column names to do anything), which depends on a large set
of non-text types, which have complex object-oriented behaviour, with
"serialization" to a variety of file formats, with an unspecified script and
command line syntax.

Such complex features, on par with serious frameworks in full-fledged
programming languages, need to be defined and documented very thoroughly, or
nobody will understand, trust and use them. Unfortunately, there are many
hints that not only documentation is vastly behind schedule, but that
important aspects and practical considerations haven't been addressed.

For example: how can I quote file names containing nasty characters? How can I
use environment variables? What is my home directory? Why is executing an
executable program difficult, with complex and unclear mechanisms that mess
with the command line? Given a very simple and very portable traditional shell
script (e.g. ps -eaf | grep foo | sort) how can I do the same in Crush with
minimal effort? If I load a very large file, does the shell choke attempting
to load it all to memory, or does it process it in a proper streaming fashion?

Other issues are more fundamental. How can I perform relational queries
between multiple tables, e.g. a join between ls output and a file listing
users? If I can't, what can I do with tables that cannot be done more easily
and flexibly with awk, jq, and other established tools?

~~~
appleflaxen
I have zero interest in this project, but disagree that it "doesn't do any
good" to share. People with similar inclinations and interests can collaborate
on software that isn't ready, but that scratches their itch.

Theres nothing wrong with that, as long as you don't create unrealistic
expectations.

~~~
HelloNurse
The problem with novel shells, compared with other kinds of itch to scratch,
is that existing shells have had about 50 years of evolution: doing better in
a few months of work is quite unlikely.

A new shell must be similar to existing ones for compatibility with an
enormous ecosystem of command line tools that work well in a typical shell; it
must do almost everything that traditional shells do equally well (this
includes documentation; documentation is non-negotiable) but it must also have
some compelling advantage.

The right way to collaborate on designing a shell is to discuss possible
designs of command syntax and behaviour, then if some promising idea comes out
of the brainstorming refine it to a full specification, then if the idea
survives (without mutating into a less ambitious scripting language, or a
library for an existing language or shell, or a small change to an existing
shell) begin an implementation. Starting with a proof of concept without a
well developed and well documented design is the wrong way to offer a
"product" but also the wrong way to learn.

~~~
da39a3ee
You either have no idea about open source software, or you have no idea what
it feels like to love programming, or both.

------
danielbigham
I love this. Yes, there are higher level languages that can be used for
accomplishing these things, but people find themselves at the command line a
lot, and having the ability to efficiently express your intent using a
familiar language, without getting hand-cuffed by a poor underlying
representation seems very useful.

------
Wandfarbe
I'm struggling today as well with this shit:

I just wanna write a few small lines of a shell script which downloads stuff;

It is very very stupid through what loops you hvae to jump to do this with
curl and bash script in a good way including readability and error handling.

Is a perl script really better?

Perl 5,6 and 7 did not make it very sane to me, managing and installing and
working with perl is also less trivial then having a shell like bash, sh or
dash available to you.

Something inbetween bash and go is missing, i like the approach of crush. I
think thats the biggest misstake happening in bash: playing around with
strings and auto expansion and substition etc. Its weird, its error prone, it
doesn't have a proper clean well defined ecosysem.

~~~
adamnew123456
If you're open to learning it, Tcl is actually a good middle ground here. The
best way I can describe it is "structured shell script" \- still very string
oriented, but the syntax and command set is much more well thought out, and
its better for building libraries in (having namespacing, among other things).
You even have a choice among object systems if you need one.

You don't lose much in terms of shell interop though - the native exec command
understands a lot of the basics (file redirection, pipelining, background
execution). In the simplest case, it's just one more word:

    
    
      exec wget $URL
    

You do have simple exceptions too, which terminate by default unless you catch
them (unfortunately, the stack traces aren't really useful). AFAIK exec throws
on a non-zero exit, and you can use it in your own code too:

    
    
      # Dead
      error "IO error"
    
      # Not dead
      if {catch [error "IO error"] error_or_result} {
        # The error
        puts "Caught: $error_or_result"
      } else {
        # The result
        puts "OK: $error_or_result"
      }

------
qchris
This looks to be in a similar vein to the Ion shell, which was developed for
Redox [1]. Both written in Rust, both not (seemingly) perfectly POSIX-
compatible, etc. I'm not sure I'd personally choose to go with either one over
shell if I knew anyone else was going to need to use it, but it's cool to see
iteration and experimentation on things that I usually consider to be the
fundamentals of the computing environment.

[1] [https://github.com/redox-os/ion](https://github.com/redox-os/ion)

------
black3r
if the only problem with PowerShell is that it feels clunky and annoying, why
not try to extend/improve it (like Kotlin over Java)? Why start a new project
that will maybe catch up in X years?

~~~
kevin_thibedeau
PowerShell is tied to the CLR APIs. You shouldn't have to run a Mono instance
just to get basic shell behavior.

~~~
nickcox
Pretty sure you don't need Mono to use any modern version of PowerShell.

------
aniou
Nice idea that corresponds with my idée fixe - a sane, usable output format
for Unix utilities. There are differences, of course - I prefer a text-table
format as universal panacea, but I'm glad that more and more people notice a
mess in Unix environment ("simple" doesn't mean "consistent" and even Plan9
suffers from that).

See also
[https://github.com/aniou/cof/wiki/Draft](https://github.com/aniou/cof/wiki/Draft)

------
iflp
Reminds me of elvish (elv.sh), which has a similar focus.

------
chrisweekly
Curious how this compares to Xonsh (IIUC, pretty similar idea, tho I Xonsh is
actual Python, vs Crush using its own new lang).

------
AnIdiotOnTheNet
I really want something like this as a replacement for sh in tiny busybox-
backed use-cases, so I find it really annoying that developers insist I
install rust and compile it instead of offering a statically compiled binary I
can use.

And if it can't do that because of a reliance on shared libs, then that's even
worse!

------
blame_lewis
This looks absolutely brilliant.

------
antb123
I have recently been partial to python and the sh library. This and the other
associate projects look interesting but are they secure and will they catch
on?

------
snarfy
The main innovation I see is tokenization into columns instead of character
streams. If I want to do that in a traditional shell, I use awk.

------
kanobo
A better PowerShell? I'll take it! What's the story behind the name 'Crush'?

~~~
cies
cRUSh shares a few letters with Rust (the prog lang of the project). The
author made Fish before, a new shell+lang in C++.

------
intelleak
Somebody just give me a rust REPL already (that is not the slow as snails
evcxr)

------
sitkack

        crush> ls | where {type == "directory"}
    

Thank you!

~~~
betimsl
What is wrong with:

    
    
      ls -l | grep '^d'

~~~
geodel
It is minimal, so does not meet the modern aesthetics of computer world.

~~~
Riverheart
Until you want something more specific like listing absolute paths of those
directories without the metadata

ls -l | grep '^d' | awk '{print $9}' | xargs realpath

Or

ls -ld "$PWD"/* | grep '^d' | awk '{print $9}'

Guessing Crush could just add another property called Fullpath you can echo in
the next pipe after filtering.

ls | where {type == "directory"} | select %Fullpath

Maybe he adds a --dir flag

ls --dir | select %Fullpath

~~~
inshadows
% find . -maxdepth 1 -type d | xargs realpath

How long do these tools exist? Over 20 years?

~~~
Riverheart
The example works but you need to understand the idiosyncrasies of all the
tools involved and the text format. Once you learn that you're passing around
structured data and you access it the same way it becomes trivial get data.

~~~
inshadows
Yes, you have to learn the idiosyncrasies of the tools involved. My point was,
these are standard tools that are there for more than 20 years, and they work
just fine. People should just RTFM and learn by experience instead of
inventing something new which looks shiny on surface but most certainly will
have unhandled hidden edge cases and serious limitations.

------
tomerbd
for the people who use powershell on mac, and as someone who used neither,
which one would you recommend to integrate into my workflow powershell or
crush?

~~~
viraptor
Crush seems very alpha stage right now (no terminal support), or for actual
usage PowerShell may be your only current option.

------
cbreezyyall
Doesn't AWK already do all of this?

------
niek_pas
Has anyone got this to compile on macOS?

~~~
pritambarhate
Yes, not working for me too. Ran the

curl --proto '=https' \--tlsv1.2 -sSf
[https://sh.rustup.rs](https://sh.rustup.rs) | sh

command also which upgraded by Rustc to rustc 1.45.2 (d3fb005a3 2020-07-31).

This is one problem I have faced with Rust quite frequently. Try to clone and
run a random (not famous) project from Github and it doesn't compile.

Love Java when it comes to this. It's easy to see which version of the JDK is
needed from the Maven file. If you have that version of JDK, the projects
almost always run.

------
ashishmax31
Neat! Definitely going to try this out.

