Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The case for Nushell (jntrnr.com)
105 points by lukastyrychtr on Aug 31, 2023 | hide | past | favorite | 141 comments


This may be obvious, but what I realized recently, when I tried to switch to one of the “alternative” shells (xonsh, based on Python), is that the experience of using a shell is (at least) four (overlapping) experiences:

1. The basic interactive stuff at the command-line: typing commands, tab-completion, aborting a half-written command when you change your mind or remember something else to do first, cut-and-paste, ….

2. Basic scripting at the command-line: writing a loop, piping one command into another, defining a function (over multiple lines?), ….

3. Scripting for job control: getting the return value of a command, waiting for a command to finish, putting one in the background, sleep, …

4. Programming as in other languages: if/else statements, assigning to variables, what arrays look like, doing more complicated logic and processing, ….

[There's some overlap, and note that (3) or (4) may happen either at the command-line itself or in a separate .sh (or whatever) file in a text editor.]

I found that `xonsh` seemed to be focusing on making (4) better—basically replacing the `if … fi` and `case … esac` of Bash/Zsh with Python—but it turns out I care about it less, and I'm too used to (just) 20 years of (1) to miss it: when Alt-. doesn't insert the last part of the previous command, or I cut some line(s) with Ctrl-k and Alt-y doesn't maintain the kill ring to paste it back (so it's gone??), and half-a-dozen other issues like that, it was annoying enough for me to switch back in less than a day or two. I don't think of myself as an old/long-time user (when I started in 2003 I was a newbie), but I guess I am one by now: don't mess with standard readline conventions. And (3) is core to why someone would use shell scripts; if you need me to "import subprocess" or do anything nontrivial to pipe (for example), you've already lost me.

My understanding is that fish has focused on (1) with good results, and the "ls | where size > 10kb" example here, or what Powershell is good at, is focusing on another (5) that I didn't write about (the commands themselves, the flags to them, parsing their output etc).

Anyway, interesting post, I'll try out nushell. As the post doesn't seem to link to it: https://www.nushell.sh/ (install it, and then start it with `nu`, not `nushell`.)


The issue I have is I don’t really get the value of solving (5) at the shell level: anything doable with richer datatypes is doable with a version of coreutils and peripheral utilities with better serialization formats. And, I’d rather the shell’s piping be less intelligent in favor of offloading more intelligence where it’s easier for users to extend: external binaries on $PATH


One of the other comments (https://news.ycombinator.com/item?id=37335912) mentions that it's possible to use `nu` for its (5) solutions without switching to it as a shell:

    nu --commands 'ls | where size > 1MiB'
So it seems possible to use nu and its coreutils equivalents / filters as an “island” having structured output and easy parsing etc, without switching one's shell.


I strongly agree, the dumbness of the Unix pipe is instrumental to its composability. The actual issue is the lack of an agreed-upon standard, default, stable, machine-friendly output format across the "give me a list of x" commands. I can easily imagine a parallel universe where df, ls, find, etc. would output JSON by default, and jq (or some other JSON manipulation utility) would be just as important as the shell itself.


I think there's a (6) here too.. the terminal app that the shell runs in. I still don't fully understand how the 2 interact, but I've never been quite happy with any terminal I've used and I wish someone would just marry the shell and the terminal so they can have nice rich integrations that aren't limited to text. Like why can't I just pop into a file picker to complete a path if I want to? Why can't I get a drop down of options like in my IDE? Tooltips? Images? Why do I have to remember what magic keystroke I need to background a process without losing it forever so that I can run another one while I wait?


FWIW I think wezterm (https://wezfurlong.org/wezterm/) offers some hope in this direction.

I switched to wezterm (a couple of years ago by now, I think) in part motivated by its support for "Kitty graphics" protocol (and others) but also because its author seemed to be interested in pushing the envelope in terms(!) of what a terminal could be & do.

The recent addition of a Command Palette[0] really helps with discoverability of features and can also have user customised items added[1] to it.

Other recent additions include the ability to prompt the user to choose from a list of options[2] or input text[3].

This is barely scratching the surface of the feature set but you might like to take a look at wezterm to see if it might be closer to your terminal ideal or at least a project to watch that's innovating in the terminal space. :)

[0] https://wezfurlong.org/wezterm/config/lua/keyassignment/Acti...

[1] https://wezfurlong.org/wezterm/config/lua/window-events/augm...

[2] https://wezfurlong.org/wezterm/config/lua/keyassignment/Inpu...

[3] https://wezfurlong.org/wezterm/config/lua/keyassignment/Prom...


Interesting on (1), I’ve always used vi mode (like I don’t even know those commands, maybe they work in vi mode too?) and from memory Xonsh had as good a vi mode as bash along with some baked in improvements over vanilla bash.

I’m more or less waiting to move to Xonsh full time to see how I go once I’ve got some time to invest moving, largely because it seems it will be a win on all 4 points for me.

Otherwise, the bigger point 5 you allude to is probably the real issue we’re all avoiding.


That feature comes from emacs; hit ctrl-y to get ("yank") the last thing you copied, hit alt-y to get the previous thing you copied, and keep hitting alt-y in succession to cycle through the clipboard, which is called the "kill ring" in emacs.

I don't know if vim itself supports that feature, let alone vi mode on the command line. "Real" vim supports adding text to registers - does vi mode support it?


This is the first I’ve heard of a “kill ring” thing, and it sounds wonderful. Pasting in Vim is generally a PITA because I always cut some word, then delete something to make room for it, which is itself another cut, and then I have to remember what register the thing I want is in now, and how to get at it. Double quote and then 9? I can’t even remember and I’ve been using Vim every day for a decade now.


> Double quote and then 9?

Interesting ... I checked the help using :h " and got to the bottom of it.

So "0 is the most recent thing you copied - I use that all the time, it's very useful. Including ctrl-r 0 in insert mode.

The other numbers 1 - 9 are the last thing that were cut either from a delete or a change, where "1 is the most recent, "2 the next recent, and so on.

The trick, though, is that it has to be //more than 1 line// to automatically end up in a numbered register - unless you explicitly use a numeric register when cutting.


I set my vim clipboard to use the system one, and then use a system level clipboard manager it’s great

https://github.com/TermiT/Flycut


Yeah, vi mode is a must for me at this point. I have the readline shortcuts on a post-it on my monitor for the odd time i find myself in a sqlite repl or something.


5. it takes more than a day or two to change to something better or find workarounds to preserve the old habits in the new environment


FWIW, I like libedit's vi mode more than readline's.


Maybe I'm not using my shell to the fullest, but 95% of commands are pretty simple, and not complicated queries. `ls` might have a bajillion flags, but `-lah` go a long way. Because fish's autocomplete, I don't have to remember tar flags. That's the killer feature for me.

Along those lines, a quick way to drive adoption could be a huge "how do i do x" or recipes page to Ctrl+F through. If I have to search the internet for how to do x in nushell/fish/etc, I might as well stick to arcane bash - at least you know someone has had the same problem before.

Make it easy for me to get stuff done by showing me how to get stuff done, and I'll grok the shell bit by bit.

I did see https://www.nushell.sh/cookbook/ , but it isn't really a searchable reference. There's some setup stuff, other short pages with basic tutorials (?), a link to a repo with example scripts, and for one-liners you are instructed to join a discord. None of these are easily searchable.


> If I have to search the internet for how to do x in nushell/fish/etc, I might as well stick to arcane bash - at least you know someone has had the same problem before.

I feel the opposite way: if I don't really know Bash, I'm not throwing away a bunch of my hard-earned knowledge and experience to start again from zero by switching to Fish. It's just a lateral move, and an obvious win if the shell it takes me to is simpler, smaller, or otherwise easier to learn.


As long as you interact with others' software, the chances are, you will have to read bash in the future. So sure, you can learn fish, but you will likely have to learn some bash (or even sh) anyway. Might as well start from common option.


I try to avoid bash because it keeps biting me whenever I try to learn it - `x=y` is different than `x = y`, for instance. I want this to die, and the only way for it to die is for people to stop using it.

Yes, it'll never truly die, but that's also true of COBOL and m4 - and you haven't learned either of those languages I'd guess.


To be fair, that same argument could be made against any new programming language designed to replace an old, firmly entrenched one.


Sure, and that's why Zig or D are not in the any of the top 10 lists.

You have to get a lot benefits to justify switching to a new default against the current critical mass.

(and this is especially true for shells: a large number of code is written in sh because it's assumed to be installed on all users' systems; so unless so many software starts requiring "fish" that it gets into default system install, we'll be seeing sh for a long, long time)


And it is, I use that argument regularly when people want to spin up a new project in a new language when the tooling doesn't support the new language and it's an uphill battle to make it work.


No, "lack of good tooling" is a completely different argument.

The argument put forth by the comment I replied to is that there is basically no point learning a new language, because to work with existing code you still have to know the old language.

Which I think is a massive fallacy. Yes, Kotlin programmers still have to understand Java to be effective. That doesn't make Kotlin useless.


Shell is pretty special though - it is a very frequent secondary language. In fact, even Kotlin itself uses bash:

https://github.com/JetBrains/kotlin/blob/master/scripts/buil...

So sure, replace Java with Kotlin, C++ with Rust, Python with go/julia/ruby/whatever.. but bash or sh is here to stay. (And that would be bash or sh, not fish or nushell or something...)


There is a 'Coming to Nu' section of the nushell book which covers 'how do I do x' coming from other shells: https://www.nushell.sh/book/coming_from_bash.html


> Along those lines, a quick way to drive adoption could be a huge "how do i do x" or recipes page to Ctrl+F through. If I have to search the internet for how to do x in nushell/fish/etc, I might as well stick to arcane bash - at least you know someone has had the same problem before.

https://github.com/tldr-pages/tldr


> Recently, I had a chat with some of my friends about Nushell and why they stuck with traditional shells like bash/zsh or the "new" hotness like fish rather than using Nushell.

Umm... because Nushell is pre-1.0, with major language changes every few weeks?

Living on the bleeding edge isn't everyone's cup of tea, and certainly not with something as fundamental to the system as a shell. Rest assured I will switch the moment 1.0 is released and a stability promise is published.

That being said, yeah – Nushell is the real deal. It's the Unix philosophy, except this time it actually works.


> It's the Unix philosophy, except this time it actually works.

How is Nushell compatible with the Unix philosophy? The core Unix philosophy is about enabling workflows that combine a set of independent tools in novel ways. Nushell implements a monolithic ecosystem with commands that support its fancy features, but using any external command not written for Nushell will be cumbersome at best, and incompatible at worst. This goes against the open philosophy of Unix, which is partly what has allowed it to grow and succeed.


> The core Unix philosophy is about enabling workflows that combine a set of independent tools in novel ways.

You mean like

    ls | rm
deletes the files in a directory?

Uh wait, that doesn't actually work.

Because the Unix tools actually don't implement the Unix philosophy at all.

Because as it turns out, text streams are not a universal interface after all.

Nushell succeeds where the coreutils have failed for decades precisely because its commands are designed to work together. They pass structured data around, which means that pipelines can actually work without requiring weird hacks to make one underspecified text format conform to the other. That is the Unix philosophy: Tools that work together.

But don't worry, the old Unix tools can be used from Nushell – and they work together just as poorly in Nushell as in every other shell, not one bit worse.


This is a strawman. It's trivially easy to do what you're describing.

    ls | xargs rm
And in fact, a better implementation is

    rm *
The Unix philosophy doesn't say "Every program interfaces intuitively and correctly with every other program". It says that programs should be able to work together. There are arguments that coreutils such as ls aren't following the Unix philosophy, but they lie much more in that ls has dozens of arguments than they do in your contrived example not working.


The fact that the first example has a bug (it doesn't handle file names with spaces correctly) sort of proves the point of the commenter you responded to.

I personally don't use nushell (or another alternative, oil) because bash for scripting and zsh for interactive shell is good enough for me, and compatibility with other people is valuable. However, whenever you start to write slightly complex scripts in bash, you have to think carefully about spaces in file names and other similar problems. Of course, you can argue you should just write those scripts in Python, but shell is a lot quicker to write and would be more elegant if it had fewer footguns and less esoteric syntax.


`ls | xargs rm` is still wrong. Maybe you can accept the point...


We've had this discussion before[1]. :)

It's a huge leap to assert that Unix tools don't implement the Unix philosophy because a specific example doesn't work in the exact way you expect it to. You could just as well implement a version of `rm` that parses stdin in a specific way by default, but that would likely make it work for that specific use case, and not in the generic and composable way the tools are meant to be used.

The whole point of Unix is for tools to be independent of each other _and_ of the environment they run in, while giving the user the freedom to compose them in any way they need. What Nushell has built instead is a closed ecosystem of tools that interoperate well with each other, but not with the outside environment. This means that a) the ecosystem is not extensible unless one contributes to Nushell, or directly implements Nushell features, and b) external tools will always be second-class citizens, that might even be incompatible with Nushell.

To give you an example, how would I use GNU ls within Nushell? Suddenly, I can't use any of the `where` fanciness, and my entire pipeline breaks. I would have to resort to another Nushell-specific helper to integrate the command, which needs to be written in a very generic way to support a wide range of use cases, or switch my entire pipeline to use external commands only, which defeats the purpose of using Nushell to begin with.

This is a contrived example, but if you compound this with the amount of CLI tools that exist and have yet to be written, it's a usability and maintenance headache.

So I'm glad that Nushell exists, as it challenges existing notions of what a shell can be, but let's not disparage existing shells or Unix to prove a point. The Unix ecosystem is so widespread and successful today _because_ of these early decisions, and will more than likely continue to exist because of them as well. That doesn't mean that we can't do better, but I'd argue that a monolithic shell with a strict contract between commands is not the way to build a sustainable and future-proof ecosystem.

[1]: https://news.ycombinator.com/item?id=36706617


> That doesn't mean that we can't do better, but I'd argue that a monolithic shell with a strict contract between commands is not the way to build a sustainable and future-proof ecosystem.

I don't think it is far fetched to imagine that most popular command line tools will have support for JSON output within let's say 5 years. With that, I think nushell's value proposition becomes a whole lot stronger. Granted, there will never be a time when all tools integrate well, but I can see a critical mass evolve.


Sure, but why JSON? :) What happens when a new format comes along that is better, more efficient, or whatever? Would all tools then need to be updated to support it? What if a change in the format is introduced? Would all tools need to be updated, and have to maintain backwards compatibility indefinitely? This would be an even bigger UX problem, while also adding much more maintenance work for tool developers.

It might seem like unstructured text as an exchange format is a usability nuisance, but it is what makes independent and disparate tools work well together. Choosing a strict contract is only sustainable if a single project maintains all tools, and even then it's a huge maintenance effort.


> Would all tools then need to be updated to support it?

The effort to translate JSON to another structured data format is orders of magnitude easier than converting the unstructured mess we have now into JSON.

So, it does not matter if we choose JSON, XML, TOML, etc. Once we get structured data, then things progress quickly.

The nice thing about JSON at this moment is that it is a very common and easy to use format.


JSON seems to have established itself for this to a large extent already. I can imagine a binary format like MsgPack to work well for this too but I think JSON is good enough and popular enough that it could actually be embraced widely.


> Choosing a strict contract is only sustainable if a single project maintains all tools

You're obviously not working in the web space, where literally hundreds of thousands of sites have somehow successfully standardized on interoperating via JSON input/output for their HTTP API endpoints despite using many, many different technology stacks, over the last 10 years or so.

It isn't perfect, but it's easy to use and common and the tooling is mature.


I absolutely think that treating data piped from or to another process as JSON by default, and treating data going back to the terminal as text (by transforming the JSON to tabular data etc.) is the way to go (also because it's backwards-compatible). To the point that I wanted to write some wrappers for the standard commands that automatically did all this via `jq`. I know commands can detect whether they're being piped to another command or to a terminal, since certain commands will automatically skip ANSI coloring when being piped elsewhere...


> I wanted to write some wrappers for the standard commands that automatically did all this via `jq`.

If you're not already aware of it, you may wish to check out `jc`[0] which describes itself as a "CLI tool and python library that converts the output of popular command-line tools, file-types, and common strings to JSON, YAML, or Dictionaries. This allows piping of output to tools like jq..."

The `jc` documentation[1] & parser[2] for `ls` also demonstrates that reliable & cross-platform parsing of even "basic" commands can be non-trivial.

[0] https://github.com/kellyjonbrazil/jc

[1] https://kellyjonbrazil.github.io/jc/docs/parsers/ls

[2] https://github.com/kellyjonbrazil/jc/blob/4cd721be8595db52b6...


This is interesting, but I like neither the TUI nor the fact that it's written in Python, lol.

My idea was to have namespaced wrappers for the commands which were designed to generate output as JSON and/or accept input as JSON. So for example "ls" would have a wrapper "qls" (queryable ls) or maybe "jsls" (json LS), etc. But in thinking about it, many questions remain- Would some of the structs or struct elements have types, like "file_path" or "file_name" for example?

Like for example they do:

> $ jc dig example.com | jq -r '.[].answer[].data'

and my API would be more like:

> $ jsdig example.com -- '.[].answer[].data'

(which would then pass those additional arguments to "jq -r" without having to pipe)

The core idea is that you can have seamless integration of structured pipe data with regular piped data without having to change shells.


Those commands that strip color very often get it wrong too. I have to force color so damn often...


I think standardized JSON output makes the value of nushell weaker, not stronger: more structure on stdin and stdout (e.g. switching them to unix sockets instead of fifos and passing FDs around) means the shell composing commands can be less sophisticated. The only way a monolith like nushell adds value is if you compare it to the current standard of unstructured bytes everywhere.


> What Nushell has built instead is a closed ecosystem of tools that interoperate well with each other, but not with the outside environment.

IMO "closed ecosystem" feels like a mischaracterization of the project--there's multiple features that are designed to support interoperability outside the nushell environment.

> how would I use GNU ls within Nushell?

To specifically answer this question there's at least 3 possible approaches (including basic parsing of output based on "Parse external ls command and combine columns for datetime" example[0])):

    # Option 1 (use `^ls`)
    $ nu --commands '^ls -lb | detect columns --no-headers --skip 1 --combine-columns 5..7 | select column4 column8 | rename size name | update size {|it| $it.size | into filesize} | last 5 | where size > 10KiB'

    # Option 2 (use `run-external`)
    $ nu --commands 'run-external --redirect-stdout "ls" "-lb" | detect columns --no-headers --skip 1 --combine-columns 5..7 | select column4 column8 | rename size name | update size {|it| $it.size | into filesize} | last 5 | where size > 10KiB'

    # Option 3 (use `--stdin`)
    $ ls -lb | nu --stdin --commands 'detect columns --no-headers --skip 1 --combine-columns 5..7 | select column4 column8 | rename size name | update size {|it| $it.size | into filesize} | last 5 | to nuon' | nu --stdin --commands 'from nuon | where size > 10KiB'
    ```
You could also use `jc` to parse the ls output to JSON & pipe that into nushell.

> I would have to resort to another Nushell-specific helper to integrate the command, which needs to be written in a very generic way to support a wide range of use cases

Well, one person would need to write that, once (or just use `jc`).

As a matter of interest, how would you do this example task ("display name & size for which of the last 5 files in a directory list are larger than 10KiB", I think) with bash/coreutils command pipeline? (Presumably some combo of `ls`, `cut`, `tail` and others?)

[0] https://www.nushell.sh/commands/docs/detect_columns.html#exa...


You're right, I'm being a bit harsh towards Nushell, partly because I don't see the problem these tools are trying to fix as particularly significant, and because the legacy Unix design decisions are what helped make it successful today.

My point is that an open ecosystem can't prosper if all tools are part of a single project, and they depend on each other to work. There's a good reason why even GNU tools, and even those part of coreutils, are not built with an assumption that they will be part of the same pipeline. All of them are independent, and "do one thing well", and how they integrate into a pipeline is entirely up to the user. All external tools that follow this same principle can generally be composed in the same way. The benefit of this is that the user doesn't depend on a single project, and each tool is easily replaceable by another.

Thanks for those Nushell examples. At first glance, they don't look readable or intuitive to me, but maybe it's because you're parsing the output of `ls`, which is generally a bad idea. I only mentioned `ls` as a contrived example, but imagine you have to integrate one or more external commands, and not just at the start of the pipeline, but somewhere in the middle. Doing those conversion steps from/to Nushell would become very cumbersome.

I wouldn't use `jc` or any structured exchange format for this task. `find` can do most of the legwork:

    find -type f -size +10k -printf '%k\t%P\n' | sort -n -k1 | tail -5
This doesn't show the results in a nice table as Nushell would probably do, but IMO it's much simpler, very clear what each tool does, and the pipeline is easily extensible at any point.


I think these are both orthogonal arguments. @p-e-w is arguing for tools that neatly and simply compose workflows, while you're arguing for independence of tools set.

Let's explore further.

> It's a huge leap to assert that Unix tools don't implement the Unix philosophy because a specific example doesn't work in the exact way you expect it to.

Fair statement, but it's never just one example. It's tonnes of examples that can be made. There are groups of tools whose sole function is just to help stitch other commands together (I.e. getting ls and rm into the same workflow). It works, but the point being made is that it could be better.

> You could just as well implement a version of `rm` that parses stdin in a specific way by default, but that would likely make it work for that specific use case, and not in the generic and composable way the tools are meant to be used.

I would argue this is _exactly_ what's happened - just that it isn't for only rm, but a whole common subset of tools to bootstrap a new way of working. Your exact argument works still for nushell, if someone wants a new tool for rm (or the legacy one) they could still plausibly do that - and your point above would still hold, it would completely defeat the purpose of nushell, as suddenly _where_ doesn't work. That same argument however applies for current shells... If I suddenly swap out rm, my scripts at some point are going to start failing. Unfortunately we're tightly bound to an environment and ecosystem based on legacy decisions. There are plenty of examples of how that's played out with technology in other ways (for good and bad) - the question should be, is this the right evolution?

> That doesn't mean that we can't do better, but I'd argue that a monolithic shell with a strict contract between commands is not the way to build a sustainable and future-proof ecosystem.

I would point out that no good reason has been given for no contract in existing shells? Or maybe we should observe that there is actually a contract, but it's just a lot less structured. To the point that the contract is just an agreed historical structure that popular individual programs are expected to use. Regardless, a contract isn't a bad thing, especially when it's done correctly to balance flexibility and structure.

I'm not convinced nushell is the way to go, but I would prefer the debate not to be muddled. Sure current shells have done fine. We expect future shells to do better. Will they imbue the Unix philosophy perfectly? Well like any good philosophy that will be up for debate.


> There are groups of tools whose sole function is just to help stitch other commands together

This is an example of the Unix philosophy working: replace smarts in the tools with smart compositions of tools; i.e. don’t make programs have to determine whether to read file names from arguments or stdin, use arguments by default and `xargs` to convert stdin into arguments.


Completely agree, but I think we get stuck in a loop at this point. Why don't we use the composition tools instead to take _structured data_ and make it print in a pretty fashion on screen, rather than pretty display data and have to compose tools to make it structured?

We've still custom created smarts in the tools to do something, it's just that those smarts are dealing with printing the data to console in a pretty fashion (which was the priority at the time) rather than the manipulation and handling of the core data they have been created for.

It's not necessarily a philosophy change, just a perspective/priority change. We less and less need to experiment and understand the systems, and more and more want to connect and streamline them.


You just can't parse the output of ls and every bash guide will tell you that.

This means that ls has no place in shell scripts unfortunately


In that sense, perhaps that facet of capabilities of Nushell would be more comparable to a Foreign-data Wrapper for SQL rather than a shell with pipes.


So long as a tool can output a machine readable format, it can be consumed by nushell. If it can't then... that's actually a big problem when composing tools in bash too. Parsing arbitrary unstructured text is not good for composition and requires making something bespoke for each and every tool. As well as each and every argument for each and every tool for those arguments that happen to change the output in different ways.


> Parsing arbitrary unstructured text is not good for composition and requires making something bespoke for each and every tool.

Right, but therein lies the flexibility of using unstructured text as the exchange format. Each tool doesn't need to be aware of the format it needs to consume or output; it just cares about what makes sense for itself and the user. It's up to the user to determine the best way of composing each tool within a specific pipeline. This might seem cumbersome, but there are a limited set of ways tools can be composed, and generic helper tools exist for a variety of use cases (xargs, sed, awk, grep, cut, paste, etc.). If one tool breaks, then it's easy to either replace it, or fix the pipeline to support that specific tool.

In contrast, by making a strict contract of the exchange format, each tool needs to support a specific contract, and any deviation or update of it means that all tools need to be updated as well, while also maintaining backwards compatibility. This might be fine for a monolithic environment where a single project maintains all tools, but it's unsustainable if one wants to build an open ecosystem of tools that work independently, but can still be composed in any pipeline.


Except that means tools aren't doing "one thing" as in the Unix philosophy. Or least I know of very few unix tools that actually do that. At a minimum they need to bundle an argument parser (or more often reinvent their own). They also need to do the same for generating output, often in multiple different display formats depending on the precise flags used.

And I think your overstating the contract argument. If `ls` changed its output format in any way (unless behind a flag) it'd break a heck of a lot of bash scripts. With a well structured output format it's better able to maintain backwards compatibility while adding new features.


> If `ls` changed its output format in any way (unless behind a flag) it'd break a heck of a lot of bash scripts.

The `jc` ("CLI tool and python library that converts the output of popular command-line tools, file-types, and common strings to JSON") documentation[1] & parser[2] for `ls` also demonstrate that reliable & cross-platform parsing of even the current output can be non-trivial based on filenames encountered, flags used & host platform--which means there's probably a non-zero number of bash scripts that are already broken but they don't know it.

[1] https://kellyjonbrazil.github.io/jc/docs/parsers/ls

[2] https://github.com/kellyjonbrazil/jc/blob/4cd721be8595db52b6...


> but using any external command not written for Nushell will be cumbersome at best, and incompatible at worst.

As someone who uses nu as his main shell for the past few years, this is not my experience. Nu has great tools for adapting programs not written for it. And if your tool already produces a standard format as output, like json or toml, you don’t even need those things.


>And if your tool already produces a standard format as output, like json or toml, you don’t even need those things.

json or toml So already we have two (not one) formats that nushell presumably has to support? Or am I not getting something?


"like" is doing some work here. nushell supports:

* 16 formats to convert from https://www.nushell.sh/commands/docs/from.html#subcommands

* 10 formats to convert to https://www.nushell.sh/commands/docs/to.html#subcommands

by default. I believe that via its plugin system, anyone can write a plugin to support whatever additional formats they want, but I haven't needed to investigate that deeply.

And that's if you want to support a format specifically. You can do something similar to like, how you'd manipulate plaintext output on UNIX with other commands. For example, see this example for parsing the output of git log: https://www.nushell.sh/cookbook/parsing_git_log.html

the 'lines' command will take any text and produce a nu table, where each row is a line. At that point the data has been converted, no specific "git log" format needed.


Okay. Sounds interesting, should check it out.


This makes a terrible case for Nushell as a shell. It makes a reasonable case for Nuhsell as a programming language.

If I want a programming language I use a programming language. A shell is an interactive environment and only a programming language by way of the Turing-complete nature of its grammar. However, if I am stringing more than a couple commands together something has gone terribly wrong.

The features I care about are features that make that interactivity better, not a type system.


>If I want a programming language I use a programming language. A shell is an interactive environment

That's the mindset behind us still having crappy shells...


fish proves otherwise.

the reason we still have bad shells is actually compatibility of scripts. so those people who prefer real programming languages are not the problem. they don't write shell scripts. those who do and insist on being compatible with a bad syntax from 50 years ago are.


If being a good programming language is what makes a shell good, use the Python repl


The false dichotomy is the issue. There's no reason (compatibility aside) a shell can't be both a great interactive environment and offer a great language (one 10x more well designed and consistent than what it offers now).


you just made the claim that people who prefer real programming languages are the reason for bad shells. if there is a false dichotomy, that would be it. sure it is possible to have both, but the focus of the shell needs to be on interactivity. i don't at all care if the shell has a good programming language. i only care about good interactive features.

there is a lot of overlap though, as a good interactive shell does benefit from a better language syntax and from things like types, arrays, etc.


>you just made the claim that people who prefer real programming languages are the reason for bad shells

No, I made the claim that people who think shells and "real (sic) programming languages" are distinct categories, and can't be otherwise, and use it support the idea that current shells are fine, are the reason for bad shells.

>sure it is possible to have both, but the focus of the shell needs to be on interactivity

As if those are contradictory goals?

A shell can have a much much better real programming language AND BE interactive.

>i don't at all care if the shell has a good programming language. i only care about good interactive features.

Hence my false dichotomy complaint.


Uhm, so what about REPLs? Jupyter notebooks? Are they wrong?

What if the typedness of the shell made the interaction better? Can you make an argument that this does not hold? The article does make a good argument in favor


REPLs and notebooks and all the rest are great.

NuShell might be a great REPL, I don't know, don't really care.

The fact no one uses a REPL as a shell, and there's no demand to add direct exec() to REPLs, demonstrate they're different categories.

Let programming languages do what programming languages are good at. Let shells do what shells are good at. Please do not write programs in a shell. Do not create a half-assed programming language in a shell to encourage people to write programs in said shell.


Huh? Using a shell interactively is a REPL. REPL means Read-Eval-Print-Loop. That's what you do when you type commands to a bach/zsh/fish/nushell/... prompt. But it also how you type commands to a gnuplot or Mathematica or APL or Python or Lisp prompt.

Some REPLs have nicer interactive features (such a completion than others). Some REPLs have nicer or more powerful syntax or different data types. But as longs as you have a Read-Eval-Print-Loop it's a REPL.


The article admits basically exactly this, but also makes arguments as to why?

```

He proceeded to explain that if he could recreate many of the techniques we showed all as part of a C++ library that people could use. At this time, I wasn't sure how to respond other than "but you don't have to, we already built this language" but he couldn't be swayed. If it wasn't C++, he didn't want it.

Fast forward a couple years, and I'm standing in front of a JavaScript audience giving a similar talk, this time promoting TypeScript. I remember the kind of politely confused looks on people's faces as I showed off the features TypeScript offered. There was a similar sense of "why do we need to leave JavaScript?".

To answer whether Nushell can overcome this kind of inertia, I'll pose two questions:

    Is Nushell compelling enough for a single person to adopt it?
    Would adopting Nushell broadly as a community move the needle?
Let's tackle the first question. Time and again, as people try Nushell, they come back with quotes like "this is the most excited I've been about tech in 15 years". It has a fanbase that loves it, and that fanbase is growing. It reminds me of the early days of Rust, just after hitting 1.0.

To the second question: would adopting Nushell broadly actually improve things noticeably? Without a doubt. I say this without any reservation. Thinking of our shells as structured, interactive processing engines opens up the doors to a much wider array of things you can do with them. The commands would be far simpler than their POSIX equivalents and would compose far better. They'd benefit from the full knowledge of the data being shared between them. Adaptors could be made to connect to all parts of the system, allowing you full, structured interaction with everything you have access to.

```

I enjoy nushell, but I am far far far from the average user/target audience. I do not sys admin or anything like it, I mostly use it to quickly navigate the filesystem and, with nushell specifically, occasionally to work with data i've gotten from other things. I haven't gotten quite to the point I would write scripts in it yet, but at the point i'm considering throwing up an FSX file or going into a repl, I like that I might just be able to use nushell instead for the smaller tasks.

Is this useful? To me, sure, to everyone else, hell if I know.

edit-

Article even touches on this mindset:

> I don't need to do heavy data processing everyday, but it's nice to not have to shift what I'm doing at all when I need to do it. I don't have to download new utilities or switch languages. It's all right there. Need to write a script to load some files and handle some directory processing? Still right there. Need to throw together some web query that outputs the top download results for a github repo? You guessed it, all still right there.


The comparisons provided are borderline non-sequiturs

The C++ library and Chapel were solving the same problem, but Chapel (in the author's framing) did it better. Javascript and Typescript solve the same problem, but Typescript (in the view of people who care about type systems) does it better.

Nushell is not trying to solve the same problems as a shell:

> Nushell is really an interactive, data-focused scripting language with shell capabilities.

Great, but that's not the problem I give a damn about. If I want an interactive, data-focused scripting language I will use Python. And against Python, Nushell is not nearly as attractive.

Nushell is trying to blend two domains, shells and programming languages, which I see distinct advantages in keeping seperate. I do not want the world to be built on the back of shell scripts, regardless of how good you make the type system. I do not want to ask more of shell scripts and view that as a negative.


I use python extensively. I've used bash (+awk+xargs+sed...) extensively.

Nushell is already a great improvement over bash _as a shell_. It is even better when using it to compose _preexisting text based programs_. I would say it is better in every way I can think of, except for: - not (yet) coming pre-installed, and - stability of interfaces and language.

Its already better enough to be my default shell on my daily driver, though I keep bash around because some things really assume it. I very much look forward to one day having a userspace with no traditional shells at all.

nushell is not yet a strict improvement over python, but it might one day be, and it is already better at: - munging text, json, dates and tables - quickly creating nice CLIs callable from the shell (even if that shell isn't nu!) - fun of programming in it

> Nushell is trying to blend two domains, shells and programming languages, which I see distinct advantages in keeping seperate.

Interesting, though, how many PL features the most popular shells tend to have...

> I do not want the world to be built on the back of shell scripts, regardless of how good you make the type system.

If I had read that before knowing nushell, I would strongly agree. Yet, it turns out you can make a shell so good I wouldn't mind if... not the world, but _a lot more_ was built on it.


My main criticism of nushell is that it seems to be more developed as a programming language than as a shell.

Looking at the bug tracker, someone asked 4 years ago how to separate multiple commands, as in `git pull; git push`.

It turns out, this is hard with nushell. At some point a ; syntax was added in nushell, but it is NOT the separator you expect. It's actually a strange cross between bash && and ;

    `echo a | grep b; echo ok` does NOT run the second command, so here nushell ; is like bash &&

    `false; echo ok` DOES runs the second command, so here nushell ; is a plain ;
There is no exact && equivalent. There is no || equivalent syntax at all. And the closest form of a plain bash ; I could find is |1; (piping into a constant value, which guarantees a 0 return code, so that ; will never fail)


which programming language doesn't allow multiple statements on one line? ok, there may be a few that strictly require linebreaks to start a new statement and lisp where everything is one huge statement with nested expressions. but not supporting this in a shell looks like a deal breaker.

the primary function of a shell is to execute a sequence of commands.


Today I learned (after disappearing down a rabbit hole after reading the linked article) that it's actually possible to begin to use & benefit from nushell's structured data pipe feature without changing one's current shell.

Structured data pipes have always been my primary reason for keeping an eye on nushell's development but after looking at the project's documentation again today it all still seemed "too much initial effort with uncertain outcome".

Because I don't want to switch my shell (not because bash is good but because it's not a priority to justify the expenditure of effort), I just want to have structured data in pipes within bash!

Turns out it's as easy as:

  nu --commands 'ls | where size > 1MiB'
(Where `nu` is the nushell binary being called from your existing shell prompt.)

Or, as more complete flow of data example:

  echo "[1,2,3]" | nu --stdin --commands 'from json | to json' | cat
Now you can fit nushell within your existing workflow where ever it's useful enough for you--without needing to commit to changing your entire shell.

(And this isn't the only or necessarily the best way to arrange things for the communication with bash--there's "^" & "externals" & "command signatures" & "from ssv" etc too.)

And nushell does have some nifty tools such as `explore` with `:try` to interactively build a processing pipeline.

But this information doesn't seem to be documented anywhere in the "book" or other introductory material. It only seems to be documented in the help message of the `nu` binary--which I almost didn't even get as far downloading today.

But then I found the help text in the source, so decided to try it again: https://github.com/nushell/nushell/blob/fd4ba0443d01e67f6304...

If the structured data pipes is one of the main appeals for you, maybe try this approach out?


As a follow-up on this comment, I wrote up[0] some more of my discoveries with regard to using nushell structured data pipes from other shells (e.g. bash).

Another variant to allow bash piping but preserving (most of) nushell's data type is with the use of `from/to nuon` commands rather than `from/to json`:

  nu --stdin --commands 'ls | to nuon' | nu --stdin --commands 'from nuon | where size > 100MiB'
I also discovered an existing discussion[1] related to this topic which includes a link[2] to a "helper to call nushell nuon/json/yaml commands from bash/fish/zsh" and a comment[3] that the current nushell dev focus is "on getting the experience inside nushell right and [we] probably won't be able to dedicate design time to get the interface of native Nu commands with an outside POSIX shell right and stable.".

[0] https://gitlab.com/RancidBacon/notes_public/-/blob/main/note...

[1] "Expose some commands to external world #6554": https://github.com/nushell/nushell/issues/6554

[2] https://github.com/cruel-intentions/devshell-files/blob/mast...

[3] https://github.com/nushell/nushell/issues/6554#issuecomment-...


> And nushell does have some nifty tools such as `explore` with `:try` to interactively build a processing pipeline.

> But this information doesn't seem to be documented anywhere in the "book" or other introductory material. It only seems to be documented in the help message of the `nu` binary--which I almost didn't even get as far downloading today.

https://www.nushell.sh/book/explore.html#try-command


Yes, `explore` & `:try` are documented[0] but by "this information" I was referring to the `nu` binary's `--commands`, `--stdin` etc.

[0] When writing my original comment I had been going to link to page you linked but decided it would just clutter an already long comment no one would read anyway. :)


> Turns out it's as easy as:

They should have lead with that.

I ignored nushell only because of the limitations of using a non bash shell with no sequence of commands using ;, && and ||.


Nice, that's a great way to begin using it and testing the waters.


Nushell is part of the ongoing Visual Studio-ification of software engineering.

TypeScript and Powershell are preferences, not standards. If you want to use them, great, but its wrong to say that these have evolved into defacto ways of doing things.

Also- that everything is text in Bash is one of its great strengths, just as Powershell's reliance on Objects was generally understood to be a weakness.

Also- I don't get the need for IDE support? Surely the point of a shell language is that it is designed to be used for a terminal?

As others have said here- this feels like a better case for nushell as a scripting language than nushell as a shell.


Could you expand on the "Visual Studio-ification" of software engineering, and why that would be a bad thing?


"Visual Studio-ification": Any technology that requires Visual Studio to be productive. The classic example would be TypeScript, but you can clearly see nushell's overtures to VS in the linked article.

A de facto requirement on VS to make a webpage is bad enough, a de facto requirement on VS to write a shell script is even worse.


Hover/go-to-definition seems more like a nice-to-have than a requirement. Either way, neovim support can be found here: https://github.com/LhKipp/nvim-nu


Do you mean VS Code or actual Visual Studio? As someone who has unfortunately used both.


It has nice error messages so you definitely don't need an IDE to edit scripts.


IDE support is required if you’re building up complex scripts/custom functions.


>Nushell is trying to answer the question: "what if we asked more of our shells?"

well apparently one thing you should not really ask is "can you activate a python virtual environment?". I tried out nushell for the first time a few months ago and that was the first thing I ran into which apparently is still causing issues after four years[1], which to me doesn't really inspire confidence that this is a practical tool for people who want to get regular dev work done

[1] https://github.com/nushell/nushell/issues/852


I think nushell makes a pretty good case against itself.

Take ls: like the article says gnu ls has tons of options to configure how its output is displayed. But that's because ls is a user interface for displaying lists of files. I'm not sure how the proponents of all these new shell keep missing this point. If you just want a list of files in a POSIX shell you use a glob expression.

But whatever, nushell has its own builtin ls that is for getting lists of files. And then you pipe it into other commands to configure the output. Great. Can I get the output to be like 'ls -C'? Nope. Can I turn off colors? No. Can I add the indicator character like 'ls -F'? Not a chance.

Well, at least I don't have to pass '-l' just to see the file mode... actually I do, otherwise it would be too slow.

Many of the options that clutter gnu ls are generic options that apply to all tables, and it would be nice to abstract them. But many aren't, many are specific to what ls is doing and it's hard to me to see how this all could be generalized in a way that applies both to ls and to grep.

For now nushell sidesteps the problem by not having its own reimplementation of grep, unlike ls if you type grep you get real grep and you can't pipe the output into anything meaningful.

The best I could do to replicate grep inside nushell was this:

     cat haystack | split row -r '\n' | enumerate | where item =~ needle
asides from the shocking verbosity, I can't see any way to replicate the --color option of grep, or -A and -B.

The cop-out solution to this is to have builtins return objects (as in OOP objects, with methods), but then everything needs to be written in the same language.


Along the lines of what another user said, you can actually get this a lot shorter. given variables $haystack and $needle, this actually becomes

    $haystack | lines | where $it =~ $needle 
which, albeit maybe not as concise as the grep solution, to me feels good enough, especially when you have so many other tools to handle text (all of the str subcommands, splitting by row/column and then working on tables). also, admittedly, if you want to drop down to "normal" ls, you are one character away: use a caret and you can drop down to the usual bash/sh/whatever.


> If you just want a list of files in a POSIX shell you use a glob expression.

I think this sums up the problem. If you want a list of files for display you use ls, and if you want it for other purposes use a glob; it seems much more elegant to have one way of getting files, and a separate way to format it for displaying if that's what you want.


That's already the case, just not the default. "ls -ld a*b file1.*" will use common way of getting files -- the glob expression; and "ls" will only be used to format the file list nicely.

Except turns out people don't like to type extra characters, so the most common case, "ls -d *" is just "ls".

Yes, shells (and command-line interface) have shortcuts to simplify typing. It's a feature, not a bug. I want to be able to type "ls -t" instead of "ls --sort=time", and any shell replacement which does not offer former option is doomed to fail.


I think most people choose a shell not based on its elegance, but whether it allows them to get things done.


If you want to use grep in Nushell, you can use grep and pipe the output to `lines` to split the raw output by line.


Nushell is interesting. I tried using it as a scripting language (fish is better for interactive usage[1]), but I didn't get very far because it's constantly evolving and scripts that work today might not work next week. I can't commit scripts to a codebase if future users can't run them. So I end up using fish for scripting too.

[1] Not only does fish have the best out of the box experience (although that's reproducible with plugins in other shells), it's also old and common enough that tools usually support it alongside bash and zsh. Nushell is too niche and always changing for tools to support it properly.


>fish is also old and common enough that tools usually support it

for many years, more than a decade at least, this was not the case. it was a long and hard road for early fish users to get to that point.

and i'd make the claim that todays openness towards alternative shells that break compatibility with traditional syntax is in large part thanks to the groundwork done by fish.

that is to say that if nushell can present a compelling argument with its improvements, then there is hope yet. but for that to happen focus needs to be on interactive usage that is at least as good or better than fish and not just focus on better scripting.

there is a place for better scripting. gluing together multiple tools where input, output and arguments to commands matter is something normal programming languages don't do well.


I agree with you overall. Still, the reality today is that Nushell has a long way to go in terms of support.

I think it's hard for Nushell to do something for interactive usage that would displace fish for me, but I'm always open to exploration.

The thing that resonates the most with me is the need for a better scripting language. A "proper" language, that worries about types, not just stringly typing everything. I think Nushell is making good progress on that, and the main reason I don't use it today is because it moves and breaks too fast for my taste.


Am I the only one who uses a shell primarily as an interactive interface, not as a programming language? Why do all shells seem to focus on the latter aspect?


They don’t. Murex (my shell, https://GitHub.com/lmorg/murex ) is all about the interactive environment.

Elvish is too.


There is an overlap. I often write for-loops in bash interactively for example. The ideal shell works well for both, but bridging this gap is a challenge.


Because the biggest problems with shells are when people think "these commands work for me; I'll just save them in a file".


The case for Nushell is weak because shells are not an ideal environment, but rather a lowest common denominator by today's standards that has stuck and hasn't disappeared. What we need is a better interactive environment interface. It baffles me that something like Pharo's interactive environment hasn't been copied all over the place. I understand it's largely because it's a self contained environment that it is able to afford the functionality it has, but there are lessons in there worth learning from.


>The case for Nushell is weak because shells are not an ideal environment

Not sure what "an ideal environment" even means, but shells have tons of very nice features. The problem is they also have a lot of legacy bs and tons of footguns.


> Not sure what "an ideal environment" even means, but shells have tons of very nice features.

If you haven't, I would encourage you to try Pharo. It's a Smalltalk dialect but you program through what feels like an operating system. The richness in interactivity and feedback is unparalleled. This is what I would consider an "ideal environment." Shells are useful tools, they wouldn't have the ubiquity they do if that weren't true. That being said, there's definitely more interesting and powerful paradigms out there for interacting with an operating system at a lower level.


My years doing hardware development and driver programming in college taught me about the evils and endless frustration of invisible, un-resettable state. "Why did my driver worked the first time but exactly same code fails now? Which of hundreds of hardware registers did I forget to reset _this time_?"

Same thoughts appear when I look at Smalltalks... Ok, I can edit any function or variable, including a system one; and the changes would be saved automatically in opaque persistent "image". So if you mess a system function, your whole system is unusable and you have restore from snapshot; and even something as simple as a program which continuously pops up message boxes can make system unusable (and again, require restoring from snapshots). It's like Windows 98 but with no reboot option!


I don't see how this would be any different from messing around with a Linux system and slinging off `sudo this` and `sudo that` here and there. Though you do pose a very good point. I've always maintained that version control is not something that should be just for managing software projects. I think it should be a core system feature and potentially a feature of programming languages as well. If backwards compatibility is needed, you should be able to import a module from a particular revision so you can actually delete old code without the nasty ramifications from doing so.


The key is a filesystem, which is giving you a lot of help with disaster recovery. We are so used to it we take it for granted, but those features are real and are are not present in all environments:

- Filesystem is distinct from memory, you need explicit action to write to it. This allows programs (or entire systems) to crash and be restarted, in which case all volatile state will be rolled back to (hopefully) good one. This even helps with long-term problems -- for example one of my computers right now has broken VPN that will likely fix itself upon restart. Still, I am hesitant to restart, and I am safely using that very machine to write code, because I am pretty sure that fixing my VPN problems will not damage my code files.

- If you working on "foo", the most people will have a directory like ~/project/foo, and all related files will be in that location, and this location has no system files. As long as you keep using common practices, each of your project is easy to separate from other projects or from system files.

- You can copy the files any number of times, and because of the directories, the copies will not interfere with the original. If I copy ~/project/foo to ~/project/foo.old, there is no chance that my project in ~/project/foo will suddenly pick up a SomeClass from ~/project/foo.old.

- Even if you completely killed your system so that it no longer boots, you can still take the filesystem and mount it on the working system and get your user files off. This is as simple as booting from rescue USB stick, or removing the disk from one PC and putting it into another. And seeing folder structure lets you easily say "this is ~/projects on external drive, this is ~/projects on this machine, and this is how I copy files from one place to another". (There are exceptions to this, nothing will safe you from "rm -rf /" or "dd of=/dev/sda", but most of the system damage is not that blunt)

Many of those seem like obvious things that every computer has, but they are not - image based systems, and their approximations (like cellphones, especially older once like Symbian) are missing them.


Thank you for pointing out Pharo! My mind is blown, and I can’t sleep!


Nushell is a great improvement in terms of shell design, and it really does feel like a modern, better equivalent of the shell.

However, I tried using it twice as my main shell and had to give up both times. The first time, I experienced a handful of bugs. The most annoying one was that I couldn't copy-paste passwords into the shell because Nu kept inserting control characters. Then I kept having trouble with other programs that rely on the shell - for example, "conda init" does not work with nushell, nor does Julia's shell work with nu. So practically speaking, I had to continually boot up zsh inside Nu.

Then, there are the constant breaking changes. It feels like every month there is some breaking change, and every few months I need to dig into my nushell setup scripts to unbreak half my shell tools because some syntax changed.

Yeah, no thanks. Hopefully someone else will do the annoying work of test running Nu for 5 more years until it's stable and integrates well, but for me, I went back to zsh.


Do you remember what issues you had with Julia's shell mode when using nu? A (very) cursory look at the Julia code doesn't reveal any obvious problems, but there's probably some hidden assumptions (possibly around POSIXness) that's causing troubles.


I guess we all appreciate the traditional Unix shell as an incredibly stable tool over the last 50 years (!). But let's face it, it's hard to justify that a kid born in this millennia should learn & use this antiquated & obscure software.

My fear is I suspect the success of those YAML based Ansible or Github Actions tools is the result of people being afraid or tired of writing shell scripts.

But let's face it those are abominations because it lock you in a platform and you end up drowning in way too many YAML files.

So yes I believe we need an ambitious update to the Unix shell and the Nushell team did a pretty good job. If we don't we will have to deal with many of those "automation" platforms & their poor configuration "languages".


This post implies the decision is between nushell, MS Powershell or something based on shells from the 70s. But there are a plethora of other shells like nushell, that have been around for longer than nushell.

It’s actually a pretty crowded market space.

So why are people still using bash? In part because it’s a default (defaults are hard to change) and in part because it’s largely “good enough”.

It’s the same reason JavaScript is everywhere. First because it was the default in browsers and then because it was good enough that people didn’t want the effort of having to learn nor use another language.


Like what? I’d love a stable nushell alternative.


Check out my entry, marcel: https://marceltheshell.org.

E.g., find the newest vlc instance and kill it (a command that an acquaintance needs frequently, for some reason):

    ps | select (p: p.name == 'vlc') | sort (p: p.create_time) | tail 1 | (p: p.signal(9))
- ps produces a stream of Process objects.

- select binds each Process to p, and then selects those whose name is vlc. (Anything in parens is a Python function. You can include or omit "lambda".)

- sort by create time, (using a Python function to specify the sort criterion).

- tail 1 as in Linux, to get the newest qualifying Process.

- Run the function p.signal(9), which kills the Process.


Stable is a problem because a lot of these shells don’t offer any guarantees for breaking changes.

My own shell, https://github.com/lmorg/murex is committed to backwards compatibility but even here, there are occasional changes made that might break backwards compatibility. Though I do push back on such changes as much as possible, to the extent that most of my scripts from 5 years ago still run unmodified.

In fact I’ve started using some of those older scripts as part of the automated test suits to check against accidental breaking changes.


I daily drive nushell - combined with wezterm it's great for a completely cross-platform setup. There are definitely usability and stability issues though, to the point where I wouldn't yet recommend it to anyone who isn't ready for some pain. I've had lost data issues from buggy 'mv' for example, basic things like '&&' or '||' not working, etc. I avoid ever updating because I need to spend an hour or so fixing scripts. But the trade-off is that you get really nice error messages and working with structured data can be a breath of fresh air. Once it hits 1.0 I think it'll be an easy choice over existing traditional shells.

It does make me wonder though - at what point does it make more sense to just add some shell-like commands and functionality to a typed language's REPL? I played around with this idea using F# and FSI. The composition (|> instead of |) maps pretty directly and you get the benefit of types but in a more sane language (sorry nushell). You'd just need to add a 'cd' to mutate pwd state, pull in FAKE core and alias the most common shell operations.


Apparently this is a thing in haskell already, so F# isn't too much of a stretch: https://hackage.haskell.org/package/turtle



I love using nushell, and I hope it becomes a standard going forward. I do sometimes feel slightly annoyed that I have to type out a bit more in nushell, as opposed to simply tacking on another flag, but I think JT makes a really great case for doing just that. In terms of adoption, JT asks:

1. Is Nushell compelling enough for a single person to adopt it? 2. Would adopting Nushell broadly as a community move the needle?

I'm 100% on board with nushell in my personal computing. But I'm still worried about number 2. I think something like rust was able to succeed because enough people really felt like there a need for something to fill that space. Do enough people feel like nushell fills a necessary gap? I hope so.


Nushell is definitely interesting, and I'm glad it exists, but I'm not using it because of at least a couple of major reasons:

- It reimplements all commands. You're not actually using GNU coreutils, but Nushell reimplementations of ls, rm, etc. See [1].

I understand why it's done, but this has many drawbacks:

1. All existing shell examples and documentation spanning decades are useless. Nushell commands try to replicate existing UIs, but they invariably must be incompatible to support all features and deprecate unnecessary ones, so the only valid source of documentation is Nushell itself.

2. Nushell-specific bugs[2,3]. This goes without saying, and will improve as the project matures, but experiencing bugs with core commands is jarring. I've never had that with GNU tools.

3. Unsupported commands won't have any of the benefits, and you'd still have to parse their output in the traditional way, or use a Nushell helper. How often this happens will depend on your use case, but you'll always depend on Nushell for integrating that command in your workflow.

- Related to point 3, Nushell is building an entirely separate monolithic ecosystem, which goes against the open ideology of Unix. This is similar to BusyBox, where you always have to be aware you're not using GNU tools, but some slightly incompatible equivalents.

More importantly, this approach doesn't scale. There will always be new commands that target POSIX shells, but not many commands will be written specifically for Nushell.

I appreciate what projects like Nushell and Murex are trying to address, but having a saner scripting language and passing structured data in pipelines is not worth the drawbacks for me.

For one, Bash scripting is not so bad if you set some sane defaults and use ShellCheck. Sure, it has its quirks, but all languages do. Even so, the same golden rule applies: use a "real" programming language if your problem exceeds a certain level of complexity. This is relative and will depend on your discomfort threshold, but using the right tool for the job is always a good practice. No matter how good the shell language is, I would hesitate to write and maintain a complex project in it.

And for general QoL improvements with interactive use, Zsh is a fine shell, while still being POSIX compatible.

[1]: https://github.com/nushell/nushell/blob/main/crates/nu-comma...

[2]: https://github.com/nushell/nushell/issues/5027

[3]: https://github.com/nushell/nushell/issues/9310


> It reimplements all commands. You're not actually using GNU coreutils, but Nushell reimplementations of ls, rm, etc. See [1].

Yeah, ls being called ls despite not being coreutils ls is both a pro and a con. If it would be called lf, for list files, I think it would actually be a little easier to adapt. Because I usually use ls with the flags that don't exist in nushell, which obviously fails. It feels strange to borrow the name without being compatible.


If you want to invoke an external program rather than a built in, it is as easy as prepending a ^.


`^` is a pretty inconvenient character to type every time you need such common commands.


Sure, the idea is that that would be the exception, not the norm. It is an escape hatch not a standard workflow.


You may be interested in https://www.oilshell.org/. It has many of the same end goals as nushell with more of an upgrade path from bash planned. You can start with full compatibility, and then turn on shell opts as you get comfortable to enable new behavior and disable bad old practices.


Thanks. I'm familiar with Oil, and appreciate its gradual approach to compatibility, but I'm not the target user for it.

> It's designed for programmers who know Python, JavaScript, or Ruby, but avoid shell.

I know Python and JS, but I don't avoid shell. :)

I differ with the Oil authors in that I don't think that my shell should be a programming language. The warts in shell scripting are not insurmountable, and for the simple use cases that it's meant to be used for, it's perfectly adequate. For anything beyond a comfortable level of complexity, I reach for a proper programming language. This approach has served me well over the years, and I don't experience any of the compatibility drawbacks of using an alternative shell, or maintenance drawbacks of using a niche shell-oriented programming language.


> Nushell is building an entirely separate monolithic ecosystem, which goes against the open ideology of Unix

I wouldn't entirely agree with that (but I am biased, I work on Nu). One of the nice things about Nu is that it's trivial to bring in data from elsewhere; if other tools speak JSON, XML, TOML, CSV, etc. it's easy to bring that into a Nu pipeline.


> I appreciate what projects like Nushell and Murex are trying to address, but having a saner scripting language and passing structured data in pipelines is not worth the drawbacks for me.

Murex doesn’t change any of coreutils. With murex can use the command line as dumb byte streams with the standard GNU / BSD utils if you want…and in fact I do just this myself. What Murex offers is an abstraction on top that allows you to add type hints to those byte streams so you can do fancier stuff with it if you wanted. Think of it like what Typescript does for JavaScript. You have access to all the same POSIX utilities and additional type aware controls on top but it’s completely optional if you want to use those type aware controls (personally, I think once you do, you wouldn’t go back, but I respect people’s personal preferences).


Ah, thanks for clarifying. That's actually a better approach, IMO.


> The truth is, in 2023 if someone asked us to design a system, we wouldn't design POSIX. If, in 2023, someone asked us to design a shell language, we wouldn't design bash/zsh. This matters.

Indeed, clinging to the awful design choices of the past prolongs misery for all the future generations

> [bash] Pros: it's everywhere. Learn once, run anywhere

not on native Windows unlike, say, python-based shells like xonsh or even pwsh


Maybe a hot take but copilot CLI has obviated the need for this kind of thing for me personally. Most of my shell commands are incredibly simple, and now with copilot CLI, I have a 80% head start on anything more complex. In fact sometimes I will just type "!! <sloppy command>" and it will autocorrect to basically what I was going for without any further prompting.


The case for any other shell: ls -R and find exist and work. Nushell doesn't cover that. I like the idea, but execution has so glaring holes that I had to go back to fish 1 min after installing nu. (there is glob ** that doesn't have any column and I don't see a way to compose it with ls, and ls ** doesn't work).


I tried it briefly and the fact that find means grep and so on will be a tough nut for me to crack, learning-wise. I understand that if you start from a clean slate, you want to make the experience internally consistent. But how do you as a user manage, when your muscle memory is used to something else?


>But how do you as a user manage, when your muscle memory is used to something else?

Like with anything else new: if you find it worth it, you develop new muscle memory.


Just don't see it. I'd need at least 10x improvement to switch and that means fundamental changes to interacting with processes as I'm petty surgical with bash, find, awk, sed. A shell isn't 10x by being a union of all the best tools, it needs to redefine them. And 'where' doesn't come close.


I usually just use the Julia REPL as my “shell” (my terminal emulator is configured to open it directly).


Do you use something like Shelly [https://github.com/umlet/Shelly.jl] ?

Also, do you not often need to pipe one command into another? Afaik the Julia REPL's shell mode doesn't allow pipes, redirects, etc. Do you straight up do `run(pipeline(...` for those cases?


I’m a minor contributor to Nushell, it’s a great community working together and I really enjoy developing in rust on a shell. It’s much easier than the old c/c++ shells. It also has a native integration with polars (dataframe) that is very powerful.


The single greatest thing someone could do to break the stranglehold of ancient shells is to make a language that compiles to both posix and their new shell syntax. Make it so people can write once and run on both old and new shells.


I can’t live without job control


> Nushell is designed to be a language

Last time I checked it didn't have functions. Has that changed?


It does, see https://www.nushell.sh/book/custom_commands.html

  def greet [name] {
    $"hello ($name)"
  }
They were added in 0.25 in January 2021 (https://www.nushell.sh/blog/2021-01-05-nushell_0_25.html#cus...)


isnt this a repost?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: