Hacker News new | past | comments | ask | show | jobs | submit login

Sure, but the reason it works with any tool is because it's generic and simple. If each tool had to implement anything more sophisticated like a JSON parser and serializer, it would be a nightmare to maintain. Projects like Nushell essentially need to handle every type of output from any command, and every type of input to any command, which is an absurd amount of work, and just not scalable. Subtle changes in this strict contract means that pipelines will break, or just not be well supported[1].

If programs simply input and output unstructured data, it's up to the user to (de)structure this data in any way they need. The loose coupling is a feature, not a bug.

[1]: You can see this in Nushell's issue tracker[2]. I'm not judging the amount of issues, as any healthy OSS project will have many issues, but some of these are critical bugs related to command handling and interop. I'm not blaming the Nushell team either, my hat's off to them, but just pointing out that the nature of the project will inevitably lead to a neverending stream of these types of issues.

[2]: https://github.com/nushell/nushell/issues




> Projects like Nushell essentially need to handle every type of output from any command, and every type of input to any command, which is an absurd amount of work, and just not scalable.

I think you're misunderstanding how Nushell works. They don't parse outputs, or generate inputs, from/for standard Unix commands. Instead, they implement their own commands with the same names as standard commands, and generate/consume structured data by default, using the same data structures everywhere. There is only a single implementation of those data structures. That's very easy to maintain.

So running `ls` from Nushell does not shell out to the `ls` program on your system, and then try to make sense of its output. It runs a Nushell-internal command that is tailored to the kind of pipelines that Nushell is built around. They already have hundreds such commands implemented and working, and that approach absolutely does scale. Whatever issues may remain, it already works much more reliably than the default Unix tools.

Saying that unstructured text streams are a universal interface is like saying that atoms are a universal construction kit – it's technically correct, but pretty useless in practice.


You're right, I misunderstood the way it worked. But I'm not sure that approach is better. They either need to maintain full compatibility with existing tools, or users need to learn the idiosyncrasies of Nushell tools. And commands not reimplemented by Nushell wouldn't work, or they would need some kind of generic wrapper, which would have the drawbacks I mentioned.

But, hey, this obviously has users who prefer it, so if this works for you, that's great. Personally, I'll stick to the standard GNU and POSIX tools. I do concede that this is partly due to the robustness of this ecosystem and my familiarity with it, which is hard to abandon.

> Saying that unstructured text streams are a universal interface is like saying that atoms are a universal construction kit – it's technically correct, but pretty useless in practice.

My point is that offloading the decision of how the tools are integrated beyond raw byte streams to the user, is the most flexible and future-proof approach, with the least overhead for individual tools. Doing anything more sophisticated, while potentially easier for the user, would require maintenance of the glue layer by each tool's developer, or a central maintainer ala Nushell. This loose coupling is a good thing.


> They either need to maintain full compatibility with existing tools, or users need to learn the idiosyncrasies of Nushell tools.

The existing tools aren't fully compatible with each other either. There are significant differences between GNU and BSD tools, for example, and yet more differences with BusyBox and others. The idea of "standard" tools is unfortunately an illusion, so not much is lost there.

But more importantly, most of the many options the traditional tools offer are related to output selection and formatting. In Nushell, those problems are solved in a unified way by piping to builtin commands that work with structured data. So instead of learning twenty different cryptic flags for `ls`, you just learn three or four postprocessing commands, and use them for `ls` and everything else.


Both gnu and bsd grep input and output a stream of bytes.

Without ever having tried it, I know that one random day when I for whatever reason have a wish to take the output from bsd grep and send it over tcp through netcat, to be collected by zsh's built-in tcp support and fed in to gnu's grep, it will work. No piece along the way made any jerk assumptions or required any any jerk tight coupling, and the bsd and gnu tools were completely compatible.

That is more valuable and greater convenience than any other poorly concieved ideas to make it more convenient.

This all 50 years after unix pipes were invented and in an environment the inventors did not even try to predict and handle. Instead they handled infinity by not trying to predict anything. They just made useful low level tools which you assemble however you may turn out to need to, and the tools make as few as possible assumptions which will all eventually break. The hammer doesn't only work with one kind of nail.


> The existing tools aren't fully compatible with each other either.

Right, but those incompatibilities, as well as the way commands interoperate, are left to the user to resolve. No monolithic tool could realistically make that easier, unless they reimplement everything from scratch, as Nushell has done. But then you have to work with an entirely different and isolated ecosystem, and you depend on a single project to maintain all your workflows for you. Again, the ability for loosely coupled tools to work together is one of the strengths of Unix.

We clearly have a difference of opinion here, so let's agree to disagree. :)


Murex (https://GitHub.com/lmorg/murex) doesn’t replace coreutils with builtins but manages interop with commands just fine.

Most output is relatively easy to parse, sometimes you need to annotate the pipe with what format to expect but that’s easy enough to do. And Murex does come with builtins that cover some of the more common coreutils use cases for instances when you want greater assurances of the quality of the data - but those are named differently to their coreutil counterparts to avoid confusion.


Murex is pretty neat, thanks for sharing.

Still, you must have issues parsing all variations of output, depending on the flags passed to the source command and its version. How do you parse the output of ls or ps without knowing the column headers, delimiters, or which version of the command was ran (GNU, BSD, BusyBox, etc.)? Piping data into commands also must require a wrapper of some sort.

Not knocking on the project, it does look interesting, especially the saner scripting language. But the usefulness seems limited to the commands and workflows it supports.


Basically the same way you’d parse a CSV except white space delimited. You assume the headings are the first row. You can use named headings or numbered heading (like AWK) so you have options depending on the input and whether it contains headings.

The current implementation does break a little if records contain a space as part of its input (eg ‘command parameter parameter’ in ps) but I’m working on some code that would look at column alignment as well as separators etc — basically reading the output like a human might but without going to the extreme of machine learning. (I’m already doing this to parse man pages and --help output as part of automatic autocompletions so I know the theory works, I just haven’t yet applied that to more generalised command output).


That's the first time I'm hearing about this project (which I take you are the creator of?). Very interesting!

How would you say Murex compares to Nushell? The syntax seems vaguely similar. Are there any fundamental differences?


Yes, I'm the author :)

Murex was created before most of the alt shells existed, created to scratch a personal itch. It's only relatively recently that I've been promoting it. What I wanted to create was a shell that had typed pipes but still worked 100% with traditional POSIX abstractions. So it's still just standard POSIX pipes underneath so type information is sent out-of-band. This basically means you can have a richer set of functionality from anything that understands Murex while still falling back to plain old byte streams for anything that doesn't.

I've also taken inspiration from IDEs with regards to the interactive UX. You'll get syntax highlighting, dynamic autocompletions based from man pages (I'm shortly going to push an enhancement in that area as well), smarter hints (like tool tips), inline spell checking, and all sorts.

There's also been some focus on making the shell more robust. Such as built in unit test framework, watches (for debugging), etc.

There will still be plenty of rough edges (as is the case with all shells to be honest) but it's a vast improvement over Bash in my biased opinion. So much so that it's been my primary shell for > 5 years.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: