
In defense of Unix - ingve
http://leancrew.com/all-this/2016/03/in-defense-of-unix/
======
IshKebab
The complexity of the `find` command is the least of Unix's problems. How
about defending these?

1\. Unnecessary and confusing directory structure. `/etc`? Why not `/config`?
`/usr` instead of `/system`, `/var` instead of ... well who knows. The maximum
directory name length is no longer 3 characters.

2\. Programs are mushed together and scattered through the filesystem rather
than stored in separate locations. This basically means applications are
install-only. Yeah, package managers _try_ to keep track of everything, but
that is just hacking around the problem, and most developers don't want to
spend hours creating 5 different distro packages.

3\. Not strictly Unix, but the mess of glibc with respect to ABI
compatibility, static linking, etc. is ridiculous. Musl fixes most of this
fortunately.

4\. Emphasis on text-based configuration files. This is _often_ ok, but it
does make it hard to integrate with GUI tools, hence the lack of them.

5\. Emphasis on shell scripts. Fortunately this is starting to change, but
doing everything with shell scripts is terribly bug-prone and fragile.

6\. X11. 'nuff said about that. When is Wayland going to be ready again?

7\. General bugginess. I know stuff works 90% of the time, but that 10% is
infuriating. Windows is a lot more reliable than Linux at having things "just
work" these days.

~~~
MereInterest
Regarding #5, I quite enjoy text-based configuration files, and can't stand
systems that force me to use a GUI to change settings. If I have a text-based
config file, I know that it will play nicely with git. If there are many
related settings, users can change them all quickly with their preferred text
editor.

~~~
Ao7bei3s
Agreed, text based config files are good and not the problem. (Though binary
formats != GUI tools only.)

I think the real problem is config files either in hard-to-parse-correctly
custom ad-hoc formats or even "config files" written in a scripting language
(-> impossible to parse).

All config files should use the same standard format. I'd say "like YAML", but
I'm not aware of a widely-used format with standard features like file
includes or data types beyond "int" and "string" (e.g. for time intervals;
these really shouldn't be "just a string... with a custom format").

~~~
Houshalter
That works fine when config files are simple straight forward text data. But
config files can grow increasingly complex over time, and eventually become
complex turing complete languages of their own.

I think it would be better to just start with a Turing complete language. I
think they should use Lua. It has very simple general data structures that are
self explanatory, and its very lightweight and sandboxable.

The only issue is combining config files with other programs. You don't want
to strip the comments or formatting when you modify it with another program.
Also wish there was a way to specify metadata, like what the possible values
for a variable are allowed to be. Or descriptions and help info. With that you
could easily convert config files into GUIs.

~~~
Ao7bei3s
> I think it would be better to just start with a Turing complete language. I
> think they should use Lua. .......... when you modify it with another
> program ...

No! Turing complete config files are even worse than ad-hoc config files.

If your config format is turing complete, you _couldn 't_ correctly modify
config files automatically. (You wouldn't even know how long until you're done
evaluating it.)

If you need more logic, put it in your program or have a plugin system or
write a program that generates the config file, but don't put it in the config
file.

~~~
Houshalter
I don't see anything wrong with adding the option to do scripting. No one is
making you use it. But when you need it, there's no alternative.

Many projects start out with just simple config files. But then they realize
they need to do logic, and hack that into it. Then they realize they need more
complex logic, and hack even more stuff in. And it's just a mess. It would
have been much cleaner if they just started out with a scripting language.

Whether you should be putting logic in the config file is a different issue,
but as long as people do it or have a need to do it, it's much better than
crazy ad hoc solutions.

See these discussions on the issue:
[https://stackoverflow.com/questions/648246/at-what-point-
doe...](https://stackoverflow.com/questions/648246/at-what-point-does-a-
config-file-become-a-programming-language/)
[https://medium.com/@MrJamesFisher/configuration-files-
suck-6...](https://medium.com/@MrJamesFisher/configuration-files-
suck-6daa9812f601#.f36iap1yb)

~~~
Ao7bei3s
I do understand your point.

But as soon as scripting is supported, it's impossible to write tools that
process the config files and always work, especially with untrusted config
files. You can't have both.

So the question is: can code be separated from config (like I've proposed
above) (the code can still be inlined in the config file, as long as the
"root" of the config file is declarative and the boundaries are well-defined)?

If no, which is more important: parseability or flexibility? It's a tradeoff.

~~~
telotortium
What you could do is go the other way around. The program's canonical
configuration format is pure data in a well-defined format (XML, JSON,
Protocol Buffers, etc.). However, the top-level user-facing configuration is a
script, written in a well-known (and ideally easily sandboxed) scripting
language, whose output is the configuration data. The script can still load
pure data files, which can be automatically analyzed and transformed, and with
enough discipline most of the rapidly changing parts of your configuration
will live in these pure data files. Even without this discipline, the final
output of the configuration script is pure data that can easily be stored
separately for tests, diffs, analyses, etc.

------
txutxu
The post linked from this article says:

    
    
        Great. It is spread everywhere this kind of complication for everyday tasks. Want to install something, need to type:
        apt-get install something
        
        Since we only use apt-get for installing stuff, why not?
        apt-get something
    

I will give a response I didn't see in any comment there, in the original
post, neither in this post or here:

Because it could fail miserably trying to install like this, any package named
like a subcommand.

    
    
        apt-get remove     # is installing package remove?
                           # or is failing the remove subcmd without args
    

You can workaround that, by apt-get install remove in that case, but the error
at first try is counter-intuitive.

Edit: fix my last example

~~~
yrro
For some reason, “Since we only use apt-get for installing stuff, why not?
apt-get something” really pushes my buttons.

Perhaps I'm reading too much of my own biases into my interpretation, but this
sounds like it's written by a developer who has only ever used apt to blindly
install a list of packages in a Dockerfile, rather than someone who has any
system administration experience.

And that's fine, except that maybe they should have taken two seconds to read
the apt-get manual, and realise that apt doesn't just _install_ packages, but
it also, shockingly, allows for them to be removed and upgraded too...

Now I've gotten that off my chest, perhaps a more favourable interpretation
would be that they are trying to say that they would prefer separate 'apt-
get', 'apt-remove', 'apt-search' commands, in which case they'd have a point.
Fortunately there is now a new 'apt' command that can perform the most common
operations that users commonly invoke via the apt-get and apt-cache commands.

~~~
nunobrito
Nuno Brito, original post author.

Sorry to disappoint but professionally administrating Linux machines since
2004, started as end-user in 1998 and I work with whatever machines are
available. My apologies if the blog post is read as an attack on apt-get, this
is not the case.

It is just an example. Not saying that we can/should change apt-get, there was
apt already made for that purpose. This example is only to raise attention for
upcoming command line tools and respective authors to think about the most
frequent use-case scenarios and then make them as straightforward as humanly
possible.

~~~
cyphar
Debian isn't the only system on the block, and apt isn't the only package
manager. OpenSUSE has zypper (which IMO has a much better interface and
supports patches to packages). Arch has pacman (which has less features, but
is great for normal use on your local machine, I wouldn't recommend it for
administrating a server -- not just because it's rolling release). apt has a
very janky interface overall, there are better alternatives IMO.

------
zenlot
After opening the link I expected to see an article from 90s, unfortunately
it's from 2016... I can't believe that people still going into such debates.

~~~
Ao7bei3s
Not all of us were around (as admins) in the 90s.

If it was bad then, and it still (after 20 years) hasn't improved, then what
do you expect?

------
knz42
The article could have been enhanced by highlighting that some shells (e.g.
zsh) provide expansion patterns that recurse into subdirectories. e.g.

    
    
       ls **/*.txt

~~~
nothrabannosir
or in bash, with the globstar option set:

    
    
        shopt -s globstar # e.g. in your .bashrc
    
        ls **/*.txt

~~~
mnarayan01
It's probably worth noting that the option is new in 4.0 -- relevant as OSX
users are likely on 3.0.

~~~
paulddraper
3.0 is ancient, but 4.0 is GPLv3....

~~~
ahoka
This thread perfectly sums up what's wrong with Unix.

~~~
avar
Apple not wanting to distribute software that they're perfectly able to
distribute for paranoia/anti-GPLv3 reasons is a problem with Unix?

------
Vieira
Ask your parents or non-geek friends: Given the task of finding the files with
the name ending with .txt which of the following two commands would you
choose?

    
    
      find -name "*.txt"
    
      dir *.txt /s

~~~
vacri
And if we're going to be using the commandline, I'd much rather the unixy

    
    
        find PATH -name FOO
    

than the powershelly

    
    
        Get-ChildItem -Path PATH -Filter FOO -Recurse
    

I mean "dir FOO /s" is simple and all, but powershell was created because cmd
was deficient in many areas.

The blogpost referenced in the article is also stacking the deck a bit, as
some of the 'complex' commands are normal commands, but with the verbosity
turned up - the rsync command has _three_ flags for increasing verbosity...

~~~
ygra

        ls -r PATH FOO
    

if you want things to be short. Since parameters can be often given
positionally instead of named _and_ you can shorten parameters as long as they
remain unambiguous _and_ there are aliases – it seems like you're stacking the
deck a bit as well.

PowerShell has over cmd (and WSH):

\- Consistency in command handling, naming and discoverablity

\- A robust scripting language

\- A modern shell

\- An embeddable scripting environment (most GUI configuration stuff on
Windows Server these days is PowerShell in the background; PowerShell is also
the NuGet console in VS)

\- Extensible

\- The core of the language is built up from mostly orthogonal commands which
work completely the same regardless of the context

\- Interoperable with native commands, .NET, COM, WMI, arbitrary item
hierarchies (file systems, registry, etc. – comes back to consistency and the
point above)

\- SSH-like capabilities built-in. Running a command or a series of commands
over hundreds of machines is no harder than doing it locally.

The (perceived) verbosity can usually be tamed quite a bit with aliases and
shortening parameters (or using them positionally), which is what you'd do
most of the time when working with the shell once familiar with it. I guess
you're not yet familiar with PowerShell or never used it, and that's okay.
Because the long commands are in many, many cases self-describing to the point
that you don't have to guess at all what they mean or do. This also helps with
learning or communicating with others.

~~~
vacri
> _I guess you 're not yet familiar with PowerShell or never used it_

This is correct - my example wasn't intentionally complex, but the result of
googling and looking at the top answers (cmd was deficient, and I was
wondering how you'd do the same thing in the 'proper' windows shell). I'm glad
I have got the responses I did - I didn't mean to deride powershell, but show
that if you want more power, you end up with more complex commands - and at
the cli, I'd rather be typing in the unix example that the powershell one.

In any case, powershell _should_ be better than the other shells we're talking
about - it's from 2006, and the others are considerably older.

------
raldu
I cannot see how this article is defending UNIX only by talking about a single
utility. It could be a better "defense" if it mentioned, for example, the
power of being able to compositionally combine various commands through pipes,
each of which doing one thing well. Inputs and outputs, remember?

The post the author is responding to is uninformed, and seems rather like a
rant. DOS vs UNIX comparison does not work. I couldn't know how to take the
`apt-get` example seriously. Because you could easily fix something like
`alias ai='sudo apt-get install'` to achieve `ai something` magic with
something as simple as aliases, which DOS does not even provide.

~~~
bpye
Powershell also has pipes, and works with objects rather than with text. It
can be very powerful and much easier than having to use awk, sed, etc.

------
exprx
The mentioned blog post is horribly ignorant, and lacks almost any valid
points.

find, in particular, very much has a fine UI, and I dare you to process, and
not just list, the files with cmd.

~~~
ygra

        for /r "C:\some path" %F in (*.exe) do process "%F"
    

forfiles also exists, which works similar to find regarding passing the list
of files to another command.

~~~
rat87
ls -Recurse -Include "*exe" $PATH | % { something $_}

~~~
ygra
Use -Filter instead of -Include unless you need fancy wildcards. It's much
faster because it gets passed to the FileSystem provider and filtering is
applied at that level already.

------
zokier
In my opinion ls is one of the more broken bits in unix. But besides that,
arguably more unixy way (even if no proper UNIX supports it out of the box) to
solve is recursive wildcard, i.e.

    
    
        ls **/*.txt

~~~
twic
I'd say this is less unixy, because it relies on the shell to walk the file
tree, rather than delegating to a utility whose job is to do just that.

No comment on whether it's actually better or worse, mind!

~~~
OJFord
I was going to make a similar argument, but then I lost faith in what ls' job
is, if we want to walk this really pedantic path.

After all, `echo * * / * ` is going to give you an ugly dump of space-
separated files; while `ls * * / * ` will give you a pretty list with colours,
ownership, permissions, date modified, is directory/exec/symlink, etc. with
their respective flags or according to your alias.

I don't think it's cheating to argue that `ls` job is to format the files
given, or within given directory.

Edit: Although possibly it is, because `man ls` tells us "list directory
contents".

------
AsyncAwait
@7 I see this point repeated often without further clarification, what exactly
is more buggy?

I can tell you from personal experience that with a reasonably modern Linux
distro, my laptop works out of the box without any problems and I've not
experienced any significant system-level bugs in a long time.

Meanwhile in Windows, my sound doesn't work at all when I wake up the laptop
from sleep, Windows update tries to override the GPU driver with an older
version from Windows update and every time I have > 10 Chrome tabs open, the
whole system locks up regularly. Not to mention that the Windows registry is
still a complete mess and trying to COMPLETELY remove a piece of software is
an impossible task.

I am not saying Linux is perfect and yes, Windows works better in the games
department, but I am not sure I'll call Windows less buggy.

------
joosters
I can never remember find's strange command line arguments, so I end up
writing the easy replacement:

    
    
      find . |grep \\.txt
    

The backslashes are not very intuitive, but if you miss them out entirely,
you'll still likely get good enough results.

~~~
BozeWolf
Haha! Yes! +1 for this. I do exactly the same thing. Or even just grep txt.
The find command just is sometthing i cant get into my muscle memory (despite
using linux for more the 15 years)

------
Gratsby

        function dir(){
          #make the windows guy happy 
          find ./ -type f -name "$1"
        }

~~~
yrro
But... there's already a dir command...

[https://www.gnu.org/software/coreutils/manual/html_node/dir-...](https://www.gnu.org/software/coreutils/manual/html_node/dir-
invocation.html)

~~~
Gratsby
hrmph. I never knew about it.

~~~
yrro
Not trying to be cheeky, but I have genuinely found it very illuminating to
read the _proper_ documentation for all the GNU software that I use. Not the
man pages, but the documentation available 'online' via info, and 'offline' on
the web for those who are info-phobic. :)

When I find my mind wandering and I am tempted to waste time on reddit/Hacker
News, I try to discover a new feature, or a new aspect of one with which I
thought I was already familiar.

The info program (hated it when I first used it, now I prefer to use it to
access the documentation of stuff that I have locally installed):
[https://www.gnu.org/software/texinfo/manual/info-
stnd/info-s...](https://www.gnu.org/software/texinfo/manual/info-stnd/info-
stnd.html)

Readline User Manual (that thing you use when you press Ctrl-R to search your
Bash history):
[https://tiswww.case.edu/php/chet/readline/rluserman.html](https://tiswww.case.edu/php/chet/readline/rluserman.html)

Findutils (the find and locate commands):
[https://www.gnu.org/software/findutils/manual/html_node/find...](https://www.gnu.org/software/findutils/manual/html_node/find_html/index.html)

Coreutils (ls, rm, sort, tail, tee, the good stuff):
[https://www.gnu.org/software/coreutils/manual/coreutils.html](https://www.gnu.org/software/coreutils/manual/coreutils.html)

glibc (an incredible manual that is very readable):
[https://www.gnu.org/software/libc/manual/html_node/index.htm...](https://www.gnu.org/software/libc/manual/html_node/index.html)

Bash (actually has a good man page, but it's nowhere near as in depth as the
reference manual, nor does it explain features with language designed for
those who don't know that they already exist):
[https://www.gnu.org/software/bash/manual/html_node/index.htm...](https://www.gnu.org/software/bash/manual/html_node/index.html)

Groff (have you written a man page for the last program you wrote?):
[https://www.gnu.org/software/groff/manual/html_node/man.html...](https://www.gnu.org/software/groff/manual/html_node/man.html#man)

Gcc:
[https://gcc.gnu.org/onlinedocs/gcc/](https://gcc.gnu.org/onlinedocs/gcc/)

GNU Make (bet you didn't know you could extend it with Guile and native
code...):
[https://www.gnu.org/software/make/manual/html_node/index.htm...](https://www.gnu.org/software/make/manual/html_node/index.html)

Automake (for many years the autotools were something deeply mysterious to me
that I thought I'd never understand, then I read the manual for them and now I
can't get enough of them, and the shortcomings of other build systems cause
serious pain):
[https://www.gnu.org/software/automake/manual/html_node/index...](https://www.gnu.org/software/automake/manual/html_node/index.html)

Autoconf: [https://www.gnu.org/savannah-
checkouts/gnu/autoconf/manual/a...](https://www.gnu.org/savannah-
checkouts/gnu/autoconf/manual/autoconf-2.69/html_node/index.html)

libtool:
[https://www.gnu.org/software/libtool/manual/html_node/index....](https://www.gnu.org/software/libtool/manual/html_node/index.html)

The C Preprocessor (has its own manual, who knew!):
[https://gcc.gnu.org/onlinedocs/cpp/](https://gcc.gnu.org/onlinedocs/cpp/)

~~~
anthk
Use pinfo instead of info.

------
OJFord

        > This is not available in the version of bash that comes
        > with OS X
    

Almost nobody that doesn't comment "what's Terminal?" on an article/etc.
should be using "the version of bash that comes with OS X".

------
jkot
I dont think its fair to mention DOS in 2016. There is PowerShell, VBS
scripting etc...

------
FDominicus
1) could be a running gag. If you know it you know it, if you don't why not
learn it? 2) mushed together /bin, /sbin/, /usr/bin, /usr/sbin/ usr local Well
yes that may be hard to get if you exectubables are scatterd around as in
windows in every other directoy. what a "big" difference 3) Yes DLl-Hell never
ever has happened to Windows users - never 4) Oh yes it's better to have on
registry and nobody knows which is for what. And if the registry is broken the
whole system does not even run any more - yes that sounds as if that would be
much better.. And no there is no graphical frontend for whatever in webmin. 5)
Shell script are programs, and you can use them for scripting. what problem do
you have with that? 6) So AFAIKT it runs here without troubles and
update/upgrades are just an apt-get upgrade away. 7) Teh IT backbones are
servers and most servers run under Linux. That should give you a hint.

You arguments are none. They are just your opinion which is not backed by any
knowledge. So welcome in the land of good-doers.

Reliability is a word that Windows users have just learned the last few
Windows incarnations. Long running servers are usual with Unices that's hardly
the case for any Windows.

And for Windows nearly all malware works. But hey who needs reliability if
it's all that nice and colourful

------
someoneretarded
"Since we only use apt-get for installing stuff, why not?

apt-get something instead of apt-get _install_ something"

wat? How dumb do people get?

~~~
nunobrito
If you read the blog carefully, you notice that the topic is about usability
for upcoming command line tools.

An hypothetical apt something follows the pattern found on curl, ping, unzip,
... Correcting user input when badly typed is something git already serves as
example. So, labeling as dumb one of the parties at a discussion about
intuitive command line switches as if they are immutable should be done with
more substance.

btw, when reading your reply this was the first thing coming to my mind:
[http://dilbert.com/strip/2007-11-16](http://dilbert.com/strip/2007-11-16)

Have fun. :-)

------
skocznymroczny
Is Bash really better at wildcard expansion than cmd? I mean, sometimes you
don't want to do wildcard expansion in the shell. copy _.txt_.bak would be
much harder to write in Bash I suppose.

~~~
dllthomas
For the record, I would probably write this:

    
    
        for i in *.txt; do
            cp "$i" "${i%txt}.bak"
        done

------
gtf21
People are still having windoze vs. unix fights? In this day and age?

------
catnaroek
There are good things about the Unix shell, but the fact that the shell, not
the program being called, expands wildcards, ain't one of them.

~~~
yrro
I like consistent and predictable rules for how wildcard arguments are
expanded. There's no way that thousands of programs would all get it right if
it were up to the developers themselves!

That said, it would perhaps have been nice if glob expansion would have, in
some other timeline, been performed by a separate command, so that «echo 🞳»
would print out a literal 🞳 character, and «glob echo 🞳🞳» would print out the
result of the expansion.

glob could then accept flags to modify the rules for expansion, such as
enabling the common __shortcut for recursive expansion, rather than the user
having to modify the behaviour of wildcard expansions by setting global
variables.

~~~
alkonaut
This. I can't understand why it's handled at the shell level rather than at
either the OS level (some Api to expand) or as a separate system tool?

What happens if I make a shell that does expansion slightly diffrently to the
existing shells? That can't work well? So there is already a set of rules for
expansion and all shells must implement them exactly? That does sound just
slightly better than the apps trying to do the same thing?

~~~
yrro
If programs had to use an API to expand paths then developers would screw it
up. Just look at the clusterfuck that is the command prompt on Windows!

In theory this applies to shell developers as well; however POSIX specifies
how expansions should work, and there are far fewer shells than there are
programs that those shells launch; and shells that don't conform to POSIX are
less likely to find adoption because they will break user expectation.

~~~
alkonaut
Yes, I know shells are fewer than programs, but shouldn't the OS provide an
implementation if something is specified by posix? Can't see the downside of
at least several shells using the same OS-provided function for it.

~~~
yrro
GNU/Linux kinda does.

[https://www.gnu.org/software/libc/manual/html_node/Pattern-M...](https://www.gnu.org/software/libc/manual/html_node/Pattern-
Matching.html#Pattern-Matching)

Don't know if they are commonly used by shells however.

------
DiabloD3
A lot of what this article talks about is why systems like msys2 exists: to
bring a *nixy command line to Windows.

~~~
bitwize
Only useful for compiling software that depends on Unix toolchains.

Modern Windows has PowerShell, which is leagues ahead of bash in terms of
functionality and power.

~~~
DiabloD3
Not really. I use msys2 every day, and I've never used it to compile things.

And PowerShell is kind of useless since I can't use it on any other computer I
interact with on a daily basis.

I use a github-synced environment, that includes things such as my zsh and vim
configs.

------
iLemming
and in the last example he given, i think you don't have to wrap things in
quotation marks if alias done like this:

    
    
       alias lsr='find . -name $1'

~~~
pdkl95
As jstimpfle mentioned, that doesn't work, which is why the argument is left
off the alias. In general, you can do that in a function instead of an alias:

    
    
        lsr() {
            find . -name "$1"
        }
    

Wrapping variable expansion with double quotes is a good habit, so spaces are
handled properly.

Also, if you are using a modern-ish bash, you almost always want to use "$@"
when there could be multiple arguments. The double-quoted @ special variable
is guaranteed to always expand as _multiple_ args, but with spaces handled
correctly:

    
    
        foo() {
            bar --quux=42 "$@"
        }
    
        foo "a b c" "Spaces in my filename.txt"

------
thisisdumbski
Seriously? I can't think of anything more pointless.

~~~
shocks
Trying to improve our command line interfaces is not pointless.

------
AndyMcConachie
The equivalent of "dir *.txt /s" in UNIX is "ls -R|grep txt$".

~~~
nunobrito
I find that syntax "good enough" for my own usage. YMMV when considering it
outputting not just files with txt as extension, but also somewhere on its
name.

Someone else mentioned a workaround by adding the escaped dot in order to be
fully equivalent to the dir syntax.

