
Helping my students overcome command-line bullshittery (2014) - tdurden
http://www.pgbovine.net/command-line-bullshittery.htm
======
bijection
Old discussion:
[https://news.ycombinator.com/item?id=8438129](https://news.ycombinator.com/item?id=8438129)

------
Animats
At least standard UNIX commandline bullshittery doesn't change much. What
drives me nuts is new build system command line bullshittery. There are lots
of build systems, all different. Go and Rust both have their own build systems
and their own directory layouts.

A few months back, the Mozilla Foundation insisted that add-on developers
switch from their old build system (useful for nothing else) to their new,
roughly equivalent build system. The new system by default generates the new
package in the same directory in which the package files live, then, on the
next build, including a zipped copy of the old package inside the new package.
(The documentation says it won't do that. It lies.)

This week, I'm using ROS, the Robot Operating System, which is a large
collection of robotics packages hammered into working together using a clunky
interprocess communication system. It has its very own set of file system
commands ("rosls", "roscd", etc.) which work on its own interpretation of the
file system, depending on various environment variables. It has its very own
build system, Catkin, with its own command line bullshit ("catkin_make", etc.)
and lots of configuration files, some of which are in XML, some of which are
in YAML, and some of which are in their own special format. Many configuration
files contain pathnames, and there are function calls which can be embedded in
some strings to call functions which use environment variables to search for
things.

ROS2 is coming, with a shiny new build system. I can hardly wait.

~~~
ramidarigaz
Oh god I can't stand ROS. I had to use it for a school project and it drove me
insane. The project requirements had us using it on a Raspberry Pi, and the
startup times were intolerable, for no apparent reason.

------
etrain
I'm a researcher in the same basic field, and I too am pretty good at this
"bullshittery."

However, Philip trivializes the "10,000 hours" spent learning this stuff as
merely learning how to cope with this interface. This is so far off it hurts.
Those countless hours spent messing with free software is how I learned to
_use_ other people's work, compile it, read it, fix bugs in it, and _learn how
other people think and write software_. This is an incredibly important
educational experience and there's sometimes no substitute for time and
patience.

~~~
tptacek
Think about it this way: in 1995, you'd have spent those 10,000 hours reading
other people's C code, reading the same dumb hash table or linked list
implemented 400 different ways, the same goofy 4-5 line loops implementing
"split" with strtok() over and over. But you don't anymore, because most
research isn't done using C.

In 1995 you'd have said "learning to understand what a strtok string split
routine looks like is learning how other people think and write software". And
you'd have been right. But you would also have been describing bullshittery
that we are all glad not to have to deal with anymore.

That's the overall feeling I get from this piece: there is more bullshittery
to eliminate than the 400 different versions of "xmalloc" that we had to deal
with in 1995.

Also: that because understanding how to iterate over the tokens of a colon-
delimited string with strtok() isn't actually intrinsic to much computer
science research, it's very important that we don't judge aptitude or
capability by how conversant people are with strtok, or whatever it is strcspn
does.

~~~
p4wnc6
I actually disagree with this. I came to computing from the reverse direction
(I did theoretical math in grad school and then steadily needed to become
better at programming as I needed it in industrial jobs). At first I thought a
lot of stuff classified as 'bullshittery' and that with convenience tools like
MATLAB, most of it was obsolete and taking time away from focusing on the
supposedly more fundamental "real work" of some domain science.

But as I have gone down rabbit holes with Haskell, C, LLVM, and some other
tools, I have discovered that seeing those 400 different implementations of
basic data structures has been extremely helpful. Reading and re-reading
decades-old tutorials on finer points of gdb and memory layout and books on
what NUMA-aware memory architectures are like has been extremely educational
and important. I would now regard MATLAB, and even the idea of wanting a
prototyping platform like MATLAB, as bullshittery.

I used to think that progress, so to speak, in computing was supposed to mean
less and less need to be proficient with low-level tools, and a constant push
to abstract away the low level and use languages or tools that offer the same
things in a high level, black box manner.

But I totally don't think this any more. Now I want to be a power user of all
the low level things and I wish that the surface area of tech and science
wasn't growing so fast that it forces me to be a wimpy abstraction user in a
lot of domains that I don't have time for.

Anyway, I'm just adding my perspective that the attitude that seemingly
cumbersome low-level stuff that was formerly instructional but now abstracted
is a nuisance is wrong. The world overall doesn't have nearly enough competent
technicians and engineers who really know (intuition in the fingertips) that
low level stuff, and we're allowing the superficial pleasantries of limiting
abstractions to raise up generations of programmers who not only don't know
this stuff at all, but also aren't even curious about how it works. They can
do jobs when the black box abstractions are working, but as soon as something
breaks, they don't know how to fix it.

Maybe one way to say it is that we're breeding programmers who have shifted
their curiosity away from how fundamental atomic parts aggregate to make
functioning tools and towards how to permute a series of ready-made black box
abstractions for some social-level end goal (e.g. how do I permute Heroku and
Postgres and Spark ML to make a recommender system) -- but it's not actually
clear to me that it's better in a utilitarian sense that we're supposedly
"freed" from our low level chains to think about these one-level-up problems.

~~~
nostrademons
I'm curious - what's your motivation for being interested in tech? Have you
given some thought about what drives you?

I've found, personally, that my motivation has changed fairly dramatically
over the years, and with it, my perspective on high-level vs. low-level
programming. When I was in high school and college, my motivation was that I
wanted to _feel competent_ \- I wanted a skill that was all my own, that I
could master without anyone telling me I had to. Immediately after college,
through my first job & startup, my motivation was to _get rich_ , or at least
develop a financial position better than my peers. While I was at Google, my
motivation was to _earn the respect of my peers_ , and now that I'm out on my
own trying out startup ideas, my motivation is to _test my intuitions about
the software industry_.

When I wanted to feel competent or earn the respect of my peers, I've usually
gravitated toward low-level technologies and concepts, things that were hard
conceptually or close to the machine but didn't have much practical
application. Haskell, lambda calculus, compiler design, blazingly fast code,
the bits & bytes of network protocols. When I wanted to get rich or test out
my ideas, I've gravitated toward very high-abstraction, productivity-
enhancing, flavor-of-the-month technologies. Javascript (in 2007, before it
became hot), frontend dev, Polymer, React, native mobile development, Parse &
Firebase, etc. I currently care very little about tech, because I've learned
that in the grand scheme of things it will usually not make-or-break a
business idea.

When I read your post, I'm curious why you use terms like "wimpy abstraction
user" or "superficial pleasantries" or "generations of programmers who not
only don't know this stuff". That seems to imply that you don't respect these
programmers. Why? Frontend devs frequently make more than backend devs, and
have many more job opportunities to choose from. The people who make the most
(founding CEOs) often have the wimpiest coding skills of all. What makes for a
status hierarchy where the closer to the machine you are, the higher the
status? Not saying it's wrong - I've certainly felt like this at times too.
But given that I'm now in my "study reality and poke it in certain ways to see
why it behaves like it does" phase, I'm curious.

FWIW, paradoxically I've found that each time I've moved from a core
motivation, it's because I've found that I can never achieve what was
motivating me. So my "feel competent" phase ended when I started digging
deeper into all the subfields of computer science and realized I could never
master it all. My "get rich" phase ended when I realized that I already had
enough and no matter how much money I earned, there would always be something
else I might want, if I let myself. My "respect of my peers" phase ended when
I realized that there will always be some people who liked you and some people
who hated you and that which side they fell on depended more on how you treat
_them_ than on anything you accomplish yourself, so really all I needed was
the respect of myself. Maybe the "test my intuitions" phase will end when I
decide that no matter how much information I have or what I experience, there
will always be an alternate explanation, but I suspect that will trigger an
existential crisis that I'm not quite ready for. There's probably some
brilliant Nietzschean revelation in there somewhere, but I'll settle for some
observations on status & respect in programming communities. ;-)

~~~
tmptmp
>The people who make the most (founding CEOs) often have the wimpiest coding
skills of all. What makes for a status hierarchy where the closer to the
machine you are, the higher the status?

Continuing your analogy a bit further, we can say that many politicians
(dictators) make much much more than most founding CEOs and they have the
wimpiest of wimpiest coding skills of all! I mean, we must not confuse between
the ability to make more with ability to tackle technical problems. Regarding
the status hierarchy, we can see that Nicola Tesla was certainly very closer
to machine as compared many people, surely he has a very high position in the
status hierarchy than most of the run of the mill founding CEOs with wimpiest
coding skills.

~~~
gozo
You might want to read up on Tesla ;)

------
gtrubetskoy
He makes a good point. I once tried teaching my two teen-age kids Python. I
was surprised to discover that before we could get into any Python I had to
explain all kinds of things I took for granted (or, perhaps I don't remember
how I learned them) - what is terminal (this is OSX), what is the command
line, why it's called "shell" and what is the kernel, what is the current
directory, what are text files, how to edit them, why there is vi, vim and
emacs, etc, etc. All in all it's many many hours of explanation of the
environment which you need to understand before you can do any programming. We
spent a couple of weeks on this stuff, got to some basic Python and then the
summer was over :(

It's especially curious that one of these kids took a class in high school
called "Computer Science" and they were using Java as the learning language.
Somehow they managed to write programs that performed cool animations without
knowing what the command line is.

~~~
distrill
I'm in my fourth year of a computer science degree, and there are students
graduating with me in a few months who have never written code outside of
eclipse.

They've even had us 'deploy' a website with filezilla. It obviously works well
enough for the size of projects we were dealing with, but I can't help but be
a little put off by how unfamiliar/uncomfortable most people (in most of my
classes) are with the command line.

~~~
gtrubetskoy
Very curious - what is the name of the school you're attending?

------
32bitkid
While I completely agree with the gist and point of the the article, the word
"bullshittery" is a bit over-the-top.

I grew up in the suburbs of Detroit and in middle school and high school there
was an abudance of shop classes; wood shop, metal shop, electronics, auto
shop. In first-year wood shop, we started out with a little coping saw and a
file and made golf tee holders and bird houses out of soft pine. After a
while, our projects got bigger and we moved on to using stuff like planer and
jigsaw. After we demonstrated proficiecy with those we moved on to more
complex/dangerous tools like the lathe, table saw and drill presses. Stuff
that could easily take a finger off, if you weren't careful.

Throwing someone unexperienced at a lathe or a bandsaw is simply
_irresponsible_. I think that one need to learn the basics, to grasp the the
mechanics, with simplier/less dangerous tool. However, one proficiency is
gained there is a new class of tool available to a skilled
craftsperson/professional.

Yes, its important to learn how to use a coping saw, but that doesn't make a
table-saw "bullshit". Just a different class of tool, that could seriously
hurt you if you don't know what you are doing.

~~~
dexiwz
This analogy falls flat when you compare the levels of Unix 'bullshittery' to
a shop. In a shop you have a limited number of already working machines that
are readily visible. In Unix, you not only have to know how to use the
machines, but how to install them, how to repair them, how to select similar,
yet slightly different ones, how to copy your machines and make them run the
same way in a different location. And on top of that to a novice all of the
machines are hidden, and you have to read manuals/tutorials/forums to even
know that they exist.

Unix tools can sometimes be horribly obtuse to use when first starting out. As
you get use to them, you may discover the elegance of simple programs with
pipable output, but their beauty is not readily obvious. Going from Windows
installers to apt-get to git clone/make/setup, is a long process. Expecting
students to already know this eats up time.

~~~
32bitkid
Of course, the analogy falls down–like all analogies–at some point. However,
there is a whole realm of industrial fabrication beyond what is available at a
simple/hobby shop.

To put it a different way: there are tools that I want to use when I am making
one of something. Then there are different tools when I am making 10 of
something. Again at 100, 1,000, and 1,000,000.

I've been a teacher, and I think there is a inherent conflict: As a teacher, I
want my students to learn _how_ to do something, but I also want them to do
things "right". Those two goals are often at odds. Sometimes the best way to
learn is to do it "wrong"; preferably in a safe environment where doing it
wrong won't kill/maim you. I think the injustice is trying to teach students
how to use the "for a million" tools (which they will _need_ later), when they
really struggling with the idea of making one.

------
ekidd
If you want a mostly nice, mostly well-documented development toolchain, you
can either:

1\. Install some proprietary, GUI-based development tools and color inside the
lines,

2\. Figure out what toolchain you want your students to use, and package it up
using apt or docker or whatever (and tell your students to get a VM if they're
on a different OS), or

3\. Give your students remote logins on shared computers.

Any of these will allow you to keep the "bullshittery" to a tolerable minimum,
and get on with the subject of the course.

But in the long run, unfortunately, if you want to be a programmer, you've got
to learn to manage your own tools. This might mean being able to figure out
command-line "bullshittery", or it may mean knowing how to unit test ASP.NET
REST controllers. Even in the days of MacOS 8, back when there _was_ no
command line, there were still plenty of unholy, mind-eating horrors hiding in
the dev stack. (The Component Manager, for instance.) And let's not even talk
about the complexities of the web stack.

You can choose your flavor of "bullshittery", but if you want to be halfway
good at this job, you've eventually got learn how to deal with whatever tools
you choose.

~~~
analog31
The Python crowd has worked out what I think is a decent compromise, which is
to create some packaged installers that are good enough for most of us. I use
WinPython. I don't know what it would take to do something like that for other
languages.

------
mathgenius
Yes, and algebra is the "command-line bullshitery" of doing physics.

I struggled for many years over this (my username is an in-joke with myself).
There's basically two days a month that i can actually comprehend algebra, the
rest of the time those little symbols merge and splurge themselves across the
page until i'm dizzy.

The beginnings of a breakthrough happened when I started writing a _program_
(algorithm) for every bit of algebra I couldn't understand. Now I am starting
to see that the equals sign is just not for me. I think in _processes_ , not
static equalities. And indeed, there is a whole branch of mathematics with
this philosophy: category theory.

------
tormeh
I like the same author's "The Two Cultures of Computing"[0]. It's somewhat in
the same vein, and also really important.

0: [http://www.pgbovine.net/two-cultures-of-
computing.htm](http://www.pgbovine.net/two-cultures-of-computing.htm)

~~~
opnitro
I've always had people who wanted to learn programming become so bogged down
in the tools around programming. It led me to thinking that it might be better
to teach the fundamentals with pen/paper/whiteboard. To get the "this is why
programming is important/powerful" message across. Anyone tried this/had any
success with it?

~~~
Gibbon1
I have a small anecdote about this.

In the last couple of years I've had few people ask me 'how can I write a
program with a GUI.' And I give the advice, 'Download the free version of
Microsoft Visual Studio for C# and select new project, windows forms' and then
start mucking around.

All of them were convinced by others to use qt and C++ All of them gave up
after a week or two of no progress.

~~~
kibibu
I got my start at "serious programming" using Visual Basic 6. There is a lot
to be said for early successes.

------
JackDanger
Having read the article I half-agree with the author. Yes, learning arcane and
inconsistent APIs is not the work the students signed up for and studying
command-line tools doesn't materially help them in the near-term. However, the
things that make the command-line so difficult also make it easy to package
tools together underneath simpler apis. I suspect the students are struggling
with having to use tools that aren't supported by their department or industry
community. Otherwise they'd have a shared script that they all maintain that
turns a bare PC/Mac/*nix box into one that's capable of doing great work.

This happens with every single engineering team. Those who understand the
tools wrap them up into something solid so new folks have a great experience
on their first day.

~~~
lallysingh
And hopefully it makes the students less likely to half-ass the build and
command line steps in the apps that they write later.

------
ocschwar
I really find this off-putting.

The same things he labels as "bullshittery" are those things that meant when I
got into CS, I didn't have to hold a soldering iron, enter the CHS triplet
from a hard drive's label into the BIOS, or write code in an assembler.

Other people's work, their "bullshittery" relieved me of a lot of tedious work
that they DID do when it was their turn.

What makes their stuff "bullshittery" is that they did not write their code or
even design it to carry out my wishes. They did it to carry out their designs.
And what they wrote did not DWTheyWanted perfectly, because they were human.
But they gave at least some thought to what others would do with their work.
And that calls for a bit more respect.

I am right now learning Android app development. Are Intents and Activities
bullshit?

~~~
dexiwz
The 'bullshit' he is referring to is not the software itself, but its arcane
interface. Think about the first time you tried to install a new version of
python, only to have both version end up in your PATH somehow. Or run into
permission issues and someone told you to run 'chmod 777.' As a novice, hours
are easily lost trying to find the perfect commands to do something, and it
often ends up being only one or two lines, something an expert would know
immediately.

His job as a mentor is to empower his students to do research, not be
sysadmins. Selecting a student based on a pre existing ability to use a tool
drastically limits the pool.

~~~
ocschwar
If you want to do research into something interesting and worthwhile, you
can't expect to just have a fully fleshed out programming environment with no
rough edges to deal with. To use an analogy, you're walking behind the guys
who cut the trail for you with machetes. You don't get a blacktop highway to
where you're going.

~~~
dexiwz
Turns out a machete is much easier to use than the machine that is Unix. A
machine that will mow down the entire forest in 30 seconds, or blow itself up,
depending on what you tell it. Oh, and that machine has 1014 levers, knobs,
dials, most of which are only labeled with single letters, or vague snippets.

------
soyiuz
I strongly disagree with the author's position. I think it betrays bias in
favor of computer science as opposed to software engineering. The
stereotypical computer scientist operates in a an abstracted world. The
physical limitations of computing only stands in the way of perfection. The
stereotypical software engineer delights in maintaining real-world systems.
The imperfect messiness of the world---its onslaught on all things man-made
presents a beautiful challenge in itself.

I'll give you an example. A colleague of mine was stuck debugging a fairly
complicated piece of Python Pandas data sciency code. In trying to help him, I
realized he does not understand the difference between absolute and relative
paths. In fact, I don't think he understood what a file is.

Now, you can say such details are bullshit. Who needs to know about paths?
They should just work. I want to do data science, not poke around some arcane
legacy operating system standards. I am an artist and a mathematician, dang
it! Yet the greatest generation of computer scientists and software engineers
always did both. People like Donald Knuth and Brian Kernighan made their own
"bullshit" which continues to run our machines. For me, to encounter that
history is to understand the delicate compromise between theory and practice.
To become a great architect, one cannot bypass gaining intimate knowledge of
building materials.

We are lucky that our systems contain the trace of their history. We should
study it closely, instead of calling it bullshit. Without that history, our
students and our colleagues are bound to play with prefabricated toys, while
the true polymaths write their own compilers and operating systems.

------
blahedo
If you read this, definitely follow up by reading the (more recent) response:
[https://medium.com/@eytanadar/on-the-value-of-command-
line-b...](https://medium.com/@eytanadar/on-the-value-of-command-line-
bullshittery-94dc19ec8c61)

~~~
analog31
This follow up article is definitely worth reading. But just to put things in
perspective, I got my degree in physics, and I overcame bullshitteries in
programming, plumbing, machining, electronics, mechanics, bike repair, and so
forth.

I think that part of the value, if not the major part, is that at the PhD
level, you're given a problem that is ill defined, that your advisor doesn't
know how to solve, and whose solution doesn't necessarily come entirely from
within your field. Though I was a physics student, solving my thesis problem
required developing skills beyond my advisor's own expertise, in areas such as
electronics and programming. I had to invent a solution that he had not
imagined. It helped a lot that those things were also my hobbies.

 _Which brings us to this reality. One day, and I pretty much guarantee it
will happen, your student will download something from the Web and they will
get stuck._

You don't know stuck until it's actually on fire. ;-)

~~~
digi_owl
> You don't know stuck until it's actually on fire. ;-)

Reads like a line from the "things i won't work with" section of In the
Pipeline.

------
wfo
Honestly it comes down to whether or not you want to understand how something
really works. If you are 99% of users then someone has pre-anticipated
everything you will ever do with a computer and so there are nice pretty GUI
tools to do whatever you want to do that someone has worked very hard and
possibly been paid a great deal of money to create to make your life easier.

If you are going to be building things or programming or creating new things
you have to understand how things actually work. I'm sorry if that means you
need to know that python on OSX is different from python on Windows but you
do.

If you are doing research by definition your needs can never be anticipated so
you will never have pretty GUI tools to satisfy your needs. So I know the
command line sucks for most people but if you want to do real work you're
going to have to learn how to use it since it drops all of the leaky
abstractions that prevent you from doing what you need to do when you're doing
something novel.

~~~
kragen
I think you have it backwards. Someone has pre-anticipated 99% of everything
you will ever do with a computer, no matter who you are. And if you don't take
the time to learn the things that let you do these things in one minute flat
each:

    
    
        comm -13 <(./foo | sort) <(sed 's/ //' < bar | sort ) > foos-not-in-bar
        awk 'BEGIN {while (getline < "/home/ubuntu/foo/bar" > 0) {sym[$1]=1}} ($2 in sym) {print $0}' sizes | sort -rn
        join -v2 <(cat ../foo ../bar | sort) <(sort ../baz) > quux
        zgrep -n " XYZ," /foo/bar.out.gz  | awk '{print $0} END {print NR}'
        ./configure && make
    

...then you are going to spend an hour writing each one of them in Java or C++
or some nonsense like that and then debugging it. With Python you'll get it
down to fifteen minutes each, maybe.

Now, I'm not going to claim that the command line is the _only_ way to do
these things (except for the last one). SQL or Excel or probably REXX would
work too. But take advantage of the massive body of software that's already
out there that can solve the 99% of your problems you have in common with
everybody else, so you can spend your time mostly on the 1% that is your
research.

~~~
wfo
No, nobody, say, anticipated that I would try to upgrade Ubuntu while using
Spotify. Otherwise it wouldn't have broken the app. Nobody anticipated that my
friend would have upgraded to Windows 10 in whatever machine state he did it
otherwise it would not have deleted and reset his user profile and all his
permissions. Both these issues are unexpected and require handling on a deeper
(commandline) level than 99% of uses, not because the users are trying to do
something weird but because the systems are so complex that even normal use
can lead to wildly unexpected situations.

You can use the shortcuts and use GUIs and they are excellent and of course I
use them as everyone does but if they are all you learn the moment anything
unexpected happens (which happens a lot when you are using or building
experimental software) you are dead in the water because you haven't developed
a mental model of how your system actually works.

~~~
kragen
Hmm, I guess I see what you mean. I don't think that's mostly the kind of
thing pgbovine was talking about, though. He was talking about the hurdles of
getting to "hello, world", or whatever the equivalent is in terms of modifying
some random piece of research software they've downloaded. Those hurdles are
what cost his students productivity when they don't know how to navigate them,
and how he teaches them to leap over them.

Now, it's _also_ true that when things break at one level of abstraction, you
have to debug them at the next level of abstraction down, which means you have
to understand that level of abstraction. But I don't think he's talking about
that, because (among other things) that rarely has to do with the command line
as such.

------
kohanz
I actually love the term he used. I started my programming in the Windows
world and wasn't exposed to POSIX command line stuff until later. I'm
comfortable with it nowadays, yet I know exactly what he's talking about with
the 'wall'. To me, the 'bullshittery' would mean the inconsistency of it all.
There's no standard for command-line tools, so they all tend to be slightly
different. The end result is having to learn and memorize a ton of (what are
in the end useless) variations of syntax.

Please note I'm not saying anything is better in the Windows world; far from
it. I'm more talking about the transition from GUI tools, where feeling your
way around in the dark is much less daunting to the uninitiated, to the
command line.

~~~
chubot
Yeah but the shell actually provides a mechanism that allows you to FORGET all
the bullshit -- functions.

I look up commands on the Internet, and then put them into single line shell
script functions with a name. And then I write a comment about how it works,
and check it into a git repo.

Then I never have to remember the exact invocation. I just type my nice
function name, which is consistently named, and describes in my own words what
the command does.

It's odd to me to see my coworkers and others very slowly coming up with a
long command line, getting it wrong, very slowly correcting it, etc. I don't
have time for that. I just grep my shell scripts directory for something
close, copy and paste, edit it in my editor, and then run it.

It's very quick and puts no strain on the mind. Actually I find enjoyable to
put my own names on arcane command line syntax.

~~~
copperx
I find it depressing that after decades of working in the command line, this
never occurred to me.

------
ams6110
If students can't deal with the bullshittery on the command line, how do they
hope to deal with the even deeper bullshittery found in the libraries of most
programming frameworks?

~~~
tptacek
He's not suggesting that they can't. He's suggesting that doing so is an
obstacle to doing research (it clearly is) and that the obstacle isn't
intrinsic to the research process itself.

~~~
orionblastar
They should add a class to take before his class called Intro to POSIX 101 on
how to use a Unix-type system and the command line options.

My 16 year old son taught himself how to program in order to mod his own video
games. I wanted to teach him, but he figured it out from Youtube Videos and
Google searches. He had to learn how command line switches worked, etc in
order to use free and open source programming tools.

I learned Unix at a university in 1986 as part of a computer science class. I
got into Linux in 1995 using Slackware on a 486DX clone. I remember a time
before the GUI was used where everything was command line based in DOS and
writing batch files to do things. Nobody taught me how to do that, I sort of
learned it from trial and error. A lot of my DOS programming classes I had to
use command line switches to compile my programs and edit text files with a
Word Processor or the edit.com in DOS 5.0 or something. Things sure have
changed now with IDEs and source code editors that highlight syntax in color.
I learned HTML using Windows Notepad to make HTML text files. Things would
have been easier if there was a class on this stuff that was required before
taking the classes that needed it.

I worked in a computer lab in 1990-1994 at a college and taught the command
line options to students just learning how to program.

When I learned Visual BASIC, everything was GUI based and it even generated
forms I could paint controls on. It changed the way programming worked for me.

But free and open source tools are still command line based, and someone has
to train the students how to use them. I sort of wish I still worked in a
computer lab and training students, but those days are gone.

------
mikekchar
I think the author is running afoul of a very common problem (albeit he seems
to be slightly more aware of it than most): "Ease of Use" and "Ease of
Learning" are two different, and often incompatible goals. He praises the
command line utilities as offering productivity improvements of an order of
magnitude or more, but then castigates is for not having changed in half a
century.

His description of the 2 cultures of computing is also interesting because he
clearly understands that computer programming is literally the use of language
to instruct the computer. He feels the complexity is incidental because it is
not directly related to the problem that he wants to solve, but he completely
ignores the necessity of learning the language.

The command line is a general purpose instrument with an intentionally broad
interface. I think it is disingenuous to assume that one could build purpose
built tools that do what you want, before you even know what it is you want.
The complexity is specifically to afford the flexibility to do whatever you
might want to do in the future. It hasn't changed (much) for 50 years because
_nobody has found a better way to do it_.

Feel free to improve the state of the art ;-)

~~~
a_bonobo
> nobody has found a better way to do it.

People have found a better way to do it! It's just that research software
_ignores_ the better way because it's mostly "proof of concept", once the
paper is out, who cares who uses it, that's not my department.

 _The incentive is to publish, not to create usable software_. In fact,
spending the time to make your software usable is wasted - no funding
committee, no tenure committee, no supervisor cares.

Here are some of my "command line bullshittery" problems I run into in
bioinformatics:

\- There's no Makefile, just .cc files, you have to figure out how it was
compiled. 90% of the times a simple "gcc" won't cut it.

\- There is a Makefile but the flags are all outdated and there's no
./configure to get flags for your system

\- There is a Makefile, but it depends on old versions and doesn't say so. The
new versions don't work. (samtools v1.0 changed the API and there's a ton of
software (BioDB::Sam) out there that assumes a pre-v1.0 API, so you have to
override the system-wide installation, great "fun" if you're on a cluster
where each node has its own environment and you have no root)

\- the software depends on other research software that doesn't exist anymore
(google-fu to find a forgotten dusty tar.gz somewhere)

\- It's a binary, but it gives you empty output if you have a typo in your
flags (Just last week with Blast2Go: if you use "annot" instead of "-annot" it
won't complain, you just get empty output)

\- it depends on environment variables but it doesn't tell you so, it just
crashes when it tries to run a dependency program using
$PROGRAM_ROOT/bin/program (even better: MAKER doesn't crash, you just have one
error line in about a million lines of STDOUT, chances are you don't realize
that it had problems in the first place)

\- program depends on loading data from disk using MySQL but MySQL wasn't
compiled with local-infile=1

I'm sure I've wasted about half of my PhD's time finding out about highly
program-specific problems that don't appear anywhere else. It has made me a
somewhat proficient debugger but that is lifetime that I could have spent on
analyzing actual results & writing papers, things that are actually helpful in
my career.

~~~
mikekchar
Hmm... We don't seem to be connecting here. When I said "there is no better
way to do it", I meant that interacting with tools through the command line
(or shell scripts) is the best way we have of doing these kinds of tasks. I
didn't say so explicitly, but my intention was to include all of the usual
tools that one would use (grep, awk, make, whatever).

Though it was several decades ago, I also suffered greatly from having to deal
with poorly built academic projects. It's not even confined to academia. A
friend of mine often tells a tale of a start up who spent 3 years building
software only to discover that they had built such a mess that they _couldn 't
deploy_ (never having done it in 3 years... :-P). The company ran out of money
before they solved the issue.

I may be wrong, but I don't think that's what the original article was
complaining about. I think he was frustrated that while the tools were very
good, they were hard to teach to his students. He wishes there were easier to
learn systems for doing this work.

Personally, I don't think it will ever happen because the problem space is
quite complicated. You can make it easier to learn by restricting what it can
do, but eventually you butt up against that ceiling, making it extremely
difficult to use. That's why we are all still using these same tools decades
years later (although, I'm quite happy to ditch make in favour of more modern
equivalents [1])

I can't tell you the number of times I've wandered into a new project and
spent every spare second writing README files, fixing the build systems,
removing dead code, etc. I've even worked on projects where there wasn't a
single person who knew how to build the entire project -- all they could do
was compile individual files and link it to pre-built libraries. They thought
me crazy for even wanting to know how to build it. We do our industry a grave
disservice by not teaching this stuff somewhere in our education system.

[1] Says the guy who just wrote a new make file a couple of weeks ago
<whistling>

~~~
a_bonobo
I appreciate your post!

I do agree that awk, grep etc. are extremely good tools, and you rarely run
into real annoyances with them (actual scientific software is a completely
different bag of problems that suffers from different problems). Still,
there's lots of room for improvement even in the good tools [1] and I feel
like it's easy to stop thinking about the problem space, and shut out others
who don't immediately grok what's going and why you've now run "apt-get
update" instead of "apt-get upgrade". Many posts in this thread show that
attitude.

I do think at some point we'll get easier systems - git has now overtaken cvs
in popularity so these tools aren't set in stone, even though both aren't easy
to understand for beginners, git is just more powerful. I see people write
tiny Python scripts instead of using bash-tools to manipulate folders and
files, I think that is a case of taking up a tool because it's easier to use.
Are there other tools that are slowly fading out? I feel like awk is on the
way out but that may be my echo-chamber.

>We do our industry a grave disservice by not teaching this stuff somewhere in
our education system.

I fully agree - I think that some hands-on programming, understanding and
refactoring experience is much more important than knowing how to write the
code to keep a b-tree balanced. The first is something you have to do every
few weeks - the latter is something you do only once or twice, if ever (even
though it's a cool application of recursive thought). The first will make you
a more valuable member of the workforce.

[1] One example: when I teach people git I struggle to show the man-pages
because nothing in there makes sense until you've truly understood how git
works under the hood

------
cdnsteve
Docker is looking better and better... I recently decided to use Python 3.5
and Flask for a project. I didn't have time to learn them and Docker so I used
a regular OS X setup. I thought the setup wasn't that bad until I had to share
my setup with other team members... It took me 25 minutes to write up a wiki
page on just the local environment setup. Pyenv, homebrew, pyenv-virtualenv
switcher thing, setting python version, setting environment variables... After
reviewing the instructions and going through this with two team members, I am
realizing more and more that this is a major barrier to jumping in and out of
different technologies easily for teams.

We take for granted our environment setups, until you need to have someone
else jump into what you're doing. After I get this shipped I'm going to circle
back and get it up and running inside Docker. If the promise holds true, one
or two commands and you should be up and running for any environment... we'll
see.

~~~
nostrademons
IIRC Docker on a Mac requires a virtual machine, right? If you're going to use
a virtual machine for ordinary development, why not ship around a VM image
that has everything pre-installed and ready to go?

(This is actually a very sane way of doing development...for much of 2002-2008
I did my development inside a Linux virtual machine running on VMWare
Workstation, because my normal laptop was Windows. Macs have spoiled us; since
they are POSIX-compatible, we've gotten used to running our dev environment
directly on the box. But if you're having trouble configuring everything, it
may be worth re-examining the VM solution, because you get much better
isolation and many fewer configuration bugs than Docker.)

------
dougdonohoe
I prefer to live in both of these worlds.

For example, when necessary, I'll do git stuff on the command line, but far
prefer to do it in IntelliJ - with side-by-side merge conflicting, it is much
better than anything on the command line. Seriously - blame, history,
branching, diffs, etc are far easier in a graphical UI.

On the other hand, sometimes you need to drop down into the command line when
(a) s@&t hits the fan or (b) you need to do something unique or complex that
the UI doesn't handle. It definitely is worthwhile knowing (or being able to
_learn_ ) sed, awk, tar, find, scp, rsync, ssh, dd, df, grep, egrep, bash, mv,
cd, ls, and [fill in your favorite] *nix commands.

<tangent>In fact, I don't really understand how Windows developers do anything
without installing cygwin.</tangent>

Net-net, I think a good developer/engineer has the ability to read and grok a
man page, but the wisdom to use a simpler UI or IDE when it gets the job done.

------
benzesandbetter
There are plenty of installers and tools (e.g. brew, apt, yum) available to
reduce this "command-line bullshittery" of which you speak. Of course, finding
and using them may require some "search engine bullshittery" ;)

Being able to navigate this "command-line bullshittery" is one of the basic
filters for people in the software development and CS research fields.
Personally, I've hired people with CS degrees who could barely do basic
operations in the shell. I'm often the person on a software team, who is
coaching the less-skilled members on getting their environments and tools set
up.

Rather than reduce these friction points, I think we need to encourage a
culture where people learn to solve their own problems, or at least read the
docs and try to do a bit more before thowing up their hands and asking for
someone else to solve their problem.

------
MBlume
I think this is poorly written. I don't object to him calling it "command-line
bullshittery" in his title, but doing it over and over again in the article is
excessive. We get it. You don't like it. You aren't going to convince us not
to like it by naming it with the same obscenity 500 times.

------
stevebmark
This is a valid point. Learning the command line has a time and a place but
it's almost entirely separated from doing computer science work at a "higher"
level. Draining cognitive power to learn an unrelated technology is at best a
necessary evil. It's compounded by the fact that the command line is full of
tools that are inherently hard to use, because they're designed not to be
terse, not user friendly. Can you tell me what all the flags of `tar -xvzf`
do? Do you think you would ever discover that combination on your own from
reading the man pages? I certainly wouldn't have, and I'm comfortable with the
command line. `man tar` is a nightmare. (Also, bash is a lesson in language
design failure).

Of course the command line makes many things easier. Carving up history with
git is much faster on the command line, once you really grasp it. Most build
processes require some command-line fu these days. Etc. I still see his point
that it's a hindrance to early learning.

However, I don't know the ideal solution. Making a full OSX install wizard is
obviously more work than writing a shell script. Maybe the best middle ground
is Homebrew packages. `brew install ___` is usually only one command and
usually works without problems.

~~~
balls187
> Can you tell me what all the flags of `tar -xvzf` Yes.

x: extract

v: verbose

z: zip (or unzip) archive

f: file name to follow.

> Do you think you would ever discover that combination on your own from
> reading the man pages?

Yes. Turns out, I wanted to tar up my nodejs directory, but omit the
`node_modules` folder.

> Of course the command line makes many things easier.

Exactly. I'm not a high level computer scientist, but I imagine, being able to
easily deal with files and data would help things out quite a bit. So there
while it can be painful to learn bash, it's something that pays off heaps in
the long run. My opinion.

~~~
loonattic
`z' means to use gzip compression when extracting or creating a tarball. `J'
means xz, `j' bzip2.

For example, to create an xz-compressed tar file:

tar cJvf archive.tar.xz some_directory/

Generally, you can omit those when uncompressing, at least with GNU tar.

OpenBSD's tar is less friendly - it doesn't seem to try to autodetect the
compression format. Last time I checked, it didn't support xz either. One had
to install the package and then do something such as tar cf - directory/ | xz
> something.txz

where the - means that the tarball is written to standard output. /dev/stdout
would work as well.

------
epynonymous
if i could sum things up, basically the author views the actual
research/prototyping as the largest value the students can provide as opposed
to becoming skilled systems programmers, i think it makes a lot of sense. a
lot of counter-arguments seem to be in favor of putting those students through
their paces in a bootcamp style of fashion where they struggle and learn about
the intricacies and nuances of command line tools and development environments
before they actually start writing programs, also a fair argument. however, i
see the two arguments as orthogonal, his goal is to produce research
scientists that discover new algorithms and new areas of research, not
software engineers that work necessarily at twitter or facebook, etc. for the
author, i suggest using some tools for building reproducible development
environments like vagrant (mac os x). for things like command line option
permutations and drudgery, perhaps creating some wrappers, scripts, or tools
to automate some of these steps would be worth the investment and potentially
a one time effort, it seems that copy/pasting scripts seemed to help. often
times things are created out of necessity, frameworks or higher order tools
evolve out of these things. at the end of the day, developing software should
evolve, just like languages, where you have higher order languages that start
to remove the lower level details and complexities so that you can focus on
the bigger picture (e.g. java, springsource, jquery, cloud foundry, etc).

for the record, i went through all the bullshittery at one point in my career,
and don't regret any of the countless, mind-numbing hours spent trying to
figure things out, but i don't work in the field of direct research.

------
nickysielicki
I'm a command line chauvinist and I'm not ever going to apologize for it.

> [Incidental complexity] arises simply because modern research software
> development is a messy jumble of open-source tools tied together by the duct
> tape of command-line scripts.

That's ironic, I think the output of modern CS research is this way _because_
students don't learn this "bullshittery". Good luck getting your interesting
research to compile elsewhere with that makefile that you copied off of
stackoverflow and edited with hardcoded values until it built.

This past summer I worked for about a week and a half with a local startup
trying to exist in fitness-watch IoT. They wanted to flash the boards with
some proprietary tool provided in the board devkit. I wanted to flash the
board using flashrom. They were content using a cloud compiler that had a
costly license and hooked in the bootloader magically using a closed source
library. I wanted to get us using arm-none-eabi- and a proper build system.

They wanted to press a green button in their IDE. They wanted their familiar
starting point of

    
    
        int main(int argc, char* argv[]) {  }
    
    

And that's the real hidden tax of not learning this "bullshittery"\-- a lack
of an ability or urge to learn or do more. Oh, we need to do OTA flashing?
Well we don't understand the bootloader. Or the flashing process.

Good. luck. with. that.

And I guess that's fine if your goal is just to make theoretical software for
a paper. But if you're trying to make code that isn't meant to be disposed of,
you need to know what's going on underneath-- you need to be able to explore
things from first principles.

------
zobzu
"nohup tar -jxvf giant.tar.bz2 2> cmd.errs &”

Someone doesn't know how things work and is complaining about it. The above
command is just programs put together in hope that it looks complicated, but
are also totally useless, and not actually installing anything (it unbzip an
archive, detach it from the controlly TTY, redirects errors to a file, and run
in the background, while verbosely writing down what it does, because, nobody
knows why anyway)

I don't find most-programs CLI all that bad - in fact, its more consistent
than the GUI stuff with some exceptions (generally coded by people like the
author, an many java-based programs, mind you, go figure, etc.).

Sure, I'd like to say "hey make everything works" and the computer does it for
me. But until them things like "yum install blahsoftware" is REALLY fast an
easy (heck it beats the phone apps stores by such magnitudes that I can't
even?) It's 3 words. Done.

Heck, you want to install a full software suite, configured, with services
auto deployed etc? Today you can also do this also in 3 to 6 fucking words.
That's amazing if you ask me.

------
delish
I consider "programming" to be a religious journey:

* "You're on your own path" you get errors that the person next to you doesn't.

* "To learn is to invent; you must come to realizations by yourself" different solutions in stackexchange in 2012, '13, and '14 don't work for you.

* I don't have a cliche-quip for this, but if religion is how you find your complex relationship with the world, programming is _how_ you find your complex relationship with computation. I emphasize "how" because its a process and a practice.

* It's mostly bullshit. Because we don't have physics-for-everything, we have religion. Because we don't have math-for-everything[0], we have programming. I'm speaking loosely and rhetorically.

* Related to the above, religion is a bunch of workarounds, like programming. This ecumenical counsel said such, but that ecumenical counsel revised it.

[0] In this instance, I'll define "programming" to be "mostly failed attempts
at precise description." Math-for-everything would be "very precise
descriptions."

------
andrewchambers
Rob Pike already pointed out the growing complexity of our systems -
[http://harmful.cat-v.org/cat-v/unix_prog_design.pdf](http://harmful.cat-v.org/cat-v/unix_prog_design.pdf)

This is all incidental complexity caused by people adding dependencies and
options without caring about the cognitive overload they introduce.

------
thoughtexpt
"It's simply an obstacle to overcome before one can get real work done."

Since we're expressing our personal opinions, that's exactly how I would
describe a GUI.

There's nothing exciting to me about the command line EXCEPT that it allows
one to avoid the "bullshittery" that finds its way into almost every GUI.

Same is true for UNIX in general. Not exciting EXCEPT to the extent it lets me
escape the layers upon layers of abstraction, the complexity, the hassles, the
unreliability and the general unrobustness of graphical operating systems.

Those are big exceptions. So yeah, I do like the command line and barebones
UNIX-like operating systems.

I simply cannot get "real work" done without a UNIX-like OS and a command
line.

The work I perform makes those graphical operating systems and programs hang
or crash or is nearly impossible to accomplish without acquiring a repetitive
stress injury.

I did not create this state of affairs. I have simply adapted to it.

------
fnordfnordfnord
Right or wrong, I usually start the year off by having the students install
the OS on their lab workstations. It is a hassle, but I think it helps. Some
people need more help than others, some hit the ground running having had
prior experience. They'll install Python libs, various software packages,
various types of software packages (from OS repo, Java, WINE, github, etc.) I
try to cover the most common cases. They install Windows in a VM x2 (XP and
Win7), and then install some flaky old Windows software that only likes to run
in XP (PLC software), then they'll make the hardware work through the VM.
Nearly everything we do could be done from the CL or from a GUI, but
disseminating the instructions is accomplished most quickly using the command
line.

------
pbohun
You can't teach someone to read or write without learning the alphabet.

The command-line is the alphabet. You may have flowery notions of the
wonderful poetry you're going to write, but if you can't spell a word the real
bullshit is to think you're an awesome poet in the first place.

------
throwaway37182
I liked the article, but I was bothered throughout by the word "bullshittery".
Why add the "ery"? "Bullshit" is already a mass noun, and the standard slang
use of it fits fine everywhere he uses his unnecessarily longer term.

------
noufalibrahim
There's some merit in what the professor is saying. A lot of the minutiae get
in the way when you want to get something done. Who wants to wrestle with some
obscure option to some kernel module to get ones graphics drivers working
before starting work on a serious graphics project?

However, having grown up in an era when GNU/Linux wasn't friendly at _all_ , I
think I learnt a lot of stuff that's valuable to me even today. Much of it was
concrete factoids which I no longer need or use but the valuable parts were
some kind of "meta" skills (which Eytan[1] discusses much more eloquently in
his rebuttal) which serve me very well even though I don't wrangle with the
nitty gritty of getting obscure software working on my computer. Some examples
I can think of are.

a. I can decipher man pages and other unfriendly but mostly technically
complete documentation. b. I can speak the language of the programmer fairly
well which makes my work much more smoother than if I had to speak through a
UX interpreter. c. Given some kind of task, I have the confidence that once I
learn the domain specific stuff, the computer and software can't really
cripple me and prevent me from getting work done. d. In case of an emergency,
I know where the metaphorical life jackets are and what to do. I also have the
confidence to get and use them. e. I'm conversant with the ideas of pipes and
other such tools that allow me to automate away annoying tasks which would
have otherwise bothered me. f. Even when resources are scarce (e.g. over a
slow net connection on some remote machine), I know that I get work done
without the fancy interfaces.

This makes me generally more efficient and I don't think I'd trade all those
years of dealing with these arcane tools for anything else. I mentor students
these days and a thumb rule I use to evaluate them is to see if the've managed
to install a "non-mainstream" Gnu/Linux distro on their machines. If they've
managed, it's almost surely a sign that they're good learners.

This is not an argument for making command line interfaces hard or keeping
them that way. It's just that there's a school of thought that'll sacrifice
ease of use for power and there's some value in that even for people not
directly working on the tools themselves.

Footnotes: [1]: [https://medium.com/@eytanadar/on-the-value-of-command-
line-b...](https://medium.com/@eytanadar/on-the-value-of-command-line-
bullshittery-94dc19ec8c61#.4qt0dd22r)

------
vezzy-fnord
That we still revolve around static, line-based interpreters from the VT100
era (over more intelligent CLIs or hybrid UIs like Oberon) is indeed rather
quaint, and the proliferation of underdesigned programming tools also, but I
strongly disagree with the author's claim that all these Unix programs are
"[bullshit] not intellectually interesting in any way". Perhaps I'm one of the
few people who actually likes to read the source code to the nodes in their
/var/log/packages, but potentially getting involved with userspace system
programming is not to be dismissed.

------
fenomas
When you have a problem, and software exists that can solve it, the general
case is that the software fixes a _class_ of problems, and your specific
problem lies within that class. The "bullshittery" here follows inevitably,
simply because:

(A) you need to tell the software what specific problem you want solved

(B) nobody has invented a One True Way of doing (A) that's perfect enough to
make everyone abandon all the other ways of doing it.

Maybe I'm missing something, but I don't see anything more complex than that
going on here.

------
int_handler
I think it is ineffective to only focus on command line "bullshittery" and not
discuss API bullshittery. I think the problem of learning commands pales to
the problem of learning poorly-designed APIs. One extreme example of this is
the Windows Event Tracing API [1], but there are many more along the same
vein.

[1]
[http://mollyrocket.com/casey/stream_0029.html](http://mollyrocket.com/casey/stream_0029.html)

~~~
TeMPOraL
The two problems are sort of equivalent when you realize that the shell is
just a REPL to your OS, and commands you type are, generally, API calls.

------
lowlevel
I have trouble taking this complaint seriously. If the platform's interface
paradigm is such a hinderance just sheild them from it as much as possible by
providing pre-configured environments or use a different platform. There is
tremendous value in knowing how the command line work, how the systems work
and how they are configured whether you are talking about Unix, Windows,
AS/400 or whatever. If you skip this, you are likely to be eaten by a grue.

------
vsync
He works in a university. Maybe he can tell me how to get a chemistry degree
without any test-tube bullshittery, or a biology degree without any microscope
bullshittery.

------
ww520
Even after dealing with command line commands for many years, I don't remember
most of the gory detail of them. That's why I keep a note file saving all the
common and interesting command line commands I encountered. This has enabled
me to "forget" (offload) most of these bullshittery and concentrate on the
real problem at hand.

The note file is just a Emacs org file with different sections for different
topics.

------
tolmasky
This was one of the big reasons we made
[https://tonicdev.com](https://tonicdev.com) . We found so many stackoverflow
questions were just about installing packages or asking how to get node
running on their computer. We wanted a way to be able to try just about
anything in the environment without starting with the most demoralizing part
-- configuration and installation.

------
ryandrake
Why not just insist that "Basic command line bullshittery 101" be a
prerequisite class for whatever course he's teaching? Done.

I wasn't allowed to take all my senior-level programming electives before I
learned how not to blow my leg off with C++. Same should be true for whatever
research he teaches--you need to learn your tools before you do your research.

------
hullsean
Time & patience. That's for sure.

Maybe Neil Stephenson is right, that Linux tells the Gilgamesh story of hacker
culture

[http://www.iheavy.com/2015/10/28/linux-command-line-
stephens...](http://www.iheavy.com/2015/10/28/linux-command-line-stephenson-
hacker-history-open-source/)

------
blazespin
He's 100% right. It's all "bullshittery". Unfortunately, that's a rather
trivial observation. Kinda like saying 2 + 2 = 4. The next part is the hard
part - which to fix first?

I think what he really wants is a suri like AI that just does it all for him.
Of course, at that point, I suspect skynet will have taken over.

------
anigbrowl
Can't upvote this enough, and I urge people also to read his other article on
this topic, as well as the rebuttal linked within.

I got into computers in the early 1980s when built-in BASIC interpreters were
considered Terribly Clever and graphics were something you got by
loading/constructing an alternate character set to fit on the text grid. Like
many others I stumbled onwards through Vax terminals and onto PCs, where I was
lucky enough to obtain a copy of AT&T System V/386, heh heh. So that's a
fairly decent 35 years of experience with command-line interfaces...

...and about 15-20 years of dismay at people continuing to reinvent the damn
things. Look, you know the only reason for commands like _mv_ and _cp_ was
that one time we had 300 baud modems hooked up to dumb terminals and there
were good reasons to worry about minimizing character count? It doesn't mean
that the resulting obscurantism was a good thing in itself. It's not. There
are _no good reasons_ to make people go through typing everything and to
perpetuate ASCII graphics fetishism and so on. If anything, it's become an
unspoken set of rules to keep people away from our nerd treehouse. Actually
typing out all that shit is a huge waste of time.

Writing shell scripts and makefiles by hand is insane, we buiilt computers
precisely to save us from this sort of pointless tedium. I find it really
depressing that UI innovation on Linux has basically ground to a hatl and that
KDE, Gnome and so on are have basically not evolved at all over the last 15
years. Back around 1999 I was running Gnome and E.16 (or was it 15?) and I had
a hugely customizable UI that was may more powerful and flexible than anything
on Mac/Win (admittedly this was less true for applications, but still.
Nowadays when I boot into Linux it mostly feels old and tired.

As I pointed out, I've been using CLIs for a good long time. I got my first
copy of Linux from comp.os.minix because there were no distributions as such
at the time. It's not like I can't figure this stuff out. But I'm sick of it!
There's a reason most consumer software is built around GUIs - they are
better. In a well-designec GUI, you can find everything you need within the
user interface, instead of having to accumulate a heap of arcane CLI knowledge
(starting with the question 'what do I type?!').

It's true that the CLI offers greater flexibility of you know exactly what you
want to do. It's also true that for physical items, you can cast or 3d print
something that is absolutely perfectly shaped if you know what you're doing.
But if you have kids and you want them to discover the joys of engineering,
you should probably buy them a big box of Lego instead of a set of carving
tools or a CAD workstation.

~~~
Estragon
> Back around 1999 I was running Gnome and E.16 (or was it 15?) and I had a
> hugely customizable UI

Why don't you keep using it? I'm still using sawfish.

~~~
anigbrowl
Oh I still like E. What I mean is, I'm disappointed that UI experimentation
and development on Linux seems to have stalled. I'm having trouble thinking of
the last time I saw something that suprised/impressed me there, maybe Compiz?

------
pmiller2
Am I the only one disappointed that he didn't name any of these tools that
allow you to be 10-100x more productive than without them?

------
JamesBarney
Sometimes I think progress is defined as the reduction in the number of things
we need to understand to accomplish any given task.

------
rsync
Frankly, I appreciate a good bottleneck that selects against people that can't
grok basic unix admin and system maintenance.

How many hours (thousands) and dollars (tens of thousands) have I wasted on
"programmers" that don't even have a home server running in their closet...

That's an interview question at rsync.net, btw. What computer(s) do you have
(running) in your closet ? What computer is connected to your television ?

~~~
Shank
I think that's a poor interview question. You're bound to only get people who
have a disposable income to throw a server in their closet, and certainly a
computer attached to their TV. Maybe it'd be a better interview question to
have them work on a project for a week, which requires using the command line,
rather than asking this somewhat limited question set.

I don't mean to tear into you -- I see the value in what you ask because I too
get frustrated by a lack of command line experience. That being said, asking
about a specific thing that may not apply to a wide range of people is just
silly filtering. Focus on what you would do with a server in your closet,
rather than "what server do you have in your closet?"

~~~
rsync
Well, yes ... it's a leading question where we go into the real issue at hand:

Are you a "lifer" ? Would you do this work for free, like I would (and did,
for years) ? Can you, like me, not imagine doing anything else with your time
?

That's who I want to work with.

~~~
madeofpalk
I love my work and couldn't imagine doing anything else, but 'love' doesn't
pay the rent. Fuck anyone who expects me to give away my time.

------
hspak
You can call command line bullshit, but at the end of the day it has the least
amount of abstraction meaning more visibility in what's actually happening.
Visibility is always nice, especially when something breaks.

Say there was this nice GUI that built and setup lot's of projects for you.
Who's responsibility is it to make sure that everything it supports stays
supported?

~~~
analog31
Invariably, unless it's a major software product with a lavish budget, the GUI
based setup program is less intelligible than the command line, and is
accompanied by instructions consisting of page after page of screen shots,
"with circles and arrows, and a paragraph on the back of each one," to quote
the balladeer. And when the next version of the OS comes out, or a new version
changed the installation slightly, the instructions are invalid, but were too
costly to update.

After years of that, "sudo apt-get install" was like a breath of fresh air.

~~~
jqm
You can get anything you want with Make and Make Install.

------
geggam
In other words... we want to do difficult things without learning.

------
oldmanjay
To help those who are reflexively clicking "add comment" in anger, you really
should read the article first. You're near certain to argue against a strawman
otherwise.

------
hackaflocka
Command line bullshittery is the greatest source of job security to coders. If
it all became too easy to do (not saying that's possible), it would make a lot
of us less employable. This situation is an inversion of what Upton Sinclair
said: "It is difficult to get a man to understand something, when his salary
depends upon his not understanding it!"

------
frozenport
>>I can find, so it's profoundly stupid to disproportionately filter out
entire demographics based on bogus criteria such as prior familiarity with
incantations like “nohup tar -jxvf giant.tar.bz2 2> cmd.errs &”

Is this the best you can do?

------
wh-uws
Obligatory xkcd link

[https://xkcd.com/1168/](https://xkcd.com/1168/)

------
piratebroadcast
MIT and Stanford degrees and the guy can't make his personal website go the
full width of my 13 inch monitor.

------
cheez
The most recent command-line bullshittery is JavaScript and all the node/npm
bullshit.

Kind of.

It's pretty nice once you get used to it.

------
DiabloD3
I don't get the point of this. The command line is one of the easiest thing in
a *nix.

If they can't figure that out, how will they ever learn how to figure out real
world software engineering?

