
Helping my students overcome command-line bullshittery - luu
http://pgbovine.net/command-line-bullshittery.htm
======
jaderobbins1
I feel like any field of research has a base set of knowledge and skill sets
required to do high level research. One could say that biology research is
filled with "Microscope Bullshittery" or Paleontology is filled with "Fossil
Digging Bullshittery".

A base skill of doing computer science research is programming.

Can you program without computer science? Absolutely. Can you Computer Science
without programming? I would say no. Being able to look at and understand how
to use a language or library is just something required to get to the last
tier of computers science knowledge. I think every researcher would love to
get rid of their bullshittery, and often they have lab technicians or interns
do it for them but in the end they all had to pay their dues and have to know
it in order to mentor and help those below them.

~~~
saidajigumi
The author specifically calls out that he's not talking about programming, per
se. He's talking about the skill set of wrestling useful free software
packages to one's own aims:

 _So perhaps what is more important to a researcher than programming ability
is adeptness at dealing with command-line bullshittery, since that enables one
to become 10x or even 100x more productive than peers by finding, installing,
configuring, customizing, and remixing the appropriate pieces of free
software._

I'm torn about this article. Clearly this researcher, in his role as mentor,
has identified a skill gap that's hindering his students. And it's perhaps
even a problem that the software community can ease the pain of. But many of
the things he lists in passing get down to fundamental tools of software work:
version control, package management, data manipulation, etc. Yes, the usage of
these things on the command line tends to be "arcane", but that's because each
is encoding its own problem domain. And if you're going to be working in
software in any non-ivory-tower capacity, _you 'd better know this stuff._

I've dealt with this kind of problem numerous times before in various contexts
with _workflow tooling_. I.e. a single (usually) command-line tool that neatly
encapsulates the most common development use cases to reduce learning curves,
cycle time, and errors. These can be _phenomenally_ successful if done well,
but if the context doesn't define a workflow (e.g. student A vs. student B's
research ideas) then there's no easy way to encapsulate the user's problems.

~~~
mcguire
He also goes on to say,

" _Throughout this entire ordeal where I 'm uttering ridiculous epithets like
'git pipe fork pipe stdout pipe stderr apt-get revert rollback pipe pipe grep
pipe to less make install redirect rm rm ls ls -l ls tar -zxvf rm rm rm ssh mv
ssh curl wget pip,'_"

In other words, "ridiculous epithets" seems to be equivalent to telling the
machine to do something. Have you got a way to get git to control your source
without actually invoking git?

Workflow tooling can indeed be incredibly useful, but the context isn't the
only requirement for success. If something underpinning that tooling changes
or breaks, someone is going to have to understand what happened.

The people who regard _that_ understanding as "ridiculous" are the worst
people to work with and to my mind are the primary reasons that this
"profession" gets little respect.

~~~
loup-vaillant
> _Have you got a way to get git to control your source without actually
> invoking git?_

No, but git _does_ involve ridiculous epithets. No quotes, because I'm dead
serious. As an interface, Git command line is laughable, and doesn't deserve a
passing grade. Yes, it's the only one we've got. Yes, many interfaces are even
worse. Still, that's no excuse. We can do better. Hopefully someone will:
[http://tonsky.me/blog/reinventing-git-
interface/](http://tonsky.me/blog/reinventing-git-interface/)

\---

Let's take a simpler example:

    
    
      $ tar -xzf foo.tar.gz
    

So, you have to learn the name "tar". The option "-x" for extract, the option
"z" for gzip, and the option "-f" for file (by the way, the "f" must come
last, or it won't work). _What the fuck is this retarded interface?_

First, why do I have to tell tar to extract the thing, since it's obviously a
compressed archive? Why do I have to tell tar that it's in gzip format? It can
decompress it, surely it can check the compression format? And why, _why_ ,
WHY do I have to tell it I'm processing a _file_? It _KNOWS_ it's a freaking
file!!!

Surely there must be an alternative, like… like…

    
    
      $ decompress foo.tar.gz
    

I personally don't know of such alternative, and don't use them, because I was
retarded enough to learn the basic 'tar' incantations by heart. Now that I
know them, I can't summon the courage to use a decent interface instead.

But you see my point. Even for a simple tool such as tar, the common use case
involves a cryptic incantation that shouldn't be needed in the first place.
I'm sure many other UNIX tools are like that, I just didn't think about
critiquing their interfaces thoroughly. Yet.

The core principles of the command line are very good. The level of relatively
easy automation it provide is nothing short of amazing. This technology for
the 70's is arguably more capable than most graphical interfaces in current
use. But it does have its fair share of incidental complexity and useless
quirks. We can do better. Let's not forget that.

~~~
profsnuggles
Lets take your tar example. The -z and -x options are flags, they specify
binary on/off options. You can specify all the flags separately on the command
invocation like so:

    
    
      $ tar -x -z -f foo.tar.gz
    

However typing -flag -flag2 -flag3 is too many keystrokes so an a convenience
you can combine all the flags in one call -xzf. The -f isn't a flag though it
takes an argument which in this case is the filename foo.tar.gz. The argument
is required and comes directly after the flag. Which is why the f has to come
last because that argument has to come right after. Order doesn't matter for
the x and z because they don't take arguments they are just flags. It makes
sense if you add in another flag like -C which also takes an option you would
end up with:

    
    
      $ tar -xzfC foo.tar.gz directory_to_change_to
    

Which argument goes to which flag? Maybe the first flag gets the first
argument? Then your argument order changes if you type in the flags backwards.

I don't know about your z flag, GNU tar doesn't need it. The x flag is needed
because tar can do things other than extract like list the contents of the
archive with the -t flag, or create a new archive with -c.

Finally why is the f command required? My first assumption was that maybe
because you need to specify the output file when you are creating an archive.
I took a look in the manpage and the reason is a lot more interesting.

    
    
      Use archive file or device ARCHIVE.  If this option is not given, tar will
      first examine the environment variable `TAPE'. If it is set, its value will
      be used as the archive name. Otherwise, tar will assume the compiled-in
      default.
    

I knew that tar's name comes from the phrase tape archive but I hadn't put two
and two together. Of course you need to specify if you are writing the archive
to a file because tar was created to back up data to tape! If you think about
it tar is actually doing the "right thing". Considering why it was written tar
has a sane default, write the data to the tape drive.

Maybe you already understand all this and I'm reading too much into your
simple example. It feels to me though that when people have issues with
something like the unix command line its because they just wanted to get
something done and memorized an incantation to do it. There isn't anything
wrong with that of course but a tool like tar is SO much more powerful than
just decompressing files. Once you start to dig into it though there is an
internal consistency and logic to it though.

~~~
loup-vaillant
> _Maybe you already understand all this_

Yes I do. Every single item. I just feel for the hapless student that is
required to send a compressed archive of his work to the teacher, and is using
tar for the first time.

There's only one little exception: I didn't know GNU tar doesn't require the
'-z' flag (which by the way means 'this is a gzip archive') when extracting
tar.gz archive. Anyway, I bet my hat that the '-z' _is_ required if you
compress something and output the result to the standard output: there will be
no '.gz' hint to help the tool magically understand you want it compressed. If
you omit it, tar will likely not compress anything.

The '-f' option is the most aggravating. Nobody uses tapes any more. Tar _was_
doing the right thing, but no longer. -f should be dropped, or rendered
optional, or replaced by '-o' for consistency with compilers… As it is, it's
just ugly.

> _It feels to me though that when people have issues with something like the
> unix command line its because they just wanted to get something done and
> memorized an incantation to do it. There isn 't anything wrong with that of
> course […]_

Actually there is. The users want to do something (usually a very common case
such as compressing or decompressing an archive), then they _have_ to memorise
an arcane incantation. Yes, tar can do much more. Yes, the command line is
powerful and flexible and more. This is Good. (Seriously, I miss my command
line whenever I have to touch Windows.) On the other hand, some common cases
are just not well handled, and it is full of idiosyncrasies that have
_nothing_ to do with the aforementioned benefits.

When the user wants to decompress files, it should not be more complicated
than 'decompress archive.tar.gz'. Though thanks to an uncle comment, I now
know of the 'unp' tool, which works just like that: 'unp archive.tar.gz', and
you're done. (And the man page is refreshingly short.)

------
tw04
I find it sad, that in an academic setting no less, a PROFESSOR is telling his
students it's BULLSHIT to know how the systems that are vital to their work
actually function. He sounds like the guy that goes to jiffy lube because he
doesn't know what an oil filter is, and thinks that it's "bullshit" that
anyone does. I just want to turn the key and go! That's great, but as an
academic you should also want to know what the fuck happens when you turn the
key. And encouraging your students to devalue knowing how things work under
the covers... I think he needs to find a new line of work.

~~~
akumpf
I understand the perspective you're taking, but I politely disagree. I think
you're missing the bigger point here.

Let's say someone is interested in baking a cake. There is a LOT you can
learn, spanning general baking techniques, chemistry, design, art, tasting,
etc. But if your immediate response is "we need flour, so go plant some wheat
and wait a few months," they would likely lose interest.

Teaching people to plant and harvest wheat is awesome, but for most people it
probably shouldn't be the first thing you're met with when you are trying to
learn how bake a cake.

~~~
Kalium
If you're studying "Baking Science", which covers everything from the
beginning to the end, starting with "Let's grind some flour" is a good idea.
People studying "Baking Science" need to understand the whole process, rather
than believing everything starts and ends with pre-packaged recipes and
machines that do everything for them.

That's what's going on here. A "Baking Science" curriculum that didn't impart
people with a knowledge of where flour comes from and how it's made would be a
joke.

~~~
drostie
There may be space in the whole curriculum for that, but it's probably not
essential _here_ and _now_. The whole pull of Bret Victor's presentations is
that they show us what it would look like to program if our tools were as
modern as Word.

The baking-metaphor problem is that you have students who are supposed to come
in and investigate how the arrangements of toppings on a pizza affect both
nebulous qualities (like deliciousness and heterogeneity) and rigorously
measurable ones (like moisture and elasticity) of the pizza crust. However,
when they come to your kitchen, usually most professors put them in a totally
new room which contains millstones and grain and milk ready to be turned into
fresh mozzarella, with nothing labeled. There are reasons for this -- real
pizza aficionados have very different choices about how they want to compose
their sauces and which cheeses they want on the finished result and even what
leavening agent causes the dough to rise, so the framework for pizza-baking is
as general-purpose as possible. But those reasons make things difficult for
the newcomer.

The professor is just saying, "when we start, I walk everyone through the
process of finding the flours over here, the additives over there, and using
the bread-machine to mix them and knead them. I then show them where to find
the canned sauces and the pre-grated cheeses, so that they can start with
minimal knowledge baking up some pizzas for science. Our concerns are very
high-level and I want them to be fussing with baking times and topping
arrangements, but so many of my students seem to be stuck on trying to turn
milk into mozzarella."

~~~
Kalium
The problem he overlooks is that we do not _want_ our tools to be "as modern
as Word". We want modern tools, but not modern as Word defines it.

There are excellent solutions for his problem. Distributing a pre-configured
VM is a good one. Instead, he wants students to have the experience of
bootstrapping, but he also wants it to be painless and magical.

But instead of looking at his actual problem, he's wound up railing against
all the critical freedom that makes the field something other than a glorified
exercise in painting by numbers.

EDIT: For the record, turning milk into mozzarella is actually really easy and
quite suitable for a novice. I've done it. Takes about an hour, end to end.

~~~
dllthomas
My understanding is that this is not so much for a single course (where a VM
would be a good solution) as for general student research. Prepackaged works
_less_ well there (though may still be made to work).

~~~
Kalium
When you set out to blaze a new trail, you should not expect to find a nice
paved road with bus service.

If he decides to ignore all the tools (puppet, chef, scripts, etc.) designed
to make all of this easier, that's his fault.

~~~
TeMPOraL
> _If he decides to ignore all the tools (puppet, chef, scripts, etc.)
> designed to make all of this easier, that 's his fault._

You're kidding here, right?

I find Puppet and Chef superconfusing and not worth the effort to learn at my
job right now, _and I 'm a fucking programmer by hobby and profession_. This
is _exactly_ the kind of bullshit people doing science should never have to
deal with.

~~~
Kalium
Puppet, at least, is pretty straightforward. You are describing what you want
your system to look like. Puppet takes that description and makes your system
look like that.

People doing science who want to use computers should expect to have to learn
a thing or two about using them. As in more than using Word if they want to do
complex, custom, not-done-before tasks. Much like people who want to do novel
things in chemistry should expect to learn more than how to make black powder.

This guy is upset that novel things haven't already been thought of and
planned for by the people who make shiny GUIs. This is a farcical position. If
it's really that novel, of course nobody's written a GUI for it.

Point is, tools to address his problem already exist. He dismisses them,
because they don't do it in an arbitrarily flexible and powerful way while
still being infinitely iTunes-y.

~~~
TeMPOraL
> _People doing science who want to use computers should expect to have to
> learn a thing or two about using them._

A thing. Or two. Not half-a-year-worth of full-stack dev education many.

He is not complaining about having to learn things. He's complaining about
having to learn _irrelevant_ things. Infrastructure. He wants to make a soup,
and he's being asked to run his own plumbing to get water, and to drill his
own gas for heating. And people here are saying he should stop complaining,
because nobody is making him build his own drill - it's already provided via
Puppet script in a Git repository.

He doesn't dismiss tools because there ain't iTunes-y. He dismisses them,
because to use those tools he has to learn more tools, for which he has to
learn even more tools, and all that effort is throwaway, because the next time
he will need to learn _different_ toolchains (or should I say, tooltrees with
stupidly high branching factors).

> _Puppet, at least, is pretty straightforward. You are describing what you
> want your system to look like. Puppet takes that description and makes your
> system look like that._

It makes sense for a team of web developers doing high-scalability
applications. It is bullshit for a researcher who just wants to crunch some
numbers with a bit of Python code.

~~~
Kalium
There's the problem. He doesn't understand what the proper bounds of relevance
are. He can't see how a given task is relevant, so it's bullshit. That's more
a comment on the limits of his thought processes than anything else.

He wants to do novel things. This means going places where not everything is
preconfigured for his pleasure. It also means he needs to know how to use his
tools, because when he runs off the edge of what point-and-drool does for him
he will need them.

He asks for a world where point-and-drool covers everything. All I can say is
that what he asks is impossible for what he wants.

------
robotresearcher
Folks, this prof isn't saying the command line isn't worth knowing. He's
saying that learning it costs time which would be better spent on actual
research. Becoming the three-millionth sed expert does not expand CS.

The command line is the best available technology for many tasks. We're going
to need these skills for the foreseeable future and maybe forever. CS PhD
students are trainees and its OK if they invest in learning it. But for the
prof, it's completely reasonable and rational to want to minimize the total
time his group spends on this stuff, for productivity's sake.

I'm a CS prof and I spend way too much time wrestling with flaky build
scripts, incorrect docs, overly-specific dependencies, etc, because the
original author did not put in the extra effort to make the code usable by a
third party. This apparent economy costs a huge amount of other people's time.

Bearing this stuff in mind, a prof can actively try to minimize this overhead
by (i) decent design, (ii) a bit of spit and polish before releasing code, and
(iii) maintaining lab environments that already work (VM/container images,
etc.). Let's train the students to care about this stuff.

edit: To clarify, I'm not saying that the author of some open source code owes
me anything in terms of usability or my time. But if they want to maximize
their impact, a little effort in that direction can pay off. Like writing a
paper that is easy to follow rather than an ugly brain-dump.

~~~
Roboprog
(iii) sounds on the mark, and perhaps what the guy really wants to say is that
there is a market to help profs do that prebuilt setup???

~~~
Fuzzwah
There's a market, but no money.

------
kylebgorman
I am a jr. CS professor working in an applied field (computational
linguistics). I have found myself repeatedly giving the same impromptu
lectures to RAs and graduate students concerning various UNIX tools. This is a
clear inefficiency (I could have planned the lectures ahead of time, with nice
illustrative examples, and given it just once). The scarier thing (for me) was
that I occasionally observe students (as well as many open-source
contributors) doing something in an enormously complex way because they have
not yet _discovered_ tools like `tee` or `cut` (it is hard to know just how
much time is wasted in this fashion).

Consequently, a colleague and I created a course entitled Research
Programming, which is taken by all our masters students, as well as some RAs
and Ph.D. students. At first, some of my other colleagues were skeptical; they
felt that students should already know this material (true in a normative
sense, perhaps, but contrary to fact). However, the "economy" argument appears
to have won them over and the class is going well thus far.

Personally, I don't agree with Guo's assertion that command-line skullduggery
is intellectually unstimulating. But it's unfortunate when it gets between you
and scientific (or business) progress, and as an educator I hope to minimize
that.

~~~
drostie
That's awesome.

The only thing I think would make it even better would be if we could make it
an interactive course online, since so much of these details about shells,
commands, etc. are things which happen in a computer context anyway. If
several of the major could collaborate on a common research-programming
tutorial and release it openly, it would be a huge win for more fields than
just computational linguistics.

I've seen some related links to put shells in the browser, so it's more than
just an academic speculation; I think one could put the site together with
under a thousand hours of programmer time at this point. Hosting costs might
be a concern, but application development is probably not too difficult.

~~~
kylebgorman
Good idea, I'd love to enable online learning for this type of material. In
the meantime, I'm putting handouts for the first unit ("UNIX") online, and the
second unit ("Python") will use iPython notebooks. There may be Coursera
(etc.) offerings that cover similar material, but I haven't looked into it
yet.

------
lucb1e
In a linked article about why the "command-line" is "bullshittery" in the
first place he compares Python to Excel and declares Excel superior because it
can make graphs much more easily. His students apparently ask why use Python
when you can use Excel.

First off, Python is not a command line. Secondly, what the fuck does he think
Excel's source code is written in? MS Publisher? They're just different
things.

The author does not seem to understand why command line tools still work the
way they worked in the sixties (as he loves to point out), which is because it
just works. He mentions git, the version management system that uses the
command line mostly, but forgets to mention that although git could surely
have (and has) a GUI, it would disable you from doing many things. I often
pipe git output to other tools, something you just cannot do with a GUI.

Oh, and he uses nano for command line text editing. I suppose that also
explains a thing or two when working with Vim or Emacs is so much better than
any other text editor.

~~~
Kalium
In fact, git has multiple GUIs. Often a user will use multiple in their git
workflow, because git is too complex to fit neatly into one UI.

~~~
aet
Is it possible to make Git as easy as Dropbox?

~~~
Kalium
Yes, provided you're willing to give up all the complex ideas that make git
more useful than Dropbox for some things.

~~~
aet
I imagining a scenario in which I have a team and I have 10% git experts, 90%
not -- can I create a system in which they are basically "drop boxing" and the
merging etc gets handled automatically or decisioned by an expert? -- Maybe
ultimately everyone should just learn git since it isn't that complicated.

~~~
Kalium
I've been in such a scenario. Making it simple and drool-proof for some people
while everyone else works around the consequences of that is not as lovely as
it sounds. It winds up being much, much easier to just teach whoever else
enough of the basics to use git.

It's almost certainly less difficult to use than whatever ERP the company
uses.

------
rlpb
He mentions "bullshit" 19 times in the article. It's a shame he isn't able to
describe why it is bullshit (apart from that he has to use it) or how we could
make it any better.

The command line is an incredibly powerful tool, which allows us to _very
quickly_ combine components together in many more ways than use cases that
could be considered by a GUI developer. This is why he's having to use a
command line in the first place: no GUI developer has addressed his very
specific use case, and nobody has figured out a better of doing these things
in the general case than the command line.

It would be great if it were easy to learn. But short of any real workable
suggestions, nothing is going to change.

In general, I understand the word "bullshit" to mean something that has been
constructed to obstruct somebody in a way that isn't actually necessary. I
don't see how that applies here. The command line is _the only known way_ for
him to achieve his goals. By definition, then, it cannot be bullshit.

------
aroberge
As someone who has spent a fair bit of his life teaching, I find it
"fascinating" to see the reactions of the users here, most of which appear to
have at least some experience with programming and none (from what I can see)
having any significant teaching experience to draw upon.

Teaching beginners can be extremely challenging. A majority of beginning
students, no matter what field is being taught to them, will simply not master
the material and will easily get "tripped" by something that advanced users
will find trivial.

Most of what P. Guo wrote on his various articles, including this one,
strongly resonnates with me, and jives with my own experience as a teacher.
His Online Python Tutor tool [1] is a fantastic teaching (and learning!) aid
and shows his understanding of what is needed to help beginners learn.

[1] [http://pythontutor.com/](http://pythontutor.com/)

~~~
juanre
I completely agree. The only beginners I have taught were my sons ---I wrote
what I thought were the basics for them in a book, The Hackers Ways [1]--- but
P. Guo's articles very much resonate with me as well. Helping them see that
the tools were the means to an end, something they could master and that would
make them much more efficient, was a big part of the effort.

[1] [http://juanreyero.com/hacker-
ways/index.html](http://juanreyero.com/hacker-ways/index.html)

------
trarman
I have the opposite problem. I hate GUI bullshittery. With the command-line I
can string together multiple tools with options to work the way I need them
to.

If you throw in a tool with nothing but a fancy GUI, I can't simply pipe data
through it. I have to find buttons to click, search for hidden menus, resize
windows, etc. What a waste of time! Give me a man page any day!

He says Excel is superior because it makes charts easier? Guy is in the wrong
field. Learn some octave or matlab.

~~~
jackmaney
Agreed. I find any of R, Python (with scikit-learn, pandas, etc), or even SAS
far more palatable than SPSS or Excel.

------
kjetil
With love to all the UNIX apologists in this thread: you're missing the point.

Why isn't software easier to install and configure? Why are we still using
relics from the 70s as the base of our systems? Why are we so focused on
mastering the tools of the past instead of creating new tools for the future?

I'm a long time Linux and BSD user. I sit in a shell for hours every day. And
I still can't believe why we put up with this crap.

Shell scripts and one-liners. Ls and cat and grep and uniq and sed and awk and
tar and loads of other low-level tools that you haphazardly string together.
For the love of all that is holy, just use a real programming language!

With the powerful computers of today, we still interface to other computers
through a text-based bottleneck that's basically from the stone age. What's up
with that?

OK, this turned out to be a little rant. Don't take it the wrong way; I love
UNIX, but I also hate all the cruft that just seems to accumulate and never
gets cleaned up. I think we could do a lot better.

~~~
super_mario
shell is a programming language. You are interactively programming and in a
more efficient way too. You are telling the computer to do things for you
(instead of clicking on pictures) or speaking in more verbose language.

So I don't really get your objection. Even if you programmed what you wanted
computer to do in some other "real" language you would be doing exactly the
same thing only in different syntax. And source code would still be even
bigger bottleneck :D. Try writing python script to wget a file and extract it
to local file system. I think you will find it's more time consuming :D.

~~~
dllthomas
A linguistics professor I was speaking with once put it (paraphrasing from
memory): "With a shell, you have all the expressibility of a language. With
pointy-clicky, you're reduced to pointing at things and grunting."

------
ghshephard
Every domain - whether it be physics, chemistry, electrical systems, biology,
and yes, even computing systems, has an initial training window in which the
practitioner requires some training on the the basic elements of the toolset.
Initially, it's painful, and annoying, as you are literally starting from the
ground, and you have no understanding of even the language to understand the
instructions you are being given. But, eventually you develop a vocabulary,
and learn the rudiments of the tools - even if they are completely peripheral
to your actual domain of research.

I have a friend who wanted to be a cook - she was interested in food, when she
went to chef school for a couple years, and she was learning how to cook meat,
they spent two-three days just talking about knives. Apparently knives are
very important to chefs. So they taught her about the metals, the edges, the
sharpening, the care and maintenance - the entire science of knives. She went
in with no expectations she was going to have to learn any of this stuff - but
before she could start cooking, she had to learn how to use her tools, and
jigs, and moran edges, and Scandinavian Grind, and what the hell does this
have to do with cooking again?

Same with working in a field that requires you use software systems. The
command-line is your toolset. Master it, and you can do anything. Don't master
it, and you are like a chef who is using whatever knives happen to be in the
kitchen, and in whatever condition they happen to be in.

But, if he thinks that we have "command-line bullshittery" today, when 95% of
the time I can just type "brew install foo", or "apt-get update bar", and have
a pretty damn good chance that 95% of my dependencies are fullfilled, and if
not, a quick google of the error message will usually give me a stack-
exchange/server-fault/stack-overflow solution, I can't imagine what he thought
working with these systems was like in the 1990s...

~~~
pyre
Maybe he's pining for a development environment that looks like this:
[http://fc05.deviantart.net/images/i/2003/2/5/0/Operation_Swo...](http://fc05.deviantart.net/images/i/2003/2/5/0/Operation_Swordfish.jpg)

Then his students will know that it's on the bleeding-edge of _The Future_.

------
showerst
At this point, isn't it more effective to have a virtual machine image with
everything set up? Virtualbox runs on tons of platforms, and I imagine you
could build a 'CS' box with most of the tools installed and just update it a
few times a year.

Granted, that keeps them from learning how to set it up on their own, but his
stated goal is to cut out that BS.

I think this would be even more valuable in undergrad, I remember my first few
college CS classes spending the whole lab helping people install java on
various flavors of windows.

~~~
pmontra
A course I've taken on Coursera years ago did exactly that. Not that would
have been a big problem for me to setup the software we needed (I was on the
very same Linux distro anyway) but it saved time to me too and I didn't read
all the usual bullshittery (funny word) in the forums.

So Dear Professor, you had to deal with the command line to setup your own
machine, do it in a VM and let your students download it.

For extra safety pin the versions of the software your students will be
working on (for example Octave) if you're afraid that an automatically updated
new version will be incompatible with the assignments and all the other stuff
you prepared for your class.

~~~
lbearl
The bad thing here is that it sounds like these are graduate students, not
undergrads. Grad students should understand how to wrangle the command line
(if not, how did they get through undergrad?).

~~~
uiri
The first lab of my second programming course in University involved making
sure that every student knew how to compile and run Java programs on the
command line. The reason the Prof gave for this exercise was, to paraphrase,
that he was tired of dealing with grad students who have no command line
knowledge.

It is certainly possible to get through a computer science undergrad at (to
borrow Spolsky's term) a Java School and have absolutely no knowledge about
Unix nor the command line. "Modern" tools like Eclipse and IntelliJ hide all
of that stuff away.

~~~
lbearl
That is a good point, I majored in Computer Engineering, so my experience is a
little bit different (kind of hard to cross compile to various embedded
platforms without using a command line).

------
lazyant
"There is a huge disconnect between the elegant high-level ideas discussed on
the whiteboard (while presumably sipping cappuccinos) and the grimy, grungy,
terrible command-line bullshittery required to set up a computing environment
suitable for implementing those ideas in code."

Welcome to real life outside academia, where you need to step down from the
ivory tower and actually get your hands dirty. Incidentally we never had
automatic configuration and deployment as easy as now with all the recent
tools.

------
bkeroack
Perhaps he should try to explain the rich history of this "command-line
bullshittery". It was the original OOP, before anyone articulated it.
Encapsulating functionality into small, easily understandable programs that
could be chained together was an amazing breakthrough in software engineering
at the time. If you look at old Unix sources, the source code for many
utilities was only a few hundred lines at most. Essentially the size of what
we would consider a "class" today.

Whenever I'm getting frustrated with this stuff I try to step back and realize
that we're all standing on the shoulders of giants. It would not be possible
to do all this grand whiteboarding were it not for all the work done before us
to make it possible. We'd still be grappling with how to wire together
transistors and magnetic core memory.

------
Kalium
Clearly, all of the people who make useful and powerful tools should spend
less time on useful features and more time on shiny Apple-style UIs. Because
who wants useful features when you can make software cater to people who don't
want to learn?

This author's problem is not with bullshit. It's that he doesn't believe he
should have to understand his tools at all. He wants them to magically work
because _he_ knows what he wants them to do.

Do you want to do novel, interesting things with your tools? Don't expect
someone else to have paved the way for you first.

------
putzdown
So here's my theory. Whenever we have the experience of a field of useful
knowledge being "bullshittery," it's because we ourselves learned it in an ad-
hoc and unsystematic way. There's no question that the C language, say, is
complex and full of arcana, both syntactic and from the library. But most
people go about learning it with the expectation that this is so. We're often
accompanied by friends and guides. Learning a language we usually have the
sense that there's a fairly clear beginning, middle, and (just maybe) end--or
at least saturation point, where our mastery is sufficient to do useful work--
to our quest.

The (UNIX?) command line feels like bullshittery not, I suggest, because it's
intrinsically more arcane and arbitrary than many languages and standard
libraries, but because of our procedure of learning. People tend to discover
it. We tend to learn each part (command, option) in response to a specific
need. And this both compartmentalizes our knowledge (we know what "ls -la"
does but not what "ls -a" does) and makes us quick forgetters (on the idea
that what we learn for a specific purpose we tend not to retain for long).

Therefore, don't blame the command line. Blame the educational process. And if
you want to experience the command line without frustration--not as
"bullshittery" but as simply a tool you understand and have mastered--the
solution is to do what the author seems to allude to, which is to take a more
systematic and deliberate approach to learning the thing.

------
tdicola
I've been on both sides of the fence, I remember being a frustrated student
who couldn't get libraries, tools, etc. to work, and now I have some years of
experience and know the ropes of working in a Linux environment. My best
advice for people struggling is that if something feels difficult and painful,
you're probably doing it wrong and should look for an easier way.

I remember I would bang my head against a wall trying to get Linux/Unix
software to compile in Windows using cygwin, mingw, etc. and it was just a
complete nightmare. However running a Linux virtual machine on Windows is drop
dead easy--you can have one up in seconds with something like Vagrant. Much,
much easier to get stuff working in a real Linux environment.

Once you're in a modern Linux environment if you're compiling software
yourself, again you're probably doing it wrong. Check out your package manager
and there's probably a compiled version already available. Is it too old? Look
for someone else who's compiled it, or an experimental package source and try
that next. Still can't find it? Ok download the source and give it a shot. If
you have to do anything more than a typical './configure && make && sudo make
install' step back and see if maybe you're doing something wrong.

------
Roboprog
Cry me a river.

I've yet to meet the person who was a programming god in trendy-language-of-
the-day, but couldn't use a command line (perhaps with a few weeks' practice
to change shells / operating systems). Maybe it's more to do with off-shoring,
but whenever I see "programmers" who can only use a GUI, I know I'm in for
trouble, and said "programmer" won't actually be good at issuing orders as
lines of text for the computer to follow.

Yes, please give me cave paintings and tally marks! Don't make me learn stuff
like functions, sets, pipelines or language!

A more valid criticism would have been "some programs/utilities/packages are
more complicated than strictly necessary", with examples. Others have pointed
out that it should be possible to make a "distribution" of a tool suite
(perhaps as simple as a script containing the individual setup commands),
rather than making every student run every step by themselves.

Having an "install" clicky thing from a 3rd party that has no options and just
works is great. As long as that 3rd party knows all your needs (really knows,
not the Apple mind control illusion) and has control of the environment in
which you are dropping the package.

~~~
Roboprog
Wow, this came out kind of bitter. I guess maybe due to too much having to
deal with a Windows culture of grotesque ignorance the last few years.

~~~
zwerdlds
You're right though. Even MS is starting to realize the lack of power in their
GUI-centric designers. I will say that, in trying to catch up, theyre way out
of practice in buildin powerful/customizable command like tools (see SQLMetal
and t4 for starters)

------
pessimizer
Learning how to deal with the unix command line is awful, and unix is awful,
but it's also the best thing we have right now, and using it is a skill which
will pay off again and again over decades.

Helping people to learn it is definitely torture, but it's honorable work that
contributes to society in a big way.

I have no understanding of the point of view of the angry comments on this
thread, although I suspect that they're based on the headline. The author
clearly thinks that the command line is the greatest thing since sliced bread.
From the article:

"What is wonderful about doing applied computer science research in the modern
era is that there are thousands of pieces of free software and other computer-
based tools that researchers can leverage to create their research software.
With the right set of tools, one can be 10x or even 100x more productive than
peers who don't know how to set up those tools."

"[...]perhaps what is more important to a researcher than programming ability
is adeptness at dealing with command-line bullshittery, since that enables one
to become 10x or even 100x more productive than peers by finding, installing,
configuring, customizing, and remixing the appropriate pieces of free
software."

"As an advisor, I've found that one of the highest-leverage activities that I
do with my students is guiding them through the intricacies of command-line
bullshittery. There is simply no substitute for sitting down with them one-on-
one on their laptop and walking them through all of the arcane commands to
type, what they each mean, and how to interpret the bullshit output that's
barfed out to the drab terminal."

------
akumpf
This is like an alternative form of "Yak Shaving." :)

Instead of going super far down a path, following each step along the way, and
somehow ending up with a yak in your hands, you're immediately asked to find a
yak and you rightfully ask, "what does this have to do with yaks?"

Yes, the command-line is powerful, but it is definitely a huge hurdle for most
as they're really just trying to do something else. Well stated PG.

------
zwerdlds
Sounds like a docker/vagrant use case.

I do disagree with the thesis though. If youre building on a tool you ought to
learn how to use it properly. Otherwise what happens when something breaks?
Placing "sensible defaults" works until you need to change the tools
functionality, at which point, if you dont know how to use the tool you will
be once again lost.

It's a whack-a-mole method vs the dynamite approach IMHO.

------
bb0wn
There is no 'overcoming' of command-line bullshittery -- it's a barrier to
entry and computer professionals should be glad that it's non-trivial to
become productive with your environment -- the time that you put into becoming
fluent in POSIX and all the requisite pays off 10x. It makes you that much
more valuable to have put in the time to learn an interface that is powerful.

It's hard because there is a lot you can do with it, both in research and in
the workplace. I am really biased against software people who are not good
with their tools. I don't want to work with people that are counting on their
fingers and toes trying to get stuff done -- I want to work with people who
embrace the tools and take pride in using them well. Is playing the scales
keyboard bullshittery, or the fundamental skill required of being able to play
beautiful music?

~~~
robotresearcher
It's one thing to be proud of building an accomplished skillset. But it's
another to be glad that other people don't have access to that power.

~~~
bb0wn
I'm not glad that others don't have it -- I'm just glad that it's non-trivial
because it means that learning it is an accomplishment worthy of taking pride
in. As far as it being a barrier to entry is concerned, I think that having to
learn something that is not-so-simple to become a professional separates the
wheat from the chaff in the same way that someone like a professional exam or
certification would. I have worked with newbs in the past who don't know how
to use their tools correctly, and they're usually a net negative re:
productivity on a project.

------
sehugg
After his students finish their papers or projects or whatever, most of them
will go and stick their code into a .tgz file on a website, unpackaged,
unmaintained and forgotten about until years later when another researcher
needs to refer to it, who will then grumble and complain about command-line
bullshittery...

------
bjornlouser
when did command line builds of free software that enable one to " ... be 10x
or even 100x more productive than peers who don't know how to set up those
tools." become too tedious and esoteric for computer science students to
master without hand holding?

------
naunga
I wonder if he has a colleague in the English department who wrote an article
complaining about the amount of "notebook bullshitery" that an English student
has to go through to start writing an essay.

Calling out the pains that students go through having to go to a store and
pick out a notebook? And the woes and confusion that they experience with
faced with the vast and baffling array of colors, paper sizes, page counts,
line widths, and so on.

I found the woes outlined in this article to be dubious. This is simply a
"first world problem".

On the other hand, I did get a warm-fuzzy feeling regarding my job security,
since I am more than comfortable at the command line and further: learning to
use tools I have never encountered before.

~~~
recursive
There is a lot less bullshittery surrounding the acquisition of notebooks and
pens than there is around setting up certain software. A ~year ago I decided
to set up a rails development environment, having no experience with it or
ruby. I assembled a list of repeatable instructions as I went. It took me
about two days, and had over 50 steps. If you are experienced with the domain,
you could probably do it in 5 minutes, but there is not a good reason why it
should require so much specialized knowledge just to get it set up.

------
gear54rus
From my (not-so-big) experience in IT, the sub-fields of this field are really
tightly networked together (e.g. if you understand piping in bash, you can
then learn about stdin and stdout concepts in your future C program, or the
other way around). Therefore I really can't see how could one try to excel in
one area and not touch the others.

While you are learning about the area of your choice, you inevitably
understand _ideas_ that hold true for many other areas (provided you possess
the perception to see them).

Expanding on these ideas, in turn, may help you understand something new about
your area of choice again. Not willing to do so feels like a wasted
opportunity.

------
jordanpg
Isn't this just blossoming into the vi/emacs/IDE debate couched in different
language?

Can you -- in principle -- accomplish goal X with tool Y? Yes, almost always.
Some people will be far, far more efficient with the CLI and likewise some
people will do better with GUIs.

There are some tasks that are only amenable to CLI work, like kernel
development. And there are some that are only amenable to enormous IDEs like
mobile app development.

The author is saying that for the scope of his problems, the CLI just gets in
the way. This isn't always true. He is careful not to make any sweeping
generalizations.

~~~
Roboprog
Tangent - but why does mobile app development require a huge IDE??? Aside from
how bad the APIs are.

What if mobile app APIs had a simple interface to add layouts to views, and
widgets with their callbacks (in one call) to the layouts? And what if all the
widgets and layouts had so much of their behaviour standardized in the base
classes that you almost never needed to look up, auto-complete or otherwise
remember 100 distinct classes' use cases???

~~~
jordanpg
I'm most familiar with Android, but I would say it's because there is too much
infrastructure under the hood.

Google seems interested in abstracting away almost everything about the
framework and build process into the Gradle plugin in Android Studio, to
create the illusion of what you're describing. But behind the scenes, all of
the machinery if IntelliJ is making that happen.

Related question I asked a while back on SO:
[http://stackoverflow.com/questions/24151396/how-can-i-
view-a...](http://stackoverflow.com/questions/24151396/how-can-i-view-android-
build-gradle-tasks-from-android-studio)

------
niels_olson
Coming to programming via biomedical research (I'm the undisputed grand master
of microscope bullshittery in my pathology department), I have to say, the
wall of command-line bullshittery is _much_ steeper than of anything else. I
can count on one hand the people from whom I have had to opportunity to learn
_anything about the command line_ in the last 10 years. And I can count the
number of hours of that learning I got too.

I submit a more appropriate analogy might be mechanical engineering. You want
to be a mechanical engineer and learn to build bridges. You'll start in middle
school shop class and learn about sand paper and how to draw isometric
projections. By the time you graduate high school, you can spot a 1/4" x 20
thread pitch (aka "quarter-twenty") from 10 feet. You'll learn the
relationship between shims and wire gauges, drill sizes, flutes, the geometry
of volutes and involutes, the merits of various steels, the dangers of
overhead welding, the properties of concretes, how to patch carbon fiber
defects, the grades and types of ball bearings, needle bearings, babbitt
bearings, and their lubricants, crush plates, etc, etc. Along the way you'll
have to learn the difference between Y and Delta three-phase power, how to
analyze an op-amp, and how to model how the center of gravity of a car with
shocks changes when it goes around a corner.

The difference is that we have a shop program in middle schools and high
schools that cultivate that culture. There is precious little in terms of UNIX
culture cultivation before college.

------
cheald
It sounds like maybe he should invest some research time into improving
software interfaces.

I get the frustration with getting mired in infrastructure before you can
actually start producing useful things. I experience that regularly. But you
know what? It's going to be there for your whole career, and it's often a
useful driver which inspires people to improve the things that annoy them. I
usually learn very useful things in the process which improve my ability to
comprehend the entire system.

I think there's a balance to be found here - it's unreasonable to expect
students to spend their entire semister in an advanced CS research class
learning how to use rudimentary tools, but maybe it's also unreasonable to put
people into advanced CS research classes before they've learned how to use
their field's rudimentary tools. The student should know how to do it, and the
professor should attempt to ensure that _they don 't have to_.

On a different topic: My anecdotal experience is that academically produced or
oriented software is nearly always among the worst offenders of this - it is
overly complex, poorly documented, and not well supported. This leads me to
wonder if there are attitudes in academia that such things are "bullshit" that
get in the way of "real work", which lead to this problem in the first place.

------
ntrepid8
Behold a man standing on the shoulders of giants and complaining that the
giants are not tall enough.

------
chubot
It's true that the "command line" takes a surprisingly long to learn. I've
definitely surpassed the 10,000 hour mark, and I'm STILL learning things.
(People say that about Vim alone :) )

That's probably because it all superficially looks the same (sequences of
characters), but there's are a lot of depth and functionality with many
different concepts -- e.g. compilers (cross-compilers too), data compression,
networking, storage and file systems, hardware devices, crypto, parallelism,
data formats for every problem domain, several important little languages,
several syntaxes for the same language (regexes, etc.).

But I think you basically have to learn it to become computer literate, and
all computer science researchers should be computer literate.

Otherwise you can't "fend for yourself". If you are relying on your professor
to set shit up for you, that's just lame and you will not be useful in many
other contexts.

What's the alternative? I think it's "mouse bullshittery", which is even
worse. For example, compiling software stuff on Unix can be hard. But I used
to use Windows, and it's 10x harder. I remember the last experience I really
had with Windows was installing Visual Studio to compile Python, and I hit
some trivial but showstopping but in the CDs, and gave up in disgust. The
workaround was to copy some DLL somewhere in your \system32 folder, and I'm
like "really?"

I don't really use Macs for development either, because it's annoying to set
up compilers. (It's 10x less annoying than Windows, but 10x more annoying than
Linux).

------
msamwald
I think that the article describes a sentiment that far more people dealing
with computer science / programming share than usually perceived. I think it
is mostly because for the (seemingly warranted) fear of being called out as
unprofessional by other people in the field.

I'll come out of the closet: I also wonder why Bash is still so important in
2014. I think it simply is unintuitive and cumbersome for all tasks that you
don't routinely do, and only a small minority of people that are using Bash
are doing it in such a fluid way that the potential benefits of the command
line can be reaped. For the rest of us it mainly feels like
[http://xkcd.com/1168/](http://xkcd.com/1168/)

In summary, I think that: 1) Bash is overripe for being replaced by something
else, perhaps a more user-friendly and interactive shell that is built from
the ground up and not a fruitsalad of ad-hoc solutions created a few decades
ago. 2) Using Bash and praising its benefits is not only so popular because it
has positive practical aspects. It also seems to have become culturally
ingrained in the geek community in an irrational way, like an initiation rite
that separates insiders from outsiders.

------
soyiuz
I don't like the reasoning in the author's argument at all. He is begging the
question: it is "bullshittery because it is bullshit." By assuming the command
line as not an intellectually rich environment, he creates an intellectually
impoverished and frustrating experience for his students.

The command line is far from that. It is an incredibly standardized
environment, elegant compared with "black box" gui-driven cluttered all-in-one
solutions (looking at you Stata) for programming or doing data science. It
encourages code that "does one thing well" and which produces input/output
streams in an unencumbered, plain text format.

These ideas may not be interesting for the development of new search
algorithms, but they are incredibly interesting for the developing of a new
search engine, for example, or a new word processor (as interfaces between
human and machine). Using the command line well develops into the daily
discipline of economy, simplicity, and common sense. If command line is
bullshit, what does he think is not?

------
msluyter
+1 for introducing me to the term "bullshittery."

My undergrad was in music and when I went for a master's in CS, I knew like,
nothing, and suffered under a lot of the command line bullshittery he refers
to. And most profs at the master's level were unwilling/unable to help with
that sort of thing. One of the most useful classes I took was a (totally non-
degree fulfilling) undergrad course in Unix system administration.

Nonetheless, I believe that mastering the command line and general Unix
environment is still rather essential. Things like basic vim usage, or
readline shortcuts
([http://www.bigsmoke.us/readline/shortcuts](http://www.bigsmoke.us/readline/shortcuts)).
I still see experienced folks cursoring around on the command line a character
at a time, which sort of makes me cringe.

It might be interesting to see some sort of catch-all "systems skills" CS
course that covers a lot of this sort of thing.

------
wffurr
Clearly a lot of HN readers have Unix command-line Stockholm syndrome.

It's a sad fact that having facility with the Unix command line enables one to
be more productive, but that's because because it gives you access to a huge
array of useful tools, not due to any innate qualities of the interface.

It's a clear case of worse-is-better.

~~~
lukev
I really want to agree with you - its easy to look at the command line and
point out how it's suboptimal.

Unfortunately, there's a dearth of real, better alternatives. There's a lot of
ideas, and a lot of toy projects, but there's nothing that's actually viable
to which we can migrate.

Except, maybe the Microsoft dev stack. But I still prefer the *nix suite.

------
fiatjaf
It is strange that nobody commenting in this thread, or in any other thread of
HN, will feel the way the author's students feel about the command line.

I would also expect that most people commenting on HN, whatever their degree
of knowledge, find the command line and all the things it can do, all the
tools they know how to use or they know they can learn, a lot interesting and
valuable.

These are the characteristics of people who really love computers and, by
consequence, whatever that means, "computer science". The students referred by
the the author are not truly interested in the computers as the HN visitors,
and they should not be in an university doing "research" with the final
objective of producing "publications".

For such a mismatch of interests and people to happen there must be a very
misleading set of incentives driving the academia today.

------
neilellis
As someone who lives eats and breathes the Bash shell, I can say quite
confidently that I can do most things I need just using this 'bulshittery'.
Ignorance in my book is not a virtue.

What I failed to understand is exactly what was so difficult about writing and
compiling software? This process can be as easy or as difficult as you wish.

Do they not have IDEs, surely that would help. I use IntelliJ as an IDE and it
pretty much does everything for me, including finding dependencies.

Maybe I'm missing something, but if I can build complex Java programs without
reaching for the command line surely this can be done in a scripting language
without any complexity.

Also I would suggest looking at the dockerized development environments out
there. (e.g. [http://awvessel.github.io/](http://awvessel.github.io/))

Maybe I need to re-read this and see what I missed.

------
chrisprobert
I also enjoyed his post on the "Two Cultures of Computing":
[http://pgbovine.net/two-cultures-of-computing.htm](http://pgbovine.net/two-
cultures-of-computing.htm)

Definitely something worth thinking about as we build tools for other
developers/engineers to use.

------
zachh
This is exactly the problem we're trying to solve with
[http://bowery.io](http://bowery.io). There's so much amazing OSS that's hard
to set up, making it hard to focus on the actual development/coding/research
you want to do.

~~~
williamstein
Similar with [https://cloud.sagemath.com](https://cloud.sagemath.com), except
in the domain of computational mathematics. People like you and I deal with a
lot of the BS, so it's already setup for other people.

~~~
buzzkillr2
Using sage math now in a class and it's tops. Thank you for your work

------
fidotron
I think it's got to be time to revisit some of the ideas of the Smalltalk or
Lisp machines, principally that of one language all the way down from app to
lowest level.

One of the biggest pieces of brain damage the UNIXes have inflicted on the
world is the division between command environment and programming language. It
may have once served a purpose, but in modern systems we have a level of
madness that is the equivalent of expecting a literature student to master all
Indo-European languages before being able to work in French.

The growing importance of services over the network and containers gives some
cause for optimism, since if we can agree about the interfaces between
services (yeah, big if) then each container can hopefully become a steadily
simpler world unto itself.

~~~
pyre
Yes. We should definitely require everyone to use the One True Way! Because
that has always worked so well in the past!

[P.S. You also fail to explain why not just use Microsoft for the One True Way
rather than a LISP machine. Microsoft is all about making their tools work
together with other Microsoft tools so that everyone should do things the
Microsoft Way!]

------
porqupine
Despite the amount of criticism on here (and while I agree the article is a
bit over the top), I do think I am strongly partial to this sentiment
nevertheless.

I think part of the problem is that amongst coders (a much greater population
than that of people doing real research in computer science) fluency in a
number of arcane syntaxes is perceived as something 'cool', finally it's a
standard that's been ingrained for years and years.

But the amount of say 'command-line-tool' or 'library' specific syntax, that
must be re-learned for a new tool (which you might use once in a month, or a
library you might use for one function, is very often a giant time sink). It's
not cool. It's not intellectually interesting to memorize syntax.

------
hcinfo
Two things that I've found that help a lot with the problem the author
describes: Cloud hosting and YouTube.

We recently created some research software[1] that was somewhat kludgy to set
up, and asked a high school student to give it a try. He got stuck on some
early steps -- like working with the DVCS.

So we created a YouTube video[2] that walks through provisioning a new host on
Digital Ocean, and each step after that. The video got him "un-stuck", and as
a fallback he could always abandon what he was doing and simply replicate the
entire video step-by-step.

1\. [https://sharps-ds2.atlassian.net](https://sharps-ds2.atlassian.net)

2\. [http://youtu.be/p54CCUEhfig](http://youtu.be/p54CCUEhfig)

------
efaref
A computer science professor complaining about command-line "bullshittery" is
like an astrophysicist complaining about telescope "bullshittery". C'mon, it's
the 21st century, we should just be able to take a rocket to other planets!

------
mnarayan01
From (quickly) reading the article, it seems like "drudgery" would be a much
more apropos word choice. Though given that he doesn't appear to describe how
to do so, maybe not.

Where I agree with him is that there can be a tendency to make command line
usage into something of a fetish; computer science should not be a command
line competition. On the other hand, making effective use of command line
tools will eventually help almost anyone doing computer scienciey things
immensely.

In summary (and possibly what the article was trying to say), students
shouldn't feel embarrassed or intimidated by a complex series of command line
invocations; eventually they may know what they all mean, but they're just a
tool.

------
glathull
Personally, I am sick and tired of all this hammer bullshittery. I just want
to build houses. Now I have to hold this nail in one hand and hit it with the
heavy thing??!! WTF is this all about? What if I hit my thumb? It's bullshit
is what it is. I mean, come on. This is stone age technology. You want me to
start with a hammer and nails? I want the damn nail gun now! I don't care if I
don't know what to do with it. I'm an entitled brat.

More seriously, every real program is going to interact with a file system in
some way at some point. You should learn how to use that before you nail gun
yourself in the thumb. Unfortunately that means learning some command line
"bullshittery."

~~~
recursive
I think you missed the analogy. Hitting a nail with a hammer is directly
contributing toward the task the builder is trying to accomplish. It is
intrinsic, not incidental.

------
russelluresti
On one hand, I get where he's coming from. You'll get more people interested
in something when you give them direct access to all the cool stuff first and
none of the setup. This is how Khan Academy structured their Programming
courses. If people don't have immediate roadblocks thrown in their way,
they'll want to stick around longer.

On the other hand, there is a common mindset in computer science and
programming of "I don't understand this, it must be bullshit." This article
comes fairly close to fitting this mindset, and may actually exist squarely
within those parameters.

Ultimately, everything you want to do is going to come with prerequisites. If
you want to go out in public, you must first learn to dress yourself. If you
want to cook something, you must learn how to properly heat your oven, stove,
grill, or whatever. You can't just say "I'm making chicken - the appropriate
cookware should already be set up for me and heated to whatever is necessary
to make chicken." There are so many different configurations you can use to
create software or cook chicken that you can't just say "this is the one setup
that will take care of all your chicken-cooking needs."

There's always been a strange fear of the command-line interface. Maybe
because it looks old we think it's automatically shit. "This is the way people
did stuff in the 70s - how have we not improved on this yet?" Just because a
method has been around for a long time doesn't automatically mean it's out-
dated. In fact, if a method HAS survived that long, it usually means it's got
something going for it.

Maybe, in the future, someone will come up with software that's better to use
to get you started on something new. However, the person who creates that
software is going to have to have a fundamental understanding of how those
command-line tools actually work. If you want to improve something, you need
to understand it and not just dismiss it as "bullshit". And if you're a mentor
in the CS field, maybe just complaining and expecting someone ELSE to fix the
CS field-specific issues isn't the approach you should be taking.

------
dllthomas
Porting my comment from yesterday's posting of this
([https://news.ycombinator.com/item?id=8432955](https://news.ycombinator.com/item?id=8432955)):

 _" I strive to remove incidental complexity for my students, so that they can
focus on the intrinsic complexity of their research."_

A worthy goal. I think, though, that calling out the fact that it's command
line serves to distract more than illuminate - at least the reader and
possibly the author. There are doubtless unnecessary steps between idea and
code. However, shoving those in a graphical wizard doesn't make it any better.

------
dkhenry
posts like this are why 90% of students coming out of these university's can't
pass simple programming assessments.

------
juanre
I've often had the impression that mastery of command line tools is seen by
some as a rite of passage. Looking at many of the comments elicited by this
article certainly makes me think so.

I don't think he is trying to belittle command line tools: he clearly sees
them as a pre-requisite for his line of work that is not fulfilled by many of
his students. His use of "bullshit" may be unfortunate, but I think he's only
trying to ---correctly--- point out to his students that the tools are the
just that: tools that should not be intimidating and that will help them be
much more efficient.

------
bostik
This sounds like a problem that should be easily solved with a simple
metapackage, installable either via apt-get or zypper.

Add the university repo, install metapackage and have dependency resolution
take care of the rest.

~~~
tdicola
Yep that's what I was thinking. If he's annoyed that students are wasting too
much time setting up software, they should hire some folks to build a nice
repository of packages and maintain them.

------
dmschulman
"I've spent the past decade mostly leading my own research projects. This
meant that I did the majority of the command-line bullshittery and programming
to produce the results that led to publications, especially ones where I was
the first author. In short, I've gotten very, very, very good at command-line
bullshittery."

You mean you learned a new skill and got better at it? Your example suggests
that there's only one way to improve your skills when it comes to working with
"command-line bullshittery"

------
ribs
Why doesn't he set up a config management server, like puppet or chef or
ansible, and have students get set up that way? Or provision some VMs? Or
provide some Docker containers?

I have tweeted him thusly.

~~~
mcguire
Likely because student A's research requires tools X, Y, and Z, while student
B's research requires X, L, and M.

------
super_mario
Command line interface is also the most powerful and at times the most usable
interface (in a lot of cases you simply can not do certain things in the GUI,
and UI that prevents you from doing something is by definition less usable).

Given the choice between GUI and command line to achieve a task I would always
choose command line way. Command line takes a while to understand and use
effectively. So yes there is initial cost to reach certain proficiency, but
then new world of possibilities opens and you become freakishly effective and
powerful in expressing yourself and telling computer what to do, so much so
that resorting to GUI (clicking on pictures) becomes hopelessly frustrating
most of the time.

There is a hierarchy of skills needed to be effective in the CLI:

1\. Learn to touch type at decent speed

2\. Learn advanced text editor (really this comes down to Vim/Emacs, either
one of them will serve your for the rest of your life).

3\. Learn general purpose tools to do basic filesystem things (manipulate
files, traverse directories etc).

4\. Learn general purpose tools to transform structured text to arbitrary
"shape". Things like sed, awk are considered advanced here, but simpler things
like cut, tr, colrm, paste etc are often powerful enough and sufficient to
achieve the task.

5\. Learn to search for information. Learn find command, learn grep (ack/ag),
learn regular expressions (amplifying the power of your sed/awk scripts).

6\. Learn more specialized tools for specific problems

7\. Learn a scripting language of your choice and stick with it. Python, Perl
or Ruby are great choices but some are better suited for certain tasks or
areas of specialization.

Note that any UNIX distro shipped in the last 10 years has all of these tools
pre-installed. All you need is to spend some time practicing. Some things like
touch typing and advanced editor may take longer time because of muscle memory
involved, but basics of simpler tools can be learned in a few minutes to
hours.

And of course while doing any of this, you will be interacting with your
shell, so learn its basic and some advanced features.

After all this, you are really a master of your data. Finding, extracting,
slicing, presenting data etc become trivial chores.

------
0xdeadbeefbabe
> It takes a tremendous amount of command-line bullshittery to install, set
> up, and configure all of this wonderful free software.

Then it isn't wonderful (or even free in some ways). Has the author considered
what Kay might mean when he called linux a budget of bad ideas? I thought
academia had more freedom to reject pragmatic stuff when it doesn't suit their
needs.

The author might be able to help his students best by reviving an interest in
systems research.

------
agilebyte
The people developing command line tools become proficient quickly, writing
and improving on existing CLI apps. If they were to spend time writing GUIs,
we wouldn't have all these wonderful CLI apps to begin with.

If enough people are complaining about a CLI app, then maybe there is a niche
for a company to explore and write a GUI no? Until then, the students need to
invest some of their time (rather than money) to use a (usually) open source
software.

------
mbish
This seems to me akin to saying that: because learning how to use a tool is
never research (the tool is already built) it is therefore bullshit and a
waste of time. I like the command line so I'm biased in this direction but you
need to know your tools in order to do work. What if a mechanic didn't want to
deal with all that "how-to-use-a-wrench" bullshit and just wanted to fix
things!

------
nocman
I'm a little confused about what the author is really trying to say. In this
article, it sounds to me like he is saying that all of this interaction with
the command line is arcane, stupid and just needs to go away.

But then I read the other article he wrote which he referenced in this article
("The Two Cultures of Computing" \-- [http://pgbovine.net/two-cultures-of-
computing.htm](http://pgbovine.net/two-cultures-of-computing.htm)). In that
article he discusses how there are two groups of people -- the "User" group
that just wants to do everything with a GUI interface, and the "Programmer"
group that likes the intricacies of the command line. There he discusses the
difficulty he had with explaining to students the justification for learning
the command line tools (which included the fact that they provided more
flexibility than the GUI tools alone could achieve).

He concludes the "Two Cultures" article by encouraging folks in the
"Programmer" group to bridge the gap between the two cultures:

"You need to gently introduce students to why these tools will eventually make
them more productive in the long run, even though there is a steep learning
curve at the outset"

Although I dislike the notion that the command line is "bad" or "arcane" just
because it has been around for a long time (which the "Two Cultures" article
seemed to harp on a lot), I do agree that bridging the cultures is an
excellent idea.

And then I go back to the article referenced in this post, which seems to be
saying that all this interaction with the command line is just boring, stupid,
and has no value. So again, I'm not sure what the author is ultimately trying
to say.

Personally, I appreciate both. I love a good GUI that makes common tasks fast
as lightning to rip through. But I will always prefer to _also_ have the
ability to get down to a lower level, where I can make the tools do even
greater things that the author of the GUI did not think of or did not have the
time or motiviation to implement.

Also, I think it is a bad idea to discourage students ( _especially Computer
Science students_ ) from learning the nuts and bolts of how computers work. We
have enough of a shortage of people with that kind of knowledge already. IMHO
we need as many people with those kinds of skills as we can get.

~~~
danielweber
His thesis isn't "the command line is bullshit."

It's "all that bullshit you have to do (on the command line) is a waste of
time and mental energy."

------
hvs
While I can sympathize, does anyone else feel it is odd that we are talking
about a _Computer Science_ professor complaining about how basic unix tools
work? Sure, they aren't perfect, but I think it is going a bit far to say that
CS researchers shouldn't have to understand on the other 99.999% of developers
in the world use computers. This feels a bit ivory tower to me.

~~~
mcguire
Consider that a computer science professor beginning a research program with
students is roughly equivalent to a software development manager, just
recently promoted from a development role.

------
fla
Dear Professor,

Computing might help solving your problem. [1]

[1][https://www.virtualbox.org/](https://www.virtualbox.org/)

------
avz
I've always been surprised when my friends mention how hard it is to install
FOSS packages from scratch. I mean what can be more natural than

    
    
        cd directory
        ./configure
        make
        make install
    

it's almost like in the real life:

    
    
        cd kitchen
        ./configure microwave
        make pizza
    

Isn't it? ;-)

~~~
LordIllidan
Until you're missing an ingredient and need to go all the way to the
supermarket. Multiple times.

With a binary, you're just reheating a frozen pizza.

------
vicaya
He should help his students understand how to acquire basic skills in a field,
which is a prerequisite skill to do any research.

It start with a web search these days.

There is simply no excuse, when the knowledge of these basic-
skill/bullshittery is openly available.

You know, teach a guy how to fish...

------
minimax
_git pipe fork pipe stdout pipe stderr apt-get revert rollback pipe pipe grep
pipe to less make install redirect rm rm ls ls -l ls tar -zxvf rm rm rm ssh mv
ssh curl wget pip_

Anyone care to speculate why he is using curl for one thing and wget for
something else?

~~~
jackmaney
Not to mention that even if one replaces "pipe" with "|", the resulting string
would return a syntax error on the shell? When the hell would you pipe the
command "git" to anything (not "git log", not "git diff", but "git")?

------
mverwijs
I honestly don't see the problem. Don't want your students to learn the
innards of computers running Linux? Configure a Vagrant instance and have your
students Vagrant-Up. Could have been done inside the time he took to write
that blogpost.

------
mike_ivanov
Some of his students went into debt to receive this sort of "education". How
sad.

------
jjindev
I think this is a "stages of moral development(1)" thing. In post-conventional
levels of development, all tools would be valued for their strengths. We
wouldn't denigrate a tool for belonging to a class of tools.

1 - with apologies to Kohlberg

------
mrbill
This results in people like a friend of mine, who has a Masters degree in
computer science... yet I had to walk her through swapping out the power
supply in her PC.

~~~
CocaKoala
What should having a master's degree in computer science indicate about
somebody's ability to swap out a power supply? I'm curious.

~~~
mrbill
I would think that someone who would go so far as to get a masters degree in a
topic would also be curious enough to know exactly how some of the tools used
in that field plug together and work? Am I wrong here?

~~~
CocaKoala
Maybe you're right, maybe you're wrong. But the power supply and the case are
probably the two least interesting parts of the computer to somebody with a
Master's degree in CS; it may well be the case that she knows a lot about the
cache in her cpu and the various sockets in her motherboard, and she just
doesn't care about the PSU because as long as it supplies enough power, it's
boring.

Also, let's be clear here. At heart, building your own computer is like
playing with Lego, except that each piece is very expensive and you're
terrified of breaking it. Plus there's probably a specific order you need to
follow to make sure everything fits properly, but you get no instructions
about what that order is, because it depends entirely on what pieces you
bought. Also, if you don't already know what you're doing, it's entirely
possible to buy pieces that just plain don't fit together and you won't
realize it until you're elbow deep in the case searching for a socket that
just plain doesn't exist.

As somebody who has gotten a master's degree in computer science and also
built his own computer, I definitely feel like the master's degree was far
more interesting of an endeavor and if somebody said "I'm really interested in
computers, but not in putting them together; it seems like just putting tab A
into slot A but all of the tabs and slots are poorly labeled", I'd nod my head
and say "Yeah, that's basically it. I enjoyed building my own, but it's not
for everybody".

------
mcguire
I'm going to quote the majority of an entire section of Guo's post because it
(a) demonstrates what I believe to be the core of his problem, and (b) is
incredibly offensive to me as a computing professional:

" _Here is a common productivity bottleneck faced by students working on
applied computer science research:_

" _1\. Advisor and student discuss high-level research ideas by doodling on
the whiteboard. Awesomeness ensues._

" _2\. Student leaves advisor 's office feeling pumped and knowing exactly
what they need to do to implement those ideas in code._

" _3\. Student tries to get started on programming but immediately gets stuck
since they don 't know how to handle all of the command-line bullshittery
required to set up their coding environment with the proper libraries, tools,
and frameworks._

" _Many students get discouraged and turned off from research when they hit
the wall in step 3._

" _There is a huge disconnect between the elegant high-level ideas discussed
on the whiteboard (while presumably sipping cappuccinos) and the grimy,
grungy, terrible command-line bullshittery required to set up a computing
environment suitable for implementing those ideas in code. This gulf of
execution is tremendously frustrating for highly-capable and motivated
students who just didn 't happen to spend 10,000 hours of their youth
wrestling with nasty command-line interfaces._"

If I were to try to write a summary of this complaint, I believe I would have
to boil it down to, "Useful work is indeed often difficult."

I'm certain the HN community is familiar with the concept that ideas are a
dime a dozen, that execution is indeed the important part. I'm also certain a
significant chunk of the readership has also spent "10,000 hours of their
youth wrestling with nasty command-line interfaces", by which I mean "has
learned the fundamentals of how computers work" and "has spent some time
understanding how to use their tools."

So along comes Philip Guo, who has, in his career, developed the ability to
convert whiteboard doodles to working prototypes (I'll withhold my rant about
academic quality code for today). Now, however, he's in the position of every
developer promoted to management: he has whiteboard doodles that he would like
his loyal minions to convert to functioning code and his minions are having
difficulties. And, like almost every other recent promotee, he blames the
tools involved. Because, after all, "this bullshit is not intellectually
interesting in any way", at least to him; he's only interested in the
whiteboard awesomeness, after all.

Programming is difficult. Guo knows this. The tools could be better. (Are
there significantly better tools out there than "command-line bullshit"? Not
that I'm aware of. Maybe the Plan 9 stuff, Acme, etc., but that comes with its
own stack of bullshit and a reduced environment to boot.) But is _actually
learning to program_ really incidental complexity? Is it really "bullshit"?

~~~
pessimizer
It isn't offensive to me in any way as a computing professional. It's exactly
what happened to me in college.

"Programming is difficult. Guo knows this. The tools could be better. (Are
there significantly better tools out there than 'command-line bullshit'? Not
that I'm aware of. Maybe the Plan 9 stuff, Acme, etc., but that comes with its
own stack of bullshit and a reduced environment to boot.) But is actually
learning to program really incidental complexity? Is it really 'bullshit'?"

Command-line bullshit has little to nothing to do with programming.

------
aeze
The article actually irritated me a little bit.

------
fiatjaf
So this guy thinks that the people who build the tools that enable people to
"be 10x or even 100x more productive" are bullshitters, while he, the almighty
professor who produces a lot of useless "publications" and getting taxpayer
money for it, is the nice and important guy on the field of "applied computer
science".

------
TeMPOraL
I feel him. I used to feel in exactly the same way.

This is how you feel when you get overwhelmed by stuff you don't understand,
that is _completely irrelevant to your goal_ , but which you are forced to do
anyway.

I used to think version control is bullshit back in the days I was first
learning to program. I wanted to make games. In particular, I looked for other
games made by hobbyists in order to download them, play a bit and then look at
the source. It _really pissed me off_ when the only way someone shared their
production was through a CVS repository. I wanted a simple .zip download, not
some stupid weird Unixy thing (I was running Windows) that was complex,
required to download some weird console software (ah, the PATH issues) and was
completely irrelevant to my goal of getting up-to-date version of the
software. Oh, and they didn't store binaries in CVS so here's another hour
wasted for me to try and figure out how to compile that project (fixing
dependencies took time).

When NASA released some of their software as CVS repos with no direct
download, I got so pissed I vented on a blog
([http://temporal.pr0.pl/devblog/2007/06/25/systemy-
kontroli-w...](http://temporal.pr0.pl/devblog/2007/06/25/systemy-kontroli-
wersji-gdzie-przebiega-granica/), pl_PL only).

At some point I learned SVN though, and then Mercurial and Git, but the main
thing that made me stop being annoyed by version control was switching to
Linux for primary programming work. Version control feels natural there.
GitHub for Windows application also helped.

Another, more recent example. Because of some weird reasons I ended up having
to learn how the old RMI in Java worked. Oh, such a pain in the ass it is. I
wasted like five to ten hours making it work because of bullshit like XML
configurations, setting up servers from (Windows again) command line with
magic invocations, weird security issues and dropping manifests in random
places. _All I wanted to do is to have program A call a function in program
B_. Things I did to get there were completely irrelevant, but took 90% of
time.

Current example - don't ask me to do anything in client-side MVC frameworks.
The whole ecosystem around Angular, Ember and Node.js _feels like_ utter
bullshit to me. Again, tons of magical invocations, hundreds of megabytes
downloaded, hundreds of thousand of files in project directory, all that to
achieve what used to take a single _alert( "Hello world!")_.

I know, it's probably useful somehow to something. But I wanted to make a
simple app, you told me to use Ember, and I wasted half a week and a DVD worth
of disk space and I still don't have that app.

So yes, I feel him. It's not that the software or command line sucks. It's
that 90% of things he has to do are completely irrelevant to his research, and
yet he can't proceed without doing them.

What can we do to solve it? _Hide the bullshit_. Hide the infrastructure. Just
running a program should never require touching version control. Hello world
in a web framework should not require setting up test frameworks, downloading
half of the Internet and selling your soul to Satan. Make basic use cases
simple. Or at least, write tutorials that explain the reasons behind magic
invocations and that are targeted to beginners, not people with 10 years of
experience in the domain.

~~~
Kalium
Bear in mind that his use case is that people want to do novel things with
their machines. You can simplify, automate, and GUI away things you can
anticipate. When someone needs to go beyond that, you can try to give them a
helpful framework, but that's it. Quite quickly they're going to be off
blazing a trail, armed only with their wits and some basic tools.

But here's the catch - when blazing a trail, you can't expect to find a nicely
paved road with regular bus service.

