Hacker News new | past | comments | ask | show | jobs | submit login

I find it sad, that in an academic setting no less, a PROFESSOR is telling his students it's BULLSHIT to know how the systems that are vital to their work actually function. He sounds like the guy that goes to jiffy lube because he doesn't know what an oil filter is, and thinks that it's "bullshit" that anyone does. I just want to turn the key and go! That's great, but as an academic you should also want to know what the fuck happens when you turn the key. And encouraging your students to devalue knowing how things work under the covers... I think he needs to find a new line of work.



I understand the perspective you're taking, but I politely disagree. I think you're missing the bigger point here.

Let's say someone is interested in baking a cake. There is a LOT you can learn, spanning general baking techniques, chemistry, design, art, tasting, etc. But if your immediate response is "we need flour, so go plant some wheat and wait a few months," they would likely lose interest.

Teaching people to plant and harvest wheat is awesome, but for most people it probably shouldn't be the first thing you're met with when you are trying to learn how bake a cake.


If you're studying "Baking Science", which covers everything from the beginning to the end, starting with "Let's grind some flour" is a good idea. People studying "Baking Science" need to understand the whole process, rather than believing everything starts and ends with pre-packaged recipes and machines that do everything for them.

That's what's going on here. A "Baking Science" curriculum that didn't impart people with a knowledge of where flour comes from and how it's made would be a joke.


There may be space in the whole curriculum for that, but it's probably not essential here and now. The whole pull of Bret Victor's presentations is that they show us what it would look like to program if our tools were as modern as Word.

The baking-metaphor problem is that you have students who are supposed to come in and investigate how the arrangements of toppings on a pizza affect both nebulous qualities (like deliciousness and heterogeneity) and rigorously measurable ones (like moisture and elasticity) of the pizza crust. However, when they come to your kitchen, usually most professors put them in a totally new room which contains millstones and grain and milk ready to be turned into fresh mozzarella, with nothing labeled. There are reasons for this -- real pizza aficionados have very different choices about how they want to compose their sauces and which cheeses they want on the finished result and even what leavening agent causes the dough to rise, so the framework for pizza-baking is as general-purpose as possible. But those reasons make things difficult for the newcomer.

The professor is just saying, "when we start, I walk everyone through the process of finding the flours over here, the additives over there, and using the bread-machine to mix them and knead them. I then show them where to find the canned sauces and the pre-grated cheeses, so that they can start with minimal knowledge baking up some pizzas for science. Our concerns are very high-level and I want them to be fussing with baking times and topping arrangements, but so many of my students seem to be stuck on trying to turn milk into mozzarella."


The problem he overlooks is that we do not want our tools to be "as modern as Word". We want modern tools, but not modern as Word defines it.

There are excellent solutions for his problem. Distributing a pre-configured VM is a good one. Instead, he wants students to have the experience of bootstrapping, but he also wants it to be painless and magical.

But instead of looking at his actual problem, he's wound up railing against all the critical freedom that makes the field something other than a glorified exercise in painting by numbers.

EDIT: For the record, turning milk into mozzarella is actually really easy and quite suitable for a novice. I've done it. Takes about an hour, end to end.


My understanding is that this is not so much for a single course (where a VM would be a good solution) as for general student research. Prepackaged works less well there (though may still be made to work).


When you set out to blaze a new trail, you should not expect to find a nice paved road with bus service.

If he decides to ignore all the tools (puppet, chef, scripts, etc.) designed to make all of this easier, that's his fault.


> If he decides to ignore all the tools (puppet, chef, scripts, etc.) designed to make all of this easier, that's his fault.

You're kidding here, right?

I find Puppet and Chef superconfusing and not worth the effort to learn at my job right now, and I'm a fucking programmer by hobby and profession. This is exactly the kind of bullshit people doing science should never have to deal with.


Puppet, at least, is pretty straightforward. You are describing what you want your system to look like. Puppet takes that description and makes your system look like that.

People doing science who want to use computers should expect to have to learn a thing or two about using them. As in more than using Word if they want to do complex, custom, not-done-before tasks. Much like people who want to do novel things in chemistry should expect to learn more than how to make black powder.

This guy is upset that novel things haven't already been thought of and planned for by the people who make shiny GUIs. This is a farcical position. If it's really that novel, of course nobody's written a GUI for it.

Point is, tools to address his problem already exist. He dismisses them, because they don't do it in an arbitrarily flexible and powerful way while still being infinitely iTunes-y.


> People doing science who want to use computers should expect to have to learn a thing or two about using them.

A thing. Or two. Not half-a-year-worth of full-stack dev education many.

He is not complaining about having to learn things. He's complaining about having to learn irrelevant things. Infrastructure. He wants to make a soup, and he's being asked to run his own plumbing to get water, and to drill his own gas for heating. And people here are saying he should stop complaining, because nobody is making him build his own drill - it's already provided via Puppet script in a Git repository.

He doesn't dismiss tools because there ain't iTunes-y. He dismisses them, because to use those tools he has to learn more tools, for which he has to learn even more tools, and all that effort is throwaway, because the next time he will need to learn different toolchains (or should I say, tooltrees with stupidly high branching factors).

> Puppet, at least, is pretty straightforward. You are describing what you want your system to look like. Puppet takes that description and makes your system look like that.

It makes sense for a team of web developers doing high-scalability applications. It is bullshit for a researcher who just wants to crunch some numbers with a bit of Python code.


There's the problem. He doesn't understand what the proper bounds of relevance are. He can't see how a given task is relevant, so it's bullshit. That's more a comment on the limits of his thought processes than anything else.

He wants to do novel things. This means going places where not everything is preconfigured for his pleasure. It also means he needs to know how to use his tools, because when he runs off the edge of what point-and-drool does for him he will need them.

He asks for a world where point-and-drool covers everything. All I can say is that what he asks is impossible for what he wants.


The analogue in CS would be to give the students a pre-configured environment if you want to deep dive into a specific topic before teaching how to set things up. If the students are using a known OS, all you need to distribute is a shell script. Otherwise give 'em a Vagrantfile or something.

That being said, you'll want your students to understand their tools sooner rather than later.


To be fair, unfortunately as of 2014 considering there are 10^2 main disciplines, with 10^4 main research areas, with 10^6 different things to learn and 10^7 researchers building stuff to teach, expecting a GUI button to do those myriad of functions in diverse ways would be naive. Maybe, the primary school education tools are close to what you/he describe in simplicity of use.


Speaking as someone who has changed his own oil more than once: if I needed to know how to do that in order to operate my car, that would indeed be bullshittery. I benefited in no way whatsoever from changing my own oil. It was, as it turned out, simply a very- low- ROI- recreational activity.

The point here isn't that the command line is bullshit. Knowing how an engine lubrication system works isn't bullshit... for a mechanic. But it is a waste of time for a cab driver.


So considering the OP is an "Assistant Professor of Computer Science University of Rochester" please refine your analogy whether his target audience falls closer to become the cab driver or the mechanic?

Edit: quoted below OP's case:

>> Write a piece of prototype computer software that demonstrates the feasibility of a novel idea."

>> Write a piece of prototype computer software that collects, processes, and analyzes data to produce novel insights about some topic"


Facility with the Unix command line is not in fact all that valuable for fundamental CS research.


The "killer app" for most CS researchers is indeed Microsoft Word.


Nonsense. The standard writing tool in mathematics and computer science is LaTeX, which has plenty of bullshittery of its own. Most researchers will generally be suspicious of a Word-produced paper, since it usually will come from crackpots.


I was actually quoting Philip Grenspun there[1].

[1] http://books.google.com/books?id=Gb6vumaGwJAC&pg=PA47&lpg=PA...


the standard writing tool in mathematics and computer science is the pencil. Or pen, if you are feeling cocky.

http://www.cs.utexas.edu/users/EWD/


My thoughts exactly.

"But what about the magic of version control, GitHub, pull requests, forking clones, cloning forks, diffing forks, forking diffs, etc., etc., etc.? None of that matters for someone who works with binary file formats (i.e., user culture) rather than plain-text formats (i.e., programmer culture). There was such a disconnect between the two cultures that it was hard for me to come up with responses to these sorts of “why use X rather than Y” questions without sounding either incomprehensible or patronizing."

1. Is he seriously implying that version control is just some arcane thing that Unix programmers do, and that it's bull shit that students have to learn it? 2. If you can't answer students' questions about why these CLI tools are needed, then that does not mean there are no good reasons--it means you don't know, because you don't understand yourself.

"It's comically absurd that the most sophisticated computer programmers nowadays are still using an interface that's remained unchanged since before humans landed on the moon. What the duck?!?"

"Students are starting to grow suspicious: These instructors are supposed to be expert programmers, but their tools look pretty primitive to me."

This is a problem of perception, both on the students and on the professor's part, not a problem of reality. Do you think there's a reason that programmers still use these interfaces decades later? I'll give you a hint: it's not because they don't like change.


1. He is implying the students believe it. From their point of view, the CLI does look arcane, primitive, and useless.

2. it's more complicated than that. Yes, if you understand something, you can explain it. No, you can't automatically explain it in 5 minutes. Sometimes, the knowledge gap is so great that you need months to explain it all. In 5 minutes, you can only make claims your students will feel entitled not to believe if they feel like it. If your claims don't match their experience, and you have not demonstrated trustworthiness, they simply won't believe you, then tune out. http://lesswrong.com/lw/kg/expecting_short_inferential_dista...


1. No, I think he's implying that the user interface his students are required to use is bullshit.


You need to say hello to the canonical example Dijkstra, where arrogance in computer science is measured in nano-Dijkstras, to steal from Alan Kay.

Dijkstra famously observed that computer science is as much the study of computers as astronomy is of telescopes.


That explains why NASA (and other agencies) always seem to have so many issues with unit conversions. It's all just Measurement Bullshitery that they shouldn't have to deal with! /s


If only that were a good comparison. Measurement is so much better than say autoconf.


His point was that it is a bullshit tax that you must pay.

The Jiffy lube analogy is wrong. It is more like buying car insurance, registration, inspection, and filling the car with gas. Those things are "bullshit" that has nothing to do with what I am trying to accomplish with a car, which is going places faster than walking, and carrying more stuff. I must do them, but they have little to do with what I want to accomplish with a car.


While the original article didn't particularly garner my sympathy, I get concerned when people start complaining that others don't know how systems "actually function".

Image that, the next time you sat down to write come code, you're asked for the value of Planck's constant and to solve a couple of Feynmann diagrams for holes in a semi-conductor before you can write a single line. After all, a programmer can't just live with the high level abstractions of machine code and Kirchhoff's rules - she should know how the systems she works on actually function!

Everyone uses some abstraction when they think about writing code. High quality abstractions (e.g. machine code) should be encouraged and things which violate those abstraction (e.g. bombarding memory with radiation) should be avoided. Low quality abstractions (e.g. monads are burritos) should be avoided. Being charitable, the professor seems concerned that poor quality abstraction are dominating and students are being forced to drop to abstraction levels that aren't relevant to their studies.


Be fair. Every class isn't "How to configure a VM and toolchain" class.


The very first class in the CS curriculum I took was "Intro to UNIX". Every other class in the curriculum assumed that as background knowledge. It's completely reasonable to expect a configured environment.


Shouldn't computer skills courses be prerequisites for programming courses? He's right that setting up the needed software shouldn't be an impediment to completing work in a programming, but that's because the university should be preparing the students with those skills before they get to his course.

If you're going to need to configure a VM and toolchain for many of your subsequent courses, it makes sense to start off with a "configure a VM and toolchain" course.


Y'know, even at MIT 20 years ago, there wasn't an assumption that people knew how to use computers, even for the computer science classes. Each class had an evening lecture on getting through all the "bullshit" that you needed to get started and actually learn the interesting things, instead of the peculiarities of the latest flash-in-the-pan toolchain that didn't exist 5 years ago.


Difference between Computer Science class and Computer Programming class?


As a professional programmer, I think that prof offers excellent advice. There's a difference between programming-in-the-abstract, and the mis-evolved "ls -l ls tar -zxvf rm rm rm ssh mv". Written by people in real-world dysfunctional settings.

There's alternatives to bash commandlines, to interactively explore/navigate running systems. Take REPLs, SQL prompts, etc.

Profs don't do their students any favors if they let poor design pass without comment. Unix has a history, which you can get if you read early papers from those involved, and users can critique it like they critique whatever website they use nowadays.


> Take REPLs, SQL prompts, etc.

That's still "command-line bullshitery" according to him though. Because it's an 'arcane interface' that "hasn't changed since the 60's". When his student wonders why he is using a text-based interface, he doesn't have an answer. Since he doesn't know, then obviously there is no good reason, so It's All Bullshit(tm) is the answer.


There is indeed a difference between programming-in-the-abstract and Guo's "bullshit"; it is exactly the difference between whiteboard doodles and running code: programming.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: