"When you don’t create things, you become defined by your tastes rather than ability. Your tastes only narrow and exclude people. so create."
As for the programming is a science or art debate: It is' of course wrong to make such a dichotomy. People who draw a thick boundary between these two, I think, appreciate neither art or science. Programming is a science, it's true, but it also is a thing of beauty, much as a game of Go, or a beautiful mathematical theorem. Any scientific endevour you do passionately is worth to be called an art, I think (and vice versa). Let us not make the mistake that John Keating makes when he says:
"We don't read and write poetry because it's cute. We read and write poetry because we are members of the human race. And the human race is filled with passion. And medicine, law, business, engineering, these are noble pursuits and necessary to sustain life. But poetry, beauty, romance, love, these are what we stay alive for. To quote from Whitman, "O me! O life!... of the questions of these recurring; of the endless trains of the faithless... of cities filled with the foolish; what good amid these, O me, O life?" Answer. That you are here - that life exists, and identity; that the powerful play goes on and you may contribute a verse. That the powerful play goes on and you may contribute a verse. What will your verse be?"
How so? How do "you become defined by your tastes rather than ability"? And why do [y]our tastes only narrow and exclude people"?
This sounds like one of those pop-psych quotes that have little intrinsic content but ill-defined assertions into which people read what they want to believe.
I know many people who are not "creators" by most definitions, yet I don't see them as defined by their tastes, but by their everyday words and actions. And when they do express their tastes I don't see that as narrow or excluding people. Their tastes often allow me to see associations among things, and value in places I hadn't expected.
If you want to then turn it around and claim, well, these people, through their words, actions, and expressions of taste are therefor in some way creating something, then I'm lost as to what "create" is supposed to mean, other than perhaps something too amazingly broad to bother with.
(I've also known creative people who are also narrow and exclusive and petty, so these qualities seem to be pretty orthogonal.)
Let me paraphrase what the quote says yo me: There are three types of people: the source (creator), the sink (end consumer), and the filter (the critic). Now, when I say that, I mean people for which these characteristics are dominant, of course, we all have a mixture of both. Also, we may assume different roles for different fields. And some gifted individuals can do more than once at once; John Updike comes to mind, as a top notch writer and both an literary and art critic.
The non-creator types (end consumer and critic) are defined by their tastes, which range from voting with their wallets to the up buttons in HN and links in their blogs. We are the link creators, the judges, the filters of content. However, where does all this content, code, the good stuff come from, who makes these? These are the people who make the quantum jump to creation. Just like the fact that there's a huge difference between a product that is free vs that's priced (even 10 cents), there's the gap between the consumer and the creator.
Ooops, this already became a long comment, so I don't have a chance to touch on what the part about "excluding people" parts say to me. Let me finish by pointing out a wonderful book called The Mirror and the Lamp, referring to the roles of the critic and the creator. I cannot think of a better analogy. Currently I'm all mirror and I want to change that! People like _why are a huge inspiration.
It's like a mantra, and for me it alternates between being inspiring and depressing.
Perhaps I disagree with you. I think programming is a craft, but that's different from art. In any case, though every time we write some code we're creating something, I don't consider the code I write for my day job to be "meaningful" creation (if that was the case every act of labour would be creation and the quote would be meaningless). That's why the quote is depressing sometimes.
And the Renaissance artists had to make a living somehow, much of their art we know today was made to order for Church, State or rich merchants.
We may consider "pure art" to be "form without function" and "pure utility" to be "function without form". But fields like industrial design, architecture and engineering (ex. bridges) has room for both: Aesthetically pleasing form that merges well with their function.
If programming is art, it's the form+function sort of art. And just like with bridge building, the form part is optional - if the program works and the bridge does not collapse they are doing their job.
I don't consider the code for my day job to be art either. But I still think art can arise under such conditions, generally because someone with the power to do so goes beyond the call of duty and decides to make their bridge or code not only functional but beautiful as well.
There are plenty of constraints that can stop considerations of form dead in their tracks; like whether the client is willing to pay extra or wait longer for a more aesthetically pleasing result, or how much control you have over the final result - there are limits to how creative you can be if you are a subcontractor for a bridge pillar, or if you are designing an ad that needs to fit within the company's design guidelines and an existing ad series.
Still, I think art can arise in the most unexpected places, it's mostly a matter of spotting the opportunities and deciding that it is worth the bother.
Sometimes function is optional as well!:
Like you say, art vs. science is a false dichotomy. Programming seems like both because it is about design -- the intentional and intense, the passionate blending of art and science.
I'm prepared to agree to disagree on this, but I'm not prepared to agree that Wikipedia has the last word on the subject. Sorry.
I could just as well be melodramatic like that kid in American Beauty and go on about how a plastic bag being blown about by the wind is the most beautiful thing ever. I could suggest the same about the way some skillful janitor moves his mop about the floor. Just because I get emotional about it doesn't change the definition of art.
Traditionally, the term art was used to refer to any skill or mastery. This conception changed during the Romantic period, when art came to be seen as "a special faculty of the human mind to be classified with religion and science".
You evidently subscribe to this Romantic definition of art, but claim that engineering is neither art nor science. So that makes it... religion?
Evolutionary psichology 101.
Fundamentally I think my objection is to elevating a slightly odd-ball programmer to messianic status based on a mildly creepy fandom that manifests primarily as people vastly over-stating the contributions of this one guy in HN comments.
The problem is that most of our industry is not ready to accept this news. There was no so important economic process like IT in the past where creativity and art had the same impact as it is happening with programming. Check the software and the sites you are enjoying every day: are they made by great scientists? The big majority are not.
But at the same time, the reverse is true. If you have good scientists you can create any complex program or system without problems. Look at all the PHDs at google, the result is a wonderful piece of technology, from the search engine itself to the cluster and other stuff they are running. But this is an engineering problem indeed, so it is a perfect fit for them.
But not everything is like that in our field. This is why for instance a big company like Google is not having a big impact in programming itself. If you see the reality most of their products for the masses are not working well, Ruby on Rails was not produced there, and so forth. It's not that I've something against Google but it is the perfect example of "company of scientists".
I think that PG in one of his essays said that CS is confusing because it's got a of people under its umbrella: some do mostly math, some are closer to artists, and some are somewhere else (I can't remember his distinctions). So the field has a lot of people saying "this is what it means," all of whom are at least somewhat correct.
What is CS or programming or whatever you want to call this thing that involves making computers do stuff? On some level, the answer drifts towards "whatever you want it to be." But the flame wars are less satisfying that way.
Accordingly, programming is not _currently_ a science. However, I think that is a much better position to aspire to than self-aggrandizement and glorifying a skill that is really no different than any other skill that any suitably motivated 12 year can acquire with a couple library books and an internet connection.
With a science of programming, we can aspire to a position where programmers can at least claim to be engineers, and at least have good odds of most programming projects resulting in software of reasonable quality. Face it, there's good art and bad art, and it's subjective which is which. Programs have objective measures we can apply, depending on what the code is used for - developer time, performance, correctness... and a million others. We can argue about which measures are the right ones to use, but at least these measures exist. We can (with current technology) empirically evaluate a piece of code in many different ways. Actually starting to apply this kind of empirical evaluation is the first step towards programming being a science.
[Edit: OK, so the parent to this was heavily edited after I posted this, but I can't be bothered rewriting this accordingly, so please excuse any incongruence between this reply and the post it is replying to]
The vision you seem to be espousing is someone tells the programmer exactly what to do, and the programmer does it exactly to that specification. Wiring a house and balancing a budget are solved problems. So you seem to be saying that programming is also a solved problem.
Which is so out of touch with reality I feel that, surely, I must be misunderstanding you. A computer program can do pretty much anything. Depending how you feel about the equivalence of Turing machines and human brains, you can argue that a computer program can potentially do anything a person can do.
With the geometric increase of computing capacity available to us, how is it even conceivable that "programming" is equivalent to balancing a budget or wiring a house? You seem to suffer from an incredible lack of imagination.
"However, I think that is a much better position to aspire to than self-aggrandizement and glorifying a skill that is really no different than any other skill that any suitably motivated 12 year can acquire with a couple library books and an internet connection."
That is the same as saying a 12 year old can write or play a musical instrument. Sure, they can spell words and make sentences, or know the correct fingering for certain notes. But the potential for improvement in those endeavors is also unbounded. The same is true of programming, in my opinion.
This is the absolute, most common, frequently found, frequently followed process for developing software. There is more "wiggle room" at each step, but it's the same exact process, whether it's one guy doing it and dreaming up the requirements or a team of thousands of developers.
For instance, the people that write papers on new datastructures, computer vision and other state-of-the-art developments. They're not working with 'requirements' other than 'it would be nice if you found a way to solve 'X', come back in three years'. (well, not quite like that, but closely).
That's basic research and a definite amount of artistry is involved at that level.
But after the paper is written up it has solidified and it becomes applicable science, and for the person calling the API that embodies the concept it has become 'mere' bricklaying.
I completely disagree.
The art for the person calling the API is often found in "building something people want." Think of the people who frequent this website. Many of them are writing lines of code to make something people want. Surely there is some art to that. There needs to be an idea, building something to show customers, and iteration on that idea taking customer response into account. The customers have input, but the programmer decides what to write.
Just as wealthy people once commissioned portraits and told the artist what to draw, programmers can be employed someplace where the question of what to program is decided by someone else. But there is nothing inherent in art prohibiting painters from deciding what to paint, or in programming prohibiting programmers deciding what to program.
Putting together a large piece of software, or optimizing a particular path in a large codebase, or putting together a creative solution to a class of problems, requires a much higher level of thinking. You need to bring together far more concepts at different levels of the abstraction stack, and have enough of them in your mind to arrange them without them interacting poorly at one level when you're coordinating them at a different level.
Programming is more like building a whole new kind of house, without a plan, or altering an existing house, without it all falling down around your ears.
As to subjectivity: you're wrong that software quality can be measured objectively, because of two simple facts: software isn't static - it must be modified over time - and humans are limited in their capacity for complexity. The abstraction which yields most readily to the changes that matter, such that those abstractions mesh well with those people doing the modifications, are subjectively good - their worth is dependent on the observer.
Sure, putting together a large piece of software is hard, but then again, putting together the electrical system for a nuclear plant or something on that order of complexity is just as hard, if not harder (no 'undo' switch on that one).
There has to be a way to see that programming is really not that hard 'in principle'. It is hard in practice, and this goes for a large number of other activities as well.
To argue on a programming forum that programming in principle is a simple thing is probably an unpopular point of view, but really, we're not half-gods or somehow special, we're just bricklayers and watchmakers. And if you've never built a brick wall I challenge you to go and do it and afterwards we'll talk about how easy it was 'in principle'.
The problem comes from the complex interactions between those elements. You can learn how the body works in high school biology, but that doesn't prepare you to operate on it. Similarly, you can learn how software is constructed ("make sure the right lines of code are used, follow each other, and are debugged properly") from Hacker News, but that doesn't prepare you to actually put those lines of code together.
One of the unfortunate effects of Internet forums is that it seems to have shifted attention from the nitty-gritty details of how to do something to the high-level overview of how to do something. So we have a bunch of programming forums where we talk about programming, but never actually program anything. It's not just programming (just look at reddit.com/r/economics), but programming seems to have been one of the most affected disciplines. But I'm afraid that this is giving a skewed perception of the field to newcomers, such that they look at the code they see on blogs or Hacker News and think that's all there is to programming. The #1 reason people fail Google interviews is because they don't have enough depth: they can recite code snippets they found on the web and maybe even put them together, but they can't analyze the performance of their programs or suggest how they might adapt them if requirements change.
Able to produce by tying together endless snippets of googled code, it is the professional equivalent of the script kiddie.
But for the most part I think that a very large amount of it is very comparable to bricklaying. Especially in the part where most people in the IT world make their money, such as software for the financial and the business world.
Rarely does it elevate itself to the level of watchmaking and even rarer do you find art. It does happen though, but not often.
I'm sure that programming is conceptually different from bricklaying in a sense that we are solving puzzles, sometimes. But everybody that got in to it for that reason knows that the only time you really need that skill is when you've messed up and you are facing a very tough bug.
Once you get better at programming it is more like knitting or weaving than puzzling.
Of course there are different styles of programming, and there are only so many kinds of brick. But programming for the most part is hard because we have chosen to make it hard. If bricklayers had to create each and every brick from clay and bits of stone then you'd be much closer to what programmers do for a living than what bricklayers do.
Bricklaying has been 'industrialized', and one day programming will be too. It will probably still be marginally harder than bricklaying, even after that has happened. But if you can explain what you want done and in which order then you are essentially programming.
People often say they would like to learn how to program but they are not smart enough for it. And plenty of programmers just love that because it makes them appear special.
But I think anybody can program, the difference is not some binary switch in your head 'programmer/non-programmer'. The difference is in how complex an arrangement you can create, and that's a continuum.
Some programmers are better at this than others and can solve more complex problems.
This prediction has been made for decades now. I suppose if people keep predicting this indefinitely, someday it will become true.
In the meantime, new programming tasks are invented faster than the old tasks are industrialized.
"Once you get better at programming it is more like knitting or weaving than puzzling."
This is up to you. Was building Google, Amazon, Linux, Firefox, iPhone OS, etc. etc. etc. more like knitting or weaving than puzzling? Maybe what you are saying is that most programmers aren't good enough to build those kinds of products, or would just rather stick to their knitting and weaving. That I would agree with.
"Some programmers are better at this than others and can solve more complex problems."
This is trivially true of any cognitive task.
One day is on purpose very vague, I haven't a clue on what timescale we're talking. But I'm fairly sure what will be the sign that we've reached it, it's when we will remove the term programming from language and use 'teaching' instead.
> This is up to you. Was building Google, Amazon, Linux, Firefox, iPhone OS, etc. etc. etc. more like knitting or weaving than puzzling? Maybe what you are saying is that most programmers aren't good enough to build those kinds of products, or would just rather stick to their knitting and weaving. That I would agree with.
Most of those were huge projects, but if you break them up in to little pieces some of them were artwork, most of them were bricks. The architecture of each of those was likely artwork.
> This is trivially true of any cognitive task.
Yes, but many programmers seem to think there is something 'magical' about the ability to program, as if there is some kind of essential element that differentiates those that can program from those that can not.
And I think that is not the case.
You should read this, and maybe the linked academic paper, too: http://www.codinghorror.com/blog/2006/07/separating-programm...
Maybe they won't be the next Donald Knuth, but they'll be able to hold down a job and make a living. I guarantee it.
You ship 'm I'll teach 'm, no matter how much time it takes I'll do it. The only pre-requirements are a willingness to learn and a willingness to spend the time, as well as an iq of 80 or more.
Anybody can do it, not equally fast, not equally good, some may hate it and some may love it.
That test is so biased it's not even funny, I've linked the script below, it assumes that people will clue in to the different meaning of the '=' sign in programming (assignment) as opposed to what it means in the rest of math (equality). I note that some programmers that are very well seasoned have been caught to mess up '==' and '='.
I always thought that the '=' sign for assignment was a huge mistake and that '<=' would have been far better.
But it's a little late for that now. What galls me even more about that test is that instead of providing a warm up with a bunch of assignment samples and 'print' statements afterwards would have given the really interested parties a shot at figuring out how it works, to test their assumptions.
The problem with programming is an educational one, not some kind of genetic pre-disposition to being able to program.
Just as there are people that are better and not-so-good at arithmetic there are people that are better and not-so-good at programming.
It's a skill thing, and if you work at it hard enough you can get better. The ten thousand hour rule definitely applies. If that's not the case then how come there are a thousand and one books that will teach you how to become a (better) programmer. It is teachable, it is a skill, there is no magic component, in spite of what this so-called study says.
Here is the test script:
Have read, and see what you think, try to pretend that you have no programming knowledge and that all you know is arithmetic, it basically tests a single question in different forms over and over again.
In my opinion, a much better way to grade people to find out who will be the 'good' programmers is to find who is good at solving puzzles.
There are many people for who this is not true when it comes to programming, and this is why they will never become a programmer (good or otherwise).
Let's face it, with programming, we're inventing entirely new domains with their own special rules that we create. It can be an artistic process or a scientific process. As an artistic process, it's very hard to judge the outcome from the start. As a scientific process, the result is much easier to judge prior to completion. Until the processes and methods are understood to a sufficient level, programming will be very much like an art. Once they're understood, it's a science, apply the rules and out pops a software.
I have several relatives that are plumbers, electricians, carpenters and architects. Once there's a common set of tools, language (as in, for communication between people) and understanding, programming is no different than any other craft.
The 'art' part in programming is exactly that part where it is not yet a science.
This is not a good aspiration I think. Engineers can be replaced with not huge efforts, while you can't easily replace artists.
Also this vision will lead to more bad programmers coming out from universities (if programming is an art), as the process of learning programming is totally different if you think it's a science or art. If it's a science, do CS courses. If it is an art, put them into craftsman shops to learn how to code.
Yes it is, it is just that in information processing, the problem domains are often ill-specified and ill-behaved. The system analyst's job is to pull the pin on a wish list grenade, fling it at the engineers, and run away before they realize what has been done to them. The engineers are then forced to pile crap together, reverse-engineer a problem definition, and see if anybody wants to pay for solving it.
Where is the magical divide between "art" and "science" in the field? Do the adherents of either side even know what they're defending?
If I know what a finite state machine is and how to apply it, does that make me "scientific"?
If I hack together a web application in Scheme, instead of some Mathemagical, Dijkstra-approved language, does that make me an "artist"?
Why can't people just accept that you need some of both to be a good programmer?
As soon as something is idiomatic I think it ceases to be art, but that does not diminish the artistry of the person or people that first came up with that.
Dijkstra would not have liked this.
Was it Dijkstra that advocated writing everything in assembly, or am I thinking of someone else? I know he was big on formally verifying the correctness of programs, but I thought I remember him saying that for implementation purposes, he advocated assembly. Or was that more of a "if you can't write it in assembly, you don't understand all the parts" deal? It seems he hated FORTRAN, APL, BASIC, and PL/1, and only later did he like LISP.
I wonder what he would have thought of Haskell. Standardization started in the 80's, so he would have been alive when it was being ironed out. A search in the EWD archives didn't reveal anything though.
It appears that he liked functional languages...
>The clarity and economy of expression that the language of functional programming permits is often very impressive, and, but for human inertia, functional programming can be expected to have a brilliant future, the more so because today's computers admit quite efficient implementations of functional programming languages.
>In particular, evaluating a functional program with pen and paper is a pain in the neck!
So he spends the rest of the paper talking about the predicate calculus and relation calculus.
We need to revive the word "artisan" which has connotations both of technical ability and artistic sensibility. Look at it like that, and programmers are no different from thousands of generations of craftsmen from here to the pyramids. It's just that the tools of our trade are usually far more complex than theirs. (Although I still have no idea how to build a pyramid.)
I think this is why some people like to compare programming to painting or writing or whatever their other life's endeavour is. I don't think there is a direct correlation between programming and those other things, but you start using the same parts of the brain for parts of both activities, so it feels as though it's roughly the same.
So here's my comparison between programming and writing: every word must be relevant, there is no redundancy, no unnecessary repetition. Good code and good writing must be tight, clean and comprehensible. I mentioned this to a co-worker once and he said "I'd never thought about it like that. I think it's more to do with mathematics." Which it also is.
Preferably a book without any surprising twists. Or homicides. But extra chunky bacon.
Wikipedia defines art as "the product of deliberately arranging elements in a way to affect the senses or emotions", and science as "the systematic enterprise of gathering knowledge about the world and organizing and condensing that knowledge into testable laws and theories." Why do these two things have to be mutually exclusive? Through creation and exploring, such as in _why's projects, we are further "gathering knowledge" about the limits computers can be used. And at the same time, we are creating art as these programs have had a clear impact on at least the Ruby community.
And, bringing up a more "sciencey" example, if the fact that my handheld calculator can solve complicated algebra and calculus equations in less than a second doesn't "affect [your] senses or emotions", then you need a reality check on just how impressive technology has come in such a short time.
I personally believe that programming is one of the rawest forms of creation imaginable, and therefore must be of some artistic worth. Simply calling it science and moving on does a disservice to those who slaved for so many hours working on their piece of computer science history.
I've never understood _why's fan base; he seems to be no more than somebody who popped up, wrote a few low-quality libraries for an obscure language, and then vanished.
An HTML parser requires fast and efficient code to successfully and quickly read the document. Furthermore, the programmer needs to take into account common variations in HTML code as well as handling nested tags and attributes. And taking it a step further, the HTML parser is used to output the trillions of websites that are currently out there in a visible form for a majority of the developed world. In this way, the designers have added something great and unique to the world which many people enjoy and use every day. This is art.
Then again, I am one of those weird people who likes modern art, so I could be a vocal minority.
"Programming with libxml2 is like the thrilling embrace of an exotic stranger." Mark Pilgrim
That sort of excitement is not just an engineering one.
And, of course, some artists choose not to even write HTML parsers, but to do things more strange, more beautiful with their code. And that is where things really change, and really get exciting.
Maybe this comes down to the difference between something constructed purely for artistic purposes and something constructed in an artistic way to fulfill a broader purpose. Or maybe it's just a matter of personal taste, the way some people will spend hours arguing about whether a soup can is modern art or just pretentious trash. Personally, I'd say that if people are moved and/or inspired by it, than you can make the case that it is art.
"programming is rather thankless. you see your works become replaced by superior works in a year. unable to run at all in a few more."
"if you program and want any longevity to your work, make a game. all else recycles, but people rewrite architectures to keep games alive."
"an ascending homage to fish bones. culminating in a delicate canopy of mouse furs."
... Okay, maybe not that last one. Anyway. He was obviously contemplating all of the stuff we've created around software, and was pretty bummed out by it.
Not that anyone will know _exactly_ why he disappeared, but still.
Anyway, first, to re-iterate: nobody really knows shoes, err, why _why disappeared. This is just conjecture.
Basically, those tweets are all talking about the social constructs we've put up around programming. There are large fads, things come and go, old projects are abandoned, new projects and forks of old ones spring up. The quote about games seems to be really about hpricot; from some reports, _why was kind of upset about Nokogiri, and everyone's shift to it. If people didn't like hpricot, why not contribute, rather than make the same project over again?
The first one is something I'd also expect to hear out of someone who's getting burned out. _why did a _lot_ of things. And, since _why was an artist, he put a chunk of himself in every project he did. It's unmistakeable, all of his endeavors undoubtably have his signature attached to them. And when you put yourself out there like that, and people reject it, it's hard. It's easy to get frustrated when you invest yourself in something, and other people simply reject it out of hand. There's a dead comment at the bottom of this thread that talks about people 'giving him shit' on the shoes mailing list, which I wasn't subscribed to at the time, but with anyone as prolific as _why, I can't imagine there wasn't a fair body of naysayers. Just look at this thread, and the people that want to remove all artistry and craftsmanship from programming. Hence what starkfirst was talking about.
The naysayers won. _why got burned out. He decided his sun had set, so he burned his guitar.
np, no hurry :)
> Anyway, first, to re-iterate: nobody really knows shoes, err, why _why disappeared. This is just conjecture.
Ok. I figured as much, which is one of the reasons I asked. I thought that someone might be able to finally provide some hard info on this, but it seems not.
> Basically, those tweets are all talking about the social constructs we've put up around programming. There are large fads, things come and go, old projects are abandoned, new projects and forks of old ones spring up. The quote about games seems to be really about hpricot; from some reports, _why was kind of upset about Nokogiri, and everyone's shift to it. If people didn't like hpricot, why not contribute, rather than make the same project over again?
Because they can. That's the downside of giving stuff out, you lose control. If you want control then you can go corporate, but as soon as you release stuff in to the wild, if it is at a level where plenty of others could 'fork' it they probably will (or they'll start some me-too project).
This is one of the bigger downsides of open-source, there is a lot of fragmentation and not all of it is good.
> The first one is something I'd also expect to hear out of someone who's getting burned out. _why did a _lot_ of things.
Yes, I noticed that. But then again, _why is definitely not unique in that. In the Ruby scene he is, but outside of it there have been over the years more people in that vein. I've known one personally here in NL but it was before the 'dawn of the web', so none (or at least, almost none) of it is documented.
Burnout is typically a symptom of taking on more than you can deliver and then to keep on throwing more of yourself on it until there is nothing left to give. I've had it and it took me years to recover. Not quite 'didn't touch a keyboard' but very close.
> And, since _why was an artist, he put a chunk of himself in every project he did.
Everybody does that. Really, people put a piece of themselves in to their work all the time. Any creative profession has this, whether it is a designer, a programmer or a person that restores old vehicles. Work created becomes like a child.
> And when you put yourself out there like that, and people reject it, it's hard.
That's the downside of putting your work out like that.
What bothers me a bit about the _why saga is that the ending of it left a pretty bitter taste in my mouth, it's fine if you no longer want to play, but active destruction of what you've created points to mental problems a little deeper than just a burn-out.
> There's a dead comment at the bottom of this thread that talks about people 'giving him shit' on the shoes mailing list, which I wasn't subscribed to at the time, but with anyone as prolific as _why, I can't imagine there wasn't a fair body of naysayers.
Happens to all of us. I have my share of those and I'm a public 'nobody'.
> Just look at this thread, and the people that want to remove all artistry and craftsmanship from programming.
Not all. I'll be writing a long piece about that any day now, and in part it was prompted by this thread. But there is a lot more to it than 'all code is art' or 'no code is art'.
> The naysayers won. _why got burned out. He decided his sun had set, so he burned his guitar.
Any fool can destroy, but the naysayers did not burn out _why, he did that for the most part to himself. By taking on more than he could sustain-ably deliver he set himself up for a fall, and by not keeping enough distance from the trolls he allowed them to get under his skin. If you are in the public eye by accident that's one thing, but if you choose to be in the public eye by your own choice you really have to have a thick skin.
Look at all the flak that Linus, RMS and Guido van Rossum get, none of it is deserved and they just keep on giving.
One of my first exposures to contributing open source was an extremely negative one (I wrote a clone of zmodem and got crucified publicly by none other than the great Paul Vixie himself for 'copyright violation' when in fact all that happened was that a few lines of an old include file made it in to the spec for the new code, and so eventually in to the distribution. He neglected to mention that this was perfectly ok in a non-spec'd protocol and that Chuck Forsberg, the original author of zmodem was his buddy).
After that I vowed to never release code again.
I can't imagine what would have happened if I had gone through with my idea of an open source micro kernel based operating system released to the public. Probably I would have burned out a lot worse than I did on the webcam project.
Naysayers are like trolls. Ignore them, don't feed them.
As an educator (and that's how I primarily see _why) he had an exemplary role, and I think he failed in showing the people he was teaching how a mature person bows out when they realize it is no longer worth it.
It's rough being out there, but you're absolutely right. You have to grow a thick skin. But that doesn't mean that everyone is able to do so.
Oh, and you were going to write a µkernel? That's awesome, I help out two of my friends with an exokernel. It's been two years of hard work, and like you conjectured, we've had a lot of naysayers. But it's coming along really nicely, I'd say...
It's lots of work getting something like that built and built properly. I can vividly recall the day when it first booted and then a few months later when it became self hosted was really amazing.
I've dumped some source for you here: http://ww.com/task.cc http://ww.com/task.h , it's the kernel itself and the main header file for the process structure. It's 'hard real time', something I wished linux would get on to in the standard releases, and switched on by default.
Maybe there are some ideas in there that are useful to your friends. If you guys get something working let me know please, I'm always interested in stuff like that.
It's funny when I look through that old code, how simple it all looks, blood sweat and tears to get it right though. That was definitely pre-burn-out code :) Since then I've done only much simpler stuff, with the occasional venture in to something a bit more ambitious.
greetings & thanks for the exchange btw.
But I don't think that fundamentally you can separate the act of expressing programs from the act of coming up with interesting things for them to do. The materials of the medium constrain what is possible or easy to express, and it's quite hard to produce anything good if you try to artificially separate them into levels of specification vs. implementation.
Basically, you have defined programming as a form of dictation. That definition makes your arguments true. But it does not follow the common usage of this term, especially as a site like this.
You are right to be cynical but this time I think you missed the mark.
The author gives numerous examples of work "Why" produced. Their merit cannot be taken away. Just as important is the effect "Why" had on the Ruby community and other programmers who I imagine are just as or even bigger cynics. There's John Resig, Zed Shaw and Fábio Akita. I've read their blogs, looked at some of their code. They don't appear sycophantic in their ideas or writing. But I don't know any of these people.
But I do know @DrNic ~ http://www.flickr.com/photos/bootload/tags/drnic I met him last year at a talk he did on "Scriptable tools" ~ http://www.flickr.com/photos/bootload/3409666921/ Nic is the most cynical begger I've met - talking and interrupting the talk after his, ripping into a greeny with a largely sympathetic audience. Questions and comments both insightful and rudely cynical at the same time. I didn't sense the audience resenting this. It pushed the speaker to clarify his ideas.
So I artfully disagree. "Why" had a tangible positive effect in the core Ruby developer community who by nature are as skeptical as you can be - and then some.
Sarcasm aside, _why's influence was felt beyond the Ruby community.
Honest question from a python programmer who never heard about him before the "he disappeared" craze a few months back.
_why was drawn to Ruby because of its own artfulness, and playfulness; you would be hard-pressed to come at programming from the same perspective in Python, not because Python isn't a nice language (it is) but because the diktat "there is only one way to do it" necessarily discourages experimental styles of coding, though not of API or visual design. If your medium is code, then it helps to work with a malleable programming language, and Ruby is nothing if not that. Scheme is malleable, but I'd not call it playful. Perl comes closer, but most languages, which are are written by Serious People with Important Goals, do not.
Anything that inspires people to stretch their skills and try new ideas out helps grow teams and developers, though preferably not when working with production code.
As for his strange charisma, I would say _why is to programming as Ramanujan was to mathematics. There were western mathematicians who were so fascinated by Ramanujan, that even when he was wrong, they suggested that perhaps he was "right" in some higher plane of reasoning.
There's this cute paper some guy once wrote, it's called "The Art of the Interpreter ...", geez I guess he'll have to change it to "The science of the Interpreter ..." - good thing you fixed that for him.
As the article says, what you can take from _why is to bring a sense of aesthetic to _any_ thing you approach, be it a boring powerpoint presentation or a blog or whatever. Didn't your Mommy ever tell you, "If you ain't got something nice to say, maybe you should bite that tongue of yours".
_why has done absolutely nothing for me either, I cannot stand Ruby, and its community, I don't understand why all the reverence for this single man. Yes, I know of _why's work, and yes seeing other people use his old nickname makes me feel sad since they won't ever really appreciate the ideas, and projects he put out into the world, but why are we still writing about him?
There are many other people that have contributed greatly to advancing of computer science as a whole that contributed to more than just one programming language that have changed the field of computer science as it was once known. Those people are the people we should be writing about, not someone who like a coward decided to up and quit.
_why reminds me of my younger brother, and even partly of myself back when I was a kid, if monopoly or some other board game was not going my way, the board would go flying and so would all the pieces, and it would no longer exists. _why removing all of his content, not passing it along to the next group of people, not even leaving a backup in place thereby causing others to scramble to find the latest up-to-date versions from local disks is very much like that game of monopoly where I am losing, and it is a cowards move.
What I'm getting here from _why's detractors is that your animosity towards _why is bound up with your animosity towards Ruby. I have never understood language wars. Each language brings something different to the party. I learn different languages just for the fun of it. We can be competitive and cheerlead our own corner but it's tedious when people start spewing bile.
I don't think you can project your childhood emotions onto _why's pulling of his online presence. We can only guess at why he did what he did. He disappearance and removing of the code and so forth that he was in control of does pose interesting questions in this age where source-code is shared and diaries are public. If it's important enough people will have copies and keep a hold of the bits they like. I can see how it would annoy somebody to feel like something of worth could be removed from the world so easily but maybe we'll have to get used to that. You could argue that _why had a social responsibility to not do what he did but surely we must allow him some personal autonomy as well.
For my part I'll try to understand your perspective but could you also try to appreciate mine? Thanks!
I like _why too, but but by telling gdp shut up because he's not a "fan," you're actually making it much easier for him to make his point.
> _why's influence isn't only in code, but in thinking. And if you look around, for example on github, you will see a lot of people hacking away for fun, trying out new stuff and just doing something they love to do. Did _why start this trend? He didn't, but he was a big advocate of it.
Yes, that's right, this person appears to be seriously suggesting that _why could have been responsible for people hacking for fun (in the "I'm not saying that...", right after saying it kinda way).
> I'd argue his unique way of thinking has also pushed the boundaries of programming as a science in the way that calculus changed people's ways of thinking about math and physics.
As influential on programming as calculus was on maths and physics? Really? _why is on equal footing with Newton and Leibniz now?
> As for his strange charisma, I would say _why is to programming as Ramanujan was to mathematics.
This one suggests that _why is to programming as a Fellow of the Royal Society was to mathematics! Awesome!
> I always thought he was one of the most creative people currently working in any medium,
Yep. That one pretty much speaks for itself.
I mean, the _title_ of the article is "A Tale Of A Post-Modern Genius".
Need I go on? I believe my use of the word "sycophantic" was justified and I stand by it. Genuine praise for someone's achievements is completely justified and should be given freely. This is not genuine praise. It appears to be a pissing contest between different people trying to find the most absurd hyperbole with which to describe the effect that _why had on them, the Ruby community, the programming world and the entire universe and all matter contained within it.
Most of it is really over the top, and you have to wonder how the non-ruby programmers get by, without access to _why's genius or his contribution to thinking in general.
_why was just some guy, he had a nice run at it, decided that it wasn't for him in the long term and proceeded to try to hit the 'undo' button. In a way it is good that most of the stuff got saved, if not we'd have an even larger problem, which is that the scope of his actual contributions would have become the stuff of legend. At least that keeps things a little bit grounded in reality.
Credit where credit is due, but let's keep some scope here _why was an ok guy while it lasted, but he certainly was no Newton, Leibniz or Ramanujan, and he did not put the 'fun' in programming except for the relatively small number of people that learned about programming through him. And for that last we should be grateful, and we should do more of it, and we should NOT destroy what we give to others to use as a base.
Especially for someone that wrote open source libraries his actions were totally unconscionable and they negate a lot of the positive feelings I would have had to his person otherwise.
INTJs : programming as science, maybe a little art
INTPs : programming as equal parts science and art
the left-brain folks cannot handle the little details directly. they rely on subconscious abstractions. for left-brain folks, the engineering aspect of coding is perhaps necessary to keep track of the little details. on the other hand, left-brain folks have a knack for certain kinds of structure (abstract logic for example) that right-brain folks do not have naturally
Incidentally, I also consider engineering to be art, only with more constraints.
Funny, that's exactly the kind of thing I'd expect _why to say if he were to take up a new pseudonym and participate in discussions on Hacker News. I'm sure he'd just be trying to convince us that he wasn't actually all that great.
The move to 'quit while ahead' is genius, forever his legacy will be treasured and the myth of his genius will only expand with time.
That doesn't mean that his contribution wasn't great, I'm sure the Ruby world would look different without him, but for every _Why there are 10 others, maybe not as good at marketing themselves but at least as influential.
We really have come to the crossroads of popular culture and programming if we get programmers with 'cult' status comparable to minor movie stars. But I'd rather see some credit go to those that are less capable self promotors.
Watch for _Why's spectacular comeback in a couple of years.
Mark my words.
And that in fact made _Why a stronger brand.
Have a look at the band 'The Cure' and Netochka Nezvanova.
Both of them were amazing marketeers, and _Why worked very much in that spirit.
In the beginning I thought it might be the same group.
nobody wants to challenge me on this?
Maybe rather than saying 'wow' and laughing, you should consider why the things that you say cause others to want to exclude you from the conversation, rather than reply.
Non-attachment -- Exactly how? If so why delete your work ? That shows his attachment to it. iirc, there was some discontent over one of his libraries being overtaken by another (nokogiri ?). There were benchmarks that showed his lib to be slower. then he improved his lib. this went on. he did seem a bit annoyed, if i recall.
"People get way too caught up in their work. I like to think that he was able to keep the products of his online persona separate from the rest of his life, treating them as completely distinct entities - the perfect, clear-cut, division between personal life, work, and play."
"Seeing the complete deletion of his online persona doesn't terribly surprise me. Back in 2007 _why closed his main blog (RedHanded). That event truly shocked me, but it helped me to better understand him as a person. The blog, even though he had put years of work into it and people strongly identified him with it, was immaterial. It didn't feel like the right place to talk anymore so he moved on to another place, abandoning the old site."
Does anyone have any idea what he's been doing since?
Basically, the Ruby community decided that if his wish was to be gone, we'd respect that wish. Nothing has been heard since his disappearance.
* The _why that we knew would not be averse to setting up a puzzle of this sort.
* We simply want to make sure that he didn't kill himself or anything like that.
You make it sound as if there were a single monolithic Ruby community where some sort of consensus gets reached on things. That's completely imaginary.
Seems more like some folks, most most, who happen to be vocal in certain circles, decided to not try to dig shit up. But that's a reflection of those people, not _The_ Ruby Community.
You live a sheltered life.
A few million creative artists would probably disagree without you, some violently so.
Yes, it was nice to have him around, no, he's not that special. Have a look at Alan Kay for an example of someone who is special.
There are a lot of creative people, for sure. Discussing whether X is more creative than Y is meaningless. Ultimately it's about how X or Y's work resonates with you.
>Yes, it was nice to have him around, no, he's not that special. Have a look at Alan Kay for an example of someone who is special.
"In my opinion".
Personally, I've seen few things in any field that struck me as comparable in originality to The Poignant Guide, for example. I would never imagine a language tutorial that would be essentially a cartoon, and a pretty brilliant one at that, accompanied by a soundtrack - but also a working online console and eventually a whole new programming language.
Looking at his entire body work, it's impressive this was all done by one person, but it's incredible that this person actually didn't make any attempts to reap financial rewards from it, in the age where everyone with a few blog posts behind him is telling you "you should follow me on Twitter".
Have a look at the early TRS-80 manuals, they were exactly that, no soundtrack though, it was on 'dead tree' format for want of a bit-mapped display on that particular computer.
That does not diminish what _why did but it is not quite as original as you may think.
> Looking at his entire body work, it's impressive this was all done by one person,
> but it's incredible that this person actually didn't make any attempts to reap financial rewards from it,
Agreed, but he's hardly alone in that, the open source world is full of people that have given an enormous chunk of their lives without getting much or even anything in return.
> in the age where everyone with a few blog posts behind him is telling you "you should follow me on Twitter".
And on that we also agree.
In fact the whole 'get rich quick' mentality can take a running jump as far as I'm concerned.
Certainly a lot of people donate their time to open source projects. Many of them do it while being paid by a company or a university to work on that code. Many others later us it as a base to start companies or consulting careers.
There's nothing wrong with that, of course, but _why did stand out, especially among the Ruby community in the mid '00s, as someone who could easily make $200/hr consulting, and chose not to do so. I don't know many (of course there are some) who willingly choose that path.
Otherwise we pretty much agree :)
The Radio Shack/DC comics stuff is definitely not what I refer to.
And we don't know if why didn't choose to do $200/hr consulting jobs, he may have done the _why thing on the side.
I just think that there are plenty of people that have done a lot more for programming in general than _Why (or however you spell it 'properly'), and that to exaggerate like this is not helping at all.
Plenty of people are busy today doing very interesting stuff, let's go and talk about that instead of endless pouring over the ashes of one very briefly very productive programmer, artist if you wish.
He choose to limit his contribution, I have heard more about _why and how fantastic his works are since he quit than I ever did before.
Too much fandom, which in turn actually does _why a disservice because it turns off those that would otherwise still be open minded.
edit: it is precisely because of the similarity in goal that I mentioned Alan Kay by the way, and I think Alan Kay has gone about it in a lot more structural fashion that _why, he's also still at it instead of taking his toys home because people won't play his way.
The very least he could have done would have been to transfer the maintenance to others in an orderly way in stead of this wanton act of vandalism, and without a second of warning to boot.
_why was a very good influence during the first period and he may have approached computing the way Alan Kay has tirelessly advocated for his whole career, but he wiped out a lot of that credit at the end.
I'm really excited!
Good stuff comes out of organization, great stuff comes of inspiration, which only comes from playing with ideas.
Theres a great Feynman quote from just after he got out of the Manhattan project, where he describes his decision to not think about "Serious Problems" and instead just concentrate on playing with interesting ideas.
Which is, not coincidently, why _why's detractors tend to feel the need to throw in a comment about how much they dislike Ruby.
You (and everyone else) are creating your own world. Have fun with what you create..
Or they're a form of art. Out of one side of your mouth you say people should do things besides programming for fun, then when they do you call it self-important.
update: there, fixed it. thanks :)
It's just a moniker, a handle. As soon as you start to assign great weight to the spelling and the special look ("It has to have an _") of a nickname then you're definitely in branding territory.
Just like "The artist formerly called Prince" is a brand, so is "_why, they lucky stiff".
Some people care about similar variations in brands.
So far, so good. What's not clear (to me, at any rate) is how you get from these to "Therefore, _why's name is a brand". That would be reasonable if no one cared about details for any reason other than branding, but that's obviously false.
Compare: "... then you're definitely in programming-language territory." (Almost all implementations of almost all programming languages care very much indeed about the difference between "Why" and "_why".)
Or: "... then you're definitely in religious territory." (Some Orthodox Jews care very much about the difference between "G-d" and "God" even though the former is just a lightly modified version of the latter.)
Or: "... then you're definitely in music territory." (In a musical score, adding or removing dots, little lines, funny characters, etc., can completely change the meaning.)
Or: "... then you're definitely in vanity territory." (Some people just care a lot about appearances.)
So ... why "branding", specifically?
One can imagine how much energy is wasted revising commits and over-massaging code just to make the author look smarter.
I'd love it if the culture were more accepting of ugly code... It has its purpose too.
But, I do find that code written in sort of industry-standard style often is hugely over-architected, with a lot of boilerplate and cruft, and not all that easy to modify if I want to adapt it to my personal ends; i.e. my use-case is a single programmer wanting to put in 5-10 hours to make a small to moderate change. You end up in a maze of Factories and Interfaces trying to figure out where anything is actually done. The idiosyncratic code often just does away with all that architecture, which can sometimes be easier to deal with.
Why did this enigmatic guy disappear from the nets?
Although, if he now distrusts the Ruby community too badly, he might be waiting for a new language espousing "programmer enjoyment" to sprout up...
I think the only lesson he's trying to teach us is to take things less seriously, to accept transience and impermanence, to realise that nearly all of us are just making pictures in the sand. Almost everything we do will one day disappear without trace. We might as well get used to the idea.
Humans understand new things by comparison to what they already know, so fresh souls are often misunderstood and misclassified.
I'll add to that though. often, when people die (and _why did in a sense die), people view everything they did through rose colored lenses. I'm with the majority in HNers in this thread so far, still in awe of _why, however, let's not forget the tough lessons his disappearance taught us.
While I greatly appreciated everything _why did for the community and think he truly was an amazing person, it's clear in my mind that the way he left the community, without notice that he would take down his sites and code, was irresponsible. He had the right to do what he did, but that doesn't make it ok.
I'm not angry at _why, more sad. I assume, given his eccentricities, that there was no malicious intent behind it, which is why I find it merely tragic. However, his leaving was a powerful reminder of the responsibilities open source software authors have. If you make code free, you should make sure it stays available within reason, and if you use free software, you should take _why's disappearance as an object lesson.
...but that was exactly what he was from the beginning; I don't believe he ever considered himself to be an "open source software author." He said himself (in the article, even) that he was only ever messing around—it was our fault for depending so heavily on what he intended to be either "art" or "toys", but never "work."
His ethic (really an aesthetic, as he was an aesthete) worked well for those who were with him from the start, but it was entirely incompatible with the premise of "software engineering." The only people sore from his loss were those on the outside looking in.
And why hide your real name in the first place ? Considering that he gave presentations and made appearances, it was only a matter of time.
"Fischer Random was designed to remove the importance of opening book memorization. Fischer complained in a 2006 phoned-in call with a television interviewer that talented celebrity players from long ago, if brought back from the dead to play today, would no longer be competitive, because of the progress in memorization of opening books. 'Some kid of fourteen today, or even younger, could get an opening advantage against Capablanca', he said, merely because of opening-book memorization, which Fischer disdained. 'Now chess is completely dead. It is all just memorization and prearrangement. It's a terrible game now. Very uncreative.'"
When you don’t create things, you become defined by your tastes rather than ability. Your tastes only narrow and exclude people. so create.
— Why the Lucky Stiff