Hacker News new | past | comments | ask | show | jobs | submit | page 2 login
“I hate almost all software” — Ryan Dahl (plus.google.com)
397 points by robinhouston on Sept 30, 2011 | hide | past | favorite | 292 comments

I think this rant is striking such a chord because it exploits a well known, if not widely acknowledged predisposition among developers - the breathtaking quickness with which we assume that the other guy is an idiot. Not that we don't understand the problem or that this solution addresses things we aren't aware of, but that the other guy is stupid and shouldn't be allowed anywhere near a compiler. I can see Dahl doesn't like complexity, but has it ever occurred to him that software is complex because it solves complex problems?

The real problem is that tools and libraries are incredibly unforgiving. Not only do you have to understand 10-15 different technologies to make your project work, but you need to understand them all extremely well which is a huge barrier to entry.

I don't know a whole lot about rails, so this is conjecture, but I imagine this is why ruby on rails is so popular: you don't need to know very much to get it going.

Making things simple takes a lots of work. Things are complex because the problems are complicated. That's why you got paid the big bucks.

"Make everything as simple as possible, but not simpler" - Einstein.


>> There will come a point where the accumulated complexity of our existing systems is greater than the complexity of creating a new one. When that happens all of this shit will be trashed.

Amen! (Include with complexity above the complexities of real life like project schedules, time-to-market, ..., ultimately economics.)

Ryan claims that the systems are still complex, which suggests that the accumulated complexity (including project schedules) has NOT exceeded the complexities of creating a new one in general.

Having said this, what is Ryan really saying that tells us something we did not know before!

A tacit aspect of the whole argument is that the people are intelligent enough to judge complexity to make rational decisions, and would be able to find a simple solution when creating a new one when even with all the new understanding gained with experience, the new solution will still be very complex (just simpler than the existing one). This is to an extent analogous to the rational market hypothesis, and that I doubt to be true.

Next Ryan may propose a new system that will be written from scratch to satisfy his no-overly-complex goal. Only to find that the new software runs on the top of existing hardware which is immensely complex. Oh, then he thinks about developing hardware again too. Only to find that hardware development is immensely complex (EDA tools for example). Oh, then he thinks about developing them again too. He now concludes that the accumulated complexity hasn't yet become too high after all.

After taking all of that into account and if that is not complex by itself, find something intermediate level (say a programming language) that has less complexity at that level (going deeper would increase complexity) and build something on the top of it. But isn't this what all of us already do?

But this is what we have, and I am Candid, so I believe that this is the best possible world.

What does the author of this essay wants? A mind-reading machine? I RTFM, so I don't complain. In my archlinux netbook there is plenty of room for creativity, and amazement, and fun.

I love what I have. Besides of that, is free, and I can hack it.

Pain is inevitable. Suffering is optional.

I can't relate to this at all. I hardly ever run into the problems he describes.

I usually just start Visual Studio, create a project, import some NuGet packages and off I go.

(No, that wasn't a "my framework/platform is better"-rant - JVM IDEs + Maven can do essentially the same thing)

Point is, I don't even know what autoconf is, and I like to keep it that way.

Only a tiny fraction of us have to make Node/Java/.NET/Ruby+Rails+Rack+etc. All the rest can just go and solve problems. These tools really do abstract away from accumulated platform complexity. They add a little on their own (like $NODE_PATH), but that's on the platform level too, the level i don't care about anyway. I have npm, you know.

You're not doing system programming.

On a platform-related side note: if you're not using UNIX then it's going to be pretty hard to run into the described issues. When you go with a proprietary developer platform like Visual Studio you are outsourcing a lot of little headaches to Microsoft whose job it becomes to make your life easier. The problem is if Microsoft decides to drop their clue off in the weeds somewhere you have to either start writing pleading letters to Ballmer or make a huge jump to some other platform. In UNIX you suffer a steady string of little headaches, but you have the open source ecosystem to back you up, so with enough persistence there is almost nothing that you can't solve.

Thanks at least for responding. The other thumb-downers were too much in a hurry to, apparently.

I know I'm not doing system programming. My point is that a very small minority of programmers is doing system programming. And if you're not part of that minority, software isn't at all as bad as Dahl describes. In fact, it got super many enablers for free, without many corresponding headaches, over the last 10 years or so. There are a lot of real decent high level tools.

Most of the Java ecosystem is open source, btw, and it has comparably easy high level tools as Visual Studio / .NET. Eclipse and NetBeans are real decent and open source, and so are Maven and all tooling for the cool JVM languages (i.e. not Java). I don't think my point applies to proprietary platforms like .NET only.

In fact Node is quickly becoming an equally cool platform. Big load of open source libraries, robust and dependable core engine, thriving community, excellent package management, etc. Nothing there to hate.

I daresay Ryan Dahl hates software for us so we don't have to.

Okay. I didn't downvote you BTW.

Isn't he dismissing culture with his rant? Culture as the currently common way to do and describe things?

He seems to imply that there could be an easier, more straight-forward way to describe things in some more common language. And that while he doesn't give any evidence how the current ways are overly complex.

Of course, there is broken or outdated software, and some things were crap from the start. Of course, there are always concrete things to improve but you won't get anywhere by dismissing all of it and starting anew.

For me, understanding the current state as part of our culture and our humanity and improving gradually on it, has guided me well in the past.

One thing that would help me out would be if new documentation could point out which parts of a system are needlessly complicated and which parts are needed for the system to function. If a book or document could just say "This part of the tool is waaaay too complicated because the original developers envisioned this evolving differently. These are the good parts that you should spend time learning". I think Douglas Crockford called this "subsetting". A big challenge when approaching a new technology is deciding which parts to invest in.

Thank you for pointing out the elephant.

I dont think he hates software....he seems to hate the unnecessary steep learning curve that he is faced with when he is out to do something simple with an existing platform like unix.

I think this argument is perfectly sound. As a software developer, there are times when I wanted to do something simple using a particular framework, and I was faced with a steep learning curve to achieve it. Note here that I was not trying to use the fanciest features of the framework, but the most simplest of it.

I'm not sure why people expect software beyond Solitaire, Pidgin or Thunderbird to be any easier to configure or maintain than, say, changing the spark plugs in your car.

Or harder yet, diagnosing the issue as the spark plugs to begin with. Cars have been around for a hundred years and they can still present a challenge to even highly skilled and trained mechanics.

Manage your expectations when you try to do something you aren't an expert in and you won't be disappointed.

Agreed software is often way to complicated, especially Linux at some parts.

Ultimately we are developing the software because of one simple thing.

The user.

[mild rant]

Usability discussions like this invariably fill me with rage because of how oblivious and dismissive some of the comments are.

They might as well say: No Wireless. Less Space Than A Nomad. Lame.

You'd think people would know better after 10 years & $350B in market cap!

And yet nontrivial things can be built in 48 hours.

The results of modern software development speak for themselves. One of the biggest things I learned from reading a few chapters of The Mythical Man–Month was what software development used to be like.

What he has against BOOST?if any, Boost helps eliminating deps. Without it one would need to use multiple libs from different places to achieve common things like shared pointers etc. Not to mention most of it is just header files.

I'm shocked that about 90% of comments are bashing Ry. If it's not obvious to you that writing software sucks, it's because you haven't swallowed the red pill. You are not enlightened. What he says disturbs you and discomforts you because you've built your life on the blue pill notion that writing software is supposed to look something like this -- that memorizing the minutiae is justified and even somehow noble.

You're tempted to criticize because someone told you growing up that you're a unique little snowflake and your opinion is worthwhile whether it's qualified or not. This is Ry. And his sentiment is echoed by the greats, like Alan Kay and others. Listen for a second (and you can't listen if you're already babbling your unintelligible knee-jerk response).

Thanks, Morpheus Durden

You're welcome. Actually, the snowflake thing was Bill Hicks before it was Tyler Durden. And, OK, I was a bit hard, but I can't stand the Oprah Effect: here is an important, deep topic, now let's ask some unqualified, uninformed people for their opinion. I browsed the comments and only saw a litany of shallow, and frankly stupid, replies. Since when did HN become Reddit? Which then begs: Is it time for me to go?

Obviously jocking (and I must admit I don't know who Bill Hicks is).

Anyway, I see your point.

No worries. Cheers.

When discussing this with a colleague, I was reminded of: http://neugierig.org/content/unix/

It's only fair that we, developers, start suffering a little from the poor state of system software usability after inflicting so much pain on our users.

I could imagine the hoops that he has to jump through to get things working correctly under both windows and *nix environments

I think part of the problem is that many of these issues are left to "distributions" and "repositories".

Some guy releases a library or application, then it gets packaged one way into .debs, another way into .rpms, another into macports. Maybe the author does this work, maybe more likely distribution maintainers do it.

Or in the world of a specific programming language, there is a similar story with a language specific packaging system. Maybe it gets packaged as a gem, or a jar, or an egg, or a module, or maybe the new node package manager.

Often, installing a package involves spreading its contents around. Some of it goes in /var, some goes in /opt, some goes in /etc. Who knows where else?

Many of the reasons for the unix directory layout don't apply for most people today. How many people even know what those directories' purposes are? How many have actually read the Filesystem Hierarchy Standard document?

Typically, those directories were established so that sysadmins could save storage space by sharing files between sets of machines (the word "share" seems to have about a dozen different meanings in these discussions). So you slice it one way so that machines with different architectures can "share" the contents of /usr/share, and you slice it another way so that things that change can be backed up more often, so they get thrown together in /var (and then you can mount /usr read-only!)

Most of these considerations are not worth the effort for most people. I think they are outdated. We don't generally have these directories mounted on separate partitions. We just back up the whole damn hard drive when we need a backup.

Here's an idea: a package should be a directory tree that stays together. Each programming language should not have its own separate packaging system. A package should be known by the url where the author publishes it. That author should also declare his/her package's dependencies by referring to other packages by their urls. Then you don't need dependency resolution systems that function as islands unto themselves (one for debian, another for node etc).

Software is published on the web, in things like git or mercurial or subversion repositories. These have conventions for tagging each version. The conventions are gaining adoption (see semver.org for example) but not fast enough.

Some middle layers just add friction to the process: distributing tarfiles, distributing packages to places like rubygems or cpan or npmjs.org. Developers usually want the source straight from the source anyway -- users might as well use a setup that very closely mirrors developers'.

If you want to add a package into your system, the only piece of information you should need is the url for the project's main repository, with an identifier for the exact release you need. That's a naming system shared by the entire web. If there are issues, that information can go from the user directly to the author, with no layers in between.

Apple has a great install/uninstall process for many applications: you move them to the applications folder, or you drag them out of the applications folder into the trash. We need to strive for this level of simplicity. Deployed software should have the same structure as the package distributed by the developer, in almost all cases.

Have you heard of GoboLinux[1]? It revamps the directory structure for exactly this reason.


Yup; the key is adoption :)

My approach right now is to manage all my software within my home directory in a way not unlike what GoboLinux is doing. The home directory gets mounted on different machines with different operating systems. So the aim is to gradually work out a software packaging strategy that works well across all the existing OSes.

Similar to homebrew or GNU Stow, actually. But homebrew is mac specific, and weirdly tied to a github repo.

"The only thing that matters in software is the experience of the user."


This pisses me off a bit, actually. Basically, he's ripping on every developer who ever wrote any code. Ok, I'm sure there are a few he'd be happy with, but his comments do sound all-encompassing.

And then he somehow tries to make it "better" by ripping on himself, too, saying he's a part of the problem. Um, no, being self-deprecating in the same way that you're insulting everyone else does not magically make it ok for you to insult everyone else.

I've been using Linux (and a couple UNIXes on and off) for a little over 10 years. So I can get around a UNIX-like system pretty well. A lot of things are easy, and a lot of things aren't. Saying that it's somehow someone's fault is ridiculous. Claiming that all software developers are collectively lazy or don't care about user experience just doesn't hold up.

The funny thing is that he works in a position that naturally involves some difficult stuff. Let's say my favorite language to write software in is called XYZ. Say it's super easy, intuitive, concise, performant, and the method for compiling/deploying/distributing the end result of your hard work is trivial. In all ways, this system is just beautiful to work with.

Great, but I'll bet you the guy who wrote all the development tools and runtime for XYZ had to do a lot of difficult work to make that possible. Dahl is building a runtime for web applications. Unless he's writing it in some high-level language, it's not going to be easy. Supporting every platform he wants to support isn't going to be easy. User interfaces should be as simple as they can be, but often that requires a lot of complexity under the hood.

Go down even farther. Let's think about our basic building blocks. Transistors. Hgih and low, ones and zeroes. It's a very simple interface. You construct logical operations by using NAND, NOR, NOT, etc. gates, which are built from transistors. Also simple. But the next step for our modern computer is... well... the microprocessor. And while it's made up of these incredibly simple building blocks, the combination of them is extraordinarily complex. So the interface into that mess is also not the most friendly thing to work with: a machine instruction set. So we build things on top of that to make it successively easier: assembly language, C, Ruby.

And the tools that come along with this are only as good as the technologies they're built on. Tradeoffs must be made to be portable. Yes, all this is a huge mess that "we" have collectively invented over the past 30-50 years or so, but it's simply not possible to go back to the 1970s, know exactly where we're going to be in the 2010s, and design the perfect system, even with foreknowledge. The current state of computing is a product of the evolution of our technology. Often that means doing the best you can today, and hoping for something better tomorrow.

as far as unix is concerned, i would say a huge part of the complexity of those systems comes from insisting that dependencies be installed, and shared, system-wide. This approach comes from a time when disk space was very expensive. Hence the need for those super-complex make/configure/install/apt stacks, LIBRARY_PATH, etc.

IMO you could simplify things a lot with a distro that only shared, say, the kernel/module/libc layer, plus a package management system. Beyond that, each packages would manage its dependencies, and install them under its own root directory - so you have only the package maintainer to blame if something is missing. This would give an application much more control in how to configure itself. It would also have the added benefit of super simple uninstall - just delete the app's directory, just like on osx.

Yes, OS X got this very right. It's really nice when some big library gets upgraded I don't have to worry that half the applications of my system have to be simultaneously upgraded.

Having just wasted a day trying to install ImageMagick (with no success), I have to agree. I just want to freaking resize images, I don't give a DAMN about installing prerequisites or dynamically linking so and so. I just want a simple API that I call to resize image with a width and height.

GNU is Not Unix

Somebody needs a hug...

This may be the right place to say it: PHP... i love you!

troll, perhaps? gave me a laugh

The post is about how a programmer thinks that there is a lot of things that are more complicated than those should be; and even use web development as an example. PHP is a programming language with a lot of native functions to do things in the web that would be a lot harder otherwise...

So, what is the trolling here? The lack of serious tone in the comment?

Nice drunk post.

Complex BAD. Simple GOOD. Move on with your lives.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact