Hacker News new | past | comments | ask | show | jobs | submit login
“I hate almost all software” — Ryan Dahl (plus.google.com)
397 points by robinhouston on Sept 30, 2011 | hide | past | favorite | 292 comments



I've been tempted to write rants like this before. Ryan's point seems particularly centered around Unix, which makes sense. My experience of trying to get stuff done in Unix has taught me that it is a really powerful, extremely well-thought-out way to waste my fucking time.

All of it. Down the fucking toilet, and for stuff I don't give a shit about. Every single goddamn thing I try to accomplish while setting up a server involves a minimum of 1 hour of googling and tinkering. Installing PHP? Did you install it the right way? Did you install the special packages that make it secure and not mastadon slow? You want to create a daemon process? Hope you found the right guide! Setting up a mail sever? Kill yourself.

For some people, this is not the case. They have spent multiple decades breathing in the Unix environment, and are quite good at guessing how the other guy probably designed his system. And they don't mind spending the majority of their productive hours tinkering with this stuff. But I don't have time. I don't care. I don't have time to read your 20-page manual/treatise on a utility that doesn't explain how to actually use the thing until page 17. I don't want to figure out why your project doesn't build on my machine because I'm missing some library that you need even though I have it installed but some bash variable isn't set and blah blah blah blah.

The problem with Unix is that it doesn't have a concept of a user. It was not designed that way. It was designed for the people who programmed it. Other pieces were designed for the people who programmed them. If you are using a piece that you built, then you are a user. Otherwise you are a troublesome interloper, and the system is simply waiting in a corner, wishing you would go away.

And yet...we put up with it. Because there isn't a better option. Because it's our job. Because we'd rather just bull through and get things done than spend an infinite amount of time fixing something that isn't fixable. Life sucks, but NodeJS is pretty cool.


"Because it's our job." Well probably not. I don't spend hours on fixing my own car. I leave that job to the ones who know about cars. Install a new server: ask someone who knows, give him your specs and he will set up the rig.

That's the funny thing about internet. Everybody can get the knowledge he needs, but that doesn't mean you can apply that knowledge and know about all consequences.


Long live the PaaS mechanics!


Car-Mechanic-as-a-Service.

Have your car fixed, pay by the hour, brilliant idea!


Exactly. As a friend of mine once said, "being in shape and being able to get into shape aren't the same thing".


He mentions things in that rant that I have no idea about, but I can get plenty of stuff done on Debian (as an example) - coding command line apps, creating services, deploying webapps...

I mean, it sounds like he's doing some pretty fiddly stuff - really getting in there and hacking. I don't see how that could ever be simple, and it certainly isn't anything 'end user' facing.

End users check their email online and work in spreadsheets occasionally. They don't develop server-side js frameworks. I guess the argument is that if things were simpler, then maybe they could do those things? But I don't buy it.

(I get the frustration, when you're held up for 45 minutes googling because some library is missing or a string isn't formatted just so, but that sort of thing only happens when you're literally hacking things up. Which isn't end user behavior, and I can't see how it could ever be a whole lot simpler. Maybe I'm just short sighted.


End users NOT only "check their email online and work in spreadsheets occasionally". They use complicated workflow systems, data analysis environments, resource-hungry media editing applications, and very complex yet almost undocumented scientific instruments. And the mountain of domain knowledge they have is no less than that of Unix systems programmer, so they just don't have place for the latter.


I don't think you understood the post correctly. Your comment is the same point that he is making, that end users never see any of the stuff he needs to fiddle around with while developing. Why does the developer need to learn all of that stuff when it makes no difference to the end user?

He's saying that the development side could be simplified as long as the end result (what the user sees) stays the same, since the user doesn't care how the product was developed.

The dev side is horrendously complicated, which is why he says he hates almost all software.


Here's an example: I use Snort, and wanted to set up Snorby because BASE is ancient and creaky and doesn't work well. It's written in Ruby, so it should be easy, right? Just get a package with the correct version of Ruby, then gem install until I have the prerequisites.

Nope. I gave up trying to install it months ago, but it required many external programs at versions too recent to be included in distro repositories, and which as far as I could tell were mutually incompatible. Obviously, people have gotten it to work, because it's a pretty popular front-end, but I never did.


I agree with robomc but it appears that most people is mixing up 'not buying' with 'not understanding'.

Sure nodejs would be easy to pick up if it had no dependencies and if its binaies were contained in a folder - i.e. portable - but assuming nodejs could be of any interest to the end user is a bit of an exaggeration IMHO. Therefore I don't buy it either.


Unix is just an example. Doing something non-trivial in Windows is usually even more complicated and involves a good amount of black magic. The post is about how tool complexity vastly exceeds complexity of the problems those tools solve.

    Because we'd rather just bull through and get things done than spend an infinite
    amount of time fixing something that isn't fixable.
This is a lie. Most developers don't even try to make things simple, and then say it's an unfixable issues to give themselves an excuse. I see this happen pretty much every day. Yes, simple is hard. Yes, designing simple systems requires doing more work, and sometimes re-doing your old work, but it does not require "infinite amount of time". It's perfectly doable, and there is return on investment in the long run, unless you're solving fake problems in the first place.


On a practical note, I've seen the install situation improve in the last 5 years thanks to two ubuntu features:

- the ease of apt-get install (inherited from debian)

- ubuntu LTS releases

I now use availability of apt-get instructions for an LTS as one of the measures of the maturity of a software package.


So.. things that have been in Debian for years.


NO - this is about the availability of DOCS to setup 3rd party software.

10 years ago, Debian may have had apt-get and stable but a lot of software setup docs were for building from source leading to possible compile issues and version conflicts for libraries.

Now, many setup docs are written as apt-get of binary packages for the last Ubuntu LTS.


Have you looked at http://www.turnkeylinux.org/


Thus Heroku.


Exactly, and Heroku isn't simple.

It's just simple for you, because it places you in an actual end user role. (Which you pay for).

Heroku is no doubt complicated as hell for the people who developed it. So if the argument is that developing node.js should have been like deploying to a paid managed hosting environment with fairly tight requirements, then ok. But that's a weird thing to assert.


Isn't the entire point of the op about making software simple for the end user? Obviously Heroku is complex underneath, but they've created a beautiful abstraction that makes an extremely complicated stack very simple to interact with. It doesn't solve every system administration problem in the world, but they're doing their part to make software more pleasant to work with.


Yeah see what you're saying. I guess my angle is that the original Heroku (before they had cash and time to add more support) had a tight limited costly scope - it concealed the complexity of deploying a rack app, and charged $50-100 a month for that concealment.

And I'd say that if all you want to do is deploy a rack app, you can do that relatively simply in Debian too, for free. (I'd do it with rvm and perhaps build nginx from source and such, but you could do 90% of it straight from apt-get, rubygems and editing ~three config files).

The sorts of things that are really tricky, and have you sweating over the sorts of things the original post is complaining about, are just hard original work (and free, and you don't have to wait years for them to support your pet language or framework or whatever).

It's an acceptable trade-off, an issue internal to the concerns of developers - not end users, and not a reason to hate on unix, is all I'm saying. Not at all hating on the service Heroku provides.


I looked the Heroku sign up page, and got intimidated as heck by all the terminology they use. I can't imagine what the setup and admin process is like.. it could be simple, who knows but their pre-sales page confuse me...


For anyone who isn't familiar with git, I can see how that could be confusing. Other than that, the "How It Works" page (which is what I assume you mean by signup page) is mostly just explaining the secret sauce, and doesn't really matter that much to the end user (developers). All that matters is gem install heroku, heroku create, git push heroku master (etc).


And now you know why I use a Mac. :)

Unix is there, if I want it and thank the gods there are package managers.


Imagine the amount of hacks and abstractions that go into making a GUI work on top of UNIX. the complication is compounded, it is easier for the end user (for some tasks not all) but the complexity he is talking about isn't solved by fancy widgets.

It depends what kind of user you are. Few user require fussing around with D-Bus etc..


Unix is very easy if you take time to learn it. Most things behave the same way.


Demonstrably untrue based on two and a half decades of experience. At least chasing lib dependencies for 18 hours during the config/make/make install process is mostly a thing of the past.


Do you have any broad advice on easing the config/make/make install process?


apt-get install


or yum install, Where yum automatically resolves the dependencies for you. No more make && make install


Nothing to do.


  Most things behave the same way.
Ha ha. You jest.

  bbot@bbot:~/foo$ ls
  bar
  bbot@bbot:~/foo$ ls bar
  text.txt
  bbot@bbot:~/foo$ mv bar baz
  bbot@bbot:~/foo$ ls
  baz
  bbot@bbot:~/foo$ cp baz bar
  cp: omitting directory `baz'
  bbot@bbot:~/foo$ ls
  baz
There are hundreds of things like this. dd's hilarious syntax. The spotty usage of --help. Inconsistent behavior when you invoke a command with no arguments. The dazzling array of contradictory option flags for ls. Everything about vi. Etc etc etc.


He said "Unix", not "Linux".


I'm confused. Which of these don't apply?


You just proved the authors point.


Free Your Technical Aesthetic from the 1970s: http://prog21.dadgum.com/74.html


And yet...we put up with it. Because there isn't a better option. vs this is needless (from the article) doesn't really gel. If it's actually needless, then you know a better way of doing it - so publish it!

The problem with Unix is that it doesn't have a concept of a user. Nope. It doesn't have a concept of a naive user. Table saws also don't have a concept of a naive user, but people don't bitch about folks trying to use a table saw without having to learn how first.


> Nope. It doesn't have a concept of a naive* user.*

I forgot who originally said it, but you comment reminded me of this quote: UNIX is user friendly, it's just choosy about who its friends are.

And occasionally even your best friends get on your nerves, I find.


That's a one brilliant thought.

When someone inexperienced tries to use the chainsaw, he may cut his hand off. Nobody blames the tools - it's obvious that if a green guy is hurt, it's because of his own foolishness.

Sadly, in the IT it's the opposite. People are bitching about the tools, paradigms, philosophies, without really doing their homework. Hey. Once upon a time it took a lifetime to master specific crafts. Let's be decent and maybe humbler a bit.

I have a theory why it is so, by the way. In the conventional crafts everything is physical, touchable, solid. In IT everything is abstract and prone to easy judgement and mindless relativism. Please let's bring back craft to the hacking!


Challenge accepted!

- Here's how needless Unix users is:

Every fresh server install, I have to make up a meaningless string called the 'login' of an 'admin user' who belongs to a 'group' called 'admin'. Once upon a time, I could use the well known admin login 'root'. Now that's not allowed. I have to make up a name, remember this name when you connect and then remember to prefix every command with sudo.

- Here's a better way of doing it:

Give me a server distro where I don't need a 'login'.

Meanwhile, why is apache pretending to be a 'user' called 'nobody'/'http' and not using some 'capabilities' or some shit like that?!!!!


> Every fresh server install, I have to make up a meaningless string called the 'login'

If you are only doing the occasional install then this really shouldn't be a great hardship. If you are installing many servers you should have this part automated. And ti shouldn't be meaningless either. I think you are doing it wrong.

> Once upon a time, I could use the well known admin login 'root'.

You still can. root login can always be reenabled if you want it that badly. There is also "sudo su" (unless explicitly disabled by your admins) to avoid repeated invocations of sudo while you are performing a long admin task.

> - Here's a better way of doing it:

> Give me a server distro where I don't need a 'login'.

No, no, and thrice no. Far too many newbies will leave it in that state and get hacked to buggery in short order. Even if it is only for local console logins, I'd consider it a bad idea.

No matter how inconvenient it is, server install should default to an insecure state and allowing access without authentication is such a state.

Live CDs often do this, but they are not production systems.

> Meanwhile, why is apache pretending to be a 'user'

Well that much is a valid point. There has been work in this area but non of it has made its way to default setups of most unix-a-like systems.


> There is also "sudo su" (unless explicitly disabled by your admins) to avoid repeated invocations of sudo while you are performing a long admin task.

You can also do 'sudo -s', which keeps you in your normal user's shell. It's pretty slick.


You learn something new every day. Thanks for the tip, I'll give that a try at some point.


Here's a simple thought experiment:

    Imagine a distro that changed the terms 'login/password' to 'password1/password2'
No commands you type would change, but you'd wonder why the password is in 2 parts. That's how redundant user is!


There's a security problem with that:

  $ chpasswd maybeUniqueString
  Error: maybeUniqueString already in use.

  $ su maybeUniqueString
  LOGIN OK, PROCEED TO EXPLOIT ME
So username and login are not totally redundant.


Well, as the old saying goes, those who do not know Unix are doomed to reinvent it.... poorly. I am rarely surprised anymore by the bad ideas I see people come up with.


Sounds like you've misunderstood that either password1 or password2 will get you in!

I didn't change any behavior - just the UI strings. So you need both!

i.e:

    $ su maybeUniqueString1
    Enter Password2:


Isn't that exactly the amount of stuff they would already have to guess?


No, you can have many users, and this way you try to guess everyone's password at the same time. Also, sadly, passwords tend to be repetitive, so now someone can accidentally guess someone else's password.


Cute, except that password1 has to be unique. Hope you like confusing some of your users.

You'd be better off getting rid of the login altogether and using a GUID. People still share computers, you know.


Uniqueness is not an issue - see my original point - we only create one admin user on servers!

Imagine a team of 3 people running a SaaS webapp on 3 web servers & 1 db server. I guarantee no one will waste their time creating 3 'users' on each server i.e. 3x4 = 12 'users' on that cluster.


> I guarantee no one will waste their time creating 3 'users' on each server i.e. 3x4 = 12 'users' on that cluster.

It sounds — and I don't mean to be rude — like you have not been involved in a "real" production environment.

Modern Unix environments are automatically managed with modular configuration systems such as Puppet or Chef. Sysadmins have little or no need to log into servers to configure them; they just hook the server into Puppet (for example), and Puppet will do everything required to mould the server into its correct state: Create users, install public keys, install packages, write config files etc.

Puppet in particular is so simple that you would want it even if you were managing a single box. Why? Because if that single box dies/is wiped/whatever, you just get a new box and point it at Puppet, and it will become identical (minus whatever data you lost) to the old one. Or need to buy more hardware? Just point the new box at the Puppet server, and you have two, or three, or ten identically configured boxes.

So yes, in a sense you're right; sysadmins won't waste their time creating a bunch of users, because they will let the configuration management system do it. :-)


> Puppet in particular is so simple that you would want it even if you were managing a single box.

You know... that's so blindingly obvious, that it had never even occurred to me. I'm in the middle of a home "IT refresh" right now and I'm trying to update the obscene amounts of documentation needed on how to configure every little thing.

Your comment just gave me a "the code is the documentation" kind of moment; realized I'd much rather have all that documentation checked into a Git repo somewhere as automateable configs. Thanks!


That's exactly the point. Glad to help. :-)


It's really nice having separate users in production for a server. That way, you log sudo and know who issued an admin command. But in a larger system, you don't worry about setting them up on each server; instead, you rely on LDAP.

Unix is the LISP of server operating systems. It's a multiplier. In return, it demands much more from the operator. This is not ideal for a desktop system. It's amazing when you have an admin who knows his shit.


LDAP is a sensible suggestion.

Why can't we still kill the local login i.e. directly map LDAP user -> permissions instead of LDAP user -> local 'user' -> permissions.


Why can't we still kill the local login

If LDAP ever goes down you might want to retain the ability to login to your box.

I ran into a problem similar to this on a recent DR exercise.

Active Directory (an LDAPish service) was down. Was going to be down for a while. If I could get into my three Windows hosts I could re-jigger the service account for $APP from the AD user to a local account, start things up.

I couldn't login to the servers: my .admin account was in AD. No one had any idea what the local administrator account could be. We were just .. stuck .. until AD came up.

I could have booted the system a rescue disk (linux) and edited the SAM to change the password. Didn't happen then for complicated reasons. And one shouldn't have to resort to heroic methods to get local access back.

And can you imagine doing that for hundreds of hosts?


And can you imagine doing that for hundreds of hosts?

This is why you:

- Always provide redundant, network-local LDAP servers so that LDAP doesn't go down.

- Wire up remotely accessible serial consoles that provide emergency-level local root access.

You can attach a modem to the serial console systems, or a hardline (which is what I did at a previous job) between your data center and offices.

We had a fixed 'role' account for the serial console systems, but it existed only for the purpose of emergency access, could only be accessed from specific local networks (we divided different classes of employees into different VLANs), and the knowledge of the password could be constrained to those that needed rare "server fell over" access.


The serial consoles won't work if the parent poster removes all local accounts and goes 'LDAP only'.

Unless I've misunderstood something about that. It happens.

We do the redundant Active Directory thing. It didn't help during the DR exercise when the AD guy did something foolish (don't remember what) and the AD / DNS host went down for a few hours.

Single host because the DR was limited in scope.

I was fine with my Solaris hosts - had local root access via serial and SSH. I was simply locked out of my Windows hosts, and could not reconfigure those services to work without AD.


Unless I've misunderstood something about that. It happens.

You just maintain a local/serial-only root account for that eventuality.

[Edit] And make sure internet-facing production services don't rely on administrative LDAP.


divtxt is proposing that exactly those things be removed.


Yes, and that's stupid, and I'm explaining how we made accounts work fine for multiple users (in production, across 100+ servers).


Today, it could probably be done. I'll have to think about the implications.


This is why we have cfengine (or similar). Because yes, we do have NxM accounts (double and triple figures respectively), and we can all passwordless-ssh to any box we have to and have our dotfiles set up just the way we like them.

But no, we don't "waste our time" creating these accounts. We have tools to do this for us. Revolutionary, I know.


I guarantee no one will waste their time creating 3 'users' on each server i.e. 3x4 = 12 'users' on that cluster.

You're just wrong. Plenty of people will do this. If one of those three people leave the company, you can disable that account without having to change the password for the other two. If you insist everyone use sudo, you get logs of all the commands run via sudo, and that includes who ran it.

This is all really useful. You don't understand why it's useful, but lots of people do understand it.


To join the rest of the people: We create separate accounts on all of our production and staging servers for every developer.

We switched to this after I had to stay about 5 hours late one night to switch all of our passwords on all our servers because someone quit.

We don't, however, do it manually. We have tools setup to do it for us (chef, in our case)


Wasting their time creating 3 users? Just sync /etc/passwd and authorized_keys between them. Zero time wasted.


Replying to multiple comments:

1) Automation: Yes to automation, but if I'm arguing that a task is needless, automating it does make change that.

2) Authentication: Yes to authorized_keys, auditing, LDAP, etc. I'm killing the local 'login' - not trying to kill security.


You've already received a few answers here, but one that seems to have been missed:

Use something other than ubuntu. Although there may be others out there, I'm unaware of any other distro that disables root. Complaining about disabling root is an ubuntu-specific complaint - it doesn't apply to linux in general, let alone unix.

Also, if you don't like using passwords, copy-ssh-id is your friend.

As for apache, I don't play with it much so I can't comment there. It certainly scares me :)


Nobody forces you to use a login, you can use the root account with a blank password on your server if you like, it will do exactly what you want...


You don't even have to do that. You can replace init with whatever you want and you'll never see a login prompt.


Aside from security, I think what this indicates (and maybe this is fundamental to computers?) is that Unix doesn't understand that sometimes users will lie. Some of the time when I say `rm -rf` (or run a process which contains that command somewhere) I want to delete the indicated directory, but some of the time I actually don't. I'm lying.

Unix knows who I am, and it knows what I want to do, but it has no way of knowing how much.

The way we get around this is by inventing an imaginary person called "root" who always actually wants what they say they want. On the other end, the imaginary person "nobody" almost never actually wants to do anything. This is obviously a half-solution, and it shouldn't be surprising that it causes weird workflow problems.


Here's an example showing how a human user does not require a local unix user of each server:

Jack starts Apache on one of the web servers:

    $ ssh jack@web4.example.com
    Password: secret123
    [_x_@web4] $ sudo /usr/bin/apachectl start # or similar
    [_x_@web4] $ logout
Now, there are 2 possible values for '_x_':

A) 'jack' - because there's a unix user 'jack' (what we have today)

B) 'sysadmin' - because there's no unix user 'jack' - only an entry in /etc/sshpasswd

B is the same as A as long as you update auditing to trace the Apache start to the jack/secret123 combo.

Sidenote: wow this thread blew up!


Actually, in one of Neal Stephenson's articles/booklets, he actually compares Unix to a very large drill (working the metaphor in some comic depth). When said drill is told to turn, it turns, consequences be damned. ("In The Beginning Was The Command Line")


You don't know the tools, and don't want to take the time to learn them, and yet you have a difficult time using them

Shocking.


It's not really because you don't care, etc, it's just because you're not good enough. Some people are knowledgeable, others are doomed to be Ruby programmers forever, but that's life.


This is just raw pessimism, you could rant like this about anything.

I hate all cars, especially my own. I hate that heavy, dangerous, gas-guzzling honda civic with an over-sensitive brake pedal and enormous, completely pointless blind spots over both shoulders. I hate filling it up with gas, which is expensive, smelly, and bad for the environment. I hate the dishes that I have to wash every day after I use them. I hate my Aeron chair that I sit in all day long. I hate peeling grapefruit. I hate the sound of my central air conditioning fan powering up. I hate how I'm either sore from working out or depressed from not working out.

There's nothing wrong with a rant now and again but let's recognize it for what it is.

Life is pain, Highness. Anyone who says differently is selling something.


Software really is sort of a special case though. Most of the problems you mentioned are at least partially caused by the constraints and resources of our physical environment. However, the complexity of software is almost entirely generated by human ineptitude. The one exception might be complexity caused by necessary optimization for hardware limitations, which would in fact explain some of problems cited in the blog post.


Most of the problems I mention aren't objective. Sometimes I like washing dishes, it's relaxing. I certainly like eating from dishes. Some people like cars and like driving. It certainly saves people time. Peeling grapefruit is very satisfying and makes the room smell nice. Central air is so much nicer than setting up a fan by my window and hoping that blowing the 75 degree air from outside will cool down the 85 degree ambient temperature inside.

Blaming "human ineptitude" is pessimistic. Sure, the fact that humans can't all manipulate computational machines directly and require layers of abstraction to effectively model problems can, technically, be called ineptitude, but really-- why be so down about it? That's the way things are and there's a lot of good that comes from software if you think about it for more than 30 seconds.


> However, the complexity of software is almost entirely generated by human ineptitude.

Human ineptitude is a part of our physical environment. We're just animals. Clever ones, but not perfect.


Unix (for example) was most certainly designed around the constraints of hardware at the time.

Furthermore it is a physical limitation for how much software you can write (and have it work) if you can get something that "mostly works" by building on top of yesterday's cruft then you do it, since the alternative is starting over from scratch and not being able to finish.


Time is a constraint and mental effort is a resource of our physical environment.


> mental effort is a resource of our physical environment.

One of the more intriguing comments I've seen on HN. Care to elucidate?


I'm ignoring the argument here, but I don't think any modern(ish) car actually has any blind spots if you set up your mirrors correctly.


Chevy HHR. No matter how much I adjusted the mirrors on that beast there are spots along the sides, and directly behind, that you just aren't going to see.


You clearly haven't driven a Koenigsegg then.


You could have picked about a hundred "regular" cars to illustrate your point better and not one of the fastest and most expensive in the world.

For example my brand new Hyundai Sonata has pretty shitty rear visibility due to it's 'sleek' styling and therefore smallish rear window. I could cite many more.


And then there's all this: http://en.wikipedia.org/wiki/Driver_visibility

It's more than the mirrors, and unless you have transparent pillars on top of the car (giving up the structural integrity of the cabin) it's going to have blind spots.


Also, most side-view mirror blind spots are caused by improper mirror positioning. If you can see the side of your car, or see the same object in both the rear view and side view mirror, you've positioned them wrong.

You should actually have them a lot farther out such that visibility in your side mirror coincides with losing rear-view visibility. That position is a lot farther out than most people think and is tricky to do the first few times.


But isn't it neat how shit still works?

Never fails to amaze me what users will do with a software tool.

I've seen experienced devs and support staff run a C program written to parse some weird data against another data set in the vain hope that it would parse the new data set into something usable.

I've seen MBAs who could barely tell you what a variable is write visual basic macros in Excel to do hardcore data management.

Game devs who almost seemed to frickin' think in OpenGL.

It is a big ball of mud (turtles all the way down, eh?), but on a good day, I listen to a hacker talk about finally getting that little piece of code beat into submission and it's very satisfying just to see that gleam in their eye.


Shit still works in spite of the software stack, not because of it.

Just how much reliance we put on autoconf really makes me shudder.


This is hopelessly naive. The reason that the whole stack of a solution isn't in proportion to the problem it solves is that we have more than one problem, and the only way to scale our manpower to all these problems is share some of the common bits in the solutions.

This sharing creates new abstraction boundaries, increases the number of concepts and moving parts, and there are lots of compromises involved in reusing a common part compared with crafting something small and simple specific to the task at hand. But if you didn't do this, you'd have lots of duplication of similar, but not quite identical work, like a pre-industrial society; a massively inefficient use of human labour.


You're absolutely right of course, but I don't think this is hopelessly naive, especially coming from such a respected developer. I see it more as a nice, bite-sized rant that lets off some steam, makes us laugh at ourselves, and has a nice bite of truth to it. I see it as cathartic rather than ignorant.


I still think it's immature. It takes time before you accept these kinds of things, the larger part of the world outside your control, that you only become properly aware of as you get older; fighting against it is like fighting against the tide. Fixing the systemic inefficiencies can only be done incrementally, but this rant literally suggests flushing the whole thing down the toilet at some point, and that's just childish.

You can't pause the world while you rebuild everything; it would take far too long to get to something better than what you're trying to replace. You can only repair one or two things at a time, and hopefully leave the world better for it; but the mindset espoused in the rant is more likely to result in a half-baked start on something new, but abandoned when the scope of the whole problem is fully perceived.


What his rant literally suggested what to "flush boost and glib and autoconf down the toilet and never think of them again" when "the accumulated complexity of our existing systems is greater than the complexity of creating a new one." It is hardly childish to imagine that such a scenario might occur, and you have not argued against his thesis as he stated it.


I directly disagree that a complex system will be replaced by creating a new complex system to replace it. I do not think that will happen, because I don't think the world works that way. What happens is something slightly simpler is created to solve a simpler problem, and gradually accretes more and more functionality until it gradually replaces something, in a kind of process of innovator's dilemma; or alternatively (and IMO more likely), one or two pieces in the complex whole are individually replaced by (perhaps) one thing which is simpler. But there's never a moment of high drama where we suddenly realize what a pile of crap we have and switch forthwith.

Just about everybody knows that all our software is imperfect crap on top of imperfect crap, from top to bottom. Everybody, when met with a new codebase above a certain size, thinks they could do better if they started over and did it "properly this time". Everybody can look at a simple thing like a submit form in a web browser, and sigh at the inefficiencies in the whole stack of getting what they type at the keyboard onto the wire in TCP frames, the massive amount of work and edifices of enormous complexity putting together the tooling and build systems and source control and global coordination of teams and the whole lot of it, soup to nuts, into a working system to do the most trivial of work.

But this is not a new or interesting realization by any means. It's not hard to point almost anywhere in the system and think up better ways of doing it. Pointing it out without some prescription for fixing it is idle; and suggesting that it will be fixed by wholesale replacement by another complex system is, IMO, fantasy.


While some see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world, are the ones who do. </quote> http://en.wikipedia.org/wiki/Think_Different


I don't know, there may come a time. If we were to replace the current system, then yes that would be impossible/waste of time. But what if an architecture came about built on AI? Maybe quantum computing? DNA based computers? Eventually there will be new hardware platforms that force the very change you are dismissing. Even if it is 25-50 years from now which is a blink of an eye in the grand scheme of things.


I'm a bit confused. The architecture is just abstracted away. Why does a programmer care if it's optical or DNA or whatever? See, for a less extreme example, the tools that people use to develop for a single x86 vs CUDA vs massive clusters vs huge FPGA clusters vs ARM vs etc etc. I honestly think that revolutionary architecture will just lead to some new libraries and dev tools which get kludged onto existing dev systems. But I welcome more information because I know I could be horribly wrong.


I think you're on the right track, software tries to layer itself as much as possible. New architectures and capabilities will only make the concepts and tools exposed by the glue between the layers change.

Unless the new thing isn't Turing-complete and can't be implemented with a Turing-complete system, it will be abstracted away at first just so we have an environment to start building with, and can start using it without reinventing every single wheel we have.


What does Turing-completeness have to do with it?


Turing-completeness means that the path of least resistance is to create a compatibility layer.

Without radically changing the paradigms on such a fundamental level a start-over just wouldn't happen.


So the solution is to simplify the boundaries between the parts. Perhaps, if the behavior of the parts were fully encapsulated, then they would be crazy easy to use. Just tell the part to "do whatever it is you do" and leave it at that. No compromises.

I don't think Ryan Dahl is at all naive for wanting something like this. I also think something like this is totally possible.


Hear hear! As a software developer, my trade is a ghetto awash with all manner of amateur-hour charlatans and language silos that are tantamount to pistol-whipped lock-in (I'm looking at you node.js). If you take a step back, the entire ecosystem of 'software development' is a chattering tower of Babel, all sound and fury, signifying nothing.

Programming languages, their frameworks, their libraries, their petty concerns are a mere vanity folly, riddled with re-invention, abstraction arcana, and deus-ex-machina hoopla. We have lost our way, straying so far from the path of the UNIX philosophy such that I must now 'whole-stack' an application instead of using the pipe character. A pox on the whole damned lot of it!

Some days I just despair of all the time I've wasted bustling and jostling, crushed by the sweaty masses in the ghetto. But if I'm honest with myself, I must confess I love it too. I love my programming languages, my libraries, the eight different ways I know to full-text search, to regex, to parse, to lock, to async. I love the smell and heat of the coal-face, the futility of it all. Stockholm Syndrome indeed!


You had me until "straying so far from the path of the UNIX philosophy". UNIX is a huge part of the problem IMO. Worse is better. Systems like Linux are the enemy of progress because while they suck horribly, they work much better than some alternatives (not to mention any naMeS) and at least as good as others so why spend time actually doing things right?

There is so much more that could be done with operating systems in any direction you want to go. I'm thankful that doing things on my iPad doesn't involve messing with command lines. But for when I want hackability I'd rather have what a Lisp machine could have become than a silly way to do functional programming in shell.


Maybe the real problem isn't that UNIX/Linux suck, but rather that these are extremely hard problems and there are no easy solutions.


But UNIX/Linux does suck and few are going to work on these extremely hard problems because UNIX/Linux are good enough.


I'd be interested in Dahl's (or your) opinion of Alan Kay's STEPS project in this context.

"For example, essentially all of the standard personal computing graphics can be created from scratch in the Nile language in a little more than 300 lines of code. Nile itself can be made in little over 100 lines of code in the OMeta metalanguage, and optimized to run acceptably in real-time (also in OMeta) in another 700 lines. OMeta can be made in itself and optimized in about 100 lines of code."

http://www.vpri.org/pdf/tr2010004_steps10.pdf

and, btw: https://github.com/tristanls/ometa-js-node


I am also interested in what others think of this project, I don't understand why this project isn't more popular.


http://vpri.org/videos/yahiko_mem_video.html (21:40 to 24:15)

Most people won't change their mind about anything, unless everyone else already did. Therefore, (Kay concludes at 24:00), truly new ideas take at least 30 years to become popular.

STEPS is too young. At this pace, wait for at least 20 years.


In order to make some project popular, you need to show it's powerful in doing REAL work. Ruby on rails did it for ruby. Paul Graham's success and writing did help lisp.

Is there some similar proof for ometa ?


This bit isn't the most impressive. Self-compiling languages are worthless until they implement something else. And it happen that IS (OMeta + Javascript + Nile), does implement more than itself. On top of it, they implemented:

- TCP-IP in 200 lines. Current C implementation use 10Kloc (50 times more).

- Most of Cairo's functionality in 500 lines. And it's fast enough. Cairo on the other hand weighs about 40Kloc. (Again, about 50 times more code.)

And that's for functionality they couldn't scrap altogether, or merge those with similar capabilities. For instance, you don't want to send emails, or publish a web page, or print a PDF, or, goodness forbid, a Word document. You just want to handle a fucking document. Send it, publish it, whatever, this is all glorified text (you do need the glorification, though).

The bottom line is, Alan Kay and his team rule.


I think his rant is both brilliant and incredibly naive and confused. Let me explain.

He's right about the fight against unnecessary complexity. He's right about how ultimately the enduser's experience is king. But he's objecting to a lot of the complexity that lies behind that UX facade. Because that's exactly what that UX is: a facade. It's an abstraction. And one that sometimes leaks. The iPhone is loved because of it's UX. But inside, behind the screen, it's not a box of mostly empty air and perhaps a little magical fairy who blows kisses and VOILA! the UX is delivered. It doesn't work like that. There are moving parts, both physical and virtual, a lot of them, that must be complex because they have real world constraints they MUST satisfy which your own mental model or messy subconscious human desires don't have to satisfy. The little girl wants a pony and SHE WANTS IT RIGHT NOW, DADDY! But her father lives closer to reality. He can't just wave a magic wand and give her a pony. It takes time. It takes money. You have to find a pony. Get it. Where do you keep it? Who feeds it? Shelters it? Can we afford it? Or are we just going to let it starve after the little girl gets bored playing with it? These are all the niggling little details that lie around the edges and behind the scenes when trying to satisfying this little girl's desire for a pony immediately. It is good to satisfy and deliver a desired experience. It is dumb and naive to think it only takes the wave of a magic wand or the press of a button. Yes we can provide a button you can press to make that pony appear. We can. That's just straightforward engineering and entrepreneurship. But there's going to be a lot of complexity and ugly moving parts, some with sharp edges, or unpleasant chemical properties, or esoteric technical jargon, under the hood, to make that button press deliver.


It would be ironic if the "just solve the fucking problem, damn the details" attitude espoused in this post is the reason everything is so fucking complicated.

(I honestly am not trying to imply that that is the case; I'm just musing.)


I've got the impression that the attitude is more like "carefully mind the details, so your users don't have to and they can just solve their fucking problem". If you program tools for other programmers, your users are also programmers, but don't assume they have your same background and are willing to mess with the same problems as you.

Designing and programming a tool that abstract the details from users is not more difficult, but it's very tedious. Just giving out meaningful and accurate error messages has a huge effect.


While I sympathize with the general frustration, this sort of rant gets us nowhere. It's sad to see such a brilliant mind lost in rage.

Systems programming has always been the code that most people won't tackle because the problems are ugly (thus the label systems programming). I really dislike autotools but I am not really up to resolving that problem, so I'll leave it to those that do. Pretty simple conclusion. When people with the guts to go in and replace these tools come around, I try to support them, but bashing others doesn't magically make that happen.

The claim that people who build on top of these systems are making problems worse. You could say the same thing about the users of that software then. There should be no hate for the act of construction. Destructive negativity is just a waste of time unless you want to lead people somewhere to construct again, and this post doesn't do much but hate. I'd favor suggestion over damnation. Don't hate people for building, encourage them to build something better!


Well, as Bakunin said, the passion for destruction is also a creative passion.

This isn't really rage and hate. You're taking the words too literally. It's the frustration of being able to feel clearly that there ought to be a simpler way, that there is a simpler way, while at the same time being caught in a sticky spider web and unable to do much about it.

You know what I bet is driving this? The realization that Node.js itself has turned out way too complicated. It ought to be a nice library to provide non-blocking I/O and networking APIs to V8 apps. Now it's becoming Rails at one end and an operating system at the other.

(I don't mean to pick on Node. It's valuable and I use it. My point is that we are all the sorcerer's apprentice, and runaway complexity will always be the default unless ruthlessly counteracted. It wasn't counteracted in Node's case, and since Ryah is a true hacker I imagine that he has the taste to know it. Indeed he says as much in the OP.)


It's not Ryan's passion I'm critical about. It's the fact that he's blasting anyone who doesn't fit in his view of right as "you don't understand how fucked the whole thing is." That's quite an ego to assume anyone who doesn't agree must be ignorant.


Increasingly, the key is to main orthogonality towards your problem solving (like an eagle) within the decaying confines of a semi-bloated (often mostly educational in terms of what not to do) ecosystem.

Which means rewriting crufty pieces of your stack when certain thresholds occur. 'There will come a point where the accumulated complexity of our existing systems is greater than the complexity of creating a new one' -- is something that happens in motion, iteratively, and which you do when you have time at all levels of the evolution that we call development.

Anyway, that's my two cents. Nice others are on the same wavelength, I think.


People don't think of software ecosystems as quasi-organic things.

To be regarded the same way as a long relationship with a cranky friend, maintaining an aging specimen tree, or a historic house.

Do you cut the branch off, or just prune it back a little?


Ever go through somebody's code, see some weird construct, go "this person is an idiot!," rewrite it, and find some edge case bug that the original code was written to handle? The original author had many of the same ambitions as you, and you relearned all the same lessons she did -- the hard way.

Recognizing and curtailing this impulse leads you toward enlightenment.


If the original programmer fixed some edge case and didn't bother to flag this in the code by means of comments then they worse than an idiot, they are incompetent. As would be the second programmer if they neglected any such comments which were there.


>If the original programmer fixed some edge case and didn't bother to flag this in the code by means of comments then they worse than an idiot, they are incompetent

If every reason for every fix based on an unanticipated logic path was commented, there would be 10x more comments than code.


'On the chosen day, the young and inexperienced programmer realizes that what he has constructed is simply a different collection of rubbish, mud and offal than that used by the previous tower.

7. Codethulu looks on, and says: "Now you have become one of us."'

http://codethulu.org


I don't get it? If he really hates the situation so much, why did he choose a language that makes it notoriously difficult to write quality software in and chose a concurrency style that is notoriously difficult to reason about?


You assume that he likes what he created.

In this rant he didn't say Node.js was the solution, or better than any of these crappy abstractions.

Don't assume that. He might not say he doesn't like node.js, but it doesn't mean he is happy with it.


I don't assume that he likes what he created. My point is: if he thinks things are these way why didn't he try to create a framework that did things simpler? The research on concurrency has plenty of options that are easier to read and reason about than callback-based code.


Good point he never even mentions node.js. ;)

But I do feel that node.js is Rayn's attempt to enlighten me and hopeful others. He doesn't try to hide everything like ports and the underlying c code. Its all there, and best of all in a language that is familiar(at least for web developers).


Interestingly, part of the reason Node.js used JS was that JS didn't have a ready-made standard library so it made a clean break from a lot of the styles of the past. Node was an opportunity to force a community to rethink a lot of things.


In what way did it accomplish this? The style Node has chosen for modeling concurrency is decades old. Twisted has been using it since the late 90's. So JS doesn't have a standard library? Who cares? You still have to write one for your event-loop, which is what you had to do in Twisted. I don't see any rethinking going on, I see steps backwards.


>So JS doesn't have a standard library? Who cares?

If you want people to get the performance benefits from using non-blocking libraries, you care.


Why? You have This write a new standard library no matter what language you choose. Js not having a stdlib doesn't save you work.


If you have established blocking standard libraries people are less likely to write/contribute to non-blocking ones.


Your claim that JavaScript is "notoriously difficult to write quality software in" is unsubstantiated. Which language in your opinion makes it easy to write quality software in?


People only like JavaScript because they know it. Yes, it's more productive than Java because it's (barely) functional, but I personally believe the renewed love for JavaScript is mostly Stockholm Syndrome.


I wonder if JavaScript is mainly loved for its frameworks and libraries? Without jQuery, node.js, etc., how many people would be singing its praise?


I believe since Java 6 (possibly 5), the JDK has come bundled with a minimal installation of Rhino which runs JS on the JVM. How many people actively use that? It's not hard to get going, 5 or less lines of code to start running a JS file and it will run everywhere with JRE >6.

But there's no jQuery, etc. It's a somewhat nicer way to work with Java since you aren't forced into I-don't-care-about-it-exception-catching hell and Map<Map<Map<...>>> madness, but compared to Jython or Clojure it doesn't match up. You can get a headless jQuery working with Rhino, though it's not as simple as it should be.


Rhino is really slow when compared to V8, like Ruby slow.


Rhino is slow, but it's freaking awesome if you need to write an algorithm that can run in the browser and on the server. Much much much nicer than having to maintain two implementations in two different languages.


Part of JS's popularity is definitely politics... the fact it's in the browser gives it a huge shot in the arm.

It's a decent language, though. Here's a comparison of JS to Ruby for some of the stuff people generally love Ruby for. JS comes out looking pretty decent.

http://fingernailsinoatmeal.com/post/292301859/metaprogrammi...


To quote Douglas Crockford on JavaScript, "Lisp in C's Clothing"


Before I learned a Lisp, I agreed with that statement. Now, I think nothing could be further from the truth.


Indeed. Lisp is a simple language that can be completely defined in half a page. A naive implementation takes less than a thousand lines of code. The language is homoiconic, i.e, there's no difference between programs and data -- everything is a list. This enables programmers to perform complex transformations on code using macros. Lisp compilers such as SBCL generate code that is almost as fast as C++. Oh, and the original Lisp as described by McCarthy is strictly rooted in mathematics. If you limit yourself to writing only a subset of Common Lisp/Scheme -- one that emphasizes immutability and recursive functions on lists of symbols -- it is possible to use formal methods to prove the correctness of your code.

None of that is true for JS. I don't think Crockford thought his remark through.


As a lisp (Scheme and Clojure) user who nonetheless takes issue with this statement:

1. Modern (JS) runtimes have JITs that can generate code almost as fast as C++.

2. You can write purely functional code in JS.

3. JS has arguably less features than R5RS Scheme, which itself is hard to fit on half a page.


What do you mean by point 2? Unless you don't plan on doing any I/O, you can't write purely functional code in JS. Not that you can in Lisp either...


I meant that there is nothing about JS that prohibits writing pure functions and reaping all the associated benefits (formal reasoning, etc.)


If it ain't got macros, then it ain't a Lisp. Javascript is definitely no Lisp, but I really like that it allows function composition, which is somehow reminiscent of a Lisp. f(g(h(x))); instead of (f (g (h x)))


Um, pretty much every C-derived language supports function composition, and uses that syntax for it. Am I missing something?


You're right. I meant to say passing first-class functions as arguments, rather than pointers to functions as you would in C or an object method as you would in Java.


I agree. I think this "JS is Lisp" nonsense is a pretty little platitude that many developers use simply because they don't know any better.


Crockford goes on to clarify that Javascript is really a combination of Java syntax, Self inheritance, and Scheme-like lexical scoping.


By the definition given in the article (just solve the user's problem, don't build a tower of abstractions): Forth.


JS is known for being a language with a lot of "WTFs" in it. Yes, you can write bad code in any languages and you can even write great code in JS. But to quote Fogus: "My problem with Javascript has always been that expertise is defined more in terms of understanding its faults rather than its features"


Rich Hickey really nailed the definitions of "complex" and "easy" and "simple" so well in his Strange Loop talk this year. Too bad there's only notes available right now: http://blog.markwshead.com/1069/simple-made-easy-rich-hickey...


Fantastic. Thanks for that. Any idea if the whole talk is going to make it online soon? I'd be very interested in watching it.


They shot videos of all of the talks. They took their sweet time getting them up last year though.


This sounds great indeed. Would love to watch the video of this talk.


It's nice to hear someone well-respected say this, as I've been saying this for years and yet I get frowns from senior managers and programmers.

I don't like magic in programming, yet nowadays there seems a move (especially in Ruby with the [over]use of method_missing) that encourages it.


I love magic!

Every level of abstraction above binary code, from assembly, to C, to Ruby, to Rails DSL's--each works by creating magic incantations that let you run larger functionality with a new shorter series of magic words.

Are you really against magic, or is it that you are against black magic (which I would classify as leaky abstractions)?

http://www.joelonsoftware.com/articles/LeakyAbstractions.htm...


Sometimes the new magic incantation is longer than the old incantations it was based on, though. And slower. And harder to understand. And buggier. And doesn't expose important functionality, so you invent roundabout ways to access it. And when you put several abstractions on top of each other, you will eventually get all these problems combined.

For example, try to draw a single black pixel on the screen... using JavaScript, or better yet, some language that compiles to JavaScript. How long would that take you? How many lines of code? How fast does it run? How much memory used by all subsystems combined? How many system calls involved? In assembly language that would be one instruction. And don't tell me that drawing single pixels is unimportant. I can show you any number of hackers who would create extremely cool webapps if drawing a pixel took one assembly instruction.


I don't know what it's like for other Javascript-based environments, but in mine it looks like this:

    Mars.load("olympusmons.js");

    var app = new Mars.UIApplication().init(1280, 720);
    app.setBackgroundColour(0xFFFFFFFF);
    app.startRenderThread();

    var scene = new Mars.UIScene().init(app);
    app.addScene(scene);

    var surface = new Mars.UISurface().init(app);
    surface.setSize(300, 200);
    surface.setColour(0xFFFFFFFF);

    var texture = surface.getTexture();
    texture.setBackgroundColour(0xFF0000FF);
    texture.setPixel(150, 100, 0x000000FF);


Do not confuse the language with the environment in which it runs. The difficulty with pixel-poking isn't with JavaScript, it's with the environment. Give it another environment that allows the language direct access to the window (rather than to a DOM node nestled deep withing the bowels a parse tree which then needs to be rendered by a completely separate engine) then the difficulty would vanish. JavaScript may most typically run in a browser context, but the browser context is not JavaScript.


On the other hand, there are things that you want to make difficult, like stealing credit card information.


It's hard to disagree with you when you use that definition of 'magic'. This is an example of my definition...

Yesterday I was looking at a Chartbeat gem that accesses the Chartbeat REST API [1]. The entire class is 40 lines of code, however it's coded so weirdly that you'd have to read the source in order to use it. Every API call was a method_missing call, so instead of doing (in irb)

    puts Chartbeat.new(:apikey => 'a', :host => 'b').public_methods
you'd have to do

    puts Chartbeat::METHODS
    puts Chartbeat::DASHAPI_METHODS
But I only know that because I had to look at the source code. In addition, there's no way to specify custom exceptions to the user, you have to rely on the rest_client gem's exceptions.

The code does look magical, and kudos to the developer that wrote it for the ingenius use of method_missing, but IMHO it's a bit to magical for my tastes. I like to look at a library's documentation and instantly know what methods I'm allowed to call and what exceptions/results I'm going to get back.

[1] https://github.com/ashaw/chartbeat/blob/master/lib/chartbeat...


Honestly, this is why I cringe whenever I hear code described as "clever." I bring up "clever" here because "magic" seems to be used in this discussion in a manner synonymous with "clever." [1] Developers tend to be intelligent people who like to stretch their intelligence, but in terms of code robustness, the boring, normal solution is almost always a better choice than the clever approach. Clever, to me, has almost become synonymous with unusable, fragile, and over-complicated.

[1] Edit: Added explanation since, on a second read, it seems like I'm digressing from the topic.


"Technology's greatest contribution is to permit people to be incompetent at a larger and larger range of things. Only by embracing such incompetence is the human race able to progress." http://www.theodoregray.com/BrainRot/


Something something Plan 9 something something.


Something something stewing in their own juices for too long something something.


The people who created Plan 9 (and Unix) this days are working on something that has a better chance of blowing a hole through the huge mess that is modern software: Go ( http://golang.org )

I might be naive, but Go is the only thing that has given me some hope for the future of software development in recent times, it means there is a chance that maybe some day I will be able to write software in an environment and with tools that are not byzantine hideously insane piles of layers gratuitiously complex crud.

Hell, with Go you completely bypass even libc (but unlike Plan 9, you still can take advantage of the hardware and software support of existing operating systems/kernels, that sadly are an unavoidable mess, which is one of the things that made it impossible to adopt in practice).


dont worry, GO will accrue its own layers of gratuitiously complex crud with time, indeed it already is.


If anything Go has become simpler and has removed crud since it was introduced.

And that is precisely what gives me hope, Go is going against the trend of almost every other language.


This is what happens when developers want to do sysadmin. Come on guys, we, sysadmin, spent as much time learning our job as your learned coding. If we would be trying to code, we would be lost and pissed off. That's why we don't do it.

The OS is not wrong, what is wrong is you imagining that every system should be as simple as "right click / start". If you want that, take the Heroku/<you PaaS here> route and you'll be happy. But the day you have 5000 customers connecting at the same second and your environment collapses because you don't have the flexibility to tune it, don't come crying.


Seriously. This whole thread smells of butthurt software developers who just found out that Linux is hard because it doesn't have a Mac GUI on top of it.


I am struggling to think of a single piece of software that I interact with in my day-to-day life that brings me pleasure. I suppose Emacs comes closest, but it's a hideous pile of hacks and YHWH help you if you want to get into the internals to start paying back the massive amount of technical debt.

tsort. There we go. I don't hate tsort. pbcopy and pbpaste.


Grep. Grep is about the single best fucking thing ever written by the hands of man. Shakespeare can suck it, I'm telling you, grep is IT.

Edited to add: Git. Git is also a thing of beauty. Who knew revision control could be made to not suck? Sure, SVN was a welcome relief from the unrelenting stone faced hell that was CVS, but that's damning by faint praise.


Git is too complex to learn. The internal stuff is beautiful (if you take the time to grasp it). But the UI is just horrible, in my opinion.

I love git, and use it for all of my projects. But my love for git might also be because I have spent so much time on learning it.


Grep? Grep is the redheaded bastard sibling of the actual best tool ever written, which is sed.


I love sed too, but you should mention QED (1965!) here, which is actual proto-mother of almost all text-related goodies. Let me quote a nice wikipedia paragraph, which covers it.

http://en.wikipedia.org/wiki/Ed_(text_editor)

ed went on to influence ex, which in turn spawned vi. The non-interactive Unix command grep was inspired by a common special uses of qed and later ed, where the command g/re/p means globally search for the regular expression re and print the lines containing it. The Unix stream editor, sed implemented many of the scripting features of qed that were not supported by ed on Unix; sed, in turn, influenced the design of the programming language AWK, which in turn inspired aspects of Perl.

And I love AWK too.


> Grep. Grep is about the single best fucking thing ever written by the hands of man.

I like and use grep. But, as a programmer, I like ack[1] much better than grep.

[1] http://betterthangrep.com/


I prefer bzr to git.


Git is the quintessential example of requiring the users to get inside the author's head if they want to avoid disaster, and thus falls into the "steaming pile of crap" category. The things "git merge" will do to your tree are not defined by what a user might reasonably want or expect but by what's easy and convenient for the git authors, and there is simply no way to understand "git reset" without thinking about the internals of how git tracks changes. Simple commands do bizarre things that are almost never needed or desired, while simple and common tasks require non-obvious (and poorly explained) options. Git is a powerful and useful tool, but it is also the poster child for what is wrong with modern software development.


Yep, I agree. I've used git for 5 months. I just know for sure what 3 commands do: git commit -a and git push and git add.

Ask me what git merge does, or how to branch, or even how to delete a file in a repository (w/o deleting it from your local), or how to revert back to another version, or how to even check out, and I'll say I dunno. I read the documentation, but am still confused. I sometimes feel like I'm dumb because I feel everyone loves git and everyone gets it. Glad to hear that I'm not alone.


The iPhone is a miracle the size of a deck of cards. A lot of the standard apps are a joy to use. (The annoyances arise more from the business side than the nature of the software or the device itself.)

I'm not an iPad user but it makes a lot of people pretty happy.

Chrome is usually a pleasure to use. Firefox used to be that way, and they are starting to regain my trust again. I feel good about using Firebug as well.

Skype has some irritations, but for the most part it's still marvelous to open up a video chat to fucking Zanzibar whenever I feel like it.

vim is not always easy, but wow is it rock solid.

Upgrading Debian works very well for me at least, thanks to the miracle of apt-get. Debian itself, well, it's not winning usability awards, but still....


A big problem for me is that every time I drop into the shell, I'm in the shell. And I hate the shell like I hate nothing short of, I don't know, Nazi zombie robots. I've been using Unix or a derivate for more than twenty years and I have never lost my loathing for the shell.

And don't even get me started about C the language. It makes me want to go find Ritchie and punch him in the junk.

Why is rsync not a binary linked to a useful librsync? Why STILL TO THIS DAY a grep/egrep difference? And why did Apple integrate Interface Builder -- their best piece of software, IMO -- into Xcode?

Let's not even get into Google.


I am struggling to think of a single piece of software that I interact with in my day-to-day life that brings me pleasure

Games, a game delivery system like steam or perhaps vlc for playing my favorite music, but even then, its the music that makes me happy, not the program playing it...

Ont he other hand, there is a lot of software that I actually hate.


I special case games because I play them on consoles and therefore have compartmentalized them as appliance functions, and not the massively complex pieces of software that they are.

The list of software that I actively despise, however, is finite, but unbounded.


I assume you mean in terms of the interaction with the software itself (rather than say, pleasure from social media being the conversations rather than the software) and there's one type of software that's specifically about enjoying the interaction: games. I guess you're not a gamer?


Your web browser?


On one hand, I agree with him. The software ecosystems we work in have a whole lot of needless and incidental complexity. I could go on and on about the insanely and overly complicated things that developers -- especially ones like Ryan Dahl -- have to deal with all the time.

On the other hand, it's arrogant for one to think that he or she could do it that much better than the next guy. Writing efficient, maintainable, and "simple" software requires adding layers of indirection and complexity. You have to use your best judgment to ask whether the new layer you're adding will make things ultimately cleaner and simpler for future generations of programmers, or will hang like a millstone around their necks for years to come.

Let's try a little thought experiment: go back a few decades to the early 80s. Propose to build node.js as a tool to make it much easier for developers to write real-time network applications. You'll need to design a prototype-based dynamic language, itself an extremely difficult (and dare I say complicated) task. The implementation will need a garbage-collector, a delicate, complicated, and cumbersome piece of code to write. To make it acceptably fast, you'll need to write a JIT, which traces and profiles running code, then hot-swaps out JITted routines without missing a beat. You'll need to write a library which abstracts away event-based IO, like the "libev" node.js uses. That will require kernel support.

Frankly, even forgetting about relative CPU power at the time, I think you'd be laughed out of the room for proposing this. All of these things, for production systems, were extremely speculative, "complicated" things at the time they were introduced. People can't predict the future, and they obviously have difficulty predicting what tools will become useful and simple, and which will become crufty tarpits of painful dependencies and incidental complexity. No one in 1988 could say "a dynamic, prototype-based, garbage-collected language paired with a simple event model will allow developers to create simple real-time network applications easily in 2011". Many of them probably had high hopes that C++ templates would deliver on the same vision by then. But, instead, we have Boost.

Further, it's extremely arrogant of Dahl to create a dichotomy between those who "just don't yet understand how utterly fucked the whole thing is" and those, like him, with the self-proclaimed power of clear vision to see what will help us tame and conquer this complexity. Who knows, maybe in 15 years we'll be saddled with tons of crufty, callback-soup, unreliable node.js applications we'll all have to maintain. I don't think James Gosling envisioned the mess that "enterprise Java" became when he designed the initial simple language. Most developers do many of the things he cites, like adding "unnecessary" hierarchies to their project, because they believe it will help them in conquering complexity, and leave something simple and powerful for others to use down the line.


Erlang came out in 1986, and was used in production systems shortly thereafter. The world really is just catching up to the state of the art of the early 80s.

Ryan is right. Most of the software we use is crap. That's because Worse is Better.


Ryan doesn't claim he can tame it. The post comes off as self-deprecating and not arrogant to me:

"Node is fucked too. I am also one of these people adding needless complexity. ... The entire system is broken - all the languages and all the operating systems. It will all need to be replaced."

I am not a node.js user.


In fairness the internals of Node get complicated due to cross platform support. The fact that every OS does it differently is both a blessing (because we can learn from mistakes of others) and a curse.


FYI, in 1988, Self (a dynamic, prototype-based, garbage-collected language that had a JIT) existed. I'm not sure how it handled concurrency, though.


Not well, unfortunately. Neither does it handle errors well. It doesn't have proper closures, either. I love it, but readily admit it's a giant research demo.


Lisp came out in 1958 Smalltalk came out in 1980.

Oh, hell, just watch the Mother of all demos (1968): http://www.youtube.com/watch?v=JfIgzSoTMOs

It's not arrogant of him to think like this. It's more like Steve Jobs, circa 2006 thinking phones sucked and deciding to do something about it. Or maybe it's like Steve Ballmer thinking he could take over the phone market. I think it's too early to say for sure, but the early signs are promising.


Modern Lisp didn't came out until much later, but still, Common Lisp was out in 1984, and that's still very early, especially considering the fact that it was a response to already widespread use of Lisp languages with Lisp-like features.


Arrogance? Come on, it's a rant. Cut him some slack.

Other than that, I agress with pretty much everything you said. It's easy to forget how reliant we are on things like industrial-strength GC, multithreading and JIT and how young those things really are.


I don't see where he says he knows what the solution is.


To some degree, I can agree with Ryan here that a lot of software these days is unnecessarily complex. However, I also think that his view is biased because he works on a project that is responsible for a great deal of abstraction.

The average Javascript developer using node.js DOES NOT have to "deal with DBus and /usr/lib and Boost and ioctls and SMF and signals and volatile variables and prototypal inheritance and _C99_FEATURES_ and dpkg and autoconf", because Ryan and other node.js devs already have thought about it for them, and introduced a helpful and practical layer of abstraction on top of all this complexity.

As a result of having to think about it all day, every day however, it's understandable that Ryan would despise this kind of stuff. On the other hand, as a web developer that uses the result of his hard work, I am not affected by it at all, so the complexity of my work is substantially reduced.


The complexity of your work (as I assume a web developer) is substantially reduced to only HTML, CSS (and its various implementations on various browsers), Javascript, the modeling language you use, the framework you use, the database, SQL, ... Should I go on? It's all shit. Not just what he deals with.


The problem is that the simple and quick solution has to be hacked around when the problem set changes and that's how we get all these hacks.


A thousand times no. It's what we do when the problem set changes that causes trouble. We only ever respond one way: by agglutinating more code with the old. Compound this a few times and you have irreversible complexity. It's what would happen in a neighborhood if you called garbage "construction material" and only ever piled it up all around you.

The solution is to delete nearly as much code as we write. Put differently, the solution is small programs, ruthlessly pursued. The reason we don't do this is that it's totally absent from (nearly all) software culture -- absent as in blank stares and why-you-talk-crazy-talk if you bring it up -- and by far the number one determinant of what people do is what other people do.

There are a few points of light. Chuck Moore and Arthur Whitney come to mind. From everything I've heard, their programs are small enough not to have these problems. And in case anyone is wondering "If this is so much better how come we don't all work that way?" - the answer to that conundrum hit me the other day: sample size. Statistically speaking, it's never been tried.


Thanks for the Arthur Whitney reference, I had not previously been aware of his work although I fiddled about with APL to teach maths years ago and have always liked the 'executable notation' idea.

http://queue.acm.org/detail.cfm?id=1531242

    BC Is that advice you would give to practitioners: to throw out more?
    AW Yes, but in business it's hard to do that.
    BC Especially when it's working!
    AW But I love throwing it all out. 
It strikes me (as a non-programmer) that Moore and Whitney are working in well defined problem spaces. 'The art of the soluable' by Peter Medawar springs to mind (about scientific method).


I'm glad you looked that up. More people need to know about Whitney. I wish he would open-source his work so we could learn from it.

But why do you say they are working in well-defined problem spaces? No more well-defined than most, I would have thought. Certainly Moore was a pioneer of iterative development and evolving a program (and notation) in the direction of the problem. That's why he invented Forth in the first place.

Edit: Oh, having looked up the Medawar reference I realize you probably mean "well-defined problem space" in the way a mathematician would: a problem space narrow enough to be effectively studied but rich enough to produce meaningful results. Certainly most software projects do not start out in such a space. On the other hand, we don't try to learn enough about our problems to find such spaces. We merely add code. One might almost say we excrete it.


Sorry, yes, I'm not a coder. Whitney is dealing with financial data sets of impressively huge sizes but he (to my limited knowledge) clearly understands the structure of the data and a range of queries at a deep level. Moore looks as if he his devising the hardware to run the code!


Too true. It sometimes feels like programming is about trying to glue a bunch of hacks together creating yet another hack. Then someone standardises said conglomeration of hacks. Repeat.


My interpretation of what Ryan is saying: Programming languages, libraries, and linux distributions are more complex than they should be. When you use them in your products, you contribute to the problem. When you're thinking about them, you're wasting your time because your users don't care about your tools. One day we'll decide it's easier to throw them all out and start over.

Overall, I don't agree with this.

Complexity arises because what we want to do is complicated. I don't think there's a way around that. Sometimes too much cruft builds up in an area, but that leads to redesigns of specific components. For example, client-side configuration of LDAP and Kerberos has been unreasonably complex for a long time. That didn't lead to people ditching them, that lead to https://fedorahosted.org/sssd/. It's likely that one day we will decide it's best to replace LDAP, just like was done with NIS. However, it won't mean we have to throw out all of linux.

The "users don't care" argument doesn't appeal to me. I don't care what tools the architects used when they designed my apartment building, but if learning some complex math and geeking out over slide rules enabled them do it, I'm all for it. Being told there's something wrong with me because I've changed the settings in my text editor is insulting.


I really wouldn't know about the specific Unix related points, but his frustration is easily recognized in my current work on MS enterprise applications.

The actual solution is a desktop client and a web application used for simple CRUD purposes, each with around 10 screens / pages.

We have a huge suite of tests. We have a large amount of different layers. Gigantic amounts of interfaces inheriting from interfaces, and being passed around as parameters. Partial classes, with implementation spread out all around the application. Everything grandly designed according to design patterns, and every piece of code positioned in the smallest possible unit. Everything in the front end is a user control.

In theory this gives us extreme extensibility, flexibility and code reuse. From an academics stand point, it's well designed according to best practice.

In reality, it is completely and utterly obfuscating the actual code that get things done. Adding another db field to the UI requres modification of data-access layer, business object layer, changes to 2-3 different types of interfaces, additional code to a type conversion class, initialization code in the front-end, additional display logic to a user control, extra custom validation logic etc etc.

I really feel with the author, and can unfortunately confirm it's often the same shit no matter what software you are dealing with.


He had me until the last paragraph... "if you spend time configuring your window manager or editor..."

If you don't take the time to configure your editor properly I do not want to collaborate with you on anything. Ever.


If there was a 'proper' way to configure your editor, then that proper way would be the default.


If there was a 'proper' way to configure your editor, then that proper way would be the default.

There's a proper way to tailor a suit, but no one expects an off-the-rack suit to fit everyone.

Some people want computers and software to be just like household appliances, but seem oblivious to the rants people have about appliances (stupid settings, no way way to config, lowest-common-denominator, one-size-firs-none, yadda yadda yadda).

Yeah, I get pissed at computers and software at times, but other times I'm awestruck at just what amazing things we can do.

Overall, we're winning.


"proper" in this context is how you configure it, not how it is configured.

(That seems like it should have a "confucius say" or something prefixing it).


The implication is 'proper for you'. One size fits all ... doesn't. But everyone should shape his tools to fit himself.


This has all the signatures of a bad day barfed out as incoherent rage on a keyboard. I've been there, and I can day with the auhtority of experience:

Ryan Dahl will regret this post for years to come.


That's because you have simply no idea what he's talking about.

Alan Kay once said, "Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves."


I'm pretty sure Egyptian pyramids are made of fairly substantial chunks of stone, probably weren't built by slaves and as they are nearly 5000 years old they surely must have rather a lot of structural integrity.

NB I've been to the center of Khafra's pyramid at Giza and I'm pretty sure that I wouldn't have done this if it hadn't exhibited rather a lot of structural integrity.


So... you're defending that type of architecture? Reading these other comments here, you're not alone it seems. No wonder we're in the mess we're in. Why bother looking for constructs like the arch? You want a big building? Just pile rocks in a heap.

PS. yes, yes, yes, there is irony in Dr. Kay's quote in that the Giza pyramids are of almost magical construction. He meant it metaphorically.


Wait... what? Dude, the invention of the arch was a big deal, and it came later. Forgive the Egyptians for their inability to do calculus; they were still thinking of shapes in terms of knotted rope.

The great irony of this conversation and its genesis is that it comes from the author of a server-side programming platform who's good points include, "No new techniques required, you already know javascript" and that touts a decades-old cooperative-multitasking approach as "simple" when the end result for the programmer is anything but.


I'm not blaming the Egyptians; I just don't want to be them.

I think we should looking for the arch, and not be happy with piling rocks in a heap.


"piling rocks in a heap"

The Egyptian pyramids are quite incredible bits of engineering - hardly "rocks in a heap".


"So... you're defending that type of architecture?"

No I was nitpicking about some factual inaccuracies about Egyptian pyramids - I wasn't making any comment whatsoever about software.


> That's because you have simply no idea what he's talking about.

I suspect I do. Funny you bring up a Kay quote I've heard that he regrets, but anyways...

The same day this came onto hacker news, Jamis Buck also posted an unintentional counterpoint to Ryan's rant: http://www.jamisbuck.org/presentations/rubyconf2011/index.ht...


Apologies; I'm sure you do. I have a lot of rage at our unwillingness to look at the mess we've made, the extraneous complexity and unmanageability of our systems, and those who seem to defend it.

90 years ago, I think Jamis could have written a similar deck about why cranking a Model-T while pumping the throttle was just the hard work necessary to enjoy driving.

Edit: Funny you bring up a Kay quote I've heard that he regrets

If he regrets such an important and honest quote, he's off my Christmas list! :-)


"90 years ago, I think Jamis could have written a similar deck about why cranking a Model-T while pumping the throttle was just the hard work necessary to enjoy driving."

I feel like these points are contradictory. If it is "all about the user experience" and you're advocating using more sophisticated and cognitively intensive, but conceptually cleaner and more repeatable, processes like the arch... then shouldn't algorithms be important?

It's all about the user experience, right? While our job may be complicated and involve a lot of math; at the end of the day we're to present an approachable front and clean interface to the intent of this complexity.

Or are we just raging because software is hard? I have no sympathy for people who aren't constantly improving themselves. Writing software will probably never be easy; and nothing Ryan has said (or that Node.js does) changes that.


There is a "software crisis". There is no "hardware crisis" and, in fact, has the opposite of "Moore's Law". Both are difficult, right? But we have conquered the latter and we have made few inroads into the latter.

Large systems are almost impossible to create and maintain. Imagine that we could build dog houses and, if we're careful, houses, but nothing bigger that wasn't under constant threat of falling apart? Imagine news headlines of "Chartres Cathedral collapsed again today." And then imagine a response of, "Well, that's because it's hard to build."

What Dr. Kay said, and I'm sure he only backed off of it because it's tiring making these arguments over and over again to people who look on with open mouths, is that there is the equivalent of the arch waiting for us. You, Dave, may not believe it. You may say anyone who complains about houses falling in on themselves just doesn't know it's hard work. I'm saying it's hard work to make them out of toothpicks and dental floss.

The reason I get annoyed in my comments (and had to apologize) is that I've spent 15 years working on a solution and have had nothing but resistance from those who can't see past the state of the art.

You have no sympathy for people who aren't constantly improving themselves. I have no sympathy for an industry that isn't.


So what is your solution?


You're kidding, right? I find his post a bit amusing, and its far from unprofessional. It strikes a lot of programmer nerves, and that's better than being quiet and not asking: why can't it be this way?


I really don't understand how "configuring a text editor" implies that "you don't understand that the only thing that matters in software is the experience of the user". Good user experiences can only be written if you haven't customized your text editor to be more efficient?


I interpret it this way:

As programmers, it is easy for us to get wrapped up in the act of programming, and to forget about the point of programming: to solve problems as quickly, cheaply, robustly, and maintainably as possible.

Software development is all about tradeoffs, and some amount of environment configuration is undoubtedly a good thing. Just like some abstractions are good, some design patterns are good, etc.

But you have to be honest with yourself about whether the investment you are making learning and building additional complexity is really paying dividends, or if it's just fun to play with.

An alternate interpretation is that nobody should need to configure an editor. Editors should work already. It's 2011. If you like doing this and don't see that you should be spending your time more productively, you are part of the problem.


I hate how concrete just doesn't dry instantaneously, and also how you have to mix the right proportions of each thing to get the stuff working.


Good analogy, it reminded me of teen angst. The Livejournal version of this rant would have been: "I hate how the world is so complicated, with relationships and stuff and politics and law and technology, why don't people in this time just behave simple. Animals have done so for millions of years, and the only thing that matters is that things are simple and understandable to me. I don't care about history and mathematics why, ohhh why doesn't the world simply do what I want !!!"

Software needs to be complicated because the tasks that it performs are complicated. The only way human programmers can deal with this is abstraction on abstraction on abstraction. This will only becomes worse as software handles more "real world" things such as formerly hardware tasks.

Unnecessary software complexity is added due to maintenance, where the maintainers added extra complexity because they weren't able to integrate the changes into the current design and/or didn't understand it well enough. Probably a better job could be done with better tools/documentation in this case. Not by ranting at developers though.

His rant about UNIX is crazy. Any full-featured operating system necessarily is complex under the hood. If something randomly doesn't work in his favorite OS he also has to spend hours googling, diving into obscure settings managers, etc.


But we can build big buildings. Large projects tend to fall in on themselves.


This reads too much like a rant for my liking.

I get why complexity is disliked(/feared?) by some people, but unless you've got a better workable solution that you're ready for me to try out, your rant is just noise to me.

I've often found myself begrudged by the complexity of a piece of software, but that doesn't make me think we should throw the entire program out. How about we make it easier to use instead?


He's saying we've become the proverbial frog in the boiling water. The inability to abstract well has made any medium-to-large software project a Rube Goldberg contraption. You may not mind it -- but you should. Because it's silly. Because it's stupid. And recognizing that fact is the first step towards our recovery.


I do notice the growing complexity in the systems around me, I just wish more people would present solutions instead of just gawking at problems.

Pointing your finger and making noise will draw attention to the issue, but isn't likely to fix it on it's own.


Things Ryan Dahl hates:

dbus

/usr/lib

Boost

ioctls

SMF

signals

volatile variables

prototypal inheritance

C99

dpkg

autoconf

LD_LIBRARY_PATH

/usr

zombie processes

bash tab completion

dynamic linking

static linking

glib

the details of programming languages

formatting source code

configuring window managers

configuring editors

unicode

directory hierarchies


As he should. These things are all stupid. They're monstrously stupid minutiae that you shouldn't have to know about unless the abstraction somehow leaks.

What he's saying is, "Where are better abstraction mechanisms?" And that's a tremendously important question (whether you get it or not).


So let's take the first example on the list, dbus.

http://www.freedesktop.org/wiki/Software/dbus

Should no programmers have to know about inter-process communication? Is dbus a bad IPC mechanism? Is IPC itself a flawed concept?

Those are interesting questions, because we can ask why and look for alternatives. Ryan's post and your response, not as interesting.


Application messaging is a great idea. Desktop buses are useful.

DBus as an implementation of messaging largely sucks. It uses an ad hoc protocol, there's little security, the C library implementation of it is a big mess, the socket interferes with remote X, introspection doesn't really work, and it uses far too much XML.


It may be a good question, but asking it and then ignoring it by not throwing out ideas for discussion is almost as useless as not asking it at all. e.g. regarding volatile: http://www.bluebytesoftware.com/blog/2010/12/04/SayonaraVola...


I really agree with this.

There are better abstraction mechanisms.


He lost me at tab completion


Don't forget node.js -- "Node is fucked too. I am also one of these people adding needless complexity."


Holy shit, it isn't just me! I've been muttering for years to coworkers, colleagues and random acquaintances that elegance cannot be obtained by adding an additional layer of complexity, yet modern developers seem absolutely enamored of the kind of vile unnecessary complexity that comes with layered abstractions.


I couldn't agree more. In software engineering, I have seen people read books on design patterns and use it EVERYWHERE possible.

Even the most simplest and straightforward development work is matched to a pattern and implemented as a design pattern.

I think there should be a rule that says if your feature or a particular problem that you are trying to solve does not exceed XYZ lines of code, then it should never be implemented as a design pattern.

Yet to determine XYZ. I would guess XYZ = 200?


tl;dr Get off my lawn!

Right. Because current attitudes are that if you aren't developing everything purely object oriented with a design pattern or five using nosql for your data store you're a fucking imbecile. I shit you not I've seen a coworker spend half a day adding 800+ lines of get() and set() methods to a 150 line email script. The truly bizarre part is, it's not like he's stupid, or fresh out of school. The guy's a certifiable genius with six or seven years of industry experience under his belt.

This kind of cargo cult bullshit is, in my opinion, the single largest recurring (recursive?) problem in our industry. Unfortunately this isn't a new problem. Every generation of programmers finds some set of development concepts to enshrine as the gospel.


Unix software was created by developers, mainly for developers. They honestly don't care if it's hard to set up. I'm not justifying it, i'm just explaining it.

Operating Systems make all of this much simpler by setting it up for you. Try to use only what they give you and you will go a lot farther with less effort. Buck the system and you're in for a world of hurt.

There's a better way to go about it. It's called: PAY FOR YOUR SOFTWARE. Then you might get support too. You want it for free, you bet your ass it's going to be painful to use.

By the way, I don't know who this Ryan Dahl guy is, but it strikes me as very naive to consider that groking the entire inner-workings of the complete organization of an operating system - from the development tools to make it to the execution and use of its applications - should somehow be simple for anyone. I wonder if he'd bitch that the kernel is hard to modify without affecting another component, or that different versions of software may not have been written to be completely backwards compatible with one another?

This is the real world. This shit is complicated because it evolved that way. It's almost infinitely flexible and powerful and gives you everything you need to do what you have to do - and you complain that it's complex? Grow up.


Is it because there are so many programmers?

Doesn't he hate that a 50,000 LOC VM linked to C++ libraries is more popular than a 8,000(?) LOC language that solves the same problems and more?

It doesn't matter for most end users but it sucks to be the one to deal with V8's GC, lack of continuations, design by committee language, etc. But there are more bodies in his corner, dealing with that complexity.


I think I disagree with what he's saying here. When someone takes pride in their craft and craftmanship, that care and thoughtfulness tends to bubble up to the surface for users.

You can see the difference between a chair made by hand by a carpenter who wanted to make the perfect chair and one made by a carpenter who wanted to get it over with.

Now on the other hand, you have two extreme ends:

1) The 'architect' who creates 4 layers of class hierarchies and factory-factories

2) And the guy who doesn't indent his code and types all of it in notepad.exe

I guess the key is to take pride and put thoughtfulness into what you do without losing sight of the fact that there's a end-user at the end who needs to use your work.


Call it a noob mistake... but I recently wrote my own MVC framework on top of Express... and ended up never writing the app I originally intended to use it to write.

I think a few months ago, this wouldn't have made sense to me, but now I totally get what he's saying.

On my new app, I'm still using express to do most of the connect-ey stuff, but i've definitely decided that most MVC-ey frameworks are a premature optimization (for me). I'd rather just start with node + express, add in whatever DB I need (Redis / Mongo preferably) and build small and progressively.

My lesson learned... would love to hear other opinions.


noob mistake

There is very little value in trying to draw generalisations from past experiences when it comes to deciding whether to use a framework, and if so, which framework to use.

Unless you're reinventing the wheel, each project will front you with a unique set of challenges. More often than not, with a framework, these problems are awkwardly solved with code written by other people to solve other people's problems.

In my opinion all of this has less to do with what's trying to be achieved than it does with what you personally want to take away from the experience. If your motivation is to make money quickly to feed your family then you would be silly not to jump on something like Rails and ride on the shoulders of giants. On the other hand, if you want to become the most proficient programmer you can become then this path will probably lead you astray.

A quick look around the internet reveals both tiny projects that fail, as well as enormous projects that succeed, on full-stack frameworks such as Rails.

Needless to say if you're hacking on node you are of the latter category; The developer striving to broaden her horizons by exposing herself to the news and unknowns. This won't help you learn to work in a team on a large project. It wil not teach you to control complexity. In fact, it will probably lead you to believe you're learning all these things when you're in fact becoming comfortable with the complete opposite (working alone, hacking in anything, anywhere you feel like).

Generalisations really piss me off.

I whole-heartily encourage anyone who is open-sourcing node.js code to continue doing so. Even if it is Yet-Another MVC framework. Just take note of what ry is getting at here by keeping it mean and lean.


Applications are open for YC Summer 2023

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: