This is a great rant. Nice emotional content, lots of technical details, the author qualifies his credentials, etc. Easily one of the better articles I've read in the past few weeks.
One of the things he mentions is the pain of setup -- something I've painfully watched develop over the years. Used to be you could go from a dead stop to programming something useful in about 10 seconds. Now, as he points out, it's not unusual to spend weeks digging around through vendor requirements, obscure dialects, rumor and configuration nightmares simply trying to get started. It's crazy. We've complicated the plumbing to the point nobody knows how to work the damn shower any more.
When I was a kid I loved computers and I saved my paper route money and bought Borland C++ and tried to learn C++ from the manual. I failed utterly and I had no one to teach me. Now kids can learn easier languages that do cooler things on cooler hardware and they have the Internet to learn from.
Great things are possible and computing has made great leaps forward. Enjoy it!
As challenging and frustrating as it can sometimes be for me to get the application stack working, it's nothing compared to the aggravation and sheer misery I'm eliminating from my users' workflow.
One group sits close enough to me that I can actually hear the reduction in swearing each time I replace a shitty old tool they have to use with something that was actually designed with their needs in mind. That's a pretty nice feeling.
I'm just not buying into this "oh, life is so hard to set stuff up nowadays" routine...
Rather, the point is that it is easier and more pragmatic to rush ahead with an imperfect technology stack and achieve great things than it is to spend the time and effort to reach "a better place".
I love how far technology has come, I love writing iOS software and appreciate the resources at my fingertips via Google. However I have a similar range of experience as the article's author, and I completely agree that while great things are being done with technology as it is today, it's very sad that we have limited ourselves to only reach where we are now rather than where we may have been.
I'm with you.
Go build a new computer architecture, Zachary Morris. It seems you have the knowledge and the support. Put it on kickstarter and I'll give you money. Godspeed.
The problem here is that you can argue this either way, so let me clarify. When we do systems analysis, we first focus on "happy path" scenarios. You want something, there's no exceptions, you push a big red button and it shows up. Life is good.
For those kinds of behaviors, say provisioning an entire linux stack on AWS, things are rocking. Want to code? Type in a couple apt-gets and you are on your way. There is a huge section of happy path scenarios where things are truly much better than before -- easily enough that many startups can totally stay on the happy path and do awesomely. (And that's exactly what they should do.)
But, alas, because of the dozens of layers of abstractions, and dozens more API, tooling, and hardware configurations, if you are doing anything technically difficult you can easily leave the happy path and get lost in the woods. It's very easy to get into a situation where you are using a combination of a dozen things in such a way as to be almost unique. Yes, each thing is awesome and easy to use, but the scenario you find yourself in is not. It's not called "sad path" A better name for it would be "terrible path of shame and destruction" Ever work inside a problem where you're dealing with multiple edge cases in several layers of complex system that has parts that are inscrutable? Ouch. I've plowed through complex and poorly-written C++ frameworks line-by-line and it's not as bad as that.
The author spun out one such scenario for writing iPad apps, but you could find a hundred of these cases easily. Boards are full of guys asking questions about combining A, B, C, D, E, F, and G into a situation where they are having problems, and responses are usually along the lines of "Hey, I know A, C, F, and G, and here's what worked." Sometimes that's helpful. Sometimes not.
I'm fairly new to linux, although I'm an old .NET/Win32/C++/COM hound. About a year ago, I thought about writing a mono app in F# to do functional reactive programming for the web as a side project. Basically Node. After poking around for a few days, I finally gave up. As far as I could tell, I couldn't find an entry point into a mono app such that I could load the app and hold it in memory, then re-enter from Apache (It probably could be done directly from the command line, but I didn't want to give up Apache) I wanted the speed associated with re-entrency.
Now perhaps somebody will reply and tell me the magic formula to make all of that happen. If so, awesome. But it was just too much bullshit. Finding stuff on Apache? Easy. Setting up a linux box? No problem. Loading up mono? Piece of cake. Getting F# up? A bit of a hassle, but I worked it out. Tying it all together in that unique combination in a configuration not commonly used? A very painful thing. I'm not saying it was impossible. I freely admit being a wus and giving up. But this kind of frustration is all too common any more.
Another example. I help teams a lot. Many teams I help are starting out on their project in a greenfield environment -- it's a new project and everybody is starting fresh. It's not unusual for them to spend a week just getting their environment set up. Now here's the deal: yes, you and I could cherry pick tools and such so that we could set them up in an hour or two, but many times the organizations they are part of have already made these choices for them. So no, it's not git. And no, it's not a plain-vanilla IDE. And so on. I really feel for these guys.
So sure, you can show all kinds of examples where things are a lot easier now than before, and I completely agree. But many folks do not live in that world.
I would venture that anyone who works in a large corporation is not in this world. Which is a loss; large established corporations have some of the most interesting problem spaces...
Of course, this assumes you have plenty of time to spend on such projects. ;-)
That thing he's talking about with non-sequential placement of bytes on the drum is well before that.
4K? Luxury! I started with 1K and that included the screen memory!
This is probably the single most irritating thing to me as a developer. And, the most frustrating part is that this aspect often turns a simple project or a curious dive into a new technology into a knock-down, drag-out brawl with configuration files and dependencies that in the end produces frustration instead of working code.
- Need a webserver and database setup? Easy, a few clicks on Heroku and I'm good to go.
- Need to learn how to program something that you can then deploy to Heroku? Easy, just follow these 13 steps and you're on your way:
WTF is so painful about this?
That's a different path entirely, though not necessarily a bad one. The full stack isn't under your control on Heroku.
Oh, so you have the ability, time and motivation to rewrite your firmware, operating system, network stack, database and web server and implement it on a CPU you designed yourself?
No? Then the full stack isn't under your control either. Get over it.
That's a different path entirely
You know what? It's not a different path at all. The convenience of Heroku is no different in principle to using closed source hardware.
Look, I'm a freedom-zero type of person. I run Ubuntu - not OSX - and I admin my own servers. BUT the chains of convenience of apt-get are much closer to the chains of convenience of something like Heroku than many would like to admit.
To get more control one should move to AWS. But wait, I don't control the network card on the server with AWS. Hmmm... I know! I'll get a colo. Now I can purchase and build my own machine and have control over the network card. But wait, I don't control the backup power system. Hmmm.... I know! I'll rent a building, get an internet backbone piped into it, buy some generators, and control the backup power system. But wait, I don't control the...
I guess what I'm getting at is, why is this an entirely different path? Why isn't using a service such as Heroku "the path" until you hit it's limitations and need to spend a bit more time going to the next step in the process.
One could easily make the statement of "blah isn't under your control on blah" for anything. Where does it stop?
No, software installation today is an absolute breeze compared to what it used to be, no matter which platform you're on. On Windows, DLL hell is largely a solved problem. You can just double-click the Installshield EXE and take a coffee break. On Linux, you can invoke the package manager for your system and have an even more seamless experience. Heck, even programming tools offer their own package management systems now. Want to install django? Pip install django. Want to install rails? gem install rails. It's definitely easier than unzipping files, setting configuration settings, finding (and, in some cases, installing) dependencies, and then crossing your fingers and typing make.
You need not use all of your capabilities all the time. My A/C can go to 50 degrees. Maybe there will come a day when I need that. Until then, I'll keep it above 70.
Look here's a story: We just had our national holiday in Germany and therefore a long weekend. So I decided to code an update for an iPhone app I have. The app lets the user create funny pictures, so I thought it would be cool to have an online gallery with user-created pictures, where people could vote on the best ones and have a weekly top list. Now this is far from trivial though, suddenly I need online storage, a database for users and votes, and a webservice to handle all of that. So I looked around, found Node.js, MongoDB, Heroku and S3, signed up, started reading, learning and coding. 2 Days later and the first version is done and working. How much did I know about this before and how much does this stack cost me? Almost nothing.
So where's my point with this? First, this kind of story would be impossible just a few years ago. When you realize, how it's now possible for a small developer to reach potentially tens of thousands of users, without the need of a big budget or being in control of delivery channels. When you realize, how much powerful resources are now right at the fingertips of the average developer, how you just need to sit down, read and learn, and you are able to implement even the wildest ideas - then I can't help but think these times are great!
Coding is still hard and despite advancement in technology, it may not have gotten much easier. But many things that used to be straight out impossible, can now, with the right commitment, be archived from the comfort of your own four walls.
"The printer was the first drum printer that I had ever seen. It would print 1500 lines per minute alphanumeric, and 1800 lines per minute when only numeric values were being printed. It cost $243,450. Its reliability was somewhat suspect. I walked through the room that it was kept in every day for a year, and the only time that I ever saw it working was at a trade show in Los Angeles. The main reason that I went to the show was that I heard that the printer was there and working. I suspect that the printer was a strong contributor to the demise of Toni Schumans' career with Burroughs. Doug Bolitho was giving a plant tour to a group of potential customers one day and he somehow had the printer printing something. Toni walked into the room and loudly exclaimed "My God it's working". She left Burroughs shortly after the incident."
It's an excerpt from an autobiography. The article as a whole originally came up because the author worked for a summer with Donald Knuth - as in, Donald Knuth of the Art of Computer Programming, the quintessential tome of accurate, elegant, academic, truth-on-a-whiteboard computer science. Presumably Knuth sometimes walked by the non-operative printer some days in the morning too. His job at the company was to write a compiler, and reportedly it was a very good one.
Success and failure in the state-of-the-art have always coexisted. I sit here - as an iPhone programmer, who has recently had to deal with annoying provisioning issues, LLVM and GCC compilation problems, and advertising networks - and I can look through the window of my office to see our printer, which is located behind the water cooler, available over a wireless network, and can accurately print a requested piece of paper the first time I send a print command from my laptop. I think it cost $200 from the office store - in 2011 dollars, before accounting for inflation.
My office has a printer, yes, it's available over the wireless network. However, it cost... somewhere in the neighborhood of $10,000. I'm not entirely sure, we have a five year lease on the thing, and it costs something like $500/month to run. It's an okay printer, as long as you run Windows.
There is a postscript module for the printer, but it costs around $1000, so Mac/Linux machines are out of luck, and our vendor hasn't actually said when we can get such a postscript module installed. (And I get people in my office at least once a week asking how they can print from their MacBook.)
From a hardware perspective, we've come a long way. From a software perspective, it's a wonder that we're still using proprietary nonsense protocols for printing and scanning. And my organization is stuck with a 5 year lease. But even if we weren't stuck with a 5-year lease, it's a $10,000 printer that is incompatible with OS X.
That said, I think this article is over the top and mostly wrong. But it is a great jumping-off point to talk about the limitations of the jumble of incompatible technologies we find ourselves working with, and how we can make them better.
The world has largely established that http is the way to query devices and control them. Device drivers are the spawn of the devil to me and completely unnecessary (thank you Microsoft). They may very well be the pinnacle of what I'm complaining about.
Printers could have a free wireless web interface where you upload any file type from tiff to doc and it "just works." I realize it's more complicated than that because of colorspaces and half toning and blah blah blah. But it shouldn't be.
And I shouldn't need any special software to save images from my scanner or take a snapshot with my webcam. I actually wrote a command line tool on the Mac to tell Quicktime to save a snapshot from the webcam as a file. That is pathetic and makes me want to hit myself over the head with a sledgehammer.
I don't think I've said anything earth shattering here but wake me up when any of this happens.
We sort of had this before Windows. When you bought a printer you had to make sure it had an Epson FX mode to support Wordstar, Diablo 630 to print you invoices and IBM Proprinter to print your mainframe reports. It was a bloody mess.
The Windows printer driver model isn't great and MS appear to recognise this, but things were a lot harder when every application did it's own thing.
As for printers, PS was proposed as a standard so that printers "just worked". It cost money to license, not every jumped on board, and so it didn't establish itself as the dominant brand. PDF is based on PS and people could certainly standardize around it. The problems are the same as with PS, however.
Your screenshot example. I assure you that you are not the first person to think of this feature. Apple does distribute Photo Booth with OS X after all. Is your complain then that they didn't cater to an the incredible minority with a command line tool to do this? I don't see this as a reasonable complaint. It's certainly not something to hit yourself over the head with.
None of this has anything to do with computers nor state-of-the art. Your complaints seem to be that society is not catering to your specific needs quite enough. Or that people are not working together quite enough. While I agree with you on the later point, it's not worth getting so worked up over. It's always been this way and likely always will.
Also the only decent-performance chips doing it were from Adobe, and they weren't cheap. At the time, this would have pretty much cut the low-end printer market to the ground. The high-end printer market, of course, pretty much all did PostScript if you asked them to.
Computing will improve. Computing will always improve. I think rants like this are helpful to point out where we definitely can improve, today, to bring on the future - such as making the iPhone dev and release process easier ;-)
A chair, clothing and even houses haven't changed much over the years either. Most designs only change bits at a time, slowly morphing into unrecognisable things.
Today that same repair would be a couple of days in the shop, requiring tens of thousands of dollars of specialty tools. Of course those springs don't fail as often as they did back then (a combination of improved materials science and engineering) but when they do the fix is out of reach, even for a trained mechanic without access to a shop with all the required specialty tools.
Have a read:
The good old days of grease and spittle and they didn't even have duct tape yet!
That link is about replacing the entire valve (valves were a wear part, much like tires today, only with much shorter service life).
Major car failures in cars first 5 years fell by 1/3rd between 2005 and 2010.
So, I'd expect a lot less than 30 years ago.
Roughly: American Arithmometer Company (1886) → The Burroughs Corporation (1904 rename) → (1986 merger) Unisys.
While tools like functional programming may indeed deliver on the promise of a 60% code reduction, they have a correspondingly higher barrier to learning. Evolving algorithms? Automatic programming? These problems become theoretically intractable so quickly it's not even funny (I currently do research on a very related problem). He wants compilers to just insert the semicolon for him? I'm glad mechanical engineers of the world don't have the same attitude about nails and screws!
Most of his complaints in truth have nothing to do with computer science at all. They have everything to do with sloppy engineering. There are all sorts of obvious reasons why computer engineering is sloppy. A few examples:
1) Developers for open source project usually are not paid. It's not surprising that documentation is weak.
2) Reliability and turn-key operation are expensive to develop and nobody wants to pay. I'm sure the author of the article doesn't either.
3) Bugs have lower impact. Screw up a building foundation and you might end up in jail. A clunky install process? Probably all that will happen is a scolding and a new issue in the tracker.
4) Things change so fast that standards can't keep up. The same goes for most other engineering frameworks that would solve many of the problems Morris complains about.
We've made and continue to make huge progress in the field of computer science. Computers have and continue to replace people in jobs all over the world. Morris should be happy they haven't replaced his job yet. Not working may sound nice, but having an income is also nice. That has nothing to do with computers.
Computers have made our lives easier. If I went back 10 years and told my younger self what I can do today with just my mobile phone, I doubt my younger self would even believe me.
The problem is not that progress is bad. It's that progress is moving too fast for engineering to keep up with. The state of the art is constantly changing.
Most code is badly written C++ that could have been written in, say, Ocaml, Haskell, or Lisp. This is easily a 10 fold reduction. And the guys at http://vpri.org/ are doing at least five times better than that (50 fold reduction compared to well written C).
On your point 3, Bugs have lower accountability. Meaning, lower impact for those who commit them. If they have lower impact as well, that's because they forced us to distrust software.
By the LOC count of currently maintained projects, most code is definitely badly written, and in a low-level or otherwise unsuitable language such as C++ (C++ is fast, but most of the time we shouldn't even care.) Most code I see (beware selection bias) is indeed badly written C++ (I even authored some of it). Now, if you count by project times end-user popularity, then you have a completely different story. It is a poor proxy for programmer's work quantity, however.
Regarding the 10 fold reduction, just know that I experienced first hand 5 fold reductions when doing my best with C++ and Ocaml respectively. A 10 fold reduction is not unrealistic at all when we talk about badly written C++. For instance…
…I currently write plugin for a 2 million-lines program. The exposed interface alone weights 80,000 lines. Many classes have hundreds of methods, and the inheritance hierarchy is generally 5 levels deep, sometimes up to 8 or 9. And, it actively deals with no less than 6 different types for handling strings (std::string, char* , old Pascal strings of 3 different types, and some String type that inherits std::string). Oh, and it's slow. C++ didn't even help here. We are past the point where we can talk about technical debt: it's a Big Ball of Mud.
For such a program, I expect a 20 fold reduction. And I bet there are other Chtulu abominations out there that make look this program like a little diamond.
Yes! I guess we could learn a lot from other engineering disciplines.
 Sketchpad (http://www.youtube.com/watch?v=USyoT_Ha_bA)
 The Mother of All Demos (http://www.youtube.com/watch?v=JfIgzSoTMOs)
 Hypercard (http://www.youtube.com/watch?v=g-qth3mrbwc)
 Ward's Wiki (http://c2.com/cgi/wiki)
Then ask: what are we doing that should get added to this list?
Today's computing is built mostly on tools that were originally cheap hacks intended to be replaced. If you have not watched the videos above, for the sake of our future, please do.
And what about git's content addressing?
Progress comes in fits and starts, and not where we expect it.
And long before those, plan9's venti.
The original rant really does make a good point about how accessible things once were. Gosh even being 24 I remember the days of screwing around with logo and qbasic and hacking serial ports with zterm. It's not that everything 'just worked' then (It didn't) but that if it didn't work you could still get something useful out of the machine. The layers of abstraction from the hardware have grown so much now that nothing makes sense.
Take a look at their work, most notably the last STEPS progress report. They can use 50 times less code than well written, useful C code (like TCP). Compare with redundant, useless, badly written C++.
The secret is quite simple: you can write your own programming languages, and good code looks like a formal spec (to the point of being one, ideally). If you can't write good code, then write the language that will let you. How ? Start here : http://www.tinlizzie.org/ometa/
Who better to do version two than the people involved in version one?
(I agree with you about it being kind of hard to follow what they're doing, though.)
And people just complain that life isn't easy enough. Psh.
The reason? Almost zero deployment complexity. You can take Joe Web Designer off the street, give him SFTP (or, more likely, FTP) credentials, and have him modifying your site in minutes.
Modify a file, click refresh.
I love PHP, but I'm in programming like a shade-tree mechanic: I just want a large enough hammer to make something work, I don't care how pretty the end result is.
Software and technology in general is like a gigantic thriving petri dish. The natural world is terribly inefficient, too (I need how many sperm to germinate one lousy egg?!)
The simple fact is that I'm able to read your rant, think about it and reply whilst sitting in a cafe on a mobile phone. That is awesome progress.
The problems we deal with as engineers are the reason we get paid to do it. Compare this to the practice of music: the art is in the beauty of the song. The engineering is in getting 5 stoners on stage simultaneously and managing to get paid by the ephemeral promoter at the end of it. I'd rather wrangle with a broken FreeBSD port dependency in the comfort of my office listening to music of my choosing (streamed over the internet, and made by people using computers) than starve because of crop failure.
In every life we have some trouble, but when you worry you make it double.
This sounds vaguely like one of the many "We must have a Do-What-I-Mean language" posts I've been reading since at least the 80's.
Yes, many things in the "state of the art" could certainly be better. But the belief that that is only so because the powers that be want to make money is rather misguided. Building better tools and developing better techniques is hard work.
(I'm not even going to comment on the fact that the author in the same article demands a more formal basis of our craft and at the same time thinks PHP is the best language ever. I was slightly amused)
I was going to start this reply with: where is there a forum/IRC/blog to work on this?
And then I realized those might not be the right tools at all to try to clean up technical debts.
Since we aren't machines, sometimes we express ourselves in long rambling thoughts. That's okay and, when it's not we have Editors.
I really disagree with a lot of his solutions, and I think his argument is muddled and unclear, and I hate the idea of moving toward languages like PHP that cover for programming errors (that often mask logical errors you want the compiler to check) but I see where he's coming from.
My complaint is that a lot of the measures he proposes (self-modifying code? seriously?) are going to make the problem a lot worse.
Across the industry as a whole, 80 to 90 percent of software engineer time is spent cleaning up crap (sometimes one's own, but this is at least educational) that wasn't done right in the first place and, yeah, it's frustrating. You need to have an unusual degree of autonomy (effectively your own boss or at least CTO) to be able to create such "oases".
Self-modifying assembly code, in the bad old days, was not uncommon (as a performance optimization). It is, however, utterly unmaintainable.
I would recommend the book Types and Programming Languages. Also, you should spend a year in a strongly and statically typed language. Start with Ocaml or Haskell because they are "purer", then move to Scala if you want. This will give you some ideas on what infrastructural choices make feasible the development of software that isn't complete garbage.
I'm afraid I don't follow; what do you mean by this bit here?
I've been saying a lot of the same things for years. I'm tired of it now; I'm starting to give up, because it's obvious there isn't a programmer out there that gives a damn. You can try telling them that there's something wrong with software -- something fundamental, something in the process itself that has become horribly broken -- but they'd rather tell you why you're wrong rather than really listen to why you're frustrated.
Then if, as a user (or programmer) you complain about something, they say, "So build your own version." What, we're supposed to re-build the world? OK, fine. So you start building your own version of something. Then the response is, "Why are you reinventing the wheel? That's already been done."
But, what the fuck do I know? I spent a couple of hours in a meeting today with a client explaining why upgrading -- er, pardon, "migrating" -- from Joomla 1.5 to 1.7 wasn't going to happen without a lot of money involved. Then I went to another client and we fixed a broken Windows network stack and a handful of other stuff. Then I came home and unboxed the new laptop that I just bought because my old one could no longer run Firefox anymore -- even though it ran it just damn well fine enough a few years ago. Other than the new laptop, this is pretty much my day -- all day, every day. Well, me and the other guy in my shop.
And nobody else seems to think there's anything strange about this stuff. So Samba's documentation is a mess and Samba 4 never got around to implementing allowed_users, which is absolutely necessary in a Windows AD environment? Pfaah, big deal, so what? So Western Digital's backup software interferes with Outlook in funny ways? Psh. Who cares, who's that going to affect? So everybody's decided to abandon sane software versioning altogether and make the support end of things even more nuts? Hah! Get with the program you support idiots, you're supposed to want to spend all day upgrading software and dealing with the inevitable fallout.
Besides, users that don't like upgrades are just morons, they just don't know what they like. As soon as they get used to the new version, they'll like it better, you'll see.
All of this is stuff I've personally heard, or seen in places like HN ... and not just a few times here and there.
Software used to be fun. I remember when it was, when it seemed like most things just worked, even though they didn't look pretty. I remember when it seemed like I could just open something up and start hacking on it without having to hold tens of thousands of lines of code in my head, spread over dozens of files. I remember when looking at someone else's code could actually teach me something, instead of making me want to cry.
I really don't like this industry much anymore. I guess that makes me a bad hacker or something.
I remember spending hours moving around jumpers to set DMA and IRQ. Only to find out that the only free IRQ was #9 and bloody Packard Bells sometimes didn't have a frikkin' 9 because that batch of boards was $2 cheaper that month...
I remember when it wasn't pretty and it took ridiculous efforts to make it work even a little. Now I mostly jam a USB thing in the side and it goes. There has been much improvement.
I'm actually a little psyched by it all. This just happens to be a really hard problem that arrived at humanity's doorstep before we were really evolved enough to make much of it. It might mean we're ahead of the game, and it almost certainly means that the best is still yet to come.
w.r.t. OP, should lines of code be restricted to manageable amounts by some kind of international regulatory fiat, or can they just abstract the problem away into utterly comprehensive libraries of functions, or do coders just need to man up and take it?
There was a time when there were no humans who knew how to drive a car.
This fits the OPs notion that no one really understand the crap underneath it all.
This is the 'little death.' Allow me to relate an anecdote.
I grew up loving computers, loving everything about them, how they were built, how they were programmed, how they did what they did. I started out just as the 'personal computer' revolution was getting started, it was glorious, Altair, IMSAI, SOL20, Heathkit. Lots of folks with their own take on what the PC should be. I have spent hours and hours and hours writing my own BIOS code, hacking ZCPR3, making an emacs clone work on CP/M, falling in love with the Amiga and suffering the incurable disease of incompetent management. By the time the late 90's rolled around I was starting to burn out. A lot of stupid things which didn't have to be that way, Microsoft always trying to make their version of something just a bit incompatible and only buildable with their tools. Etc. I was writing some code on a windows box and hating it. I yearned for a simple 'make foo'.
Then I met a 'kid' who was building stuff on Windows and he had the same wonder I had when I was that age, except he didn't complain about visual studio crap because he had been introduced to computers with this as the way to do it. Where I saw re-implementations of the wheel, done poorly, he didn't see anything, they were just the tools you had to use to get to the end point.
I realized with a start that I had lost my sense of 'wonder.' That childlike state where you ignore the fact that something is uncomfortable or irritating because you have so much amazement over the thing itself. And the truth is that if you use crappy tools for a while your muscle memory will figure out how to minimize the irritation. I looked around and saw that people I knew, people who were bright lights of leadership back in the day, were now stuck in an endless cycle of curmudgeonly rant because they too had lost their sense of wonder. I decided to start picking my battles more carefully. (which you can do in a hobby, not so much at work)
I found an editor that I could use everywhere (Visual Slick Edit, now VIM) so that I could have the same editing experience on all my platforms. I spent some time to understand the build system (make, gcc, java, python, etc) to get to a point where not only could I create my own environment I could keep it running across platforms, and began to develop my own set of APIs which I could link through into the underlying platform. The goal was reduce the friction between getting stuff done, and the tools to get things done. I recognized the reward comes in the running of the code and getting it to work as I wanted.
Then I can mostly ignore the crappy stuff. I can joke about how putting a character on screen used to be to monitor the transmitter buffer empty (TBE) flag on the serial port and then write the ASCII character when that flag was 'true', to something which spends thousands of cycles checking to see if this time I want my characters to go right to left, or which code points or font I should use to display them, or how they should be alpha blended into the background of what is on a screen somewhere. And when I come across something that is horribly, horribly broken like using WebCam's in Linux, I try to develop a durable API for talking to video which isn't cumbersome to use, or has feature stubs I'll be unlikely to use. I try to stay amazed that I can capture digitally on a piece of $20 hardware that which used to cost thousands, and in so doing keep my sense of wonder about what is possible.
I started out a little after you did, with BASIC on a Commodore Vic-20 and Commodore 64, then Logo, then HyperTalk, and so on. I used to have fun decompiling programs and poking them with MacsBug to make them dance for me, and I used to have fun writing my own toy operating system and generally just screwing around. And, for the most part, my tools were simple and reliable.
So that's what I compare everything now against, and it all seems less reliable and more complicated. MacsBug was a thing of wonder and beauty compared to the "debuggers" I have to deal with most often now -- Firebug and GDBp. And I've started working recently on building my own tools, which is sort of fun again, so maybe I'm sort of headed on the right track.
But anyway, thanks for describing it like you did.
I still learn all I can, but my focus has shifted a little. I'm learning to program properly, to make new tools instead of understanding existing ones. I'm learning C and python and haskell and lisp (and looking for more!) and loving every minute of it.
I find it a little bit disheartening, however, the lack of curiosity that my peers display, even my fellow CS majors. To me, the computer is a vastly complex system, just waiting to be explored, and thanks to open source software, I can! But a lot of my peers don't see things that way. I would hazard a guess that part of that reason is that the environment presented to them is not particularly interesting, particularly in a windows or mac environment, where a lot of the details are abstracted as far away as possible. For me the spark of curiosity was ignited when I had a relic of a machine that I needed to make run faster. I tried to squeeze the most out of that computer, and it taught me enough to whet my appetite. I feel like our youth, my peers have been and are being short changed by technology today. I'm not sure what there is to be done about it, but I know that it isn't right.
Or, for a slightly less doomsday scenario, just a relatively simple problem affecting one person, but they're still trusting you to figure it out, and they're expecting you to do it sooner than later.
There's a point at which the fun really starts to go out of that. I've always been a bit of an adrenaline junkie, I've always worked well under pressure, but nowadays my favorite thing is quiet time in the sunshine digging about in the garden. (Oh god, that makes me sound old.)
Maybe this will never happen to you. I hope not! But I've been thinking a lot about what Chuck said, and about my frustrations with technology, and I think that maybe this has a lot to do with it.
"Just about everybody knows that all our software is imperfect crap on top of imperfect crap, from top to bottom. Everybody, when met with a new codebase above a certain size, thinks they could do better if they started over and did it "properly this time". Everybody can look at a simple thing like a submit form in a web browser, and sigh at the inefficiencies in the whole stack of getting what they type at the keyboard onto the wire in TCP frames, the massive amount of work and edifices of enormous complexity putting together the tooling and build systems and source control and global coordination of teams and the whole lot of it, soup to nuts, into a working system to do the most trivial of work.
"But this is not a new or interesting realization by any means. It's not hard to point almost anywhere in the system and think up better ways of doing it. Pointing it out without some prescription for fixing it is idle; and suggesting that it will be fixed by wholesale replacement by another complex system is, IMO, fantasy."
(Things probably only seemed like they worked in the old days because you hadn't realized what was going on in the sausage factory; or it was an almost useless piece of hardware whose software did almost nothing, owing to its simplicity.)
And so you start using your programming skills to travel the world while working, see some amazing places, but discover to your dismay that things aren't any better no matter where you go.
But you continue writing your own stuff, releasing open source programs here and there, and then somewhere along the way you move into freelancing, which removes the protective layer between you and unreasonable customers.
Tiring of that, you move into iPhone apps and discover that you suck at marketing.
And then one day you find yourself living in San Francisco, founding a startup with a bunch of awesome partners, working insane hours, and having a BLAST trying to solve hard problems.
There is much joy and wonder in this world; you just need to look harder.
But then I think of other actual jobs that exist nowadays, and I conclude being a software dev might be the least worse.
Software is eating the world, after all.
"Everything is amazing and nobody is happy" - Louis CK
This is why you get a PC without admin rights, set to autoupdate, with our default image, our security policies, and our unpopular defaults (IE9 in my case) but with an option for chrome for those who request it.
I have no idea how home users get by. I hear about their problems and its sounds like a nightmare of poorly written default (your average home laptop is a minefield of OEM junk and a malware magnet). I suspect this is why Apple is doing so well. They abstract away the complexity and actually make their products compatible with their OS and other offerings! Toss in the fact that OSX isn't as targetted by malware, and I can see the allure. On the corporate side, linux is the same deal. We all are comfortable with the command line tools we learned in high school/college/whenever and they haven't changed. Its like an old friend. Sure, it may not generally "just work" but its easier to tinker and fix than a closed system.
I remember when you would buy a new mouse and sometimes it just wouldn't work and then you actually had to take it back to the store because you had something slightly different about your computer and there wasn't a released driver yet.
I remember when Blue Screen of Death was more than just a joke, and it was an actual thing that happened on a fairly regular basis.
You know how everyone thinks IE6 is teh devil? I remember when it came out and was awesome and it actually killed off the competitors for good with all its sweet features.
That's the world I grew up in, and the current state of things is MUCH better.
Here's me telling you what's wrong instead of listening: what you're missing here is that we're doing vastly more than we were back then. If you want a pixelized bunch of text that does a simplistic task or two I'm sure you could do now in even less lines and more readable than you could back then.
But no one wants that from us anymore. You can't make a living doing that. You, yourself were complaining about Samba not implementing a specific feature. I bet their code is already too big to meet your "software is fun" metric.
Of course, all IDEs have their design flaws (like any other piece of software), but the benefits are compelling.
If code completion is speeding you up, it means you're writing code with a lot of redundancy in it. But the slow part of programming isn't typing the code in in the first place; it's reading it later. If the stuff you're putting in your source code is redundant boilerplate that code completion can put in there for you, then it's going to make it hard to read the code later, when you need to find the part that you actually typed, because it has a bug in it, or because you need to change it.
If fast lookups (I assume you mean meta-point, quickly jumping to the definition of an identifier from a mention of the identifier?) are speeding you up, well, maybe you're getting into a codebase you don't understand very well, because you're just starting to work on it. (Although these days grep is pretty fast.) Or maybe your codebase is an overcomplicated nightmare with too many layers of indirection, costing you comprehensibility even despite the wonders of meta-point.
If folding speeds you up a lot, maybe your code is way too deeply nested, or you have a lot of irrelevant stuff mixed in with the stuff you're trying to read.
If integrated documentation speeds you up a lot, maybe you're programming to an API that's too big to memorize, which means you're going to make errors when you call it and then not be able to see them when you read the code.
Syntax highlighting is nice, but it matters most when your code is poorly laid out and unclear, and then it doesn't make your code easy to understand.
If you're spending so much time in your debugger that it matters a lot to you whether the debugger is an "integrated graphical debugger" or not, you're wasting a lot of time. Debuggers are indispensable when you're exploring a codebase you don't understand at all. (Although, in that case, the best one is probably IDA Pro, not part of an IDE.) But on code you write yourself, use unit tests and code reviews to minimize the number of bugs you have to debug, and in many cases, logging is a much more efficient way to track down bugs than a debugger, because you can see a whole run at once, instead of a single point in the execution. As an extra-special bonus, logging lets you track down bugs from production that you can't figure out how to reproduce on your own box.
If IDE GUI tools are saving you time, any time at all, your GUI library sucks and you should use a better one. Also, you're probably making shitty GUIs, the clicky equivalent of the IRS 1040-EZ, joyless bureaucratic crap, because GUI tools don't help you when you're trying to construct things like kseg or Inkscape or the GIMP. HTML and CSS, or for that matter Tk, allow you to produce those soul-destroying forms with less effort than some crappy drag-and-drop IDE. That's why HTML has replaced Visual Basic in modern use.
As for project-based search, I keep each project in a directory of its own, and then I can use grep -r or ack to search the project.
As for integrated source control, I probably spend about five minutes per hour interacting with my source control system, except when it completely breaks (in which case IDE integration is generally pretty useless).
If your integrated build system is saving you time, then you have way too much complexity in your build scripts, which are a part of your code that don't add any functionality to your product.
In short, the lovely features you're describing are real productivity boosts — but each only exists to compensate for an even bigger productivity killer, and then they only rescue you partway.
Basically I think it's stupid to do stuff manually that could be automated. Compilation is a good example: by using a compiler, I automate endless hours of fiddling with register assignments. But I think that eliminating work is even better than automating it.
I've reported thousands of bugs. Maybe 30-50% of them used to be fixed. Nowadays I'd be pushing it to say 5%.
I've gone to report GNOME or Ubuntu bugs, only to find that the problem already has a bug report from 5 years ago. The bugs are still open and unfixed, with a poor sap adding a "me too" comment every few months.
Before there was a feeling of progress: report a bug, see it fixed. Now it's report a bug and silence.
You can pay someone to implement this, as well as any other missing AD features, then open source it so no-one has to implement it again.
The original devs that did much of the reverse-engineering work on Samba have since moved on to much more rewarding work. That work will essentially never be done again, nor be improved upon until financial incentives are introduced. Much of the work on Samba since has been bug fixes and pushing your food around on your plate.
We are not your servants, we are people. Give value, get value. Open source is not your custom-fit panacea.
>"So build your own version."..."Why are you reinventing the wheel? That's already been done."
Probably two different groups of the people, the latter group is likely harried contributors to a project you're asking for help from who doesn't have the benefit of the context for the work you're doing. It's a common thing in mailing lists and IRC.
>So Western Digital's backup software interferes with Outlook in funny ways?
We're dipping into two different pools of shitware for examples of bad software now.
>So everybody's decided to abandon sane software versioning altogether and make the support end of things even more nuts?
It's gotten more standard in some respects, thanks to SemVer. I can't speak for companies that have decided to treat it like a high score, much like the Linux distributions of the late 90s and early 00s.
>Software used to be fun.
Still fun for me, after ~15 years of coding, 5 of them professionally. Sounds like you're just grumpy and unwilling to invest in swatting any of the gnats flying in your face.
That or you should take up buddhism, seriously. I'm an atheist but there's some benefit to learning when you should and should not care about things.
> ...as well as any other missing AD features...
This isn't exactly a feature, it's a core part of AD permissions. Samba 4 was developed for the purpose of taking on server roles in an AD environment.
> We are not your servants, we are people. Give value, get value. Open source is not your custom-fit panacea.
OK, I hear that a lot. And it's a fair criticism. Now, here's the other half of it: support people are not your janitors. Quit expecting us to spend hours digging through arcane documentation, followed by further hours troubleshooting things that you left half-finished, and then turn around and tell us to write it our own damn selves. Because, seriously, there just aren't enough hours in the day. I'd love to contribute more to open source, but first I have to get enough revenue in my business to support that, and before I can do that, I have to figure out how to fix my clients' technical issues without raping their pocketbooks. The harder my job gets, the less likely I am to contribute.
For example, I might be writing a Jooma migration tool right now to fix the stupid 1.5->1.7 issues, and I'd be happy to release it and even support it for as long as people need it, but first I have to figure out what the hell is wrong with the wireless drivers in Linux on the new laptop...
> We're dipping into two different pools of shitware for examples of bad software now.
Yeah, that was the point: examples of bad software cross all disciplines, all companies, all environments. If it was just one company that consistently produced crap software, it would be easy to say that there's probably something broken at that company. But when there are so many companies, and so many freelancers, and so many open source developers producing crap software -- there's probably some issue with software development itself.
> That or you should take up buddhism, seriously.
Eh, I appreciate that, really, but I don't want to stop caring. I want it to be better.
Unless I'm missing something, if software were perfect then there wouldn't be a need for support people?
I'm not under any illusion that it would work that way for everyone. But, I can't think of a single support-level person (whether consultant, phone, on-site, or otherwise) that's been doing it for more than a few years that's really happy with it. It can be exceptionally frustrating to be the go-between between users and fluky systems. So, at the very least, it would eliminate a necessary evil that makes a lot of people unhappy.
It might also result in fewer jobs to go around. But I don't think so.
It would result in more wealth and efficiency, not more jobs. Capitalism says nothing about jobs, only wealth.
We don't really (in western society) have a mechanism for transferring wealth beyond trade, labor, and government fiat. In the absence of busywork created by government fiat, we're going to have a hard time socially speaking midwifing an increasingly efficient world where redundant jobs get replaced with technology and processes.
This will continue the trend of increasing wealth stratification as whomever has control over the means of production will be subject to the will of others less and less, and be able to keep more of their profits.
Incidentally, this means it'll be fantastic to be a programmer, and terrible to be a laborer.
If a novel solution isn't found, the best many could hope for is a service job or medieval-style patronage of arts as production and maintenance of product pipelines requires fewer humans.
Just because it generates reports promptly without crashing every time doesn't mean it will know which reports to generate or which data to pull from, or how to include your new office statistics.
Just because your phone synchronises email when abroad without messing up doesn't mean it knows which email addresses to forward to which people.
Support people would then be primarily involved in using technology to help businesses do things, instead of primarily using technology workaround other technology's problems.
I am amazed once again at some people's ability to take free stuff and complain that it isn't making them money fast enough.
Just take a few seconds to think about the value we all get out of the deep and broad Free Software stacks. Finding bugs and fixing docs is a part of that process. You're not entitled to any of it, but you're welcome to participate and partake of the fruits.
No necessarily. I think sometimes the existence of something, even if only barely usable, blocks the creation of something else that might be better.
You have to weigh the value of immediacy against permanently enshrined qualities and perform your own cost-benefit analysis.
This is called critical thinking, they introduce this around age 12 in most western societies. You don't have to pick a religion and stick with it, you can just make a judgment call on a case-by-case basis.
Works great for me, fyi.
That complaint applies equally to commercial software.
CEO arrives in a foreign airport, calls us up because his Blackberry now shows email headers instead of email, including email in his inbox which previously had all the content. The carrier tells me everything is setup fine for roaming and he needs to reset his Blackberry by taking the battery and SIM out, and then booting it with no SIM, then putting it back together properly. And if that doesn't work, he needs a new phone. And we couldn't talk him through it because that's his phone, and we couldn't email instructions, and anyway it was evening in an airport and he was in no mood for it.
This is a commercial device, praised for its business email handling, dealing with a well established, decades old protocol, from a large international carrier. Talk about a problem which just shouldn't happen, with a nonsensical solution.
And it's just one anecdotal example of every day workaround-finding. Wobbly software Jenga towers everywhere, and reboot-into-a-known-state is rule number 1.
Corpses write no code. shrugs
> If it was just one company that consistently produced crap software, it would be easy to say that there's probably something broken at that company.
Anything below the top, say, 2-5% of software is guaranteed to be shit because all the programmers below the top 2-5% are shit. There is no grand movement or methodology to be had here.
It's a people problem.
>For example, I might be writing a Jooma migration tool right now to fix the stupid 1.5->1.7 issues, and I'd be happy to release it and even support it for as long as people need it, but first I have to figure out what the hell is wrong with the wireless drivers in Linux on the new laptop...
This is why I use Linux for workstations where it seems more comfortable, and my Mac for my mobile machine as it's more tolerant of network/display disruption. I'm not trying to troll or be a Mac fanboy here, I prefer working in Linux as I am simply more productive and it's my production OS. But, a laptop that is on the move plays to the strengths of OS X sufficiently that I am travelling right now and using my Mac instead of my Linux laptop.
You're going to have to accept that if you use the wrong software for the wrong problems, you're going to keep getting poked in the eye. You're using bad hammers and complaining about how bad hammers are.
There's a distinction to be made, an important one.
>Now, here's the other half of it: support people are not your janitors.
Open source projects don't have "support people", they have contributors and devs that volunteer their time. I worked for a MOTU and volunteered in the #ubuntu channel on FreeNode for years. I've done thousands of man-hours of support. I know exactly how bad software is, and how bad the situation is.
I'm on your side here, but until you start solving the problems one at a time, nothing changes.
My apartment gets cleaned one trash bag at a time.
>The harder my job gets, the less likely I am to contribute.
Something's gotta give. Stuff like Joomla registers as shitware in the circles I go in.
Complaining about things like Joomla, PHP, Drupal, Outlook, etc. doesn't really register with me.
Might as well buy a Kia and complain about how terrible the state of automobiles are. The author had much stronger points than you have. You work with some terrible stuff, period.
Disagree. I don't think 95-98% of programmers are idiots. However, a lot of programmers (even some very smart ones) are terrible architects and have no sense of the big picture. Moreover, a lot of projects start out well but turn to shit through neglect and departure from the original architecture.
95-98% of software architecture is deplorable, but that's not because our industry is has 20+ idiots for every decently smart person. It's a lot more subtle than that. A big part of the problem is that most of professional programming is so disconnected and often alienating that a lot of programmers never learn architecture and its importance; the only way to really learn it is to support your own creation and experience first-hand the consequences of your original decisions. A lot of programmers never have that experience.
In sum, I think the general shittiness of the software industry and of architecture has a lot more to do with the fact that few programmers never have the experiences that will make them any good, than it does with a lack of talent. It takes 10,000 hours of deliberate practice to become good at something, but most of what most programmers do for work is not "deliberate practice"; it's repetitive drudge work that they often don't have the creative control to automate.
No, it takes that long to master something, it takes much less time to simply become good.
I didn't say that.
> It takes 10,000 hours of deliberate practice to become good at something,
People need to stop re-hashing this. 10,000 hours of deliberate practice is likely but not necessarily going to make you an excellent programmer. I know plenty of programmers in their 40s and older who have that much time in or more that are frankly, garbage.
Investing time is necessary, but insufficient, and there is no one grand unified number that defines all professions for what is necessary to become excellent for every individual.
I know a 60-something whose code output terrifies me and he has a great deal more than 10,000 hours invested in programming.
OTOH, one of the programmers I most deeply respect is a 50-year old woman.
It's hit or miss. In my experience, passion counts more than anything.
>but that's not because our industry is has 20+ idiots for every decently smart person.
Work at a major insurance/something-not-directly-related-to-software company. That's an optimistic ratio.
Stop pretending the 10,000 hours thing is a "real thing" or somehow fact.
It's not. Fucking stop it. It's the fantasy of a bad writer who makes up shit based on pure anecdote.
And to use your own bullshit against you, he never said it was deliberate practice in the book, he explicitly used the example of the Beatles, whose "10,000 hours" was them jamming in public and fiddling around privately, not hammering at chord progressions.
The 10,000 hours meme is bullshit. Stop propagating it.
If you take it literal, then yes, it is nonsense, it is not that until the 9999th hour you are going to be bad at something and then suddenly, boom magic.
What it means - and what I've found to be very true - is that to get better at something you need to put in time and you need to practice your trade.
Nobody is born a 'great programmer', sure there are some differences in talent but I've seen guys go from bad to mediocre to good to excellent just by applying their trade and learning their lessons. Some of the kids I taught a decade ago that were struggling with basic concepts now run circles around me. That's proof enough to me that there is some truth in the 10,000 hour rule.
You can disagree with the writer all you want but in practice he does seem to have a point.
And for all your aggressive use of language you so far do not seem to have one. If you want to show that something is not true you have to provide counterexamples, not simply jump up and down using foul language telling people to stop.
Sure. And I could spend 10,000 hours playing basketball and I'd never become good at it. I don't have the genes. The point about "10,000 hours" is not that anyone can become good. It's that this is the amount of time that it takes for a person with sufficient talent (which is uncommon but not outstandingly rare, as it might seem) to become great at something.
Also, 10,000 hours of inadequate or badly-structured practice is useless. Otherwise, five years of work would be enough, and for most people, it's not. Most of the things that software developers do for money don't make them better programmers and therefore don't count.
I agree. Passion, creativity, and courage are all important. It takes all three to figure out how to divert 10,000 hours away from what you're "supposed to do" and toward what will actually teach you something.
The point of Outliers, the Gladwell book, is to dismiss the idea that talent exists at all. "A person with sufficient talent", as you say, is a person who started onto decent amounts of practise at a young age, such that by the time the world notices them at age 8, 10 or 14, they're already surprisingly good and can make good use of professional coaching.
It's not a fact, it's a number he pulled out of his ass.
He never made it about deliberate practice, he used the Beatles jamming and fucking around for 10,000 hours as an example.
Stop talking about 10,000 hours IT'S BULLSHIT. You might was replace-string it with "SIX-SIGMA CERTIFICATION", it means nothing!
He didn't say they were "jamming and fucking around", or anything of the sort. He also didn't say "chord progressions is the only practise that counts", to reply to your other comment.
He also also didn't say "10,000 hours is a fact, 9,999 hours won't do", it's an anecdote fitting rule of thumb to tell a story.
But if you can find plenty of people who are world class at what they do, and haven't done anything close to 10,000 hours of doing it, have at it.
Lots of us did coding as a hobby for 10 years before gettting out into the professional world. With 5 years under your belt, you're still a junior programmer and the world's your oyster. Let us know how much fun it is 15-20 years or more from now. (And when you do have to care about things more important than programming: family, health, aging parents, job security, etc., etc.)
For all I complain about the state of technology, it's remarkable that I am using it to communicate with some of the most amazing people in the world. You just can't make this stuff up.
I think that maybe it's time to stop taking the pain and do something about it. The only part I've never been able to figure out is how to earn enough income and maintain the independence required to work on the really big problems. Heck, maybe that IS the problem.
This morning I was going to get up and fix another computer to sell on eBay because I am living month to month since quitting my job in November to make the leap into living the dream. I've called in almost every favor and in all honesty am within a few weeks of applying for jobs. Maybe I should have blogged about the process earlier!
I feel like if I can't hack it, then there must be others out there just like me. I just didn't expect this post to resonate quite like it did. I wonder how many of us feel like that friend we all know whose band hasn't quite made it yet so they couch surf. The waste of talent is staggering, and all around us.
If you are interested in this issue, I set up a google doc here:
I'm totally open to any suggestions and you can also contact me on twitter at @zackarymorris or any of the sites in my profile.
There you go. I'm sure you'll find you're not alone.
Best of luck and stay positive :)
What do you think the problem (or problems) is exactly?
I'll give you my current high-level pet peeve for free:
While programming, it is so incredibly easy to miss something, to not be entirely thorough.
I think a lot of problems originate from the fact that, even for an experienced programmer, it is hard to get a feel for what a function actually does, including all the corner cases. Especially if someone else wrote it, or you wrote it a long time ago. Yes, you can read the program text, and encounter a lot of logical expressions, evaluations, etc. Fitting it all into your head can be difficult and takes quite some time. Code as text format is quite limited and slow to parse deeply in your brain.
I'm sure that is possible for a tool (some kind of uber-IDE) to help with this, for the compiler to 'explore' what happens inside the function (as that's what a compiler does). But what it can't do is show it to you in a useful way.
Some vague ideas around this:
- test cases: a lot of work in writing test cases is simply mechanical. You want to evaluate all the code paths, verify assertions and function input/output. There are initiatives like (Microsoft) Pex that attempt this. If you have auto-generated unit test-cases that explore the corner cases you get more feel for what a program does and what might still be wrong.
- code verification: computers are incredibly fast, and can understand the code that we type instantly. With post/preconditions and assertions, it could be possible to find bugs and overseen cases while we type in the background by random/structured "brute force" fuzzing (let the monkey loose) or modeling analytically. Imagine the hours of debugging saved.
- graphical languages: pictures sometimes say more than words. Would also remove the "mental overhead" of worrying about specific superficial program syntax. Alternative representations of source code. Some kind of (intuitive) graphical visualization of software would be very useful. Both for building programs and looking at programs from a new perspective. This could also include intermediates between fully graphical and fully text, or simply looking at the program in another syntax.
- literate programming: why did this never go anywhere in the mainstream? the perspective/train of thought of the developer would be another input to the person maintaining the code. Code comments are part of the solution, but they're completely separate from the program. To be more useful the there needs to be a deeper way to associate/interleave the train of thought and even discussions between developers with the code.
Anyway, there's probably lots of issues with these ideas, but I do feel that better tools of the trade could help. Things don't have to be completely automated, but the computer could certainly help more in filling in the blanks. It seems that with all the jobs 'we' automate, there is little focus in automating part of the job of a software developer (on the other hand, the state of the art does advance, high-level languages like Python and PHP already helped a lot in making us more productive, oh man imagine that we had to write everything in C...)
I disagree. Automation is happening every day. Your error is assuming that will lead to "people not having to work." It won't, and the reason for that is social and political, and has nothing to do with CS.
I'd like to believe that a deep rethinking of computer systems in languages that aren't C-based and incorporate the academic OS research done in the last 30 years would produce some fantastic innovation. But that requires something like a Bell Labs willing to allow years of hacking for potentially 0 return.
I think sometimes as developers, we need to cut ourselves a little slack.
Sure we need to continue to move the state of the art forward. But sometimes we do get stuff done even in spite of our industry's imperfections. And some people find software pretty useful even with two different versions of PNG image loading code.
For example, the old tractor had a couple of problems that would crop up when it was cold. Nothing serious, but definitely annoying. The new tractor exhibits the exact same problems in an identical fashion. Thirteen solid years of engineering and the only thing that really changed was some improvements to the user experience.
And that sounds a lot like the software industry – a new interface on top of the same old technology. With that, I imagine you are right that virtually all industries have the same problem.
Across the software industry, we lose an incredible amount of time to the maintenance of bad code. The average professional programmer writes about 250 lines of new code per month. Most large software companies have zombie legacy systems that have ceased to grow but have one or more full-time developers only on maintenance.
So, yes, there is a problem. The way we are doing things, as an industry, is terrible. On the upside, this means that there's a lot of profit potential in improving engineering practices.
Instead of, say, adding a new feature to a large project, let's lay a new subway line in New York. Better make sure that it works with the current signaling system, and can accomodate every train that's on the lines now. Oh, and before you start digging, you better check that you're not cutting across existing power, water, steam and gas lines. Not to mention other tunnels, building foundations, mole people colonies, etc.
And on the rare occasion the ideological programmers are in charge, you can't get them to agree.
Frustrating but that's the way the world works and I'm not sure where complaining gets you.
But I really miss toggling instructions and data directly into memory on the PDP-11's control panel. Yeah, the state of the art is terrible indeed.
I tried talking a client through deploying his stuff to the app store. It took about a half hour to figure out what weird setting (overridden by his target settings) that he had different.
Does stuff really need to be this difficult?
it's almost absurd.
1. If x86 hardware is so terrible (and I have heard that the architecture really is bad many times), how come we don't have competing chips out there which are many, many times more efficient? I know ARM outperforms on the low-power front, but not in terms of perf to my knowledge. Do such chips exist? And if not, why not if this is true? Even with x86 backwards compatibility concerns, you could bring out a theoretical amazingly powerful chip and just port compilers to it to leverage it and gain some (possibly niche) market hold that way.
2. I think he is underestimating the vast complexity of computing, and deeply underplaying the techniques which have evolved to get us where we are. Yes, I have again heard that the Von Neumann architecture has shortcomings, but look at what computing has enabled us to do - it has changed the world + enabled amazing things which are evolving at breakneck speed. Again - if there really is an ideal alternative approach, why isn't it being pursued? I don't buy the conspiracy theory crap about vested interests holding it back. Again, you could do well in a niche if the alternatives really are that much better.
3. I think it is likely that as things evolve previous ideas/approaches will be overturned and old approaches thrown away, like any pursuit. As @deweller says, this is true of any field. Ask any professional and they will tell you how their industry is fucked in 100 different ways. It is frustrating to hear software engineers talk about how terrible immature + hackerish + crap it all is while assuming that things have some sort of crystalline beauty in other industries. I did Civil Engineering at (a good) university, and it turns out that most consultancies use linear models to model everything, then apply an arbitrary safety factor, resulting in hopelessly inefficient over-engineered structures all over the place + turning the job into something of an uncreative administrative role in some respects (with no disrespect intended for all the civil engineers out there).
4. It is my theory that any attempt to interface with the real world and actually do useful real things results in an enormous amount of uncertainty and chaos in which you still have to make decisions however imperfect and however negatively others will judge it going forward. I think that's true because the world is chaotic and complicated and imperfect, and to influence it means to be touched by that too. That doesn't excuse poor engineering or not caring or stupid decisions, etc. but it is a factor everywhere I think.
5. So change it :-) there are efforts afoot everywhere to struggle to improve things, and from the sounds of it the guy has perhaps not tried enough things. I feel go is a great step in the right direction for a systems language which removes a lot of the crap you don't want to think about, as does C# (though it does potentially tie you to ms crap).
6. Programming is difficult, solving real problems is difficult, abstractions are nice but leak, and sometimes it's quite painful to have to put up with all the crap other people hacked together to make things work. But the value is in the end product - I don't care that my beautiful macbook air runs on a faulted architecture and some perhaps imperfect software, the beautiful ease of use is what matters to me. We mustn't lose sight of these things, though again this is not an excuse for crap code. There is definitely a balance to be struck.
7. Zack Morris is clearly a highly accomplished + competent hacker given what he's done and the obvious passion he writes with (to care enough to be disappointed + upset by these things is indicative), which I think has a bearing - the deeper your knowledge, the better your skill, the more faults you notice by dint of your intelligence + competence. There is a definite curse at play here.
Anyway, just my 2p :)
A quick answer to your @singular's questions:
1) Existing code - this trumps writing everything from scratch.
2) I don't agree, I believe the compuation is straightforward, my belief is that what you perceive as 'progress' is mostly just 'go really really fast.' I showed a Microsoft engineer at the Vintage computer festival installing an RDBMS on VMS while four people were playing games and exploring VMS on four terminals connected to the machine, then I fired up and ran the test code to show the code had installed and was usable. It did not impress him that I didn't reboot the system once, nor did the other four people using the system notice I had installed a new capability that was available to everyone using the OS. Those are not design goals of a 'personal' OS like Windos/DOS/NT, although they could be. The stuff you learn in CS classes about architecture and layers and models and invariants, can make for very robust systems.
3. My experience is that programmers program. Which is to say that they feel more productive when they produce 10,000 lines of code than when they delete 500 lines of code and re-organize another 500 lines. Unlike more 'physical goods' types of industries it is easier to push that stuff out into production. So more of it ends up in production.
4. Not sure where this was going.
5. This is something I like to believe in too, its just code, so write new code. Hey Linus did it right? The challenge is that it will take 4 - 5 years for one person to get to the point where they can do 'useful' stuff. That is a lonely time. I wrote the moral equivalent of ChromeOS (although using eCos as the underlying task scheduler and some of the original Java VM as the UI implementation.) Fun to write but not something picked up by anyone. You get tired.
6. I'd take a look at eCos here, one of the cool things about that project was a tool (ecosconfig) which helped keep leaks from developing.
In the 'hard' world (say Civil Engineering) there are liability laws that provide a corrective force. In software it is so easy to just get something put togehter that kinda works, that unless you are more interested in the writing than the result, you may find that you're spending less time on structure and more on results.
2. Sure, I guess what I'm getting at is that we've done amazing things with what we've got, I'm by no means suggesting we shouldn't take a broader view and replace crap, or at least work towards it where market entrenchment makes things difficult. The point is, again, that if there exists such a plausible alternative to the Von Neumann architecture, then why aren't there machines out there taking advantage? Again you could probably fill a niche this way. I suppose, in answer to my own question here, that you would be fighting a losing battle against the rest of the hardware out there being reliant on V-N but still, I'd have assumed that something would exist :)
3. Yeah. But it's hard + often the harder path to do things right in any industry. Such is life, not that that excuses anything.
4. A sort of philosophical point. Feel free to ignore :-).
5. There is stuff out there that already exists too though. Go, OCaml, Haskell, F# are all really interesting languages which in their own ways tackle a lot of the accidental complexity problems out there. Plan 9 + inferno are very interesting OSes, though they are probably a little too niche to be all that useful in the real world. But yeah, understandable, fighting the tide is difficult.
6. Cool will take a look.
Yeah - one of the things that attracts me to software engineering is the relative freedom you get to be fully creative in solving a problem. However that cuts both ways it seems.
Suppose I invented a new chip that was awesome for gaming, spreadsheets, word processing, databases and power consumption.
Who would build PCs with it?
What OS would it run if someone built it?
Who would buy that?
Its not merely a "huge, massively important thing". Its the only thing.
No doubt they made some mistakes, but it wasn't for lack of trying.
(And having debugged code on an ia64 I'm quite happy with the status quo!)
As we move into an era of 'I don't see why I should get a new machine' of growth minimization there is a window for folks like ARM to get in with 'all day computing.' But it will take someone extraordinary to make that happen. Look at the state of Linux on ARM to understand the power of a legacy architecture.
That is, until you do them in parallel...
You're absolutely right: if it could be done much better, there'd be an example in the market to prove it. Yet Intel is walking all over the market, and has been for the last 8 years or so.
X86 is amazing in spite of legacy and poor architectural choices.
Would you also say that there probably couldn't exist a significantly better OS than Mac, Windows, or Linux, or else we'd know about it? I admit "better" can be hard to define; what would make a 10x better car, or IP protocol, for example? It strains the imagination, because what makes these things "good" is complicated. But ask anyone who was around during the early proliferation of computer architectures and operating systems, and they will tell you that what ended up winning out is markedly inferior to other things that could have been. Paths get chosen, decisions get made, billion-dollar fabs get built. The market doesn't pick the best technology.
It's like web standards -- accept them, but only defend them to a point. We may be stuck with CSS, but that doesn't mean it's any good or that it succeeded on its merits.
As to the rest of it, I think you can look at microwaves for a perfect example of terrible software in wide spread use. You need to be able to select cook time and possibly power level or set the clock. Yet, most microwaves have such a terrible interface that few guests get embarrassed asking how to get a new one to work. And as long as it takes more effort to send it back than it takes to figure out the strange design there is little reason to build a better system.
the core dilemma of computer science is this: conceptually simple systems are built on staggeringly complex abstractions; conceptually complex systems are built on simple abstractions. which is to say the more work your system does for you, the harder it was to build.
there are no stacks which are pure from head to toe. I guarantee you, old LISP Machine developers from Symbolics probably had a hard time designing their stuff as well.
And I wouldn't say PPC lost. IBM has competitive CPUs in the market, they're just not in consumer devices. But they're just that: "competetive". They aren't much better (actually pretty much nothing is better than Sandy Bridge right now).
To me TFA is about a reassessment of fundamental assumptions, and it's about exploration. It doesn't suggest concrete solutions because nobody knows what they are, but it does suggest that our efforts to better the art have been short-sighted. Right now the next Intel chip or ARM chip is just another target for compilation, just another language or library fight with instead of solving real problems - solving old problems, not just the latest new/interesting/imagined ones.
(FWIW, this particular example doesn't excite me too much - If the future is DWIM, it almost certainly has to be done first in software, even if it is eventually supported by specialised hardware.)
Also, x86 isn't as bad as it is made out to be either, even if most of that is just how well our compilers can optimise x86 code.
What about IBM's Cell (the one used in PlayStation 3)? Or GPU related technologies, like nVidia CUDA?
No Silver Bullet: Essence and Accidents of Software Engineering
for a better idea why the state of art is actually much less terrible than it appears to idealists.
"I believe the hard part of building software to be the specification, design, and testing of this conceptual construct, not the labor of representing it and testing the fidelity of the representation. We still make syntax errors, to be sure; but they are fuzz compared with the conceptual errors in most systems.
If this is true, building software will always be hard. There is inherently no silver bullet."
E.g. think why Google just doesn't buy Oracle servers instead of developing all their software infrastructure -- they wouldn't be able to provide the service they do and they wouldn't earn anything. Whenever you decide that you "don't care" you didn't really remove complexity, you just decided to accept the way the system behaves by default with all the limitations that come from that.
Idealists think that a lot of the details can be "abstracted away" just because they'd rather not think about them, but they are still there and they make the difference.
"The good tools like functional programming and provably correct algorithms are either too esoteric or too expensive for the mainstream"
I was really with him there.
Then when he said:
"That’s a reason why one of my favorite languages is php. It just works. Screw up your variable typing? Who cares. Forget a symbol somewhere? Heck you still get some output and can see your error."
Is the contradiction there not obvious?
Then to act like the entire software stack is broken is silly to me. Sure there's inefficiencies. In any large complicated system, you can expect there to be room for improvement. And anytime you introduce layers of abstraction, inefficiency is bound to happen. But guess what? Not everybody has time to write code in assembly. The fact of the matter is, the software ecosystem has become so complex that people are forced to specialize in more and more specific subsets of it.
And all this jazz about "Oh boo hoo.. software hasn't solved world hunger and doesn't wipe my ass for me when I use the bathroom". Geez man. Calm down. These problems your talking about like figuring out AI are incredibly hard. Its going to take a long, long time for them to be tackled fully. Software and computers have only been around for the blink of an eye in human existence.
"if your grandparents can’t use it out of the box to do something real... then it fails"
The problem is not the difficulty of software, or its aesthetic decline. The problem is that the most important things for human happiness -- such as autonomy, integrity, feeling of connection to a world larger than us and love for other human beings -- are mostly ignored or eroded by technology rather than improved by it. This is not something new to computer technology, but it does seem to be focused and hastened by it.
When you're a kid, it's easy to be enthralled by the wonder of the machine. I certainly was. As an adult, you don't have that anymore. You need to feel like you're working for something worthwhile. All of that complexity would be worth managing if we understood it as part of a struggle for something of magnitude. The feeling that it's all crap comes, as much as anything, from this lack of ends.
The momentum means we can never stop to make sure the software is as carefully written at it needs to be, and we can never stop to think of the big picture (all the other software we need to operate with).
And because of capitalism's inherent competitive nature, there is no concerted movement towards a unified goal in computing; competition leads to fragmentation, not unification. So we have a profileration of operating systems, redundant software reinventing wheels etc.
Open source is interesting in that regard. It's sort of the solution to a lot of the mess -- in theory, by making the source public, nobody should ever need to solve a problem more than once, and a specific piece of problem-solving code should evolve over time into the perfect solution -- but it's screwed by the competitive nature of people. (People don't just compete among themselves with their egos. They also compete against the status quo; what I like to call the "not invented by me" syndrome which drives people to create new stuff even though what they really ought to have done is to improve the old stuff, thus expanding the pile of legacy software even further.)
Why do we have both Python and Ruby? They are incredibly similar to each other. They are so similar it's silly. Sure, one's got whitespace-sensitive syntax, the other has runtime-extensible OO. But those are superficial differences, and nobody can objectively say that Python is better than Ruby, or vice versa. Do we really need both? And yet Matz and Guido are never going to join forces to work together on a common goal to create a single, superior technology.
Open source ought to work less like capitalism. People need to work together into creating the best, safest, most robust software imaginable.
This is true of any engineering discipline throughout history (read Henry Petrowski for many examples). One example was the technology of building iron railroad bridges in UK in 1800s. As the railway network grew bridges were needed and UK did not have enough forests left, so bridges were constructed from cast iron. However, at the time the science of metal fatigue was not there, so many of the these bridges failed - killing people in the process.
Clearly the state of the art of iron bridge building was in a horrible state, yet bridges were built because people needed them.
Eventually the science caught up and bridges don't fail as much anymore..
Having competition made them both stronger, no doubt.
People tend to find their local maximum. Competition forces projects out of local maximums.
There's much to enjoy, agree with, and debate in this essay. But this is the most important line.
I mean, I worked with Logo in 6th grade. So I'm really not sure what else you would teach that is different.
The state of software is sad.
I liked that one "Heck, my Mac Plus in 1987 with HyperCard was more approachable than anything today." which is very true, too.
In 1987 _lots_ of people had never sat down before a computer in their lives. I could spend an hour with someone and have them doing useful things with Hypercard. It was extremely approachable. And I don't just mean doing word processing or drawing pictures, though they were doing that too. They were creating software that solved real problems from their real life.
I started out by giving out floppy disks for them to save their "stacks", for them from out of my own pocket. We finally had to start selling them at the circulation desk.
The biggest conceptual problems, then as now, were in teaching people how to deal with files in a file system, on multiple volumes/disks, and in making people think in terms of versions of documents (actually applications in this case) and in backing them up.
Data loss from exposing the disks to magnetic fields was also a major problem, at least we seem to have conquered that problem...
The tragedy will be if we are so blinded by the past that we do nothing more than project the status quo onto it. If we can get back to our roots and remember what computing is fundamentally about, extending the capability of humanity, I think we can expect to see some amazing things over the next decade.
I still use computers, heck, the whole developed world uses computers now. But that's despite the way software/hardware have developed; a side effect of Moore Law making things faster and cheaper.
Unix is the big thing in 2011? Really? Nooo.... well shit!
Funny that the OP mentions Hypercard. When it came out a number of very smart people said, "Hey, that's kind of neat!"
And it was. It was little better than a toy at first, but I saw street people sit down with it and _an hour latter_ they were using it to solve problems, and were delighted. And let me tell you, in 1987 part of that hour was used to explain how to us a mouse, drag selections, double click, etc.
Natural Intelligence is the product of billion years of crappy evolution, and thinking that the application of some clear, clean mathematical concepts will be able to recreate it is very naive.
I have a nacent dream of a language that could elegantly replace the workflow definitions with pure code that looked like Lisp or Json. I'd like to work on a program like that.
I find myself agreeing with the author on many points. The evolutionary progress in computing seems to have stalled a long time ago. The fact that we are still hand-coding loops and such things means that it is hard to move up into a higher level of consciousness, if you will. The paradigm shift we need is one where the programmer is able to think and work in problem space rather than being pushed down into verifying loop counts and semicolons every five minutes.
Back in college I experienced a mind-opening moment when a physics professor insisted that I enroll in a class he was teaching. The class was for a language called APL. I won't go into details here. Look it up if interested. That class and that language changed my view of computing and how computing could work forever. I was taking FORTRAN and C classes at the same time. The contrast between the languages was almost beyond description. While we were mired in do loops and other language-mechanics in FORTRAN and C we were actually solving real problems with APL in very short order...even writing a game or two, database applications and doing some scientific computing. Programmer productivity and the ability to express and solve a problem simply could not be compared. APL felt like it was a century ahead of anything else.
APL lets you focus on the problem space. In a certain way it is like playing the piano, you don't think about frequencies and durations, you think about expression of ideas.
As people focused in languages like C++ (which were easier to grasp and use with equipment available in those days) APL never became mainstream and, to some degree, did not evolve into what it could have become. Ironically, the machines we all have on our desks today provide an incredible environment for a language like APL in terms of available resources and speed.
I am not saying that APL is the end-all. What I am saying is that my path through this craft was altered in a non-trivial manner by being exposed to a very different set of ideas. I find myself longing for feeling that way about the tools I have to use today, particularly when hitting the pavement with languages such as Objective-C and VHDL, which, despite their many supporters are far, very, very far, from providing the kind of evolution and progress we so desperately need in computer science.
So until we hit a major technical wall, why bother? Why optimize prematurely until we're bumping up against atoms?
For the longest time I've felt guilty of this, until I started seeing my coworkers committing code that I would later have to correct. Our company is such a mess right now.
Life isn't supposed to be beautiful or elegant or simple or perfect.
If an entrepreneur can make money while improving the parts of life (no matter how small) that others benefit from, what can possibly be the issue?
It is exactly the reason why I say that Computer Science is still in the Stone Age.
"I can't get things to work! It's the fault of The World - it couldn't possibly be me! I've had a game in the App Store!"
As a rule, the worse the programmer, the more convoluted his solutions to problems. Maybe instead of writing huge rants on his blog, he should re-evaluate the way he does things so the world doesn't seem so horrible. Or get a new job.
Well put, Sir.
code something simple (=terse) that follows common sense and just works and it's an island of consistent speed and reliability in a sea of crap and bloatware.
we need more djb-like coders, who do not mimick 1000's of other coding monkeys.
mcillroy said the hero is he who writes negative code. he's right. the world needs less not more code.
if you can't handle that, then you're just contributing to keeping the "state of the art" terrible. but then people just keep paying for this crap so that's why it won't disappear. the idiots are rewarded for their "productivity" in producing saleable crapware.
In my opinion, the industry as a whole has gotten obsessed with new hotness over properly finishing what was started. I don't know whether that's always been true.