Hacker News new | past | comments | ask | show | jobs | submit login
Software Engineering: Dead? (codinghorror.com)
66 points by jp_sc on July 19, 2009 | hide | past | favorite | 32 comments



"Software Engineering is Dead" is obviously an overly sensational title, but let's look for the deeper truths.

First a little background. I have built a career developing business applications as a contractor, often in very large organizations. I am almost always very successful and I'm often regarded as a hero doing what most people here would call "just doing my job".

Why is it so easy to be a hero in the enterprise with software that would just seem ordinary in a forum like hn? Is it because enterprise programmers aren't as good as us? Is it because their managers are all phbs? Or because they don't know how to run projects?

I'd say no to all of the above. Enterprises are filled with lots of excellent people doing great work.

I think that the real problem is that the Systems Development Life Cycle (SDLC) that so many depend on so much never really did work. Why?

Every phase depends upon Phase I, Analysis to be rigorously done. This rarely happens for 2 reasons: users often don't know what they want and most systems analysts don't know how to extract it even if they did.

So almost everything done after Phase I is built upon a foundation of sand. It's either wrong or sinking fast. And what do most people do? Everything except fixing the problem: more resources, more project management, freezing specs (which aren't right in the first place), more rigorous deadlines, etc.

But rarely does anyone attack the core problem with the SDLC: defining the expected result.

So what should we really do? Develop something, anything, quickly, cheaply, and get it out to the right users. They will instantly give you feedback. What's right, what's wrong, what's stupid, all the cool stuff that no one thought of.

No one can just sit down and write a Functional Specification for a large business application. And even if they could, you don't want them spending time on it. Better to get the right people together and find out what they need. Usually, no one of them knows what the result should be, but all together, any decent developer should be able to extract enough data to write version 1.0 of something.

It's a lot easier to judge something that exists than define something that doesn't.

The larger the organization, the more difficult it is to change their ways.

Software engineering isn't dead. It's just that the process of depending upon blueprints before you get started never worked in the first place.


Quoting Paul Graham:

"...'Systematic' is the last word I'd use to describe the way good programmers write software. Code is not something they assemble painstakingly after careful planning, like the pyramids. It's something they plunge into, working fast and constantly changing their minds, like a charcoal sketch.

In software, paradoxical as it sounds, good craftsmanship means working fast. If you work slowly and meticulously, you merely end up with a very fine implementation of your initial, mistaken idea. Working slowly and meticulously is premature optimization. Better to get a prototype done fast, and see what new ideas it gives you. ..."

from http://www.paulgraham.com/usa.html


That might often be true. But I've worked for A-level software firms in the Bay Area and B-level corporate shops all over. The level of programming talent is unquestionably better in the A companies. And there's lots less friction when trying to get things done.


Well stated.

Or to put it another way: software engineering is not dead, it just happens in a much faster manner than it is commonly practiced.


"Develop something, anything, quickly, cheaply, and get it out to the right users. They will instantly give you feedback.... It's just that the process of depending upon blueprints before you get started never worked in the first place."

Funny, you make it sound like software engineering is dead.


It sound to me like a tightly prescribed strategy of gathering relevant user information. Sounds like short iterations in market research.

It doesn't sound like it makes any mention of software engineering.


All the other kinds of capital-E Engineers sit around making blueprints all day. Unless your definition of "software engineering" doesn't rely on the noun "engineering", then the fact that making blueprints "never worked" means that Software Engineering is Dead.


"All the other kinds of capital-E Engineers sit around making blueprints all day."

[citation needed]

This debate comes up periodically, and in some forums we are lucky enough to have engineers from other disciplines pop up, who comment to the effect that they do rather more experimentation and software-like development than computer programmers often realize. The "big" projects that we associate with blueprints are rather small by percentage of projects done, and even they require a certain amount of flexibility to handle changes discovered during the course of the build ("oh, crap, this bedrock is actually sand!").

Mind you, the previous paragraph is also [citation needed], but my real point is and remains to call for actual experience and knowledge and not just regurgitations of stereotypes.

(I know a number of engineers in my circle of family and friends, and none of them actually work in anything like the "Waterfall" model, only with blueprints. The closest is the one who is in charge of designing racecars, and they still do experimental designs in computers, and before computers, did experimental builds of cars.)


I think engineering in other fields works with blueprints, because there are so many clear-cut expectations/requirements. No matter how fancy the design, the over-all purpose of a building is to shield one area (the "inside") from another (the "outside"). Doors and windows open and close, optionally locks. Heating and cooling must be scaled the to size of the area covered.. etc.etc.

Designing a building is much more akin to configuring a server (which anyone here is perfectly comfortable doing by blueprint) than developing an application.

The level of detail usually required in a software-blueprint would amount to including the placement of pencils on every desk in the building-blueprint.


Chemical engineers? I mean if you define: Engineering == blueprints that's fine, but it's an awfully narrow definition.


Waterfall development, i.e. "depending upon blueprints before you get started" was an anti-pattern when it was first described, and it's finally dying now. You're describing iterative development.

Everyone's getting onto scrum and agile now, even the sceptics who can see that it has benefits as you describe.


I came to the same conclusion in my first job out of college, working in IS/IT at a large automotive firm.

Now, I'm not so sure. I think there is a large craft component in the implementation of software, but there is also an engineering component in the selection of algorithms and datastructures, as well as testing and benchmarking. I think what I'm saying is that the construction of software is a craft, but the design of software (should) still have a large engineering component.


The software is Arts and Craft.

http://www-cs-faculty.stanford.edu/~knuth/taocp.html

(Note that he titled it "Art" and not "Science" or "mechanism".)

We're also beginning to see the formation organizations similar to Guilds in software.

& Soon we will have our own guild of Cathedral building Templars using esoteric knowledge :)

No sign of model-T on the horizon. (Remember that in software assembly line, what rolls down is not a 'copy' but another 'program'.)


I'm familiar with TAOCP. :-)

I think it depends on what part of the software development process you're looking at. Designing new algorithms and datastructures... probably art (in the way that developing mathematical proofs is artful). Choosing from a set of well-understood algorithms and datastructures... probably engineering. Coding up said choice into something that users and other programmers will use (in the case of libraries)... I'd put it in the craft category.


I have a feeling that, at some point in the future, all the "engineering" could actually be done by the compiler. You tell it "I want to store X, Y, and Z"; it selects the best data structure for you, and tailors the algorithms to be optimized for just the kind of data you'll be working with. The designers would still exist and be responsible for feeding the compiler new data structures to select from, but most "programmers" would simply become business analysts, feeding the domain knowledge required to the compiler and watching as it evolves a program that best handles those conditions. The "craft" aspect would reduce to defining Interface contracts for your libraries that the compiler would have to adhere to when munging its algorithms (though it could always just tack the "right stuff" transforms on top of its quirky works-by-coincidence internal versions.)


Or, things might go step a further with everything running on virtual machines that monitor the behavior of the program (and the structure of incoming data) to tailor things appropriately.


I'd prefer the software on the plane I'm flying on to be engineered. Arts and crafts are fine for your new facebook app, but please engineer the stuff that keeps me alive.


Oh, rest assured it is. Software engineering is alive and well in fields that demand up-front reliability -- avionics, medical equipment, reactor control systems, and so on.

In university, I did a co-op job with a civil avionics shop where I learned all about DO-178B, the standard by which avionics software is certified. DO-178B Level A-certified is top-quality and highly reliable. I feel safe flying on planes after working in that industry. It's a rigourous process, and software that emerges from it basically never fails.

But it's an expensive and slow process to develop software this way, and it's prone to failure from any number of reasons. Requirements and technology change, and must be known up-front. Capital can run dry.

It's only worthwhile developing software this way when absolutely necessary, when half-working software is dangerous and could kill someone.


Exactly. I think Jeff is forgetting, or ignoring, that there are different realms of software development. I think too often he generalizes the 'web app' experience to cover the whole industry. I know that in my work as an embedded developer that the fast iteration and feedback loop doesn't work too well with physical products with no easy upgrade system.

That being said, I do think he has a good point with desktop and webapps where you can quickly get and act on feedback from the customers.


From the article: "What DeMarco seems to be saying -- and, at least, what I am definitely saying -- is that control is ultimately illusory on software development projects. If you want to move your project forward, the only reliable way to do that is to cultivate a deep sense of software craftsmanship and professionalism around it."

He isn't talking about fast iterations. He is talking about feeling responsibility about what you are delivering, instead of using procedures as a crutch. I know I would rather fly in a plane that had its sowftware written by developers that cared about their work than by "engineers" that don't think about their work because they expect the code review and a couple of QA cycles to catch all the bugs.


Good point. There needs to be a balance of both, I just think Jeff leans too far toward the 'soft' side of things sometimes.


I agree completely; however, this is where Engineering should start and stop. If the heartbeat of the software isn't tied inextricably to the heartbeat of real people, don't call the people that work on it Engineers, and don't expect them to do any Engineering. On the other hand, start giving the people that write the plane firmware those neat iron rings[1].

[1] http://en.wikipedia.org/wiki/Iron_Ring (It's a Canadian thing.)



[1] http://en.wikipedia.org/wiki/Iron_Ring (It's a Canadian thing.)

That's one of the coolest things I've heard in quite some time. Perhaps software engineers in the US should wear some visible reminder of "Microsoft Bob"?


The lack of mathematical rigour is what kills it. The software industry is the one where we do not learn from past mistakes or successes, we rarely measure important things or when we do benchmark, it's to focus on stupid things like the overall performance of specific language implementations or how much RAM Firefox is using instead of figuring out how to eliminate whole classes of problems.

We can be artists and creative people just as mathematicians and physicists are.

I'm reading about John Nash and holy shit, Princeton gathered some of the smartest people together and told them not to worry about grades, just worry about research. Nash himself would just wander around the halls and just think. The papers written may seem formal but the way the ideas were generated was highly informal, with random meetings and discussions in hallways and common rooms and walking about lost in thought.

What struck me as most interesting and applicable to software development is that Von Neumann and the others at RAND were applying somewhat pure/abstract mathematics to specific problems in other fields. So why don't we take a page from them and apply some more rigour to measurement, and why don't we learn some management skills and develop useful metrics for software development, so that we can stop letting the marketing people and the snakeoil consultants from monopolizing our field and turning it into a wasteland of overly expensive unfinished projects.


What would those metrics be, though? Pretty much every metric anyone has ever come up with to "measure" software development has been worthless, and it doesn't look like that trend is going to be broken soon. It's like trying to apply metrics to writing a novel. After a certain point, if something doesn't work you have to accept that maybe it's just a bad idea, and I think we're well past that point with software development and metrics.

About the only useful metrics I know of are all end-state sorts of things: bug counts and regression rates, number of satisfied users versus dissatisfied users, number of successful implementation projects versus failed implementations, and whether you can generate real revenue or business value. As soon as you start trying to look at things like number of lines of code, "function points" implemented, coupling metrics, test coverage metrics, etc. it tends to start distorting your development efforts, and the harder you push on formalisms and hard metrics the worse it gets. Pretty much any metric in software is really a way to get at some unmeasurable underlying thing like "quality" or "rate of progress," but they're always easily gamed. It's like standardized tests and education: the more you push on "accountability" and standardized test scores, the more teachers teach strictly to the test, and the less the kids actually learn. Similarly, of the best ways to destroy a software project is to guide the development based on metrics.

I think you get much better results if you approach software as a design process, which is inherently unpredictable and non-repeatable, instead of as a manufacturing process where predictability and repeatability are the desired goals. As much as the word "agile" has come to be almost meaningless these days, that's really the original insight behind it: on a fundamental level, software development is an unpredictable, non-repeatable design process, and so the best you can do is to expect that to happen and build in ways to deal with that, rather than instead trying to exert more control.


It's like trying to apply metrics to writing a novel.

Oh, that's not hard to do.

The only one that matters in the marketplace for software is the same one that matters to most novel publishers - does it make money. That's a simple metric to measure and it's one that corporations instinctively understand, both in software and in novel publishing. Monetarily successful software publishers, just like novel publishers, know very well what the quality sensitivity of their customer is, and produce accordingly.

The metric (and the audience quality sensitivity) is obviously quite different when you are talking about niche product, be it aircraft control software or a mathematics textbook.


Much of programming is working with complexity, not engineering.

If you know from beginning to end how to do something, then you can engineer. You can form processes to govern the engineering, you can compute some things to be just right, you can make some nice tradeoffs and eventually you're finished. You could build a bridge like that. You could build a GUI like that. Or the 10th website of similar functionality. Or the 100th half-assed C-library replacement because your project can't or doesn't want to depend on the standard C lib on all platforms.

Most importantly, basically, much of what could be engineered often gets automated. Programming always happens on the edge of unmanageable complexity and anything less than that gets quickly automated. That leaves very little to be engineered. Basically you engineer something if you insist on not being lazy and doing it by hand.

Throw in a "minor" scalability requirement, or a guaranteed response latency of less than X milliseconds, or some "easy" dynamic "addition", or some "basic" interoperability mechanism to another program, and suddenly your engineerable bridge must be designed to support a million tonne UFO the size of Manhattan that keeps gleaming in the temperatures of several thousand Kelvin.

To conclude: Software engineering does happen but only after the software has been written enough many times so that we actually have experience in doing it, and often for only a short while until the engineering part can be automated and programmers released to work on something more complex.

Software is so tricky that you don't want to redo it too many times unless you really, really have to.


I like the original article by DeMarco which Atwood discusses. DeMarco makes a good point that normal engineering methodology doesn't work with software. Anyone who has tried to code up something based upon an 80-page spec document written by a junior analyst who knew neither the business nor how to code might agree. Writing code is more like writing an engineering plan than it is trying to follow one. DeMarco writes about the dangers of using the wrong metrics but he doesn't offer much of a counter-proposal - something like "fail early and fail often?"


Dead??! Is it born yet?

I think we may just be witnessing a very long and difficult conception.


I've seen a lot of teams, and a lot of projects either fail or succeed.

It boils down to three things: People are the most important, over-generalizations about the development process is anathema, and always adapt.

The over-generalization part is especially rampant. Bloggers or authors want to make some sweeping statement about what software engineering is. Team members that worked on one successful project want to repeat that exact experience on another project. Project Managers who measured one thing on a good project become convinced that measuring that thing was part of what made the project successful.

Yes, there is engineering at work in software development. We load test, we plan scalability, we use set theory to model data and relationships. Debugging is about hypothesis formation and testing. There's all kinds of hard empirical science going on.

At the same time, the end result, the goal, is a nebulous thing that appears different ways in everybody's head. Large parts of this process give us the freedom to creatively construct things.

People who forget the engineering part fail projects due to bad design. People who forget the creative part fail projects due to poor user acceptance.

But highly-trained and motivated adaptive teams do just fine no matter what the environment.

The ultimate truth is: there is no ultimate truth.


For myself I agree, but I was never big into engineering approaches. Very suspicious of agile methods, too. The way I see it: for example in mathematics the first proof of a difficult theorem is often very long and complex. Over time more mathematicians might look at the proof and find ways to make it simpler. Eventually you might have a 5 line version of a proof that used to be 30 pages long.

I don't see how any agile method or engineering approach could help much in getting the 5 line version from the beginning. There always seems to be a kind of "crunch" step involved in solving the problem. Once the problem is solved, the solution can be cleaned up. But solving the problem at first you explore maybe hundreds of possible paths in your mind, and combine lots and lots of aspects. You are stuck in the jungle and try to find a path to the summit. Once you are on the summit, you can look down the mountain and see the easier routes you missed on your way up - because in the jungle you can't see very far, and you have to fight for every meter of progress.

I find the same applies to my own code - which is always crappy, anyway. But refactoring usually goes quite fast once the first working version has come into existence.

I'd expect having to focus on getting some agile method right would just distract me from solving the problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: