Hacker News new | comments | show | ask | jobs | submit login

I can't remember if Larman includes this in the paper, but at a talk he said that he tracked down the guy who standardized waterfall for the DoD (in Boston IIRC) and when they met for lunch, the first thing he said to Larman was "I'm so sorry".



Well, at least he regretted the damage he did.


In retrospect it was inevitable. The patterns of thinking that we take for granted around iteration and prototyping were still too alien. You have to get it before you can do it, and the types of programmers who were inclined to get it weren't in decision-making positions.

Another great story that Larman told was that he tracked down some of the programmers on famous waterfall projects that had succeeded, and found out that what they had really done was write code first, secretly, and then write up the requirements and design docs based on what they had learned. In other words they did things in the 'wrong' order but published them in the 'right' order.


"The patterns of thinking that we take for granted around iteration and prototyping were still too alien. You have to get it before you can do it, and the types of programmers who were inclined to get it weren't in decision-making positions."

Maybe I was too harsh on them. It does seem likely. Further, it probably came directly out of the nature of programming on expensive, shared, and often batched machines. Here's a great example of what programming looked like in the 60's (skip to 5:27):

https://vimeo.com/132192250

It wouldn't have been much better a bit after 1970 where many of the same concerns and processes would exist. I still think one has to ignore most of the Royce's paper to not pick up on iteration paradigm. But, I could easily see someone in that mentality sort of glossing over it, spotting a diagram of their current flow, giving it a name, and pushing it from there.

Finally read the paper you gave me. It was really neat to see iterative development kept getting independently invented from the 60's onward. It getting into the mainstream was an inevitability due to its inherent advantages. The majority just couldn't contain it within limited groups forever.

"and found out that what they had really done was write code first, secretly, and then write up the requirements and design docs based on what they had learned. In other words they did things in the 'wrong' order but published them in the 'right' order."

That's funny you say that: history is repeating in high-assurance field. Safety-critical design is like waterfall or spiral on steroids with specs, constraints on implementation, rigorous testing, analysis... you name it. To outsiders, it seems they're using a waterfall-like refinement strategy to build these things. Insiders trying to get Agile methods in there have countered that with an unusual supporting argument: successful projects already use an iterative process combining top-down and ground-up work that results in a linked, waterfall-like chain of deliverables. The actual work is often done differently, though, so why not allow that kind of development in the first place?

With your comment, I've heard that twice. Another quick one was Mills' people not being able to run their own code in Cleanroom. Sometimes it wasn't necessary but it has many benefits. So, of course they often ran their own code during prototyping phase to get an idea of how the final submission should be done. We'll all be better off when management lets their processes reflect the realities of development. At least it's already happening in some firms and spreading. :)




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: