Hacker News new | past | comments | ask | show | jobs | submit login

Yes I did mean that. We used noweb.

Let me probe my memory...

Requirements were developed first by the customer and our requirements specialists, working together. That was a big priority. Requirements were broken down into individual elements, examined for contradictions and ambiguities, they had to be discrete, they had to be testable, all that sort of thing. This required domain knowledge as well as just being thorough and having attention to detail. As one would expect when writing software, it turns out that if you don't know what it's meant to do, or you misunderstand what it's meant to do, the odds of building the right thing are very low. This is something I see very commonly in many software companies; it's very common to start building things without knowing what they're meant to do. Sometimes it's impossible to know at the start what they're meant to do, but I do wonder if we could be a little harder up front about demanding to know what it's meant to do. Agile seems to be a way to control against that risk; a way of building software without knowing at the start what it's meant to do but getting there by frequent course correction.

Once written down and formally agreed by the customer, each little requirement was given a unique identifier. That unique identifier was carried through the design and implementation and tests. The literate programming techniques used and the format of the design documents and test documents typically mean that a sidebar on each page showed the exact requirements that a given section was implementing; an index would show all such places, so if I wanted to know how we were going to meet requirement JAD-431, I could look up all such pages and satisfy myself that the design met it, that the implementation met it, that the tests covered it. Reviews would involve someone having a copy of the requirements any given design or implementation was meant to meet, and ensuring that every one of them was indeed met.

I remember printing out test documents, and executing them (including the build; each test document specified the versions of source code to be used and the tools used to build it, and often the exact build steps), and on each page where I signed and dated it to confirm that I had conducted the test and it had passed, there was a listing of the requirements being tested (or partly tested). That sure gives someone pause; when I signed my own name in ink to give my personal guarantee that the test passed and was all in order, I sure didn't accept any ambiguities or shoddiness. Those signed, dated test documents would get a double-check by QA and if that was the final test, sealed in an envelope with a signature over the seal, and securely stored against the customer performing a random inspection or against anyone ever needing to come back and verify that for a given version, at a given time, the tests did pass.

If a requirement ever changed (which did happen, and we sure did charge for it!), it was thus easy to identify everything else that would need to change. The effect of the change would ripple through the design and implementation and tests, linked by the affected requirement unique identifiers; each piece thence being examined, updated and confirmed as once again meeting the requirements, with confidence that nothing had been overlooked.




While I've really enjoyed your posts in this thread you're anecdotes do highlight something very important; quality software is expensive. Also, most clients, be they internal to the org or external don't really know what they want and certainly don't have the wherewithal to supply detailed, testable requirements up front.

Where I think most shops touting "Agile" miss the mark is the customer collaboration part. Extracting the detailed requirements you've described would take a lot of upfront time and effort that I've found most "Agile" shops don't have the stomach for. Where the rubber usually meets the road is in "Grooming" sessions. These sessions are typically rush jobs to move Stories to "Ready for Dev" but what I've experienced is there's rarely adequate information to proceed to development after a typical Grooming session.


most clients... don't really know what they want and certainly don't have the wherewithal to supply detailed, testable requirements up front.

Very much so. To do this well, one requires something that is in very short supply; high-quality, competent customers.


Nice, it seems like a simple process done rigorously. No highfalutin buzzwords (compared to Agile/Scrum :-) but a systematic approach using commonsense/wellknown processes. In essence, a) Utmost focus on Requirements gathering b) Use Literate Programming techniques to weave Design and Implementation and possibly into Testing/QA c) Use a "primary key" throughout to trace everything to requirements.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: