The non-negotiable part is that there should be tests, regardless of when they were written. Those tests will be the record of truth for future development. They are proof that the proper specification was implemented, regardless of methodology.
I really disagree with this. Asides from the fact that tests can't prove the absence of bugs, it's very easy to write a test suite that avoids the unhandled edge-cases; most people do without thinking about it. Your test suite is the tool you have created to search for bugs and should be written in that adversarial mindset rather than as a certification that you actually did your job.
Deploying 100% bug-free code is the Holy Grail of software development.
Is this really true? Judging by their decisions, most people's Holy Grail is more like something reasonably good and shipped on time. 100% bug-free code is already possible, but few are willing to pay for it.
Is anyone willing to pay for it? Is there any non-trivial code that's been deployed that has been bug free?
I mean, supposedly NASA paid $1,000 per line of code and still got 0.11 defects per 1,000 lines. By that metric, windows would cost $40 billion to develop, but it would still have 4,400 defects in it.
Yes, as DanWaterworth has pointed out, people have written large projects with entirely bug-free code (in the sense that it verifiably implements a formal specification). The way these projects are developed is very different from the process described by this article -- a lot of time is spent writing formally-verifiable proofs of correctness.
I'm definitely not advocating throwing out agile and embracing waterfall. I like the ability to iterate quickly. I just think we seem to be taking "move fast and break things" a little too seriously in terms of breaking all the things. Everything in moderation.
Sure, what I meant is most companies will chose cheap-fast over good-fast, in my experience.
Flinging crap out fast is sometimes seen as a valid business model (unfortunately): that crap is only expensive to you if you have to maintain it, but that's not your problem if all you're trying to do is to "exit".
On the other end of the spectrum (like where I work at now for example), we take a lot of effort and time to make sure our code is not crap - but that is an expensive decision both in terms of money and time.
This. I think there's a big space between cheap-fast (crap) and good-expensive (or just "good"). In my experience, we seem to be way more toward the crap side of things. I think we can make a correction without going too far the other way. You can make sacrifices for speed that don't always mean crap in the end. I'm willing to sacrifice perfect architecture in order to get something out the door, for instance.
I really disagree with this. Asides from the fact that tests can't prove the absence of bugs, it's very easy to write a test suite that avoids the unhandled edge-cases; most people do without thinking about it. Your test suite is the tool you have created to search for bugs and should be written in that adversarial mindset rather than as a certification that you actually did your job.