More seriously, the point of the article, from a skim, appears to be that "writing too many unit tests is a waste of time". Well, duh. Save 15 minutes of your life and pass on this one, methinks.
In retrospect I wish I'd used a different title and skipped the first few paragraphs - I'd intended it to be a bit of humorous indirection making it sound like the post was going to be against automated testing whereas it is really very much in favour of it, but unfortunately people have taken it the wrong way.
People didn't "take it the wrong way." You did not clearly express yourself. If you want to talk about the value of functional and integration tests, then talk about that.
I'd appreciate it if anyone interested in the post would skip the first few paragraphs and start at "Benefits of automated functional tests" since that was what I was intending to convey.
Packaging really matters where blog posts are concerned... the hook needs to be crafted well for each article, or people won't bite.
I'd take scott's advice and just repost... many people will probably read it then, since the subject is relevant.
Commence downmodding in 3, 2, 1...
But back on topic. There is some research supporting unittesting (McConnell writes about it in Code Complete) it doesn't find even half of the existing bugs though, and it should be supported by other testing methods.
The main-big-very-nice advantage of unittesting is that any bug that is _does_ find is easily found and fixed.
Generally, people like having their attention taken up, provided you follow their advice. :-)
I continue to assert that most unit tests are the same type of drudge work that good programmers try to avoid in other scenarios (data entry, text editing, email templates, deploy scripts, etc).
"I think sometimes we take for granted how fortunate we are as developers to be able to make such a direct impact on problems instead of being wholly dependent on others."
If you're too lazy to write good docs, please, at least leave me some tests.
Personally, I have worked seriously used DBC, TDD, and automated scenario testing, and hybrids in that order of learning.
My conclusions about the sweet-spot (and this may vary depending on your area):
1. Automated functional testing in the large, and DBC in the small is a winning combo.
2. Special techniques for graphics intensive programming: I get my scenario tests to show what's happening on the screen, and I can slow 'em down and step through 'em
3. I never got a lot of dependency injection / mocks (like the author) -- not sure if I don't get 'em or application area is wrong
4. You need special techniques for intensive algorithmic code where there's a combinatorial explosion of cases and line coverage won't do; e.g. randomly generated test cases + compute intensive sanity checks
Bottom-line: Automated scenario tests exercise the code; DBC pin-points illegal states.
DBC = Design by Contract = pre-conditions + post-conditions + invariants. In practice understanding the method (this takes work) and writing explicit pre-conditions gives ~70-80% of the benefit. To do invariants well needs language support.
Whereas the automated functional testing advocated by the OP makes a tremendous amount of sense to me intuitively.
Note that the most important programmer commodity is time. Time spent on a unit test means less time to write the program. Now, what ought to tip the scale is that later changes are easier to get right so in the long run, one should win time by writing down unit tests.
But personally, I think that there are other practices which are more effective: Static typing, assertions and automated test harnesses for the whole program. I'll choose to spend my time on these rather than unit testing.
(Also, maybe it's just me, but a book that starts from, "Help! I just inherited a C++ project with ten years of rot and half the comments are in Norwegian! How do I even start adding tests to this without breaking it further?" seems a bit more practical than one that starts by applying unit tests to example code designed for convenient testing. It's seldom that easy.)
Don't just read it: Work your way through it. You need to follow the recipe to develop the discipline/knack.
Fairly quick and reasonably easy.
Python's built-in "unittest" module is an example of that. While it's not perfect, you can quickly form good habits by using it.
And it is not limited to Python. For instance, I've written unittest-compatible classes that basically run other programs as test cases. Either way, you're forced to think about things like "how do I automatically detect that this failed?" and "is the purpose of this test clear?".
Doing is the best way to learn most things in the software world.
generally speaking if someone's angry about something, they probably don't understand it. otherwise... why not ignore it til it goes away?
Sometimes I do wonder if I rely on them too much, but they aren't a waste of time.
One thing that works extremely well is to have a test environment. In other words, spend the time to replicate your entire production flow 100% (but in isolation), so that you can "deploy" changes and see if anything blows up. This not only catches stupid deployment bugs, but it also allows you to avoid some unit testing by simply relying on customers themselves: have them log into the isolated environment and do exactly what they would have done in production, and you'll know if it works.
That said, yeah, "unit testing is a waste of time" is link-bait-ish.