
Reasons to Avoid Test Driven Development - LiveTheDream
http://www.softwareandi.com/2011/12/10-reasons-to-avoid-test-driven.html
======
CJefferson
This article is just irritating sarcasm.

There are perfectly good reasons to avoid test-driven development. Like being
part of an existing large project, where there are already too many
interconnected components. Or working in complex AI algorithms, where separate
testing of components is extremely painful, and lots of whole system testing
is necessary anyway.

~~~
lucisferre
Agreed this was completely pointless. Bottom line is not every line of code
needs testing, and TDD alone does not have an impact on code quality. There is
nothing I hate more than bad code covered by even worse test code.

Here is a talk for anyone who is wondering why TDD doesn't make sense as a
default mode: <http://www.youtube.com/watch?v=LeVvj4HENOQ>

Bottom line is that writing TDD style unit tests won't make crap code better
any more than removing them makes good code worse. TDD is a tool, nothing
more, nothing less.

~~~
tzaman
Not entirely true: If you write tests, chances are you're educating yourself
on how to write them. As a consequence, code gets improved - because it's
almost impossible to write _any_ kind of tests if the code is crappy. But then
again, maybe that's just me :)

~~~
dasil003
I think it _is_ entirely true:

> _TDD is a tool, nothing more, nothing less_

Essentially TDD is just one more form of testing and/or specification. The
reason it's so good is because of the properties of executable code: namely
that it's fast, precise and repeatable. Human-based QA processes also have
their advantages, and if you had unlimited human QA resources then the value
of automated testing would be significantly (though not entirely) diminished.

But in the end, testability is ancillary, it's a second-order concern to the
actual operation of the software. The ability to write a good test is
secondary to the ability (and wisdom!) to choose the right architecture. In
other words, a simple and correct program using some advanced techniques in,
let's say Haskell, would be preferable to a kludgy Frankenstein beast of a
program written in Ruby but with great test coverage over the areas of the
problem domain that the programmer was aware of.

Obviously that's a straw man; automated testing is an extremely powerful
technique, one which requires practice to master, and TDD shines in its
ability to lift the scales from the eyes of the beginning programmer and see
programming from a new perspective. However TDD, BDD et al are no silver
bullet and no different from learning functional progamming, gdb, printlining,
modular programming, meta programming, dynamic programming, regular
expressions, or any other algorithm or technique. All just tools.

------
SoftwareMaven
I was disappointed by this article. There is no such thing as a "perfect"
methodology, so I tend not to trust people who tell me their methodology is
perfect. I have a lot more respect for people who recognize the tradeoffs that
any methodology forces and then finds the right places to use them and the
right places _not_ to use them.

And this is not to say anything at all about TDD. There are a lot of good uses
for it. It just isn't the One True Paradigm of software development. But then
again, neither were dynamic languages, object oriented programming, extreme
programming, nor any of the other paradigms that were going to cure the
problems of software development that went on to add value when creating well
thought out plans for every unique project.

------
zorked
I find it telling that proponents of TDD so often resort to name-calling and
sarcasm to promote their views instead of, like, just writing great software.

------
tyre
These are reasons for testing, not TDD. There is no discernable difference
between 2 shipped products with identical test coverage where one team used
TDD and the other just wrote regression test.

The worst part about TDD is the people trying to shove it down your throat.

~~~
geebee
I think you've identified one of the weakest parts of the essay. This is about
ignoring testing, not TDD.

One way I've convinced people to write unit tests is to remind them that they
are already testing. People often write a little main to check the output of a
program. This is a test. People often click on a link to validate that some
text is present. This is a test. Keep them! Instead of throwing away your
stand alone program, put it in the unit test folder. Instead of throwing away
that point and click, automate it and put it into the integration tests
folder. This alone will get you close to 100% coverage. Now, when you refactor
your code and it no longer works the way you verified it would a while ago,
you'll get a little red bar or a "FAIL" message. Usually this is because you
wanted the behavior to change, so update the test. Occasionally, this is
because you introduced unwanted side effects in your code, which is exactly
why you want these tests.

TDD is substantially different from what I just described. If the author wants
to defend it, that's fine, but let's make sure we understand it's not the only
way to get high levels of code coverage.

------
colomon
The article doesn't seem to recognize there is a distinction between TDD,
having unit tests, and having automated tests. Personally, I've used all three
approaches, and think each has its place. As nearly as I can tell, anyone who
says one approach fits every programming project must not have a very broad
experience with different sorts of projects.

------
kevinw
I'm a bit confused why this article was written in a satirical (sarcastic?)
style. It raises a lot of good points, but it would have been much more
straightforward if it hadn't been written sarcastically. All of the points in
the article could have been written from the position that the author ACTUALLY
advocates, which would have made it more readable without sacrificing any
content.

~~~
jakejake
It seems to me the author is such a proponent of TDD that the idea of not
using it is laughable.

The humor is so dry that it almost seems like a serious article. I got a few
laughs though.

~~~
benjaminwootton
I was about to ask if it was satire. Guess it was wasted on me!

------
ssi1111
I am a big fan of TDD. I read this post to see if I can get any reasonable
arguments against TDD, so I can be better prepared next time I am trying to
sell TDD. #10 - 'no clients' and #8 - 'short project' kinda make sense to not
practice TDD, but these also mean that there is no real product - you build
something and you throw it away after a few days. So there is no need for TDD,
testing or anything really. All the other points in the post don't make any
sense :)... is it just me?

------
netmute
The article lists 10 reasons which can basically be boiled down to:

1) Unrealistic assumptions ('perfect' developers, 'perfect' architecture etc.)

2) You simply don't care about quality.

~~~
jonny_eh
I think it was a dry satirical humour piece.

~~~
Aco-
somehow I doubt that

~~~
recursive
Get your sarcasm detector tuned up.

------
ludflu
I'm a strong proponent of TDD. In my experience its more productive, and
results in better software. But....

I've been in this position before: We have a large existing codebase with no
unit tests to speak of, and the business is not willing to fund efforts to
refactor or add unit tests. It is _very_ hard to add unit tests after the
fact. Not impossible, but its alot of work, and business folks really don't
care.

~~~
manojlds
TDD is test driven development. You are talking as though TDD is same as
adding unit tests.

~~~
mpweiher
I think he is describing the (common) situation where you want to do TDD for
new code, but are dealing with an already existing code-base that doesn't have
test coverage and just isn't amenable to testing the way it currently is.

So you can't really do TDD for your new code without refactoring the existing
code-base. But refactoring the existing code-base without good test coverage
is nasty.

------
nullc
Some useful arguments but the sarcasm isn't needed…

And it belittles some of the complicated arguments such as: If you develop to
the test, the test will always pass— but that doesn't mean that your software
is any good. Loosely coupled development and testing can result in the tests
being more powerful proxies for overall quality.

------
programminggeek
TDD is awesome, if you have a codebase or a tech stack that makes it nice. TDD
will quickly expose bad design through an unpleasant testing process. For
example, slow tests, untestable code, or tests that you "always have to
rewrite/maintain" are problems that only show up when you try and TDD your
code that isn't designed to be testable.

TDD doesn't work well if you don't design your code to be testable because you
very quickly become frustrated by the problems TDD uncovers.

1\. Slow Tests - If your tests are taking a long time, it's probably because
your code isn't written to be tested easily.

For example, if your code is hitting the database a lot and it starts to take
minutes or hours to run your test suite, your code has a problem. For example,
if your entities/models are coupled to an ORM like ActiveRecord, testing those
is going to suck because they probably sholdn't be coupled to an ORM. Many
people try to fix this by writing frameworks that make testing their code
easier instead of writing better code that isn't tied to an ORM.

Tests should be fast. Most of the time they don't need to hit the DB. Slow
tests make TDD suck. Avoid slow tests.

2\. Untestable Code - I've run into this a lot and it sucks, especially in
languages that lend themselves easily to untestable code (In my experience,
that's PHP).

Until you try to test your code, you don't realize how untestable your code
is. A lot of things that seem like reasonable design decisions are so
untestable, that you end up doing incredibly lazy integration tests because to
fix things you have to rewrite whole portions of the system.

I've been there and it sucks. The choice seems to lie between rewriting the
whole system on the same platform to be testable one chunk at a time, or to
rewrite the system in something that lends itself better to testing.

In either case, the answer is to take the time to learn how to write testable
code. To start learning techniques like Dependency Injection. To practice
writing portions of your system using TDD to see what works and what doesn't.

3\. Maintaining Tests Sucks - Actually, if you don't want to maintain a test
suite, don't write one.

At its core, your tests are a specification of how your system should work at
a given time. When they start breaking it means your system has likely
changed, so either it should be fixed to match the specification, or the
specification should be changed to match the appropriate system behavior.

At a fundamental level, tests aren't something you write once, they are
something you constantly rewrite to keep them in line with what your system
should be doing. Thus, when they are broken, you know that either your specs
need to change or something in the code isn't working right.

Test maintenance is a price you pay for wanting higher quality software. It's
the same kind of cost you pay in designing a building with blueprints. If
people build a building without following the blueprints, it's not going to be
the building that was supposed to be built. Tests are a living blueprint of
your system. If you aren't willing to treat it that way, TDD isn't going to
help you.

TDD isn't for everyone because not everyone is willing to make those kind of
investments in code quality and maintainability. Also, TDD doesn't make your
code better if you aren't willing to let it make your code better.

~~~
__david__
> TDD isn't for everyone because not everyone is willing to make those kind of
> investments in code quality and maintainability.

One thing I've noticed is that code written to be testable tends to suffer
from over-abstraction. Sadly, writing nice, compact, easy to read code is
somewhat at odds with TDD in my experience.

I personally don't like that I have to make sacrifices to code simplicity just
to make it testable. But perhaps I just suck at TDD. :-)

~~~
programminggeek
I totally understand what you mean.

I've gone to great lengths to determine what kind of architecture is needed to
truly support TDD and what I've come to realize is that it's not so much about
over-abstraction that is the problem, it's that testable code is pluggable,
which is mostly at odds with how systems are designed.

For example, standard MVC apps tend to tie models to an ORM in some way shape
or form. Think Rails and ActiveRecord and you'll understand what I mean. This
makes writing a simple crud app fast, but testing it sucks because you have 2
things tied together - your actual data models and your database layer. Funny
that nobody would ever do that with the filesystem, but it's super common with
the database.

To make your models clean and easy to test, you need to pull out the ORM bit
into something more pluggable.

The same principle applies to writing testable biz logic. A lot of times
people don't know where to put it so it ends up in the View, Controller, or
Model. All of which are the wrong place usually. Testing that code becomes not
awesome because you then have to write a test against what the view outputs,
the controller does, or possibly what the model is doing (usually by touching
the database).

The answer I've found is to pull your business logic into a separate container
that can be tested easily apart from any MVC structure.

None of this means you need to over-abstract anything. It just means you have
to figure out the right abstractions and structure for the right layers. None
of this is by any means obvious at first.

P.S. I'm working on an architecture framework that simplifies a lot of this to
make doing TDD much easier.

------
vegas
He left out "You are Donald Knuth"

~~~
mpweiher
Donald "beware of bugs in the above code; I have only proved it correct, not
tried it" Knuth?

------
mcs
so silly. brogrammer turned maddox.

