
The Start-Up Trap - filpen
http://blog.8thlight.com/uncle-bob/2013/03/05/TheStartUpTrap.html
======
MattRogish
(tl;dr: it depends)

It's a hallmark of "experienced" non-dogmatic product people (UI/UX/Dev) that
can use their intuition to know what are the happy paths that they need to
test, what interfaces are likely to not change (e.g. a user is
probabilistically always going to be a member of an organization, so unit
tests around that are likely not waste), and what level of quality to
introduce for a given feature relative to the probability the feature is going
to exist in perpetuity.

You can concretize this by calling it "spike driven development" if you want
(that's what we do) but the point isn't that TDD is faster or slower but that
high coverage might be inappropriate (TDD isn't binary - degree of test
coverage goes from 0-100%) at different phases of the build/learn/refine
cycle.

For example, if we're building a speculative feature that we think has value
but hasn't been proven yet, we want to spend the least amount of effort
(code/time/money/pick some) possible to prove that it's viable.

Small bugs aren't really that interesting (provided the thing "works") and so
we're explicitly willing to live with degraded quality to prove the feature is
worth further investment. We're also explicitly not building a "large" feature
(so it's very likely to get coded in a day or two) so the surface area for
major showstopper bugs is minimized.

Often the feature will be thrown away or majorly refactored into something
that is completely different.

In this case, full-bore 9x% coverage TDD is probably waste as the feature
existed for a short period of time, the interface changed dramatically, or
otherwise was proven invalid. High test coverage makes _major_ interface
refactors really expensive and you really don't need that level of quality for
speculative features.

After you prove this feature is a "thing" you want to keep around (and you've
nailed down the interface), then it's a perfect time to re-write it with a
much higher degree of test coverage.

~~~
pifflesnort
> _After you prove this feature is a "thing" you want to keep around (and
> you've nailed down the interface), then it's a perfect time to re-write it
> with a much higher degree of test coverage._

I find that tests help me write things that are complete and "nailed down" the
very first time.

~~~
sanderjd
You must have read that with a different definition of "interface" than I did.
Under my reading of "user interface", I don't see any way that a developer
writing tests has anything to do with how nailed down it is. You need user
feedback for that.

------
ardit33
Wait, since when TDD become a best practice?

I know this author is advocating it, but has 0 (zero) solid data or evidence
to back this assertion. He is just trying to sell another religion. What I
have noticed, TDD is being pedled heavy by RoR contracting shops that just
care about billable hours and not vested if the startup or company is going to
make it on the long run. I see often that it is the usually youngish and
guidable engineers that fall in the trap of eating it up. TDD is something
that makes younger and less experienced developers feel good (more locs) and a
false sense that the code is correct while there is solid data that even 100%
unit testing coverage finds only 20% of defects, at best.

Having shipped many world class products, and none of the processes that build
those products had any resemble of TDD in them. So, to all the pragmatists out
there feel free to ignore the advice on the article. I'd yet to hear a super
successful startup that used TDD.

Having Tested Code (which is a good goal) != TDD. TDD to me is putting the
carriage in front of the horse. Anybody that have created large systems from
scratch, TDD is a major slowdown on the refining process. Your job is to SHIP
a working product. Testing is only a small part of it, and its positives
should be considered alongside its negatives and side effects on the shipping
timelines.

Having worked for over 10 years in the industry I have to meet any great
programmer that preferred TDD. I hate to attack the author, but from his bio
he seems a bit like a process peddler. I don't see him have worked on any
successful startups, or anything that had a good exit. So take his claims with
a grain of salt. <http://www.8thlight.com/our-team/robert-martin>

He is trying to sell a religion, so buyer beware.

~~~
megaduck
I work at one of those "contracting shops" (Pivotal Labs), and I think you're
misunderstanding the reason we push TDD so damn hard: It's not about speed.
It's about predictability.

Good TDD forces you into building loosely-coupled code that's easy to refactor
and change. When new business requirements come in, the effort for
implementing them isn't increased because of your previous code getting in the
way. There's an overhead to writing the tests, but it's a fixed continuous
cost. As a result, you're always in a position where you can ship a new
feature in a predictable (and reasonably fast) fashion.

Not doing TDD often leads to tightly-coupled brittle software, which can be
very fast to implement but also difficult to change down the road. It
certainly doesn't have to, but in reality that's what happens 90% of the time.

There's a few successful startups that immediately spring to mind that use TDD
(Taskrabbit and Modcloth spring immediately to mind). However, the real
question you should be asking is, "Which startups died because they got buried
under the weight of their own codebase?" That's a long and very depressing
list. In many of those situations, some TDD might have helped.

In short: TDD won't make your startup healthy, but it helps ward off some of
the more common fatal diseases.

~~~
choxi
> "Which startups died because they got buried under the weight of their own
> codebase?" That's a long and very depressing list. In many of those
> situations, some TDD might have helped.

Could you give some examples? PG doesn't include that in his list:
<http://www.paulgraham.com/startupmistakes.html>, and in my personal
experience I've never met a startup that failed because they had an
unmanageable codebase.

~~~
unclebobmartin
Quite a few actually. I can name two off the top. You've probably never heard
of them (why would you, they failed). Saber, and Lucid. Oh, there are
certainly others. What kills them is a code base that's too tangled to manage.
Estimates grow by an order of magnitude, costs skyrocket, defects mount,
customers flee, investors cut their losses.

------
readme
Great, more TDD evangelism.

I would love to see an empirical study proving the claims made about TDD in
this article and numerous others.

It seems to me TDD is a huge waste of time when prototyping a minimum viable
product. You want me to spend 2 hours writing tests for a feature that
someone's going to tell me to rip out 15 minutes later? No thanks.

It's really easy to tell others to use TDD, and even admonish them for not
using it. But unless you are the one in their shoes, you will not know the
whole of their reality.

In a perfect world I'd write tests for everything. In the real world, I write
tests for almost nothing. Most of the work I do is on stuff that might be gone
tomorrow.

~~~
shadowfiend
“I would love to see an empirical study proving the claims made about TDD in
this article and numerous others.”

This point is so important that I think it should have been the only one you
made. Tests look good on paper. And they are great intuitively. We can make
great arguments for them. But last time I asked for a clear study indicating
that TDD led to better results than a non-TDD development, everyone seemed to
come up blank.

What it sounds like to me is religion. Doesn't mean I won't test. And it
certainly doesn't mean I'll eschew testing on a team that does testing. But it
still smells suspiciously like religion, and that's very worrisome to me.

~~~
dasil003
TDD teaches you how to write tests at all costs. Once you know how to test in
every conceivable scenario than you can discard TDD and replace it with the
wisdom of what tests are useful and understand the real ROI of writing this
test or that test. If you haven't forced yourself to get good at writing tests
at some point then you will fail to write a valuable test simply because the
cost of writing the test seemed higher than the return even though it was only
because of your lack of skill in that department.

------
jmduke
I like to talk about TDD using a finance analogy.

One of the first concepts you learn in introductory finance is that, in a
mature market (ie the stock market -- basically something with a bunch of
people and a bunch of information) you have to be compensated for _risk_. This
is why a riskier stock -- like tech stocks or penny stocks -- can boast
incredible amounts of risk but generally will give you higher returns over the
aggregate than something like pork commodities or a CD. When you choose to
purchase a tech stock -- or any stock at all -- you're saying "okay, I
recognize that this is riskier, but I think I'm being fairly compensated for
the risk, too."

Choosing to eschew TDD is like purchasing stocks. By definition, going through
TDD is going to be a safe route, but it's rare that TDD (at least in my and my
friends'/peers' experiences) is actually going to make you get from Point A to
Point B any faster. TDD isn't, by default, a superior or inferior approach to
anything: it's a tradeoff -- do you want risk or do you want return?

Sometimes, you want to minimize risk, and that's probably smart. Sometimes,
you just want to produce an MVP -- and that's okay, too.

~~~
tmoertel
> By definition, going through TDD is going to be a safe route, ...

Why is TDD safe? Most TDD advocates seem to be blind to the fact that testing
is a terrible way to prove many important properties about software systems,
security properties for example. If you're betting your ever-so-scarce
programming resources on TDD, you're probably paying too much, getting a lower
return than you could be getting, and leaving some serious holes in your
software. As I wrote in [1]:

 _If all you know about getting your code right is TDD, you’ll never bet on
types or proofs or constructive correctness because you don’t know how to
place those bets. But those bets are often dirt cheap and pay in spades. If
you’re not betting on them at least some of the time, whatever you are betting
on probably costs more and pays less. You could be doing better._

So I don't think that TDD is a "safe" bet. I think it's an expensive bet that
has relatively poor payoffs.

[1] [http://blog.moertel.com/posts/2012-04-15-test-like-youre-
bet...](http://blog.moertel.com/posts/2012-04-15-test-like-youre-betting-for-
your-life.html)

~~~
unclebobmartin
The safety of TDD comes from having a test suite that you trust. Given that
suite, you can safely refactor the code. If you can refactor safely, you can
improve the design safely. If you can improve the design, you can stop the
inevitable slowdown that comes from making a mess.

What is the risk of _not_ doing TDD? The risk is that slowdown. We've all
experienced it. What is the cost of TDD? You'd like to say that it takes time;
but since the risk is a slowdown, the net is positive no matter how you look
at it.

That's the irony of all these complaints. They assume, and sometimes they
simply state, that TDD slows you down. And yet, the primary effect of TDD is
to speed you up, and speed you up a lot.

Some folks suggest that it's a short-term slowdown for a long-term speedup.
But in my experience the short-term is measured in minutes. Yes, it might take
you a few extra minutes to write that test first; but by the end of the day
you've refactored and cleaned the code so much that you've gone much faster
_that day_ than you would have without TDD.

~~~
tmoertel
The benefits that you attribute to TDD are _not_ exclusive to TDD. They are
the benefits of having well-tested code, and TDD is only one of the ways to
get there.

The problem with the TDD way of getting there, however, is that it's
expensive: It makes programmers see their code through the pinhole of one
failing test at a time, blinding them to larger concerns, _which are
important._ As a result, a lot of avoidably crappy code gets written at first
and then must be reworked later, when its flaws are finally allowed to come
into view.

If you're a new programmer who hasn't learned how to reason about larger units
of logic and the relationships between them, maybe that pinhole restriction
proves helpful. But for more seasoned programmers, it's constraining and
wasteful.

Write well-tested code. But not using TDD.

~~~
unclebobmartin
The idea that TDD involves some kind of blind faith that the tests will
generate grand designs and beautiful code is both silly and wrong. You are
right about that. Good design and good code require skill, thought, and
knowledge irrespective of whether you are using TDD. So as I practice the
discipline, I am thinking all the time about larger scale issues, and I am
_not_ being blinded to those concerns.

However, the act of writing tests first has a powerful benefit: the code you
write _must_ be testable. It is hard to understate this benefit. If forces a
level of decoupling that most programmers, even very experience programmers,
would not otherwise engage in.

It also has a psychological impact on the programmer. If every line of
production code you write is in response to a failing test, you will _trust_
your test suite. And when you trust your test suite, you can make fearless
changes to the code on a whim. You can _clean_ it and improve the design
without trepidation.

Gaining these benefits without writing tests first is possible, but much less
reliable. And yet the cost of writing the tests first is no greater than
writing the tests second.

~~~
tmoertel
> However, the act of writing tests first has a powerful benefit: the code you
> write _must_ be testable.

No, the act of writing well-tested code _at all_ has that benefit. Whether you
write the tests before or after the code, one at a time or in module-sized
groups, writing code that's hard to test has immediate and obvious penalties
when you test it (e.g., tedious rework), and you'll quickly learn to avoid
those penalties. So just having the discipline to write well-tested code _at
all_ forces you to write code that's not only testable but easily testable.
This benefit is not unique to TDD.

> It also has a psychological impact on the programmer. If every line of
> production code you write is in response to a failing test, you will _trust_
> your test suite.

It's not enough to trust that your tests actually test your code. You also
need to trust that your tests express your desired semantics. And that's
harder to do when the semantics is not designed in whatever form and grouping
is most natural to its representation but rather is extruded, one test at a
time, through the pinhole view that TDD imposes upon programmers.

> And yet the cost of writing the tests first is no greater than writing the
> tests second.

What you seem to be overlooking is that TDD not only forces you to write tests
first but also in tiny baby-steps that cause programmers to focus only on
satisfying one test at a time. As a result, the initial code that is written
satisfies only a small portion of the system's overall semantics (the portion
that's been expressed as tests so far), and a lot of that code ends up having
to be reworked when later tests finally uncover other requirements that affect
it. This leads to rework that would have been avoidable had the programmers
not been blinded to those requirements earlier on.

The problem with TDD isn't so much that it's test first but that it promotes a
pinhole view of subjects that are not narrow.

------
tptacek
I thought this was going to be an article about overwork, or about charging
for what you build as a way of proving your market, or about how employee
stock options are often a bad bet. Instead, it's about TDD. What a
disappointment.

~~~
orangethirty
Which is ironic, because the real startup trap is the one where people build
things but are told to not charge for them (or charge very little for it).

------
thetrumanshow
TDD is like those best-practices books that everyone pretends to read, but
hardly anyone ever really reads them (I'm looking at you Code Complete). The
Folks that do read these books know the dirty secret that no one else really
reads them either, but they feel that endorsing the books somehow gives them
credibility.

TDD has its place, as most all main-stream methodologies do. But, lets just
admit that the people that use this methodology are in the minority. The rest
of us are working on smallish projects that are struggling to be worthy of the
time-budget that they've been granted, and we're more worried about shipping
than caring about how not having TDD in place will slow us down in phase III.
Assuming phase III ever happens. We're using all the best-practices we can,
but TDD doesn't rise above the bar most of the time.

~~~
j_baker
You're strawmanning.

Uncle Bob isn't saying "If you aren't using TDD at the startup phase, you
suck!" What he's saying is "Just because you're in the startup phase and think
you're invincible doesn't mean you should throw best practices out the
window." There are good excuses for not using TDD, and I tend to agree that
"We're in a startup phase. We don't have time for this crap." _isn't_ one of
them.

~~~
thetrumanshow
Had I opened with "Interesting article, but tangentially..." then we'd still
disagree, but maybe we could have avoided the straw-man ding. :)

~~~
j_baker
You're right. Then we'd have a passive-aggressive ding. The only thing you
would have to add to make it perfect would be "...but I'd _never_ claim the
author is trying to do that."

~~~
derefr
I wouldn't even call it passive-aggressive; it's soapboxing, plain and simple.

> This article is about X. But I don't care what the article says about X. _I_
> have something to say about X, and by God, I'm going to _say it._

\--But I don't know that this is a particularly _problematic_ thing to be
happening in a threaded comment system, since people _can_ get sidetracked by
"soapbox issues" in a subthread while letting the "parent conversation"
continue around them. It's just kind of confusing for people who treat this
place like a linear-chronological message-board that discusses one topic at a
time.

~~~
thetrumanshow
I concur.

There are n-ways to join in a conversation, but the more direct experience you
have, the better you will be able to field certain types of discussion. So,
experts and expert debaters will dive right in and attack the topic head-on.
More timid souls will wait in the wings and hope that someone will say
something that they can directly respond to with confidence.

Sometimes you have to be slightly shameless if you want to join in a
conversation. Stating an opinion (soap-boxing) does feel a bit like cheating
the system. But I would argue its slightly better than standing on the
sideline.

~~~
j_baker
Honestly, if your opinions were at least novel, I probably wouldn't have said
anything. I've heard the same old debate many times before and frankly, it
gets old. Especially when people make tangentially-related topics into being
about TDD. Can someone mention TDD in a blog post without dredging up all
these old topics?

------
spartango
My feeling is that the point of this article isn't strictly to promote TDD or
any single engineering tool that startups tend to eschew. Instead, the author
seeks to make the broader point that cutting corners in engineering is not a
good idea _even in a startup_. While it can make startup engineers feel like
they are moving quickly relative to the competition, the engineering debt
incurred is non-negligible.

I strongly agree with this point, and don't buy the blanket rationale that
speed trumps everything in a startup. Yes, speed is quite important when you
are bringing something new to market; it lets you get your foot in the door
and rapidly find the right fit for your product. But if you cut corners then
your product will suffer, and you'll end up treating your customers poorly by
it.

There's a fine balance to all these things, of course. We all have to find the
right balance between keeping engineering standards high and heavy-handed
structures.

~~~
freshhawk
Boy, did I have to scroll down far to find someone who read the article.

I was hoping for tips about how to find a balance when other team members have
bought the blanket rationale that speed trumps everything and then complain
about why we spend so much time fixing bugs and fighting fires instead of
working on the next thing.

Instead a bunch of reading comprehension challenged commenters are bitching
about TDD again.

------
petenixey
Man, I've been back and forth on tests from one extreme to the other. I have
to say that I do agree with this but with one caveat - Test Driven Development
is a bit miserable.

Test "Driven" Development is a real invitation to write too many tests. A
small, good set of tests gives you freedom to work fast, to refactor and to
have multiple developers pushing simultaneously. Not having them really isn't
sustainable past a certain point but equally to many tests will drag you to
the ground.

Tests are overhead, they cost time to write, maintain and run and TDD tends to
drag you into the deep end. Avoiding them altogether ends up being a false
economy (if only in that it makes you too nervous to push often).

To those who fear the slippery slope, a nice self-annealing approach to get
into testing is to only write tests for something that has failed. That way
you waste no time writing tests for things that are actually pretty robust but
equally you avoid addressing (and fearing) the same issue twice.

~~~
agilord
I'd like to emphasize this approach: "a nice self-annealing approach to get
into testing is to only write tests for something that has failed."

It is great for old-timer enterprises without any test, and balances the time
pressures of startups. However, you need to write those test not only for prod
failures, but also if that something failed in your dev environment.

------
gfodor
Over the years, I've grown less and less likely to write lots of tests, after
being a very large test zealot during the peak of the TDD wave. There are a
few reasons.

First, yes, TDD slows you down. The reason this matters is because a lot of
our time as developers is spent exploring ideas, and it's pretty well
understood that the faster you can get feedback on your ideas the easier it is
to creatively develop them. In fact, the final result of your creative process
can be completely _different_ depending on the feedback cycle you have. From
the micro to the macro perspective, religious TDD introduces an constant
factor slowdown for small projects that yes, ends up being a net win in the
long run if you live with the code for long, but is a net loss in the short
run and also can _prevent_ you from finding a solution to a problem since your
creativity is stifled by the slow speed of idea development.

Second, when building web applications, building tests turns out to be fairly
overrated. First, people will tolerate most bugs. (Write good tests around the
parts they won't.) Second, if you have a lot of traffic, bugs will be surfaced
nearly instantly in logs and very quickly via support tickets/forum posts.
(What if there are bugs that don't make it into the logs and don't affect
people? If a tree falls in a forest...) Much more important than mean time
between failures is mean time to recovery. I'd rather have ten bugs that I fix
within 5 minutes of them affecting someone than one bug that festers for a
month. Not only because this is healthier for the code base in terms of
building robustness, but also because human behavior is such that many fast
bug fixes make everyone feel good but few lingering bugs make everyone
miserable. People want to feel like you care, and are much more likely to feel
cared for when you fix their problems quickly, and are not often interested in
just how few bugs you ship if the one that you do has been affecting them for
a month.

This isn't theoretical nonsense, it's a very real phenomenon where you use
tests and manual testing to get up to a basic working prototype and then just
throw it over the fence to flush out the bugs. It's the only way to do it
anyway. (This only really works if you have traffic and can deploy changes
quickly. To paraphrase a colleague, production traffic is the blood of
continuous deployment.) Obsession over deploying bug-free software (an
oxymoron anyway) is usually coming from people who haven't gotten over the
fact that we don't need to ship software once a month or once a year but can
do it every 10 minutes in certain domains. Instead of focusing on not shipping
bugs, focus on shipping faster.

~~~
randomdata
I, on the other hand, have grown to love tests more and more.

Not for reasons of finding bugs – though that is often a nice side effect –
but because once I have decent test coverage I don't need to look at the
application anymore. Being able to run the tests in the background to verify
my work is sane, while I move on to the next feature in parallel, I find, is
considerably faster than the code/build/review cycle you find yourself in
without a decent test suite.

My own performance, being a limited commodity, is the most important factor
and I find tests help me increase output, not slow it down as you suggest.
They are certainly not a panacea though. As always, use the right tool for the
job.

~~~
saraid216
I personally can't rationalize skipping the "review" part of the cycle. If it
faces the customer, it's my responsibility and the luxury of _not_ looking at
it seems like trading away effectiveness for efficiency.

~~~
randomdata
I see your point. I also take on design roles, so I naturally do that extra
double take on the work during the design phases. Though I generally find the
tests really do a great job of covering what they should.

If you are working with larger teams with designers and developers, I suppose
it may not work out as well.

------
DaniFong
Imagine test driven development for visual art. How would one write a test for
visual art, beyond viewing and comprehending the partially completed work? If
one could write a test in advance, one would have made a significant advance:
distilled to a verbal form a description of what good art is.

Instead, many artists -- or software authors, or chefs, find a way to
repeatedly sample and appraise their work as it develops over time.

I think this is far more important than TDD, because the truly important
problems of software engineering are not in making software that can achieve
simple correctness, but in making something that people want.

~~~
mattvanhorn
Actually, that is a great metaphor. But you should know, TDD _is_ the way I
repeatedly sample and appraise my work as I go. Multiple times per hour, in
fact.

Red -> Green is the sampling and Green -> Refactor is the appraising.

------
kgo
"Of course one of the disciplines I'm talking about is TDD. Anybody who thinks
they can go faster by not writing tests is smoking some pretty serious shit."

Why do so many TDD people think that people who don't use TDD don't write
tests?

~~~
theoj
Also, why do so many TDD people think that no automated tests = no testing at
all?

------
icedchai
TDD is often a waste of time. I worked at some place that had taken the TDD
philosophy to the extreme: every getter and setter had a test.

On the opposite side of the spectrum, I worked on software that processed
millions of dollars in daily transactions and there were maybe 5 "tests" in
the whole system. This was about 500,000 lines of C and C++ code in the early
2000's.

My personal philosophy is to write unit tests where I think it's important,
not test everything.

~~~
mattvanhorn
I won't write tests for getters and setters, but then again, I try not to
write getters and setters very much in the first place.

------
vellum
> _Of course one of the disciplines I'm talking about is TDD. Anybody who
> thinks they can go faster by not writing tests is smoking some pretty
> serious shit._

That's some serious strawman. In between 0 tests and TDD, there's a lot of
room to maneuver.

~~~
j_baker
Is it a strawman? Because I _have_ heard plenty of developers espouse this
attitude.

~~~
vellum
Some people don't write tests. But Bob was only comparing TDD and 0 tests. A
lot of people sprinkle a few unit tests for sanity, instead of letting tests
drive development.

There’s a wide spectrum of successful coding practices that get products
shipped. Bob’s article only compares the outliers.

------
Xcelerate
TDD is but one method of writing code; one that I've never found particularly
useful. There's plenty of other things you can do to perform your job well.

------
JoeAltmaier
Skipping testing amounts to risk. The winners in a brutal race take every risk
to win.

So yes, its foolish to skip testing. And yes, the race goes to the foolish.

~~~
unclebobmartin
The winners in a brutal race have learned how to avoid as much risk as
possible. The prize does not go to the biggest risker. The prize goes to those
who do everything they can to minimize the risk of going slow.

What slows a programmer down? Debugging. Messy code. Fear of change. How do
you minimize those risks? A comprehensive suite of tests. How do you get that
suite? TDD.

~~~
JoeAltmaier
And you come in 2nd or 3rd in the race, and lose.

~~~
unclebobmartin
That is certainly your fear. It is a fear I don't share. Firstly, I think
you'll get there faster if you work well. Secondly, as MySpace showed, being
first isn't really all it's cracked up to be.

~~~
JoeAltmaier
Im not making myself clear. You might beat me, personally, once. But you won't
beat that one lucky guy that didn't do the testing, got his code to work the
1st time, and got to market before either one of us. That's how the game's
played. Conservative, careful, correct coders don't win that game.

And if your business plan is to be 2nd to market, well, good luck with that.

------
barrkel
"as if any bug is acceptable!"

This is the point at which I stopped reading.

Yes, some bugs are acceptable. The point of writing software is not to create
bug-free code; it's to create value. The marginal returns on eliminating the
last bug are much lower than implementing new functionality.

As for the gist of the post, from my experience, TDD is OK; but it is unit
tests that are essential. They are far more important in dynamic languages,
because you otherwise have very little feedback when you make mistakes as
simple as a typo. They're also important in static languages, but fairly large
programs can successfully be written with without anything near the volume of
testing needed with a dynamic language.

------
jasim
It is easy to hate TDD.

In a Rails project, the Rails boot-up time makes TDD a painful experience if
you are not using tools like Zeus or Spork. Even in the presence of such
tools, you need a powerful machine to not hate the slowness of the whole thing
and worse still, break your flow.

My recommendation for someone who hates TDD for the wrong reason, aka:
breaking the flow, would be to get a fast machine, fix the code to make TDD as
painless as possible and to use the right tools.

But once you start using TDD as a tool to organize your thoughts and model
your domain, you might end up becoming too dependant on it and find it hard to
work any other way. This is anecdotal experience.

Also, learning how to wield TDD properly takes a lot of time, error and
practice. Good things don't come easy. There are obviously places where TDD
isn't a good fit - a spike, requirement that is known in advance to change
soon, and exploratory programming are all candidates. However, good practice
dictates that you refactor your code once a spike calcifies into production
code. At this point, TDD becomes just unit-testing.

Most of the arguments against TDD in this thread seems to be against unit
testing in general. But we know unit tests are important. Doing it before the
fact increases the value of the unit tests manifold and also ensures that you
do have coverage (though that is not at a primary objective).

------
gearoidoc
Whilst not exclusive to TDD, I think DHH said it best:

"The road to programming hell is paved with “best practices” applied too
early."

~~~
freshhawk
And the road to failure is ignoring best practices because "I'm special" and
"I'll do it later because I'll magically have time then". It's a balancing act
that pithy aphorisms don't help with.

There's a reason people use the metaphor "Design Debt". Debt is a tool with
tradeoffs, financial or design, use appropriately.

~~~
gearoidoc
But what if the debt outweighs the benefits?

What if you can't measure the debt?

~~~
freshhawk
If the debt outweighs the benefits you don't take it, I have a project right
now that I "know" will be fragile and a maintenance headache later. So even
though it's urgent I'm writing the appropriate tests and doing all those best
practice things that you know will pay off 10x later but often skip because
you don't want to take the time now.

You could measure how much debt your team tends to take on, by measuring how
much time is spent on new features and how much on the type of bugs and the
refactoring that pays off design debt. This still misses what I think is the
most important part of design debt, how much longer it takes to implement new
ideas because you are paying the "the pieces this depends on were rushed and
don't work/integrate/extend well" tax (I guess I should call this "interest
payments" to not mix metaphors) .

You can't measure debt as you take it on though, the best you can do is
estimate how much work it will be to fix it later (and that's only if it needs
to be fixed later, maybe you get lucky and your hacked up code just stays good
enough). We all know more than enough about the pitfalls of estimating
software projects and this adds in more uncertainty about future need.

This is all part of the craft side of software development, experience helps,
but it's not something easily measured. Too many people take that as an excuse
to just do the quick and easy thing and say "move fast and break things!" or
fall back on over designing and never get any work done. HN talks about the
latter more often and pretty much ignores the former. I find this strange,
I've read plenty of accounts of failure where the reasons boiled down to "we
got to a point where we couldn't adapt our codebase to changes needed to face
a new competitor, change in the landscape, business model pivot, etc. because
it was too crufty". Enough design debt means some smaller and more agile
competitor will eat your lunch.

Well that's definitely a long enough answer to two simple short questions.

~~~
gearoidoc
A good response :)

I think we just lie on different sides of the fence on whether to apply best
practice early and suffer the initial time cost upfront versus getting a rough
around the edges product in place and refactoring later.

Do you have links to articles about companies whose business failed due to
lack of technical agility? I was thinking about this recently and don't
believe I've ever read about such a case, but its highly likely I'm looking in
the wrong places :)

~~~
freshhawk
I look at it as a spectrum rather than a binary choice. I find, for myself,
that in the last few years I've gotten screwed too often by spending weeks
(amortized over a few months to a year) fixing bugs and patching up hacks that
I could have avoided with a few hours of upfront cost.

So I try to move my default a little more to the best practice side. I also
don't see this epidemic of overdesign in our community, although it exists. I
see people hit a design debt wall and have greatly reduced velocity more
often. Sometimes laziness reinforces a desire to move quickly and the wrong
tradeoffs are chosen. Sometimes the upfront cost is higher because you have to
train people in those best practices, or at least teach them the new tools.

I also have noticed that different fields of programming have different sweet
spots so it's not surprising that people don't agree on the one true way
either.

------
moocow01
Largely agree with everything here but in my opinion many times there can be a
very strong current that forces you to do it the fast, stupid way. In my
opinion if you dont have any of these problems, you work in a unique place...

\- Agile - Agile IMHO largely screws you in making sound technical decisions.
Its not necessarily because agile is flawed - its usually because
business/management uses agile as an excuse to randomly take a hard left turn
every other week making it much harder to make long term architectural choices
that are beneficial.

\- No agreed upon standards or unification amongst software professionals -
The accountant in this situation has a set of standards and expectations that
allow/force him or her to do things in a moderately set way. This allows for
the accountant to usually fend off management from pressuring things to be
done in a shoddy or just get it done way. On the other hand its much more
difficult for software professionals to say "we as a group do not condone
writing shitty software" (because ironically a large number of us do ... "move
fast, break things" has done more good than bad from my perspective)

\- Ageism - Also known as experience doesn't really matter. Some will say
thats because technology changes so much - but it actually doesn't. Just
because you've been grappling with software problems and design patterns in
Java doesn't mean that when you switch over to Python that really anything
changes. Same shit, same problems, different words - but in all honesty we
seem to be pretty bad at building on the experience of our elders because they
are over the hill at 35... what could they possibly teach us.

For me it at times has been frustrating because coming out of school I really
enjoyed the fun of designing systems and code that are sustainable,
performant, etc etc but there seemingly is typically more reward for just
throwing quality out the window in startups. Just my personal experience.

------
lubujackson
In my mind, TDD is a defense against complexity. Complexity is always the
enemy. If you can simplify your code through structure or clean design you can
minimize or remove testing.

The moment you can't hold the whole thing in your head with ease is the moment
you should have done TDD a while ago.

------
kyleashipley
Although I've been a hardcore TDD advocate from time to time, I find myself
writing fewer and fewer tests in the early stages these days.

In the very earliest days of a startup, you can easily hold the entirety of
the problem space in your head. Refactoring is simple, tests or not. I can
destroy and rewrite a feature from scratch in hours or days. A non-trivial
number of the features I'm writing will not exist _in any form_ a week or two
later. This is not a problem with process or planning; this is the nature of a
startup.

I think that TDD is immensely valuable as a team and product grows, but
claiming that your startup will fail because you aren't applying engineering
"best practices" from day 1 is counter to most successful startups I've worked
with.

------
matt2000
I disagree with this almost entirely.

Startups _are_ different:

1\. What you are doing starts at 0 value. 2\. You can (usually) break it and
it's ok. 3\. You need to change stuff a LOT to figure out something that
someone wants to use, rewriting tests makes that slower.

Every test case you write has a cost: It verifies that something of unknown
value works. What if the value of that code is 0? Well, you just doubled down
on something useless.

There are no absolutes in software development, but how we did it was have
uncomfortably too few test cases when something is new, change the product for
a while until someone actually wants to use it, then add a bunch of tests
until we know it works from there on out.

And I hesitate to add: Also we work in Java so a bunch of testing comes for
free, so there's that.

------
rcconf
The way I do my TDD is by writing the unit tests to specifically test the
functionality of the code.

For example, if I'm writing a module that does validation for a specific type
(for example, in JS) I tend to test the functionality while writing the tests.
I am unable to speak amongst others, but you have to test your code
eventually, and it's either going to be in the classes that are using the
code, the REPL, or it's going to be in your unit test/integration framework. I
usually pick the test framework first because it serves 2 benefits:

1) I can prove my code works. (I must prove this to my self, or the people
using the code at some point.)

2) I can reliably change the code later.

Complexity is also mentioned in a couple of comments here. TDD helps simplify
complexity by writing code that is testable. It also makes you think about
your API as you're writing the different possibilities the module/code can be
used in.

Further, I think there are multiple issues with tests, and depending on the
type of test your doing, different problems can arrise. You have your unit
tests, and then you have your integration tests.

Unit tests are relatively easy to write (if your code is split into individual
chunks, that do a specific task.) and they should always be written regardless
of the time you have.

I think the major issues arrise in terms of time are the integration tests.
When you're testing the functionality between complicated modules that require
databases, i/o writes, network communication, it is sometimes hard to write
the tests, and it may not even be worth it.

The majority of people who disregard TDD, usually disregard a specific sub-
problem of TDD. I think TDD has its benefits depending on what type of tests
you're doing, and how easily the problems can be solved with those types of
tests. You can always pick to write unit tests, and disregard integration
tests, etc.

I think there are a lot more issues to expand upon, like the language you're
using, the platform you're using and how often your code is required to
change, but TDD has large benefits, and code rot is very real and TDD can help
mitigate it.

------
Mahn
I'm not sure if the author has any idea what is it like to bootstrap a web
product and watch the money go down the drain day after day, with zero income
to compensate, and with the certainty that you won't make it for another 6
months if the product is not out and making revenue. I'll deal with the
technical debt later, thanks. I'm pretty sure Facebook and Google didn't do
anything remotely close to TDD when they first shipped, and yet they survived.

------
unclebobmartin
To those of you who asked for "links" to studies that prove that TDD works;
google around, you'll find plenty. Some are positive, some are negative --
what else is new. Now, please show me the studies that show that _your_
discipline works.

To those of you who consider TDD a religion; you are being silly. TDD is a
discipline, like double-entry bookkeeping, or sterile procedure. Like those
disciplines it has aspects that are dogmatic; but that aim at a purpose. That
purpose is to help you write software fast and well.

To those of you who think I'm a process peddler. You're right; I am. I make a
very good living teaching people how to do TDD. I have been doing so for over
ten years now. I hope to continue for some years to come. My goal is to help
raise our industry to a higher level of professional behavior; because at the
moment our behavior is pretty poor.

To those of you who wonder whether I've ever worked at a real start-up. I've
worked at several over the years. My first was in 1976; and it was very
successful. Another was in 1989, and it didn't do so well. I've recently
founded cleancoders.com a startup in video training for software developers.
The website is done entirely with TDD. And this doesn't count the rather large
number of startups I have consulted for in the last 10 years. So I've got a
_lot_ of experience with startups.

Folks, I am 60 years old. I got my first job as a programmer when I was 18. I
wrote my first program when I was 12 (on a plastic 3-bit computer). I started
using TDD in 1999, after I'd been programming for thirty years. I continue to
write code using TDD today. I've seen the startup trap. I've lived the startup
trap. I strongly advise you to avoid that trap.

------
stickbranch
I want to vomit when I see titles like Master Craftsman.

~~~
unclebobmartin
Please use the air-sickness bag located in the seat back in front of you.

------
johnrob
So the key to meeting your short term goals is... blow them off and focus on
the long term goals. Doesn't make sense to me.

------
zwieback
Hacking up a quick demo the non-TDD way definitely has its upsides but if you
don't force yourself to review and rewrite your early code you would have been
better off doing the right thing from the start.

Just preaching TDD is not helpful, we need a coding process that allows clean
separation of early and production code.

------
lifeisstillgood
Know the rules before you break the rules. The accountant analogy is dead on -
that and the best quote of the day:

    
    
      You'd fire his ass! You'd fire it so fast that the rest of 
      his worthless carcass would be left outside the door 
      wondering where his ass went!
    

Brilliant.

------
pardner
Statistically speaking, a web startup can safely not invest in TDD. For
precisely same reason that a daily Russian Roulette player can safely not
invest in health insurance.

But for the minority who miraculously dodge the bullet, the extra investment
will eventually become a lifesaver.

------
spullara
I find that making sure the foundation of an application is solid (I'm an API
first kind of person) let's you get away with a lot less testing in the upper
layers. This lets you move faster and experiment far more quickly than if you
were built on quicksand.

------
digitalWestie
I see this as an argument for testing, but not necessarily TDD. Why have the
two become so intertwined?

------
PasserBy2
"Oh, I know you are a warrior-god."

Well, at least someone noticed!

------
alexrson
A useful reminder

------
michaelochurch
This "trap" is Deadline Culture and it doesn't pertain only to code quality,
but to other matters like product direction and personnel decisions, which are
made quickly and often badly. It's sloppy and destructive. Companies with
Deadline Culture often make it easy for a total asshole to gain power just by
citing existential risks that either do not exist or are of extremely minor
importance. Deadline Culture companies are _obsessed_ with competitors, even
though it's their own internal corrosion that does them in.

Sometimes, having deadlines is unavoidable. They might be set externally and
exist for regulatory reasons. Deadline Culture is when the company starts
encouraging the formation of unreasonable deadlines that triumph over
professional integrity and intellectual honesty.

VC-istan seems to encourage it with the whole "if we get bored with you, we'll
fund your competitors" arrangement.

Deadline Culture is, however, great for the true fascist. Fascists love (real
or imaginary) existential threats, especially vague ones they can continually
adapt to their needs, but that come with time pressure.

