
Is the development of automated tests a waste of time for a startup? - richcollins

======
mdakin
Think of unit-testing as a tool and not a dogma.

Proper unit-testing roughly doubles the amount of time it takes me to write a
"unit" of code. This is a high cost so I do not have 100% unit-test coverage.

Over time I have developed an intuition about whether I NEED a unit-test for a
particular unit. When I think I need tests I write them.

Typically I'm correct about whether the units need tests but when I make a
mistake (as evidenced by bugs in the code I didn't unit-test) I go back and
write the tests.

This "middle way" works well for me.

~~~
felipe
Research has shown [1] that unit testing actually cuts development time by
20%-30%, because in the long run you will spend less time debugging code or
fixing code broken by other changes.

[1] - I don't have time to look-up the specific link right now, so you'll need
to trust me on this one! :) But I'm sure one can google it. I remember reading
it on an IEEE magazine years ago.

~~~
richcollins
In what context does it cut development time?

~~~
cwilbur
If you only count the initial development time as "development," writing tests
will increase your development time. If you take a longer view, and consider
maintenance and later development, having a good test suite makes it easier to
find bugs and makes it possible to add features and improve code with
confidence that you aren't breaking code that relies on your test suite.

------
tx
Without automatic testing dynamic languages _are not_ very productive because
they do not catch type mismatch errors during compilation time (there isn't
one). With statically typed languages you get a lot of tiny issues taken care
of once you hit magic "compile" button and compiler does some basic sanity
checks.

Without solid unit tests backing you, activities like removing "obsolete"
methods or renaming classes/modules turn into nightmare with duck typed
languages.

~~~
richcollins
We didn't write tests for our last startup and we never had too many problems
with bugs.

We do write tests for this one and we don't have too many problems with bugs.

Tests take almost as much time to develop as the features do. I just wonder if
it is worth the time and if other people with startups write automated tests.

~~~
tx
No difference? That's pretty impressive. I wish I was able to look at your
process. Either your engineers are extremely smart and anal about their
practicies or your domain is very simplistic (ta-da lists).

At my previous start up we had unit tests run on the entire system
automatically after every check-in (with CruiseControl) and whenever someone
broke a test suite, everybody got an email with name of the offender and diffs
of his code against the trunk.

We also had a "beer rule" about violatros gettings drinks for everyone, to
make the process a bit more exciting.

~~~
richcollins
Facebook doesn't have an automated test suite. I wouldn't call that a
simplistic app.

~~~
immad
really?! I would be shocked if they dont have at least automated testing on
their most critical systems. Whats the source of that info?

~~~
richcollins
we asked Mark Zuckerberg at Startup School

------
felipe
Another huge side advantage of automated tests is that it becomes a live
documentation of how to use your code (unit tests as "sample code")

------
dhouston
it depends, and in my experience can be reduced to a cost/benefit analysis. a
sprinkle of different things has worked for me -- some things to think about:

what's the impact of a typical bug in your app: an error 500 page (annoying)
or a BSOD or corrupted data (much worse) or a missile hitting a house? these
will merit different levels of testing rigor.

if something is breaking, how do you even know? how much time will have
elapsed? how long will it take you to find the cause, and is suitable
debugging information available for your investigation? how quickly can you
come up with a fix, and how expensive is it to deploy it? this is, in my
experience, a _much_ more effective place to optimize if you're a web app,
especially if you're a small team.

for the non-core pieces of a web app, i'd rather release twice as fast and
have logging and monitoring infrastructure in place to find and fix problems
quickly (e.g. exception - logged and SMSed to our cell phones) than spend 2x
to write unit tests (and maintaining them later.) a nice property is that
you'll tend to find bugs roughly in order of decreasing impact, or at least
prevalence. however, this approach won't do much against silent failures or
bugs that are both rare and high impact.

there are lots of other effective tools: automated smoke tests (by which i
mean end-to-end tests that quickly test large pieces of the app) deliver a lot
of bang for buck, and root out unintended consequences of seemingly local
changes that even unit tests sometimes miss. i also find generous use of the
assert keyword finds failures fast/nearer to the cause.

------
tx
I find it really counterproductive whenever people bring the following
arguments to the table (during discussions):

"I don't use it" or "Facebook does not use it". So? What does that prove? Who
taught you gentleman, to use such useless arguments in a debate. It sounds
like "I do not believe unit tests are useful _because_ I do not use them".
Your logic is reversed.

BTW Einstein did not unit test either.

------
tx
Lets not forget about quality of unit tests. I've seen people writing
completely idiotic tests that really do not prevent them from anything (like
calling methods and making sure they do not throw exceptions). Such "unit
testing" is your time well wasted indeed. Another example is to go too low or
too high, i.e. being unable to find a proper balance between unit/integration.

IMO proper unit testing is a skill and takes some time to acquire. Currently I
am thinking about good interview questions for our candidates that help us
find out how well people understand the _reasons and goals_ of good UT.

Moreover, there are some areas of development that are notoriously hard to
unit-test (with good results): Win32 UI is a good example, and even in those
cases it's possible to write little, but highly effective UT-code (for
instance if you have a unit-test-friendly design in place, like mediator
pattern when you can "drive" your UI actions by sending commands to it).

------
spejic
I actually complete disagree with sillydude. Even though automated tests may
make the team be a bit slower while developing, the suite sure as hell makes
the code the team produces more reliable. So, while the speed of development
is very important, so is quality of the product you get out there, even when
it's something simple. The more small, simple bugs you can get out of the way
before launch, the better. So, automated tests are, at least in my opinion,
high priority.

~~~
richcollins
Is there evidence that time spent writing tests improves the quality of the
code over time spent reading the code and trying to make sure it is simple and
straightforward?

~~~
cwilbur
There's no evidence that reading the code and trying to make sure it is simple
and straightforward improves the quality code, for that matter.

The quality of the code is a combination of the quality of the programmers and
the quality of the processes they use to produce the code. If you have really
good programmers, they can use a shoddy process (write, compile, ship) and
still produce reasonable quality code; if you have average programmers with a
good process (write test, write code, compile, fix warnings, compile, run test
suite, fix errors, compile, run test suite, ship), you can produce good code.

What I've found is that whether a test will be useful or not depends on the
nature of the code. If I'm writing logic code or building a domain-specific
language, an automated test suite means the difference between getting it
right quickly and taking a long time to get it hopefully probably mostly
right.

Readability is only one aspect of code quality, and code that does something
complex and convoluted will never be simple and straightforward. For that
matter, complex, convoluted, and correct is preferable to simple,
straightforward, and wrong, and that's the sort of thing that a test suite
will tell you.

~~~
richcollins
I meant reading it as you (re)write it.

~~~
cwilbur
Even then! I've seen programmers read the code, get it wrong, and rewrite it
in a way that introduces bugs. Hell, I've _been_ that programmer. It's clear,
it's readable, it's concise, and it's wrong.

Readability and correctness are orthogonal concepts.

------
ricky_clarkson
The larger your units the more important unit testing is. Now replace larger
with smaller and more with less.

------
nmeans
If you have humans writing code for you, tests are an imperative. Without
tests, one seemingly small code error can wipe out a smatter of user-created
data accidentally. Sure, it's worse case, but it could also mean lights out
for your startup. You have to decide if it's worth the risk.

~~~
richcollins
Shouldn't you be backing this kind of data up? You can't guarantee that tests
will catch this anyway.

~~~
nmeans
If you've got good coverage, you'll have a great chance that they will. You've
got to actively monitor your coverage though.

As far as backups go, a backup is only as good as it's age. If a startup is
struggling to decide if they've got the resources for writing tests, they
probably don't have the resources for maintaining a real-time database mirror
either. You can argue that you should always back up before you roll out a
code update, but the real world's not always like that.

~~~
mdakin
If one is using Postgres as the DB it is not really necessary to set up a
real-time database mirror to have a very robust error recovery strategy. (I
can't speak about MySQL but there is probably a similar feature.)

All you need to do is configure the Point In Time Recovery feature, write a
little backup script (mine is 165 lines of Python), and then call it from
cron.

With PITR as long as you have the last backup and the archived WAL files made
since the last backup you can recover to the SECOND before your program puked
all over the DB.

------
richcollins
I always go back and forth on this one. It always feels great when the tests
catch an introduced bug, but it is really worth the time you could be spending
on testing out new features with users?

I wonder if the solution is to be careful to write very simple, well designed
code.

~~~
nostrademons
I've gone back and forth on this issue too. I don't really have a firm opinion
on the subject, so all I can do is share my experiences.

At inAsphere.com (2000 teen content dot-com), we had no tests and didn't
really miss them. Our software was dead-simple though: a bunch of 3rd-party
forum and link directory stuff. We got killed for other reasons (i.e. there
wasn't really a market and we kinda sucked content-wise).

At Traxit (2000-2001 venture backed startup I worked at, selling remote access
software to get through firewalls) we had no unit tests, and it sank the
company. We got into a QA death spiral, where every bug that QA found made the
engineers introduce 5 more, and the product quality suffered so much that we
cancelled our launch. Cancelling the launch is the _absolute worst thing_ you
can do if you have venture funding, because the VCs will swoop in, throw out
most of the management team, and try to pick up the pieces. Of course they'll
fail, so the company folded about 6 months after I left.

At FictionAlley (2002-2005 nonprofit Harry Potter fanfiction archive), I
started with comprehensive unit tests, determined not to repeat the mistakes
at Traxit. I then got frustrated with how I was spending as much time writing
test code as production code, and abandoned them. It eventually launched, a
little buggy, and the users were reasonably happy despite some annoyance at
the bugs. However, the lack of up-to-date tests has made it very difficult for
new developers to come up to speed and change things, which means that
imperfections in the architecture have kinda fossilized.

At my previous project at my current employer (2006 IDE plugin), I released
without any unit tests. I then spent 2-3 weeks _immediately_ after release
writing a fairly comprehensive test suite for it. Overall it's been pretty
successful: product quality is fairly high, and the tests have caught several
insidious bugs since. I find they're most useful as a sanity check to show me
I'm about to do something stupid. Often I'll do something that looks like a
trivial refactoring and then roll back when I find out it's made a whole bunch
of tests fail.

At my current project at my employer (2007 financial webapp), we have no unit
tests. We are getting along okay without them, but that could be because we
haven't really met reality yet. When my employer has released previous
products (again, without any unit tests or QA), we've had to deal with some
pretty pissed-off customers because of the product quality or lack thereof.

At my startup (2007, flash games website), I've had fairly comprehensive unit
tests from the beginning. I haven't been running them lately, and they're a
bit out of date, again because it takes as much time to maintain the tests as
to write the code. While I was running them, they were useful in the same way
as above: they keep me from doing stupid things and making silly mistakes. I
plan to go back and make them pass 100% before or soon after launch.

If I had to summarize, I'd recommend that you skip them until after launch,
and then _any_ time you don't have an imminent feature request, spend that
firming up the test suite. That way you don't delay your release too much but
they'll still be there for you when you need them, and you also don't start
writing them until the feature set has somewhat stabilized. They're a big
investment, but they do pay you back.

~~~
palish
For some reason, I really like the philosophy of "Don't write unit tests until
right before you launch, then maintain them". Good observation.

~~~
irrelative
I second that. Once you've launched, your product is usually fairly stable (in
terms of domain, big features, etc) and the tests are worth it for confidence
in smaller changes/maintenance. Before then, your code base changes so often
that updating the tests start dragging down your productivity.

------
dazzawazza
I tend to write unit tests for core functions/modules as I develop them. It
helps me create an API and to think about it's use/documentation. I also write
tests whenever I come across a non trivial bug: write a test to create the
bug, fix the bug, prove it by passing test. I haven't gone the whole hog and
employed test driven development. I think all types of testing matter most
once deployed. You need to prevent your customers 'testing' your app :-) Of
course retro fitting tests is often a waste of time as your perspective isn't
so clear.

There is always a balancing act between prefect text book engineering and just
making some money!

------
sillydude
It also depends on the size of your team and the quality of engineers and
code. Having tests is usually a plus, but they don't always have to be a high
priority.

------
wlievens
My unit tests are mostly related to the core logic. The only UI testing I do
is automated XHTML validation of pages generated for a test set.

------
jcwentz
Writing tests is just part of programming, whether it's for a startup or any
other project.

~~~
richcollins
That is a pretty conformist thing to say

