
Ask HN: How would you estimate the cost of testing as part of a dev project? - chunkyslink
I&#x27;m a software engineer working mainly in the browser. I work for a digital marketing company. As part of an ongoing improvment of our process and dev practices I&#x27;ve been asked to &#x27;Estimate and document the cost of testing as part of a development project&#x27;.<p>I can and do, write tests for my code but I want to make a decent effort at the above question.<p>There are some smart people in here. How would you go about this?
======
MeteorMarc
Interesting question. Maybe you should educate your supervisors about the
value of testing. Obviously, code without tests has little value if this code
needs to be extended and maintained. Also, writing tests helps you in defining
better API's, so it is not so clear what hours should be called developing
hours and what testing hours. Maybe the question should have been: "what are
optimal procedures regarding the writing and running of tests". You could also
put too much effort in testing or have suboptimal roles in de the devops team.

------
davismwfl
A lot of this depends on the type of coding you are doing and the needed test
coverage. For example, testing a browser app that does CRUD for say a lead
generation tool isn't as critical as an embedded application running a medical
device.

That said, you mentioned mainly browser. One way to estimate the cost of
testing (regardless of domain) is to consider what your code coverage needs to
be, figure out for your language and tooling how many lines of code that is in
comparison to the code it is testing, then calculate a cost based upon how
long it takes to create said tests. It won't be perfect, and I'd not highlight
the line of code methodology necessarily to management but it is an effective
way to understand the problem. e.g. if I wrote a 2000 line module of moderate
complexity and I had to write 3000 lines of test code to get to 100% coverage
then my testing will likely be nearly an equal or slightly higher cost then
the development, again a lot depending on tooling where code generation might
help reduce the workload.

For example, I work on medical devices and associated products (including
browser based). The general rule of thumb we are finding is that the cost for
testing devices is roughly 1.5x the cost of the initial code creation. Said
another way, if it took us 1 month to write the initial functionality, we
expect it will take us 2.5 months to write the initial functionality with
proper test coverage and test it all. Move to the web, node/javascript in our
case, and the tooling is pretty slick and will create a lot of the boiler
plate for you, so the testing there is more like a .6-.8 to 1 ratio. Meaning
for every month worth of feature creation I generally estimate it will take
2.5-3 weeks of test code, so we would book 1.75 months for development & test
creation for the same 1 month of features.

Mind you, this can all vary greatly depending on the team, requirements and
needs. But should at least help you figure it out. For me when asked for a
quick "guess", I always ballpark testing as 100% the cost of feature
development for most projects. As an example, I recently was involved in
rewriting some API's for our solution, we have spent as much time writing
tests and testing as we did creating the initial features.

You will almost always hear pushback from people that don't understand that
testing costs real time and money, but a key to it is if the testing is setup
and done during feature development the cost is incremental to schedule and
provides robust regression analysis and as you add features it gets more
robust and takes less time to deploy a feature because testing will help
validate the quality of the code.

