
Ask HN: Do you follow TDD at work? - samrohn
I am learning about TDD and one of the TDD principles I come across again and again is `Test behaviors and not implementation details`. This seems very abstract to me. The behaviors are often very broad in practical use  cases. How do I go about modelling the expected behaviors as tests? Most of the tutorial example revolve around toy problems- which is difficult to map to real world scenarios. Can anyone provide few examples from their experience on how to go about writing TDD with some real-world examples ?
======
satishgupta
First and foremost, I does not treat anything as gospel. TDD, like
everythingthing else, might not make sense in all scenarios.

I often use TDD for microservices. It forces me to define REST service
interface from the perspective of consumers, and only then jump into
implementing it. So I write tests that call the service for various use cases,
and asserts for returned status code, pay load, and any invariances. All tests
fail initially.

It is indeed cumbersome, requires disipline, but it becomes easier with
practice. I find, it particularly useful for for microservices. Earlier I
would implement, then test, and while doing both will keep figuring a better
API and keep changing. With TDD, I upfront get the clarity what functionality
I want to implement, so counterintuitively TDD makes me faster.

I use it mostly wherever interfaces are involved. For example, if my service
is using a DB, unless the use cases are too simple, I write tests for that
API.

Parts that does not have (external) interface, and I want it to be
fluid/evolve, I write tests only after implementing. But service level (TDD)
tests provide some safety net.

I can see this approach can be extended at all levels, but I am not sure about
costs vs. benefits tradeoffs. I think if I do TDD only for macro
(process/service/subsystem interfaces kind of things) and some leaf library
type modules, tradeoffs are favourable. And if you have TDD for those, the
returns of doing it for rest diminish.

Disclaimer: I haven't done detailed study, I am telling my experience, so only
one data point.

------
speedgoose
In the real world, most people don't do TDD at work.

In my opinion it's a good tool in some cases, for simple algorithms or when
you know exactly what you want. But most of the time it's too cumbersome or
totally not practical. So you should not force yourself to do TDD for the sake
of it. That's also why you only find toys examples, where TDD is a good idea
for them.

~~~
samrohn
Interesting perspective. I was watching this talk and thought there should be
a way to do this
[https://www.youtube.com/watch?v=EZ05e7EMOLM](https://www.youtube.com/watch?v=EZ05e7EMOLM)

------
mytailorisrich
How do you know what to code if you are not clear about the code's expected
behaviour?

This is one of the principles of TDD: be clear about what your code should do
(i.e. it's behaviour) before you start coding.

~~~
samrohn
I know the exact expected behavior before I start coding. But I am having
trouble translating that expected behavior into my TDD test cases. For
example. an expected behavior could be the functionality I am building should
periodically look into database, compute some metrics and then generate a
report. This is a broad piece of expected behavior, unlike a unit test when
you test small functionality. As per the TDD principles, it is advised to
write tests at your interface and avoid any dependency on your implementation
details as these may change in future. I am having a hard time trying to
understand how do I do this for the use case I just mentioned ?

~~~
mytailorisrich
You can do TDD at any level. You can do TDD for unit tests.

From what you're describing your issue might not be TDD but defining and
implementing the right tests for your systems.

I would say, look at your functions. How can you test them? Look at your
interfaces. How can you test them?

TDD is just doing that before you code.

