I've noticed that TDD forces me to think about the code in a better way, before actually coding it, than just going ahead and coding it and then figuring out how to test it after.
This is by no means an easy practice to adopt, btw, especially after years of not doing so.
I actually think TDD should be taught in schools and never even considered a separate aspect of coding; TDD should be integral to coding, period. If it wasn't discovered separately in the first place, it would just be called "coding."
> that you only need to spend less than 10-30% of your time debugging your code before it isn't worth it
That is a good point.
It's worth it down the line. You're not factoring in future costs. You're not just reducing TODAY'S bugs by 90% with that increase in overall coding time, you're also vastly reducing the technical debt of the code in the future.
You're also writing a permanent, provable spec for the code. What happens 5 years after you write your untested but debugged code and have to go back to it to add or fix something? How in the hell will you remember the whole mental model and know if you are in danger of breaking something? The answer is, you will not. And you will tread more useless water (time and effort) debugging the bugfixes or feature-adds or refactorings.
Speaking of refactorings, they are almost impossible to do without huge risk unless you have well-written unit tests against the interface to the code being refactored.
In short, if you do not write tests, you are literally placing a bet against the future of your code. Do you really want to do that? Do you consider your work THAT "throwaway"?
That said, tests are no panacea... and I am NOT trying to sell them as one (which would be wrong). You might assert the wrong things, or miss testing negative cases (a common one is not testing that the right runtime error is thrown when inputs to the code do not conform). There are cases on record of well-tested code that has passed all tests (like satellite code) and still fails in the real world because of human error (both the code AND the test were wrong).
"I've noticed that TDD forces me to think about the code in a better way, before actually coding it, than just going ahead and coding it"
IMO, that is the crux of the matter: Thinking-Driven-Design is the way to go. The idea that you _need_tests_ to do the up-front thinking is, again IMO, bogus, and writing tests without thinking doesn't help much, as you seem to agree on with your remark on missing test cases.
Some people use paper or a whiteboard as things that make them think. Others go on a walk. Yet others, sometimes, can just do it sitting behind their monitor while doing slightly mindless things such as deleting no longer important mail, or setting up a new project.
Also: good tooling makes many kinds of refactorings extremely low-risk. Strongly-typed languages with that are designed with refactoring in mind help tons there.
I've noticed that TDD forces me to think about the code in a better way, before actually coding it, than just going ahead and coding it and then figuring out how to test it after.
This is by no means an easy practice to adopt, btw, especially after years of not doing so.
I actually think TDD should be taught in schools and never even considered a separate aspect of coding; TDD should be integral to coding, period. If it wasn't discovered separately in the first place, it would just be called "coding."
> that you only need to spend less than 10-30% of your time debugging your code before it isn't worth it
That is a good point.
It's worth it down the line. You're not factoring in future costs. You're not just reducing TODAY'S bugs by 90% with that increase in overall coding time, you're also vastly reducing the technical debt of the code in the future.
You're also writing a permanent, provable spec for the code. What happens 5 years after you write your untested but debugged code and have to go back to it to add or fix something? How in the hell will you remember the whole mental model and know if you are in danger of breaking something? The answer is, you will not. And you will tread more useless water (time and effort) debugging the bugfixes or feature-adds or refactorings.
Speaking of refactorings, they are almost impossible to do without huge risk unless you have well-written unit tests against the interface to the code being refactored.
In short, if you do not write tests, you are literally placing a bet against the future of your code. Do you really want to do that? Do you consider your work THAT "throwaway"?
That said, tests are no panacea... and I am NOT trying to sell them as one (which would be wrong). You might assert the wrong things, or miss testing negative cases (a common one is not testing that the right runtime error is thrown when inputs to the code do not conform). There are cases on record of well-tested code that has passed all tests (like satellite code) and still fails in the real world because of human error (both the code AND the test were wrong).