

Turn Off Your Step-Thru Debugger - techdog
http://asserttrue.blogspot.com/2009/04/turn-off-your-step-thru-debugger.html

======
tumult
I don't know. Debuggers, used correctly, are powerful tools. I don't require
debuggers to write code, but I don't think I would be making the best use of
my time if I simply stopped using them, or even avoided using them if
possible, as the author suggests.

 _A certain part of your brain shuts off, because you expect the debugger to
help you find the bug. But in reality, you wrote the bug, and you should be
able to find it._

This might have been more true at one point in time, but I think it's less and
less true as time progresses. A large number of bugs I find are not in my own
code – they're in libraries I'm linking to or including, in other people's
code. Or the documentation is wrong, or some subtle behavior in an API changes
between versions. Trying to figure this stuff out without gdb; just thinking
about it makes me want to pull my hair out.

Sometimes you end up with really goofy stuff that would be impossible to
figure out without the copious use of breakpoints. For example, recently I was
working on some lower-level multithreading event loop for an app that had very
tight timing requirements.. it was for the Mach kernel (OS X) and on my
machine I was getting a really terrible Heisenbug. Event timing would be fine
whenever gdb or dtrace was attached, but would start to jitter and error when
it was run without a debugger – no deadlocks, though. One of the system calls
I was using to return the absolute time (after you do some math) relies on the
CPU core to derive its value..

It turns out that the clocks in each core in a computer are likely very
slightly off from one another. If you think about it, this makes sense, but it
wasn't documented. Since my computer had four cores, when I attached a
debugger, the host OS would notice the increased CPU usage and shuffle the
threads in the running processes around. Usually this meant all of the threads
in my app ended up being executed just from one core: ta-da, timing problems
gone. Take the debugger off, threads more likely to execute from across
several cores, timing deltas might be messed up.

Anyway, I guess if the author's advice could be more generally summarized as
"don't do stupid things with software" then I wholeheartedly agree :)

 _Make the machine tell you where the bug is._

Valgrind, dog. Valgrind.

------
psranga
When you're first working on a large chunk of unfamiliar code, stepping in gdb
is an AMAZING way to learn how the code works.

If this guy is advocating adding prints to code, then his edit-compile-debug
cycle must be very fast (i.e., smallish programs). On my large-scale project,
it's much faster for me to restart the program with new breakpoints than
compile and run again.

The latest gdb has checkpoints. That is a significant advantage over printf-
based methods.

Of course, _reasoning_ about programs is a very very effective tactic to solve
bugs. If you write you code so that you can reason about it, you reduce the
need for both printf's and debugs. As somebody pointed out above, this usually
means writing short methods.

~~~
larrywright
I'd agree with this (in spite of arguing against debuggers above). I was
referring to my own development. In the case where you're handed someone
else's spaghetti code and given the directive "fix this!", the fastest way to
get up to speed on how things work is with a step-through debugger.

That said, if you have the time I still think the best solution is to develop
unit and functional tests (assuming none exist), and learning the code base
that way.

------
larrywright
This will be heresy to some, but I've found it to be true. I don't ever use
debuggers, and haven't for years. I examine my code, and use logging to
examine my assertions. If I _think_ I know what the problem is, I will write a
unit test to prove my assertion.

Doing it this way really makes you think about (and understand) what your code
is doing. One of the side-effects of this is that it encourages me to write
very small methods. Having small methods makes it easier for me to wrap my
brain around what my code is doing.

I read years ago that Linus Torvalds also doesn't use a debugger, relying on
debug statements. That was the catalyst for me beginning to do this myself.

~~~
markessien
Are you a faster programmer compared to when you used a debugger?

~~~
larrywright
I'm not sure if I can say faster, but I certainly think I'm better.

~~~
markessien
How do you judge you are better? What criteria are you using? Also, everyone
gets better with time - do you have some type of control that shows you may
not have been better by now using a debugger?

~~~
larrywright
It's completely subjective.

I don't develop for a living at the moment - I'm a program manager these days.
At my day job, I do one off scripts, plus the occasional prototype. I do
however work on a number of side projects, mostly in Ruby, where I try out the
latest practices: TDD, BDD, etc.

I base my statement on my own perception of my abilities, and I certainly
don't attribute it fully to my not using a debugger anymore. I've learned a
lot, and gained experience. That experience enables me to solve problems more
quickly, as it's much more likely that I've seen an error (or something like
it) before. I've had more exposure to good code, both internally as well as
many of the open source projects that I've looked at. Seeing how other people
solve problems also makes you a better programmer. I've also seen a lot more
bad code, and thus know which things to avoid.

My opinion on debuggers is also subjective. I just know that when I used one
in the past, it tended to cause problems (though I didn't realize it at the
time). You set a breakpoint, step through some things, find a problem and fix
it. But this doesn't generally lead to an understanding of what caused the
problem (simple logic errors aside), and in fact can insulate you from larger
design issues that need to be addressed. You fix a bug here, modify a line of
code there, without taking a step back and looking at what the code is doing,
and whether or not it needs to be refactored.

This isn't to say that you can't do those things with a debugger. My point is
that debuggers seem to enable certain bad habits, while skipping the debugger
and relying on analysis, unit tests, and debug statements seems to encourage
some good practices.

All I can say is give it a try - next time you want to reach for the debugger,
write a unit test to recreate your problem, and use some debug statements to
try and solve it. If it works for you, great. If not, go back to the debugger.
YMMV.

------
greyman
But what if there are hundreds of developers working on a project, and I need
to discover if the bug is caused in my own module or somewhere else? It seems
to me that the article was written from the perspective of pet-project
programmer who mostly deals with his own code only.

~~~
jcromartie
Code that you didn't write is basically just a black box. This is even more
true if it is undocumented or written in a way that makes it hard to glean
information from the code.

There is no way I could feasibly sit down and understand the entire flow of
their code (especially when it is very poorly principled to begin with). I can
manage without a debugger when I handle the code, or when the abstractions in
use provide great error handling... but when it's just a giant pile of
somebody else's barely-structured cowboy code?

Debugger, please!

At the moment I'm working on a massive system composed of about _4 years
worth_ of code (written by someone who will readily confess that they are not
a programmer) with no step-through debugger. There's a lot of printf going on
and at times it is an _extremely_ painful process.

------
nradov
This is idiotic advice. The greatest benefit of using a debugger is to get a
holistic view of what is really going on in your program right now, which may
not match your preconceived notions of what you think it's doing. There's no
way you can make the machine give you that kind of insight. I make it a
practice to step through all of my code, even when it works correctly and I've
written unit tests for everything. By watching the live view of control flow
and variables I often find defects that hadn't yet shown up as visible bugs,
or realize that there is a better way to do something.

In my experience, the programmers who don't like or use interactive debuggers
have never learned to use a really good one.

------
ori_b
I use a debugger mainly as a way of inspecting values without printf
everywhere.

I put a break at a place where I suspect something is going wrong, and inspect
the state of the program; I could simulate the same thing with printf, but
that takes recompiling and trial-and-error.

A debugger saves me a great deal of time. Especially with DDD's way of
stepping through data structures.

~~~
tezza
Agreed. A lot of the article seems just like a restatement of KISS.

* Using a Debugger is like a first pass sweep of where to put the juiciest printfs. It cuts down the problem space.

* If you're doing OOP and different components are calling back in at different points, a breakpoint can be quite illuminating. printf will clog your short term memory with trivialities.

* A Debugger has deep structure support built in, no coding a recursive node walk printout for each datatype.

------
jerf
Debuggers strike me as having two distinct use cases, which I think is backed
up by the discussion here up to this point: There is the trivial case where
you are stepping through code to find a relatively simple problem, and there
is the case of the horrid, horrid complicated bug in a complex system with
lots of things going on at once. (In many cases, this complexity is
fundamental; "remove as much as you can" is a good idea but you're still often
left with a complex core.)

The first case _can_ be a crutch if you allow it to substitute for
understanding, but it can also enable understanding, particularly code you
didn't write. (Nothing beats that; not looking at the theoretical design, not
reading the docs, nothing.)

The second case is a different thing entirely. Maybe some people are fortunate
enough to be working on simple enough problems that there are simple
solutions, but despite my best efforts there are certain places that just
can't be simplified, and when they have bugs, oh my.

So, what do I do? I use a debugger about once a month. But when I do, it saves
me multiple hours per use. If I worked in a more complicated domain, it could
lean even more heavily on the debugger's side. (I would characterize my work
as modest complexity on the grand scheme of things.)

------
pj
This philosophy goes back to the days of punch cards. If the programmers
relied on the punch card reader to inform them of a bug, they'd be delayed by
at least a day and then perhaps more because someone else may be using the
punch card reader when they have fixed their bugs.

The reason ditching the debugger makes the programmers better is because they
work through figuring out what the code will do before they write it. They
know it will be good code before the debugger tells them and before they have
to examine a variable to see what it contains, when they should have known
before they wrote the code.

These programmer articles are like investing by listening to financial
analists critique stocks. You don't know how good the analyst is at their job.
You don't know what is their vested interest in you believing them.

For the most part, if the reader of these half-witted blogs (not necessarily
this one about the debugger, but in general, and definitely the "popular" ones
that get put in here all the time) would spend as much time imagining problems
and solving them, or even solving real world problems, then they'd be much
better coders...

------
jcl
Charles Petzold has similar things to say about IDE autocompletion
([http://www.charlespetzold.com/etc/DoesVisualStudioRotTheMind...](http://www.charlespetzold.com/etc/DoesVisualStudioRotTheMind.html)).

The article also reminds me of Brian Kernighan's remarks regarding debugging
(<http://en.wikiquote.org/wiki/Brian_Kernighan>):

"The most effective debugging tool is still careful thought, coupled with
judiciously placed print statements."

"Everyone knows that debugging is twice as hard as writing a program in the
first place. So if you're as clever as you can be when you write it, how will
you ever debug it?"

~~~
nradov
That Brian Kernighan quote is from 30 years ago. We have better tools now.

------
flashgordon
To say debuggers will make you dumber (sorry "brain shuts off") is like saying
using high level languages will make you dumber than when using low level
ones. Lets all go back to assembly!

------
biohacker42
There's a kernel of truth here expressed as an extreme.

As I've gotten better at programming, I've changed how I use the debugger.

I used to set a break point and dive right in.

Now I stop and think a lot, then I might even modify the code a bit, make it
fail _better_.

Make it fail in a more descriptive way. Then I do a bit of debugging, then go
back to thinking and perhaps modifying the code a bit more, repeat.

------
mrbgty
Depending on which file I modify, building can take 4 hours.

Do you think this is a good scenario for using print statements?

I often still find myself adding print statements and rebuilding due to the
complex nature of debugging this particular (multi-process) system. For
obvious reasons, rebuilding is very inefficient. Interested in any other
suggestions or references you might have.

------
hboon
I usually don't use the debugger when doing programming, usually using print
statements except when doing Smalltalk. Rather than saying not using a
debuggers is better, maybe it is that most debuggers are just not so good. Or
it could also be that most of us are just not so good with using debuggers.

~~~
stcredzero
Few debuggers are as good as the VisualWorks Smalltalk debugger.

I've also seen people get into loops when not using the debugger. I think the
rule is to 1) understand as much as you can 2) when you get stuck, switch.

Also, monkeypatching is one of the most awesome debugging tools! Your code
fails only for that one Profile object? Then monkeypatch it with a special
debugging version of a method.

Debugging does hand in hand with dynamic languages. It's like
Smalltalk/Ruby/Python are just giamongous debuggers.

~~~
hboon
VisualWorks isn't the only Smalltalk (with a good debugger). If you are on
Windows, try Dolphin. Squeak/Pharo is pretty decent once you get used to it.

------
javanix
I use GDB for C code, if only to get the nice backtraces, but other than that
I find its easier to just think things through and put a few debug statements
in.

Setting a DEBUG constant is usually pretty handy - I surround any debug output
with an 'if (DEBUG){}' - it makes it easy to turn on/off the extra
verboseness.

------
msie
I was really angry after reading this article. For the many reasons you will
find in the other comments, debuggers are useful tools. It seems very arrogant
to say that you don't need a debugger. Maybe the author's mentor was trying to
train him to do better design or to desk-check his code.

