
I do not use a debugger - another
http://lemire.me/blog/2016/06/21/i-do-not-use-a-debugger/
======
corysama
There are two areas I repeatedly see people touting that don't use a debugger.

1) Situations where debuggers are completely unable to be of any help. Such as
debugging distributed messaging.

2) Situations where someone is refining an specific piece of software for so
long that they have every aspect of the code worn deep into their memory.

1 is pretty common in web development and web developers talk a lot, so you
hear a lot of people talking about it. 2 is something you only hear about from
highly respected developers working on highly respected projects. That gives
the impression that it's something respectable people do.

Unfortunately, 2 is not a situation that most people are in. In my experience,
most people work in projects that involve too many people changing too much
stuff too fast to burn anything into memory. These projects probably won't
survive more than a few years. And, if the resumes I see are any indication,
most people won't stick around on any company -let alone project- for more
than a few years.

For the majority of people, powerful debuggers can be tremendously valuable.
The two biggest issue blocking that value are 1) many development environments
have pitiful-if-any debugging support and 2) a lot of developers react to
their crappy debugger situation by assuming debuggers in general are crap and
that developers should be macho enough to not need them.

~~~
ctvo
The first statement is not true. Web developers use debuggers for front and
back-end work just as much as anyone else. It's probably why Chrome and its
developer tools are so popular among that group.

~~~
vardump
Those tools that help with distributed messaging are logging or sugar coated
logging. Logging can be a part of a debugger, but it's not the core reason why
debuggers are used.

Usually "debugger" refers to running a controlled portion of a program, to
some specific position is reached (breakpoints, stepping) or until something
in program state is true. When the condition is met, the debugger suspends
execution.

Debuggers also offer ability to inspect program state.

Chrome developer tools can do request logging. It can log the HTTP requests,
timelines, Javascript console, etc. It also contains debugging tools for
Javascript, but those are rather unlikely to help with debugging distributed
messaging.

~~~
ctvo
We must be talking about two different audiences then, because web development
in my experience has little to do with distributed messaging.

Most web developers work in JS for the front-end -- they use Chrome to debug
client behavior. Most also write the layer in the back-end that communicate
with databases and other data sources and handle business logic. They use
debuggers for their language to follow program flow and inspect values.

Once their layer needs to communicate with other systems or do anything with
distributed messaging, their responsibility ends and the need to debug the
issues of the other systems becomes someone else's problem.

------
zbuf
Hmm, this headline is bought to you from the "too cool for school" crowd.

The citations in the article are a little flawed to support the argument.
Torvalds "doesn't use a debugger", except the actual quote contains "I don't
like debuggers. Never have, probably never will. I use gdb all the time"

Guido van Rossan "uses print statements for 90% of his debugging".

Sounds like people who /are/ using debuggers to me.

In fact, using one for the remaining 10% of the bugs sounds about right. I'd
be highly sceptical of any programmer who has not picked up a debugger in the
last year, and using it to step through some code is just one of it's many
features.

But this article could easily be headlined "people don't use debuggers all of
the time" and it wouldn't be nearly as interesting.

------
Arzh
> If you dive into the bug, you tend to fix the local issue in the code, but
> if you think about the bug first, how the bug came to be, you often find and
> correct a higher-level problem in the code that will improve the design and
> prevent further bugs. -- Rob Pike

I'm usually not allowed to fix bugs in that manner since there is a false
sense of risk to go along with it. That being said, when I do work in games a
debugger is essential, stepping through the code and watching the vars change
is needed since most of the code is a simulation and you can't just /think/
the problem away.

------
dang
I wrote an Emacs mode that interfaced to V8's debugger so I could single-step
through the Lisp (Parenscript) programs we were writing that compiled to JS.
In the end I only used it a handful of times over three years. That was a lot
of work for something one doesn't end up using! (Though it was still a good
learning exercise.) Writing your own debugger and then never using it turns
out to be pretty strong evidence that you don't need one.

------
cbanek
While I used to like debuggers, recently I've found them to be a giant pain.

First, process boundaries. If you're using a complicated stack, it's likely
that you'll need to set up multiple debuggers for different processes and
switch between them.

Second, languages. If you're using different languages, or a combination of
languages, debuggers can be very painful. Each usually has a settings overhead
per project for finding source code files, etc.

Some bugs just don't reproduce in debuggers, anything with multithreading or
timing is instantly troublesome.

Also, in production, you rarely have the option of a hooking up a debugger to
a live service. By not relying on the debugger up to this point, you hopefully
have a robust logging system that can be configured to give you the detail you
want at the time. The messages will hopefully be informed by previous
debugging sessions, and help get you to your solution more quickly.

------
selectnull
> If you dive into the bug, you tend to fix the local issue in the code, but
> if you think about the bug first, how the bug came to be, you often find and
> correct a higher-level problem in the code that will improve the design and
> prevent further bugs. -- Rob Pike

Sometimes, the bug is local. And in those times, debugger is excellent tool.

------
jcrben
Most applications branch based upon their state. You can't just statically
analyze them without knowing that state. OK, so you can add print statements
to get a small peek into the state. But if you have a moderately complex
application with multiple branches, you'll need to keep adding these to even
know where to point your next print statement. That's a rather tedious
feedback loop.

Functional programming does make your code more statically understandable, but
it's not going to remove branching. So I guess you memorize all the inputs? Do
you write them down? Might be easy if you have a simple program under your own
control, but not when business requirements impose a wide array of inputs, and
some of those inputs are different than you expect (hence, the bug).

It's also enjoyable to drop into a REPL (read-evaluate-print-loop) at some
interesting point. At that point you can try all sorts of things with no
friction. Minimize your feedback loop.

It's funny how Lisp was famous for its REPL, and yet now we're in the 21st
century and some people think they're too cool to use a REPL.

As for me, I'm enjoying Redux style time-traveling debuggers... in the future
rather than tedious print->run or step slowly we may just query the state of a
program over time like a database.

------
draw_down
I don't really think using a debugger is incompatible with the advice here,
but, whatever. Knock yourself out if you don't wanna use one.

------
nickpsecurity
The links actually had the counterpoints I was going to make: hardware-level
issues like device interactions and memory corruption. You're not going to
figure those out just looking at source code. You need to see how the live
value got inappropriately transformed by something that had no business
transforming it. That said, the small counterpoints support the consensus in
the links that debuggers could be relegated to a rare tool you break out
rather than essential part of development. I'm personally undecided on the
issue.

Whereas, I encourage Design-by-Contract interface checks, strong typing, and
static analysis over TDD. You can use them in conjunction but you knock out
more bugs with the former. Plus encode the assumptions in a simple way. Even
better, what you knock out applies to all cases within the spec instead of
what specific ones you thought to test.

~~~
vardump
> hardware-level issues like device interactions and memory corruption. You're
> not going to figure those out just looking at source code.

Well, that's right, but you can't step through something like a DMA transfer
setup or IRQ routine either. By the time you've done first single step, the
hardware state is already bad and everything has timed out.

Distributed system interactions are also nearly impossible targets for
productive use of a debugger.

In both cases, mostly I just do very verbose logging.

------
voycey
In every language I learnt in my CS degree, the debugger was integral. One of
the first things you are taught in CS is how to use truth tables to determine
the expected outcome, debuggers are just an extension of the whole "I expect
this to be x value" \- In my development now I see it as just as integral and
it takes me a fraction of the time to track down a bug as it does the
developers on my team who don't.

Obviously each to their own - but when I see one of my team putting print
statements after every line and constantly going back in to run the code it
drives me mad - 90% of the time you can sort it out in 1 pass with a debugger.

------
peterkshultz
When first learning to program, my teacher refused to allow us to use the
debugger. He believed it corrupted our abilities to truly understand the code
that we wrote.

Although I now use a debugger, I can't help but feel improper for having
abandoned his rule. While a debugger provides for an easier programming
experience, there is something to be said for personally combing through every
line of code to find a bug.

~~~
Terr_
I dunno, I mean, I'm sure you could learn to build internal combustion engines
through trial and error, assembling them, letting them run, listening,
stopping them, and opening them up again...

... But wouldn't it be even better to learn by assembling one with transparent
materials and a magic device that could "pause" the whole engine at an
arbitrary point in time?

While it's true that it will enable some students to simply "fiddle until it
works" without understanding, I think that's a small price to pay for the
benefit it brings to the genuinely-curious students.

~~~
banku_brougham
here is the crux, for me.

understanding the thermodynamics of gases and mechanical properties of steel
are what help you build a combustion engine. i think this is the point of the
article, that stepping through code doesnt help you understand why things
happen.

when you know that pressure is proportional to the temperature and the inverse
of the volume, and that steel yields at 36K psi, you understand why the piston
shaft keeps breaking. those transparent materials and magic devices lets you
watch the engine block crack in slow motion -- do they help you learn why that
happened?

~~~
xg15
You need to know those properties. But they are usually a strong
simplification of reality. They will help you not to completely botch your
engine design. But the see-through engine could also show you that your
particular design of the pipes leads to pressure being above threshold at that
particular point - and can give you clues to investigate further.

Which I think is why the closest equivalent to magic see-through engines -
simulations - are actually used in engine design.

------
barefootcoder
This is very similar to a POV that I've espoused for quite some time, though I
phrase it somewhat differently.

A symbolic debugger is good for verifying your premises, but the actual
troubleshooting should be done by thinking about the problem, thinking about
the code, forming a hypothesis about the potential causes of failure, then
confirming (using a debugger as necessary). Too many young and inexperienced
developers fall into the trap of running straight to setting a breakpoint and
watching what happens, which doesn't scale to more difficult problems such as
concurrency and IPC.

I find myself using a debugger when:

1) it's a trivially 100% reproducible problem 2) it's a completely new
codebase that I'm unfamiliar with 3) it's a language that I'm unfamiliar with

Otherwise the only reason I use it (or print/log statements) is, as stated, to
verify the premises upon which I am basing my reasoning about the likely
causes.

------
xg15
The main arguments of the author seem to be "I don't use a debugger", "Some
people I respect don't use a debugger", "sometimes you can't use a debugger"
and "we should make tools that are somehow better than debuggers". I don't
find any of those very convincing.

------
mixedCase
A debugger is a very convenient tool when dealing with spaghetti code.

Which also has the side effect of allowing undisciplined/inexperienced
programmers to more easily churn out spaghetti code.

I very seldomly use one, almost always if I know it's gonna save me _a lot_ of
time. And I would certainly never recommmend a beginner programmer to learn
how to use a debugger until he gets very deep in the shit. Before that point
though, his/her own human resourcefulness and introducing a little knowledge
of best practices will set him/her on the right path faster than a debugger
will.

------
6mirrors
I don't care

------
mehh
Is this really still a debate?

If using tool helps, use the damn tool.

Each project is different, there isn't a golden rule for this shit arghhh.

You might not need to use a debugger often, but number of times I have seen a
young dev too cool (or lazy to learn/configure) for a debugger waste time
poking about making random guesses, when a simple break point would of told
them the problem in a couple of seconds. Literally seen them waste hours
inventing all sorts of improbable reasons why it doesn't work.

I don't use a debugger very often, but I have been programming a long time and
hit my head on so many bugs I can make very good predictions where an issue
might be. That takes time, tying to emulate very experienced people when your
not is foolhardy.

~~~
vardump
You're right, but I might not want to work with you if that comment reflects
your attitude.

