Hacker News new | past | comments | ask | show | jobs | submit login

How vital are debuggers really though? Shouldn't proper design and testing mitigate the necessity of using a debugger? They're more akin to a crutch are they not?

EDIT: Adding this: ( http://www.artima.com/weblogs/viewpost.jsp?thread=23476 )




In the same way an MRI is a crutch for doctors who don't know the human body well enough.


Citation needed. I've been working on a debugger for the blockchain for the past 6 weeks. Comparing it to doctors is lacking, at best.


Do you mean debuggers are more advanced beyond comparison (my opinion; but tomography is closest to a medical debugger IMO), or vice versa?


The necessity of having tools like MRI available to doctors can't be underestimated. Yet comparing doctors because of their use of tools such as sectioning to the complexeties of cloud distributed debugging on the blockchain can only yield unsatisfactory results.


That's a terrible comparison. The body already lies open before you in the form of code. A debugger should be the measure of last resort, when you can't explain why something fails or you have strange side effects. In my experience every programmer that relied on the debugger had difficulties understanding his or her code in the first place.


I don't know how your code can tell you the silicon is wrong (for example). Are you're suggesting the first step of debugging a problem is to understand your code, then every bit of code running under it, the all the code under that ad infinitum, then hardware designs under all that... and not using an available debugger?

I can't help but feel this is an opinion only someone who has only ever worked on extremely high level (and basic) systems and has simultaneously disregarded all underlying abstractions could hold, or someone who has only worked with very low level systems which were so shallow my aforementioned facetious approach to debugging is actually feasible. Limiting yourself to either extreme doesn't exactly give one the most balanced view of how problems can be solved in software


I don't think you understood what I've said. Debugging is perfectly fine when you can't explain what's going on even though you understand the code. "The silicon is wrong" so rarely, that it makes for nice, memorable trench stories. Either that or you work at a really low level, which is something that is not the default.


In addition to what the other comment says:

"you can't explain what's going on even though you understand the code"

If someone's working on code what's wrong with using the debugger to understand the code better? There's all sorts of behavior defined underneath code that people write that can be much simpler to understand through analysis of the system in action than through code, especially when you don't have access to what's going on underneath your code due to abstractions.

There are a lot of 1+1=3 type situations that arise from abstractions people don't have access to the source to, or the resources to analyze at a given moment.


It's perfectly fine to use the debugger in such situations. Please see my other response on this topic:

https://news.ycombinator.com/item?id=13367946


And yet somebody has to debug the numerous silicon bugs before the chip makes it to market, and somebody has to find the few that make it into the product in the wild. It might be that very few people really need a feature but that it's very important for the product that that feature exist.


> The body already lies open before you in the form of code.

Uh huh, because you've read the source code for your program, including all the bit someone else wrote, plus the database code, and the code for your desktop manager, and the OS code for good measure. And since it's all "open" (like say, DNA), it's clear how it runs.


There's a real practical difference between the operational semantics of the machine and the layers of abstraction a programmer understands and uses. Sometimes the only way to verify all models are confluent is with a debugger.


Yeah I know, which I aknowledged. My problem is with people who write 50 lines of code and then open up the debugger to see what it does (As in "rely on the debugger")


That... depends on what level you're working at. And how obfuscated those lines are, etc. ad nauseum


That is an... odd statement. The debugger is what enables me to understand what exactly is going on. Noone, not even you, will be able to fully grasp any modern language and framework, and the debugger becomes a necessity.

As a counter claim: In my experience, programmers that don't use debuggers, are those that have difficulties understanding things. They just throw code at it until something sticks.


I hardly eve use debuggers. But I do understand stuff. Last week I found a small bug in the reference implementation of Argon2i (Found out by implementing my own). Last year, I designed and implemented a small programming language (for work, with deadlines money and all). I did the VM in C++, and the bytecode compiler in Ocaml (tried C++ first, it was too cumbersome at recursive data structures).

To understand what's wrong with my program, I need a strong understanding of the data it manipulates. Show me the data, and I can probably track down the bug. The sequence of operations shown by step-by-step debugging is important, but never helped me as much. I have invariants in mind, and I can detect invariant violations by looking at the data, not at the operations between them.

In practice it means I use Valgrind first to weed out most undefined behaviours, then printf(). Yep, printf().

Take my VM for example. Had many bugs, many of them hard to track: off-by-one errors were not detectable by Valgrind for instance, because my VM heap was a giant std::vector<Word>. The GC was wrong, the stack management was wrong, the primitives were wrong… I made errors pretty much everywhere. What saved me was a little printf() based visualization tool. I could now see every block in my heap, and if anything went wrong there, I would detect it very quickly. This became my new Valgrind.

I have no idea how gdb would have helped me there. Didn't need it anyway.


Your comment only makes sense if you equivocate on the definition of debugger. Debugging by printf is not the same thing as not using a debugger.


There are two ways to put it: I can say "I don't need a debugger, I have printf", or I can say "I don't need gdb, printf is my debugger".

Either is fine. Most conversations about debuggers however tend to equate the debugger with something like gdb or the IDE debugging interface. People will often say the absence of such a debugger is a deal breaker for them. It feels like they forgot about printf.


I also distinct between people that use a debugger to find problems in code and people that use the debugger to write code in the first place. The latter are the ones I've criticised in my original comment. I expected to receive many downvotes on this comment because most mediocre programmers I know rely heavily on the debugger for programming.


I don't 'rely' on a debugger per se, but it's an important tool in my arsenal.

When I know that memory corruption is occurring (I work in embedded C++), a debugger with watchpoints is about the only way to track down who has that stray pointer.

Sure, a debugger isn't an excuse for not thinking about what you're doing, but banning them outright is a ridiculous idea.


I never said to ban them. I think my wording was really badly chosen. By "rely" I mean people (and I've seen this on numerous occasions) who write code without a plan and then use the debugger to adjust it till it's working, instead of thinking about the code and then using the bugger to fix misconceptions about the own or foreign code.


If you don't have a debugger, you're going to end up debugging with print statements. Something I've done in the past on devices where JTAG was too much of a pain.

Of course, it requires you have deterministic logging. If you don't have that, it should be your #1 priority.


> How vital are debuggers really though? Shouldn't proper design and testing mitigate the necessity of using a debugger? They're more akin to a crutch are they not?

Shouldn't a debugger mitigate the necessity of overly paranoid design and testing? Testing hardware is much more expensive than testing software - to do so properly, you first have to build the damn thing. Hardware 'revisions' (builds) are more commonly measured in single digits, not the 6+ digit monsters of your local CI setup.

Another answer: In the exact same sense that bug databases, assertions, unit tests, static analysis, 'safe' programming languages, code reviews, etc. are all mitigations and crutches for the human condition.

A third answer: Crutches are useful medical devices.

A fourth answer: I routinely interact with software and hardware that hasn't undergone proper design and testing. This is a bit of a mouthful, so I generally shorten this to "software" and "hardware", respectively. Why yes, I do debug and workaround 3rd party issues - for which no amount of "proper design" or "testing" of my own stuff will help - regularly enough to need a debugger.

> EDIT: Adding this: ( http://www.artima.com/weblogs/viewpost.jsp?thread=23476 )

To quote from that:

> Once you have exhausted every other avenue of diagnosis, and have given very careful thought to just rewriting the offending code, then you may need a debugger.

Yes, even that admits you'll sometimes need a debugger. Not "want" - need. Even for software.

EDIT: Various typos.


I've never understood the dogmatic opposition that some developers have toward using debuggers. When I write new code I always step through it in an interactive debugger even if it passes all the tests. When you actually watch the control flow and state changes live in front of you it allows you to spot defects and inefficiencies that tests missed. It's just one more step in the quality process including unit tests, code reviews, static analysis, acceptance tests, etc.


A jtag debugger isn't the same as just debugging a local program on your system. Normally it is required to do initial hardware bringup, where you need physical access to the cpu.

Probably not wise to ship one enabled on production gear, I used to work on things that required some soldering to get access.


And what about debugging issues with data that I don't have access to at compile time?





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: