Hacker News new | past | comments | ask | show | jobs | submit login

I don't think you understood what I've said. Debugging is perfectly fine when you can't explain what's going on even though you understand the code. "The silicon is wrong" so rarely, that it makes for nice, memorable trench stories. Either that or you work at a really low level, which is something that is not the default.



In addition to what the other comment says:

"you can't explain what's going on even though you understand the code"

If someone's working on code what's wrong with using the debugger to understand the code better? There's all sorts of behavior defined underneath code that people write that can be much simpler to understand through analysis of the system in action than through code, especially when you don't have access to what's going on underneath your code due to abstractions.

There are a lot of 1+1=3 type situations that arise from abstractions people don't have access to the source to, or the resources to analyze at a given moment.


It's perfectly fine to use the debugger in such situations. Please see my other response on this topic:

https://news.ycombinator.com/item?id=13367946


And yet somebody has to debug the numerous silicon bugs before the chip makes it to market, and somebody has to find the few that make it into the product in the wild. It might be that very few people really need a feature but that it's very important for the product that that feature exist.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: