Hacker News new | comments | show | ask | jobs | submit login

I disagree. If we make changes to the programming model that allow for better tool integration, we all win. In fact, tooling really is a part of the language, and you should co-design your IDE and language together. Treat any third-party plugin as a language extension. Programming models will always limit the effectiveness of features like edit-and-continue and IntelliTrace, so let's fix that!

I don't think the Omniscient log-the-world Debugging is practical, but if we just had some form of deterministic replay, we could take snapshots or even cheaper memoizations at certain points in the program. Going backwards is simply a matter of going back to the last checkpoint and going forwards to the point where you want to stop.

I agree that the key is UX, and as Gilad said in the blog comments, they aren't necessarily technological (once we know what the right experience is, we'll build the technology!).

Disclaimer: I work for MSFT, though I study this problem in research (MSR/MSRA).




Better debugging is one of those things that helps one group of developers solve problems they couldn't otherwise solve. Unfortunately, it also allows another group of developers to make a bigger mess before failing under their technical debt.

I have definitely noticed a correlation between developer debugging skills and clean code practices. Obviously, this doesn't mean one is causing the other, but I would love if there was any research looking into this.

It does make me wonder if our time would be better spent finding ways to improve consistency and professionalism in our practices, rather than another new tool to get you out of the mud.


You've basically described the primary difference schools of programming thought. Should programming be the act of taking well-defined/understood requirements, a nice clean environment (and dependencies), careful thought, and producing a program that is correct by construction and has little need for debugging? Or is programming a messy affair of poorly defined requirements, crazy environments, and more exploratory, where debugging would then consume more effort than actual coding?

I believe more in the latter school, though of course, they are both extremes and I'm a moderate. The mud is unavoidable and we might as well build winches to pull ourselves out rather than spend time trying to carefully avoid it.


I haven't really seen such a clean divide. There is definitely a group of developers that prefer building over everything else. This group loves tooling because it enhances their ability to keep adding new things. I also suspect that this group is a product of their environment. They are exceptional at debugging because debugging is the primary way of working. A problem is to be understood and additions made, not changed. Change ruins their understanding.

But even this definition feels too confined. There have been some developers I worked with that must have started out this way and then learned 'engineering' practices. They were all exceptional to work with. I wish more of this group could get to that level. Hence my interest in putting bread crumbs on the path to learning, rather than yet another pill to lose weight fast.


There is definitely a big difference between an inexperienced developer and one who is experienced and follows rigid organization practices. But even these latter people probably debug a lot, you just don't see it from the outside.

Debugging is as close to experimental science as we get in computer science. It is the act of understanding a complex system, and even if we built that system entirely by ourselves (unlikely), it eventually "gets away from us" and takes on a life of its own. It is impossible to understand everything, and debugging is a great way of allowing us to forget details and uncover them later.


Have you all looked into pluggable types at all? It'd be nice to be able to enforce a convention like, any class annotated with type x must prefix all methods which call into y with z, and be given warnings/errors at build time. Forgive my ignorance if this is already doable, I work in OS X.


IMO it is not the programming model that needs to be changed, but the tool chain creating around the programming model that needs to be built with a larger horizon of ideas & ideals in mind.

So far most of the tools have been created in a very strict, single purpose way. While that's a good engineering approach it also leads to limitations that are hard to overcome. I do remember the early days of Eclipse when they had to build their own java compiler to be able to pull out enough data for the IDE. And that's the simplest example. Or think of the days before JMX.

Obviously this is not a simple problem to solve. I think the only way we could fundamentally change things is to completely rethink the way programming models/environment are thought.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: