Hacker News new | past | comments | ask | show | jobs | submit login
Teach Debugging (2014) (danluu.com)
64 points by tim_sw 5 months ago | hide | past | favorite | 34 comments



>> We had been warned in orientation that half of us wouldn't survive the year. In class, We were warned again that half of us were doomed to fail, and that ECE 352 was the weed-out class that would be responsible for much of the damage.

What an absolutely shitty attitude towards teaching. "We're going to let half of you fail". But that's not the failure of the students: you're their bloody teacher! You're the one who's failing to teach them!

Gah.

(I've read to the end of the article, I know he's complaining about the same thing, I just had to get that off my chest thanks).


Indeed, and the self-serving reply given to the author when he tried to do something about a similar situation ("some people just can't hack it in engineering") suggests a vicious cycle in which the survivors of this process might well believe that this attitude has been vindicated by the process they themselves have been through. It is not hard to imagine this leading into the casual acceptance of the idea that people of the "wrong" gender, ethnicity or social status are inherently among those who just can't hack it, and it is suggestive of some of the techniques used by cults in indoctrinating new members.


Dan's comment about gatekeeping in STEM and dismissing the problem as "some people just can't hack it" rings very true to me.

I remember tutoring undergrad math back in the day; it was a walk-in thing where students would tutor other students and the only qualification required to be a tutor was that you passed the class yourself with a 3.5 or whatever. I recall distinctly one evening helping a group of sophomores with their intro linear algebra homework. It was early in the semester, the deadline for dropping classes was a week or two away, and they just weren't getting it. More than likely they were going to just wash out.

As we were plugging away through an example exercise in the textbook, I said something like "so this is a vector and this is a matrix, so to multiply them you need to do this" and one of them said in exasperation, "well how'd you even know that's what they were??" and it finally clicked that no one had explained to them basic notational conventions. They didn't know that the bold and italic variables in the textbook actually meant something. This was compounded by professors using different notations on the blackboard during lectures - typically lowercase letters with overbars for vectors and poorly-rendered double-struck capital letters for matrices.

It took 5 minutes of listening to understand what the real problem was, and 5 minutes to fix it, but really it just took empathy. I don't know if those students ended up dropping the class - statistically they very well could have. But if they did I hope it was because they found something else to be passionate and curious about, and not because they were bullied out of it.


I think the industry has gotten a lot better, even at the educational level. The "you should already know this" attitude is no longer widely accepted IMO.

I feel a lot of people are overly judgmental and unwilling to provide education. It's still important to not being found out to be one of the dumb ones by making the unforgivable error of saying something stupid.

I'm shocked how many engineers, especially goods ones (ie ones you would benefit from hearing from the most) that simply refuse to inform people their mistakes and correct their misunderstandings. So not only are they judgmental, they're unwilling to do anything about it.

I understand some of the reasons this is hard, and am very understanding. There's time crunch, communication problems, and people have had negative experiences providing constructive feedback. People are damned proud of the way they operate and don't want to hear anything that would imply their way is not optimal, even if it'll help.

At the same time I think some people want this. They want to be judged superior, by virtue of others being deemed inferior, and want that status quo to continue.


> no one had explained to them basic notational conventions. They didn't know that the bold and italic variables in the textbook actually meant something. This was compounded by professors using different notations on the blackboard during lectures - typically lowercase letters with overbars for vectors and poorly-rendered double-struck capital letters for matrices.

That's a deeper issue than debugging or troubleshooting what went wrong.


I'm an undergrad TA for an introductory programming course, and I believe it is one of the most important skills that is not taught in the actual syllabus. Most every student runs into road blocks at some time. When I work with them, I try to work through a process of debugging. Even for me, it took me a number of years to have a rigorous concept of what debugging entails, and often still takes me some time before I decide to attack the issue in a systematic way, even if beginning the debugging process formally would be a much more efficient use of my time.


Fully agreed. Debugging skills are not that hard to teach, while of course practice makes perfect.

Debugging is not limited to EE/CS courses of course. The sound on the TV doesn't work? But you still get a picture? So the HDMI cable is carrying the video correctly. Sound goes through the same cable, so it's not a cable issue. Do you hear sound if you switch to another input? Then the speaker is working fine. And so on. Divide and conquer.

Debugging (figuring out problems) relies on a handful of basic principles:

- having some understanding of how the system works

- looking for differences. A works, B doesn't. What are the differences between A and B?

- tracing issues through components (the data we care about makes it to the database, but the UI doesn't show it)

That's about it. Definitely teachable.


I'm going to absolutely agree with you on that.

A number of years ago, I wrote http://the-whiteboard.github.io/coding/debugging/2016/03/26/...

> I’ll admit to being a bit hazy about the exact homework problems and lectures in intro to CS all those decades ago. I’ll even admit to it being in Pascal (the other choices were C or Fortran 77). I suspect the first homework problem was a “get familiar with writing in the IDE” and the second assignment was your typical “basic control structures.”

> If I could go back in time, I know what that third assignment would be. An intro to the debugger.

I mention it in the post, and I also really like the set of essays for How To Be A Programmer ( https://github.com/braydie/HowToBeAProgrammer ). I don't think its a coincidence that the first skill listed is Learn To Debug - https://github.com/braydie/HowToBeAProgrammer/blob/master/en...

> Debugging is the cornerstone of being a programmer. The first meaning of the verb "debug" is to remove errors, but the meaning that really matters is to see into the execution of a program by examining it. A programmer that cannot debug effectively is blind.


It's controversial, but I'm not certain it's possible to teach someone how to debug.

Some folks seems to naturally "get it", some other will try every possible permutation possible or start frantically copy-pasting code and some other seems to just get stuck and are basically awaiting instructions.


I was gonna say something similar, but more pretentious. Debugging can't be taught, it must be learned. People can go from the "awaiting instructions person" to getting things done, but they need to have spent the hours in the trenches. It's kinda like saying teach me to be a great live musician. You have to practice, jam with people, and eventually do lots of gigs. You can't teach that in a classroom.


Debugging skill is generally useful, even outside of programming. You need it to fix a car that won't start, or to fix a toilet that won't flush, or to figure out why your bread recipe isn't rising well in the oven, or to understand why you fall off a boulder problem at the same spot on every attempt. However, it would be hard to teach debugging independently of another subject, as a general skill. Debugging relies on working knowledge of a system; you need to know a bit about how something works to figure out why it doesn't work. Moreover, debugging requires self-direction and creativity. Debugging is a skill that can be learned, but that cannot really be taught.


Debugging skils are generally useful, even outside of programming, because they are general problem-solving skills, applied to a problem with a program.

I doubt that they cannot be taught, but if that is so, then it would seem that programming cannot be taught either, as debugging is a necessary part of programming. While debugging may require self-direction and creativity, it can hardly need more than that needed to come up with a working program, starting from nothing more than a goal.

The more interesting question is how is it possible to teach someone programming (or digital circuit design) without instilling an ability to debug? If classes are churning out programmers who cannot debug, can their graduates really be called programmers?


I think it can be taught, but it takes a curious character.

Classses are already churning out plenty of non-programmers. Computer science has become famous for being a "sure job". If that is your motivation for entering university you probably never had the necessary curiosity.


I took the Carpentries instructor training recently, and this is one of the reasons they insist that all of their courses are delivered as live coding with no slides at all.

Instructors are encouraged to pause after every mistake, and then systematically resolve that mistake out loud - review the error messages and walk through the resolution.

https://carpentries.github.io/instructor-training/14-live/in...



>This dynamic isn't unique to ECE 352, or even Wisconsin – I saw the same thing when TA'ed EE 202, a second year class on signals and systems at Purdue. The problems were FFTs and Laplace transforms instead of dividers and Boolean2, but the avoidance of teaching fundamental skills was the same. It was clear, from the questions students asked me in office hours, that those who were underperforming weren't struggling with the fundamental concepts in the class, but with algebra: the problems were caused by not having an intuitive understanding of, for example, the difference between f(x+a) and f(x)+a.

>When I suggested to the professor that he spend half an hour reviewing algebra for those students who never had the material covered cogently in high school, I was told in no uncertain terms that it would be a waste of time because some people just can't hack it in engineering. I was told that I wouldn't be so naive once the semester was done, because some people just can't hack it in engineering. I was told that helping students with remedial material was doing them no favors; they wouldn't be able to handle advanced courses anyway because some students just can't hack it in engineering. I was told that Purdue has a loose admissions policy and that I should expect a high failure rate, because some students just can't hack it in engineering.

I think that fundamentally the problem is in mathematics.

Our notation is from the 18th century at best, yet because it is hard people think it's meaningful.

Standard maths notation makes sense for polynomials: a_0 x^n+a_1 x^(n-1)+ ... + a_n = k is the only type of general expression one can write without the need to use parens. Everything else is a kludge added on top of that notation to fix one problem with it, but only in one specific field. Which leaves you with dozens of dsls to specific types of areas of maths/engineering/physics/etc

I'm a big fan of lispified typed lambda calculus. There are no exceptions, and any specific notation is defined in terms of rewrite rules. The fact that types are required also makes it clear what you're talking about.

Unfortunately I'm in a minority of one whenever I've talked to working mathematicians, even though I can tear through papers and proofs an order of magnitude faster than when I try and use standard notation.


It's absolutely a huge part of the problem. Mathematical notation is horrible, it's imprecise, it's inconsistent - it's a hold over from a legacy time.

But anytime it's brought up, mathematicians get upset. The issue will never be addressed, but it's a disaster.


I think it will when we finally have a language that is useful for all of: jotting down expressions, manipulating them by hand, and being consistent enough that automated theorem provers/proof helpers can use as their internal representation.

The issue right now is that the impressionistic maths notation works well for humans and there is no computer language that:

1). has good notation

2). is useful to mathematicians out of the box

Mathematica, sage, axiom, etc all have internal representations that are essentially the system I'm talking about but the user facing language is a mess in all cases. It's not a simple problem and I don't even know what the solution looks like.

It will have a lispy notation (tree serialization), and use rewrite rules (generalized macros) and types (some type of type inference with explicit typing annotations), but other than that I feel like someone trying to invent Algol in 1947.


Couldn't you say that they technically applied the Feynman method recursively?

The original problem was the design of the divider and how to implement it. After looking at some half-working solutions, the problem changed to being able to find the cause of the broken half, where they came up with the "mechanical technique" for debugging.

I'm not familiar with the method, but the second step, "Think real hard", is quite broad which allows the technique to be applied to anything as long as there is a problem and solution (steps 1 and 3).

1. How do I design a divider?

2. Thinking

3. Solution

and

1. How do I determine what's wrong with my design?

2. Thinking

3. Mechanical technique


The “weed out” class that the author describes in his story is the same type of class that eventually lead me to drop out of college/my cs program. Teacher described the class the same way. What’s interesting here though is that I ended up learning how to really program by debugging and reverse engineering software while in a customer support role. I eventually ended up being moved to the development side of things after I filed enough defects in our code base.


I highly recommend the free Software Debugging course on Udacity[1]. It focuses on a systematic approach to debugging, and using/building tools to automate your debugging process. I had already been writing code professionally for several years when I worked through the course, and I still learned a lot.

[1] https://www.udacity.com/course/software-debugging--cs259


This is real talk. I train software developers these days and this was exactly my very first observation from the start. It's always perplexed me that there's no plan to teach debugging.

Its such an important skill that it boggles the mind. Anyone have any thoughts on why this seems to be glossed over or not even included in most school curricula?


Anyone have any recommendations for material which teaches debugging in a systematic and formal way?


I teach debugging in all my courses. The best systematic approach to give broken code to students, both in activities and on tests. Students are exposed to common errors for each and every topic. I collect examples of mistakes from previous years and from Stack Exchange.

I also demonstrate how to debug. It is combination of making accidental mistakes during live coding and introducing intentional mistakes that I show how to identify and fix.


After reading this post and the link there on the ‘Feynman Method’, I’ve decided to read Polya’s ‘How to Solve it’ [1]. I say this as someone who did his BSc in pure mathematics, MSc in CS and has worked as a programmer for a number of years. I meant to read it back in university, but I instead decided to keep applying the ‘Feynman method’, which I find for me is less efficient and more error prone due to my comparatively lower intelligence. The principles in the book shouldn’t be too hard to apply to both coding and debugging code.

They say you can’t teach an old dog new tricks, and you can’t increase IQ, but I’ve recently burnt a lot of actually quite simple coding interviews due to things like a missed edge case, and I wonder whether I can discipline myself to do things in a more disciplined and systematic fashion with this book. Not only for Leetcode/Hackerrank style questions, but for work as well.

[1] https://en.wikipedia.org/wiki/How_to_Solve_It?wprov=sfti1


On the topic of getting burned on edge cases I myself am in the same boat. I’m very much a speedy, excited programmer. I want to build the thing and see if it works afterwards. This can make me a bit sloppy especially in a time boxed coding problem.

What I’ve done to improve this to pretty good effect is follow this rough formula on problems:

- ask questions

- write out assumptions

- come up with a basic solution (pseudocode)

- see if i can come up with edge cases or converse examples that break my solution/assumptions

- code while stopping occasionally to repeat the previous step

- walk through the code manually with some examples

After doing this a bunch I’ve actually internalized a lot of the edge cases or errors I might have missed before. So I think I am slowly teaching myself to be more precise for these types of questions. It’s slow going but decently effective.


It’s hard to balance how much you should write on paper in place of just coding. Coding in itself allows you to explore the problem, but induces its own cognitive overhead, which might distract you from important insights about the direction you’re taking.


I replied prematurely. It definitely looks like we’re in the same boat regarding programming style. I wonder whether I can apply this with measurable results. It would be nice if it were easier to compare your completion time with others on Leetcode and Hackerrank so that the improvement could be measured.


EDIT: Some context about myself so as to not misrepresent myself as some sort of expert. I’ve been prepping while working full time for 6 months. I was really quite bad at doing the problems initially and now can pretty much knock out most problems without too much trouble. Planning to start interviewing in a month or two.

IMO comparing against yourself over time would be more productive. I say that cause it would definitely stress me out personally if I knew I was in the 10th percentile or whatever. And then that negative attitude could snowball into stopping practice.

If you do want an objective measure for FAANG I think Facebook recommends being able to complete 2 Medium level LC problems in 35 minutes. But again, it’s real hard to mimic the real interview environment of explaining yourself, being able to ask questions, etc. I did find mock interviews helpful early on to refine my process before just grinding problems.


Can always talk to yourself if you’re on your own. Maybe get a rubber duck so it’s less weird.


John Regehr has a post with some book recommendations from 2013 [0]. Notably one of the books corresponds to an older Udacity course [1].

There's also this book that was recently published called Effective Debugging: 66 Specific Ways to Debug Software and Systems [2].

[0] https://blog.regehr.org/archives/849

[1] https://www.udacity.com/course/software-debugging--cs259

[2] https://www.amazon.com/Effective-Debugging-Specific-Software...


Debugging by David Agans is great. It’s my #1 book recommendation for software engineers. https://amazon.com/dp/0814474578

The lessons from the book are especially helpful when I feel stuck in debugging; I’ll think through the guidelines and they get me unstuck almost every time. For example, yesterday I started feeling stuck trying to figure out why my tests were failing, and I realized I was failing to follow the “stop thinking and look” guideline. We tend to theorize way too much about what may be happening when we should simply look to see what is actually happening.


I didn't see this before I wrote my other comment (https://news.ycombinator.com/item?id=25337491), but the free Software Debugging course on Udacity is great: https://www.udacity.com/course/software-debugging--cs259


I teach undergrad CS, and recently wrote this to explain how to debug: http://justinnhli.oxycreates.org/debugging/

It's still a work in progress (I would like to add some worked examples), so any suggestions for improvements are welcome.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: