What an absolutely shitty attitude towards teaching. "We're going to let half of you fail". But that's not the failure of the students: you're their bloody teacher! You're the one who's failing to teach them!
(I've read to the end of the article, I know he's complaining about the same thing, I just had to get that off my chest thanks).
I remember tutoring undergrad math back in the day; it was a walk-in thing where students would tutor other students and the only qualification required to be a tutor was that you passed the class yourself with a 3.5 or whatever. I recall distinctly one evening helping a group of sophomores with their intro linear algebra homework. It was early in the semester, the deadline for dropping classes was a week or two away, and they just weren't getting it. More than likely they were going to just wash out.
As we were plugging away through an example exercise in the textbook, I said something like "so this is a vector and this is a matrix, so to multiply them you need to do this" and one of them said in exasperation, "well how'd you even know that's what they were??" and it finally clicked that no one had explained to them basic notational conventions. They didn't know that the bold and italic variables in the textbook actually meant something. This was compounded by professors using different notations on the blackboard during lectures - typically lowercase letters with overbars for vectors and poorly-rendered double-struck capital letters for matrices.
It took 5 minutes of listening to understand what the real problem was, and 5 minutes to fix it, but really it just took empathy. I don't know if those students ended up dropping the class - statistically they very well could have. But if they did I hope it was because they found something else to be passionate and curious about, and not because they were bullied out of it.
I feel a lot of people are overly judgmental and unwilling to provide education. It's still important to not being found out to be one of the dumb ones by making the unforgivable error of saying something stupid.
I'm shocked how many engineers, especially goods ones (ie ones you would benefit from hearing from the most) that simply refuse to inform people their mistakes and correct their misunderstandings. So not only are they judgmental, they're unwilling to do anything about it.
I understand some of the reasons this is hard, and am very understanding. There's time crunch, communication problems, and people have had negative experiences providing constructive feedback. People are damned proud of the way they operate and don't want to hear anything that would imply their way is not optimal, even if it'll help.
At the same time I think some people want this. They want to be judged superior, by virtue of others being deemed inferior, and want that status quo to continue.
That's a deeper issue than debugging or troubleshooting what went wrong.
Debugging is not limited to EE/CS courses of course. The sound on the TV doesn't work? But you still get a picture? So the HDMI cable is carrying the video correctly. Sound goes through the same cable, so it's not a cable issue. Do you hear sound if you switch to another input? Then the speaker is working fine. And so on. Divide and conquer.
Debugging (figuring out problems) relies on a handful of basic principles:
- having some understanding of how the system works
- looking for differences. A works, B doesn't. What are the differences between A and B?
- tracing issues through components (the data we care about makes it to the database, but the UI doesn't show it)
That's about it. Definitely teachable.
A number of years ago, I wrote http://the-whiteboard.github.io/coding/debugging/2016/03/26/...
> I’ll admit to being a bit hazy about the exact homework problems and lectures in intro to CS all those decades ago. I’ll even admit to it being in Pascal (the other choices were C or Fortran 77). I suspect the first homework problem was a “get familiar with writing in the IDE” and the second assignment was your typical “basic control structures.”
> If I could go back in time, I know what that third assignment would be. An intro to the debugger.
I mention it in the post, and I also really like the set of essays for How To Be A Programmer ( https://github.com/braydie/HowToBeAProgrammer ). I don't think its a coincidence that the first skill listed is Learn To Debug - https://github.com/braydie/HowToBeAProgrammer/blob/master/en...
> Debugging is the cornerstone of being a programmer. The first meaning of the verb "debug" is to remove errors, but the meaning that really matters is to see into the execution of a program by examining it. A programmer that cannot debug effectively is blind.
Some folks seems to naturally "get it", some other will try every possible permutation possible or start frantically copy-pasting code and some other seems to just get stuck and are basically awaiting instructions.
I doubt that they cannot be taught, but if that is so, then it would seem that programming cannot be taught either, as debugging is a necessary part of programming. While debugging may require self-direction and creativity, it can hardly need more than that needed to come up with a working program, starting from nothing more than a goal.
The more interesting question is how is it possible to teach someone programming (or digital circuit design) without instilling an ability to debug? If classes are churning out programmers who cannot debug, can their graduates really be called programmers?
Classses are already churning out plenty of non-programmers. Computer science has become famous for being a "sure job". If that is your motivation for entering university you probably never had the necessary curiosity.
Instructors are encouraged to pause after every mistake, and then systematically resolve that mistake out loud - review the error messages and walk through the resolution.
2015 (a bit): https://news.ycombinator.com/item?id=10631273
Discussed at the time: https://news.ycombinator.com/item?id=7215870
>When I suggested to the professor that he spend half an hour reviewing algebra for those students who never had the material covered cogently in high school, I was told in no uncertain terms that it would be a waste of time because some people just can't hack it in engineering. I was told that I wouldn't be so naive once the semester was done, because some people just can't hack it in engineering. I was told that helping students with remedial material was doing them no favors; they wouldn't be able to handle advanced courses anyway because some students just can't hack it in engineering. I was told that Purdue has a loose admissions policy and that I should expect a high failure rate, because some students just can't hack it in engineering.
I think that fundamentally the problem is in mathematics.
Our notation is from the 18th century at best, yet because it is hard people think it's meaningful.
Standard maths notation makes sense for polynomials: a_0 x^n+a_1 x^(n-1)+ ... + a_n = k is the only type of general expression one can write without the need to use parens. Everything else is a kludge added on top of that notation to fix one problem with it, but only in one specific field. Which leaves you with dozens of dsls to specific types of areas of maths/engineering/physics/etc
I'm a big fan of lispified typed lambda calculus. There are no exceptions, and any specific notation is defined in terms of rewrite rules. The fact that types are required also makes it clear what you're talking about.
Unfortunately I'm in a minority of one whenever I've talked to working mathematicians, even though I can tear through papers and proofs an order of magnitude faster than when I try and use standard notation.
But anytime it's brought up, mathematicians get upset. The issue will never be addressed, but it's a disaster.
The issue right now is that the impressionistic maths notation works well for humans and there is no computer language that:
1). has good notation
2). is useful to mathematicians out of the box
Mathematica, sage, axiom, etc all have internal representations that are essentially the system I'm talking about but the user facing language is a mess in all cases. It's not a simple problem and I don't even know what the solution looks like.
It will have a lispy notation (tree serialization), and use rewrite rules (generalized macros) and types (some type of type inference with explicit typing annotations), but other than that I feel like someone trying to invent Algol in 1947.
The original problem was the design of the divider and how to implement it. After looking at some half-working solutions, the problem changed to being able to find the cause of the broken half, where they came up with the "mechanical technique" for debugging.
I'm not familiar with the method, but the second step, "Think real hard", is quite broad which allows the technique to be applied to anything as long as there is a problem and solution (steps 1 and 3).
1. How do I design a divider?
1. How do I determine what's wrong with my design?
3. Mechanical technique
Its such an important skill that it boggles the mind. Anyone have any thoughts on why this seems to be glossed over or not even included in most school curricula?
I also demonstrate how to debug. It is combination of making accidental mistakes during live coding and introducing intentional mistakes that I show how to identify and fix.
They say you can’t teach an old dog new tricks, and you can’t increase IQ, but I’ve recently burnt a lot of actually quite simple coding interviews due to things like a missed edge case, and I wonder whether I can discipline myself to do things in a more disciplined and systematic fashion with this book. Not only for Leetcode/Hackerrank style questions, but for work as well.
What I’ve done to improve this to pretty good effect is follow this rough formula on problems:
- ask questions
- write out assumptions
- come up with a basic solution (pseudocode)
- see if i can come up with edge cases or converse examples that break my solution/assumptions
- code while stopping occasionally to repeat the previous step
- walk through the code manually with some examples
After doing this a bunch I’ve actually internalized a lot of the edge cases or errors I might have missed before. So I think I am slowly teaching myself to be more precise for these types of questions. It’s slow going but decently effective.
IMO comparing against yourself over time would be more productive. I say that cause it would definitely stress me out personally if I knew I was in the 10th percentile or whatever. And then that negative attitude could snowball into stopping practice.
If you do want an objective measure for FAANG I think Facebook recommends being able to complete 2 Medium level LC problems in 35 minutes. But again, it’s real hard to mimic the real interview environment of explaining yourself, being able to ask questions, etc. I did find mock interviews helpful early on to refine my process before just grinding problems.
There's also this book that was recently published called Effective Debugging: 66 Specific Ways to Debug Software and Systems .
The lessons from the book are especially helpful when I feel stuck in debugging; I’ll think through the guidelines and they get me unstuck almost every time. For example, yesterday I started feeling stuck trying to figure out why my tests were failing, and I realized I was failing to follow the “stop thinking and look” guideline. We tend to theorize way too much about what may be happening when we should simply look to see what is actually happening.
It's still a work in progress (I would like to add some worked examples), so any suggestions for improvements are welcome.