Hacker News new | past | comments | ask | show | jobs | submit login
Why Don't Schools Teach Debugging? (danluu.com)
154 points by danso on Feb 11, 2014 | hide | past | favorite | 127 comments



I did an electrical engineering course. Especially with hardware ("welcome to tweak week"), there was a real sink or swim approach. I ended up feeling like it had to do with testability. It's easy to grade a test if you just look for right answers, and mark everything else off. It's easy to grade labs if you just look for the circuit tracking the input frequency. Grading either based on some assessment of how well the student understands what's actually happening is much, much harder.

My older brother and I went through the same classes and labs together. He brought to the table 15 years of experience playing around with electronics, so when we took labs, he had a clear sense of how to debug hardware.

He would wave his hand over some part of the circuit, and make gnomic pronouncements like, "When the input goes over about 5 volts, this part goes apeshit and starts firing back at it." Then he'd swap out a cap or a resistor, and everything would start working. I just went along for the ride.

I had decent math skills, so I did all right in the program. But I never learned that intuitive, heuristic, holistic way of grokking circuits like my brother did. Nor could I ever see that the school had any way to reward people like him, who "got" things at a deep level.

My sense was always that his approach didn't test well. If you test people for calculating the right voltage, you end up rewarding people who "plug & chug" quickly, even if they don't know what the voltage means, or why it would make some cluster of components go apeshit when it goes over about five volts.


> It's easy to grade a test if you just look for right answers, and mark everything else off. It's easy to grade labs if you just look for the circuit tracking the input frequency. Grading either based on some assessment of how well the student understands what's actually happening is much, much harder.

The professor I had for Networks and Linear Systems had an interesting approach to that. He developed every test problem such that the input values were small, whole numbers. If you did the problem right, your answers would also be small whole numbers. If you went off into the weeds, you would get really weird answers. That was both a hint to the student that they were barking up the wrong tree and a guide as to where the solution broke down.


EE is all about intuition, really. If you don't have the intuition to approximate everything (I don't) then you just sink; there's no way to do everything exactly like is often possible in CS. It's even more important than how good you are at math, because the whole point of the skill is figuring out when you don't have to do a lot of math in the first place.


CS is also heavily reliant on intuition:

- Picking the right architecture for a complex system out of a huge space of potential solutions (which have varying costs, performance, reliability, maintainability and complexity) requires a lot of intuition, developed over years of practice. There's no deterministic method that you can use to crank out the "right answer".

- At a lower level, the same applies to choosing algorithms: Will an algorithm that's much simpler to implement but provides reasonable average performance over most of the cases I need to handle be better than a complex algorithm that will take much longer to implement? Are there heuristics that can help me solve a problem that's computationally intractable? The answer frequently comes from intuition.

- When debugging, being able to quickly narrow down the possible causes of bugs (is it a memory leak? a race condition? a compiler bug?) requires a lot of intuition. Yes, it's possible to debug by brute force, but people with an intuition for it can do it much more efficiently.


In addition, usually you are balancing those searches for optimal solutions against business deadlines, incomplete specs, and the need to make unknown changes in the future.


Does anyone have advice for learning this kind of "intuitive, heuristic, holistic way of grokking circuits"?


What helped for me:

* Play with circuits all day

* Learn to grok math. No, really. Learn to understand the physical meaning behind what you write. Time integral of some signal? Oh, yes! It's energy = ability of that thing to do work. Fourier series => this signal written as a sum of sine signals which you can analyze separately (with all the implications that has for linear systems). That kind of stuff.

* Learn to look at circuits in terms of transformation of magnitude, phase, frequency, shape and nature of signals, not in terms of connections, adapting this to various use cases in circuits. There's no general answer for this; an inductor is sometimes there to "insist on keeping the current to a constant value", sometimes it's there to exhibit a greater impedance to high-frequency signals, therefore allowing -- along with the capacitor over there -- only signals of a certain narrow frequency band to pass.

* Learn circuits in terms of idioms. Try to look at schematics and understand that "this part" is a filter, "that part" is a preamplifier, "that part" is a current mirror used for biasing and so on.


The art of electronics has some nice ideas.

1) don't use a calculator. It's easy to plug numbers in without really understanding what they mean, and also believing a number that is several orders of magnitude out. And after the first week of converting between micro and nano and pico you'll have a good grasp on what those numbers mean.

2) draw circuits by hand. You'll be working on small circuits and drawing them is a bit like the "learn whTever the hardway" approach of typing stuff in.

3) for analogue DC circuits it's about manipulating current, voltage, resistance. For ac circuits it's power and impedance. For digital it's 1s and 0s and timing.

3a) in context components serve a role - decoupling or current limiting or amplifying or etc etc. you get to learn the configuration of a few components means an amplifier, and so you know that a voltage across those pins means a bigger voltage across those other pins.

Years ago the advice was to grab some second hand equipement and some new components and a breadboard and make circuits. The eauipment would be a powersupply, an oscilloscope, a signal generator, maybe a multimeter. The components would be a few resistors and capacitors and transistors with so e 555s and 741s. (With a few 4xxx series thrown in). Nowadays there is almost certainly a virtual set of this stuff that you can run on a computer. Perhaps some HN reader is involved in it?

The Art of Electronics (and the Student Lab book) used to be a solid introduction - hard work but good. I'm not sure if there's anything better for general use and I guess there's a lot of dated content. Microcontrollers is one example where you'd want additional books.

Tldr: do it.


Buy the radio shack 150 in 1 electronics lab. Complete all 150 projects.

In between play with Lego Technics.

Buy all the parts and a breadboard, do 10 projects using solder to make the result permanent.


Do radio shack still edit it ? It does not look like it still does

Sparkfun should propose something similar then :)



> 15 years of experience playing around with electronics


Pretty much this. It's a variant of the "10,000 hours rule" http://en.wikipedia.org/wiki/Outliers_%28book%29

I'm like the OP's brother in that when I started EE, I had already been tinkering with circuits for about 7 years. Obviously, in labs it was a huge help, but it also gave the math and all the abstract concepts learned in class some relationship to the real world.

Having the background means you can glance at a circuit and immediately know that a certain resistor is sized too small, or hearing the high-pitched whine of an inductor means that may be the problem with your regulator. In short: build lots of different stuff, have it blow up in your face, and learn something from the experience.


https://www.sparkfun.com/ - get some toys and start playing. Hands on approach.


Reading application notes can be helpful . They hold plenty of knowledge and in many cases they are described in a less formal more intuitive way.


It's more like they don't teach general diagnosing/understanding why things don't work. I consider the skills needed to debug a computer program, to diagnose a hardware problem, to fix a car, and to fix commercial coffee brewers to be almost identical (and I have done all of these things, a lot). In fixing a computer, you have a mental checklist of all of the things that could contribute to it (failing hard drive, bad termination, failing memory, loose cable). In fixing an engine, you think of what the problem is (rough idle), what can cause it (plugged fuel injectors, vacuum leak, exhaust leak), and slowly determine which one is the culprit (probably not the fuel system, goes just fine 75 down the freeway. Oh, that vacuum line is disintegrating. Good job, Toyota).

They all follow a similar pattern of identifying what the problem is, what physical manifestations can actually cause that problem, and then eliminating them one by one. This involves looking at the whole picture to narrow it down. This is not something I learned in school, I just figured it out growing up. I've found friends that don't necessarily know anything about computers, but are mechanically inclined, think similarly to how I do. I do have an innate desire to understand how everything works, and this definitely helps me understand what can physically cause the problem. That said, I did just take an Intro to Math Thought, for my math degree, and while it didn't really teach problem solving as I described above, it dramatically improved the way I thought about proving things rigorously.


I never knew this was a problem in EE, but I know for a fact it's a huge issue in CS.

They don't want to teach any "tools" b/c CS professors find teaching people how to be good developers as beneath them. They honest to god think CS is an actually hard science on par with math and physics. You can sum up their attitude with: "It's not an associates degree goddamnit!" (even though it should be)

Another source of reluctance is that by the time we'd graduate it will be sort of outdated. I remember we learned subversion and Apache Ant. Everyone now uses git and I'm not even sure if anyone uses Ant

95% of students used std::cout for debugging and 95% of students never set up an IDE.

If I had spent a couple weekend properly learning to use Visual Studio, all of my programming assignment for those 4 long years would have been 10x easier.


CS prof. here.

We don't want to teach "tools" (e.g. IDE) because they are always changing. There is no point is knowing how to set up an IDE when the IDE will have a different GUI before the student graduates. We teach"unix tools" a bit more because they did not change much during the last decades.

My goal is to teach "concepts" (which is hard) and use tools as examples of these concepts. Dealing with the specifics of a particular IDE or tool is pointless. We are trying to give students general skills that will be useful for the whole life of the students, and not skills that the industry needs this year.

That said, I do my best to teach debugging (mainly using gdb and valgrind). The real issue with debugging is that it relies on a lot of experience, which students do not have, by definition.


I learned absolutely nothing from having to use horrid unix tools like gdb in college. The focus on unix tools, in my opinion, seriously degrades the ability of students to learn and unfortunately cements very very bad usability paradigms in their minds.

I learned a ton by debugging things on my Mac using Turbo Pascal, Think C, and even, when absolutely necessary, MacsBug. Those are all obsolete today, but it was well worth learning the skills regardless of the tool.

That said, we were also taught very little about debugging in class. I learned it all on my own (with help from friends). Most of my fellow graduates seriously didn't know how to set a breakpoint and step through code at the end of their college days.


I'm so glad you replied!

I've heard versions of what you've said hundreds of times and honestly, you academics are missing the point!

Yes, the particular tool you learn will become obsolete, but debugging between Eclipse, gdb, Visual Studio etc. is all basically the same. The knowledge you get is transferrable, just like they have a head start if you teach them C++ and they end up in a Java shop.

Teach them tools that are going to be obsolete! Stop worrying about that.

You have no idea how much time your students are wasting hitting their heads against walls that shouldn't exist trying to debug their crappy code instead of actually working on understanding the underlying concepts. I'd say it's a 9 to 10 ratio of brain-numbing debugging to actual concept implementation.

I really strongly encourage you to set up your students with IDEs. Have a few seminars on how to set up a Visual Studio environment, and how to debug a large codebase. It abstracts a lot of the fluff that stands between them and the underlying concepts. The command line and its tools are great in the right context, but you're making the barrier to entry a lot larger than it needs to be. You can transition to command line tools in more advanced courses, but don't hamper their learning from the get-go.


> I really strongly encourage you to set up your students with IDEs. Have a few seminars on how to set up a Visual Studio environment, and how to debug a large codebase.

You can find lots of information about that using a search engine of your choice. Any student should be able to use Google. So there's no reason to waste teaching ressources for that purpose.


That is a bit weird thing to say. Lots and lots of things can be found using a search engine, but it doesn't mean they shouldn't be taught.


For really big programs you can't use valgrind; valgrind causes significant performance hit and the system often starts to choke under the the memory load. Also GDB really alters the timing of things, so it is bad at timing/multithreading bugs.

Mostly what helps is working through the logs & traces / following through the code along as you pass through the traces; Use a needle and a steady hand ;-)

i guess that's what they should practice in school as part of learning the trade. On the other hand its a trade off - school needs to teach the general principles of programming, with regards to debugging it is hard to distill universally applicable principles; it all depends on what you are doing.

Here is my take:

- unit testing is cool, it allows to catch problems in isolation; so it would be useful to teach this as a discipline

- for most GUI programs and application logic the debugger is really helpful

- traces and log analysis as mentioned earlier as the last resort

Also they should teach how to formulate the problem, analyse problem source and ways of fixing a defect; here you have to understand the trade offs: when to choose a local fix, when is a local fix no longer appropriate; how to quickly test a fix - so that it does not break the system in any way.

If i remember correctly then some of that they do teach in software engineering courses.


Valgrind is fine for teaching students. I would probably use something faster and more sophisticated (and probably more expensive) in a professional setting but that doesn't mean there's nothing to learn from using it.

Adding logging also changes timing. Those aren't just NOPs you're throwing in. Also, changing timing isn't always a bad thing. Sometimes you can get a bug from a race condition to happen more often with the debugger.


Valgrind can't be fast, it has to track/color each memory location, it also has to check each pointer reference; also its memory overhead for larger footprints can be considerable.

Once upon a time it was really slow, but then they added just in time compilation.

logging/tracing can be selectively enabled;


The tools for debugging have changed yes. But the high-level process is the same whether it is fixing a NAND gate, figuring out the cause of a segfault, or finding a the location of the leak in a water pipe.

There is a problem. Many of the interns and recent CS grads I have worked with or hired have struggled with debugging. On a day to day basis, debugging is one of the most important skills. I would argue it is also an important skill in research settings, not just industry.

I don't know the answer, but maybe somehow integrating debugging into every CS class over the four years would help. Because you are right, it also requires a lot of practice/experience to get good at it so it is unlikely that just adding a class would solve the problem.

That is what the Purdue psychology program (I have a minor) does in its undergraduate program. The first 2-4 weeks of every class is spent on research methods often specific to the particular topic of the class. After taking the 4-5 classes to complete the minor, how to reduce bias in surveys, usability tests, etc was beaten into my brain.


Thanks for the comment. I agree that tools go obsolete quickly. But it occurs to me that debugging is both a set of tools but also a knack.

I'm curious what you think about how to teach the knack.


How to teach the knack:

Write lots of code that uses the most sketchy language features available. The chance is high that soon you'll get code that doesn't work. Now go through each line (source code or assembly) using a debugger of your choice and get it to work again.

Do this until there is nothing that will scare you anymore.


Also, do lots of maintenance programming. Finding bugs in code someone else wrote is generally much more difficult (and in my opinion valuable in learning sense), because you aren't familiar with it.


We aren't asking for vocational training, but for teachers that teach how to think, how to design, how to run experiments, and so on. So many schools (even the 'top' ones) turn out people that cannot do this, and I find it appalling. These things are foundational, they are not vocational. Unfortunately that does in fact meaning having to learn some tools which will become obsolete. So what? I did EE labs, and while resistors are the same none of the digital components really are. I learned how to design NPN junctions, which are now largely replaced with CMOS, but so what? It was the experience that mattered. You can't hand wave that away ('you' is global, not you mandor), teach the math of junction biases, and expect that anyone will be able to do anything with it once they graduate. But so many CS programs try to do exactly that. I walked out of undergrad knowing how to do recurrence relations, prove the complexity of graph algorithms, and so on, but with almost no clue about how to actually design an algorithm, how to structure and design software, and so on. It took grad school and some good teaching professors, to change that.

I've rewritten this three times or so - it is hard to address without sounding like I'm attacking you, which I am not. But I am truly dismayed about the skill sets of people graduating from CS programs. I'm not asking for Java vocational training, but for a recognition that 99% of the graduates will be asked to function as "engineers", not "scientists". The people that come out and that function well seem to do so despite the schooling, not because of it.

Incidentally, the best EE teacher, or any teacher, that I ever had, had worked in industry for many years, decided he wanted to teach, and went into teaching. His classes were practical and pragmatic. Oh, you were failing if you didn't master the math and theory behind the material, but he taught you how to design, how to think, how to manipulate all of this book learned stuff to make real things that worked. We had to cost out our projects, write design reports, and so on. Extremely hard courses, but absolutely fantastic, because of, not despite, the focus on what might be called 'pointless' things (who cares what the cost of a transistor was in 1986, after all?!)

Unfortunately, you cannot gain experience, and develop the process, without using tools. It needs to be part of the education. Just look at the post by the poor person you are responding to. Think of how much better his entire education would have been if in some freshman level lab he had been taught some of these basics. I shudder to think how much he probably spent for that education vs what he got.


>They don't want to teach any "tools" b/c CS professors find teaching people how to be good developers as beneath them. They honest to god think CS is an actually hard science on par with math and physics.

Despite what you (and a lot developers that I've talked to) seem to think, CS is a hard science. However, CS and software development are not the same thing. Most CS programs are taught by professors who have research degrees in Computer Science, not by people who have ever been (or probably ever claimed to be) professional software developers, so there is a fundamental disconnect between what you seem to think you should be learning and what is actually being taught.

The professional software developer is a fairly new occupation in the grand scheme of things. The people who are most qualified to teach it are people who have actually worked as developers in the industry, but that is difficult because 1) most universities are reluctant to hire (and students can be reluctant to take classes from) adjunct/non-PhD faculty, and 2) it is hard for schools to offer enough to lure many of them away from industry anyway.


"CS is a hard science"

Hahaha. As someone who did a degree in physics, computer "science" is a joke when it comes to math. They took one of the easiest field of mathematics and rolled it in to a half-assed degree. Sure you learn some logic and some graph theory (yeah, lets try to avoid numbers as much as possible...) but it's not at a very high level. I seriously doubt CS professors are making significant contributions to mathematics.

"there is a fundamental disconnect between what you seem to think you should be learning and what is actually being taught"

Yeah. That was the point of what I wrote. There is also a disconnect between what academia wants to teach, and the tools that the students need to actually learn the concepts.

It's like teaching an Literature class without actually teaching people how to write. Code is our primary tool for expressing algorithms and concepts. If you hamper people's ability to write code, you are shooting yourself in the foot.


Why do you say CS takes one of the easiest fields in mathematics? People have been studying CS since long before we had computers and there are still famous open problems. If the math is so easy perhaps you'd like to give a proof for P=NP (or P!=NP) and claim your million dollars? Do you really think CS is all discrete math and graph theory? I don't go around claiming physics is easy because Newtonian mechanics are simple. I'd also argue that math only gets interesting once you take the numbers away. If all you're doing is number crunching you should probably just write some code instead. Computers are good at that sort of thing.

In my experience the real disconnect with CS degrees is expectations about what is and what should be taught. A lot of students want a vocational program. Universities end up teaching a hodgepodge of topics such as how to write code in a particular language, program design/software engineering, algorithms and complexity, discipline specific specialties (e.g. AI, OSes), computability, etc.

When I was in university I was taught how to debug my code with ddd and gdb. Was this a waste of my time because I no longer use them? Of course not. Everything I learned with them carries over to the debuggers I use everyday. My degree also covered several languages that I've never used outside of classes. The skills I learned learning them help me any time I want to pick up a new language and start coding.

Code is not a primary tool for expressing algorithms and concepts. Code is how I tell the computer what to do. If code was so good for communicating concepts and algorithms to other people I wouldn't need to write comments. I also haven't seen any algorithm books that include real code. The closest you'll get there is MIX. Good luck programming a real application in that.


Why the bitterness against CS?

As jarvic says,

> However, CS and software development are not the same thing.

A lot of schools offer a separate degree in Software engineering as opposed to CS (or at least separate tracks). So the distinction is pretty clear. Where I studied, there are separate degrees for CS and IT. You seem to be fudging the two. Do you expect physics majors to be mechanical engineers automatically?

CS degrees teach you algorithms, not how to write code in Java. Doesn't matter what IDE/language you're using, bubble sort is going to be slower than quicksort for large lists.

Yes, unfortunately most people who do CS end up becoming IT drones. That's the nature of the industry, not the CS program's fault. As a matter of interest, what do most physics majors become? I'm sure there aren't enough professorships available.

Or in the words of E.W.Dijkstra: “Computer science is no more about computers than astronomy is about telescopes.”


Sorry if I came off as bitter. I'm a couple of years out of college and honest I felt that the CS department was pretty self entitled and misguided.

"CS degrees teach you algorithms"

Yeah it does. And it doesn't feel like it's really enough. There isn't a feeling by the end that you've learned only a small fraction of the algorithms and I guess it felt like there wasn't all that much science.

I went to a school with a decent department (I think top 20 in the US) but I never felt like they were actually doing any real, interesting, groundbreaking research.

In contrast, the Physics department is doing all sorts of stuff. Quantum computing, CERN related stuff, building new astronomical sensors, string theory stuff etc. etc. They've got their own problems, but at least it's something you can point to.

Physics majors are sorta screwed. Physics is sorta the opposite of CS. When you're done you feel like you've only learned a tiny bit and that there is so much more to understand. If you're not trying to go to grad school, the department doesn't really care about you. You end up having no marketable skills and your selling point is "I-went-through-a-insanely-difficult-degree-and-i'm-good-at-problem-solving". Most people can sorta code in Python, and sorta can do some circuit stuff. It's pretty bad.

People end up teaching, or working in unrelated jobs. Or going to grad school in related disciplines. One of my buddies ended up picking grapes with illegal immigrants.


Disclaimer: I'm a CS PhD student that did pure math in undergrad, so I probably have a different perspective on these things than someone who did CS only.

CS, at a graduate level, is a much more varied field than most people, even CS undergrads, realize. In fact, a lot of the main research groups in my department (graphics, vision, image analysis, robotics, VR, visualization) are things most CS undergrads don't even get exposed to.

In math or physics, a lot of the undergrad courses are kind of basic, entry-level things that give you a feel for some of the different fields of research without going too far in depth. This isn't really the case with CS, because CS departments have to cater to several different audiences who tend to fall into one degree. They have to come up with a curriculum that balances programming/development concepts for people who want to be developers, networking/administration stuff for people who want to be IT (although this is more and more being moved into a separate program), and theory for people who actually want to be computer scientists. Even then, this theory is mostly relegated to computation theory, which is only a small part of what research computer scientists do.

Personally, I'm in medical image analysis, which is a field which is a collision between a ton of different areas, including physics, statistics, operations research[1], numerical analysis, image processing, differential geometry, and medicine. I don't know that I would consider any of those fields to be the "easiest branch of mathematics" (what does that even mean, anyway?)

This post has kind of gotten off the rails a little bit, so I'll end with this: I think, if you just look at a CS undergrad program and try to project that to what actual CS research is like, you end up getting a much narrower and incomplete picture than if you do the same thing to a math or physics department, partly because CS research tends to be much more interdisciplinary than others (I have many more collaborators in many more areas than some friends of mine in math/physics/biology do).

--

[1] http://en.wikipedia.org/wiki/Operations_research


The purpose of a CS course is not necessarily to teach people to become developers. Indeed it would be interesting to know what % of CS grads even work as programmers post graduation, probably less than HN would imply. Based on the people I keep in touch with from my graduation class, I would put it somewhere near 30%.


That's a common thing I didn't understand about my fellow CS majors. Many of them did not want to become developers and even took pride in the fact that they made it until the final exams without ever having to hand in a single programming assignment they did themselves.

Yes, these people have amazing careers in management and consulting ahead. Is this good for the industry? I doubt it.

For me, the purpose of a CS course is to make people reasonable at programming with an advanced knowledge about the remainder of the field. It's easier to teach a programmer formal CS methods than to teach a formal methods guy to program.


Funny, I always thought it the other way around. I struggle whenever I try to teach myself any formal/theoretical stuff because without the aid of a teacher checking my work I'm never sure if I really understand or not.

I never had a problem learning programming languages on my own by following examples though.


I think you'll find that in CS, there's often if not always a way to implement or approximate the theory in code. To me, that's the more interesting part, but it requires self-learning because there are very few teachers who can blend theory into practical terms. Like the original EE intuition examples, perhaps it's something you appreciate after 10 years in the field.


Ouch. Does that imply that finishing in the top 80% of your class then put you somewhere near the bottom 33% of developers?


You are assuming that the best CS majors must obviously want to be commercial developers.

People go to grad school in other disciplines (sciences, stats, etc.) which can take advantage of computation, for one thing.


Wouldn't that depend on the correlation between class grades and developer skill? I have no idea about that.


I'm curious - what do the remaining 70% do?


Various things, a common aspiration was to try and fast track to a management position at a consultancy or blue chip where things were are more about spreadsheets than code. Others do technical work, but not strictly development like Q&A or technical support.

Some went into academia and some did other things altogether, I know at least one person who became a tree surgeon and another who is a session musician.


Tech support and CS degree?!? It might just be me but I cant really see a person with a bachelor or masters in CS work in tech support.


There are more jobs available, especially in places without software companies. Or people take those jobs temporarily while they look for other jobs, get promoted and end up following a different career track.


If Q&A is QA then it should be considered development, in a specialized domain.


except you just click things on a GUI there.


I'm fond of developer-QA folks that automate most of their pointing and clicking. There is, absolutely, a skill in finding bugs by exploratory testing, but someone who can combine that with automating previous tests (thus building up a regression suite) in a way that's maintainable is a real keeper.


I think you mean "except when..." As others said, QA automation is a serious endeavor, and an important one.


Unless you're writing test automation.


The debugger should be a concept to be taught, just like the compiler, linker, interpreter, assembler and so on. There should be at least a few lab hours where students are given existing programs and need to go for a bug hunt. This should include strace and valgrind, two essential tools that will probably not be outdated for quite a while.

Now, where I disagree is the IDE part. I have nothing against IDEs per se and I think everyone should be able to choose the tools he can work with best once (s)he starts a professional career. (hence why I like the concept behind cmake btw.). But I think it's a bad idea to start off with this as a student. An IDE comes along as an non separatable entity and it's hard to distinguish between the essential components behind it if that's the only thing you practice developing with. Students should learn what those components do from scratch and that's easier when you can see them as Lego bricks in front of you. IMO it's also a more satisfying learning experience than having MS, the Eclipse Foundation or anyone else holding the hand for you when you do your baby steps.


I see this as completely backwards. Tools like cmake are often worse for beginners because they have no idea how to understand them, and it involves learning another new language in addition to whatever programming language they're learning. Looking at something like Xcode or Visual Studio or Eclipse, where they can see "these are the source files that comprise my project," is much easier.

I do agree that learning the difference between the editor, compiler, the linker, and various other pieces is important, but I think there are better ways to do it than trying to understand the unix build tools right off the bat.

FWIW, my first exposure to all of this was in high school where we were taught Turbo Pascal on DOS. It was essentially a text-based IDE, but the teacher still taught us about the editor, the compiler, the linker, etc. and it made perfect sense. When I went to use the unix tools in college, it was quite confusing.


I think it's important to hit the ground running using the best tools available.

"Students should learn what those components do from scratch"

Yeah, they will eventually. When they take a compilers class, they'll understand how the compiler works. It's okay to have that be "magic" in the mean time. They should be focusing on learning about OO principles, and data structures, and algorithms and not futzing with command line tools.


> They honest to god think CS is an actually hard science on par with math and physics.

Theoretical computer science is a branch of mathematics. Don't dismiss the whole field just because your university chooses to under represent the theoretical side.


Plenty of people still use ant. It's the path of least resistance where I work since we have a lot of tooling and preexisting code that use it.


On that note plenty of people still use svn for the same reasons.


As an update Gradle's easier to use Maven is gradually replacing Ant and has its sights on Make next. See Android for example.


It also gives a huge advantage to students who've been coding for a few years, especially if they've been involved in open source projects. I used source control tools for my programming classes at uni (not a CS major) and it made me 50% more productive, easily.


We didn't even learn what source control is, let alone a tool like Subversion. We used it because some of us were familiar with it, but tools in general weren't taught.


I was taught version control. In fact, university made very sure to emphasise proper source control in our second year group project, even to the extent of making one person in each group the "code librarian", responsible for making sure the code of the various members fitted together, was properly controlled, and safely backed up (which was me).

And of course the nightmare situation occurred. The project manager (read: no !*&^£ clue) "accidentally" deleted the entire source control repository. Reconstructed from backups from three hours previously, plus changes from peoples' working copies within half an hour.


Print statements for debugging is not necessarily a bad thing: http://www.scott-a-s.com/traces-vs-snapshots/


Agreed! In the same vein, I recently wrote about both debugging vs logging [1] and about session-based logging [2].

[1] http://henrikwarne.com/2014/01/01/finding-bugs-debugger-vers...

[2] http://henrikwarne.com/2014/01/21/session-based-logging/


Neat, we hit on many of the same points.


Yep!


It's certainly not exhaustive, but Andreas Zeller's Udacity course[1] has helped me bring a more systematic, methodical approach to my debugging sessions. In addition, his focus on using using tools to assist our debugging effort has encouraged me to automate a lot of what I would previously have done manually. I'll admit my first instinct is still to throw in a couple of printf statements and see what they say, but for harder problems I know have a richer set of tools to work with.

[1] https://www.udacity.com/course/cs259



Those course videos have an lite-ASMR inducing quality to them, if you are into that sort of thing.


I’m no great teacher, but I was able to get all but one of the office hour regulars up to speed over the course of the semester.

The first part of the sentence is in direct contradiction to the second part. If you did that, it means you had patience, were willing to explain, re-explain, and find alternative ways to explain. And you cared.


So much truth here. Especially the fact that you cared.

As someone who had his share of poor TAs and was a TA, caring is what sets apart the good teachers from the poor ones. Taking the time to sit and figure out the fundamental misunderstanding and then a good way around it is essential. It may look obvious to you, but the other person is obviously lost. The only way to solve that isn't by special power or teaching acumen, but taking time to figure out the core problem.

Aptly, it often involves debugging the person! Figure out where the chain of thought broke down!


Yes, very much. It requires a deep understanding of the material, empathy, and abstract thinking.

You need a deep understanding of the material, obviously. You need to have a model of the correct thing in your head. The empathy bit comes in from first caring, but also being able to quiz the student's understanding so you can start to understand what the (incorrect) model is in their head.

Then you need to figure out how to build a bridge between their model, and the correct one. And you may need to build multiple bridges if the first one doesn't take.


The author mentions the answer in the first paragraph:

    ECE 352 was the weedout class that would be responsible 
    for much of the damage.
Weed out courses are common in undergraduate education. In some disciplines the logic is plausible because the resources available for upper division undergraduates are limited and vast numbers of Freshmen choose the track. The first year biology sequence is a good example.

The problem is that weed out classes substitute the easier task of deciding who is prepared for the harder task of preparing students. The author's success in the class was primarily due to the skills and knowledge they brought to the first lecture: relative success in the class was not a result of what was taught but of what was not taught. The "difficult" final still only required a few hours coding and not developing the habit of coding every day for several hours.

In my field, architecture, structures is the notorious weed out class at the undergraduate level, and truss design is the the rock which dumps all but an arbitrary few into the sea. Architects don't design trusses in contemporary practice. Hell, outside engineers employed by fabricators structural engineers working on buildings rarely design trusses (mostly they specify their structural requirements).

But my structures course was different. I studied architecture in grad school. Grad schools weed out during admissions and afterward their goal is largely to prove they made the correct decision. Having been in the industry before schooling. I could see that my structures course taught the important elements for practising and skipped the difficult for the sake of difficulty.

Interestingly, participating in Coursera classes I have seen that the more accomplished teachers tend to focus more on what is important and less on how hard the class should be. When the instructor wrote the book, maybe they know that the ideas are hard enough.


> The problem is that weed out classes substitute the easier task of deciding who is prepared for the harder task of preparing students.

I don't think this is the intent, at least in some disciplines. For example, I minored in math and went through the weedout class. It wasn't intentionally difficult (for some it wasn't even difficult). It weeded people out because it was the first class where you actually learn what math is - proofs, mostly. The people who thought it was going to just be algebra/calculus backed out.

I think maybe an important point is that almost no one fails the course - it was structured and taught such that anyone who cared could get through it. People backed out because it wasn't what they thought it was, not because it was too hard.


Not all schools do that, however. Some have weedout classes with 30% or higher failure rates (I'm looking at you GA Tech!). This wasn't just a wakeup call with a C, "Oh, maybe CS isn't for me." It was more, "Here, we threw this course together for the 3rd CS course, we'll poorly introduce terminology and use TAs (some were awesome) who are only one semester ahead of you and terrible teachers. Your next 3 courses, that depend on this, will be much easier and better taught. Good luck!" And then 30% or more would fail.


The difference is that in the math class, what sent people away was not a lack of success but a lack of interest. The people who remained at least found the idea of performing proofs better than a sharp stick in the eyey. In the author's class a person could be interested and passionate about coding and not succeed and a person who felt that a few hours solving a homework was a lot of time spent coding could.


Right, I get that. My point was that there is a legitimate way to weed people out of a major, and that lumping all weedout classes together as evil isn't really fair.


If every professor feels compelled to thwart the admissions policies by failing an inordinate number of students, either the university should re-think its admissions policies, or the professors should find other work.


I read Agans's Debugging: The 9 Indispensable Rules for Finding Even the Most Elusive Software and Hardware Problems back when I was in high school and still pretty new to programming. I learned a lot from it, and still recommend it.

When I was in college the administrators were redesigning the CS curriculum (and are probably still fiddling with it). One of the newer classes they had come up with was a mandatory lab course on debugging.

Sure enough, they had everyone read Agans and emphasized thinking of debugging as an application of the Scientific Method. There were also constructed debugging exercises to solve too, but given how hard it is to deliberately craft good bugs that are nevertheless still predictably solvable by beginners within the time frame of a lab session, I think most of the value that people got out of the course was from reading the book.


I second this recommendation. Its the best description I've found of how to systematically approach debugging. I read it after I already had lots of industry experience, but still found it useful because it provides a "formal" description for what is usually an ad-hoc process. Its a useful way to think about things.


It's not debugging as much as what us old timers call troubleshooting. At the university I had an insane advantage over those folks that had never seen a Simpson multimeter nor soldered a component nor seen a schematic. I had spent my teenage years building radios and computers followed by formal training in the military. And the Navy taught us how to troubleshoot large systems down to the component level. So college was mostly easy and extra beer money was earned fixing other students electronic devices and tutoring.


The short answer is because they're supposed to figure it out themselves.

I've been on the teaching staff of CIS 120 and CIS 121 at Penn for five semesters now (the second and third classes in the intro computer science series). Both have units on debugging, both in Eclipse and independently of an IDE. We teach students proper testing fundamentals, how to use the debugger, and how bottom-up design can make debugging much simpler.

However, it's up to the students to actually use these tools and techniques to figure out why they're so much more effective than print statements. (After all, could you imagine taking French while never having to speak or listen? Piano while never touching an instrument? Same principle here.) Some students will learn to use these tools quickly, but others can still logic through these courses.

However, everyone eventually learns. Take CIS 380, our operating systems class. The final project involves building a simulation OS with kernel, scheduler, file system, shell, etc. Pretty much all system calls aside from read, write, malloc, and free are written from scratch. Even for the hotshots that didn't need to learn how to debug, they eventually learn to use gdb, because you simply can't logic your way through several thousand lines of C, even if you did write them all yourself.

Any computer science or engineering curriculum that doesn't teach debugging fundamentals does students a disservice. However, we can't directly force students to practice these skills. It's up to them to try applying these techniques.


I agree. Though writing my assignments for a C course was made so much easier thanks to Xcode and the ability to quickly view what's in memory at a breakpoint. Sadly, the course never mentioned debuggers -- likely as a way to weed out students. I suspect this has changed since, but it's up to the prof, so who knows. Part of it is that few profs understand the practical reasons to favour some tools over others -- if you don't write the code day in and day out, you won't know which tools to recommend when. I think maybe there's room for a "practice of programming" course on methodology, tools and open source licensing.


This is by no means isolated to engineering or mathematics either. Nearly all students, especially in higher education, will have gaps in their understanding of topics they are expected to just know. The classroom simply isn't able to personalise learning to each student in the way that's needed. The only thing that really can help here the is clever application of technology.

This is part of the problem that I'm working to solve with my startup Ellumia. Some others are making great strides in areas like language learning, but there's still lots to be tackled. We're building a mobile platform that blends bite-sized content with a highly personalised, social experience – something that would be immensely helpful to the students mentioned in this post. If you're interested in innovative learning methods, please check us out: http://www.ellumia.com


Um, what you're doing might be super-duper valuable, but "bite-sized personalized social innovative learning", frankly, sounds more like a game of buzzword bingo than like a product description.


That's a valid criticism.

Specifically, we offer courses that are far shorter and more targeted than a "traditional" offering (or one that you'd find on a MOOC). Content is broken down to the concept level, and available in multiple presentations. This facilitates delivering content that is most suited to a learner's personal style, as well as spaced repetition whereby a student exposed to the same information multiple times over a period of time, in an altered format or context. As I mentioned, this is on mobile, a space that's largely ignored aside from educational games in higher learning.

I hope that is a little less buzzword-heavy, though I suspect it may still not be as BS-free as it could be. Learning what messages work best for different audiences is all part of the fun with a startup…


Late 90s' UW Engineering grad here. First mistake, as a non-EE I thought I would enjoy this course as an elective. Second mistake, not attending lecture. Third mistake, waiting till the last minute to do the final project. Bad times. It really struck true to me. As a developer today, looking back the 'hardest' part of my CS and ECE courses there, was that they completely focused on the theory, and not the practice. Knowing how to debug my code would have maybe made the difference. At the time, I was told in the lab, by a TA that we were being left to 'sink or swim' so as to weed out the scrubs. This combined with requiring us to sign a 'no cooperation / or sharing' pledge made the experience a failing one. It took me nearly 5 years after school to rekindle the interest.


Well I guess students are supposed to figure out the things they are not taught for themselves maybe with a little bit of google-fu but i suspect the amount of things the professors require them to figure out requires more time then they have to the next course.

I regard universities as the pace you go when you already know most of what you need and you just need to get the papers to prove it maybe get better in the areas you missed.

But if you into university not knowing most of what they are going to teach you you are going to have a hard time there's just not enough hours in a day.

That's how i see it anyway because every time i learn something i usually end up researching more then i need for that specific assignment and end up spending an entire day on just one thing maybe even more.


Engineering programs are designed to weed people out and give the ones who stay the ability to teach themselves new things. Debugging is an example of one of those new things. If you go to school and have your hand held through the entire process you won't be very well equipped to function in the real world where it truly can be sink or swim.

My Chemical Engineering program was designed around the approach of "give the students homework with no instruction and review the problem sets after they are due". This was tough to handle and weeded a lot of people out every semester yet was a great introduction to working in the real world where a lot of times you have to figure out complex situations with little to no help from anyone else.


This is an excellent essay. I remember sitting in my parents’ basement in high school with a fully assembled electronics board that I’d soldered together from a schematic, and no idea what to do about the fact that it didn’t work. Had I understood that learning how to build involved learning how to debug, I’d probably have built more things and fewer programs (for better or worse).

I must have known this about software and math, in order to be doing what I was doing in those areas then, but not figured out to generalize it to {electronics, mechanical objects, project plans, social interactions, habits and motivational systems, and creative writing} — and only partially to writing essays.


While it has been a number of years, I took CS and EE classes at Michigan State University, Tri-State University, MIT, and Notre Dame.

These are all very different schools, but all of them taught debugging. At Tri-State one of the classes was even dedicated to debugging.

I think the lack of teaching skills may be that more and more schools are degree mills, and more and more the teachers have only ever worked in a classroom so they don't know how to do things outside of the text.

But I have a hard time believing that MIT has dropped debugging, or that Tri-State which is an Engineering school has either.


I find the professor's attitude quite puzzling. You often learn best when you have made a mistake, and then figure it out ("expectation failure"). Why not take advantage of that? I think bugs are quite useful as "teachers", as I discussed in "4 Reasons Why Bugs Are Good For You" http://henrikwarne.com/2012/10/21/4-reasons-why-bugs-are-goo...


henrik_w, the professor's attitude is not puzzling once you consider this... the teacher in question intentionally wants to make the students struggle, because it validates their own achievements.

Then you will see that it's not about the teacher doing the most to help the student succeed, it's all about the teacher's ego. Of course such an attitude is not one worth promoting, but everything in the article suggests this is what we're seeing.


Ability to debug is really a function of general domain knowledge, "skill" of accessing documentation and/or examples, and good understanding of experimentation. It's difficult to teach, as different coders approach problems in different ways. First, I'll explain what I think makes one good at debugging:

1. Domain knowledge, which I'll consider code that the programmer has either written herself or has otherwise become intimately familiar with, cannot really be taught. But you can at least prepare a bit so you don't get lost in the woods. I think it's good practise to understand the basic file structure of what your working on and, generally, what is in the files you might be working with. More detailed knowledge of code will come through being able to trace data through files. This is not trivial in larger code bases, again, I recommend using notes or whiteboards. This is necessarily lower when you first encounter significant code that is not your own, and people who can leverage their other skills tend to do better before acquiring knowledge of the surrounding code.

2. I say "skill" at finding documentation in quotes because sometimes, good documentation does not exist. But the only way to increase your knowledge of what's happening in your programs is to start figuring out what the components you are using. Generally, newer programmers tend to find that the code they are writing doesn't operate the way they hope because they have made a mistake in arguments or syntax. Examples really help alleviate this. I'm not certain this can be taught very well except for pointing students towards documentation, but good debuggers tend to understand the benefit innately before too long.

3. If you're having problems with code that you think should be working, you need to be tracing data. At each step, you should know what your variables are and what you expect them to be. This is not trivial, and it can be very difficult to determine just where something is going wrong. But this is also the heart of debugging: being able to figure out where your signals are getting turned around.

When you have nothing else, you can just start testing components for correctness and model a cull cycle of your algorithms. I don't think this can be taught, because many bugs are not like one another. After time, you might grow to recognise code that would more probably produce bugs, but I don't think there is a teachable process for finding bugs. Sometimes, things just be broke. But knowing the manner in which things fail, you can find solutions that bypass the sorts of code that cause errors. I'm not saying this is easy, but I'm saying that it's something that you kind of need to learn for yourself.


In the EE/CE program I went to, the first class you were required to take was "Intro to Digital Systems". It consisted of a little bit of classroom basics k-maps, binary/hex/octal math, gate logic, etc mixed with a lab that took a breadboard and applied the knowldge you learned in the classroom to make blinky lights/sounds. Not very impressive, but in the course of making the light blink you learn rules for debugging systems that are applicable everywhere (change only one thing, then test after every change, think about what each change means in terms of expected output given an input, etc). If you know how to do basic electronics going in, you have an easy A and a bunch of free time. If you are learning it all for the first time, you probably spend 10-15 hours a week on a 3 hour lab until it clicks.

The class had something like a 90% pass rate so I don't think this is something that can't be taught with a mixture of normal classroom plus TA+Prof help in labs.


I wish school taught reverse engineering. Or perhaps I should say breaking things apart. You often need to reuse software written by someone outside your team and soon realize the software either has bugs you have to fix yourself or needs features you must add.

I've never seen a class on reverse engineering.


Troubleshooting, as important as it is to engineering and computer science, is a general skill that should be taught to kids starting in pre-school.

I'd argue that you can generalize all non-creative disciplines (science, business, plumbing, etc) down to a specific area of troubleshooting. And for the creative and athletic disciplines, troubleshooting will be needed to get the best output from your tools, be they chisels, oboes, your voice, your muscles, paints, or anything.


It's a top secret recruiting method. Only the top 1% learn to debug code or hardware on their own. Since it's a secret, it's still not a gamed in the interview process by the me-too crowd. Please downvote this story, you will make recruiting much harder.


The best way to find the source of a problem can be summed up in two words: binary search. On the other hand, having witnessed humans physically search for things in ordered collections, it seems we're instinctively programmed to perform a linear search.


In my experience, good logging is the first step to good debugging. Having good logs lets you find a lot of problems without even debugging "properly" (using a debugger say). http://henrikwarne.com/2014/01/01/finding-bugs-debugger-vers...


Good logs have been especially useful while integrating new motion controls in our wafer handling robot. We have a vendor who's product sometimes does not respond correctly to motion commands. Before logging each movement command, debugging a weird movement error would involve repeatedly moving the robot until the error happened again. Now I just pull the last N commands out of the log file, reproduce the issue, and send a script to the vendor so they can fix it.


Somewhat true. Though intuition about the nature of the problem may give you a much better place to start looking than right in the middle.

If there's one particular module that is often at fault during a particular scenario, you can often start there rather than at the midpoint.


I really enjoyed this post having attended the same school and taken some of the same classes. I'm actually currently in my final months of a mechanical engineering B.S. at Madison, and I think the problem goes beyond a "hazing" of freshmen.

For me, my ECE 352 was Introduction to Dynamic Systems. For anyone here without a physics background, it's basically an introduction to "more real" mathematical modeling of physical systems, beyond what you would learn in your first few semesters of college physics. Think "applied differential equations."

This class naturally involved some fairly high level math compared to what we had been using before, and I think the professors honestly didn't know (or had forgotten) how to transition into it. Suddenly the class was using things that they might have done one or two homework assignments on in differential calculus a year a go, but there was never any "here are the overarching concepts used in this class and how they fit into the two years of calculus you have already taken." The class was just dumped off the Laplace transform deep end and left to memorize steps to solve problems.

I think the problem was that the professor had honestly forgotten what it was like to not have an intuitive grasp of what a Laplace transform did or how to linearize a dynamic system. I've found myself falling into almost the exact same trap when teaching people object oriented programming for example. To an experienced programmer, an object is the most natural thing in the world, you can pass them around, perform operations on them and they just "work" for you. It's so simple for you that it's hard to remember that it may be a completely foreign concept to someone brand new.

The thing is, it's comparatively easy to just teach someone the steps to solve a specific problem when the alternative is teaching them to think in such a way that they could solve it on their own. It's the same with debugging, it's the same with math, and with programming. I spent the rest of that semester watching the professor spend every single lecture doing one or two difficult examples, and nothing else. Teaching someone so that they can gain a deep understanding of a given subject is really hard, and I think it's the rare school and professor that can do it effectively.

I ended up doing well in that class, but it was really by rote memorization of every single mathematical "scenario" that I thought likely to be on the exams. It wasn't until a year or so later that dynamic systems finally clicked into place, and it was because a professor in an unrelated class happened to spend 45 minutes on a good intuitive explanation of what they meant.

I think teaching real systematic debugging is similar in that it's something that requires a real mental investment on the part of the teacher. That's not to say it's impossible, because it certainly isn't. It just requires someone to make the investment and explain the why as well as the how.


6.002 midterm at MIT in 91 had a problem trivial to solve with a Laplace transform ... Actually it had a couple. Of course Laplace transforms would be taught in the second half. The few students who already knew the transform or somehow knew to read, ahead did well. I spent 20 minutes on one problem solving the differential equations from scratch without the damn transform. I felt sorry for those who weren't taking it as a freshman (it was pass/fail for all classes your first year at MIT).

That guy was a useless jerk. (As a teacher.) I did have professors that were useful jerks ... Difficult homework every single class with spot on directed feedback by the very next class. Unfortunately teaching well at a research university is just a mark of being eccentric. It seems to be negatively correlated with getting tenure. The 'useful jerk' already had tenure. He was mean, but an awesome teacher.


> I've found myself falling into almost the exact same trap when teaching people object oriented programming for example.

At least in my experience, a huge issue in teaching OOP is that the introductory examples are not compelling examples of class design (DiagonalMatrix and IdentityMatrix are both Matrices! Dogs and Cats are both Animals!). Students don't really understand the pros and cons of OOP from those sorts of examples.

Perhaps due to this, I would attempt to do OOP by creating taxonomies that modeled the ones in the real world. It wouldn't be until into my professional career that I would realize that this is an anti-pattern in most cases.

Though OOP best practices have matured since I was in school (interfaces base classes are encouraged, deep hierarchies and reuse through inheritance are discouraged, the integration of functional programming), so perhaps things are better now than they used to be.


Debugging isn't a science that can be taught. In this particular case, everything was made up of the same things a the fundamental level, so it was an easy rabbit hole to follow i.e. since one of these inputs is wrong, then one of these other inputs must be wrong, and follow all the way down.

Debugging a more complex system has to do with the system in question. Sure you can teach people to set breakpoints and to run the debugger, but that's about it. Learning how to observe what is going on in any particular system is something that you learn how to do by banging your head against that system for a long time.


>> I’ve even seen people talented enough to breeze through the entire degree without ever running into a problem too big to intuitively understand; those people have a very bad time when they run into a 10 million line codebase in the real world.

Reminds me of the classic Eric S. Raymond email to Linus Torvalds, where he says essentially this. http://www.vanadac.com/~dajhorn/novelties/ESR%20-%20Curse%20...


Okay real life example....

In Mobile Operating systems the application lifecycle and the OS application system sandbox collide to give you this nice Jaberwoky of output in a debug, for example Android...

They way I did Id was use some example Android Apps to learn what the normal android OS would show in a debug output as far as the OS components themselves start up the android OS and than the application..

I can always look at logcat and see where things are hanging and just know where to look in my code because of what OS components I see reporting in


During my Education to Electrical Technician ("Lehre") in Switzerland we had a lot of exercises to learn debugging. One was that we got a blackbox with some pins and some maximum ratings and we had to find out what circuit is inside. Or we got a Circuit with a defined function and had to find out if it works as expected or not. And so on. This was also a big part of our exams.

Then on the University the education shifted to a very theoretical approach.


back when I was CS at JHU; there were some very strange divides in how some professors viewed "programming vs. CS". (that I only got a taste of as an undergrad; the real opinions didn't come out until I was talking to the professors after my tenure as a student) What I would call useful "life skills" (debugging, good code style, VCS, various development workflows), they called "a waste of time getting in the way of their study of PURE CS(tm)". Certain professors foster some of these skills, certain professors pretend none of them exist at all. So it's a grab bag of what skills you'll end up graduating with, aside from those you take the classes for. (which, credit where its due, with the giant qualifier of "within higher end CS", was never terrible.)


The more important issue, at least in my case, is that schools don't teach programming at all, only programming languages and programming is what you learn on your own while you write code, teachers don't help in that process.


Yeah there are two fundamental things that are not taught well in many (not all) computer science programs.

1. Debugging as the author explains.

2. How to properly use version control and work with others using version control on the same codebase. Hopefully Github for education will help with this one.


The author is talking about engineering not cs classes. That being said, I would completely agree with you, having been through a cs degree myself. Version control and working in a group with it would have been very beneficial in school. Although, when I was in school that would have meant svn not git which I have grown to love, but felt the pain of having to learn the hard way.


Who the heck went through a CS degree without using version control? We used it from day 1 until graduation and you didn't get credit if you didn't use version control, even on your projects you did yourself. How does one do group projects in school without version control?

Git didn't exist when I was in school either.


If by 'version control' you mean 'delete and resubmit zip files into Moodle' then sure... my courses use version control...

>How does one do group projects in school without version control?

Dropbox. If you're lucky. Otherwise, attaching files through gmail.

FWIW I wish Git were part of my curriculum. I also wish the Java curriculum included Android. You gotta take what they give I guess.


I'm old. Dropbox did NOT exist when I was in school. Neither did Gmail...

> If by 'version control' you mean 'delete and resubmit zip files into Moodle' then sure... my courses use version control...

No I meant real version control. When I started CVS was standard.


If it helps, the last time I was in a CS class before this current iteration (more or less to improve my career options) we were playing around with DOSSHELL and Turtle Graphics.


That is great! I wish that was more common.


RPI covered using version control and debugging. www.rpi.edu


Other skills not taught in a CS degree:

1. Version control 2. Unit testing


I learned gdb in our systems/assembly course.


Because it isn't fun or exciting.


What do you mean? It is both fun and exciting. It's like solving little mini-mysteries. Hmm, how come this variable is null here? That shouldn't be possible! But it is, so how did that happen? Mystery!

You can also learn from it. Often a bug is caused by some misunderstanding, so figuring that out teaches you something. Here's an example where I learned several things finding and fixing a bug: http://henrikwarne.com/2014/01/27/a-bug-a-trace-a-test-a-twi...


It does teach you a lot. I spent ages wondering why my inherited class was blowing up on destruction in C++ - the parent destructor was not virtual and I was deleting the object using a pointer to the base type. So working this out has meant that there is a big burn mark in my brain to not do this again. Debugging really is good fun and helps with learning.


Perhaps I'm sick in the mind but I really like debugging, especially hard ones.

Recently our stack fell over when we did a production release. Basically web worker threads were spawning more worker threads and this crashed our AOP proxy (Castle.Core) which uses some global state and was spinlocking for some reason. We ended up with everything whacked at 100% across the cluster. 2 hours into the problem and it was something completely different - a for loop that didn't end on another thread and took up one core per node causing CPU contention. The spinlocks just amplified the problem.

One change to the loop continuation clause and wham -- fixed.

The journey is exciting and you learn so much on the way. It's great.


One of the best experiences of my engineering career was being on top of a 10-story sensor tower at midnight in a Seattle drizzle trying to get some custom sensor probes attached to a radar to work properly because we were chasing a very elusive but potentially catastrophic failure mode we didn't understand.

Yeah, I'm a little... different.


Problem solving isnt fun or exciting? Sounds like you are doing it wrong.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: