Hacker News new | past | comments | ask | show | jobs | submit login
Falsehoods CS Students (Still) Believe Upon Graduating (netmeister.org)
53 points by lolptdr on Nov 11, 2019 | hide | past | favorite | 39 comments

Bah, sprinkling printf statements is a quality debugging technique. Sometimes everything really is a nail.

It seems to me that there are two schools of thought: one school which uses debuggers, the other school which uses printf statements and equivalents. Neither schools are absolutely false.

I was once told a quote (original unknown, so any butchering is mine) something " when I was a young programmer, I relied on print statements to debug. When I learned more, I used a debugger, and when I learned even more, I used print statements"

I always took that to mean dogma was no substitute for results, so I use whichever works better for the problem at hand. (Or at least try to - often this means trying to notice when one way isn't working well and being willing to switch, but that means I was doing the worse choice for a while)

A third school that does both, based on the circumstances.

Exactly! I’ve dealt with multiple real-time embedded platforms that have a log output channel that is waaay more convenient to use via printfs than hooking up a whole debug apparatus.

Though I've had the super rare occasion where the bug actually IS the code behind the print function, usually in experimental PL/OS stuff. There's a sort of turtles all the way down feeling the first time it happens.

Gotten bit by that before in Ruby, which attempted to pretty print objects by recursively enumerating their properties and printing them. One mutually recursive data structure later...

Well gdb is pretty hard to use. I learned how to step through programs recently but I'm still not confident using it. The fact that printing variables is still the quickest and easiest way to gain insight into a running program's state speaks volumes.

To use gdb, I have to recompile all code with -g, run gdb, set up breakpoints and variable printing, run the program in gdb and then slowly step through it.

Alternatively, I can just write this somewhere:

  fprintf(stderr, "i = %d\n", i);

Agreed — if you're strategic about it, you cut in half the space in which the bug could be hiding each time you add a printf and re-execute.

There's times I use the debugger and times I just print to the console. It really, really depends on the situation. Not to mention setting the logging level to DEBUG is basically the same as just a bunch of printfs and it's many times the best way to debug issues in production.

To say printf is not a "quality debugging technique" is just false.

You ever seen those bugs, where adding a printf makes the bug disappear?

Heh. Sounds like race conditions. That or your code is possessed, which I've had happen too.

My personal favorite is programs that seem non-deterministic. Different errors/results for different executions.

Have you ever seen those bugs where attaching a debugger made the bug disappear?

Great list, two nits.

"Open Source means it has fewer bugs and is more secure."

Maybe not in the absolute meaning of that expression, but I find it true when you only consider the higher quality open source projects, e.g. say the top 10k open source projects. The code that most professional software developers produce at big and small companies alike makes me in awe that their final products work as well as they do.

"Being able to program is the most important aspect of being a good software engineer."

Ehh, I would say this is true. Certainly not the only aspect but if you can't code then you just end up being someone playing office politics all day trying to hide that fact.

I find it true when you only consider the higher quality open source projects, e.g. say the top 10k open source projects.

How are you defining "top" here? The best quality open source projects are not the most widely used projects -- if anything, I find that the correlation tends to be negative.

The biggest falsehood I believed is that companies care about writing quality code, and you will get to apply the rigor on the job that you learn at uni. On the whole, they don't really care, with some exceptions.

You wouldn't know it from interviews, which is why the falsehood persists.

This is a great list. The arrogance of the recent grad is pretty astounding (in my experience) (myself included). I wish more time was spent on solving some of these misconceptions during university.

Great list. I would add: ‘Coding and software development are the same thing’

> Command-line tools should print colorized output.

Just add a "colorless" command line argument if you need to parse things without ASCII color codes. Better yet, detect when the program is outputting to a pipe and disable colorized output, e.g. like Git https://unix.stackexchange.com/a/19320

> They will use lots of math in their career.

The validity of this is highly domain-specific.

> 'git' and 'GitHub' are synonymous.

Aside from GitHub-a-likes (e.g. GitLab, BitBucket), what does this mean? I'm assuming that it's that you can use Git by its original use pattern (i.e. without a "single-source-of-truth-plus-issue-tracker-as-a-service" system and more as a "true" DVCS) but very few projects seem to really use it this way. They're important projects, sure (e.g. Linux) but they are few in number.

> Sprinkling printf statements is an efficient debugging technique.

This is...opinionated. As useful as GDB can be in a pinch I have often preferred to just output things to console. `printf` specifically is a bad example, as C doesn't have reflective abilities (out of the box, anyway), making debug-by-manual-print harder.

> Compiler warnings can be ignored, as they'd be errors otherwise.

I smell a Golang programmer...

> Using lambda in Python is a good idea because it shows others you have a CS degree and understand the "Lambda Calculus".

Kinda rolling my eyes at this one. Yes one can "be annoying" with FP-like concepts but lambdas can be very useful in a pinch.

>Object-oriented programming is the best and most common programming paradigm.

>Using a custom written Vector class makes your program object-oriented.

I feel I should add my personal falsehood: "State objects with methods means your program is object-oriented."

CS Students think Steve Jobs was successful because he was a jerk?

Unfortunately I really know people in my class who believe that if people are in fear of getting fired and missing deadlines they will be more productive :/

That's just MBA canon.

A lot of people who are interested in management (including CS people wanting to make a ton of money), have this idea that 'survival of the fittest' is just everything killing everything else all the time.

The line of thinking is that the best way to be successful in the business world is to abuse and destroy everyone possible to show how tough and competitive you are. If you've ever been pushed to compete with your co-workers instead of a competing company, someone higher up is probably thinking that way.

> 6. CS professors know how to program.

> 7. CS professors know how to use a computer / mobile device or how the internet works.

I'm not entirely sure how exposure to CS professors could give anyone that impression.

Reminds me how a CS professor proudly showed me how he managed to send an email with an attachment.

Now if he could show you how to send an email with PGP, that would be impressive.

For him, sending an email with an attachment was very impressive.

Alright, this one eludes me:

>Object-oriented programming is the best and most common programming paradigm.

Sigh... “10. Compiler warnings can be ignored, as they'd be errors otherwise.“

If commenting code is bad, then I don't want to be good...

The article talks about 'lots of comments'. The occasional comment can be helpful but many things are just as clearly expressed in code as in comments. E.g., if you write a numerical algorithm you should probably either explain it or add a reference. But if you start to see things like

    DatabaseConnection conn; // Connection to the database
things have gotten very badly out of hand.

One more: "If you're coding on Windows, you're doing it wrong."

Reads more like author's pet peeves than anything supported by data.

> Sprinkling printf statements is an efficient debugging technique.


Wrong thread?

Good catch; we've moved it from https://news.ycombinator.com/item?id=21481931. Thanks!

Here is another one: Python is great for web development.

eeeeh, that sounds more like tribalism; there's plenty of good webapps written in Python... and Ruby, and Go, and PHP, and Perl, and any number of other languages, many of which might be unexpected but work well for the people using them.

That is correct, but some just tend to religiously adopt Python. Those who understand that the best tool for the task at hand is the best tool for the task at hand are fine.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact