I always took that to mean dogma was no substitute for results, so I use whichever works better for the problem at hand. (Or at least try to - often this means trying to notice when one way isn't working well and being willing to switch, but that means I was doing the worse choice for a while)
To use gdb, I have to recompile all code with -g, run gdb, set up breakpoints and variable printing, run the program in gdb and then slowly step through it.
Alternatively, I can just write this somewhere:
fprintf(stderr, "i = %d\n", i);
To say printf is not a "quality debugging technique" is just false.
My personal favorite is programs that seem non-deterministic. Different errors/results for different executions.
"Open Source means it has fewer bugs and is more secure."
Maybe not in the absolute meaning of that expression, but I find it true when you only consider the higher quality open source projects, e.g. say the top 10k open source projects. The code that most professional software developers produce at big and small companies alike makes me in awe that their final products work as well as they do.
"Being able to program is the most important aspect of being a good software engineer."
Ehh, I would say this is true. Certainly not the only aspect but if you can't code then you just end up being someone playing office politics all day trying to hide that fact.
How are you defining "top" here? The best quality open source projects are not the most widely used projects -- if anything, I find that the correlation tends to be negative.
Just add a "colorless" command line argument if you need to parse things without ASCII color codes. Better yet, detect when the program is outputting to a pipe and disable colorized output, e.g. like Git https://unix.stackexchange.com/a/19320
> They will use lots of math in their career.
The validity of this is highly domain-specific.
> 'git' and 'GitHub' are synonymous.
Aside from GitHub-a-likes (e.g. GitLab, BitBucket), what does this mean? I'm assuming that it's that you can use Git by its original use pattern (i.e. without a "single-source-of-truth-plus-issue-tracker-as-a-service" system and more as a "true" DVCS) but very few projects seem to really use it this way. They're important projects, sure (e.g. Linux) but they are few in number.
> Sprinkling printf statements is an efficient debugging technique.
This is...opinionated. As useful as GDB can be in a pinch I have often preferred to just output things to console. `printf` specifically is a bad example, as C doesn't have reflective abilities (out of the box, anyway), making debug-by-manual-print harder.
> Compiler warnings can be ignored, as they'd be errors otherwise.
I smell a Golang programmer...
> Using lambda in Python is a good idea because it shows others you have a CS degree and understand the "Lambda Calculus".
Kinda rolling my eyes at this one. Yes one can "be annoying" with FP-like concepts but lambdas can be very useful in a pinch.
>Object-oriented programming is the best and most common programming paradigm.
>Using a custom written Vector class makes your program object-oriented.
I feel I should add my personal falsehood: "State objects with methods means your program is object-oriented."
The line of thinking is that the best way to be successful in the business world is to abuse and destroy everyone possible to show how tough and competitive you are. If you've ever been pushed to compete with your co-workers instead of a competing company, someone higher up is probably thinking that way.
> 7. CS professors know how to use a computer / mobile device or how the internet works.
I'm not entirely sure how exposure to CS professors could give anyone that impression.
Reminds me how a CS professor proudly showed me how he managed to send an email with an attachment.
DatabaseConnection conn; // Connection to the database