Hacker News new | past | comments | ask | show | jobs | submit login
Three of the Hundred Falsehoods CS Students Believe (uni.edu)
248 points by PretzelFisch 30 days ago | hide | past | web | favorite | 160 comments



I'm mostly amused by this - > 6. CS professors know how to program.

Back in grad school, we had an Algorithms Professor who was my advisor & wrote several important papers. We also had a PL professor who mostly taught Lisp/C/C++ etc. One day PL professor fell ill. So Algo Prof was sent in as a substitute. So he tried to divert the class attention by going into algorithm detail instead of focusing on programming language theory. But homework was due next week, so students asked him about some C nuance. So Prof said I don't know, but I'll let you know by next class. Then he hurried back to his room & tried to read up on C but obviously couldn't make much headway. In those days we had Usenet & that was it. So the Prof goes on Usenet & clicks on comp.lang.c. In those days I used to fancy myself a C hacker & I hung around on newsgroups trying to help out newbies. So I see this frustrated question from a newbie "HOW TO OPEN FILE FROM C ???" and I click on it so I can help this poor miserable bastard. Turns out it is from my Professor! The footer had his entire credentials like PhD, list of papers published, teaching career etc. So everybody on comp.lang was mocking this idiot who had a PhD in CS but didn't know how to open a fucking file with C. I was so embarrassed I rushed into his office & demanded how he could make a fool of himself like that. "You could have just asked me about fopen, why did you post in public that too with your real name ??! " He was like I've never used all this newsgroup stuff I barely know how to telnet. This was my first post I thought they will help me but they are just heaping abuse.


> I was so embarrassed I rushed into his office & demanded how he could make a fool of himself like that.

Why is it shameful to ask for help when you are doing something for the first time? Why such strong emotional reaction to that leading to berating from someone who is not even affected?

Algorithms are a branch of math - it is about proofs and structures and big O. It is about figuring out how to achieve right big O. Especially in their beginning, it was not necessary to know C in order study or teach them. Nowdays, typically they are not taught in C either.

Edit: fixed typos


> Algorithms are a branch of math

Well, when viewed mathematically, sure.

> It is about figuring out how to achieve right big O.

That's hardly the whole story. I'd say the engineering work 'counts' as algorithm work, as much as the computer science.

If you're looking to publish in a top journal and prove how clever you are, an impressive time complexity is fantastic, even if the algorithm is completely impractical for real applications. Case in point at [0].

If you're looking at real-world performance, you don't especially care about time complexity, you care about, well, the real-world performance of your implementation. Maybe you'll improve time complexity too [1] (I suppose that counts as crossing from engineering to science), or maybe you'll just put in the engineering work without making breakthroughs in theoretical terms [2].

[0] https://en.wikipedia.org/wiki/Coppersmith%E2%80%93Winograd_a...

[1] https://en.wikipedia.org/wiki/Timsort

[2] https://news.ycombinator.com/item?id=21459839


Algorithm courses in CS do not teach those real time performance considerations. They are about the math part and pseudo code. And I think they should not teach that. The real world performance issues area is heavily dependent on technology being used and changes over time, algorithm course is about basic platform neutral algorithms.

Those real time considerations should be taught, but in more courses that are focused on coding itself and by people who are focused on that area, are good at it and can explain it in depth.

A single prof does not need to be good everything and every course does not need to teach everything.


> I think they should not teach that.

I agree that the practical side of things is quite a different specialism - computer architecture and compilers are different topics than the 'pure' study of algorithms. Today's emphasis on concurrency and parallelism does impact the theory, though; the worlds aren't entirely isolated.

I suppose my point is a semantic one: I consider the practical aspects to be a part of the study of algorithms.

> The real world performance issues area is heavily dependent on technology being used and changes over time, algorithm course is about basic platform neutral algorithms.

Well, the Coppersmith-Winograd algorithm will never be practical.

> algorithm course is about basic platform neutral algorithms

There's no such thing. That's why I mentioned Coppersmith-Winograd.

Looking only at the complexity properties of the Coppersmith-Winograd algorithm, you're unlikely to reach the (correct) conclusion that it's entirely useless for any practical application.

> Those real time considerations should be taught, but in more courses that are focused on coding itself and by people who are focused on that area, are good at it and can explain it in depth.

Agree, except to nitpick that it's necessary to develop a solid understanding of computer architecture, which may or may not fall under 'coding'. (That sort of expertise is beyond the knowledge of most coders.) Again, that's just semantics.

> A single prof does not need to be good everything and every course does not need to teach everything.

Agree completely, but it's important not to oversell the theory.


> There's no such thing. That's why I mentioned Coppersmith-Winograd.

Ok, but that is purely theoretical nitpick that has nothing whatsoever with what we are talking about. Algorithm courses dont teach Coppersmith-Winograd unless you are in some very special course that is designed for those few people who want exactly that out of interest and curiosity.

> Agree completely, but it's important not to oversell the theory.

The thread is about whether it was shocking for algo prof to not know C++ few dozen years ago. It is not, regardless of how useful C++ and architecture is.


Okay, this is a great story. But look at the prof's actions: he covered for a colleague, admitted he didn't know the answer, and had the humility to ask when he realized he wasn't going to be able to learn C overnight. It might have been easier on him to use a pseudonym, but the guy sounds like he was doing life right.


In contrast, there are plenty of CS professors who are really quite exceptional hackers. The falsehood should be "All CS professors are really good programmers".


Dan J Bernstein, for one.


Expecting a CS prof to know how to program is like expecting a pure mathematician to know how to use Mathematica/Matlab. It’s absurd and doesn’t follow at all.


This is more like an anatomy professor who's never touched a cadaver. Far less excusable than a theoretical mathematician not knowing Matlab - because Matlab is not the only way to apply math (the mat in Matlab is for matrix not math)


The guy has a PhD in algorithms. He’d he writing proofs, not writing code. The same for programming languages, you’re proving things, like the Pumping Lemma. Computer Science is not about producing coders for industry. You can learn how to code from a boot camp or Coursera, but it has very little to do with Computer Science.


No this is more like an anatomy professor who doesn't know how to remove an appendix without asking a surgeon.


And C isn't the only way to apply computer science.


Indeed, Dijkstra barely ever even used a computer. I can't say I agree with trashing a Prof with deep expertise in algorithms for not knowing C while doing a favour by substituting for a colleague.


I know how to build a C++ compiler, but I'm not very good at C++ code.


Is anyone good at C++ thou, didn't Bjarne only rate himself as 7/10 at C++ coding


I don't know. I'm not even able to recognize good (modern) C++ code when I see it.

I figure it's like being the race car designer. He can design a fast car, but he can't drive it fast. And the driver can't design it.


It's like expecting a pure mathematician to know if 57 is prime. https://wikipedia.org/wiki/Alexander_Grothendieck


It's more like expecting an astronomer to know how to use a telescope :)


This sounds so made up, but it's too specific for it to be. Thanks for sharing, a nice chuckle before the start of a new work week.


Comp.lang.c wasn't exactly know as a kind space. Thinking about it, the lisp group had a bit of unfriendly rep. The objective-c group had one damn fool who advocated that NeXT wasn't actually a true Objective-C. I do miss those group though.


> the lisp group had a bit of unfriendly rep

And Erik Naggum might have had something to do with it. Perhaps unfairly as he seemed to hate stupidity rather than ignorance, which he was AFAICS, helpful towards.

I alwasy imagined a 60 year old sergeant-major type frothing at the mouth, heck yes he could flame! Dead aged 44. A loss.

"Erik Naggum several times stated that stupidity, or rather the lacking willingness of individuals to acquire knowledge about a subject, argument or read other people's arguments with an open mind, was more or less a criminal offense. He was known for his polemic aggression towards what he considered to be ignorant individuals. Much of what he wrote was so full of sarcasm and irony that it could be difficult to understand what he truly believed in and what were general exaggerations made just to make a point. "

https://en.wikipedia.org/wiki/Erik_Naggum


I don't know if there's a place less friendly than #perl on freenode

I'm pretty sure 90% of the questions I ever asked there were answered with "CPAN is a thing" or "Why would you even want to do that?". Rarely would anyone answer my actual question.


Use irc.perl.org #perl-help instead. See <http://www.irc.perl.org/rules.html#Community%20Policies>.


##c is easily the most pedantic chat forum I've ever visited. It's also the only place I've seen someone use a diaeresis in chat.


Computer Science is to programming as Theoretical Physics is to Homebuilding. A lot of physics professors probably wouldn't know which end of a hammer to grip to drive a screw.

The problem here is probably students expecting to attend a trade school and discovering they're in academia instead.


> which end of a hammer to grip to drive a screw

I must confess that I don't know which end of a hammer to grip to drive a screw.


I've always blamed Microsoft Outlook for the top posting nightmare that has become email standard.

I just use stock Apple mail client and always bottom post (and snip as appropriate) and I get so many non tech people who approach me in the hall and ask "how do you do your emails the way you do? can you show me?"

My narrative has always been that Gmail simply followed the trend that Microsoft's tools set in motion with their monopoly^H^H^H^H^H^Hy^H^Hpopularity amongst business users.

Of course, I'm not even sure how many modern/younger readers will understand why "^H". I'm getting too old for this.


I like top posting. My experience has been that snippet-and-reply sometimes leads to an argumentative and confrontational style of discussion, perhaps more than top posting.

My theory is that snippet and reply may lead to taking quotes out of context and replying with an uncharitable interpretation of the snippet.

So while top posting may be less logical and efficient, I believe it may be more conducive to a friendly discussion.

Of course I could be completely mistaken about this, and it was just a coincidence that the communities I was involved in where top posting was heavily criticized had cranky people in them who would have found something to get upset about regardless of posting style.

Edit: I don't mean to imply that you are one of those cranky people just because you like bottom posting or snippet and reply! :-) Just observing something I noticed in online forums many years ago.


> My theory is that snippet and reply may lead to taking quotes out of context and replying with an uncharitable interpretation of the snippet.

Whenever you start to see multiple quote snippets appear in an HN (or any forum) comment thread, it's basically guaranteed to have died and splintered into a bunch of uncharitable nit picks.

It encourages https://en.wikipedia.org/wiki/Gish_gallop.


Maybe this is just my innate pedantry talking, but I think the opposite pathology is common too: some people are great at writing convincing but illogical/weakly-supported arguments that only fail when broken down and looked at in a granular, nitpicky way. Strong norms against that sort of response might make discussion threads less annoying on average, but they are also a boon to sophists, charismatic charlatans and sincere but overconfident bullshitters.


A nice thing about bottom-posting, and even more so inline-posting, is that is a weak certificate that the person replying has actually read the whole message.


>Whenever you start to see multiple quote snippets appear in an HN (or any forum) comment thread, it's basically guaranteed to have died and splintered into a bunch of uncharitable nit picks

Citation needed.

>It encourages https://en.wikipedia.org/wiki/Gish_gallop

[Insert smarmy HN-stlye dismissal here]

</sarcasm>


You could argue though, that the post that gets a reply like that was itself Gish galloping. Somebody just took the time to try to address all the points.


As an aside, Gush galloping is eerily similar to amplification attacks.


Broad generalization: if there are three questions in an email, top posters will answer one, maybe two if the planets align. Someone who takes the time to inline quotes is more likely to recognize that three answers are required.


That does fit my personal experience, which has been that when comparing old-school open-source communities where people know how to do proper quoting on mailing lists vs. internal e-mail communication at an Outlook-centric company that does a lot of remote work by virtue of being geographically spread over the world, the quality of technical discussion is much higher in the open-source communities. A large part of that is indeed that sub-points and/or nuance of earlier e-mails is simply dropped on the floor in the Outlook culture.

I suspect that it's not just about taking time to inline quotes, but also simply about seeing what you're replying to directly next to your answer, as opposed to having it be separated by at least a large message header and likely by much more.

Another sad fact is that people re-invent inline quoting in Outlook culture all the time, badly: they manually copy relevant parts into their top-posted response, or they use some sort of color coding to write their own response inline.


I find it makes things unnatural to read.

> I like top posting.


If you've already been following a conversation, it should make it easier. If you haven't been following the convo, you should probably start at the first message instead of jumping in the middle.

I find top posting with inline, contextual comments when necessary, most useful.


So? ;-)

> I find it makes things unnatural to read.

>> I like top posting.


> My theory is that snippet and reply may lead to taking quotes out of context and replying with an uncharitable interpretation of the snippet.

My theory is that we're all so used to internet debates unfolding in snippet-and-reply-like formats that we associate snippet-and-reply with argumentative discussion by default.


I have an interesting story to tell closely related to this. I used to work as a teaching assistant for an undergraduate computer networking course at UBC. As part of the course, we would usually ask the students to write a simple POP email server, in order to practice their socket programming skills. This was an assignment designed by a professor in the early 2000s, when being able to set up an email client was considered a basic computer skill.

Fast forward to 2019, and I can say that roughly 80% of students in the class didn't know what POP, IMAP or SMTP were. Before being able to get started on the assignment, we would have to provide them with additional material explaining how emails are delivered and what an email client does to retrieve them. The majority of students just grew up using the Gmail app or whatever self-configuring email clients they had on their phone/laptop, so they had no idea of what we were asking them to build.


I think this is now related to the democratization of programming than it is to Gmail in particular. CS as a major used to be limited to people who were already self taught and had a solid understanding of how computers and networks worked. It has become not just acceptable, but common, for CS majors not to have grown up programming. I think that's overall a good thing.


Oh yes, it surely is, and I didn’t mean to hit at Gmail or any particular player. Additionally, I think it also speaks a lot about how industry did a very good job in constructing easy-to-use abstractions on top of tech that used to be hard to configure for a first-time user back in the days.


I used to work as a teaching assistant for an undergraduate computer networking course at UBC. As part of the course, we would usually ask the students to write a simple line editor, in order to practice their IO programming skills. This was an assignment designed by a professor in the early 1980s, when being able to work ed was considered a basic computer skill.

Fast forward to 1999, and I can say that roughly 80% of students in the class didn't know what ed was. Before being able to get started on the assignment, we would have to provide them with additional material explaining how to use a line editor and why you need one. The majority of students just grew up using Emacs, VI, Notepad or whatever visual editor they had on their computer, so they had no idea of what we were asking them to build.


Email is still widely used by billions of people whereas line editors never were.


Email is still widely used, but POP is not. Text editors are still widely used but ed is not.


Maybe, but SMTP and IMAP are still widely used. It seems reasonable for a computer science student to implement these protocols.


You'd expect a CS student to at least know what SMTP and IMAP are, even if they don't know how they work.

Just as you'd expect a CS student to know what Emacs is, even if they've never used it.

Relying on gmail horrifies me, because it's a classic political enclosure pattern.


Top posting is the proper place for new content in an email chain to go. It means the new message is prioritized over the history, with the option to have the history of the email chain available if you really want/need it.

MS Office was designed to fit business processes, and it does that well.

It's not a surprise that Apple's mail client falls over attempting to support business functions.


Top posting is completely worthless. All it does is include the message you replied to, which any sane email client can show immediately without having to copy it all.


I’m not defending top posting here. I’ve noticed a couple of practices in companies (not sure about other environments). One is doing a Reply All, adding a reply body and copying some new people into that thread after several back and forth messages have already been exchanged. Those new people (can) now have the full context of what they’ve received without the person composing the reply having to add context. The second is when people send mails to several others and miss copying some. Then one of the receivers or the sender themselves would do a Reply All and add a “+<person forgotten before>” (or sometimes even use ++ and the name).


Say someone gets added to an email thread some time after it got started—is there an easier way to get them all of the emails in the thread than having all of the emails as part of the one they were sent?


RFC 2046 multipart/digest

A standardized method of sending a set of email messages as messages, rather than copied into body text, predates Microsoft Outlook.


It's funny that this is a suggestion because top-posting was a reaction to the unfriendliness of keeping message history in MIME attachments rather than the body of the message.

MIME may be the technically superior solution...But like Betamax and MiniDiscs, it lost out to the easier-to-use solution.


Message history wouldn't use that; it uses headers (metadata) like 'In-Reply-To:' and 'References:', which are in… well, I was going to say RFC 822, but it turns out they go back to RFC 724 from 1977.

A MIME digest would be used for the case that you want to provide someone a bunch of messages they don't already have. In that case, this brings up another reason this is superior to including the thing as text: it preserves all the threading and other metadata.


Oh yes of course! Actually, multipart/mixed really seems like it should be the way that all clients do things, and that quoting at the bottom is an abuse of the quote button.


Sure, do something between an article a wiki and a forum.

Insert your cursor in their message, press enter and respond to every part worth responding to - inline. (you may want to sign your entry)

Remove all parts of the previous message(s) that do not directly build towards the current conversation.

It should look so organized that the new participant immediately adopts the format. If they fail, do it for them. (preserve the sane part of their signature)

Doing it like this really feels like you are talking with serious people who would never waste your time. That said, you can now ask how their weekend was. That part of the conversation is simply removed later on.


Top posting is extremely valuable to people who need to weigh in on parts of a discussion without participating in the entire thing from the beginning, like lawyers, accountants, engineers, and doctors, who may get added on late into the discussion but may need the message history so they can make informed statements/decisions.

Also, the suggested alternative, MIME attachments, simply doesn't work well in most mail readers. It's barely usable in Gmail and not at all usable on iPhone or Android. (Ironically, it works quite well in Outlook.)


Not if you are added to a thread.


> MS Office was designed to fit business processes

Microsoft Outlook made sense once you (or at least me) realized it was designed to send memo's not email.


It doesn’t “fall over.” It gives you the option to choose.


> Of course, I'm not even sure how many modern/younger readers will understand why "^H". I'm getting too old for this.

For all the other modern readers: http://answers.google.com/answers/threadview/id/386870.html


Excerpt from Hacker Writing Style (http://www.catb.org/~esr/jargon/html/writing-style.html):

There is also an accepted convention for ‘writing under erasure’; the text>

    Be nice to this fool^H^H^H^Hgentleman, he's visiting from corporate HQ.
reads roughly as “Be nice to this fool, er, gentleman...”, with irony emphasized. The digraph ^H is often used as a print representation for a backspace, and was actually very visible on old-style printing terminals. As the text was being composed the characters would be echoed and printed immediately, and when a correction was made the backspace keystrokes would be echoed with the string ‘^H’. Of course, the final composed text would have no trace of the backspace characters (or the original erroneous text).

Accidental writing under erasure occurs when using the Unix talk program to chat interactively to another user. On a PC-style keyboard most users instinctively press the backspace key to delete mistakes, but this may not achieve the desired effect, and merely displays a ^H symbol. The user typically presses backspace a few times before their brain realises the problem — especially likely if the user is a touch-typist — and since each character is transmitted as soon as it is typed, Freudian slips and other inadvertent admissions are (barring network delays) clearly visible for the other user to see.

Deliberate use of ^H for writing under erasure parallels (and may have been influenced by) the ironic use of ‘slashouts’ in science-fiction fanzines.


Can you explain why top-posting is bad/why people care about what IMV is a really irrelevant distinction?


I’ve seen this quip before on why top posting is bad.

> A: Because it messes up the order in which people normally read text.

> Q: Why is top-posting such a bad thing?

In longer threads, top posting makes it cumbersome to understand which points a responder is replying to. Most top posters (a generalization) tend to avoid point-to-point quotes and answers to establish context. Top posting is easy for a lazy writer, and transfers cognitive load to the readers.


The quip is stupid, and I don't understand why it's so popular. In reality, top posting looks like this:

--

Message 1:

Q: Why is top-posting such a bad thing?

Message 2:

A: Because it messes up the order in which people normally read text.

> Q: Why is top-posting such a bad thing?

--

My point is that you almost never read a single message in isolation, so bottom-posting makes you re-read everything you've just read all over again. If you happened to be reading an individual e-mail from the middle of a chain, bottom-posting would be slightly more convenient (and inline-quoting would lose information). Top posting gives you reverse chronological order, prioritizing what is important, and is friendly for discussions with changing number of recipients.


Top posting is fine for really shallow conversations.

As soon as you attempt to have more complex conversations, say about subtle technical issues, it falls flat on its face and inline quoting becomes far preferable.

In a way, this is a culture issue similar to the whole maker schedule vs. manager schedule discussion. A lot of management types engage primarily in shallow conversations[0], and so they may think relying on Outlook is fine, while being totally unaware of the damage it does to the engineering part of the organization (and possibly to themselves).

[0] In part this is inherent to a manager's role. For example, in many cases a manager's email may simply be the message "yes, dear direct report, you have formal permission to proceed with your plan". There's just inherently less deep thinking in a management role, when compared to engineering roles. Of course, deep thinking should occur in management as well in plenty of places, and the best managers do recognize that. But unless they come from an engineering culture, they're probably not even aware what they're missing with top posting.


I was never a corporate manager, but my impression is that "deep thinking" in management happens in Word documents.

Personally, I am a fan of inline-quoting, but only when I know I'm talking 1:1 or with a fixed group of people. If new people are expected to be included down the line - as is frequent with conversations that seek approval or feedback in a company - then I default to top-post to preserve the history of the conversation.

I don't know how a deep engineering conversation looks when written down, because I'm yet to see any in a work setting; whenever a problem approaches any interesting complexity, someone can't handle the complexity and calls for video or IRL meeting.

If there's a culture problem, I think the "maker" group is much smaller than the group of engineers. I personally blame webmail (GMail and others), which by virtue of popularity essentially set the rules for how e-mail is supposed to work, and more importantly than defaulting to top-posting, popular webmail clients don't handle tree structure. Inline-quoting makes sense when your discussion forms a tree, and not a stream of messages.


top posting is absolutely fine for the type of low content email generally used by 90% of businesses.

always bottom posting is nearly as awful as always top posting.

email is for the reader, not the writer. write to your reader’s expectations and to maximize their understanding, not to appease your own sense of what is “correct”


I just wish email clients would pick a direction and stick with it. The default on an iPhone is to have the most recent messages at the top. But if you open a thread of messages the most recent within the thread is at the bottom. But inside the individual message, the most recent text is at the top.


If it makes you feel better, I'm in my early 30s, and switched careers to software dev just last year, and I definitely understand "^H." Although I understood it because a veteran guy passed on the story to me a while ago. So I guess the moral is, keep that oral history alive!


get off my lawn^W^W^W^W

I'm getting old, aren't I.


Oh look at you and your fancy vt100

the lot of us are stuck on vt52s


lol, I'm not old enough to have ever actually encounter ^W. I learned about it from /.era who made fun of my 5-digit uid


Try it at your bash prompt someday.


I will top post unless I know the recipients are familiar with quoting conventions. I have run into far too many people who become confused otherwise.


[flagged]


Personal and generational attacks ("ok boomer" manages to be a toxic blend of both) are not allowed on HN. Please review https://news.ycombinator.com/newsguidelines.html and stick to the rules when posting here.

Edit: it looks like you've posted a lot of good comments but also quite a few that break the site guidelines. Could you please take the spirit of the site—thoughtful and curious conversation—more to heart? We'd be grateful.


I checked the original list [1] and it's a fun read; however the following is absolutely true in my experience:

> 30. Software with version numbers ending in '.0' are buggy and you should wait until the next release.

[1] https://www.netmeister.org/blog/cs-falsehoods.html


You can probably replace the 0 with any digit, and it will remain true.


I just run super alpha quality software all the time so that the “.0” release seems like a breath of fresh air in comparison.


Depends on the project, but it's absolutely true for some.


yeah, I'd be curious to see what a distribution of "time until first confirmed bug / next bugfix release" looks like. I strongly suspect the .0 versions don't last long in the vast majority of cases.

It may not be a True Fact, but it's not used that way. It's a reliable heuristic, nothing more.


For OS X the .3 release seems to be the sweet spot.


| 38. Employers care about which courses they took.

Dark side: Employers care about your technical interviewing ability and your GPA. Technical interviewing focuses on a subset of skills around certain algorithms and data structures. The courses most relevant to interviewing will spend at most a couple of weeks on the topics you need to know well.

Bright side: There's so much to know in CS that regardless of what courses you take, you'll likely have to (or want to) learn more on the job. So, in college, you want to focus more on learning-to-learn technical material. Also, certain CS courses contain so much outdated/irrelevant info that you can more efficiently learn all you need to know on the job or through your own projects.


Yeah take challenging classes that open your mind but you should fully appreciate that the job will have you use only a subset of what you learn. For example, I don't work in video games, but I took some computer graphics classes and it was amazing to learn how rendering and ray tracing work. Same for hardware. Even though I'm doing web dev, I've had hobby hardware projects I've worked on and enjoy tinkering with circuits every so often.

What employers do care about is that you worked on something challenging, you're able to overcome adversity to ship something, and you're eager to learn more.


Diverse skills can be super important. Even if you're talking about computer graphics in a really curmudgeonly field.

I'm in GIS. (geospatial information systems) Turns out knowing how to define a camera using a matrix and intersecting a ray with a mesh of triangles is pretty useful. And it turns out that if you don't know how to do these things and just sit down and start cowboy coding the solution you're gonna come up with is going to need to be rewritten by the person they hire when you leave.

There are a lot of things I run into in my job where I genuinely don't know what the right algorithm is, compared to the number of times I've said "I recognize this thing" and brush up on the literature and implement some variation on it as applicable to my particular usecase. I'm kinda curious as to how much of the stuff I've written has someone come across it later and been like, "wtf is this shit, you just use <so and so's (probably Djikstra's)> algorithm".


For me the most useful skill has been to recognise when I'm doing something that's O(n^2) or worse, especially if it involves doing anything besides just math (e.g. database queries, file reads, web requests, etc) If something is O(n^2) or worse it's a good sign I should probably google around for a better algorithm.

Thankfully python has a rich set of built in datastructures, using a dict (or a hash map in other languages) turns a lot of O(n) problems into O(1) problems. The bisect module turns O(n) problems into O(n log n) problems. I'm always keeping my eye out for more of these big-o-reducers.

That said, in production software it hardly ever matters these days as long as you're not doing anything exceedingly daft. Fast enough is fast enough.


Maybe FAANG cares about GPA, but I haven't had it listed since I was applying for internships.

Not a single person has ever asked for it.


I don't even think they care, I got an internship in one of the FAANG companies and didn't even have my GPA on my resume.

I also remember the recruiter telling people you should only have your GPA in case it was super high and it would work as an extra to your resume. So I would think your interviewing skills matter the most.


It's been a while, but when I was entry level the only employer I encountered who cared about GPA was Google. Have times changed?


> 8. You should always add lots of comments[0] to your code.

If I don't do this, when I come back to the code an hour later I have no idea what its supposed to do or why I wrote it. I'll also usually add an explanation, like "3 is bad number of args because..."

> 44. Funny names are funny, you can always change them later.

This one drives me crazy, I hate when people use something like "x" or "habberdasher" instead of something functional like "newspapers"

[0] https://twitter.com/jschauma/status/1044581775983276034


> If I don't do this, when I come back to the code an hour later I have no idea what its supposed to do or why I wrote it.

Many people write comments which are often nothing more than a really obvious transliteration of the code to English "foobar += 3; // Add 3 to the foobar".

These comments add little to nothing and often become worse than useless when they fall out of date and you end up with a cryptic "foobar += 7; // Add 3 to the foobar"

I find it helpful to imagine a codebase guru, someone who not only wrote the thing but has had years of experience maintaining and teaching to others... and the comment writes down what they would say if you asked them about it.

The guru wouldn't say 'adds three to foobar', they'd tell you about how the number used to be the obvious value of two but then the database choked when the boss was trying to store their golf scores in it, because foobar controls the whatzanits and there can be a race that requires it to have room for one extra. The guru would tell you _why_, and that's what a better comment will do.

Students of programming are often just writing what they're told to write. They don't necessarily have a strong grasp on the why, especially not a deep contextual grasp.


I once saw

    using namespace std; // using namespace standard


Worst comment I have seen was in an systems paper at Uni (assignments in a MIPS's like assembly language).

add $1, $2, $3 #add some stuff.


> If I don't do this, when I come back to the code an hour later I have no idea what its supposed to do or why I wrote it.

I think the point that the tweet is trying to make is that if you comment, you should explain why you are doing something instead of saying what you are doing. The latter is already written there, in code.


This is one of a few of my programming pet peeves. So many people I work with add comments which are completely just re-stating the code that follows. Ruby is particularly bad for this, at least visually.

  # authenticate user
  user.authenticate!


> If I don't do this, when I come back to the code an hour later

Good coding means two things, in general:

* There is no more understandable comment than no comment needed.

The code should explain _what_ it is doing as close as possible in words such that any programmer, including you, can read the code and understand what it is doing.

Bad: sum = a.length + b.length

Good: totalNumberOfGoodProducts = exceptionalProducts.length + acceptableProducts.length

* Comments should be the exception for _why_ something odd is happening.

Bad: if(r == 42) return; // Nothing happened.

Good: if(ioReturnCode == 42) return; // Ignoring code 42, an undocumented value and seems to be a "no operation happened." We'll retry again later in the code.


The better you get at writing code, the more the code documents itself, and the less comment writing you will need to do.

Think of writing a comment as a failure of your ability to write good code - we all fail from time to time but we asipre to do better over time.


>Sprinkling printf statements is an efficient debugging technique.

I’m a senior engineer for one of the big tech companies. For some issues, printing (or writing to your logger) is certainly what you’ll try again and again and it can absolutely get you a far way or down to the root cause. It’s not the best strategy (Depending on what you’re dealing with) but it can often work.

Where it’s probably less fruitful most of the time is if you’re debugging some kind of distributed system that has just that one edge case which occurs < 1% of the time in production against real traffic, and it’s entirely your own damn fault for trying to reinvent distributed locking or something akin to your own algorithm that “handles” eventual strong consistency.


I find it depends on the language, too. Some make debugging a little harder, but are so easy to mess with (REPL or similar), that printing is sufficient and easier 95%+ of the time.

Some languages invite sufficient complexity that you need to actually step through most code written in them...and they also offer the facilities to do so fairly easily.

So I'd say this really depends on the language.


I like top posting+keeping the reply text. That way, if I get CC’d on a conversation in the middle, I have all the context I need instead of a cherry-picked selection of what someone in the reply chain thought was relevant.


This has always been my default: including the entire reply chain for the same general reason you've stated, and top-posting my reply so it doesn't accidentally get read by someone later as part of someone else's previous message (possibly after formatting conversions that could conceivably screw up the displayed hierarchy). It just seems like the method with the least potential for things to go wrong.


> 24. Productive coders write lots of code.

I wish I could convince my employer of the low truth-value of that statement.

(Instead, I make sure to have a reasonable number of easy large-code-volume tasks sitting in a backlog so that I never fall behind)


On the other hand there are many times where I wish I could convince other programmers of that statement. Lots of repetitive boring code is usually better then convoluted meta frameworks to solve the same task. I just spent my morning digging through 8 layers of OO abstraction hell that turn CSV files into sql because someone thought the easy way was too repetitive and the could cut the LoC count.


We engineers sometimes love beauty too much.

Sometimes, we build things that are technically beautiful because we appreciate them, even though they aren't actually the right solutions to the problems at hand.


You call it beauty, but to me a pile of unnecessary abstractions (and the associated boilerplate) is just ugly. It can be fun to design and build, kinda like goldberg machines, but I wouldn't decorate my home or art with such things.

Simple, small, short.. now that's where I see beauty.


From the original 100

> 54. 'main' takes two arguments, argc and argv.

Is this a myth because you are supposed to think of them as one argument?

Or because of non-standard extension that includes envp?

Or because of some other nitpick?

I feel I have a pretty solid understanding of how processes are started for C and I think the statement is true.


Also:

> Sprinkling printf is an efficient way to debug.

I mostly work on threaded code, and print statements are my go to for tracking down weird timing issues. Things behave much more differently when attaching a debugger, and adding good logs makes a ton of sense long term anyway, so yeah, print statements are absolutely a great approach to debugging.

A lot of them are true, but several aren't. I guess that's the thing about lists...


> > Sprinkling printf is an efficient way to debug.

> I mostly work on threaded code, and print statements are my go to for tracking down weird timing issues.

I have found printf to subtly alter timing in threaded programs. Either masking issues or introducing them.

Anecdotally, I remember some programs being reported as working only with a printf present and bombing when removed.


Yeah, but the likelihood of a printf throwing the timing off is generally lower than a breakpoint.


Sprinkling printf is very different from "good logs". Moreover, logging is not primarily for interactive debugging, but rather for monitoring. They're complimentary practices.


I've found sprinkling print statements to be an easy and quick alternative to setting up logging.


A lot of the myths have built-in assumptions and I think this one is a minefield: c: (variations by spec, version); language: (variations on args to 'main', if there even is a 'main'); platform: (variations on whatever runtime implementation of 'main' you're working with). Many of these reduce to the regulatory escape hatch in finance that 'past performance does not imply future returns'. Your experience with 'main' doesn't necessarily map directly to whatever 'main' you're working with right now.


Two answers.

Some embedded envs I've worked in there isn't a argc/argv/envp.

In the Unix envs I've worked in it also takes envp. I think I've use that 3 times in the roughly 25 years I mostly wrote c.


> Some embedded envs I've worked in there isn't a argc/argv/envp.

I would think you are generally using _start rather than main in embedded systems. But in embedded systems portability is generally out the window so it makes sense that some _start's call main without arguments.

> In the Unix envs I've worked in it also takes envp. I think I've use that 3 times in the roughly 25 years I mostly wrote c.

I have never tried to use envp and had it fail, but for maximally portable code you should use getenv() or if on posix you can use the environ variable.



The student would not understand they can name the arguments whatever they like.


I'm not sure what the original author meant, but main could have no arguments. Or, in freestanding context, may not even have a main.


> 71. The ad-driven profit model is a necessary but reasonable trade-off to make the world a better place.

I was glad to read that amongst the rest.


There are people still complaining about top posting? Wow. All these CS grads who invented putting the high-signal new data first clearly need to get off your lawn.

This article makes me feel very old.



As far as classes go, if I’m interviewing someone fresh out of college (or current students looking for internships), it does make a difference what classes they took. That’s somewhat specific to what I do professionally though.

Pretty much anything I’d hire them for would involve some degree of systems programming and/or graphics. So, having one or more OS classes and/or graphics classes are significant. Of course if they are knowledgeable about those things by learning on their own, that’s great too, but that’s even less common.

Once someone’s no longer fresh out of college, education specifics become a lot less important, and the focus shifts to their career history. But for people without a prior history, there’s little else to go on. For example, if a particular internship or junior position involves graphics, it’ll make a big difference if the interviewee has some sort of introductory foundation of knowledge and won’t be starting from scratch. That could be a critical difference between two otherwise equal candidates.

TL;DR if you are a student with an interest in any interest in a specialty that you might want to pursue professionally, and there’s a course dedicated to that specialty, by all means take that course. Whether it’s graphics, or AI, or operating systems, or whatever. It could definitely help you get that first job or internship in that field. It sounds painfully obvious, but it must not be based on my personal experiences.


I'd say it's less what class you take and more what skills you have.

If you say you can use X programming language or Y framework, and you pass an interview you've got the job. It doesn't matter if X and Y were learned in a class or doing hobby projects. At the end of the day it's the same skill.

Most interviews for new grads are algorithms interviews, which requires some level of skill from an algorithms class, which is at most universities a third year class, which is why companies want people with a BS and not an AS.


> 8. Sprinkling printf statements is an efficient debugging technique.

Ouch. This is my bread and butter.


I've known younger programmers think the speed difference between ++i and i++ is enormous and that using the wrong one is the sign of an incompetent programmer. But they don't necessarily know why or agree which one was the good one.


I am clearly undereducated on some of these

> 21. 'Privacy' and 'Confidentiality' are synonymous.

> 50. The error message 'No space left on device.' means you are out of disk space.

> 54. 'main' takes two arguments, argc and argv.

> 62. Dropbox is a suitable backup solution.

> 69. Zuck is a genius.

> 76. HBO's "Silicon Valley" is satire, not a documentary.

> 77. Jokes about recursion are funny jokes about recursion.

> 94. There is a "real world".


Well, I can answer a few of them. But honestly, these are kind of obscure, and I'm not sure you need to know them all.

> 50. The error message 'No space left on device.' means you are out of disk space.

On Linux systems, (and perhaps most Unix-likes? IDK) some fraction of the file system is typically reserved for root, typically to allow core functions to continue and to give some breathing room to recover the device. This is the -m flag to mke2fs, "Specify the percentage of the filesystem blocks reserved for the super-user." which defaults to 5%.

> 54. 'main' takes two arguments, argc and argv.

It can also take no arguments. I believe it can also take a third argument, a char* environ[], which contains the environment variables, but I don't know what the standards say about this. (I believe its a thing some implementations, particularly Unix-likes, do, but not a required-by-C thing. There's also the entry point in Windows, but that has a completely different name.)

Most of the rest seem like subjective opinion on the state of the world.

> 94. There is a "real world".

"You think that's air you're breathing?" and the recent https://xkcd.com/2221/ ; also https://qntm.org/responsibility ; or at least, that's how I'm choosing to interpret #94.


> 50. The error message 'No space left on device.' means you are out of disk space.

>On Linux systems, (and perhaps most Unix-likes? IDK) some fraction of the file system is typically reserved for root, typically to allow core functions to continue and to give some breathing room to recover the device. This is the -m flag to mke2fs, "Specify the percentage of the filesystem blocks reserved for the super-user." which defaults to 5%.

I've also had this happen when I ran out of inodes (the files storing metadata) - you can run out of the allocated number of inodes when you have millions or billions of files while still having hundreds of 'normal' gigs free. `df -i` will tell you.


>> 50. The error message 'No space left on device.' means you are out of disk space.

It can also mean that you're out of inodes. Or, if you're working with a non-disk-based FS, it can mean that you've run out of some other resource.


Or out of quota.


> Sure, they could go on and get a PhD, but first they'll make some money; they can always come back later.

Yeah, you can totally trust yourself to not take on a bunch of financial obligations that would effectively preclude the possibility of taking on a PhD. Honest.


I was hoping that refreshing the page would highlight a different three. :)

Blaming gmail for top posting seems odd. I can't disagree. And I like quoted messages. That said exchange, and phones, seem just as culpable.


The original list has a few I’d argue with >95. Brooks's Law has exceptions However, the Wikipedia article linked has a subsection of exactly that — exceptions.


27. Real Programmers(TM) use neon-green on black terminals.

This caused a guffaw... because I have a large screen which has a full-screen emacs session... using green on black :-)


Real Programmers (TM) use emacs -nw


I always ask recent graduates about interesting courses they took. Not so much to meet some kind of criteria, but to start a conversation about something in CS that interests them. Helps to see how passionate they are about their field.


34. They know how to validate an IP address.

I am surprised by this one. What are the pitfalls there?


Maybe it means a non-reserved IP? 127.0.0.1, is often used as localhost for example, but all of 127.0.0.0–127.255.255.255 is reserved. There are several reserved blocks like this. If you include reserved addresses, every 32 bit pattern is a valid ipv4 address at least in theory. I hear network equipment in the real world is entirely another beast. I remember some ISPs did not work cloudflare's 1.1.1.1 DNS presumably because they were using the address for something else.


IPv4 has a lot of reserved and special-purpose ranges. Hell, I don't think most people could even recite the three major private-use blocks. Multicast and link-local addresses have more complications.

IPv6 is similar, though a little better consolidated. While there is one range of addresses reserved specifically for organization subnets at your disposal (fc00::/7), a lot of people assume just about anything not currently assigned for the internet (presently 2000::/3) will be safe forever.


I understand your point, but I think the context is more of "They can't even handle a fizzbuzz level problem" and not come up with something quite naive (i.e. something like naive RegEx matching), rather than "they can't implement a professional IPv4 validator that takes into account all the intricate special cases with subnets and reserved address blocks."


Yes, but that's still orders of magnitude easier than parsing emails or html...


My personal favorite is assuming that /31 IP subnets (255.255.255.254) are invalid (RFC 3021), but also, to quote wikipedia:

> IPv4 addresses may be represented in any notation expressing a 32-bit integer value. They are most often written in dot-decimal notation, which consists of four octets of the address expressed individually in decimal numbers and separated by periods.

> For example, the quad-dotted IP address 192.0.2.235 represents the 32-bit decimal number 3221226219, which in hexadecimal format is 0xC00002EB. This may also be expressed in dotted hex format as 0xC0.0x00.0x02.0xEB, or with octal byte values as 0300.0000.0002.0353.


I think that is because not every IPv4 adress has the format x.x.x.x There are short forms (i.e. 1.1 that gets expanded to 1.0.0.1, though not sure if that's in the original RFC) or decimal forms. And then there's IPv6, which again has its rules.


Huh, I thought omissions/abbreviations of addresses was introduced with IPv6.


The gap will be even larger, more low code UI and FaaS are going to make hiring a lot harder.

I think we don't spend enough time on socket level programming and concurrency.

I don't know how many Java devs are still writing blocking IO code..


On tge bigger 100 list, this one I kinda still believe. Mostly with games.

"Software with version numbers ending in '.0' are buggy and you should wait until the next release."


27. Real Programmers(TM) use neon-green on black terminals.

This is not a falsehood. The rest of the world just hasn't caught on.

Sent through lynx


> "who can learn knew things as needed"

Not worth mentioning but just in case you wanted to fix it and have missed it.


101. You should always use React.


Do CS students learn React?


No, but many younger people nowadays seem to think it's a necessity, even for a simple static page.


I just ran tests for a new project. JS is used for testing. It generates an HTML test report which requires... react!


Like jQuery back in the day.


[flagged]


Could you please not post unsubstantive comments to Hacker News?


101. Always use React.


Oh my God… “Java is a reasonable choice” and “liberally sprinkling in your code with the print statements is an effective debugging technique“...

Oh the pain... my everyday pain...


From the original list:

  66. Spreadsheets and powerpoint are for business majors.
Spreadsheets are for business majors. Jupyter Notebooks are for programmers and pretty much anyone in STEM. Forcing a new programmer to use a spreadsheet as if it were a fundamental tool, the Lisp of business, is nonsense.


Ummmmmm... not sure if this is hyperbole or not, darn Poe's law.

I've seen a civil engineer push the limits of excel for complex stress-strian models, a finance comproller quickly build out elegant reporting models, and an environmental engineer build out a complex watershed model; but sure, spreadsheets and Excel is only for business majors.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: