Back in grad school, we had an Algorithms Professor who was my advisor & wrote several important papers. We also had a PL professor who mostly taught Lisp/C/C++ etc. One day PL professor fell ill. So Algo Prof was sent in as a substitute. So he tried to divert the class attention by going into algorithm detail instead of focusing on programming language theory. But homework was due next week, so students asked him about some C nuance. So Prof said I don't know, but I'll let you know by next class. Then he hurried back to his room & tried to read up on C but obviously couldn't make much headway. In those days we had Usenet & that was it. So the Prof goes on Usenet & clicks on comp.lang.c. In those days I used to fancy myself a C hacker & I hung around on newsgroups trying to help out newbies. So I see this frustrated question from a newbie "HOW TO OPEN FILE FROM C ???" and I click on it so I can help this poor miserable bastard. Turns out it is from my Professor! The footer had his entire credentials like PhD, list of papers published, teaching career etc. So everybody on comp.lang was mocking this idiot who had a PhD in CS but didn't know how to open a fucking file with C. I was so embarrassed I rushed into his office & demanded how he could make a fool of himself like that. "You could have just asked me about fopen, why did you post in public that too with your real name ??! " He was like I've never used all this newsgroup stuff I barely know how to telnet. This was my first post I thought they will help me but they are just heaping abuse.
Why is it shameful to ask for help when you are doing something for the first time? Why such strong emotional reaction to that leading to berating from someone who is not even affected?
Algorithms are a branch of math - it is about proofs and structures and big O. It is about figuring out how to achieve right big O. Especially in their beginning, it was not necessary to know C in order study or teach them. Nowdays, typically they are not taught in C either.
Edit: fixed typos
Well, when viewed mathematically, sure.
> It is about figuring out how to achieve right big O.
That's hardly the whole story. I'd say the engineering work 'counts' as algorithm work, as much as the computer science.
If you're looking to publish in a top journal and prove how clever you are, an impressive time complexity is fantastic, even if the algorithm is completely impractical for real applications. Case in point at .
If you're looking at real-world performance, you don't especially care about time complexity, you care about, well, the real-world performance of your implementation. Maybe you'll improve time complexity too  (I suppose that counts as crossing from engineering to science), or maybe you'll just put in the engineering work without making breakthroughs in theoretical terms .
Those real time considerations should be taught, but in more courses that are focused on coding itself and by people who are focused on that area, are good at it and can explain it in depth.
A single prof does not need to be good everything and every course does not need to teach everything.
I agree that the practical side of things is quite a different specialism - computer architecture and compilers are different topics than the 'pure' study of algorithms. Today's emphasis on concurrency and parallelism does impact the theory, though; the worlds aren't entirely isolated.
I suppose my point is a semantic one: I consider the practical aspects to be a part of the study of algorithms.
> The real world performance issues area is heavily dependent on technology being used and changes over time, algorithm course is about basic platform neutral algorithms.
Well, the Coppersmith-Winograd algorithm will never be practical.
> algorithm course is about basic platform neutral algorithms
There's no such thing. That's why I mentioned Coppersmith-Winograd.
Looking only at the complexity properties of the Coppersmith-Winograd algorithm, you're unlikely to reach the (correct) conclusion that it's entirely useless for any practical application.
> Those real time considerations should be taught, but in more courses that are focused on coding itself and by people who are focused on that area, are good at it and can explain it in depth.
Agree, except to nitpick that it's necessary to develop a solid understanding of computer architecture, which may or may not fall under 'coding'. (That sort of expertise is beyond the knowledge of most coders.) Again, that's just semantics.
> A single prof does not need to be good everything and every course does not need to teach everything.
Agree completely, but it's important not to oversell the theory.
Ok, but that is purely theoretical nitpick that has nothing whatsoever with what we are talking about. Algorithm courses dont teach Coppersmith-Winograd unless you are in some very special course that is designed for those few people who want exactly that out of interest and curiosity.
> Agree completely, but it's important not to oversell the theory.
The thread is about whether it was shocking for algo prof to not know C++ few dozen years ago. It is not, regardless of how useful C++ and architecture is.
I figure it's like being the race car designer. He can design a fast car, but he can't drive it fast. And the driver can't design it.
And Erik Naggum might have had something to do with it. Perhaps unfairly as he seemed to hate stupidity rather than ignorance, which he was AFAICS, helpful towards.
I alwasy imagined a 60 year old sergeant-major type frothing at the mouth, heck yes he could flame! Dead aged 44. A loss.
"Erik Naggum several times stated that stupidity, or rather the lacking willingness of individuals to acquire knowledge about a subject, argument or read other people's arguments with an open mind, was more or less a criminal offense. He was known for his polemic aggression towards what he considered to be ignorant individuals. Much of what he wrote was so full of sarcasm and irony that it could be difficult to understand what he truly believed in and what were general exaggerations made just to make a point. "
I'm pretty sure 90% of the questions I ever asked there were answered with "CPAN is a thing" or "Why would you even want to do that?". Rarely would anyone answer my actual question.
The problem here is probably students expecting to attend a trade school and discovering they're in academia instead.
I must confess that I don't know which end of a hammer to grip to drive a screw.
I just use stock Apple mail client and always bottom post (and snip as appropriate) and I get so many non tech people who approach me in the hall and ask "how do you do your emails the way you do? can you show me?"
My narrative has always been that Gmail simply followed the trend that Microsoft's tools set in motion with their monopoly^H^H^H^H^H^Hy^H^Hpopularity amongst business users.
Of course, I'm not even sure how many modern/younger readers will understand why "^H". I'm getting too old for this.
My theory is that snippet and reply may lead to taking quotes out of context and replying with an uncharitable interpretation of the snippet.
So while top posting may be less logical and efficient, I believe it may be more conducive to a friendly discussion.
Of course I could be completely mistaken about this, and it was just a coincidence that the communities I was involved in where top posting was heavily criticized had cranky people in them who would have found something to get upset about regardless of posting style.
Edit: I don't mean to imply that you are one of those cranky people just because you like bottom posting or snippet and reply! :-) Just observing something I noticed in online forums many years ago.
Whenever you start to see multiple quote snippets appear in an HN (or any forum) comment thread, it's basically guaranteed to have died and splintered into a bunch of uncharitable nit picks.
It encourages https://en.wikipedia.org/wiki/Gish_gallop.
>It encourages https://en.wikipedia.org/wiki/Gish_gallop
[Insert smarmy HN-stlye dismissal here]
I suspect that it's not just about taking time to inline quotes, but also simply about seeing what you're replying to directly next to your answer, as opposed to having it be separated by at least a large message header and likely by much more.
Another sad fact is that people re-invent inline quoting in Outlook culture all the time, badly: they manually copy relevant parts into their top-posted response, or they use some sort of color coding to write their own response inline.
> I like top posting.
I find top posting with inline, contextual comments when necessary, most useful.
> I find it makes things unnatural to read.
>> I like top posting.
My theory is that we're all so used to internet debates unfolding in snippet-and-reply-like formats that we associate snippet-and-reply with argumentative discussion by default.
Fast forward to 2019, and I can say that roughly 80% of students in the class didn't know what POP, IMAP or SMTP were. Before being able to get started on the assignment, we would have to provide them with additional material explaining how emails are delivered and what an email client does to retrieve them. The majority of students just grew up using the Gmail app or whatever self-configuring email clients they had on their phone/laptop, so they had no idea of what we were asking them to build.
Fast forward to 1999, and I can say that roughly 80% of students in the class didn't know what ed was. Before being able to get started on the assignment, we would have to provide them with additional material explaining how to use a line editor and why you need one. The majority of students just grew up using Emacs, VI, Notepad or whatever visual editor they had on their computer, so they had no idea of what we were asking them to build.
Just as you'd expect a CS student to know what Emacs is, even if they've never used it.
Relying on gmail horrifies me, because it's a classic political enclosure pattern.
MS Office was designed to fit business processes, and it does that well.
It's not a surprise that Apple's mail client falls over attempting to support business functions.
A standardized method of sending a set of email messages as messages, rather than copied into body text, predates Microsoft Outlook.
MIME may be the technically superior solution...But like Betamax and MiniDiscs, it lost out to the easier-to-use solution.
A MIME digest would be used for the case that you want to provide someone a bunch of messages they don't already have. In that case, this brings up another reason this is superior to including the thing as text: it preserves all the threading and other metadata.
Insert your cursor in their message, press enter and respond to every part worth responding to - inline. (you may want to sign your entry)
Remove all parts of the previous message(s) that do not directly build towards the current conversation.
It should look so organized that the new participant immediately adopts the format. If they fail, do it for them. (preserve the sane part of their signature)
Doing it like this really feels like you are talking with serious people who would never waste your time. That said, you can now ask how their weekend was. That part of the conversation is simply removed later on.
Also, the suggested alternative, MIME attachments, simply doesn't work well in most mail readers. It's barely usable in Gmail and not at all usable on iPhone or Android. (Ironically, it works quite well in Outlook.)
Microsoft Outlook made sense once you (or at least me) realized it was designed to send memo's not email.
For all the other modern readers: http://answers.google.com/answers/threadview/id/386870.html
There is also an accepted convention for ‘writing under erasure’; the text>
Be nice to this fool^H^H^H^Hgentleman, he's visiting from corporate HQ.
Accidental writing under erasure occurs when using the Unix talk program to chat interactively to another user. On a PC-style keyboard most users instinctively press the backspace key to delete mistakes, but this may not achieve the desired effect, and merely displays a ^H symbol. The user typically presses backspace a few times before their brain realises the problem — especially likely if the user is a touch-typist — and since each character is transmitted as soon as it is typed, Freudian slips and other inadvertent admissions are (barring network delays) clearly visible for the other user to see.
Deliberate use of ^H for writing under erasure parallels (and may have been influenced by) the ironic use of ‘slashouts’ in science-fiction fanzines.
> A: Because it messes up the order in which people normally read text.
> Q: Why is top-posting such a bad thing?
In longer threads, top posting makes it cumbersome to understand which points a responder is replying to. Most top posters (a generalization) tend to avoid point-to-point quotes and answers to establish context. Top posting is easy for a lazy writer, and transfers cognitive load to the readers.
Q: Why is top-posting such a bad thing?
A: Because it messes up the order in which people normally read text.
My point is that you almost never read a single message in isolation, so bottom-posting makes you re-read everything you've just read all over again. If you happened to be reading an individual e-mail from the middle of a chain, bottom-posting would be slightly more convenient (and inline-quoting would lose information). Top posting gives you reverse chronological order, prioritizing what is important, and is friendly for discussions with changing number of recipients.
As soon as you attempt to have more complex conversations, say about subtle technical issues, it falls flat on its face and inline quoting becomes far preferable.
In a way, this is a culture issue similar to the whole maker schedule vs. manager schedule discussion. A lot of management types engage primarily in shallow conversations, and so they may think relying on Outlook is fine, while being totally unaware of the damage it does to the engineering part of the organization (and possibly to themselves).
 In part this is inherent to a manager's role. For example, in many cases a manager's email may simply be the message "yes, dear direct report, you have formal permission to proceed with your plan". There's just inherently less deep thinking in a management role, when compared to engineering roles. Of course, deep thinking should occur in management as well in plenty of places, and the best managers do recognize that. But unless they come from an engineering culture, they're probably not even aware what they're missing with top posting.
Personally, I am a fan of inline-quoting, but only when I know I'm talking 1:1 or with a fixed group of people. If new people are expected to be included down the line - as is frequent with conversations that seek approval or feedback in a company - then I default to top-post to preserve the history of the conversation.
I don't know how a deep engineering conversation looks when written down, because I'm yet to see any in a work setting; whenever a problem approaches any interesting complexity, someone can't handle the complexity and calls for video or IRL meeting.
If there's a culture problem, I think the "maker" group is much smaller than the group of engineers. I personally blame webmail (GMail and others), which by virtue of popularity essentially set the rules for how e-mail is supposed to work, and more importantly than defaulting to top-posting, popular webmail clients don't handle tree structure. Inline-quoting makes sense when your discussion forms a tree, and not a stream of messages.
always bottom posting is nearly as awful as always top posting.
email is for the reader, not the writer. write to your reader’s expectations and to maximize their understanding, not to appease your own sense of what is “correct”
I'm getting old, aren't I.
the lot of us are stuck on vt52s
Edit: it looks like you've posted a lot of good comments but also quite a few that break the site guidelines. Could you please take the spirit of the site—thoughtful and curious conversation—more to heart? We'd be grateful.
> 30. Software with version numbers ending in '.0' are buggy and you should wait until the next release.
It may not be a True Fact, but it's not used that way. It's a reliable heuristic, nothing more.
Dark side: Employers care about your technical interviewing ability and your GPA. Technical interviewing focuses on a subset of skills around certain algorithms and data structures. The courses most relevant to interviewing will spend at most a couple of weeks on the topics you need to know well.
Bright side: There's so much to know in CS that regardless of what courses you take, you'll likely have to (or want to) learn more on the job. So, in college, you want to focus more on learning-to-learn technical material. Also, certain CS courses contain so much outdated/irrelevant info that you can more efficiently learn all you need to know on the job or through your own projects.
What employers do care about is that you worked on something challenging, you're able to overcome adversity to ship something, and you're eager to learn more.
I'm in GIS. (geospatial information systems) Turns out knowing how to define a camera using a matrix and intersecting a ray with a mesh of triangles is pretty useful. And it turns out that if you don't know how to do these things and just sit down and start cowboy coding the solution you're gonna come up with is going to need to be rewritten by the person they hire when you leave.
There are a lot of things I run into in my job where I genuinely don't know what the right algorithm is, compared to the number of times I've said "I recognize this thing" and brush up on the literature and implement some variation on it as applicable to my particular usecase. I'm kinda curious as to how much of the stuff I've written has someone come across it later and been like, "wtf is this shit, you just use <so and so's (probably Djikstra's)> algorithm".
Thankfully python has a rich set of built in datastructures, using a dict (or a hash map in other languages) turns a lot of O(n) problems into O(1) problems. The bisect module turns O(n) problems into O(n log n) problems. I'm always keeping my eye out for more of these big-o-reducers.
That said, in production software it hardly ever matters these days as long as you're not doing anything exceedingly daft. Fast enough is fast enough.
Not a single person has ever asked for it.
I also remember the recruiter telling people you should only have your GPA in case it was super high and it would work as an extra to your resume. So I would think your interviewing skills matter the most.
If I don't do this, when I come back to the code an hour later I have no idea what its supposed to do or why I wrote it. I'll also usually add an explanation, like "3 is bad number of args because..."
> 44. Funny names are funny, you can always change them later.
This one drives me crazy, I hate when people use something like "x" or "habberdasher" instead of something functional like "newspapers"
Many people write comments which are often nothing more than a really obvious transliteration of the code to English "foobar += 3; // Add 3 to the foobar".
These comments add little to nothing and often become worse than useless when they fall out of date and you end up with a cryptic "foobar += 7; // Add 3 to the foobar"
I find it helpful to imagine a codebase guru, someone who not only wrote the thing but has had years of experience maintaining and teaching to others... and the comment writes down what they would say if you asked them about it.
The guru wouldn't say 'adds three to foobar', they'd tell you about how the number used to be the obvious value of two but then the database choked when the boss was trying to store their golf scores in it, because foobar controls the whatzanits and there can be a race that requires it to have room for one extra. The guru would tell you _why_, and that's what a better comment will do.
Students of programming are often just writing what they're told to write. They don't necessarily have a strong grasp on the why, especially not a deep contextual grasp.
using namespace std; // using namespace standard
add $1, $2, $3 #add some stuff.
I think the point that the tweet is trying to make is that if you comment, you should explain why you are doing something instead of saying what you are doing. The latter is already written there, in code.
# authenticate user
Good coding means two things, in general:
* There is no more understandable comment than no comment needed.
The code should explain _what_ it is doing as close as possible in words such that any programmer, including you, can read the code and understand what it is doing.
Bad: sum = a.length + b.length
Good: totalNumberOfGoodProducts = exceptionalProducts.length + acceptableProducts.length
* Comments should be the exception for _why_ something odd is happening.
Bad: if(r == 42) return; // Nothing happened.
Good: if(ioReturnCode == 42) return; // Ignoring code 42, an undocumented value and seems to be a "no operation happened." We'll retry again later in the code.
Think of writing a comment as a failure of your ability to write good code - we all fail from time to time but we asipre to do better over time.
I’m a senior engineer for one of the big tech companies. For some issues, printing (or writing to your logger) is certainly what you’ll try again and again and it can absolutely get you a far way or down to the root cause. It’s not the best strategy (Depending on what you’re dealing with) but it can often work.
Where it’s probably less fruitful most of the time is if you’re debugging some kind of distributed system that has just that one edge case which occurs < 1% of the time in production against real traffic, and it’s entirely your own damn fault for trying to reinvent distributed locking or something akin to your own algorithm that “handles” eventual strong consistency.
Some languages invite sufficient complexity that you need to actually step through most code written in them...and they also offer the facilities to do so fairly easily.
So I'd say this really depends on the language.
I wish I could convince my employer of the low truth-value of that statement.
(Instead, I make sure to have a reasonable number of easy large-code-volume tasks sitting in a backlog so that I never fall behind)
Sometimes, we build things that are technically beautiful because we appreciate them, even though they aren't actually the right solutions to the problems at hand.
Simple, small, short.. now that's where I see beauty.
> 54. 'main' takes two arguments, argc and argv.
Is this a myth because you are supposed to think of them as one argument?
Or because of non-standard extension that includes envp?
Or because of some other nitpick?
I feel I have a pretty solid understanding of how processes are started for C and I think the statement is true.
> Sprinkling printf is an efficient way to debug.
I mostly work on threaded code, and print statements are my go to for tracking down weird timing issues. Things behave much more differently when attaching a debugger, and adding good logs makes a ton of sense long term anyway, so yeah, print statements are absolutely a great approach to debugging.
A lot of them are true, but several aren't. I guess that's the thing about lists...
> I mostly work on threaded code, and print statements are my go to for tracking down weird timing issues.
I have found printf to subtly alter timing in threaded programs. Either masking issues or introducing them.
Anecdotally, I remember some programs being reported as working only with a printf present and bombing when removed.
Some embedded envs I've worked in there isn't a argc/argv/envp.
In the Unix envs I've worked in it also takes envp. I think I've use that 3 times in the roughly 25 years I mostly wrote c.
I would think you are generally using _start rather than main in embedded systems. But in embedded systems portability is generally out the window so it makes sense that some _start's call main without arguments.
> In the Unix envs I've worked in it also takes envp. I think I've use that 3 times in the roughly 25 years I mostly wrote c.
I have never tried to use envp and had it fail, but for maximally portable code you should use getenv() or if on posix you can use the environ variable.
I was glad to read that amongst the rest.
This article makes me feel very old.
Pretty much anything I’d hire them for would involve some degree of systems programming and/or graphics. So, having one or more OS classes and/or graphics classes are significant. Of course if they are knowledgeable about those things by learning on their own, that’s great too, but that’s even less common.
Once someone’s no longer fresh out of college, education specifics become a lot less important, and the focus shifts to their career history. But for people without a prior history, there’s little else to go on. For example, if a particular internship or junior position involves graphics, it’ll make a big difference if the interviewee has some sort of introductory foundation of knowledge and won’t be starting from scratch. That could be a critical difference between two otherwise equal candidates.
TL;DR if you are a student with an interest in any interest in a specialty that you might want to pursue professionally, and there’s a course dedicated to that specialty, by all means take that course. Whether it’s graphics, or AI, or operating systems, or whatever. It could definitely help you get that first job or internship in that field. It sounds painfully obvious, but it must not be based on my personal experiences.
If you say you can use X programming language or Y framework, and you pass an interview you've got the job. It doesn't matter if X and Y were learned in a class or doing hobby projects. At the end of the day it's the same skill.
Most interviews for new grads are algorithms interviews, which requires some level of skill from an algorithms class, which is at most universities a third year class, which is why companies want people with a BS and not an AS.
Ouch. This is my bread and butter.
> 21. 'Privacy' and 'Confidentiality' are synonymous.
> 50. The error message 'No space left on device.' means you are out of disk space.
> 62. Dropbox is a suitable backup solution.
> 69. Zuck is a genius.
> 76. HBO's "Silicon Valley" is satire, not a documentary.
> 77. Jokes about recursion are funny jokes about recursion.
> 94. There is a "real world".
On Linux systems, (and perhaps most Unix-likes? IDK) some fraction of the file system is typically reserved for root, typically to allow core functions to continue and to give some breathing room to recover the device. This is the -m flag to mke2fs, "Specify the percentage of the filesystem blocks reserved for the super-user." which defaults to 5%.
It can also take no arguments. I believe it can also take a third argument, a char* environ, which contains the environment variables, but I don't know what the standards say about this. (I believe its a thing some implementations, particularly Unix-likes, do, but not a required-by-C thing. There's also the entry point in Windows, but that has a completely different name.)
Most of the rest seem like subjective opinion on the state of the world.
"You think that's air you're breathing?" and the recent https://xkcd.com/2221/ ; also https://qntm.org/responsibility ; or at least, that's how I'm choosing to interpret #94.
>On Linux systems, (and perhaps most Unix-likes? IDK) some fraction of the file system is typically reserved for root, typically to allow core functions to continue and to give some breathing room to recover the device. This is the -m flag to mke2fs, "Specify the percentage of the filesystem blocks reserved for the super-user." which defaults to 5%.
I've also had this happen when I ran out of inodes (the files storing metadata) - you can run out of the allocated number of inodes when you have millions or billions of files while still having hundreds of 'normal' gigs free. `df -i` will tell you.
It can also mean that you're out of inodes. Or, if you're working with a non-disk-based FS, it can mean that you've run out of some other resource.
Yeah, you can totally trust yourself to not take on a bunch of financial obligations that would effectively preclude the possibility of taking on a PhD. Honest.
Blaming gmail for top posting seems odd. I can't disagree. And I like quoted messages. That said exchange, and phones, seem just as culpable.
This caused a guffaw... because I have a large screen which has a full-screen emacs session... using green on black :-)
I am surprised by this one. What are the pitfalls there?
IPv6 is similar, though a little better consolidated. While there is one range of addresses reserved specifically for organization subnets at your disposal (fc00::/7), a lot of people assume just about anything not currently assigned for the internet (presently 2000::/3) will be safe forever.
> IPv4 addresses may be represented in any notation expressing a 32-bit integer value. They are most often written in dot-decimal notation, which consists of four octets of the address expressed individually in decimal numbers and separated by periods.
> For example, the quad-dotted IP address 192.0.2.235 represents the 32-bit decimal number 3221226219, which in hexadecimal format is 0xC00002EB. This may also be expressed in dotted hex format as 0xC0.0x00.0x02.0xEB, or with octal byte values as 0300.0000.0002.0353.
I think we don't spend enough time on socket level programming and concurrency.
I don't know how many Java devs are still writing blocking IO code..
"Software with version numbers ending in '.0' are buggy and you should wait until the next release."
This is not a falsehood. The rest of the world just hasn't caught on.
Sent through lynx
Not worth mentioning but just in case you wanted to fix it and have missed it.
Oh the pain... my everyday pain...
66. Spreadsheets and powerpoint are for business majors.
I've seen a civil engineer push the limits of excel for complex stress-strian models, a finance comproller quickly build out elegant reporting models, and an environmental engineer build out a complex watershed model; but sure, spreadsheets and Excel is only for business majors.