Number 2: No One Believes Anyone Can Actually Code
It's surprising to see the number of people who interview for lead technical roles that cannot code, or whose work is exceptionally sloppy. Incompetence is more commonplace than the author believes, even at the highest level.
Seriously. I've sat on the hiring side. I've seen impressive resumes. I've had reasonable discussions with people. And then I give them a very, very trivial coding exercise (a take home, they're free to Google, do it on their own computer, in their own IDE, in their professed preferred language), in a time frame that while constrained is still plenty...and the result is -terrible-.
I can try and come up with reasons why that might be the case, why a simple OO modeling problem + a couple of trivial algorithmic problems (like, take an ascii string, return me a dict mapping characters to character counts) would trip someone up...but that isn't sufficient justification for me to want to continue to an onsite.
And that's the people who get past the verbal screen. Plenty drop out at that point. "I see you spent two years as a Senior DBA. Can you tell me what the acryonym ACID stands for?" "No, I'm not familiar with that" "Okay. Well, the 'C' stands for consistency. Can you tell me what one means when we talk about data being consistent in the database?" "It means when you write something it stays written (or some other made up twaddle)" "I see."
That said, I agree that coding tests that are completely unrelated to the job, and are geared toward college grads rather than long term developers (i.e., "Implement a (data structure)" or "Remember/discover an algorithm that was a PhD thesis 30 years ago to solve a contrived problem in a theoretically optimal way" rather than "Solve a real problem of a kind similar to what we'd expect you to do here") are dumb.
I once got knocked out of the running by a whiteboard-coding interview question that went something like "How would you find all triples from a list of a million integers, where the first two numbers add up to the third?" I said, "Hmmm that sounds like an O(N^3) problem." Interviewer: "Can you think of any way to do it in smaller big-O?" Me: "Not off the top of my head, no."
For some reason that company insisted on only doing interviews at 7:00 AM, and my brain doesn't come fully online until after 9:00 anyway. I felt annoyed later when the answer came to me that afternoon, too late: put the list in a hashtable, add all pairs of numbers for O(N^2) complexity, and check if each answer occurs in the hashtable.
Much later after that, I realized that all the other red flags I'd picked up on while interviewing there added up to an impression that their corporate culture was seriously f'ed up, and that I probably dodged a bullet by getting passed over quickly in the process.
They were probably looking for the answer of, "Lets talk through it and figure out a better answer". Coders are often under a mistaken impression that interviewers care about your answer. They don't. They care about how you approach it, how you think through it, what you do when challenged, etc.
So from their perspective, they asked you to try harder on a problem, and you just said, "No." And they dropped you for it.
FWIW, I had similar questions in some interviews where we hashed through similar problems, and together collaborated down to a "No" answer... but we spent 15 minutes exploring the problem and seeing how well a couple coders can work through it before saying no. And I got the offer despite an otherwise rusty performance.
> Coders are often under a mistaken impression that interviewers care about your answer. They don't. They care about how you approach it, how you think through it, what you do when challenged, etc.
This isn't always true. Some interviews are really about solving the problems as fast as possible. And lots of interviewers are looking for exactly the answer they have on hand, and will think you're doing it wrong if you come up with a different, but equally valid (or better!) solution.
In which case, I don't think that's a place I'd like to work. Most often the solution an engineer comes up with is _not_ ideal and will be improved either because they come up with a better one later before they commit it or because they go through code review and someone else sees a better way. If you don't leave room for people to improve their solutions you'll just end up with the first, crappy one that comes to mind.
> They care about how you approach it, how you think through it, what you do when challenged, etc.
I was asked to sketch the proof for the irrationality of sqrt(2) in a quant programming interview. I kind of froze, and explained that though I had learned it, I could not recall.
He prompted me with "Well, what does it mean to be irrational?", and with that little hint I did the rest. Though ultimately I did not end up working there, it was very satisfying to have answered it.
A friend of mine didn't understand how he could have done better than me in a "write a quick sort" interview (we were still students) when I knew the algo and he never heard of it (and obviously struggled more).
Well, in one case (me), all the information the interviewer had was "this guy happens to know how to implement a quick sort by heart. K.".
In the other case (my friend), he got "this guy is able to understand and implement an algo he never saw before in a reasonable time", which is much more impressive
Errr in the set of numbers that complete the rationals wrt to standard metric? I guess you have to show it’s not rational so you probably did a proof of upper/lower converging bounds always having nonzero difference for 1/n increments or something? I would definitely not get this in a pressure interview situation.
> We now show that the equation (1) p^2=2 is not satisfied by any rational p. If there were such a p, we could write p=m/n where m and n are integers that are not both even. Let us assume this is done. Then (1) implies (2) m^2=2n^2. This shows m^2 is even. Hence m is even (if m were odd, m^2 would be odd), and so m^2 is divisible by 4. It follows that the right side of (2) is divisible by 4, so that n^2 is even, which implies that n is even.
> The assumption that (1) holds leads to the conclusion that both m and n are even, contrary to our choice of m and n. Hence (1) is impossible for rational p.
Assume sqrt(2) is rational and has the reduced form (x/y). Thus we are assuming that x and y are integers and that gcd(x,y) is 1. You square it and get x^2/y^2 which are somehow simultaneously coprime integers and also reducible to 2/1...
I think here they just meant, show that no rational number is the square root of 2. Not rational = not expressible as a ratio. You assume that sqrt(2) = p/q for some integers p and q, then derive a contradiction.
> Coders are often under a mistaken impression that interviewers care about your answer. They don't.
Huge asterisk that as long as you arrive to the correct solution with minimal help. I've had plenty of algo/ds interviews and it seems like needing help to see the trick in the question pretty much means you're out.
> Coders are often under a mistaken impression that interviewers care about your answer. They don't. They care about how you approach it, how you think through it, what you do when challenged, etc.
In many of the places that do claim this, even if you solve the problem "correctly" you still get more brownie points than someone who got a less efficient answer or didn't get an answer. Meaning that they /do/ actually care. I do get that some places are smarter with regards to this than others but it's depressingly common in my experience. "You didn't get the O(n) solution that has weird edge cases and was published as a paper in the 70s? Too bad, because some other person did, because they remember the solution from some programming interview book!"
Then I'd argue they need to make that more clear. The typical interview setting does not suggest it's an "exploratory" environment, it usually feels more like taking an exam.
At my last company, we would sit a candidate down at a pairing station. We'd offer them about 5 problems to choose from, and together we'd pair program on the problem for 2 hours. Using the internet, the IDE, anything they want was totally fair game. No system is perfect, but this was by far the best one I've ever experienced.
Had the same thing happen to me. Simple problem: find out if two strings are anagrams of each other.
My immediate solution: Sort the two strings and then compare them:
(defn anagrams? [x y] (= (sort x) (sort y)))
or some such. I was fortunate in that they didn't make me implement the sort (because it's been a long time for me) :)
"Ok, what's the efficiency of that solution?"
"Well, assuming the library sort functions I'm using are sane, I'll take a guess and say O(n log n)"
"Can you come up with a more efficient solution."
Off the top of my head, on the spot, I couldn't. Later, during the plane ride home, the obvious occurred to me: Don't sort the strings, just scan through each string once and build a hash table, keeping track of the number of occurrences of each letter in each string. Then compare the occurrences for each string. Not as elegant to express in code, but faster.
As it turns out, they later declined to hire me, giving me an excuse that made it clear they weren't really serious about hiring anyone for the position (as is the case at least half the time, it seems).
And thus ended my latest round of failed interviews. I cancelled the last one I had scheduled (probably another fake) and decided to take a break for a while (I mean, I do have a job right now, so it isn't urgent).
You need not to sort the strings. Create a vector with indices he ascii codes, incrementing for the first string and decrementing the count for the second, and keep a count of the number of chars, if you get a negative number exit false, else if the count of chars is zero then each one is an anagram of the other. (n+m+128) operations n and m are the length of both strings and 128 for creating the vector
on a tangent, one way could be assign a number code to each alphabet. Add the numbers that occur in the strings. IF the sum matches, they are anagrams.
To get decent anagram lengths and complexity, implement the numbers as a dict of repetitions of primes, and implement the multiplication by summing the repetitions. ;-)
In which case you can just compare the dicts without performing the multiplication (which happens to be the costliest part for arbitrary-precision integers).
You would have to make sure the sum of any combination of all characters was unique. For example, if the code was the character number, a=1, b=2, etc, both "abc" and "bbb" would have the same sum.
So I think something silly like:
character_code = len(string)*len(alphabet)^character_index
should work.
Or you could do top 100 questions on leetcode or hackerrank and you would have solved the question in a minute. It's kinda sad that you could remember the top 100 solutions and clear interviews in almost all big tech companies.
I recently interviewed at a big tech company (phone interview). I spent quite some time practicing on leetcode (completed at least 150 problems). During the interview, it took me a few minutes of thinking before completing the assignments with what I think was the expected solution. We discussed the complexity and a few possible variations. The interview sounded satisfied and I really had the feeling that I had nailed these assignments.
I wasn't rejected but I've been asked to retake a phone interview. Apparently, the interviewer had very good feelings about me, but found that I was "out of practice" as far as coding goes.
I wonder if my age is an issue (40+) or if they have a really high level of expectation. Are most other candidates super fast at solving these kinds of problems. I'm a bit disappointed because I'm not sure that I've got much room for improvement at that stage.
I really wonder how age is taken into account. If it were really an issue, they could have rejected my application from the start and not bother with an interview.
Maybe the interviewer was biased and led to think I was "out of practice" because he knew I was on the older side. But I'll give them the benefice of the doubt. I believe it's a competitive position and I simply wasn't in the right percentile to qualify.
Now, would my former self 20 years ago be more competitive for this kind of interviews? slightly faster perhaps, but I don't feel I'm at a disadvantage compared to younger candidates.
As someone that does lots of coding phone interviews I can say that yes, time is a factor. But it's relative, ie I'm comparing you to the time it took other candidates to solve the same problem. After all, we have to evaluate you, over the phone, in the course of less than an hour. If 2 candidates arrive to the optimal solution, the one that did it much faster is the better candidate.
It sounds to me like you did pretty well so don't give up.
> the one that did it much faster is the better candidate.
From my perspective that is only about 3% of what I expect from a software engineer. Writing readable, easy to maintain code that is well tested, collaborating with product and other developers for how a feature should work, influencing technical designs, giving good feedback are things I value much higher than how quickly someone can write a snippet of code that works.
> If 2 candidates arrive to the optimal solution, the one that did it much faster is the better candidate.
This is _a_ metric but I wouldn't bet too much on it. I know lots of people that come to optimal solutions to "algorithm type" _much_ faster than I do. I'm pretty slow at that type of stuff. But... its just such a small part of what makes someone a good programmer. Building out a medium to large application requires balancing a lot of trade offs and figuring out how to keep things simple. I know seasoned, quality algorithm solvers who honestly just repeatedly churn out garbage applications. And they can tank the productivity of an entire team of developers in their wake. I don't know how you test for that, but I can promise phone screening for problem solution time isn't it.
its kinda sad you think professionals should memorize useless tricks that dont generalize to the profession in order to be considered for a job.
The skill takes a long time to practice, and quickly evaporates once you stop doing it. Yet none of our work is anything like that. a more real world situation is reading documentation or SO for the function concat_ws that will turn an array column to a concatenated string.
however, the system that is, and one that you seem to think is ok, is one that discriminates people on many levels. Its a skill you do not get good at by working, so you must practice this in your free time. Now you just discriminated against men and women with families, people from less fortunate backgrounds, and others who otherwise do not have the time in the day to dedicate to a skill that is only useful to coding interviews.
as far as a comment above, I was a DBA for 10 years and can do pretty complicated queries off the top of my head, know how to optimize my indexes, worked with both structured and unstructured data in the multi tb size, ect ect, but I have no recollection on what ACID stands for. I dont really care. Sure, you may hire someone who memorizes useless crap, but then they have no idea why the IO has gone through the roof when inserting IDs out of order on a clustered index.
I didn't take the parent's post the same way. The situation is more that if you do leetcode or hackerrank then you pass the interview. So once again the interview process has failed; instead of identifying good candidates you identify candidates that do hackerrank in their free time.
thanks for the clarification, i read it as "its sad they cant do it, whats wrong with them?" kinda way, but what you said is probably what he/she meant
Exactly, I as well have been doing technical work for my whole life, and sitting here right now, I could more easily explain to you how to optimize the planner cache, and what hints are required to get a given result than I could explain to you which joins do what.
I just look up joins when I need them, and it's straight forward, but ask me in an interview and I sound like an idiot.
I don't read anything about OP's post as saying they think people should memorize or that this practice is okay, as evidenced by their final sentence that it's sad that you could do so and pass many technical interviews.
I would say that if you know the answers to 100 top coding questions (and understand them, a good interviewer will poke you to determine that) then you are a pretty good coder and you should be hired. Seems working as intended :)
For many tech companies where they get a lot more candidate applications than they have positions for, the goal of the interview process isn't to avoid losing good candidates, it's to not hire bad candidates.
Like I said, a good interviewer ensures you understand the solutions of those problems (the problem might even be formulated in a different form than the exact form you learned) not just typing out text verbatim. If you do understand the solutions of top 100 coding questions, I think you have a good start. Part of solving the problem in an interview will also be being able to code, so it's also testing your coding ability in a given programming language.
True that software engineering is more than that, but those other things largely depend on in-house company processes, tools and policies so you would learn them afterwards.
I'm as critical of how we interview in this industry as anybody, but I've never found this particular criticism compelling or charitable. It's implicit that you know the correct first answer is to seek prior art. Making that explicit is fine but a bit pedantic. The follow-up question is: "ok great, now say that you can't find any satisfying prior art for this on Google, how would you reason through your own solution?".
They aren't looking for people who don't know to start by doing some research, they're looking for people who won't be completely stuck when that research doesn't turn up the answer.
Maybe but I once had an interview in which I was asked how to find how similar two text strings were (for search). I answered that I would use one of the algorithms from Apache commons text or within Lucerne which implement one or several of the appropriate distance algorithms. He told me he had written his own. I asked why he would do that when these algorithms were written by people who did intense research and the implementations in these libraries have been tested by more eyes than his could ever be. He said “what if I want it to run in Ruby?”...I was interviewing but this is a guy I would not hire. He was wasting time for his own amusement. Never assume people know to do the research on existing solutions. Many developers would rather work on their own toy solutions while avoiding the actual unsolved problems in their domain
Sure, your mileage may vary and there are definitely bad apples, but my claim is that the vast majority of interviewers aren't looking for people who like reinventing wheels, but are rather trying to suss out the extent to which you can reason through a problem. So I think that's the charitable assumption to make.
Are you serious? For any algorithmic question, if googling does not turn up a great answer, you would need to be a genius if you could come up with one on the spot.
We had a CS class at UNH, a state university, where we learned how to express the complexity of an algorithm and then how to refactor algorithms to be less complex, when possible. Although we do not do this type of work for most of our jobs, it is not a stretch to think that half of people with a CS degree might have at one point known how to do this on the spot.
My job often entails reasoning through a solution to a specific problem which is dissimilar enough from the general formulation of some problem that there isn't an obvious way to Google for an exact answer. It may take a genius to devise the best algorithm to solve some general problem, but it does not take a genius to reason through a decent solution to a specific problem using general knowledge and experience.
Except they don't just want a solution, they want a solution with certain performance characteristics. And they want it to be the solution with the best performance.
Which means what they're really saying is not "we want problem solvers". What they're really saying is "the bare minimum for entry level work here is being able to, in 30 minutes and on the spot, out-perform top-tier theoretical CS researchers".
Which in turn really boils down to "be a recent CS graduate who memorized this algorithm in advance so as to 'derive' it later on command".
That's reasonable, but at the same time, unless it's a top-secret clearance job, don't threaten me with doing web development on an air-gapped machine. We both know it's absurd, just don't do it.
> ok great, now say that you can't find any satisfying prior art for this on Google, how would you reason through your own solution?
So then I try and give them a O(n^3) solution (or something).
Of course this isn't accepted as there's a better O(n^2) solution, which can always be found by googling, and I fail the interview.
The "turtle and the hare" problem of finding a loop in a single linked list in O(n) time and O(1) memory is the perfect example. It's easy if you know the answer (or can google) but basically impossible if you don't.
What did the interviewer do when I said I couldn't do better than a hashmap? He laughed and said "not many can".
I’m not familiar with this one but have been thinking about it for a few minutes and think I have an answer (granted I think it will depend on language and OS whether it would work)
The gist of it is: walk the list and after each step, multiply the previous node’s pointer by negative one.
If the current node’s pointer is negative you’ve found a loop (because you colored it by inverting the pointer in a previous step).
Clean up by starting at the beginning and multiplying all pointers by negative one until you reach the last one you inverted. Then return your answer (true).
If you reach a null pointer there is no loop because you’ve gotten to the end of the list. Clean up by going through again and multiplying all the nodes by negative one. Then return your answer (false).
Your worst case running time is 2n (if there is no loop) which reduces to O(n) and you use O(1) memory because you’re coloring the list by using the already existing pointers in the list.
That actually sounds like a very reasonable senario to not do so well in.
The problem that I have with those problems is, they always assume it's easy to answer after seeing the solution. But if you're working on them for the first time, it's a terrible environment. The minute that they stop asking you, as the canidate, are on a timeline to finish. (Also, to finish as well)
The answer to that item would be: (with scala)
> datasource.window(3,1).filter((a,b,c)=> a+b == c)
I mean if you knock it out of the park and know the internals of window.. then asking the bar raiser question is appropriate as a curiousity.
There are other problems as well: Those are the ones where they have an expected answer, response, and reiterate. I've seen that with tree problems. That's where they would low-key ask for the recursive version and then get you to say stackoverflow exception.
My personal opinion on what a senior dev interview should be is: open with a fizzbuzz-level 'bozo filter' coding question, then step away from the whiteboard and spend the rest of the interview speaking to each other like normal, functional adults.
I actually implemented fizzbuzz in Scala recently.. It's a lot easier to do it in a language that has good pattern matching. Doing that in c++/Java is pretty annoying
>For some reason that company insisted on only doing interviews at 7:00 AM, and my brain doesn't come fully online until after 9:00 anyway.
Reminds me of my final round at Amazon, which they always do in Seattle. I woke before 4 AM west coast time having flown out the night before from the East, and 12 hours later was still coding on a white board. After writing (to my surprise) a correct merge sort, they asked my to write a program to do basic math with, IIRC, binary numbers. I was like, um. I'm out.
This doesn't change the crux of your story--which is that Senior DBAs who aren't familiar with ACID are probably not good hires--but that said, "consistency" is actually a somewhat ill-defined guarantee [1].
And if you throw in distributed databases, there is yet another understanding of "consistency" (vis-a-vis the CAP theorem), which really means "linearizability".
Sorry, didn't mean to be super-pedantic but consistency as a concept may seem simple to the uninitiated but is easy to get tripped up on, especially for folks who are aware there is a deeper layer of understanding associated with the concept, but can't articulate it right off the bat.
All this is to say filtering people truly is a hard thing. The extremes are easy to tell apart, but people who hover around the average are much less differentiated.
It filters out certain groups of people, but you will also lose out on good hires that don't focus on your given question. ACID has had almost no impact on my career. Only evaluating lies of ACID from certain db vendors is the only time I think I can remember it coming up meaningfully.
Yeah, I would have been okay with pretty much anything close, and would have given massive bonus points if they could give me a couple of relevant definitions. I wasn't looking for textbook, just for a reasonable discussion.
And I have similar anecdotes related to candidates who claimed years of distributed systems architecture experience, and were yet unfamiliar with CAP (and when explained, could not tell me even at a high level whether, in the event of a partition, their system was CP or AP, let alone the specifics of how that actually exhibited itself).
"That guy didn't have 20 years of experience with db2... He had 2 years of experience, 10 times."
One of my mentors said that about an interview he conducted a while back. I found a lot of truth in that line. I run my tech interviews by starting very green and let the candidate dictate how fast I ramp it up. I've gotten pushback from managers before that you can't start with basic questions, but I've equally gotten positive feedback from candidates (even one's who've been failed the interview). In the end, programmers want to see an algorithm that allows them to judge an interviewee. I don't believe one exists.
This is a brilliant approach because everyone has more context, meaning answers can be more pointed and exact. When the questions start small and build incrementally the next topic of conversation becomes much more natural. The candidate knows that they can use a technique because it's already been discussed and the interviewer is has more control over the flow because there are fewer tangential topics.
I don't claim to have solved the problem of interviews, but I do think I and the candidate get more out of it when we leave it a bit more free-form. I'll cover architecture of our system, API design stuff, debugging, testing, security, problems we're currently having, etc. I'm looking for passion so if the candidate starts riffing on any area, I'll let the conversation flow into that area and go as deep as we can. I usually play dumb as we talk, asking the candidate to explain why and ask what does that mean if they throw an acronym or design pattern into the conversation. A plus of that Socratic method is that if we get into an area that I don't know much about, the candidate doesn't know the difference :)
I did DB work forever never hearing ACID. I initially heard of it certainly from slashdot or hackernews, and it only ever ended up coming up due to not DB work, but DB analysis of tools.
Once you start working on a particular database, and are working on all of the specific details of that db, how often does ACID come into things, really? It just doesn't.
Hence the followup question about consistency. I purposely gave a useless and very inaccurate answer as an example (and one I am paraphrasing from an actual candidate).
If you legitimately did the work without picking up -any- of the terminology, -and- respond confidently to questions you should know you don't know the answer to (rather than admitting ignorance and asking for follow-up questions, or declaring assumptions, i.e., "Well, I would assume it has to do with the behavior that if a constraint is violated or something similar in a transaction, the transaction will be rolled back instead of committing"), I don't want you. Because it means you both did not have formal training in, AND didn't seek out information or knowledge in the domain you were expected to be -senior level- in.
And while atomicity and durability are just characteristics of a relational DB (and so adminning one doesn't really require you to understand them), isolation and consistency both have definite relevance, because the isolation level is configurable, and that affects the consistency guarantees of the system. I expect someone saying they were a -senior level DBA- to be able to talk intelligently about those behaviors, and yes, to understand the words, because just reading the docs would have introduced the words.
The responding confidently about things you don't know is obviously a huge red flag for anyone regardless of position.
Otherwise, I think we simply disagree in some respect, although I think it's over emphasized due to the topic. I think both perspectives have a level of truth to them, and have their flaws. It really depends on the candidate, and what they really know. Perhaps my particular circumstances are sufficiently odd enough that they aren't useful in a broader context. Who knows.
I’ll be honest, I don’t know a lot about databases, but Atomiticy comes up a fair bit in the web apps I write, as does Consistency in terms of designing data models that can’t go wrong, or trying to encode domain structure into data models to use the Consistency guarantees that Postgres gives us. Isolation is pretty important in general, and easy to reason about from a web app perspective, with requests being isolated from each other at the database level, I rely on that all the time, and Durability doesn’t come up a load, but that’s because I haven’t had to do any sort of disaster recovery.
I don’t think it’s essential to know this, but I’d be surprised if a web dev interviewed with us and didn’t know at least the basic version I’ve given above.
I've done tons of disaster recovery, and no one ever called it ACID. It's never come up. The only time the type of discussion specific to the terms of ACID comes up is with other database administrators.
Just my experience, perhaps I live in another world.
> The only time the type of discussion specific to the terms of ACID comes up is with other database administrators
I think this is partially what I'm referring to. I use these terms in conversation with others when designing systems because they are useful in describing very specific properties of databases.
It's possibly OK (very strange but OK) to not know what ACID stands for. It's def. not ok to not know about Atomicity, Consistency, Isolation, Durability and more importantly why you need them if you are a DBA.
It's weird, I've heard plenty of stories about this kind of thing but when I sat on the hiring side and interviewed for intermediate roles (couldn't even afford senior) I didn't come across anybody who was stumped and simply couldn't code.
There were people who were bad at (possibly some because they were under pressure), it but nobody who couldn't do it at all.
I've given coding interviews for 20 years, and I'd estimate 9/10 candidates have been competent whiteboard coders. I don't have any explanation for the extreme discrepancy between my experience, and the "hardly anyone can fizzbuzz" folks. Of course this colors how I treat people. I'm much more inclined to interview seniors conversationally and judge competence by the way they talk about work they have done. I have a hard time imagining people investing the time to become conversationally fluent in a domain, as some kind of long con.
Beyond fizzbuzz, whiteboard coding is rough. Better to put someone in front of an actual computer. Space is constrained, hard to make corrections, some people (such as myself) have crappy handwriting...
It's such a weird hiring market. 1. There are a lot of imposters out there applying for programming jobs who can't program, and 2. There are a lot of very talented programmers out there who are being rejected by overly picky companies. Both can be true, and I'd argue that both are true. I don't know what the solution is. Current interviewing methods don't seem to be solving the problem.
I'd suggest a widely-accepted professional certification could help a lot, like doctors and lawyers have with the medical board exam or the bar exam. Easier said than done, but what we have today, where the candidate pool is overflowing with impostors with great resumes is not working.
That might solve half of the problem. But then we need certifications for employers - something that says, "Yes, I have a real job opening; I'm not just wasting your time." It needs to have real penalties if an employer violates it, too.
I'm not sure this is working, since employers not only search for general software developers but maybe also developers with domain-specific knowledge. It would be very time consuming develop a certification for every domain.
Hey, I said it was needed. I never said it would work, or that it was workable. But if you're going to certify workers, so that they don't waste the employer's time with interviews that are going nowhere, well, we need something to do the same in the other direction...
> I'd suggest a widely-accepted professional certification could help a lot, like doctors and lawyers have with the medical board exam or the bar exam.
"i passed the bar exam!" and nothing else doesn't get you a legal job you want.
and your medical boards, if i recall correctly, just mean you're qualified to go be a slave, er, uh, resident, some place for a few years.
Maybe because they are plentiful? I'm not familiar with infosec but if the bar for getting certified is "I can take a class and pay $XX to get this certificate" then of course they're worthless.
One problem with viewing the certification thing as a solution is people can then question - how do you know if you got the person who graduated at the top of their class versus someone who barely graduated? How are lawyers/doctors vetted beyond their certifications?
Make the bar for passing high enough such that even a person who scores the lowest passing score has demonstrated that he/she can at least code. A standard exam doesn't solve all problems with hiring, but it could at least help solve the very first "can this person even code at all?" screen that weeds out the total phonies.
Oftentimes this is the "CS degree from a good school" stick. But we don't have an easy way to check those claims, nor to evaluate what constitutes a 'good school'.
Maybe just luck? I interviewed someone for a senior position last week and that candidate didn't know what an array was. That was probably the worst case I've seen but by no means the first time I've seen someone struggle with the very basics.
I remember interviewing a candidate that didnt even know what a binary tree was (context was big-O complexity of algorithms). That interview was really bad. Left the candidate in tears (which was not intended, but I think of their realization they werent going to get the job), and me annoyed at the waste of my time (should have been caught in phone screen instead of on site).
Was it necessary to know on the top of your head without Googling what a binary tree is for the position the candidate applied for?
I've been a developer for 12 years and I've never had the use for that. Computer science is a VERY large field and being good at everything is impossible.
Asking the right questions at interviews are crucial for finding the right people.
If a candidate for a programming job knows binary trees but doesn't listen to other peoples views I would say he's less worth to us than a listener that easily learns new concepts but is currently not familiar with binary trees.
BTW, leaving a candidate in tears is not professional recruiting. Please let someone with more people skills accompany you to your interviews, you might leave people with scars that takes years to heal.
I came across it when studying computer science 20 years ago, yes. Heard about it after that but never directly needed the knowledge.
I'm sure it's used under the hood in a lot of code I write and have written but so is XOR, manual memory management and a bunch of other lower level implementations that I don't need to spend time on when developing on a higher abstraction level.
Not sure why you would expect all programmers to know about binary trees specifically.
Binary trees are the simplest kind of nontrivial tree, and trees are used extensively in programming. Most computer problems are solved using trees of one kind or another.
If you've never needed to know it, what is its value for finding a good candidate? A lot of these questions are basically testing whether you did a CS degree. If you're self taught, you might be just as good a coder, but have never had reason to learn what a binary tree is or how to implement quicksort, or whatever similar puzzle.
It's a nice way to filter out people who can code and basically do the job, but didn't have the financial resources to get a CS education, and weren't lucky enough when researching to see that his is an interview question. Although at least they'll know for next time.
Why on earth did you think that was a good, relevant question to ask? Did you ask about how to implement merge sort? I learned about it in college in the early 90s. Haven't implemented it directly since.
I'll bet, even with this feedback, you'll continue to ask that question, just because.
Most programmers never need to interact with binary trees directly, and I would hazard a guess that if I asked ~10 of my compatriots about big-O notation, maybe 2 of them would understand what I was talking about.
Do you really think that text-book question about ACID determines whether the applicant is a good DBA? I know something about databases but I couldn't remember all the letters.
Hence the follow-up (which I'd ask regardless, as we work our way to more and more hands on stuff). It's okay if someone has never heard of the acronym before. And an answer of "Uh, I remember the C is consistency...and the D is durabilty. But otherwise no" is as good, to me, as remembering the entire thing; it shows you've run across it before, that you've at least read up or received some sort of training with DBs.
I mean, look at this thread; would it be better to start with "Here are some example tables, please write the SQL that will return me (etc)" and turn it into a coding exercise? Because I honestly don't care that much about that from anyone espousing a senior level of knowledge.
I understand that's a problem, but if I have an entire github of highly starred and heavily developed projects with reasonable commit histories... don't make me do your do damned homework or implement a toy BST. I have better things to do with my personal time than toy problems because you can't be bothered to open my github.
The worst part is that usually these toy problems are justified with "but you can post this to your Github for others to see!"
I'm not going to speak to the homework type of problem (because honestly I think you should be paid for that sort of thing), but I always ask people programming questions in on-site interviews, not strictly because I want them to prove they can program (although that's one useful side-effect), but because I want to observe the candidate's problem-solving process. I can't deduce anything like that from your Github or a homework problem.
And that is entirely fine. I'm all for group white boarding exercises, or pair programming. I think there's a lot of value to be had in making sure the other person can communicate in a technical setting effectively, and explain why they make choices.
Likewise, if someone wants me to walk them through a project I have with explanations of what choices I made and why, I'm happy to do that. But please don't waste my time asking me to implement DFS, or write another twitter API client.
If the GH code is there, works, and is up to coding standards, why do you care about how the sausage was made? I'd look to see if the code is well-designed, organized, original, tested, etc. It's generally easy to see if the author of code knows how to decompose a problem into pieces. So..saying "we're looking for thought-process not the actual code" feels a bit disingenuous tbh.
Now: If you care about being able to collaborate on a tough problem with somebody, work on a problem neither of you has seen before together :)
Alas, I really do care about the thought process. If a person arrives at a great decision for bad reasons, I may not want to hire them. More importantly, though, I'm interested in when candidates pick not-optimal choices for good reasons. So much of software development is about tradeoffs. Seeing and hearing how they tackle those tells me much more about how they'll do than the specific code in question.
As an aside, the reason not to work on novel-to-you problems with job-seekers is that it's very hard to fairly compare candidates after. I strongly prefer pairing on the same problem with all the candidates in a batch. Otherwise it's hard to tell if a bad result is due to a bad problem or an unacceptable candidate.
> Alas, I really do care about the thought process.
Sure - but you're not likely to really see the "real" thought process you're hiring for by asking a stranger a riddle you already know the ansewr to in a high-pressure situation.
Real problem-solving and thinking is done in a huge number of ways that sometimes isn't conducive to strangers, pressure, or whiteboarding.
> I'm interested in when candidates pick not-optimal choices for good reasons
Right - which is why starting with something they've written on GH (which is something they've thought about and are probably passionate about) is a great jumping-off point for such a convo.
> As an aside, the reason not to work on novel-to-you problems with job-seekers is that it's very hard to fairly compare candidates after.
I see your good-intention here, but it's nearly impossible to compare two candidates without all kinds of biases (implicit and explicit) coming into play.
> I strongly prefer pairing on the same problem with all the candidates in a batch. Otherwise it's hard to tell if a bad result is due to a bad problem or an unacceptable candidate.
I think this may be another way of trying to compare candidates to eachother (which is perilous).
Good candidates can do well with bad questions, and bad candidates can rarely do better than okay with good questions. There are lots of good/proven questions you haven't solved - ask your colleagues.
If someone has a good GitHub profile I won't ask the candidate to do much or any coding exercises, because it's much better to just discuss the code they already wrote. I'll put it up on the screen and get them to talk me through it. I'll ask about anything that looks unusual, but also about anything that looks particularly elegant. Even if they wrote it a while ago they should still be able to talk about it.
> If the GH code is there, works, and is up to coding standards, why do you care about how the sausage was made?
Perhaps because, like many developers on modern teams which eschew the older role distinction, the incumbent in the position being hired for will need to act in the role of a classic system analyst in defining specific requirements give a fuzzy business problem as well as the role of a grunt coder.
Also, the presence, functionality, and quality of code on GH does not establish it's provenance.
> position being hired for will need to act in the role of a classic system analyst
Sure but that's a different skill from solving DS/algos, so don't conflate the two. Some people know their DS/algos super well but need to solve them solo, some people need to google around a bit for inspiration or take a walk if they get stuck.
Instead, separate it out. Give a specific "system analyst"-style question that's distinct from coding. A question you don't already know the answer to or that could be taken in a thousand different directions that you've definitely never thought of before.
I want to know how resourceful the candidate is, how quickly they make mental leaps and whether they'll play nicely with others in the team under the context of the work we're trying to get done. If the choices are a) find out first-hand by (gasp) asking for a demonstration of skill while I watch or b) try to decipher these things by shaking a tuning rod at their GH repository--then I guess we'll have to agree to disagree :)
Sure - my point is that asking them to solve a problem on their own that you already know the answer to isn't a realistic environment of "playing nicely" either and is only going to show you how they deal under pressure, not how well they collaborate with somebody who genuinely wants to find the answer and can build off of.
If you need proof they can "actually code" and use DS/algos etc, use their GH if it exists. And if you want to see that they can work on a team to define and deliver a messy problem with other people, do that with them on a messy problem you've never solved before. (It could eventually be a DS/algo problem; my point is it's a group effort and neither of you knows the solution; you're looking explicitly for how the interaction goes, not if "the candidate" got the "right" answer.)
So I'm supposed to spend my time researching your GH instead of just letting you spend 5 minutes proving it? There isn't enough time in a day to research every candidate's code they wrote on their own time (and personally I'd rather not be judged by mine). And, if people lie on their resumes already, what makes me trust their GH? It's like saying here's an essay I wrote, you don't need to talk to me in person, I"ll just stay home and you can decide whether you want to hire me.
Yes. It's a lot of work on both sides to interview and be interviewed.
If you don't take a candidate's prior art into consideration, you're wasting everyone's time. You're also eliminating candidates who may excel in ways that aren't solving riddle-problems out-loud in front of new people when their livelihood is on the line.
Start with the GH and if you have doubts fall back to portions of old model.
> So I'm supposed to spend my time researching your GH instead of just letting you spend 5 minutes proving it?
I've never seen a worthwhile programming q take only 5 mins to answer. And the candidate is supposed to waste an hour of her time proving to you what you could see in 5 minutes on GH?
> There isn't enough time in a day to research every candidate's code they wrote on their own time
Yet there is enough time to spend with whiteboarding problems that prove the same things the candidate has already proved on their GH?
> (and personally I'd rather not be judged by mine).
Sure - only use the candidate's GH if they prominently put it on their profile and/or own an "intended to be used/seen" public repo.
> And, if people lie on their resumes already, what makes me trust their GH?
Not a replacement for conversation! It's a way of indicating they can code without solving an arbitrary riddle in a high-pressure situation. If you're not convinced they wrote or understand the code you're looking at, ask them about it or ask them how they might change it slightly.
This is fair, it's just not realistic for a lot of places. In the places where I've worked and interviewed people, this was not my main responsibility. Typically I'd get a resume that morning and have to prioritize looking at it along with whatever else I had going on that day.
On the one hand, I agree entirely. An interview should not make people jump through hoops if they can answer their questions via existing material.
On the other hand, fakers have caught on to the "GitHub is my resume" thing. I have had applicants who have basically fraudulent Github projects, or who have taken group projects on which they did basically nothing and claimed them as their own. So a Github page now requires an expert to evaluate it.
Surely getting them to talk you through the code would solve that problem.
If its is their own code it shouldn't be a problem.
If it isn't their code and they can still talk you through it, even better - they have managed to understand code that someone else wrote which is probably an even more valuable skill.
You realize that your third second contradicts your first, right?
And no, having somebody who's an energetic fraud on the team is poisonous. Having somebody who's so good at it that they're hard to catch is worse, not better.
If people are genuinely a fraud they won't be able to talk about the code in a sensible manner.
I didn't write the monstrosity that I am working on just now, but I could explain how it works and I know where to look to fix bugs. I don't know why some crappy design decisions were made. Does that make me a worse programmer than if I had written it myself? Is harder to understand someone else's code than stuff you have written yourself. There are far more jobs working on existing code bases than on new new projects.
(My guess is that you have never worked with anyone who has claimed ownership of someone else's code, have you?)
> but if I have an entire github of highly starred and heavily developed projects with reasonable commit histories... don't make me do your do damned homework or implement a toy BST.
Unfortunately, the current common argument from the hiring side is that GitHub profiles are not sufficient proof of skill since code can be copied (not fair) and GitHub projects bias toward people with extra free time (reasonable, but not enough to discount GitHubs completely IMO)
Right. Don't "require" a GH, but use it if it's there!
Yes, some of the code may not be original, but unless the repo is a fork you can pretty easily see if the code is organized in a coherent way that shows the committer knew how to decompose a problem well. If they solved /every/ piece of the problem themselves, it may be a sign of a different problem (and also copy/pasting without citing source is also telling!)
I feel like this is a myth, what do you estimate the figure is? I think under 10%, maybe under 5%. I have significant experience interviewing senior, junior, and mid-range candidates. 99% of my candidates can code, as in iterate over collections, write case statements, and call functions. I've only had one junior candidate who couldn't code at all. Sloppiness is rampant, but sloppy code that gets things done is what my company wants (not me personally).
We set our in-house recruiter up with a coderpad question that screens candidates with a simple question:
"Write a function that counts the number of vowels in a string"
Candidates are allowed to run it multiple times and just have to produce a correct result within 10 minutes. It's not a trick question -- the test case in place makes sure you pay attention to case.
I applied for an analytics position that unexpectedly had me take a Python coding test like this (I know a bit of PHP and Java but no Python) and I was able to google everything I needed to pass the test in the time limit. Apparently I got one of the higher scores too. Got the job. It has not required me to write a single line of Python, lol.
Googling stuff to copy is one of the most important skills for programmers. I was appalled when a middle aged senior programmer at my first internship told me this, thinking he was lazy and unethical... (people are paying you after all!)... how naive I was!
The Count and Contains methods are not on string, they're both extension methods on IEnumerable<T>, which string gets for free by implementing IEnumerable<char>.
That's surprising - ill have to give that a go later this evening as I have been thinking about going back to development form a consultant non coding role.
Took me less than 10 seconds to come up with an approach that would work oh just though of a more efficient one but that would use a regex :-)
What's your process like after that screening measure? Because to me, that's honestly trivial, and a lot nicer than dealing with an extended (8+ hours) homework problem as an introduction to a company's hiring process.
A ~4 hour on-site (or google hangout) technical interview and discussion. The first 30-45 minutes is usually us selling you on the company, followed by 2-3 hours of technical stuff. We do throw a few more coding questions at you (ones that are technically trivial, but made more difficult with interview jitters etc), but go beyond that and ask how would you design an API, how would you think about the technical requirements about business problem etc. After that, Q&A, and more selling you on the company.
That sounds like a nice process. I don't know anything about your company, but I'm thankful that you keep it sane and reasonable for prospective developers.
It's absolutely not a myth. It does depend on how you're getting your candidates, but assuming you cast a net a bit wider than "only people you know personally", then you run into these cases.
Our standard fizzbuzz question was something like: print the multiplication table. This is a loop inside a loop. I think every programmer should be able to do this given 10-20 minutes. Around 30%-40% couldn't. Like literally, didn't know how to do this in languages they supposedly work in. I'm sorry, I make lots of allowances for stress, I calm candidates, I tell them syntax doesn't matter, do it in pseoudo-code for all I care. But if after 10 minutes you can't print a multiplication table on the screen, I'm going to pass.
I feel like sometimes it's out of the applicants hands if they come by way of a recruiter / head hunter. Sometimes non-junior devs who aren't quite to senior developer status yet have no control over the recruiters mistaking your pay at previous jobs and the pay you could make next time for the level of job you should be applying for / able to handle.
I've run into this issue in different specific ways earlier in my career and no matter what I'd say about skill level / lack of knowing required technologies / being willing to take lower pay in order to justify aiming for another role it didn't matter in the face of the possibility that the recruiter could obtain a larger % for themselves off of that senior / higher-than-in-my-range job's salary.
Maybe this isn't as applicable for senior dev role applicants as it is for some lower level dev roles but at the point an applicant is actually physically there in front of the interviewers I'm sure one of the last things they are going to want to admit is that they wished they were interviewing for a lower paying / easier job even if it's 110% in the favor of both parties.
I'm a senior engineer and have worked with recruiters many times before, and I've pretty much come to the conclusion that they're a big waste of time unless you want to do the 6-month contracting thing to avoid resume gaps or you just like moving around the country and renting rooms.
I've tried working with them before, have gone on job interviews through them, and then generally found that the jobs that hire through recruiters pay peanuts, and I end up finding a permanent job paying much more and taking that instead. I think if a company only works through 3rd-party recruiters, that means the company has no idea how to hire people, and doesn't want to pay much either (because they're paying the recruiter a huge commission).
It's not in their favor to find you an ideal, direct hire / non-contract gig. There is no chance for future money from a client if they find you your perfect fit / long term career type job.
Not to mention most of the one's I worked with early in their career not only had no idea about any of the technologies they were checking to see if I was fluent in but they also acted like they DID know everything about programming and beyond that they tried to flaunt this fact and treat me like I was trying to overstate my skills to sneak my way into a job out of my pay range.
Such a frustrating ecosystem in order to find a career job.
In ten years I can count on 1 hand how many times I've gotten a reply from a non-recruiter job I applied for (applied on my own without a staffing agency submitting / representing me).
It's really crazy how that works.
Another thing I dealt with was people who hired me saying (TO MY FACE):
In this situation I'm making $35.00/hr as a Regular Developer (non-junior / non-senior dev, just a middle of the road contract developer):
CEO of company in front of everyone: "scoggs we are paying way more for you than we are for our senior developers so we are expecting top notch work from you."
Me (used to it by now, unfazed but I hate this situation): "Sir with all respect I'm only getting less than 1/2 of what you pay my staffing agency for my contract/"
CEO: "well they don't really do anything / didn't really do anything but introduce us and get you a phone and in person interview with us."
Me: "But you guys didn't have to pull programmers, HR, and design / copy writing people off of their normal work to create interview materal, submit interview material, review resumes, vet candidates, plan phone interviews and schedule them, set aside time for phone interviews, review phone interview candidates among 'hiring team', review candidates references, plan in person interviews and schedule them, execute in person interviews and have meetings with 'hiring team' to pick who to hire', make offers to people you want to hire, or hire them. All you had to do was tell the staffing agency what skills your were looking for and what kind of company you were. I'm not an employee of your company. I don't have insurance, I don't get paid for sick days, I don't get paid for holidays, and I have no job security."
Of course I didn't say all of this, I said something along the lines of "They take care of the majority of the process" but in my head I feel like I'm always having to live up to the expectations of the people paying the total bill. The same way they think the staffing agency doesn't do any real work and isn't worth the cost / price of doing business -- most of these people feel the same way about developers. They don't understand what we do, they just expect us to solve anything and everything to do with computers / programming / technology and this is ALWAYS the way it is regardless if the final product / project outcome is directly tied to the company suddenly turning a profit based on the success of this project / programming endeavor.
These companies want to underpay contract developers, pile stress on their shoulders, guilt them for the situation they have little to no control over, and then stick them with the majority of the blame if things don't go 200% well (because they are expecting work that's 2x as good as the amount of money I'm being paid).
I've found it impossible to truly and personally (mentally) live up to the majority of expectations laid upon my shoulders by non-technical CEO's / Presidents / Bosses of companies I've worked for.
This is the majority of what I deal with right outside of NYC in Northern New Jersey. There is the rare company that truly respects the programmers they hire (usually companies where programmers have become CEOs / Presidents / people in powerful positions within the company who wield influence).
I love the money I make in this market but I truly hate the way everything makes me feel even when the situation is like this yet I am able to deliver above and beyond expectations. I hate working for people and in situations where it feels like I'm being told I'm robbing the company when meanwhile the recruiter is robbing us both yet they have us pitted against one another -- and beyond that the staffing agency has a 2 year lock on me being able to accept a job from that company without them "buying out" my contract.
The cost of buying out a contract? A university wanted to do that once early in my career (I wish I was still working there frankly, 10 years later), but the staffing agency wanted 2x the amount the university was paying the staffing agency! I was making $60,000/yr at that point so on top of the University paying me somewhere in the realm of $60,000/yr they would have also had to pay the staffing agency ~$240,000 just for the right to hire me. That means they would have had to shell out $300,000 total including my salary and the buyout.
Nothing about that seems right and / or legal. Under no circumstances would I ever be worth that much to the staffing agency and honestly with the amount of work they actually do I feel like it's damn near criminal.
No State University that pays employees with citizen's tax money is ever going to be able to justify spending that type of money on 1 employee. It's just a fantasy of epic proportions.
I lived in northern NJ for a couple of years. You need to get out of that place; it's a terrible place for software engineers. The cost of living is ridiculous but the pay for engineers is mediocre at best. Move to Silicon Valley: the cost of living is only slightly more, but the pay is far higher, and the weather's a lot better too. And $35/hr 1099 is ridiculously low in that area, those are poverty wages.
Anyway, you make it sound like you can't get a job there without a recruiter. There's still plenty of companies hiring directly, I even interviewed at a few when I was there such as Alcatel at Bell Labs and some wifi company in Manhattan. My current gig is with a large company (in the DC area) that hired me directly. Stop wasting time with recruiters and just look for companies that do their own recruiting; there's no shortage out there that I've seen. I've had 8 jobs now that were not contracts and not through recruiters: 3 large corporations, 3 small companies, 1 mid-size, and one a state university research division. Am I apparently special that I'm able to find these jobs?
I think it depends on how you're filtering the people who get to your interviews. If it's purely based on a resume screen, you can get a lot of people who can't code.
Personally, my first-line screen is a short answer (3-5 sentences) to a programming-related question. [1] For me, about 90% of applicants who pass that can also code. I like that better than a resume screen, as plenty of people who haven't officially been programmers are decent coders.
> 99% of my candidates can code, as in iterate over collections, write case statements, and call functions.
Honestly, that's way too trivial to call "can code". I usually ask something less trivial, yet still extremely easy like "a function to check if a string is a palindrome" (I explain what a palindrome is) or "check if a substring exists in a given string".
You wouldn't believe how many people "with 10 years experience" just lose all motor function in their hands and mouth, even without any time pressure.
Problem is: It's hard to judge the difference between someone who doesn't know what they are doing and someone who does but loses it under the extreme pressure of an interview wherein their entire future with the company is being determined by their performance on a 30 minute exercise.
Throughout my career, I've been in professional situations where I was under a lot of pressure: Having to explain failures to executives, near-impossibly tight deadlines, production outages. At least I haven't had to testify in front of Congress yet. Absolutely NOTHING I've encountered in my professional day-to-day has the extreme level of pressure of a typical silicon valley job interview.
I understand and truly symphatize with stress the candidate is in, but someone who claims to be a "senior developer with 10 years experience" should be able to write that palindrome function in 30 whole minutes in any kind of stress scenario.
Because if I hire that person, they'll be in charge of critical production systems and will face much harder problems in much more stressful situations and if they can't handle the palindrome question, I have very little confidence they can handle the actual job.
People are strange. I can totally picture someone who is an experienced senior, a hero under the pressure of a critical production outage, yet whose brain goes into a crash loop when sitting across from someone who has the power to deny them their dream job.
Not claiming to be that hero, but I have personally had those moments where I came out of the interview, and as the disorienting fog of pressure lifted during my drive home I said to myself, "WTF happened to my brain in there? I know that stuff cold!"
EDIT: I think I read this one from a commenter here, but it's so true: We're interviewing people to be music composers, but judging them by their ability to be performance artists.
Fair enough. I certainly don't claim to know the best interview method ever; simply that, in my experience, starting with trivial coding questions has a higher benefit/risk ratio then other approaches.
I don't agree. They should be able to write that as a solution to resolve an issue. It should be very well understood why they're writing it. Additionally, if you want to keep going under that justification: you should also accept quick alternatives that aren't the naive approach.
While I agree in the abstract ...honestly, it's not that hard to write at least pseudocode for problems like that regardless of the "stress" of an interview
Is it reasonable to ask things like that? Maybe, maybe not.
But as stressful as an interview is, it's still a basic problem (I can English describe it to you in 30 seconds...coding (in C/C++) would take me another 5-8 minutes (and, probably, be pretty damn inefficient ... but it's still a simple problem)
It's way more expensive to hire a bad fit than to pass on a good fit. Also, I think I'd prefer the person who doesn't crack under pressure. Flopping an interview is more likely an indicator of lack of preparation than nerves, although nerves are such an easy scapegoat.
Not sure with the amount of sloppy code I have been maintaining for the last couple of jobs.The people appear to be able to do "clever" stuff, but have an inability to keep things simple or follow best practices.
How often do you need to write palindrome functions for your job?
Checking substring in a string is a 1 liner in languages like python.
I was at a final interview with FAANG. One question asked was to load a csv. I used pandas, it's a 1 liner. You could see the wretched face of the interviewer since he was expecting a
with open() as f:
do_some_shit
and kept pushing me to write this on the board. I looked at him and said "Why? Why write your own method in a vanilla language? Give me a good reason."
So what is the interviewer supposed to do to find out if you can code? They want to give you a simple, straightforward task to see how you code. Of course, every simple, straightforward task is going to have an existing tool to do the job, but they aren't looking for a solution to parsing CSV, they are looking to see you code.
They can't ask you a complicated problem, because they don't have the time (nor do you) to solve a real problem that requires coding.
You are totally missing the point of what they are trying to ask you if you insist on using pandas for parsing CSV. Understanding the point of what you are being asked to do is critical to being a professional software developer.
I think that is the point. If your business relies on importing csvs that cannot be imported with pandas then explain that, explain the edge cases where pandas has failed, and I will understand. If your business relies on having the best fucking palindrome product then that becomes super relevant.
They can definitely ask a complicated problem. On-sites are 4 hours long these days.
You sound like somebody who would derail all of the technical discussions and pat himself on the back for it because he was "technically correct" while still missing the point altogether.
I hear the opposite. They are stressing a technical opinion here, but for the sake of rationality and efficiency. These are usually not the same people that are overly concerned with "correctness" when communicating with others.
But it is only 'rational and efficient' if you are being obtuse... they aren't asking you to code a CSV parser because they actually need a CSV parser, they are asking because they want to see you code. The 'rational' thing to do is satisfy THAT requirement, which is the real one, not try to bypass the purpose by insisting on using a CSV library.
The ability to understand the underlying need behind a request is important to being a good employee. If a candidate fails completely to understand the purpose of a question during an interview, and in fact continues to argue against you as you explain it, they aren't a good candidate at all.
> The ability to understand the underlying need behind a request is important to being a good employee. If a candidate fails completely to understand the purpose of a question during an interview, and in fact continues to argue against you as you explain it, they aren't a good candidate at all.
I agree with everything you've said, but that still makes it a bad and equally obtuse question. If they don't actually need a CSV parser, they don't need to know that you can parse CSVs, either (if you can - which is the other point - you might be a pretty good programmer if you can quickly parse a CSV, but there are tons of excellent programmers who can't).
I recently couldn't use a built-in CSV parser and had to roll my own because one of our clients couldn't be arsed to send us consistently formatted CSV files so we had to include a bunch of edge cases in there to still correctly format the damn things. (sometimes pipe-delimited, other times comma delimited, sometimes fields surrounded by quotes, other times no quotes except one column (that has a LASTNAME, FIRSTNAME" in it), yet still other times has quotes in the data itself, fields usually have no commas, but sometimes this one field will have it in the middle of it, typos in header names, not including the end of file checksum that others include, etc). The built-in CSV reader you had to specify if the fields had surrounding quotes or not for the entire document, it didn't detect it on its own, for example.
And that was when I found out parsing something that should be as stupid simple as CSV file can actually be pretty complicated.
I'll argue that a one-liner IS code. That's what I put into production systems.
I understood the "purpose" of the question and I self-selected myself out of a role I would've been bored, micromanaged at, and probably not challenged at.
The question on the interview isn't about the business, it is about showing your ability to design an algorithm. I is an exercise, not a purpose driven task.
Let's be honest: most commercial programming jobs don't. And the person with a thorough working knowledge of the standard libraries is in a better position to do it anyway.
They want to give you a simple, straightforward task to see how you code.
Overgeneralized at best, LOLworthy at worst. Does this include palindrome questions for Ruby that require the use of linked lists? Because I had that from a CTO of a couple-hundred person company not two months ago.
> How often do you need to write palindrome functions for your job?
That's like answering "how often do you need math?" to the question "what is 2+3?". It is an utterly, completely, stunningly trivial question that even a CS101 student on their first semester should be able to answer.
> used pandas, it's a 1 liner. You could see the wretched face of the interviewer since he was expecting a with open() as f: do_some_shit
I would be perfectly fine with the pandas answer, but my point is "csv parsing" wouldn't be my first question, I'd start with something much easier and if you managed that, I'd gladly accept pandas as a valid and then asked you to elaborate further (what does this pandas function do, how would you implement it yourself etc).
I don't consider palindromes equivalent to math. Unless you mean how to index lists? That would be a valid question since there's different ways in python.
Your refusal exposed you as someone with a hubris that's hard to work with. Who doubles down and refuses to budge.
That you couldn't yield on such a trivial, manufactured matter would make me wonder how you respond in a team environment on real matters in the face of adversity.
CSV is really fucking hard to parse. There's tons of edge cases. "I'd just use a well-known, well-tested library" is a very valid answer.
One of my go-to questions is "sort this array", and if the candidate types `Arrays.sort(input)` they get bonus points, because it shows they have useful knowledge of the language they'll be writing in.
Here's how their scenario would actually play out:
> I took my sunglasses off, looked him dead in the eye,
> and said "Why? Why write your own method in
> a vanilla language? Give me a good reason."
Interviewer: "To see how you might approach it."
It's a synthetic interview question. Why does it suddenly need to be a production-ready CSV parser?
We don't have enough information to say whether these are bad interview questions, and it's really not the point.
Maybe the interviewer just wanted to see if you'd use good fd hygiene (`with open('file.txt') as f:`) and that you can stub out some toy parser code and speak of it intelligently. And that you wouldn't throw a fit when asked to write some code that a library like `npm install fizzbuzz` can already solve.
That's a bit unfair. I agree that Array.sort is the best answer, but array sorting is such a classic interview puzzle that most candidates will expect that that's what you want. So instead of the bonus answer, they'll slog away trying to remember the algorithms that they learnt at uni years ago (or codecademy last month).
When I see someone doing something wrong or poorly I speak up and I speak out. Loading csv using vanilla python is in that boat. Present a good reason why it should be done in vanilla. Otherwise, I won't maintain your code as-is and will refactor it.
At least you were given a programming problem. I would have played along a little bit and at least discussed how the layers work. Instead of pandas, use the csv module so he could see a loop iterating over header and data rows as tuples. If he pushed further, outline how you might code the csv reader yourself if it didn't exist. Move the discussion into the difficulties of correctly handling a file format like csv and real-world issues like malformed inputs, validation, and error-handling.
In my case about a decade ago, I became obstinate when given one of those silly math-trivia-puzzle problems like how many ping pong balls will fill an elevator or how long is a string. This was for a relatively senior track FAANG interview, where my background at the time was in HPC, distributed systems, and provisioning systems that were precursors to today's IaaS bread and butter. In my case, the interviewer was adamant that I derive some figures that depended on basic geometry and knowing the diameter of the earth and ratios of landmass to ocean surface. But, he wanted me to first SWAG these figures, after I told him I wasn't a geospatial geek and didn't memorize such facts.
I decided to be difficult. If I were somehow faced with such a task, I would first consult reference materials and even see if I could find the actual answer he wanted, since it sounded only one step removed from what you could find in the CIA World Fact Book or similar. Only if that failed, would I dig up source facts and try to derive an answer. I would focus on getting the answer, not on entertaining myself with a Martin Gardner puzzle at my employer's expense and risk. I would never have to make seat of the pants estimates and act on them rather than doing due diligence. I'd either have done homework, or if there was really no time (like some system availability crisis, if we pretend I was an ops person), I'd choose a pre-planned contingency action to make time.
They didn't go forward with an offer, and I felt a bit of relief at that.
I got this CSV question at some nameless large company in the bay area.
I asked "Do you want me to do the simple thing, and use a library, or write code to do this?"
They preferred the latter. So I did.
As happy as it might make you to be snarky for the often silly questions you are asked, people are, if they are smart about it, looking to see your thought processes.
I recently got a question that I didn't know how to answer, so I started thinking through it aloud. I think they liked it, because they engaged with me during the process. Redirected my thought processes.
Again, good companies will do that. Look for every question, no matter how silly, as a way to show how you deal with situations, even ones you may not like. These are the moments for you to shine.
And after you get home, tell your SO/friends/etc. how wacky they are.
That seems like a fairly arrogant response on your part - it's obvious that an interview is meant to gauge your technical capabilities/knowledge. Even if something is trivialized, the ability to solve certain types of problems can translate to other problem domains where you have to use the same core skills to solve other problems - parsing a file does not seem like an other worldly type of skill (it could even be an unstructured/loosely structured text file for example).
I think to some degree, a suspension of disbelief is reasonable for an interview, and it's not worthwhile to get tunnel vision on a particular problem asked. It's ok to point out that something is trivial with a particular widely used library or whatever, but if it is asked to implement something via first principles or whatever, it is a perfectly reasonable expectation to just do it for interview purposes as a simple mental exercise.
I respect your view but disagree. I use pandas everyday for my job. I use spark. I use keras. I use scikit. I use all these APIs everyday for my job. I can't think of the last time I loaded a file using vanilla python. If it's a requirement for the job (which it wasn't, they showed me pandas code later) then I completely understand and would not be qualified nor would I want the job.
There's a few things going on here. First, there's a frustration from the interviewer who is obviously looking for the cookie-cutter answer to move things along. Second, there's a culture misalignment because they are looking for cookie cutters to do the job while I am not that. Third, they, along with OP, are obviously not a palindrome company nor a load-everything-to-lists company yet this is what they're testing. So again, culture misalignment.
At the end of the day, you are right, I self-selected myself out because I would not be happy at a place like such who are looking for cookie-cutter employees.
Why write your own method? Because it is sometimes better to write 10 lines of your own code rather than pull an entire library for some trivial task.
Creating a dependency to a library is not benign. Even if the library is easily available and mature doesn't mean that everyone has it, or that it will never change. Remember how "leftpad" broke npm?
Having that smug attitude towards the recruiter, assuming he is an experienced developers himself, should rise a red flag. Yes, sometimes projects have constraints (like "no pandas") that may seem stupid, and sometimes they are. But it is first important to understand the context.
So instead of asking "give me a good reason?" to the recruiter, give the reasons yourself. Invent a scenario where you actually need to rewrite that csv parser, bonus points if it is related to the company's business, and finally, write the damn code the recruiter wants you to write.
I don't agree with this reasoning. Perhaps the details are important. The task required loading, filtering, and aggregating a csv. Sure, if all do is "load" a csv perhaps you don't need to import a huge library. But if you want to do anything with that data I'm sure the library becomes very useful :)
Ah. Did I not mention that these CSV files I wanted you to parse... they're generated by an old mainframe system we can't modify the sourcecode to. It does have a few idiosyncracies - for fields like addresses which can contain commas, it uses a special escape sequence where the field just consists of three asterisks, then the value for that field is the content of the next line. Oh, and it uses just CR characters for linebreaks. In EBCDIC.
So presumably the mainframe has code that can parse these non standard files? Just take that code and build "on the mainframe" a tool that converts their non standard csv to something more usable.
And the post interview notes said bad performance on the isPalindrome() question since I didn't write out my own functions (interviewer did not mention to).
Personally, if you gave me this answer I would accept it and then asked you how those functions work and how you would implement them. Still trivial, but it would tell me whether you are familiar with the basics or you just memorized some stdlib functions.
To be fair, if you wanted to do a performant implementation, this is probably a more expensive implementation. The expected response is generally for you to iterate over the string and construct a count map of how many times a character appears in a string, and then iterate over the keys of that object and have at most one odd number in the values of the count map.
The interviewer probably should have probed you about performance though if their intention was to judge you on that (they're both O(n) time complexity, but split, reverse, and join are generally more intensive).
Err I always thought that the expected performant response is to have two indexes 0 and length-1 and bring them to the middle checking that the values are equal :)
I'm very confused by this answer. Have I missed a joke? Wouldn't the expected response be to iterate over half the string comparing the characters to the other half (in reverse order)? The criteria you provide seems to suggest aabbc is a palindrome.
The function to check palindrome will be composed of those building blocks, that's my meaning. Use those basic skills and composing them to solve a problem, such as palindrome, that's coding in the small to me.
Coding in the large is project management and could be tested by a take home, but I'd rather tease it out by asking in person questions about organization, prioritization, and technology choice.
Maybe it's just me, but needing an entire IDE or a debugger to write a function that checks if a string is the same backwards seems entirely too much to me. It's a pen&paper question really.
It's not just you. I use Visual Studio when I'm occasionally looking at C#, IntelliJ on the odd occasion that I'm doing Java, and otherwise it's emacs and I can't remember the last time I used a debugger for Go, Elixir, C, C++, JS, Python, Ruby, etc. Basically, unless the language's standard library is massive and over-abstracted, no IDE is necessary. There's no way I'll ever remember org.apache.some.deep.package.HttpClientBuilderFactory and its 6 constructor arguments, but "import requests; requests.get(...)" is pretty easy to write without an IDE.
Well, this may be a location and tech-dependent problem, but specifically hiring .NET positions in Jacksonville, FL I would say maybe 10% of the hundreds of people I've interviewed can code. Recent graduates, 10 years of experience, everything in-between. It makes no difference.
It's not a myth. I see it when recruiting for my teams. My sample size is too small to give a proportion, but it's enough that not having tests would be a total waste of my and their time since I'd have to fire them on day 1.
I don't mean this the way it sounds, but if you can't tell if someone is an imposter by talking shop with them in your chosen profession, you probably shouldn't be doing the interviewing.
You're sometimes right, but sometimes really wrong.
I had a candidate for a 3d graphics job, who showed me incredibly impressive things he'd programmed like very realistic water simulations, etc. He talked us through them, was incredibly articulate and clearly knew his stuff.
He completely bombed the coding exercise. It was something like: given 2 hours alone at a computer, with a threejs scene already setup, make it display a few cubes at different heights, colors, etc. Full access to internet, google, etc.
He couldn't get one cube to show up. That's maybe a two line piece of code when given the entire environment already set up, and you can find it by googling Threejs tutorial and copying like the first example.
Your interpretation does theoretically fit with the facts. But so do a few other (in my mind more plausible) stories, like:
1. He knows how to do extremely specialized things in an existing environment, but can't do things outside of that environment.
2. He was presenting work that he had only a partial hand in creating.
3. He is amazing at e.g. C++, but cannot learn even rudimentary new things in Javascript.
...
Some of these stories make him a terrible programmer. Some of them make him an OK programmer for other positions, but weren't relevant for us. None of the stories make him a particularly great programmer.
"Perhaps he really was a very talented developer but your coding exercise was too stressful for him"
Yes, it's possible. A lot of things are possible. But I'm sorry - if someone can't copy-paste a 3 line program from the internet and get it to work, in their own field, for 2 hours, then... I don't know how I could ever design a test on which they won't fail. Including the actual job.
>You're sometimes right, but sometimes really wrong.
Ya.
As far as graphics, the only time I was ever fooled was by a front end guy in 2005. He brought really nice looking color printouts of his front ends; beautiful stuff. I was really wowed by it; so much I didn't ask him basic programming problems or follow up questions about his prior work, and that was my fault. I allowed myself to be fooled.
His biggest issue was he was lazy. He had grown accustomed to government contracts where you do a lot of prototyping, but nothing ever went live (at least the contracts he had). We had a very aggressive 3 year schedule to build a sophisticated product and he just couldn't or wouldn't keep up. I had to double my workload and he ended up only doing about 5% when he should have done a third and he required a lot of prodding and hand holding. We still managed to ship on time though, but it was really hard on the other 2 people on the team for 3 years.
I think unless you're prideful enough to believe you can identify someone's level of productivity from a short conversation, it's probably better to see the proof. Again, why would I settle for a conversation when I can ask for first-hand proof? It's like a judge dismissing concrete evidence and just basing their verdict on the circumstantial. Sure, it's likely to be right most of the time, but what about when it isn't?
>I think unless you're prideful enough to believe you can identify someone's level of productivity
None of the techniques discussed in this article are even attempting to judge productivity, they are attempts to judge coding ability. It's good that you bring up productivity though; that's what we are really after, isn't it?
>from a short conversation
I'd argue 30 minutes to an hour is not a short conversation. Do you think an imposter could fool you about tech (assuming you are an active programmer) for 30 minutes to an hour?
>it's probably better to see the proof
You're not proving anything except the fact that the candidate has memorized a merge sort algorithm (or whatever trivia you are testing for). What you really want to know is, can they ship?
I think you should at least consider the idea that not only are these tests and homework assignments (past fizz buzz) a poor indicator of ability, they active repel any candidate good enough to be in even reasonable demand. I mean you have a lot of people on not only this thread, but many other threads saying the same thing.
Time for a little reflection, what does your company offer that even a mediocre developer can't get in 20 other places in your city? I mean if I could get "market rates," and a "fine cubicle," and "standard PTO" in 20 places, tell me again why I would bother with this rigamarole?
I have no idea where the idea that having someone come on site for 6+ hours wasting a dozen peoples time became the norm as opposed to talking to someone for 1 or 2 hours and truly speaking with them to understand what they know.
You're completely right, if you can't talk shop and not know when someone is bullshitting you shouldn't be the one interviewing.
How would this even fly in other industries? Are you telling me that someone can bluff their way discussing the intricacies of surgery to other surgeons without looking like a moron? Or discussing particle physics with a researcher and stand on equal footing? I honestly want to hear a conversation where a liar with no programming skills is able to convince a Senior level Software Engineer that they are on equal skill sets.
The best interview experience I had, where I eventually worked for the company, was having an hour conversation with the Engineering Manager who I would be working under then being ask basic programming Qs based on our conversation we had. Everyone in the company was interviewed the same way. IMO the resulting team was very professional and skilled on various levels from various backgrounds. I actually felt like a human at that job and not a person hired because of x thing.
I'll never forget I had an interview where they asked me a question about finding anagrams. At that moment in my life, I've never heard of an anagram or knew what it was. The Interviewer then explaining what it was to me made me feel very humiliated. I completely bombed every question because I had the audacity to not know about this thing where the interviewers knew about thing therefore I should know thing as well. I immediately understood how cultural biases can effect candidates getting jobs. I can only imagine how worst it is for people not from traditional backgrounds in this industry.
IMO if you're testing for algorithm memorization you're doing it wrong. I prefer to give someone a really simple exercise and then challenge them with added complexity to see how they adjust (since this is what most real work looks like anyway).
You must have a very efficient filtering process. I'd say probably about half of all CS graduates and a greater percentage of people in "coder" positions will struggle to code at an extremely basic level.
Yeah. In all honesty, I would prefer _more_ of our interviews be straightforward coding problems that are reflective of the job the interviewee is going to be doing. Not whiteboarding. Not pseudocoding. Not puzzle problems. Just: write some code, let's see how well you google when you get stumped, let's see how clean your code is, let's see how quickly you implement it relative to other people, let's see how well you document it, let's see how well you collaborate with the interviewer.
This was the one that really stood out to me. MOOCs, bootcamps etc are wildly popular now and vary massively in quality. It's very easy for candidates to look good on paper but not actually know the technical side as well as they should. More importantly, software engineering is one of the few jobs where you can get a (somewhat) quantitative measure on prospective performance so it's natural to take advantage of that.
I'm not saying that All Coding Tests Are Good, but they do provide a modicum of certainty in a very uncertain environment.
At a former job, I had a set of simple, if contrived examples, that I asked. Programs were short, less than 20 lines; candidates were instructed that the source would compile unless indicated otherwise, in which case they were given the entire compiler output. I would ask fairly simple questions that would expose a lot about the candidates knowledge. Such as:
In C++, what does this print? And if they answered correctly, I'd ask them to explain how/why. This question wasn't a deal breaker, but did immediately given an insight into their depth of knowledge of the language.
That is because over last few years it became extremely prevalent industry to make everything about "teams": projects are responsibility of teams, failures are attributed to teams, etc. It used to be that code won arguments. Now it is not ruffling feathers that wins arguments. If I'm not hiring the entire team, how do I know who in that team actually can code based on the result?
What does it all mean? It means exactly what it meant in a high school/college group assignments - one or two people carried the group project but everyone took credit.
I have seen hundreds of resumes where people claimed that they have done XYZ where through some side channels I know they just happened to be on that team but their contribution was limited to tweaking spelling.
I see coding tests as one of the two ways to evaluate that someone can code[1]. I trust it.
[1] The other way is that I know this person and I'm already aware what they can do.
It's surprising to see the number of people who interview for lead technical roles that cannot code, or whose work is exceptionally sloppy. Incompetence is more commonplace than the author believes, even at the highest level.