Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: interviewing.io is out of beta and open to engineers of all levels (interviewing.io)
171 points by leeny 38 days ago | hide | past | favorite | 120 comments

I really wish there was more work done in changing the terrible interview culture in software, instead of doubling-down on bad practices with low signal-to-noise ratios.

It's clear that whiteboarding and impromptu coding have little-to-nothing to do with real-life coding capability (let alone with generating real business value), while actual signals (e.g.: GitHub projects, OSS contributions, StackOverflow, take-homes, etc.) are ignored by interviewers. I've interviewed dozens of engineers and it's always been a miserable experience because my hiring manager (or other senior engineers on my team) always insisted on using these kinds of "grilling" methods which were never good signals to begin with.

It's a shame how our industry is the only highly-paid professional industry where this kind of sophomoric "intellectual hazing" is not only accepted, but also encouraged. I mean, hell, I have like three books on my bookshelf not about how to write good code, or scalable code, or performant code, but merely about how to pass interviews. Yuck.

Here’s a counterpoint: I don’t want to be forced to do OSS as part of my resume. I already write software over 40 hours a week for my job. And I won’t do a take home that takes longer than a regular interview unless the place has really good pay (I’ve never heard of a take home place paying top of industry).

Meanwhile I’m very good at competitive programming, I actually find it enjoyable because of the challenge, and I don’t need to practice it much even when interviewing.

Also to be perfectly blunt I think this just ties into people generally being uncomfortable with tests. All the examples you gave have the benefit of allowing unlimited time. What’s wrong with a test that challenges your knowledge of data structures and algorithms? I actually find that in general the very smart people I know are good at them and most of the people who “don’t test well” aren’t. And even then they can just practice them. They might even learn something. Sorry if that sounds mean, I don’t mean it to be, just blunt.

>What’s wrong with a test that challenges your knowledge of data structures and algorithms?

It's not a good signal for who will provide good business value through software engineering.

>I actually find that in general the very smart people I know are good at them and most of the people who “don’t test well” aren’t.

Is "smart" just directly 1:1 with "provides good business value" for you? I'm not so sure it's that simple...

>And even then they can just practice them. They might even learn something. Sorry if that sounds mean, I don’t mean it to be, just blunt.

It "sounds mean" because you are being selfish about your personal situation and then saying "it's not really that bad, come on," to people in different situations. It has nothing to do with you being "blunt."

>It's not a good signal for who will provide good business value through software engineering

Entirely depends on the org. I’ve done DS+A work professionally. I do think it tests fundamental understanding of programming considering DS+A underlies everything we do. If you’re hiring generalist software engineers (or general knowledge is a requirement) and not specialists in specific stacks it makes complete sense to me. I also think it’s harder to fake/game than almost any other programming related test.

>”smart” Necessary but not sufficient. You can drive business value without knowing software engineering at all. Driving business value through software engineering requires being a good problem solver

Seeing someone unable to do a slight modification of breadth first search given 30-60m to sketch it out does, in my opinion, give me a general sense of their software engineering ability. I am bluntly saying I think it’s an accurate way to assess skill regardless of all the hacker news complaints of “why can’t I get a $300k/year job without knowing what a trie is”

> why can’t I get a $300k/year job without knowing what a trie is

You're grossly misrepresenting my complaint.

I've made contributions to several high-profile OSS projects (including some small HTTP core contributions to Golang), have published one book as a co-author (another as an editor; both with Apress), have several projects with 100+ stars on GitHub, and am in the top .9% of Stack Overflow contributors. This is in addition to having a few interviews out there given to legitimate outlets, like Vice and PCGamer, about past startups (all of which have involved hardware or software).

My pedigree is verifiable. And yet, if I want to switch jobs (which I will inevitably have to do), I will be subject to being asked about some algorithmic questions that I am willing to guarantee I couldn't answer without studying. For example, I genuinely don't remember how red-black tree or splay tree balancing works. But I'm small fries compared to people like Max Howell[1] (who wrote and manages brew) and yet couldn't get a job at Google because of some algorithmic question he flunked.

Fighting for better interviewing practices in software engineering is one of the few hills I'm willing to die on.

[1] https://twitter.com/mxcl/status/608682016205344768?lang=en

I agree that knowing the details of a red-black tree rebalancing is not fair game. I wouldn't bother implementing one unless I had access to a paper written by an expert like Sedgwick. Which leads into my gripe with programming interviews.

My biggest problem with this culture of interviewing is that most people don't seem to know/appreciate the complexity of the questions they are asking. There's been a number of times where my interviewer completely lacks any knowledge on the background theory/history on the question they asked me.

Afterwards I tend to find the question in Knuth or an Algo book.

That being said, I do think knowing that red-black trees, like AVL trees, fundamentally rely on Tree Rotations to maintain invariants about the tree which ensures its height balance IS fair game, IF you require formal algo knowledge from your candidates. With that knowledge, theoretically a well-read interviewer can guide someone to towards a partial implementation.

And from that knowledge, if someone would to introduce the concept of a splay tree, I could guess that since its a self-balancing binary search tree, it likely relies on tree rotations as well. You would need to bounce that off the interviewer obviously.

But if you're not familiar with the concept of tree rotations, and/or data structure invariants, you probably won't succeed no matter how much coaching your interviewer provides, due to pressure on your working memory.

The fact that you couldn't remember red/black trees or splay tree balencing works is a GOOD sign in a software engineer. It means that you're not going to try to re-engineer existing solutions.

I would be very surprised if you were asked about red-black trees or splay balancing at the two large tech companies I’ve worked for. That would be considered too niche/specific to be a good interview question unless you were applying for a position which needed that particular knowledge.

On the other hand asking to “invert” a binary tree is totally fair game. I attribute Max’s frustration partially to his interviewer not adequately explaining what “invert” means in that context though that would also partially give the answer since it’s IMO a very easy problem (though not a good one since it’s confusing).

> to be perfectly blunt [...] the very smart people I know are good at them

> and most of the people who “don’t test well” aren’t

It's frustrating to read this same response in every interviewing thread. If I may be equally blunt: it almost always comes from someone showing a distorted understanding of the process, or implying human beings are fleshy robots which are easily deterministically QA'd by automated tests.

For weeding out the worst performers, testing works for lots of people and it works at scale. That's why it's used. That doesn't mean there aren't false negatives, or even that there aren't a lot; it just delivers good enough results for cheap enough that other local maxima aren't worth exploring.

But the industry wants it both ways. They want to cry about a talent shortage where apparently there's no one in the output of this local maximum of process they've chosen, and also claim this choice of process is fine and maybe people should git gud or maybe they're dumb, and also actively refuse to reexamine their decision in spite of its obvious failure.

The very fact of their complaint is evidence the process isn't good enough. Unless they're simply lying, its results are costing them in growth or time-to-market or actual money. If they really feel this pain, they should be finding other options—but they just don't want to.

Where else would this behavior fly?

> ...do a slight modification of breadth first search given 30-60m to sketch it out...

> ...“why can’t I get a $300k/year job without knowing what a trie is”...

I've interviewed at FAANG and elsewhere, pass and fail, and this is simply not what happens. People who defend the process often try to frame it like this and it's just... at best, indifference to the truth. In my experience—and many others', and of everyone who gets on LeetCode with a sigh because they know this—you better not be tinkering with BFS at all. Your session might be scheduled for an hour, maybe closer to 45 minutes, and the interviewer will expect you to correctly solve at least two questions, probably three, and still have time for "any questions for me?" at the end.

Counting introductions and small talk, and allowing you 10 minutes at the end to get one question in about the company, that leaves you (optimistically) with 15 minutes per question—including them explaining it, your clarifying questions, writing code on the board, erasing mistakes, drawing awkward arrows where you oops-forgot a line, and all the rest.

Need a hint? Cool, sure, I guess. Another? Uhh. (You're dumb.)

You don't in any sense have even 30 minutes for your BFS if you expect to pass, and your sketches won't count. That easy trie thing or search-with-a-twist, in actual practice if not in design, is meant for you to solve as fast as you can write it on the board. Otherwise you won't have time for questions 2 and 3, the ones they're really counting. Get ready for a crapshoot of well-chosen problems, pointless memorize-the-answer gotchas, and random applications of DP or topics from computer science papers they "would never ask because that's silly" but really just did.

If you're ok with that, I understand. But please don't say it isn't this because you like it.

> Is "smart" just directly 1:1 with "provides good business value" for you?

Not necessarily... but I've seen a concerted effort in the past few decades to separate business value generation from software development anyway: the sort of people who are good at business value do business value stuff and they work with the software developers to get the computers to participate in it.

> Meanwhile I’m very good at competitive programming

Not to burst your bubble, but you aren't. You are very good when you get it right. You aren't when you don't get it right. But then you are leaving out that part because nobody likes to focus on the negatives. Consider this - at some point of your life, you didn't know competitive programming. Were you 100% un-hireable at that point ? Most likely, you would have done perfectly fine if you were hired without all this ceremony. That's what people are frustrated with.

People dismiss college credentials as elitist gatekeeping, When you create entire artificial domains out of whole cloth, like "competitive programming", that's gatekeeping on steroids. Because that whole domain is artificial, I can stuff anything into it. The surface area of that domain is practically infinite. Forsure there's some patch where you'd under-perform. And I reckon the area of that patch is quite large. You just haven't found it yet.

> Not to burst your bubble, but you aren't. You are very good when you get it right. You aren't when you don't get it right. But then you are leaving out that part because nobody likes to focus on the negatives

I'm sorry, what? "No one is right 100% of the time so there's no such thing as being good at anything"?

> People dismiss college credentials as elitist gatekeeping, When you create entire artificial domains out of whole cloth, like "competitive programming", that's gatekeeping on steroids.

No it isn't. Anyone can pick up a book or go to leetcode and learn to solve interview problems. I know multiple people who did it in their early teens.

Or, you know, these practices exist to limit hires to specific demographics in the first place.

Counter to your counterpoint. I don't do competitive programming. I already write software 40 hours a week for my job. I don't want to spend months learning how to hack interviews when I know I can do a good job and I have done so in the past based on my resume and references.

Yeah but there's got to be some give in the name of getting signal. Resumes are incredibly noisy signal. I think references are pretty noisy too... a bad reference is very bad, a better than average reference is really good, but I think the average reference, which most people give, is very low signal.

> Also to be perfectly blunt I think this just ties into people generally being uncomfortable with tests. All the examples you gave have the benefit of allowing unlimited time. What’s wrong with a test that challenges your knowledge of data structures and algorithms? I actually find that in general the very smart people I know are good at them and most of the people who “don’t test well” aren’t. And even then they can just practice them. They might even learn something. Sorry if that sounds mean, I don’t mean it to be, just blunt.

Not everyone sees these trivia games as a valuable use of our time. Some people have hobbies, some people have side projects that are of use.

The leetecode problems don't test you for experience. They just demonstrate your ability to minimally code up a trivia problem. They don't show good coding practices, doc, or testing.

> What’s wrong with a test that challenges your knowledge of data structures and algorithms?

Maybe a standardized one, that you take one time, as part of a professional accreditation, even?

> I actually find that in general the very smart people I know are good at them and most of the people who “don’t test well” aren’t.

Man I need to get the hell away from this field

The thing is, CP is not everyone's cup of tea, and also it is a separate discipline/subject with its own trivia knowledge & tricks. CP uses Computer Science same way as e.g. Physics or Biology use Math.

> And even then they can just practice them. They might even learn something.

And you can "just" contribute to OSS, can you not? You might even learn something.

You seem to be advocating for the interview style that fits your personal strengths and preferences while arguing that others should not do the same.

Personally, I test just fine and I also contribute to OSS. Perhaps I should argue for that to be the standard, since it would benefit me personally.

It's a legal way for:

1. Ageism - not many seniors are desperate enough or have a free time for competitive programming preps.

2. Making switching jobs harder - for every next job one have to prepare again, because nobody is using competitive programming stuff during real work, so you forget.

> Ageism

I agree with that as that the knowledge they're questioning is more of a recency to being in school. Outside of school you don't write these things. You rely on frameworks and librarys. Why? They're mature and have been verified by others. In addition: if you don't know that software has to be well verified, you are VERY in-experienced in the field.

Github projects, OSS contributions, and StackOverflow questions, are imperfect measurements and can be inaccessible to a large number of developers. Github is banned in some countries, many OSS projects are english only, and stack overflow moderation needs no introduction, to name a few examples. Imagine if github became a ledger where you are now judged by the shitty code you wrote when you were 20. Remove older repos? You'll be scrutinized for having a "gap in your resume".

The system is imperfect but is there actually a better alternative that fits the economic and cultural needs of our society?

FWIW I'm on your side, interviewing gives me the worst anxiety in the world even though my career has progressed to the point where I can consistently get an on-site interview.

Yes, there is.

Extract a piece of work from the project the new hire will be working on. Set up scaffolding and any boilerplate so that only have to implement one new feature (or fix one bug). Give it to another employee or yourself and solve it while recording how long it took. If someone that's familiar with the project took 4 hours, simplify. If someone that's unfamiliar with the project took 1-2 hours to solve, send it out to candidates.

If you can't do any of the above, you might have process issues that you need to solve prior to getting a new hire in the first place.

Oh, and if the take-home exercise is expected to take 4+ hours, pay the candidates to do the exercise.

How many hires have you made using this process? It is interesting in that the cost scales per person significantly. If I'm putting that much effort into the candidate, I think I've already determined I want him.

I've definitely used a candidate's original work on Github to evaluate their skill but we still spoke about career objectives and culture.

Honestly, I doubt that guy wouldn't have aced the technical interview anyway. It just saved us both time to not do it.

I've hired a handful of people with this process, across two organizations. First time around, I identified and created the take-home exercise in one working day. Half of it was actually setting things up, the second half was solving it myself and tweaking it such that it takes the amount of time I was aiming for. Second time around it took a half day total.

I'm not saying you shouldn't look at a Github profile or any code samples the candidate sends over. However, those metrics aren't good aptitude indicators. I feel the same way about algorithm-style coding tests that a lot of companies follow. Sure, let me find all the anagrams in a set of words only to land a job writing REST APIs...

I don't see how the cost scales per person significantly. You create one take-home exercise per role you're hiring for. You send the same take-home exercise to any candidates that apply for the role. Worst-case scenario, it takes too much time to compile this take-home exercise, in which case you've hopefully spent the time to smooth out your processes which leads to an easier onboarding. Best-case scenario, you've compiled a take-home exercise in a reasonable amount of time, which verifies that onboarding will be smooth for the new hire.

To your point, the take-home is for candidates in the 2nd or 3rd round of interviewing where you've verified their experience, you've verified their character/soft skills, and need to verify their aptitude/hard skills.

That's very interesting that you haven't had any signal from Github. My friends also shared that feeling but I've tried that on two people and they were both hits. So at least I know it has specificity.

Ah I see, you do one for the role, not per person. I misunderstood. That makes sense. What sort of roles were you doing that for? Generally, I aim to keep things short which is what I worry about with exercises. The tendency to perfectionism is higher in that than in real work. But if it's effective, so be it. I know I was hired that way once out of uni the better part of a decade ago! :D

The vast majority of candidates I've spoken to commit their code to private repositories. Same goes for me.

Roles were for fullstack and frontend. You're right in that sometimes time escapes us when coding. That's why I set a time limit to the exercise and ask them to submit whatever they coded in that time frame. Doesn't have to be perfect, nor does it have to be complete. The point is to have code to talk through specific to the role you're hiring for, and ideally specific to the project itself.

I don't like these because it's not hard, can't really show skill as people aren't impressed unless you overengineer. I don't have time/desire for that - prefer algo questions as it's some entertainment at least. Although this is probably best way to hire juniors.

What does skill have to do with difficulty?

Big red flag if you think the only way to impress someone of your coding skills is if you overengineer a solution. If some candidate submitted an overengineered solution, I wouldn't hire them.

I agree on needing time to solve the take-home. That's the biggest con with this approach.

If you don't have the desire to complete the take-home exercise, then you probably don't want to work there in the first place.

Moreover, imagine for just one second that github became the defacto way to interview candidates.

Now every candidate starts hacking their github full of all kinds of code, or even paying people to, or copying work to create a perfectly manicured "facebooked" version of what their work looks like (probably with fake timestamps too).

Suddenly that signal becomes entirely worthless too. I think tech has one the best interviewing processes of all industries.

> Now every candidate starts hacking their github full of all kinds of code, or even paying people to, or copying work to create a perfectly manicured "facebooked" version of what their work looks like (probably with fake timestamps too).

I'm of the opinion that going over someone's GitHub code (asking about design decisions, language alternatives, testing, algorithm choices) during an interview is probably orders of magnitutde more valuable than having them implement a linked list. But to each their own.

I think you're comparing an idealized version of a GitHub-based interview to a, at best, mediocre version of a whiteboard coding interview.

Not everyone likes to work on code in their free time, though, and code from work is often not public. How do you suggest people like that get jobs then?

Shit 40 hours is already 35% of my waking hours in a week, including weekends. Factor in that a lot of places extend hours by 30-60min to account for lunch ("9-5" my ass) and then commute time for most folks, and tons of people are already giving half their waking hours to their job.

How much more time am I supposed to spend clickety-clacking code into a computer while ruining my health? Zero. Zero should be the answer. Because 40 hours is already a hell of a lot.

> Github projects, OSS contributions, and StackOverflow questions, are imperfect measurements and can be inaccessible to a large number of developers.

I fundamentally reject this premise and your arguments are orthogonal (GitHub being banned in like Syria and most OSS projects being in English is just flimsy hand-waving).

How do you intend to deal with developers that have little to no Github activity?

Take-homes, other OSS contributions, StackOverflow activity... GitHub isn't the only code outlet in the world. If I'd hire a writer, I'd like to see writing samples. If I hire a coder, I want to see coding samples.

Excuse me sir, those writing samples are irrelevant. Could easily be plagiarized. I prefer to test my writers on arcane grammar in the languages they supposedly speak in high pressure situations.

"Well these novels all look fine, Mr. Stephen King, but you see there are just so many fakers out there and even though you can talk very well about the process of writing a book in a way that only someone who either actually did know how to write or who was so wonderfully skilled at faking it that they'd be able to make more money applying that skill directly to a job that called for it (have you met our CEO, incidentally?), but oh my, yes, there are so many fakers, yes I'm sure there are a lot, it must be that our bad hires have been fakers who tricked us and not that environment, process, team, and management quality significantly affect people's performance in various ways and that we're badly deficient in some of those areas—anyway, point is, next we're going to quiz you on obscure citation format details across three different citation styles, have you sentence-diagram and label all the parts of speech of every word in the first chapter of Moby Dick (should only take you 5 minutes or we'll start obviously betraying that we think you're an idiot), and then write out for us the subjunctive mood conjugation of a few irregular French verbs."

"Uh, will I be doing much of that on the job?"

"Oh god no. Where do you think you are? This is Buzzfeed. You'll be hammering together barely-coherent prose that's just shy of stolen and publishing it as fast as possible."

^ your average programming interview with whiteboard grilling.

Having had to deal with this process for a long time, I've come to accept it. Fundamentally it's a scaling problem, the big tech megacorps can't do careful tailored screening of those personalized signals you mention for all of interviewees they get, coupled with a cargo cult problem with everyone else from unicorns on down to stealth garage startups all imitating the megacorps.

One optimization I'd suggest though is to standardize the process. Algorithmic/data structure more theory-oriented problems are unrepresentative of "actual work" and are annoying, but what's more annoying is how the process is 1) opaque, and 2) repetitive. So you end up with candidates grinding through 80+ Leetcode problems to both master the skills necessary to ace these problems, as well as covering as many commonly-asked problems in a bid to game the system by preempting their interviewees questions. Goodhart's law abounds. This basically makes the process into an interview version of standardized testing. But- despite the standard prep process and a standard bank of questions, it can all easily fall apart because of subjective interviewer opinion, and so all the preparation is for naught. Candidates end up having to do the interview gauntlet for each company they apply at.

So standardize it. Put out clear, industry-standard rubrics for how one's performance is graded. Don't say "we only want to know how you think" and then ding candidates for not getting the right solution. You obviously care about the right answer as much as the thought process- don't be disingenuous about it. Maybe even standardize the level of difficulty of problems. And most of all:

Make this process a one-time thing.

Or rather, a once every five year thing. Outsource the ds/a interviewing section to a third party testing organization, like what Triplebyte is attempting to do, and have the actual interview be personalized to the company. Ask domain specific questions, relevant experience questions, system design, and culture fit questions during the company interview. And leave the DS/A portion be akin to the licensure tests that other engineering disciplines and STEM professions already have.

Software engineers may hate credentialism, but if it's a credential that you only need to take once or twice in a decade, then it's already far superior to the current system that forces candidates to undergo this each time they change jobs.

The only way I’ve gotten around this is to pursue opportunities at smaller companies run by people with a similar mindset. Tech, like most other industries, can be a bit incestuous when it comes to competitive thinking because so-and-so company does it and therefore we must do it too. Smaller companies tend to have to compete more in the labor market and therefore don’t rely as heavily on these tactics.

And while it is sophomoric and ineffective, at least you’re not explicitly judged by where you went to school, etc. like you see in finance. It’s not like other high-paying industries are more meritocratic than tech. I still think tech is by far the most meritocratic industry to be in, despite the interviewing culture.

The only way we change the interviewing culture across the industry is by no longer making the FAANGs of the world the standard to aspire to, because it’s them pushing this misguided thinking.

> GitHub projects, OSS contributions, stack overflow

All of these are great signals IF present, but IMO not beyond.

I have a decent track record in the software industry (FAANG and other), but I don't tend to work beyond work (the results of which are mostly private). Likewise, I've worked with amazing developers who prefer to focus on family/hiking/baking/etc. in their free time over OSS/private projects.

It's great if you have the passion to go above and beyond in your free time, but it shouldn't be expected.

> It's a shame how our industry is the only highly-paid professional industry where this kind of sophomoric "intellectual hazing" is not only accepted, but also encouraged.

I agree for the most part, but in fairness, most other high paying fields rely solely on credentials that can often be pay-to-play (at least in the US)

Exactly, there is still a lot of corrupt credential signaling in the tech industry, but tech is one of the best due to leetcode interviews.

> It's a shame how our industry is the only highly-paid professional industry where this kind of sophomoric "intellectual hazing" is not only accepted, but also encouraged. I mean, hell, I have like three books on my bookshelf not about how to write good code, or scalable code, or performant code, but merely about how to pass interviews. Yuck.

I don't think it's hazing so much as it is a test, and we're definitely not the only profession that has tests. I'm sure medical professionals and lawyers might have books on their shelves on how to pass the MCAT, LSAT, or Bar exam.

If anything, we've maybe not matured as a field enough to have an actual standardized test for this kind of stuff, which would maybe make things better. But that's doubling-down on the "test-y" nature of coding interviews.

As mentioned elsewhere, I think the problem is that it's a decentralized, unstandardized approach, lacking in a transparent rubric, and more inconveniently it has to be taken at every single company rather than one and done, unlike those other license exams.


What evidence have we seen that Leetcode translates to people that write clean, sensible code?

It feels like Leetcode is more of an aptitude test, but there’s tons of people that do great on the SATs but don’t necessarily excel in college.

From my experience, the number one thing a team should be assessing for is sensibility. Give someone the shittiest piece of code you’ve encountered at work and have that person write their true feedback about it. If it’s mostly what your team would have said, there, you found someone with the right sensibilities.

Then hit them with a few easy/medium Leetcodes as a fizzbuzz, and feel good about making a good hire.

"a few easy/medium leetcodes" I thought you were talking about removing Leetcode as a requirement....

I can’t change the world all by myself. It’s a suggestion to weigh the Leetcode less.

The "grilling" methods are not without benefits, they’re great at showing how much the candidate is willing to do to work at a given company.

Perhaps some workplaces rely on that, consciously or not.

This site is amazing. I had a real problem with nerves during interviews, I did about half a dozen practice interviews on the site over about 2 or 3 months, while I was brushing up on algorithms and data structures in my spare time. It was great, if you have issues with interviews I recommend the site.

One bit of advice though, don't just jump into a mock, make sure you prep by studying first for a few weeks.

>One bit of advice though, don't just jump into a mock, make sure you prep by studying first for a few weeks

Once I realized the interviews were done by real people, I realized that would be the case. I was looking around to see if that was something that could be done on their site. It looks like it isn't but I found the following in their FAQ [0].

>If you need a place to start, we heartily recommend Interview Cake [1]. There, you can work algorithmic problems at your own pace and get nice hints as you go.

I wanted to post it here for anyone else wondering the same.

[0] https://interviewing.io/faq [1] I don't want to put the link here but the link from their FAQ does appear to be a referral link, or at least some kind of coupon code. I have absolutely no affiliation with either site but if you want to support this site then I'd recommend checking the FAQ and getting to Interview Cake through there.

I agree that the site is amazing. It is especially useful since you can take a day off and schedule multiple phone screens in a day. This enables you to get to a point to be able to line up the onsite interviews close to each other, which is what you want to do to get multiple overlapping offers.

The only disadvantage is that the site did not have enough mid to senior positions and very little focus on positions for engineering manager and above.

Founder here. This is so nice to hear. Thank you for sharing it.

I don't think practicing interviewing should be encouraged even though I know this has been done for years and years. If you want to end up working on something you actually love in a place you like, you should just learn what is interesting to you. Your "dream" job at a FAANG might be more unpleasant and boring than you've been expecting.

For one job I wanted I knew a lot of the engineers that worked there went to a specific developer meetup group. So I signed up to give an hour presentation on something. The day I went to the interview the lead interviewer asked - hey are you the same ... that's doing a presentation tonight at ...?

I said - oh yeah you heard about that. Yeah I'm doing that later if you can make it.

Got the job.

I'm starting to agree with that. And the older I get, the less I feel like I want to study for a couple of weeks for an interview process that has a high probability of not resembling the work I'd be doing. The other day an Amazon recruiter called me up and even said that that's the case.

But each their own, I'm sure people are happy going through that process and then also enjoy working there.

I feel the same. This really irks me. Obviously there are areas of building software where algorithms are a big deal. I recently was approached by a facebook recruiter(for android team) and amazon(can't recall the exact engineering position)...the facebook was even saying we will be asking some medium and a a few hard questions......literally ranked the same way as all the sites that people grind. I'm debating if I want to do it or not. Working at those companies can be beneficial for my resume but the interview process bothers me. The fact that you can solve a problem does not mean you are a good software engineer. There is just so much more to development, architecture, oop, good team work and communication etc but maybe it is easier to teach those things than algorithms. Maybe this system works for the big companies. Plus, I think they are more likely to higher people who are obedient as people who simply won't put up with that hiring process might be more of a 'speak up' type.

They're just using them as "how bad you want it" + IQ filters. The interviews end up testing for some combination of that—want very very badly + good but not amazing IQ, or want somewhat + very high IQ, that sort of thing. That one cannot be prepared for these interviews with the work one does 40 hours a week is by design.

Conversely, interview questions often have very little to do with the content of the day to day work. So while studying the things you like may help you ace the day to day work, you could still struggle on the interview if you don’t prep for it specifically.

This is confusing to me, because supposedly there’s a shortage of “good engineers”. That may be the case or not, but if so, why are these additional bottlenecks being added when these interviews universally seem to be perceived as irrelevant to the daily work of engineering? What would make more sense to me is if the focus moved toward architecture design instead of leetcode.

“What stack would you propose for this product?” Instead of, “Take these two arrays of strings and output a new array where no strings has a letter found in another blah blah blah...”

There isn't a shortage of generic developers, but there are shortages of specialized developers. One shortage example would be an SRE who knows Azure, Kubernetes, Jenkins, and Jira, very well. Another example would be a front-end developer who knows iOS development so well that they can make up for the pit-falls of React Native while also writing half decent typescript code.

It's often difficult to interview specialized developers because the specialist often understands (or pretends to understand) the specialty better then the interviewer themselves.

These hiring opportunities represent a large number (maybe even a majority) of roles at small, medium, and large, companies.

FANNG and maybe a few other companies which have custom tool chains from top to bottom are a little bit different because Google probably doesn't care if you know Jenkins because you'll have to be trained in their homegrown alternative anyways.

The demise of training in this industry is pretty lamentable. While companies can't be expected to turn generic developers into incredibly specialized developers, they probably could do more to help the former move closer to the latter.

I think I read something a while back from the point-of-view of a Google recruiter or someone that described their reasoning for this. The reality is that most companies are okay with the possibility of rejecting a good engineer who has all the potential to be a good fit for them. There's really no shortage of candidates who want to work for them, so they are okay with false negatives as long as engineers are still lining up to interview for them. What they aren't okay with, however, is the possibility of accidentally accepting a bad engineer: a problem which is quite costly to remedy.

The current state of algorithm-heavy interview process is really just a hamfisted solution to the problem stated above. They are trying to minimize false-positives while accepting that it increases the rate of false-negatives, and their proxy for implementing this threshold is algorithmic knowledge. It's not fun for anyone involved, but I guess I have to assume it works.

Personally, I've done interviews where I ask a few toy algo questions of increasing difficulty, in a language of their choice. The first question is trivial and screens out candidates who can't code at all. The more difficult questions are more to see if they can communicate clearly about code, and performance / complexity issues, and how they respond to questions and feedback on their code. I don't care that much whether they can solve questions 2 and 3 perfectly, but they should be able to come up with something.

If they pass that I would devise a longer interview suited to their CV. I expect a senior to be able to answer architectural questions and / or talk intelligently about architectural decisions they've made on past projects. If they claim experience in a job-relevant domain I would ask domain-specific questions, if not, the expectation would be that they would ramp up, and I would stick to more general questions.

So the point is to avoid wasting time preparing and doing longer interviews for candidates who can't even calculate the average value of an array (or worse, hiring them), and get a bit of an idea of their ability to think and communicate about code.

I'm honestly not very confident this is a good approach but that's why I did it.

There is no shortage of good engineers applying to these places.

There is a cheap engineer shortage.

Kafka is jealous.

“Your "dream" job at a FAANG might be more unpleasant and boring than you've been expecting.”

The money can compensate for a lot of unpleasantness. I think finding a “dream job” is an illusion for most people. The question is usually “bearable” or “unbearable”.

The practice isn’t just about how to give correct answers. You’ll never be able to practice enough to have prepared answers to every possible question.

The practice is about getting comfortable with the process of being interviewed. Practice is a great way to become more comfortable and confident before the real thing.

Get the anxieties out before you sit down with a real job interview so you can focus on performing well in the interview.

I think of interviewing like I think of other "tests" like the SATs. They're a thing you've got to study for. Just because they don't mirror what your day to day at a job will be like doesn't automatically mean they're of no value.

I think some coding interview questions are bad, some are better, but I don't think the idea that you must take what amounts to a test to get a job is inherently bad.

Do you think you will hold the same opinion 5-10 years down the road?

Uh, I dunno? Is the implication that I'll have less patience for taking tests each time I change a job when it's the Nth time?

Maybe? At 5-6 years out of school, I personally don't mind tests. Sometimes I actually really like tests (e.g., I've started watching youtube vids of Putnam problems, like doing daily problems on brilliant). I can imagine time to study becoming more scarce if I had a family, but I was under the impression that changing jobs was something that's gonna take a healthy amount of time regardless.

To be clear: I think you can have bad tests, and I imagine I'd find it frustrating if I was boxed out of a lot of jobs I thought I'd do really well at because the unrelated test keeps me out. But I don't think needing to study inherently makes a test bad.

I don't think there's anything wrong with practicing interviewing. It can be a good exercise to do some self-reflection and, depending on the type of interview being practiced, can sometimes help prepare for real-world situations/challenges in a 'safe' environment.

Where I do think you're spot on is that you shouldn't focus all of your time practicing one specific type of interview or practicing for interviews at one specific company. FAANG is a good example. I've seen far too many people get enraptured with the "dream" of working at a FAANG, devoting all of their time to practicing FAANG-style interviews, making their whole life about it, and then either not getting the job or they do get the job but they quickly have their illusion shattered that FAANG isn't heaven and they wasted all that time preparing/fantasizing over something that isn't true. It's Paris Syndrome [0] but for jobs.

0: https://en.wikipedia.org/wiki/Paris_syndrome

unfortunately this is the only way to get a job today, almost in any company not just FAANG, no one cares about experience, interests and projects, but only about this randomized IQ test that can be cracked by practice and rote memorization. I think this imperfect filter got widely adopted because of oversupply of people sending resumes.

Absolutely not true, at least in my experience. Could it be a Bay area problem? I'm in Vancouver BC and interviewing right now and it's definitely different here.

Fellow Canadian, living in California but worked in Canada beforehand. There's still plenty of companies who don't ask leetcode style questions. However, none of them pay even close to what the FAANG companies pay in my experience. You can find offers for similar base salaries (there's startups in LA paying 180k base for roles), but the TC is still nothing compared to the 300k+ packages you're getting from the big boys.

It may have changed since I've been living in Canada, but total comp in Toronto at the time were HALF what they were for similar positions in California. Almost all my great engineer friends who still live in my home town in Canada work remote for USA companies for that reason.

For sure salaries don't compare but that wasn't the question. If you want to maximize salary then you probably want to work for FAANG and go through the interview process.

Same experience here. I'd actually say interviewing was a very interesting learning and participative process. Lot of companies handled it as a chat between peers. Usually there is a coding challenge upfront though (understandable).

The beauty of the peer chat was that it shows the company culture. Lot of interesting people. And makes you highly interested in the company and the job.

Not all experiences were great, but I didn't have to go through painful whiteboarding sessions

this is the case in major tech hubs on US coasts with high salaries/competition

I think your first statement sadly doesn't connect with the rest. These days the place you like will most likely pose a similar interview process as the ones at FAANG, even if the material might be slightly less algorithms-heavy.

Practicing interviewing always seemed to me like studying the day before the exam: if you're lucky and the things you studied actually helped you to pass the exam, you would be lying to yourself because a couple of days/weeks later you won't remember anything of what you just studied. You didn't deserve to pass the exam, but, hey, nobody knows.

Similarly, if you practice an interview and you're lucky enough to being questioned the topics you just practiced and they hire you, what kind of impression are you going to give to your future teammates when some weeks/months later they ask you something related to that topic but you don't remember/know? You just Google it? Well, you can google "how to render a template in Golang", that's fine, but you should not google "what's the difference between a compiled and an interpreted language" because there are some things that you should know before starting to work as a professional software developer.

I know that "the tech interview process is broken", but we, as part of that process, should aim to improve it, not to trick the system for our own and personal benefit.

> I know that "the tech interview process is broken", but we, as part of that process, should aim to improve it, not to trick the system for our own and personal benefit.

It's not broken for FAANG, because what they're looking for is willingness to spend time studying hard for their interviews over material that's (certainly not all) useful in one's day-to-day job (if it were then this wouldn't measure one's willingness to spend lots of time studying), to whatever degree is required by your raw natural intelligence/ability, to be able to pass their tests.

They're looking for (what they consider to be) a sufficient amount of wanting-it + intelligence (IQ, more or less), in whatever proportion is necessary to get the candidate through the gauntlet. That's what the interviews filter for, and it's what they want them to filter for.

(yes they're also favoring recent grads and people who have lots of time on their hands—notably those two categories overlap strongly—but I'm pretty sure those are secondary goals, if they're goals at all)

I get the way you think, but it's irrelevant.

I have 60 minutes to convince you I can code. In my day to day work, I can look at another example in the codebase. I can look up the stupidest stuff in stack overflow, even if it's embarassing. I can walk around the block if it's just not clicking.

An interview is not the same thing as work. It's a performance of code. I am practicing talking about how I think about a problem. I am practicing smiling while coding. I am practicing having a conversation in the middle of an initial rush of panic because of the pressure of an interview.

i think what you're saying is that we should aim to improve the tech interview process by not practicing for interviews, and focusing instead on developing the skills that we actually use on a day-to-day basis. that we should be able to rely on that experience to be successful in interviews, and that we shouldn't play along with an interview that expects us to jump through hoops to demonstrate skills that aren't necessarily relevant to the job.

but, honestly, whenever i interview for jobs that i know that i want, i'm not willing to give up on them over a few weeks of hoop-jumping.

You should of course study well in advance, but you should probably revise the day before the exam too.

I've used interviewing.io a lot and it has been immensely helpful. Not only did I get a lot better at interviewing, I also got a referral from the practice interviewer which eventually led me to my current job at a FAANG.

This is awesome! I've really enjoy watching the interviews they publish on Youtube: https://www.youtube.com/channel/UCNc-Wa_ZNBAGzFkYbAHw9eg

I used this website in my last round of interviewing. The interviewers were hit or miss but I definitely got a lot out of it.

If you had any interview experiences you weren't happy with, please email me directly, and we'll make it right. We definitely want to do better than "hit or miss". aline@interviewing.io

What counts? I'm reading that this is resume-blind:

>Fast-tracking means that you bypass resume screens, scheduling emails, and recruiter calls, and go straight to the technical interview (which, by the way, is still anonymous1) at companies of your choice. Because we use interview data, not resumes, our candidates end up getting hired consistently by companies like Facebook...

but I definitely remember all or most of them asking about my resume, at least at the onsite.

(FWIW, interviewed two years ago, 8/9 for passing technical screens on interviewing.io, 0/8 for onsite.)

I find it confusing that the software development industry both rejects any sort of standard certification bar, yet there is a fast growing industry built around helping job seekers pass a standard technical interview loop.

Given that this overlaps with the HN community, how does everyone feel about the growth of these types of companies?

Selling shovels to gold miners

The issues being brought up are very similar to the things talked about in PG's "The Lesson to Unlearn" [1], i.e. "hackable" tests in university.

IMO, I don't think there is anything wrong with expecting candidates to be knowledgable about basic algo/data structure for a SWE role -- instead, the issues are with the evaluation process. Namely that it's become hackable using these preperation services (resulting in false-positives), it doesn't capture or indicate a candidate's real-world knowledge/experience/abilities, and the typical questions and whiteboard approach is likely outdated when there are probably better alternatives.

[1] http://www.paulgraham.com/lesson.html

Goodhart's Law, writ large!

I understand there is some value in using a technical screen to see if a candidate has the viable expertise and skills required but how do we find that if everyone is doubling down on generic competitive-programming style hiring sites?

After crunching through many leetcode problems I see the problems they give have really poor APIs and many posted solutions people share have memory leaks and other problems that you would absolutely want to avoid on a production code base.

Competitive programming is fun for sure but where is the data that it has a high correlation to job success? It seems to me like a completely different skill set. Writing fast, sloppy code in order to solve a puzzle is fun but it's quite another when your library brings down a production server because it leaks memory like a sieve.

leetcode is the StackOverflow for interviews. If all you do is copy paste the problem, you end up with a bad interview process. If instead you use the problem as a base, understand what signals it is trying to test for, and change it to implement your specific needs, you have a much better shot of gauging the right skills.

Of course, that requires putting time into designing an interview pipeline full of things like rubrics, bias training, and sometimes realizing that leetcode questions don’t evaluate for the skills you need. Most companies aren’t willing to invest time into this and prefer to just copy <big co X> which is why you hear so many bad stories with what sounds like the exact same interviewers.

> After crunching through many leetcode problems I see the problems they give have really poor APIs

This. The real-world best solution to many leetcode problems is to implement an iterator over their broken interface so that <algorithm> can solve it.

Because of how they constrain you, what you end up doing is just sloppily reimplementing portions of the standard library. That's something that you should never want to do in the real world. It completely selects for the wrong thing.

How does this compare to similar sites like Pramp.com, Gainlo.co, and PracticeCodingInterview.com?

I used interviewing.io and pramp really heavily in my most recent job search. I found that interviewing.io interviews were of a much higher quality in general vs pramp which was pretty hit and miss, some interviewers were well prepared and helpful and some clearly checked out after their half of the interview. But I ended up using pramp a lot more since they had unlimited interviews and used interviewing.io as benchmarks to see how close I was to where I wanted to be. Recommend them both!

I have used both interviewing.io and pramp. I think interviewing.io is of better quality. But I do think there is value for pramp since it is a mutual interview and we can see how others are answering questions. It helped me a great deal in improving system design interviews.

practicecodinginterview seems to be extremely expensive. who pays almost 1000$ for 5 interviews? the way i see these practice interviews is so that i get used to dealing with strangers and rid myself of nerves. why the hell would anyone pay so much for some practice is beyond me even if its "guaranteed" to be someone at FAANG that interviews you. anybody thats a stranger is more than enough for me. and then LC on the side if need be.

Pramp is a "mutual interview". You interview someone and they interview you within the same session. This is a one-sided interview.

With regards to interviews, there is no free lunch. Interviewees don't want to do at least one of the following:

1. Data structures and algorithm questions - they don't want to study for months

2. Take home exercises - they don't want to spend hours unpaid without knowing whether they got the job

3. GitHub / Stack Overflow / Open source contributions - they don't want to code outside of work

One last one, paid interviews about the work itself, may be untenable for employers to pay every single interviewee.

Well, there is the suggestion of the paid lunch- short-term contract-to-hire situations that last a week or two? Of course, this only helps new grads and the unemployed.

Yeah even for this, if you already have a job, you can't quit your current to work full time for this other job without even getting the actual job.

As a former interviewer at interviewing.io, the CEO of this company is committing fraud. She has done the following:

- Been consistently late in paying out interviewers. Sometimes over 60 days late. - I am owed over $50k in money for several months earlier this year. She has stopped responding to emails and calls. I am hiring a lawyer to get my money back.

This platform is a guise for funneling people through to subpar jobs.

Haven't tried it yet. FYI, blog.interviewing.io is running on HTTP (port 80) as well.

* in North America and the UK

As a student, I prefer the interview culture in software because at the very least there is an objective component in the evaluation.

Ok I am confused - I am a network engineer, but I could not find anything network related. Is it for software engineers only?

visited the site - read somewhere - anonymous interview practise - clicked on give it a try - and signup/login screen opened - doubted, if it is really anonymous!

Watched a lot of your youtube videos, really great help :)

> Virgin Websafe has blocked this site

God Bless David Cameron


Congrats Aline!

this is great, can't wait to try it

I've been in this industry for a few years and I still don't get it. It seems like people in this industry constantly do things against their own interests to me.

In terms of interviews:

A lot of these practices are designed and driven by other developers, not PMs or business people. And yet most devs won't work for FAANG, and will likely have a much lower salary. And yet the ask from these dev interviewers is something like this:

- "You don't know language A and/or framework B that are very similar to language X and framework Y that you know very well? Sorry we can't hire you. We expect good devs to spend day and night reading up on the latest fashionable library, framework and language and have no life."

- "Can you spend several hours every day for the next few months studying leetcode? I hope you don't have performance anxiety, try Valium sometime. No, we won't expect you to solve arbitrary puzzles in front of a semi-hostile interviewer grilling you as your day job, it's just for kicks really."

- "Can you spend 4 hours doing this coding assignment as part of the interview?"

  "We want you to do your best and include x,y,z and the kitchen sink and not just solve the problem (add 4-8+ hours)."

  "Let's review your work -- you displayed exceptional skills and a deep understanding of our core competency. However, you didn't include our favorite coding style that you didn't happen to guess we would use --that you totally would've just adopted from day one during onboarding-- so we can't hire you."

  "We didn't see X in your code. No we can't schedule a 15 minute interview for you to explain your 4-14 hour coding project, why you did what you did, and that X is actually in there on L56 -- we're busy and decided to review your code only for 5 minutes after a heavy lunch and already passed on you for someone else. Sorry. We will keep your resume on file to wipe our *sses with."


  (interviewer disappears like a ghost after dropping a 20+ hour coding project for you to do and never follows up after)
- "Can you build a full-stack app for us from scratch as the coding project? Oh no, you won't be building a full-stack app from scratch when you start with us, but we want someone who can do it in 5 minutes for the exactly 0-1 times we'll ever need it"

- "You haven't learned fancy new library/framework/language X yet? Are you a dinosaur?" ...4 months later...

  "Fancy new library/framework/language X sucks, it turned out to be a fad. It turned out it doesn't do anything better than dinosaur library/framework/language D and is actually really buggy or complicated. What you _really_ need to know is fancy new library/framework/language Y, that's where it's at. We only want people who keep up with the industry."
Me now doing the mental math of how I completely fcked myself over going into this industry

The math:

Average expected time doing the job after leetcode, coding project interviews, open source, stack overflow, learning fancy library/framework/language/whatever:

60+ hours/week.

No you can't reduce that to a normal 40 hours/week unless you want to be a generic Java/Javascript code monkey, sir/madam.

Average salary: Not even near enough to justify that, considering the salaries of other engineering and STEM fields. Unless you count FAANG, which you probably won't be working for.

What did I get myself into?


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact