It's clear that whiteboarding and impromptu coding have little-to-nothing to do with real-life coding capability (let alone with generating real business value), while actual signals (e.g.: GitHub projects, OSS contributions, StackOverflow, take-homes, etc.) are ignored by interviewers. I've interviewed dozens of engineers and it's always been a miserable experience because my hiring manager (or other senior engineers on my team) always insisted on using these kinds of "grilling" methods which were never good signals to begin with.
It's a shame how our industry is the only highly-paid professional industry where this kind of sophomoric "intellectual hazing" is not only accepted, but also encouraged. I mean, hell, I have like three books on my bookshelf not about how to write good code, or scalable code, or performant code, but merely about how to pass interviews. Yuck.
Meanwhile I’m very good at competitive programming, I actually find it enjoyable because of the challenge, and I don’t need to practice it much even when interviewing.
Also to be perfectly blunt I think this just ties into people generally being uncomfortable with tests. All the examples you gave have the benefit of allowing unlimited time. What’s wrong with a test that challenges your knowledge of data structures and algorithms? I actually find that in general the very smart people I know are good at them and most of the people who “don’t test well” aren’t. And even then they can just practice them. They might even learn something. Sorry if that sounds mean, I don’t mean it to be, just blunt.
It's not a good signal for who will provide good business value through software engineering.
>I actually find that in general the very smart people I know are good at them and most of the people who “don’t test well” aren’t.
Is "smart" just directly 1:1 with "provides good business value" for you? I'm not so sure it's that simple...
>And even then they can just practice them. They might even learn something. Sorry if that sounds mean, I don’t mean it to be, just blunt.
It "sounds mean" because you are being selfish about your personal situation and then saying "it's not really that bad, come on," to people in different situations. It has nothing to do with you being "blunt."
Entirely depends on the org. I’ve done DS+A work professionally. I do think it tests fundamental understanding of programming considering DS+A underlies everything we do. If you’re hiring generalist software engineers (or general knowledge is a requirement) and not specialists in specific stacks it makes complete sense to me. I also think it’s harder to fake/game than almost any other programming related test.
Necessary but not sufficient. You can drive business value without knowing software engineering at all. Driving business value through software engineering requires being a good problem solver
Seeing someone unable to do a slight modification of breadth first search given 30-60m to sketch it out does, in my opinion, give me a general sense of their software engineering ability. I am bluntly saying I think it’s an accurate way to assess skill regardless of all the hacker news complaints of “why can’t I get a $300k/year job without knowing what a trie is”
You're grossly misrepresenting my complaint.
I've made contributions to several high-profile OSS projects (including some small HTTP core contributions to Golang), have published one book as a co-author (another as an editor; both with Apress), have several projects with 100+ stars on GitHub, and am in the top .9% of Stack Overflow contributors. This is in addition to having a few interviews out there given to legitimate outlets, like Vice and PCGamer, about past startups (all of which have involved hardware or software).
My pedigree is verifiable. And yet, if I want to switch jobs (which I will inevitably have to do), I will be subject to being asked about some algorithmic questions that I am willing to guarantee I couldn't answer without studying. For example, I genuinely don't remember how red-black tree or splay tree balancing works. But I'm small fries compared to people like Max Howell (who wrote and manages brew) and yet couldn't get a job at Google because of some algorithmic question he flunked.
Fighting for better interviewing practices in software engineering is one of the few hills I'm willing to die on.
My biggest problem with this culture of interviewing is that most people don't seem to know/appreciate the complexity of the questions they are asking. There's been a number of times where my interviewer completely lacks any knowledge on the background theory/history on the question they asked me.
Afterwards I tend to find the question in Knuth or an Algo book.
That being said, I do think knowing that red-black trees, like AVL trees, fundamentally rely on Tree Rotations to maintain invariants about the tree which ensures its height balance IS fair game, IF you require formal algo knowledge from your candidates. With that knowledge, theoretically a well-read interviewer can guide someone to towards a partial implementation.
And from that knowledge, if someone would to introduce the concept of a splay tree, I could guess that since its a self-balancing binary search tree, it likely relies on tree rotations as well. You would need to bounce that off the interviewer obviously.
But if you're not familiar with the concept of tree rotations, and/or data structure invariants, you probably won't succeed no matter how much coaching your interviewer provides, due to pressure on your working memory.
On the other hand asking to “invert” a binary tree is totally fair game. I attribute Max’s frustration partially to his interviewer not adequately explaining what “invert” means in that context though that would also partially give the answer since it’s IMO a very easy problem (though not a good one since it’s confusing).
> and most of the people who “don’t test well” aren’t
It's frustrating to read this same response in every interviewing thread. If I may be equally blunt: it almost always comes from someone showing a distorted understanding of the process, or implying human beings are fleshy robots which are easily deterministically QA'd by automated tests.
For weeding out the worst performers, testing works for lots of people and it works at scale. That's why it's used. That doesn't mean there aren't false negatives, or even that there aren't a lot; it just delivers good enough results for cheap enough that other local maxima aren't worth exploring.
But the industry wants it both ways. They want to cry about a talent shortage where apparently there's no one in the output of this local maximum of process they've chosen, and also claim this choice of process is fine and maybe people should git gud or maybe they're dumb, and also actively refuse to reexamine their decision in spite of its obvious failure.
The very fact of their complaint is evidence the process isn't good enough. Unless they're simply lying, its results are costing them in growth or time-to-market or actual money. If they really feel this pain, they should be finding other options—but they just don't want to.
Where else would this behavior fly?
> ...do a slight modification of breadth first search given 30-60m to sketch it out...
> ...“why can’t I get a $300k/year job without knowing what a trie is”...
I've interviewed at FAANG and elsewhere, pass and fail, and this is simply not what happens. People who defend the process often try to frame it like this and it's just... at best, indifference to the truth. In my experience—and many others', and of everyone who gets on LeetCode with a sigh because they know this—you better not be tinkering with BFS at all. Your session might be scheduled for an hour, maybe closer to 45 minutes, and the interviewer will expect you to correctly solve at least two questions, probably three, and still have time for "any questions for me?" at the end.
Counting introductions and small talk, and allowing you 10 minutes at the end to get one question in about the company, that leaves you (optimistically) with 15 minutes per question—including them explaining it, your clarifying questions, writing code on the board, erasing mistakes, drawing awkward arrows where you oops-forgot a line, and all the rest.
Need a hint? Cool, sure, I guess. Another? Uhh. (You're dumb.)
You don't in any sense have even 30 minutes for your BFS if you expect to pass, and your sketches won't count. That easy trie thing or search-with-a-twist, in actual practice if not in design, is meant for you to solve as fast as you can write it on the board. Otherwise you won't have time for questions 2 and 3, the ones they're really counting. Get ready for a crapshoot of well-chosen problems, pointless memorize-the-answer gotchas, and random applications of DP or topics from computer science papers they "would never ask because that's silly" but really just did.
If you're ok with that, I understand. But please don't say it isn't this because you like it.
Not necessarily... but I've seen a concerted effort in the past few decades to separate business value generation from software development anyway: the sort of people who are good at business value do business value stuff and they work with the software developers to get the computers to participate in it.
Not to burst your bubble, but you aren't. You are very good when you get it right. You aren't when you don't get it right. But then you are leaving out that part because nobody likes to focus on the negatives. Consider this - at some point of your life, you didn't know competitive programming. Were you 100% un-hireable at that point ? Most likely, you would have done perfectly fine if you were hired without all this ceremony. That's what people are frustrated with.
People dismiss college credentials as elitist gatekeeping, When you create entire artificial domains out of whole cloth, like "competitive programming", that's gatekeeping on steroids. Because that whole domain is artificial, I can stuff anything into it. The surface area of that domain is practically infinite. Forsure there's some patch where you'd under-perform. And I reckon the area of that patch is quite large. You just haven't found it yet.
I'm sorry, what? "No one is right 100% of the time so there's no such thing as being good at anything"?
No it isn't. Anyone can pick up a book or go to leetcode and learn to solve interview problems. I know multiple people who did it in their early teens.
Not everyone sees these trivia games as a valuable use of our time. Some people have hobbies, some people have side projects that are of use.
The leetecode problems don't test you for experience. They just demonstrate your ability to minimally code up a trivia problem. They don't show good coding practices, doc, or testing.
Maybe a standardized one, that you take one time, as part of a professional accreditation, even?
Man I need to get the hell away from this field
And you can "just" contribute to OSS, can you not? You might even learn something.
You seem to be advocating for the interview style that fits your personal strengths and preferences while arguing that others should not do the same.
Personally, I test just fine and I also contribute to OSS. Perhaps I should argue for that to be the standard, since it would benefit me personally.
1. Ageism - not many seniors are desperate enough or have a free time for competitive programming preps.
2. Making switching jobs harder - for every next job one have to prepare again, because nobody is using competitive programming stuff during real work, so you forget.
I agree with that as that the knowledge they're questioning is more of a recency to being in school. Outside of school you don't write these things. You rely on frameworks and librarys. Why? They're mature and have been verified by others. In addition: if you don't know that software has to be well verified, you are VERY in-experienced in the field.
The system is imperfect but is there actually a better alternative that fits the economic and cultural needs of our society?
FWIW I'm on your side, interviewing gives me the worst anxiety in the world even though my career has progressed to the point where I can consistently get an on-site interview.
Extract a piece of work from the project the new hire will be working on. Set up scaffolding and any boilerplate so that only have to implement one new feature (or fix one bug). Give it to another employee or yourself and solve it while recording how long it took. If someone that's familiar with the project took 4 hours, simplify. If someone that's unfamiliar with the project took 1-2 hours to solve, send it out to candidates.
If you can't do any of the above, you might have process issues that you need to solve prior to getting a new hire in the first place.
Oh, and if the take-home exercise is expected to take 4+ hours, pay the candidates to do the exercise.
I've definitely used a candidate's original work on Github to evaluate their skill but we still spoke about career objectives and culture.
Honestly, I doubt that guy wouldn't have aced the technical interview anyway. It just saved us both time to not do it.
I'm not saying you shouldn't look at a Github profile or any code samples the candidate sends over. However, those metrics aren't good aptitude indicators. I feel the same way about algorithm-style coding tests that a lot of companies follow. Sure, let me find all the anagrams in a set of words only to land a job writing REST APIs...
I don't see how the cost scales per person significantly. You create one take-home exercise per role you're hiring for. You send the same take-home exercise to any candidates that apply for the role. Worst-case scenario, it takes too much time to compile this take-home exercise, in which case you've hopefully spent the time to smooth out your processes which leads to an easier onboarding. Best-case scenario, you've compiled a take-home exercise in a reasonable amount of time, which verifies that onboarding will be smooth for the new hire.
To your point, the take-home is for candidates in the 2nd or 3rd round of interviewing where you've verified their experience, you've verified their character/soft skills, and need to verify their aptitude/hard skills.
Ah I see, you do one for the role, not per person. I misunderstood. That makes sense. What sort of roles were you doing that for? Generally, I aim to keep things short which is what I worry about with exercises. The tendency to perfectionism is higher in that than in real work. But if it's effective, so be it. I know I was hired that way once out of uni the better part of a decade ago! :D
Roles were for fullstack and frontend. You're right in that sometimes time escapes us when coding. That's why I set a time limit to the exercise and ask them to submit whatever they coded in that time frame. Doesn't have to be perfect, nor does it have to be complete. The point is to have code to talk through specific to the role you're hiring for, and ideally specific to the project itself.
Big red flag if you think the only way to impress someone of your coding skills is if you overengineer a solution. If some candidate submitted an overengineered solution, I wouldn't hire them.
I agree on needing time to solve the take-home. That's the biggest con with this approach.
If you don't have the desire to complete the take-home exercise, then you probably don't want to work there in the first place.
Now every candidate starts hacking their github full of all kinds of code, or even paying people to, or copying work to create a perfectly manicured "facebooked" version of what their work looks like (probably with fake timestamps too).
Suddenly that signal becomes entirely worthless too. I think tech has one the best interviewing processes of all industries.
I'm of the opinion that going over someone's GitHub code (asking about design decisions, language alternatives, testing, algorithm choices) during an interview is probably orders of magnitutde more valuable than having them implement a linked list. But to each their own.
How much more time am I supposed to spend clickety-clacking code into a computer while ruining my health? Zero. Zero should be the answer. Because 40 hours is already a hell of a lot.
I fundamentally reject this premise and your arguments are orthogonal (GitHub being banned in like Syria and most OSS projects being in English is just flimsy hand-waving).
"Uh, will I be doing much of that on the job?"
"Oh god no. Where do you think you are? This is Buzzfeed. You'll be hammering together barely-coherent prose that's just shy of stolen and publishing it as fast as possible."
^ your average programming interview with whiteboard grilling.
One optimization I'd suggest though is to standardize the process. Algorithmic/data structure more theory-oriented problems are unrepresentative of "actual work" and are annoying, but what's more annoying is how the process is 1) opaque, and 2) repetitive. So you end up with candidates grinding through 80+ Leetcode problems to both master the skills necessary to ace these problems, as well as covering as many commonly-asked problems in a bid to game the system by preempting their interviewees questions. Goodhart's law abounds. This basically makes the process into an interview version of standardized testing. But- despite the standard prep process and a standard bank of questions, it can all easily fall apart because of subjective interviewer opinion, and so all the preparation is for naught. Candidates end up having to do the interview gauntlet for each company they apply at.
So standardize it. Put out clear, industry-standard rubrics for how one's performance is graded. Don't say "we only want to know how you think" and then ding candidates for not getting the right solution. You obviously care about the right answer as much as the thought process- don't be disingenuous about it. Maybe even standardize the level of difficulty of problems. And most of all:
Make this process a one-time thing.
Or rather, a once every five year thing. Outsource the ds/a interviewing section to a third party testing organization, like what Triplebyte is attempting to do, and have the actual interview be personalized to the company. Ask domain specific questions, relevant experience questions, system design, and culture fit questions during the company interview. And leave the DS/A portion be akin to the licensure tests that other engineering disciplines and STEM professions already have.
Software engineers may hate credentialism, but if it's a credential that you only need to take once or twice in a decade, then it's already far superior to the current system that forces candidates to undergo this each time they change jobs.
And while it is sophomoric and ineffective, at least you’re not explicitly judged by where you went to school, etc. like you see in finance. It’s not like other high-paying industries are more meritocratic than tech. I still think tech is by far the most meritocratic industry to be in, despite the interviewing culture.
The only way we change the interviewing culture across the industry is by no longer making the FAANGs of the world the standard to aspire to, because it’s them pushing this misguided thinking.
All of these are great signals IF present, but IMO not beyond.
I have a decent track record in the software industry (FAANG and other), but I don't tend to work beyond work (the results of which are mostly private). Likewise, I've worked with amazing developers who prefer to focus on family/hiking/baking/etc. in their free time over OSS/private projects.
It's great if you have the passion to go above and beyond in your free time, but it shouldn't be expected.
I agree for the most part, but in fairness, most other high paying fields rely solely on credentials that can often be pay-to-play (at least in the US)
I don't think it's hazing so much as it is a test, and we're definitely not the only profession that has tests. I'm sure medical professionals and lawyers might have books on their shelves on how to pass the MCAT, LSAT, or Bar exam.
If anything, we've maybe not matured as a field enough to have an actual standardized test for this kind of stuff, which would maybe make things better. But that's doubling-down on the "test-y" nature of coding interviews.
It feels like Leetcode is more of an aptitude test, but there’s tons of people that do great on the SATs but don’t necessarily excel in college.
From my experience, the number one thing a team should be assessing for is sensibility. Give someone the shittiest piece of code you’ve encountered at work and have that person write their true feedback about it. If it’s mostly what your team would have said, there, you found someone with the right sensibilities.
Then hit them with a few easy/medium Leetcodes as a fizzbuzz, and feel good about making a good hire.
Perhaps some workplaces rely on that, consciously or not.
One bit of advice though, don't just jump into a mock, make sure you prep by studying first for a few weeks.
Once I realized the interviews were done by real people, I realized that would be the case. I was looking around to see if that was something that could be done on their site. It looks like it isn't but I found the following in their FAQ .
>If you need a place to start, we heartily recommend Interview Cake . There, you can work algorithmic problems at your own pace and get nice hints as you go.
I wanted to post it here for anyone else wondering the same.
 I don't want to put the link here but the link from their FAQ does appear to be a referral link, or at least some kind of coupon code. I have absolutely no affiliation with either site but if you want to support this site then I'd recommend checking the FAQ and getting to Interview Cake through there.
The only disadvantage is that the site did not have enough mid to senior positions and very little focus on positions for engineering manager and above.
I said - oh yeah you heard about that. Yeah I'm doing that later if you can make it.
Got the job.
But each their own, I'm sure people are happy going through that process and then also enjoy working there.
“What stack would you propose for this product?” Instead of, “Take these two arrays of strings and output a new array where no strings has a letter found in another blah blah blah...”
It's often difficult to interview specialized developers because the specialist often understands (or pretends to understand) the specialty better then the interviewer themselves.
These hiring opportunities represent a large number (maybe even a majority) of roles at small, medium, and large, companies.
FANNG and maybe a few other companies which have custom tool chains from top to bottom are a little bit different because Google probably doesn't care if you know Jenkins because you'll have to be trained in their homegrown alternative anyways.
The current state of algorithm-heavy interview process is really just a hamfisted solution to the problem stated above. They are trying to minimize false-positives while accepting that it increases the rate of false-negatives, and their proxy for implementing this threshold is algorithmic knowledge. It's not fun for anyone involved, but I guess I have to assume it works.
If they pass that I would devise a longer interview suited to their CV. I expect a senior to be able to answer architectural questions and / or talk intelligently about architectural decisions they've made on past projects. If they claim experience in a job-relevant domain I would ask domain-specific questions, if not, the expectation would be that they would ramp up, and I would stick to more general questions.
So the point is to avoid wasting time preparing and doing longer interviews for candidates who can't even calculate the average value of an array (or worse, hiring them), and get a bit of an idea of their ability to think and communicate about code.
I'm honestly not very confident this is a good approach but that's why I did it.
There is a cheap engineer shortage.
The money can compensate for a lot of unpleasantness. I think finding a “dream job” is an illusion for most people. The question is usually “bearable” or “unbearable”.
The practice is about getting comfortable with the process of being interviewed. Practice is a great way to become more comfortable and confident before the real thing.
Get the anxieties out before you sit down with a real job interview so you can focus on performing well in the interview.
I think some coding interview questions are bad, some are better, but I don't think the idea that you must take what amounts to a test to get a job is inherently bad.
Maybe? At 5-6 years out of school, I personally don't mind tests. Sometimes I actually really like tests (e.g., I've started watching youtube vids of Putnam problems, like doing daily problems on brilliant). I can imagine time to study becoming more scarce if I had a family, but I was under the impression that changing jobs was something that's gonna take a healthy amount of time regardless.
To be clear: I think you can have bad tests, and I imagine I'd find it frustrating if I was boxed out of a lot of jobs I thought I'd do really well at because the unrelated test keeps me out. But I don't think needing to study inherently makes a test bad.
Where I do think you're spot on is that you shouldn't focus all of your time practicing one specific type of interview or practicing for interviews at one specific company. FAANG is a good example. I've seen far too many people get enraptured with the "dream" of working at a FAANG, devoting all of their time to practicing FAANG-style interviews, making their whole life about it, and then either not getting the job or they do get the job but they quickly have their illusion shattered that FAANG isn't heaven and they wasted all that time preparing/fantasizing over something that isn't true. It's Paris Syndrome  but for jobs.
It may have changed since I've been living in Canada, but total comp in Toronto at the time were HALF what they were for similar positions in California. Almost all my great engineer friends who still live in my home town in Canada work remote for USA companies for that reason.
The beauty of the peer chat was that it shows the company culture. Lot of interesting people. And makes you highly interested in the company and the job.
Not all experiences were great, but I didn't have to go through painful whiteboarding sessions
Similarly, if you practice an interview and you're lucky enough to being questioned the topics you just practiced and they hire you, what kind of impression are you going to give to your future teammates when some weeks/months later they ask you something related to that topic but you don't remember/know? You just Google it? Well, you can google "how to render a template in Golang", that's fine, but you should not google "what's the difference between a compiled and an interpreted language" because there are some things that you should know before starting to work as a professional software developer.
I know that "the tech interview process is broken", but we, as part of that process, should aim to improve it, not to trick the system for our own and personal benefit.
It's not broken for FAANG, because what they're looking for is willingness to spend time studying hard for their interviews over material that's (certainly not all) useful in one's day-to-day job (if it were then this wouldn't measure one's willingness to spend lots of time studying), to whatever degree is required by your raw natural intelligence/ability, to be able to pass their tests.
They're looking for (what they consider to be) a sufficient amount of wanting-it + intelligence (IQ, more or less), in whatever proportion is necessary to get the candidate through the gauntlet. That's what the interviews filter for, and it's what they want them to filter for.
(yes they're also favoring recent grads and people who have lots of time on their hands—notably those two categories overlap strongly—but I'm pretty sure those are secondary goals, if they're goals at all)
I have 60 minutes to convince you I can code. In my day to day work, I can look at another example in the codebase. I can look up the stupidest stuff in stack overflow, even if it's embarassing. I can walk around the block if it's just not clicking.
An interview is not the same thing as work. It's a performance of code. I am practicing talking about how I think about a problem. I am practicing smiling while coding. I am practicing having a conversation in the middle of an initial rush of panic because of the pressure of an interview.
but, honestly, whenever i interview for jobs that i know that i want, i'm not willing to give up on them over a few weeks of hoop-jumping.
>Fast-tracking means that you bypass resume screens, scheduling emails, and recruiter calls, and go straight to the technical interview (which, by the way, is still anonymous1) at companies of your choice. Because we use interview data, not resumes, our candidates end up getting hired consistently by companies like Facebook...
but I definitely remember all or most of them asking about my resume, at least at the onsite.
(FWIW, interviewed two years ago, 8/9 for passing technical screens on interviewing.io, 0/8 for onsite.)
Given that this overlaps with the HN community, how does everyone feel about the growth of these types of companies?
IMO, I don't think there is anything wrong with expecting candidates to be knowledgable about basic algo/data structure for a SWE role -- instead, the issues are with the evaluation process. Namely that it's become hackable using these preperation services (resulting in false-positives), it doesn't capture or indicate a candidate's real-world knowledge/experience/abilities, and the typical questions and whiteboard approach is likely outdated when there are probably better alternatives.
After crunching through many leetcode problems I see the problems they give have really poor APIs and many posted solutions people share have memory leaks and other problems that you would absolutely want to avoid on a production code base.
Competitive programming is fun for sure but where is the data that it has a high correlation to job success? It seems to me like a completely different skill set. Writing fast, sloppy code in order to solve a puzzle is fun but it's quite another when your library brings down a production server because it leaks memory like a sieve.
Of course, that requires putting time into designing an interview pipeline full of things like rubrics, bias training, and sometimes realizing that leetcode questions don’t evaluate for the skills you need. Most companies aren’t willing to invest time into this and prefer to just copy <big co X> which is why you hear so many bad stories with what sounds like the exact same interviewers.
This. The real-world best solution to many leetcode problems is to implement an iterator over their broken interface so that <algorithm> can solve it.
Because of how they constrain you, what you end up doing is just sloppily reimplementing portions of the standard library. That's something that you should never want to do in the real world. It completely selects for the wrong thing.
1. Data structures and algorithm questions - they don't want to study for months
2. Take home exercises - they don't want to spend hours unpaid without knowing whether they got the job
3. GitHub / Stack Overflow / Open source contributions - they don't want to code outside of work
One last one, paid interviews about the work itself, may be untenable for employers to pay every single interviewee.
- Been consistently late in paying out interviewers. Sometimes over 60 days late.
- I am owed over $50k in money for several months earlier this year. She has stopped responding to emails and calls. I am hiring a lawyer to get my money back.
This platform is a guise for funneling people through to subpar jobs.
God Bless David Cameron
In terms of interviews:
A lot of these practices are designed and driven by other developers, not PMs or business people. And yet most devs won't work for FAANG, and will likely have a much lower salary. And yet the ask from these dev interviewers is something like this:
- "You don't know language A and/or framework B that are very similar to language X and framework Y that you know very well? Sorry we can't hire you. We expect good devs to spend day and night reading up on the latest fashionable library, framework and language and have no life."
- "Can you spend several hours every day for the next few months studying leetcode? I hope you don't have performance anxiety, try Valium sometime. No, we won't expect you to solve arbitrary puzzles in front of a semi-hostile interviewer grilling you as your day job, it's just for kicks really."
- "Can you spend 4 hours doing this coding assignment as part of the interview?"
"We want you to do your best and include x,y,z and the kitchen sink and not just solve the problem (add 4-8+ hours)."
"Let's review your work -- you displayed exceptional skills and a deep understanding of our core competency. However, you didn't include our favorite coding style that you didn't happen to guess we would use --that you totally would've just adopted from day one during onboarding-- so we can't hire you."
"We didn't see X in your code. No we can't schedule a 15 minute interview for you to explain your 4-14 hour coding project, why you did what you did, and that X is actually in there on L56 -- we're busy and decided to review your code only for 5 minutes after a heavy lunch and already passed on you for someone else. Sorry. We will keep your resume on file to wipe our *sses with."
(interviewer disappears like a ghost after dropping a 20+ hour coding project for you to do and never follows up after)
- "You haven't learned fancy new library/framework/language X yet? Are you a dinosaur?"
...4 months later...
"Fancy new library/framework/language X sucks, it turned out to be a fad. It turned out it doesn't do anything better than dinosaur library/framework/language D and is actually really buggy or complicated. What you _really_ need to know is fancy new library/framework/language Y, that's where it's at. We only want people who keep up with the industry."
Average expected time doing the job after leetcode, coding project interviews, open source, stack overflow, learning fancy library/framework/language/whatever:
Average salary: Not even near enough to justify that, considering the salaries of other engineering and STEM fields. Unless you count FAANG, which you probably won't be working for.
What did I get myself into?