Hacker News new | past | comments | ask | show | jobs | submit login
Stop Interviewing with Leet Code (fev.al)
554 points by charles_f 67 days ago | hide | past | favorite | 655 comments

All of these supposed "flaws" of leetcode are by design. Big companies want people who are smart enough to do the work, but obedient enough to put up with all the bullshit that comes with working at a big company. What person better matches that than someone who's able and willing to study for and pass a tech version of the SAT? Every anti-leetcode article I read is some version of "leetcode is bad because it measures the wrong things." No, we all know it measures those things, and those are exactly the things the measurers want to measure.

You might ask, so why do startups do leetcode too? I heard startups are supposed to be, uh, innovating, developing new technology, and working on hard, meaningful problems? Shouldn't they want brilliant, super effective people, instead of smart-enough, obedient workers? Apparently not. Apparently they want the same workers bigcos want. The implication of this is left as an exercise to the reader.

The idea that companies are using leetcode tests for rational reasons goes against everything I've seen and experienced or read about. What I have seen, working at startups for more than 20 years, is irrational, self-destructive behavior, over and over and over again. I've worked with many entrepreneurs who have a few million dollars in the bank, and they have a great idea, but they self-destruct due to two big reasons: ego and fear. I wrote about this in How To Destroy A Tech Startup In Three Easy Steps, where I talk about two cases that I saw with my own eyes. This kind of self-destruction is much more common than the kind of clever, rational behavior that you impute to these companies. And I think many other people are also witnessing this kind of irrational, self-destructive behavior, consider the reviews posted on the books page, where quite a few people chimed in to say the book matched their own experience:


I think it's quite possible that both you and the parent comment are correct.

Based on my experience, you are most certainly correct that the companies using leetcode aren't following any line of reasoning, and certainly not one as sophisticated as what the parent is implying. Most decisions made by "leadership" are irrational and self-destructive as you describe there (your book looks very interesting btw).

However what the parent is claiming, as far as the effects of the leetcode interview process is certainly true. I also think that the parent is correct in that these, perhaps unintended, consequences of leetcode interview in attracting a docile though technically competed work force are beneficial to large organization.

I also think the parents second comment, about these consequences not being beneficial to startups ultimately aligns with your view: startups do great harm to themselves by mindlessly aping the behavior of big name tech cos, mostly out of ego and fear.

> However what the parent is claiming, as far as the effects of the leetcode interview process is certainly true

And once you hit LeetCode employee critical mass your culture becomes LeetCode employees. Management may have started it, but employees amplify and make it ubiquitous (especially engineers promoted to management/hire/fire).

> in attracting a docile though technically competent work force

Bingo. IMO this hits the nail absolutely on the head. $BigCo wants docile people who will show up and jump when they're told to jump without asking questions or making a fuss. And they'll throw a big bag of money at you to do so.

Obviously this is an incredibly narrow/limiting employment experience. But different strokes for different folks I guess

Would love to talk about the dating app world. It's so funny how many founders come into it thinking their "revolutionary matching algorithm" or whatever is going to make the greatest dating app ever. Until you realize that the problem is human nature, and no one wants to be told who they should match with. People want a name, an age, and a face; that's it. And anything that gets in the way of that will lead to your app being ignored.

Yup. That's why Tinder won over OKCupid.

People just want to meet attractive people, compatibility be damned.

I'm glad I met my wife in 2010 before online dating became an utter cess pit.

And a location/distance, that seems to be pretty important most of the time, from what I can remember, but otherwise yeah.

We had one of those people as a client and thought his "special sauce" of Myers Briggs, some Neuroscience research test bullshit we had to build into the system and have people, and some bullshit matching algorithm based on astrological sign would be the next big thing. Of course it wasn't, but my boss was happy to take the guy's money to build it anyway.

The client actually told us that it would have ten million of users after three weeks of release, with basically no money spent on marketing. It did not. Nowhere close.

And a height.

And a wallet (sometimes)

And a boat.

And a pink inflatable flamingo in a pool, and a cocktail

Some relevant offtopic. Ego and fear are also described in the eastern philosophy as the two biggest obstacles on your way. Ego is the illusion of separateness, the great seed of all human evils, that takes a monumental effort to get rid of. Next comes the fear of making the next step, "taking on more responsibility" as we would say here, because that new responsibility feels so overwhelming. Those who fail to make the leap of faith (or "to challenge themselves" in corpspeak) fall, and have to climb up again.

Thanks for recommending "How to Destroy a Tech Startup in 3 Easy Steps" - loved it!

edit: sorry, just realised you are the author!! (lack of sleep - not work related ))). Great reading, really enjoyed it. "Sital" gonna be used as a nickname )) Thanks mate! :praise

Every unicorny startup I've interviewed w/ that had a standard big-tech interview loop (1 LC style screener + final w/ 2-3 rounds of LC + 1 system design) was chock full of ex-big tech engineers and managers, replete with stories about wanting a faster paced / dynamic environment.

So yes, that absolutely is who they are recruiting. It probably comes down to no more than believing big tech companies have the very best in the industry (ie. like being in an Ivy League school) and standardizing the hiring bar w/ other companies they're actively in competition with. More than obedience, it probably is the aspect that they're smart and determined to succeed.

> More than obedience, it probably is the aspect that they're smart and determined to succeed.

Yup that's what it's about!

The problem with trying to test how good you are at the job is that you simply can't do it in a 1 hour (or even a 1 day) interview. You can however assess how smart somebody is, which is definitely correlated to job performance. It's also connected to growth potential - even if it were somehow possible to accurately measure how good a programmer a candidate is at the moment they are applying for the job, companies would still want to try to predict how good they are likely to be over the next few years.

> You can however assess how smart somebody is

Are you saying the Leetcode-style interview assesses how smart someone is?

Not OP, and I don't think it directly indicates how smart you are, but it does show you are at or above some smartness threshold (i.e. smart enough to get the answer).

If you don't get the answer, you might still be above that smartness threshold, but didn't get it for some other reason (didn't have time to study, didn't sleep enough the night before, interview anxiety, etc.).

Yes, being able to understand and apply algorithms and data structures requires some intelligence.

I know we try to stay away from the “we’re smarter than you” vibe here because it’s gross, but pretending leetcode style questions don’t require some degree of complex thought is absurd.

Personally, I need to practice leetcode style questions pretty regularly in order to perform reliably without reference materials. If more than a couple of months go by, the details of particular algorithms begin to fade.

I think it's unreasonable to expect people to spend their off time practicing these sorts of problems, in the same way we think it's unreasonable to require everyone interviewing to be working on personal development projects over the weekend. I suspect most working people simply do not have the time.

You can memorize a lot of them and forget everything in a year.

If enough people are doing it higher iq folks will bubble up in calibration

Which at the end of the day still doesn't test that you can do the job.

Ok but it does tho - all other things being equal and you better believe it they do have other type of interviews not just lc

Ok, tell me what other types of interviews do they do other than system design, leetcode, and behaviorial for engineering?

Lol yeah apart from sanitation, the medicine, education, wine and public order what have the romans ever done to us? In your view that’s not enough to make an informed eng hire?

Lol. So you can't give me any concrete examples. Got it.

Nothing tells me more about someone that is full of it than simply being dismissive of others. I ask for an example to support your argument, instead you chose to be defensive and give me attitude.

Exactly. If you have a candidate who can design good systems, solve complex algorithmic problems, and play nice with others, then I’d say that’s a pretty strong candidate.

> You can however assess how smart somebody is

Why not just give candidates a standard IQ test then? They are probably more reliable than Leetcode...

IQ does not tell determination/hard-working-attitude at all, in fact many with high IQ ended up being nobody, as high IQ makes learning relatively easier, most got used to that and just let the 'grit' go.

leetcode is the SAT for coding, not perfect, but at least it's close to fair play.

> IQ does not tell determination/hard-working-attitude

this is what everyone said a college degree was for. Or experience. Neither of which matter when a senior dev with 10 years of experience and two kids still has to find an hour or two a day for three months in a row to grind out textbook algorithms to fake problems. Just to switch their fucking job. Thanks to cargo-cult insanity, god help those stuck in miserable jobs that just want out.

To be fair, most are real problems, just phrased in a contrived way.

Yes, they're rarely encountered in most day-to-day jobs. But that doesn't make them fake...

I mean, if the actual solution in a work environment is "use existing solution from people who already did it" but you have to do solve it by route memory then it's a fake problem.

The actual solution for encountering a differential equation at work is wolfram alpha, not getting a pen and paper to apply heuristic knowledge on solving differential equations

> leetcode is the SAT for coding, not perfect, but at least it's close to fair play.

It would be amazing if leetcode were like the SAT, and you could just get one good score and then never think about it again.

Anything like that would make it much lower-friction to switch between FAANGs (and friends), though, which I suspect is a big part of why they've settled on doing things this way.

That’s a fascinating idea. Get companies to accept some form of standardized testing and have it be transferable. That would greatly increase the motivation behind its studies I would presume.

I, for one, will probably never subject myself to a FAANG-type interview, but absolutely would study for and take a similar standardized test if it unlocked the same kind of opportunities, and I didn't have to re-take it with every interview.

This is a startup idea in fact, wait, someone is doing that: https://codesignal.com is one that I just learned yesterday, but it's not 'standardized' like what SAT does.

In fact I believe software should have some qualification tests, e.g. general coding, database, cloud computing, etc. Like CPA for accountants. Each test should be valid for a few years in each category.

The hard part's not creating some kind of certification, it's getting desirable employers to accept it as a replacement for the most-painful parts of their interview processes. I suspect a lot of top companies don't want to make it easier to jump between them.

The "most painful" part of any interview process is just the part that you happen to be the worst at. Personally, I don't find leetcode to be painful at all and would be much more enthusiastic about something that eliminates any part of the interview where you have to talk about yourself.

What's the problem with talking about yourself? You can just memorize a few sentences since the questions are always the same.

Yeah, witness Triplebyte's lack of success in getting companies to use them as a substitute for anything more than the initial phone technical screen.

Sure it's a great idea and hiring culture would be so much better if it happened, but I think the parent is correct that companies want friction at that point. None of them want to make it easier for talent to jump ship and transfer.

you could even call it certification!

Certification is generally binary, though - you're either certified or you're not. This might be a good thing though, as it establishes a decent floor for technical competency and can save everyone from having to deal with at least some of the technical interview nonsense.

What some of the people in these comments seem to want though is some sort of standardized score/ranking system, which I suspect might lead to even more nightmarish outcomes than the current leetcode-y interview processes. (e.g. Employers setting absurdly high cutoff scores, choosing one applicant over another because simply because they scored a couple points higher, applicants grinding unimaginable amounts of unpaid hours to bump up their scores a bit, etc.)

I've done first party certification and it was always a joke -- they need you to pass so you can get your employer to use their services. Are there trusted third party certification services that are willing to fail half or more of their customers?

true, the keyword is 'standardized' at national level, even 'global' level.

You're going to hear complaints about bias in selecting the types of problems.

for SAT or GRE, my understanding is they have a huge pool of problems, before the test they're picked and assembled somehow. I hear no one complains about SAT or GRE selections of problems, so it might be less of a concern.

I can't tell if you are being facetious or not. I hope you are being facetious. I really do.

As many other comments have indicated right here on Hacker News, professional licenses are commonplace for occupations such as plumbers, electricians, doctors, lawyers, and even many types of engineers.

I suppose that central governments (such as the US federal government) should offer licenses for myriad types of software engineers, hardware engineers, software architects, hardware engineers, and so on.

Wouldn't it help companies if they could choose to interview only candidates who were licensed as, say, a level three penetration tester (intermediate penetration tester) or a level five database architect (expert database architect)?

The whole "Let's reinvent the wheel mentality" surrounding software, The Internets, and hardware simultaneously bemuses and frightens me.

"The eye never has enough of seeing, nor the ear its fill of hearing. What has been will be again, what has been done will be done again; there is nothing new under the sun." King Solomon, Ecclesiastes.


"I only wish that wisdom were the kind of thing that flowed ... from the vessel that was full to the one that was empty." Plato, Symposium

Why don't we simply allow unqualified, blind, inebriated people to drive automobiles on public roads at whatever speed they would like? Why don't we let a guy who watched a bunch of YouTube videos call himself a brain surgeon, and perform brain surgery on people who don't even need brain surgery in the first place? Hey, wait, i've gotta gureaat idear: y botherr haviing aany ruuules at al! Sheesh.

Without rules, men simply return to a state of nature where life is short, brutish, and mean (Hobbes).

I just found this on Google...

****** Origin of Life is Nasty, Brutish, and Short

This expression comes from the author Thomas Hobbes, in his work Leviathan, from the year 1651. He believed that without a central government, there would be no culture, no society, and it would seem like all men were at war with one another. ******

The “Wild West” mentality of folks who seem to believe that rugged individualists (not federal agencies such as the US Department of Defense) built Silicon Valley, is perched atop the same type of popular, yet nonsensical, mythology (falsehood) as Horatio Alger's famous character who was actually named Ragged Dick. (Really, Ragged Dick was the character's name, I am not being facetious) who metaphorically pulled himself up by his bootstraps to rise from street urchin to CEO (who was a wealthy industrialist).

Imagine a military, any military, anywhere, anytime in human history, that didn't have ranks, titles, and gasp... tests which members had to pass to move up the ranks. How well do you suppose a military without ranks and without tests would fare in combat? Obviously, such as military would be in a state of hopeless disarray.

Licenses are not a necessary evil; they are a good, and proper way to identify and reward qualified professionals, while simultaneously enabling "the rest of us" to know, for example, who's a mere private, whom we can walk past without batting an eye, and who's a colonel, whom must stop and salute.

Imagine a hiring manager say, "Hey, this kid never went to college, but he's a freshly minted level one software engineer, I say we bring him in for an interview."

Why should every company need to create their own initial screening tests? Imagine a trucking company that needs to hire a truck driver with a particular type of commercial driver's license (CDL). In the employment advertisements they post, such companies almost invariably include verbiage such as, "Class A CDL required" or "Must have a Class B CDL." See? The candidate must have already passed an initial screening test, by acquiring a particular type of license, prior to being granted an interview.

This is obviously a huge benefit to both candidates and companies alike because it saves both sides a lot, and I mean, a lot, a lot, a lot... of time!

These days it is comically inane that tech candidates are normally expected to take an endless stream of initial screening tests to prove their mettle to each and every prospective employer (unless they were referred, famous, or for some other reason exempted from the requirement).

Imagine a CPA (certified public accountant) being required to pass a basic auditing test before he was granted an interview for a new job. Why would a company ask a CPA to take such a test? If a candidate is a CPA, then (unless, for example, he cheated on his CPA test, or suffered some sort of memory loss) he has already proven that he has a substantial amount of knowledge about auditing.

Yes, of course companies should administer their own tests. But professionals licenses can, and do, enable both candidates and companies to avoid the sort of initial screening test which companies commonly require of software engineering candidates. Currently initial screening tests, such as LeetCode, are a response by hiring companies that are typically deluged with a sea of unlicensed (and almost entirely unqualified) candidates, all of whom claim to be qualified.

Nice rant. It’s full of logical holes, of course, as there’s no common definition of tech roles and their titles, let alone a common understanding of the included tasks.

There are React Boot Camp folks who can whip up a frontend in no time who are arguably more competent at this task than a PhD in Comp Sci, and there are PhDs in Comp Sci who never wrote a lick of code in their life.

Technology in software IS Political, and 30 years of development experience shows me various parallel worlds that are arguably better than our current one.

Someone who grew up coding and loves hacking as close to the bare metal as possible is worth more than a dozen of your certified software engineers.

Let me know when the world standardizes on a Comp “Sci” curriculum.

Actual computational science has very little to do with computers and software development, which is a thesis statement I will be happy to support if pressed. For now, I end my rant

> but at least it's close to fair play

How is it fair, it favors young graduates and people with lots of time to prepare tremendously. Would it seem sensible to you that every time a doctor applied for a job, even if he has 20 years of experience, he'd need to compete with recent medical school graduates on some first year medical school exam?

Because people are under the impression that it's illegal. That and I think a lot of people will make its use be about race, so people avoid it.

I was just entertaining OP, I don't think IQ correlates that well to SW engineering job performance. It's only one of many many factors (albeit probably the easiest thing to test for - testing curiousity, emotional intelligence, motivation, work ethic and resilience is much much harder).

But if we're saying Leetcode is simply testing for IQ and not really for software engineering ability, why not just test for IQ? IQ tests are designed in a way that after a certain threshold it's pretty hard to improve in them - they actually test some innate ability. Leetcode tests how well you are prepared for Leetcode, and perhaps how well you do under stress. It correlates only slightly with intelligence and even less slightly with programming ability.

I think the fact that leetcode needs to be studied incorporates work ethic/motivation/resilience alongside IQ. Also, at some stage in the interview process you have to interact with people which will expose your social skills/emotional intelligence to some degree.

Bad optics

And algorithmic questions are a straightforward way to get a scalable, structured interview round. It's difficult to entirely replace algorithmic rounds with questions that are both scalable and structured.

Leetcode doesn't measure "smart and determined to succeed". It measures "has enough extra time and energy to devote to practicing pointless brainteasers for weeks."

In other words, whatever it's intended to do, one of its primary functions in practice is to screen out people who are bright, driven...and poor, working long hours and trying to keep themselves and/or their families going.

Weeks? There are stories on Blind and the LC forums of people taking upwards of a year with pretty hefty real-life constraints. And why wouldn't they, given the opportunity to essentially double comp over standard industry jobs? That's a life changing opportunity if you're in a somewhat unfortunate life situation.

That it's a bit of a sacrifice is the point. If you're naturally smart enough this stuff comes quickly, great. If not, you're going to have to work for it - that requires discipline most people don't have.

Better put, smart and/or determined to succeed.

Did you, like, read my second sentence at all?

Whether you realize it or not, you're advocating for keeping people who aren't already mid-to-upper-middle-class (among others) out of these kinds of tech companies. Anything that's designed to make you work hard extra, outside of work to learn separate skills just to pass interviews is guaranteed to make it disproportionately harder for people who are, for whatever reason, unable in practice to devote many hours of their free time to fairly complex technical studying.

No matter how "disciplined" they are, people already working 80 hours/week just to put food on the table don't have the luxury to be doing that. No matter how "smart and/or determined to succeed" they are, people raising 2 kids by themselves would be irresponsible to be doing that.

Now, maybe you think that kind of person should be denied an opportunity to join the super-1337 hackaz' club that is FAANG or whatever. Personally, I think that kind of classist gatekeeping is disgusting.

> designed to make you work hard extra, outside of work to learn separate skills just to pass interviews

Ah, you're talking about those programming jobs that never require thinking about algorithms?

While I certainly wouldn't presume to claim that no programming jobs are remotely related to Leetcode—I'd be perfectly willing to believe that you, personally, have to engage mentally with "classic" algorithms on a daily basis—from everything I've read about them, I doubt very much that more than a minuscule fraction of programmers and other tech workers do anything on a day-to-day, or even month-to-month, basis that would be close enough to Leetcode that they could be considered similar skills.

I've been working as a programmer—largely PHP, Java, Objective-C, and Swift—for over 20 years now, and never, in my work, have I been asked to invert a binary tree, reverse a singly-linked list, or any similarly contrived scenario.

There's a huge difference between "thinking about algorithms" in the most general sense of that phrase (which means basically any programmer who does any of their own design work) and in the sense of the particular "algorithms" that we learned in CS classes and are featured in Leetcode problems.

In no circumstances did I ever make algorithmic decisions in an informational vacuum and then cowboy code my way forward, or rather, one seeks to avoid the mistakes of one’s youth.

Search engine: read material about the heart of the problem from domain experts, try different approaches and then decide.

Leet Code exercises are a blight on our industry.

If you want a shibboleth just say so.

Took me about 5 years to get even modestly proficient with medium-level leetcode questions. I'm a slow learner.

I never did a CS degree so much of the learning was decently useful but it also took me years

I only took 1 CS class in college- Cybernetics with Huffman. I failed! So clearly, that was a strong predictor for my future 12 year career at google (where I failed to do quicksort, or virtual ants problems). Nowadays I take a lot of extra time to come up with questions that aren't in leetcode, are easier than leetcode, and give far more signal than any leetcode would, for anything except for a 10X Staff Engineer.

Can you share what kind of things these are that give far more signal?

To be honest? every programmer needs to be able to pass FizzBuzz. Beyond that, I add some more basic work around byte representations of data (I have 4 symbols... how many bits do I need to encode a symbol?). I will typpically also include one major bug in a piece of code and ask the user to identify it (with hints).

I once interviewed a guy- a CTO at a biotech- and he wouldn't answer the question "I have a million DNA sequences and want to count the number of occurrences of each sequence" (could be hash table, could be any number of other solutions; he just balked and exited the interview).

Only if the person can get all that right would I even consider asking something more complicated.

I once interviewed a guy- a CTO at a biotech- and he wouldn't answer the question "I have a million DNA sequences and want to count the number of occurrences of each sequence" (could be hash table, could be any number of other solutions; he just balked and exited the interview).

Maybe he's just been around the block a sufficient number of times to have gotten tired of the standard SV "here's a hoop, jump through it, now do it again, and again and again" ritual that it likes to call "interviewing" for some reason.

Seriously - your questions are fine for junior or mid-level roles, but beyond that, you should have more important thinks to talk about. And both of you should be able to tell if the other is a bullshitter and/or a lightweight within about 30 minutes at most (and often far quicker than that) of a normal, focused technical conversation -- without having to specifically grill the other person, or otherwise put on the pretension that you're one whose bona fides are established beyond question, and they're the one who needs to jump through a sequence of hoops to prove that they are worthy of your time and attention.

Just cut the crap, get down to brass tacks, and talk what needs to be done as if they're a peer. That's all you have to do.

I agree liking to ask simpler stuff first. But what about this more complicated stuff? My problem is that when we talk about stuff having more "signal" - how are we determining if those questions are giving us more signal?

My question is fairly open and allows me to continue asking significantly more complicated questions. So far, nobody has (for example) asked enough clarifying questions to determine that a bloom counting filter could potentially solve the problem- most people just store ASCII string keys in a hash table, which wastes tons of memory and requires a lookup for every string.

So usually I end up after 45 minutes finding that the candidate has more or less tapped themselves out at "make a hash table of string keys, use it to store counts" (OK for a very junior programmer) or "make a perfect minimal hash" (I help them get there if they don't know what those are) or "use a probabilistic counting filter". This is about all I need to make a determination.

The problem with this question is that outside the junior people who won't understand the hints, many good programmers would pass or fail just based on whether they have seen a similar problem or know what a Bloom filter is. CS is so broad that you may have been working for many years and never come across one topic that the interviewer deems standard.

I don't fail people who don't suggest a bloom filter. What I meant was: knowledge of probabilistic data structures and knowing this is a situation where they could be used is a sign of a more experienced programmer (likely one who's worked in clickstream processing).

If all you do in my interview is know that a hash table or other associative data structure is good for maintaining counts, that using a full ASCII byte to store symbols from { A G T C } is wasteful and compress the letters to 2 bits each, you can pack multiple 2 bits into bytes, and can write a function that codes and decodes such data, I'm happy and you get a pass. I'm even happy to sit there and help people through nearly all the steps of the codec. Not trying to trick anybody or select for obscure CS knowledxge.

To me the real question is, how much hinting is reasonable for the coding and decoding function? Many programmers (including senior ones) struggle to implement:

x <<= 2 # left shift the int to make room for the next item x |= pattern # or the new 2-bit item into the accumulator.

Since I never use bit munging in my day job (it's 90% python data science) should I really ding somebody for not knowing about left shift or or-assignment?

What I really don't get is why people immediately jump to trying to implement huffman coding, RLE, or lookbacks as a solution to reduce the size of the DNA string. I still wonder if how I present the question is giving everybody a fair chance to shine.

I am 100% certain that your questions are not giving everybody a fair chance to shine. You have to start with the area of expertise they have already worked in and not the area that you happen to have experience in or be interested in. I could (for example) ask “simple” dynamic optimisation questions that are “easy” for me and I know that 95% of great experienced developers would fail. So obviously I don’t do that. I want to hire smart developers who can learn. Not developers who happens to have the same experience/interest that I do.

I hope you realised that your questions are extremely narrow and in no way filter for competent developers. Imagine yourself going to an interview and being asked similar narrow specific questions but in a different area you have no previous experience in. You would probably fail. For example: how would you represent a lambda closure when compiling a functional language to machine code? It is a super easy question for me but I know most people would fail. So I don’t ask questions like that when interviewing.

This sounds like a perfect question for a company that's hiring engineers who want to focus on optimizing DNA storage. If that's your goal you should just put that in the hiring criteria. You'll get more candidates that know those type of algs. If your goal is know if the candidate knows about bloom filters a better question might be "What do you know about bloom filters?" Definitely seems like you'd get answer in less than 45 minutes.

I think "key/value counts" is absolutely the correct starting answer. Anything after that is optimization. In the real world optimization comes at length through research, learning, grokking, reading, benchmarking, sleeping, mulling things over. The only way to short change that is knowing about those optimizations in advance, so might as well just ask about those algs specifically.

All that being said, I think it's an enlightening question. Thank you for sharing it! You've educated me. I've been coding a loooooooong time and didn't know about bloom counting filters specifically.

But have you had a chance to look back and see if those questions worked out? Like, are they a good predictor for how the person actually performed in the job?

I've only been directly involved in hiring five people and it worked out well for all of those

cool. Thank you!

Bloom filters have collisions. How would that work with counting sequences?

Because (and this is absolutely a clarifying question I would love to get), approximate counts are acceptable.


Exactly. I honestly think no one is entitled more than software developers. Not even that chick from Mean Girls. Hell which other similarly paying field can you double compensation in like three or four months. Six months if you are slow or have a demanding life. Lawyers? Hahahaha. Doctor? Hahahaha. At least with LC there is an end goal. Those 75 questions. Or well I think 150 questions. I hate LC personally but also am able to recognize that it is a silver platter handed to me. The article above mentions all the negatives. Ya bro who didn't know them?

At least with those professions you have one test you pass, and maybe you retake it once every few years to renew your credentials.

With Leetcode, you have to retake the test multiple times every few years when you change jobs for each company you apply to.

Leetcode would be a lot more tolerable if it was administered more like the bar exam or like medical exams. Whatever happened to DRY?

yep. And you typically do change jobs every few years just to get a raise or promotion.

Which means... Always. Be. Leetcoding. Do the bare minimum at your job and then do LC the rest of the time. Because your job is just your job, but your career is LC. These companies don't yet realize they are optimizing for mercenaries that have no loyalty to the code nor the company.

On a slight tangent, some of the people over on Blind would sell their own mother for a tiny bump in TC. That mentality used to be limited to Wall St. or maybe Big 4 accounting firms, etc. But now it's the whole tech industry. I remember having a software job was a lot more fun, back around 2007ish. That culture is so foreign to me now.

It's what happens when an industry is awash in dumb money. Startups are seen as either half-baked schemes trying to get a cut of the VC funding pie, or cheap R&D labs for the big tech companies in search for inevitable acquisitions. The big tech companies who originally made their money delivering goods or services of real value, squander that goodwill through oligopolistic rent-seeking, engaging in shady anti-competitive or anti-consumer behavior, and endlessly building derivative, short-lived products that users don't actually want in hopes of inflating their moats. In a milieu such as this, engineers become mercenaries because "making the world a better place" has become a tired old cliche, both false and naive. Just a lot of cynicism all around.

I'm quite certain that making people dread interviewing is part of the point. Else why keep making people who've already passed multiple times, do it again? First FAANG to start routinely letting people who've already passed a couple similar interviews at peer companies in without repeating the leetcode-hazing, then the other companies have to do it, too, now suddenly the rate-of-increase of software developer pay's even faster than it already is, which none of them want.

Even having to pass such a test on a very aggressive set schedule—say, every 5 years—but only at those set times, would be better—for developers. Not for the companies hiring them.

Being able to quickly come up with an efficient algorithm to a complex problem is not a "pointless brainteaser".

Being able to quickly come up with efficient algorithms to domain-relevant problems are not pointless brainteasers.

Being able to quickly invert a binary tree and that kind of nonsense belongs in library code (at best). It's not something that the average developer should ever need to worry about the actual implementation of, let alone have to implement from scratch in under 30 minutes.

Yes, I agree they're relying on post-hoc reasoning to justify their own place in the industry. All of the grade inflation stories about Ivy League and similar schools and personal experience in the industry with these types don't give me a lot of confidence in the present value of those school names if the dynamics of VC funding change a bit.

I wonder which came first? Did they hire those people because they implemented leetcode style interviews, or did they implement those interviews because they originally hired or were founded by ex-bigco folks that simply did what they knew?

Perhaps there's another collectively shared work experience that also created leetcode style interviews. That is dealing with someone who has a degree, can speak well of what it takes to code, but who cannot code to save their life. I have known more than a dozen such "engineers". That's the reason you code during an interview, I think the emphasis on optimal O(N) is just a set of engineers who don't believe it should be "that easy" and doesn't really help produce more of a signal.

My coding question, for example,

You have a music player that should be playing a 900 song list randomly. You notice that you keep hearing a song being repeated during your drive and you are curious if its truly random. You also keep hitting skip in the hopes a particular song comes up. Write a piece of code that simulates this.

You should tell me three things; 1) the number of songs played before a repeat occurs 2) how many songs needed to be played before you played them all 3) after you succeed in playing them all, how many times has the most common song been played.

You can use libraries if you know them, and you can use the built in sort method in the language, I will google it for you if you don't remember the syntax.

But.. why? Does that reflect anything close to the job they would be doing? Is handicapping someone in every way (You will google it for them), putting them on the spot in an already tense situation and expecting them to code while you watch the way that literally any software job works?

This is the insanity to me, "Here, do this contrived task that doesn't represent anything you will be doing... to prove that you can do the job"

I once had a whiteboard interview for a senior engineer position where they demanded that I write it in syntactically correct python, indentions and all, on the whiteboard. I'm trying to talk about code at a high level with them meanwhile they are deducting points because I assigned a dictionary key directly vs. using the dictonary's method. It turned me off to the company as a whole and the entire interview went downhill from there.

> But.. why? Does that reflect anything close to the job they would be doing?

Yes? Yes. Figuring out how to make a computer solve problems is very much the job of a software developer. They will only encounter harder and less well defined tasks in their actual job. If they can’t do this and you hire them that is like hiring an opera singer who is mute, or a baker who is deadly alergic to flour.

> putting them on the spot in an already tense situation and expecting them to code while you watch

There are mitigating factors one can do. We make sure our hiring managers let the candidates know that there will be a coding challenge. We ask the candidates if they prefer to chat while they work through the task or prefer to be left alone and we acomodate what they choose. We let them know that whatever style they prefer it won’t change anything.

> meanwhile they are deducting points because I assigned a dictionary key directly vs. using the dictonary's method

That sounds very unpleasant. Sorry to hear that. Interviews are a two way street. You are interviewed and at the same time you are interviewing them. I think you were right in judging them, and you dodged a bulet there.

By the sound of it you are a talented, and capable developer. It might be that you can’t imagine it, but there are people who apply for developer jobs, has a really good ability to talk about the job, they seemingly have the right experience, yet somehow they can’t program even super simple tasks. Even after you give them every acommodation immaginable to humankind. If you haven’t seen this yet you won’t believe it. If you have seen it you want a filter against this particular kind of candidate.

I’m not saying that this filter goes always well. Every filter ever invented had both false positives and false negatives. We might lose a briliant developer because some quirk of the task throws them. It is sad. We are trying to minimise the chances of this, but it certainly happens.

I can certainly understand that there are unqualified people who apply, what I'm disputing is your assertion that there is a correlation between doing the contrived problems under unrealistic conditions and future job performance. It sounds like your goal is just to filter out the absolute worst of the worst, and I'm sure its effective at that, but I believe you might be filtering out more of the top end than you realize.

> It sounds like your goal is just to filter out the absolute worst of the worst,


> I'm disputing is your assertion that there is a correlation between doing the contrived problems under unrealistic conditions and future job performance.

You are projecting something here. You are saying, without any supporting evidence, that the problems are contrived. They are not. They are really the core of what me and my coworkers do day in and day out.

You are also saying that the conditions are unrealistic. What makes you think that?

If you and your coworkers are solving the problem you posted "day in and day out" just hire me and you can get rid of them :)

You are right though I have no evidence they are contrived other than your example problem. The unrealistic working conditions though is indisputable, do you typically relay your google search's for syntax through your boss? Do you often work under extreme time pressure on problems you have never seen before? Perhaps you do, but I can tell you with certainty that it's not the norm in the industry to work in that way.

Not the guy you responded too, but is there a good way to avoid a time pressure? A limit has to be set as to the time spent per candidate from the company's pov.

> It sounds like your goal is just to filter out the absolute worst of the worst

Indeed, this is a holy grail of the applicant funnel. If a tech-for-interviewing company comes up with a better way to remove the worst 50-ish% of applicants efficiently, they’ll have no worries about their future. (The overall applicant pool is significantly adversely skilled as compared to the have been or will be quickly hired pool.)

There is a class of engineer who have little performance anxiety by nature. They simply don’t care and are normally blunt types who love algorithms.

And they are, frankly, pretty bad at writing code because they have little empathy for the reader and greatly inflated sense of self worth. They’re the kind to use complicated C++ features or algorithms for little reason. Essentially, smart idiots.

They also believe in silly things like LC being a fair and rational way to evaluate candidates and don’t see the bias at all. “Eugh she’s an ugly woman, I think I’ll give her the LC hard and little help.”

There is another class who realizes how stupid LC is, but are happy to play the game to quickly accumulate power and prestige. They usually have psychopathic tendencies and aren’t great coworkers.

LC is great at hiring these types. Feel free to stick to it if you enjoy having them as coworkers.

This fear of hiring inept people kind of surprises me. Can't they just let people go who interview very well but are poor performers on the job? If they don't care to follow up on the performance of new hires, I have to wonder if it really matters one way or the other.

I'm outside any major metro area, when we've needed to hire there haven only been a handful of people responding. When people come in for an interview it's been pretty easy to tell if they really can't code at all without making them actually code. I haven't found this to be challenging.

I suspect the issue is interviewing a large number of people in a short period of time. If you don't have 45 minutes or so to spend on each person, using automated leetcode style problems probably starts to seem like a pretty attractive way to weed people out.

Lastly, usually one or two people from my team would take part in interviews. If there aren't any developers available at interview time (i.e., it's a manager and a someone from HR), I can start to see how people without any real coding experience can make it through the process. Again, this is probably a place where automated testing looks like a reasonable solution.

> Can't they just let people go who interview very well but are poor performers on the job?

Eventually, yes. But it takes time for them to start, time and money to onboard them, evaluate how they’re ramping up, then if not acceptable, to follow whatever performance management process is indicated by the company and local law, then transition whatever work they were doing. This could be several months and tens of thousands of dollars just to get back to a worse state than when you walked into interview that candidate.

Interviewing even slightly better than last year can pay large dividends.

Here you go

  # You have a music player that should be playing a 900 song list randomly.
  # You notice that you keep hearing a song being repeated during your drive and
  # you are curious if its truly random.
  from collections import Counter
  import random
  r             = random.Random() 
  songs         = range(1,900)
  unplayed      = set(songs)
  # You also keep hitting skip in the hopes a particular song comes up. 
  # Write a piece of code that simulates this.  You should tell me three things;
  # 1) the number of songs played before a repeat occurs
  # 2) how many songs needed to be played before you played them all
  # 3) after you succeed in playing them all, how many times has the most common song been played.
  counter         = Counter()
  firstrepeat     = None
  totalplays      = 0
  while unplayed:
      song = r.choice(songs)
      unplayed -= set([song])
      if firstrepeat is None and song in counter:
          firstrepeat = len(counter)
      counter[song] += 1
      totalplays += 1
  Number of songs played before a repeat:             {firstrepeat}
  Total number songs played:                          {totalplays}
  Most common song:                                   {counter.most_common()[0][0]}
  How many times has the most common song was played: {counter.most_common()[0][1]}
Sample run

  $ python3 foo.py

  Number of songs played before a repeat:             50
  Total number songs played:                          7263
  Most common song:                                   375
  How many times has the most common song was played: 17

I'm wondering why would the music player implement sampling with replacement and not without replacement? Sampling without replacement would shuffle the list and then play the tracks in shuffled order, no repeats until all songs are played once.

This is what I was thinking, as I was driving in, and heard the same song. It was an iPhone, and it was my impression that this is how the iPhone worked. Now, my iPhone did something else weird, it randomly dropped songs from being downloaded, so songs 'disappeared'.

One other thing, I just got a new car, and this was the first time I was using 'next' using bluetooth. So, as I was driving in, I was in my head trying to figure out whether the phone now only had 100 songs downloaded, or whether this bluetooth 'next' function on shuffle was picking a random song from the list, rather than 'next' on a shuffled list.

That would have been perfect. That's what I would have wanted checked in. I would have quickly turned to telling you how awesome the company was.

Nice. I would add % of it played vs. median and how many std.deviations from it.

I will google it for you if you don't remember the syntax.

You might as well just stand behind them and breath down their neck for added effect.

Seriously, that's way too claustrophobic, and not to mention a huge practical annoyance -- for the added latency of having to ask you to google stuff for them and then give you the result somehow, instead of just letting them do it themselves, when all they want to do is get past this trite exercise and start having a real conversation.

I notice quite a few early stage startups that recruit w/ large initial funding rounds (ie. a $20-40M series A) almost invariably have a note about how their founders are ex-FAANG. I can't help but believe it helps a ton w/ funding, professional networks, team building, etc.

Again, no different than having an Ivy League education. There are big advantages to scope/reach/opportunities in the industry.

Signaling is definitely a thing!

The predecessors to leetcode questions were being asked, for example, in PhD defense dissertations as well as some that were highly specific to the company in questions (sort 2MB of data 1MB RAM). Many of the problems are basically late-undergrad, early-graduate CS student problems.

Do PhD defenses actually involve asking random tech questions instead of what's in the dissertation? Seems like if they're going to fail that, you already have bigger problems.

My PhD was in biophysics and the questions ranged widely outside my dissertation (in my case, you actually do a defense to proceed to work on your PhD, then write your dissertation and give a final talk; there's no way, unless your advisor rejects your dissertation, that your phd wouldn't be awarded. Other programs rear-load the process and have a real "defense" at the end, which is crazy if you think about it.

At one point in my thesis defense, I derived several equations I hadn't seen before, on the fly, such as "What is the time resolved fluorescence of a fluorophore in 4-dimensional space?" and finally understanding ergodicity (https://en.wikipedia.org/wiki/Ergodicity).

My defense wasn't about determining if I was an expert and qualified to write a dissertation in my field (my questioners already knew that), but to determine if I was a well-rounded general intelligence capable of out-of-task prediction.

This sounds like the qualifying examination.

In my phd experience in the USA we had oral qualifying examinations which involved whiteboard derivations. The thesis defense after writing was really mostly focused on probing the results in the thesis.

> More than obedience, it probably is the aspect that they're smart and determined to succeed.

Depending on your definition of success, I'd say determined to succeed could well be the same thing as obidience.

I don't know man. I think you're going against Hanlon's razor with this belief. LeetCode type problems may do the things you're saying, but I'm not sure that's the intent.

Personally, I believe the reason companies do LeetCode type questions is because everyone else is doing them and no one can agree on a better way. Large companies want a hiring process that scales and is relatively uniform and LeetCode questions meet that criteria. Is there a better way? Probably, but figuring out what that is takes time, effort, and investment and most companies don't want make that investment for something they may get wrong.

Basically, most everyone knows LeetCode is shit, but as long as everyone else is doing it, there's little incentive to change. If everyone is doing the same stupid thing, at least your company isn't falling behind. If you decide to do something different, there's a real possibility you could spend a bunch of time and effort only to make things worse.

It'd be a fun social experiment if one of the big tech companies replaced the leetcode-style rounds with something arbitrary. Let's say: The "jumping jacks" round. You have to turn your webcam on and do 100 jumping jacks in 60 seconds. Only then will you potentially advance to the next round.

The person watching on the other end can evaluate how far over 100 you got, whether or not your form matches best practice, and assess how you were breathing in case... you know... they hire you and then there's a business need for you to do 500 jumping jacks in five minutes.

They might not even end up hiring anyone different than they otherwise would.

I know it’s a jokey example but it has insight.

First, it’s probably illegal in the USA due to ADA unless you could show that it tested something physically related to the job. Example: a big company is required to make accommodation for a, say, qualified analyst who is blind, but can reject a blind candidate for a job that required driving.

But after 45+ years of hiring programmers, the world hasn’t yet figured out which factors are germaine and which are not.

I'm pretty sure that most of the benefit of FAANG-type interviews is narrowing the candidate pool to almost exclusively people who'd be OK to hire. IOW they could just hire randomly from the pool with no leetcoding at all, and likely do about as well.

The trouble is, they actually can't do that because then the pool would change, quickly.

Google actually kept statistics on how much they liked candidates and how well they did after hiring. This has the obvious problem of not tracking those they turned down, or those who turned them down, but they published a paper on the results and IIRC basically found no correlation at all.

But I think like you say, what they did do was create a filter that only very patient and technically qualified candidates could get through. Beyond passing the filter, their very standardized process did not derive any useful information about candidates. But I feel pretty certain if they made an offer to every candidate and not just those on the pass side of the filter line, they would have seen some differences.

My guess is that by the time the self-selection of people who think they're ready to apply and have a good enough shot to make it worth all the time the interviews take, happens, and they get to an in-person (having perhaps passed a less-harsh phone-screen leetcode question), most further leetcoding isn't doing much aside from keeping the self-selection effect strong.

The other parts of the interview might be doing something per se useful, I suppose.

Pretty awesome pace if you can do a 100 in a minute.


Looks like a solid $300k/yr+ software engineer to me. Can you send me his LinkedIn?

You should see the 10x engineer's jumping jacks.

I like that idea. It pretty much performs the same basic function that LC advocates say that it does -- "Okay maybe it doesn't really measure useful skills, at least not in proportion to how they're actually used, but hey, it at least measures their determination to get the job, and their willingness put up with utter nonsense, which is of course hugely important to us" -- and is much quicker. At at least gets you out of your chair, and your circulatory system refreshed and oxygenated.

Have you heard of Amazon's "LP" interviews? It seems to be a little bit like that. The challenge there is to tell a number of well-rehearsed stories.

I'm going through job interviews now (not with Amazon, but others). The whole thing feels like well-rehearsed regurgitation. Well, and a smidge of acting, obviously, because you don't want it to come off as well-rehearsed. But they know exactly what they want to hear and you're competing with other candidates who see the same writing on the wall and are doing their own rehearsals.

I thought it was just the leetcode, but it's the behavioral questions and system design exercises, too.

Eh, I'm coming across as cynical. I'll be fine--I just wish we (as an industry) had a better system.

No it’s totally right. Going in cold is basically impossible. You have to be ready to regurgitate leetcode answers, “approved” system designs, STAR stories about your career that show various BS story arcs. And of course pretend like you’re not just regurgitating for… reasons. It’s all a bizarre cargo cult acting session at this point. Big companies even send you a study guide these days ffs

Amazon sent me a study guide......I told the recruiter I'm going to pass.

Until, I don't have a job I'm not grinding leetcode. I also dislike people who refer to themselves as ex-faang. I really don't care.

LP style of interview optimizes even more towards storytellers and fair bit of BS purveyors. The entire process is far too impersonal to drive any real and effective assessment.

I get the sense that, because of Amazon's roots as a bookseller, they still like to optimize for people who are a bit more well-spoken and well-read, compared to other FAANGs. At the executive level, PowerPoint is banned and all meetings start by reading a well-composed multipage memo.

Being able to tell a great story might not be highly valued at most orgs, but I think it's part of Amazon's culture and something they desire in their skilled positions.

This is ridiculous. You yourself admitted that conspiracy theory doesn't explain startups. The answer is simple: leetcode interviews are stupid easy to give. There is no "nefarious conspiracy to keep workers obedient." It is simply the path of least effort.

I think leetcode style interviews being easy for the interviewer / company is the most rational explanation for why its so pervasive. Its easy for the company as an automated screener and an on-site because there is a clear pass / fail rubric that can filter people. If you have a bunch of people doing interviews that don't want to do it (Google and FB interviewers cough cough), then this style of interview is perfect for the interviewer to conduct if they just want to get it over with.

I've noticed that Apple only uses simple leetcode-y style questions for its phone screens, but for their on-sites they ask very domain specific questions. It is interesting because if you're actually good at the domain you're interviewing for, they are very easy. If you're not they can be intractable. Its clear that each team puts a lot of thought into the interview process and I imagine Apple teams get a very high SNR, at least compared to Google / FB.

You could use the ancient Microsoft logic questions. They are equivalent as easy. Leetcode is sticky for a reason even if it's not the intended reason.

this post is way too optimistic. you're implying the people making the decision that leetcode tests are used are competent enough to know they suck but still use them for the soft-factors you mention. I would bet essentially all of my money that instead they're simply incompetent and actually think these tests are good at testing ability.

that being said, everybody talks about like leetcode tests for a week straight is the norm. I've recently been through the whole thing with several big tech companies and some smaller companies, and all of them only used them for the phone screen.

in fact with microsoft for various reasons they skipped my phone screen and I went straight to onsite, and there was no leetcoding involved at all. still some whiteboard coding, but rather about highly specialized problems relevant to my field, none of this algorithm puzzle bullshit.

I can boil this incompetence you describe down even further: employers and their employees don’t want to spend time making their own interview challenges.

It’s actually pretty hard and time consuming to come up with a mock scenario and evaluate it, especially when you also have your regular work to do.

Leetcode is seen as good enough and all the effort is on the interviewee. The current employees don’t see the pain and suckage. They already have a job and it’s not their problem.

> It’s actually pretty hard and time consuming to come up with a mock scenario and evaluate it

That's a hard problem for some startup, not for FAANGs. They could do hundreds of those and keep switching them quite easily.

Some FAANGS do this. Amazon has/had their own code evaluation system along with a live proctored exercise.

I think most companies are looking for the path of least resistance.

May be that’s where the problem is. Interviewing skills should be part of the responsibility. There could be a committee that can set and evaluate the process and questions. However, this will hurt the ego of a typical engineer so much. No one likes to be told what questions they can ask. Everyone likes to believe they are good interviewers, it’s just that they “hate interviewing”.

>>I would bet essentially all of my money that instead they're simply incompetent and actually think these tests are good at testing ability.

Probably the same people who think JIRA is a great project management tool :>) - when in fact, it (JIRA/Leetcode) gets used a lot, because it gets used a lot by other people - and for no other good reason.

Agree, from my experience this is largely a factor of requesting developers not used to interviewing to suddenly have an interview, often with little guidance either from their manager or hiring team. With nothing else, they have to rely on their own experience on how they were interviewed, with leet code, and so they continue the cycle.

The problem with this theory is that I have met a ton of engineers who really have a lot of their identity wrapped up in how well they did on leetcode interviews [mostly by memorizing enough of them to pattern match] and heap scorn on candidates who don't do well on them as if they are subhumans.

The ego and status jockeying appeal of leetcode is very, very high and a lot of engineers just eat it up.

I think discussions of this topic which ignore the fact that there is a substantial population in our industry who fall into this trap are basically flawed.

and those will be the same people perpetrating this flawed interview system, because interviewers tend to look for themselves in the interviewee

I think there is this:

> Big companies want people who are smart enough to do the work, but obedient enough to put up with all the bullshit that comes with working at a big company.

But then you also have the companies who see other companies do it, so they cargo cult. Those are companies you definitely want to avoid. Their entire stack is because of cargo culting not because of an engineering need.

I think you also have a few of the Bro Culture companies who do this as a hazing ritual.

A key clue about some of their reasons for doing it is that they still make you go through it even if you're, say, applying at Google, passed theirs 8 years ago, passed Facebook's 5 years ago, and passed Netflix's 2 years ago.

I think part of the purpose is to cool competition between them.

I agree, on the startup side, the only "brilliant" player is the founder(s), and they just need people to pump code, do marketing, etc... non-stop, because "they believe on the mission". I don't like the Leetcode interviews either, but what can we do? I have had the other types of interviews (take home projects, trivia questions, code reviews, pair programming) and all of them suck as well. At least with leetcode you know if you "got it right", there is less subjectivity in the process. Also, assuming someone is less smart because do leetcode interviews is a bad take. All companies (not only startups) need obedient workers, and the interview method have very little to do with that. If you are a true rebel, would you work for somebody else?

This reminds me of the interview process at Thinking Machines, where the interview was working onsite for a week and then getting voted on.

Or Pixar which required you to excel in atleast one thing in your life

Re: Thinking machines. If you are looking at moving on from your current company, and are evaluating 6 companies, you have to bank up 6 weeks of PTO to 'try to work' at various companies for a week?

i excel at training dogs, would Pixar like to hire me?

They might (but having that one skill alone may not suffice).

It seems like a huge inefficiency in the economy that thousands of people are studying and practicing for an entrance exam that has no value in the real work that the companies actually do.

But overall that may be a good thing for society. If Google fired all their leet coders and replaced them with real engineers they would not need as many. And the biggest problem may be that there is a limited number of real engineers in the world. Companies have to figure out a way to make so with the people that are available.

> replaced them with real engineers

nice gatekeeping. Shame on them for working to get a job they want.

After seeing all the unreliable and buggy software, i think more gatekeeping in IT would probably be beneficial.

You wouldn't let someone do surgery on you that hasn't been to medical school.

Systemic problems being blamed on individuals is not going to help. Of course society is a shared responsibility model essentially but it means even a highly skilled and otherwise technically competent and qualified engineer in an inadequately structured system can regularly produce garbage work. Having to hire nothing but the best to achieve decent or viable output IMO moreso implies that the processes or systems are lacking than whether people are even capable of output of the desired feature set.

As an adult its your responsibility to produce sensible quality despite social factors. You can not expect society to hold your hand.

I disagree.

Culture is important at companies. Without it quality engineering can’t/won’t happen. People will rise to the bar set by their leadership unfortunately.

Example: The engineering of the Boeing 737 Max

Source: - https://www.netflix.com/title/81272421

Page 4, Part 1.1 from https://www.vdi.de/fileadmin/pages/mein_vdi/redakteure/publi... :

> Ingenieurinnen und Ingenieure sind alleine oder – bei arbeitsteiliger Zusammenarbeit – mitverantwortlich für die Folgen ihrer beruflichen Arbeit sowie für die sorgfältige Wahrnehmung ihrer spezifischen Pflichten, die ihnen aufgrund ihrer Kompetenz und ihres Sachverstandes zukommen

Translated with Google translate:

> Engineers are solely or – in the case of collaborative work – jointly responsible for the Consequences of their professional work as well as for the physical performance of their specific duties, which are due to them due to their competence and their expertise

Carrying this responsibility is a (indirekt) requirement for being legally able to call yourself engineer here.

Edit: a Netflix movie is a form of entertainment and not a valid source.

I understand this but there’s a huge gulf of nuance between expecting hand holding and keeping your head down while bad management expects great work output and results despite problems of their individual failures either.

It's more about professionalism and stricter standards of engineering and less about being "good despite bad context".

You can hire the best artisans for your pot making company and make some of the best pots. However, sooner or later you will encounter problems like disparate outcomes of quality (due to different people having different standards of style, crafting and discipline), unpredictability and reliability.

An engineer role is to standardize processes and turn things like quality into a predictable outcome. It's not just about being good and not being bad.

Unless the allegedly consolidated decided role of “software engineer level 123” is just a fiction recruiters believe in.

What you are describing doesn’t match my lived experience in this industry

IMHO IT should be less geeky an "leet" and more professional.

The quality of a product shouldn't rest solely on having faith that you hired good programmers by chance.

The rigor of an intro level algorithms course at a serious school far exceeds what is expected in a FAANG Leetcode interview.

Sure, that's true. I haven't taken one in over 25 years. I don't remember enough of the fine details off the top of my head to solve these well. But I can count on one hand the number of times it's held me back in real life.

Instead in my day to day I'm able to recognize when a problem needs something beyond the braindead solution and a sense of roughly where the solution lies, and am able to then go do a bit of research on the topic.

Solving leetcode hard style problems in ~45 mins off the top of one's head is just not a problem many developers face in their jobs. I agree that these tests demonstrate some information, but the ability for that person to be a good developer is not among that signal.

I generally agree, but in the case of algorithms courses you generally know what you're expected to know for tests, and for course work you have ample access to course materials.

Getting a random question from an infinite pool and completing it with fewer resources under a short time crunch is a different game.

The low-level/assembly course I took had tests that felt like puzzles, and those tests were a ton of fun to take. Can't say I've had the same feeling for leetcode puzzles.

What if my surgeon went to med school and was top of class, but had an attending who insisted on using the wrong tools? Or there was another surgeon on the team who created the "architecture" for a brain surgery, but for some reason in order to get to the brain they went through the chest. There can be many reasons for unreliable and buggy software that are completely unrelated to the developer.

An actually relevant test metric might help.

You are just apologizing for fraternity hazing

Good engineers learned their craft long before any school got ahold of them. The really good ones skipped the handholding.

Good engineers are professional engineers. A good engineer does a good job on what's expected on them.

A good engineer does not have to be a leet hacker, in the same way a good architect does not have to be a geeky concrete fanboy.

This is where you get attitudes like: "If you haven't been programming since you were six, you're rubbish and we don't want you."

I will take a programmer from age 6 over a dozen alleged “software engineers” performing “best practices” theater.

The former can deliver good software with less bugs, the latter might be able to, but are more likely to bikeshed endlessly, while being hamstrung by “best” practices that turn out to be inapplicable in a given domain, or even damaging.

Do wake me up when software engineering becomes a standardized practice that isn’t a mere cargo cult.

I swear, if I read one more comment about doctors and surgery and licensing I’m going to have to diverge on a rant exposing the multiple flaws in popular SE best practices.

Is it gatekeeping? They hire so many junior devs, do you not think it’d be possible to strip the team down by hiring far more capable engineers? Would Google crumble if they only hired senior devs like Netflix? I’m convinced they hire so many purely because they have the money and managers want bigger teams to go up the managerial track.

It stops their competitors from hiring them and it means they won't start new adtech companies.

I want the airplane I am about to board to be flown by a real pilot.

Do you want the plane designed by a brilliant engineer or 100 people who are really good at memorizing engineering questions?

That makes for a snazzy answer but we should probably consider how planes are actually engineered.

Brilliant engineers don’t make safe planes. It’s actually teams with a culture of excellence and quality.

Source: - https://www.netflix.com/title/81272421

This comparison isn't totally valid. Behavioral and system design rounds become much more important in the typical Silicon Valley interview process as you get more senior.

Oh good! We finally moved past the doctor metaphor.

There is no standardized methodology in SE as of 2022 that has anything resembling actual science as it’s basis, or we would certainly have thrown out LC interview hazing by now.

Your comparison is inaccurate. You might as well rail against non-conservatory musicians who taught themselves to play.


I'm not saying that they should be ashamed for being leet coders. I'm just pointing out the inefficiency of the system. It isn't unique to software. Historically, lot of jobs have required irrelevant accomplishments or certifications before employment.

In your opinion, what's the difference between a leetcoder and a real engineer?

In my mind, a real engineer really shines in the non technical aspect of things, like coordination, communication, prioritization, and getting hard questions answered. But that's just me, I'm curious what everyone else's experiences are.

I agree with all of those things. A real engineer has soft skills and the ability to look beyond the immediate problem to see what is at the heart of the issue. They can see the effects of a solution and see the problems that might arise from it. They see connections.

Leetcoders just find the most efficient solution for the problem at hand. They don't see connections to other problems, whether current or potential. They're a scalpel when you really need a massage.

I've seen leetcoders throw away less-efficient, safer solutions just to implement something more efficient and clever. And when things fail, they're nowhere to be found. Too busy with whatever current Story they're on.

You don't want to be a real engineer in business though. Real engineers do all of the hard work and thinking while getting paid less than management.

I've done management. It's mind-numbing and soul-sucking. I'll take the pay cut to have the stimulation and challenge of being an engineer.

One has time and patience to practice for leet code interviews, while the other is too busy for it.

All of that non-technical stuff wont help you when the bridge you signed off on collapses because you had no idea what the blueprint was actually saying.

I think a bunch of this "engineers don't actually need to know stuff" comes from a generation that never had to deal with things that might kill people if it fails.

A lone idiot can certainly foul things up, but look at how catastrophes usually unfold.

The right information is usually somewhere in the organization, but it isn’t shared or acted upon appropriately. The Challenger report is pretty clear that it was “an accident rooted in history”. They had a decade of data about O-ring performance in the cold, but it was ignored/overruled in the decision to launch. It certainly wasn’t the case that The New Guy just grabbed some rubber from the wrong shelf and blew up a space shuttle. The Mars Climate Orbiter was lost because of a metric vs imperial mixup, but it wasn’t one dev who capriciously decided to work in inches; there were whole teams that weren’t synced up.

Engineers certainly need to know something, but I’d bet the farm that a team of B+ engineers with good coordination can run circles around “rockstars” that won’t work together.

In fact, I'd go so far as to say that if a lone idiot can wreak havoc, it's usually a coordination problem further up.

To me a real engineer is good at engineering, what you described sounds like a good HR manager.

I don't think HR is usually on that level.

For example, suppose your team generates and shares some kind of data. You want to change the wire format. You could just go ahead and do it, letting the chips fall where they may, but I think that's bad engineering/engineering management.

It'd be better to coordinate with the folks consuming the data. Can you make their migration easier? While you're making a breaking change, are there others that would make sense to roll in? Are there times when it'd be better/worse to roll out the new version? Being proactive about this sort of stuff seems like good engineering to me.

All of those things are people management, and I agree people management is an important skill to have if you want to work in a big company in particular it isn't related to engineering. A guy who is completely non-verbal but makes a solar panel that is 5% more efficient is a great engineer, perhaps not liked by those around him but still a great engineer.

Terry Davis was a great engineer and he had the worst commutation skills you could imagine.

HR does not do that. They don't prioritize nor coordinate - unless you count getting people to company dinner as coordination. They do communicate, but rarely about product or how things are done or should be done.

And I don't mean that as criticism, I mean ythat as "it is not their job"

Yeah it sounds like 1x cope.

Same. The biggest point that I didn’t see in comments so far is that Leet Code exercices are really fast to evaluate and it accelerates the pipeline for filtering and hiring. Even a recruiter can “evaluate” them. Just looking at the code more or less and if the automatic tests pass. Which is not the case for exercises for engineers. If you want a real engineering problem it would be mostly about architecture which are long and hard to evaluate. Or about how to break down a rotten program and apply SOLID and other like that to redesign it. Same thing long and hard to evaluate.

Are those properties exclusive? Someone has to write std::sort

The problem is not LC. The problem is companies not tailoring the interview to the position.

I don't know about that, the best engineers I have met were introverts. (They preferred reading to "communicating.")

I mean, I'm an introvert and I prefer working with code, not people, but I've found my own productivity has skyrocketed now that I'm making an effort to reach out more and raise questions about bigger issues than just the immediate technical issue I'm facing.

I've worked with many people much smarter than me. And I've watched smart, very technical engineers silo themselves off. And then I've watch smart technical engineers that make an effort to communicate and lead initiatives to improve the codebase, make arguments to management for tackling tech debt, and speak up during technical grooming to propose better solutions. And while the siloed off introvert might be able to write better code faster, the engineer that is looking at the whole picture makes an entire team move faster.

How many companies have written and still do write their very own IAM solution? Duplicating work and inefficiencies is at the core of modern IT production.

The Purpose of a System Is What It Does.

And just because in case of companies that purpose is often unspoken, doesn't mean it doesn't apply. And that's why leetcode and inane interview processes.


I think you're overthinking it. I think very simple questions are appropriate just because you want to see if the person knows how to code. Not knows how to balance a binary tree, but literally knows how to code. You'd be surprised how many programmers I've interviewed that literally couldn't get anything to compile. To be fair, you see the same in the finance world where someone lists Excel and data analysis as skill sets and then doesn't know how to do a vlookup.

People write all sorts of stuff on their resume. They took a C course in college 10 years ago and write C as a skill. They could be programming managers and not know what a pull request is.

I think its definitely gone too far, but basic coding proficiency is essential IMO.

Funny that we've been discussing a version of this every single week in last decade but leetcode has gotten more prevalent, not less.

The Leetcode system is incredibly self-perpetuating for some reason.

Some of startups are founded by people with limited technical experience who, as a crutch, just cargo-cult technology trends from bigco's. IME you're better off going full "cowboy" than trying to crib bigco practices.

I haven't thought about the bigco tech interview process like this. But now you put it out, it makes perfect sense :) I tried white board interviews about 10-12 years ago with big companies and failed miserably. Then I decided never to take a white board coding interview again which I haven't done yet. However, after working for startups and a few acquisitions later I ended up in a silicon valley big company without doing a white board coding interview. I'm not sure where I went with the story, may be to show that there are alternatives if you don't want to do white board coding.

I think it's more along the lines of "well, this is how I got in and if it recognized my genius it must be pretty good".

Along with "if we all had to go through this shit, so should you".

IME its biggest defenders are people who were a little bit extra proud to have gotten a job at Google and I reckon its as much those types as much as upper management keeping it going.

Like frat hazing rituals the fact it intrinsically makes no sense is sort of beside the point. Like frat hazing rituals it'll have to be rooted out like a weed to actually go away coz it's well and truly baked in.

>startups are supposed to be, uh, innovating, developing new technology, and working on hard, meaningful problems?

Maybe 1% of startups do this. The rest are shitting out the software equivalent of Juicero.

I don't think a lot of thought is put into it. It's inertia and 'what the the other successful big companies do'. There is also a subset that enjoys leetcode interviews and complain if you don't do it. Everyone does it because big daddy successful google started it, and thus it self perpetuates. 20 years ago it was bullshit microsoft-style lateral thinking questions that were that generation's leetcode.

On top of that, the only people who can really get rid of leetcode are pretty much sr staff or principle engineers and high level HR people. It also doesn't get you promoted or paid more in most companies to go on a long project to change the interview process. Thus it doesn't get changed.

I estimate that leetcode will probably go away in many big companies in 5-15 years from now as today's sr engineers become the next sr management and founders that actually makes these decisions. Many startups today explicitly do not have leetcode interview loops, while 10 years ago many did have leetcode interview loops. When that generation of startups have their facebook that defines their decade and thus defines the next common interview fad, leetcode will start going away.

It also allows you have any engineer interview pretty much any other engineer, which makes interviews overall easier to do.

> why do startups do leetcode too?

A point that I didn't see yet made. The truth is that lots (though of course not all) startup companies just have no idea what they are doing.

I had to deal with the complexities that imagined possible future scalability issues introduced into our stack when the whole customer base could be served by a raspberry pi, and it happened in multiple companies.

And it is not just infrastructure, but also code. I have to follow the twisted clean architecture rules and write tons of boilerplate code (that could make sense on the backend for a complex system) for a mobile app that is basically just a skin over a graphql API (basically no business logic on the client).

The same happens when hiring: we need top talent like FAANG companies do (but we can't pay for it), because we are awesome, where in reality anyone with a sceptical, product-focused mindset and a "get things done" attitude would do.

In my opinion this happens because we don't want to admit that our startups will likely not need the scalability for years, we won't need to follow fancy architecture patterns because our app is simple, and we could do well with developers who don't have top computer science skills.

I have a saying that sums this up:

"How you hire is whom you hire."

> obedient enough to put up with bull shit

I have see an over reliance on leetcode style question in startups actually. May be more fairly, "we want to be google" style companies which is every startup. In fact some large companies have a more relaxed way of interviewing.

"companies want people who are smart enough to do the work, but obedient enough to put up with all the bullshit"

Which is why I love leetcode interviews: I get an insight at how a company, especially tech management, operates. During the dog-n-pony show interview, I will put up with it, but will definitely cast the company in a poor light. I tell companies that I am interviewing with that I will not prepare with leetcode (some will suggest it). I rather spend my time reading higher-level concepts. You know, the stuff that actually helps in modern software development.

I left out the word "big" in my quote. Companies of all sizes do leetcode interviews and want heads down and STFU type of developers.

> No, we all know it measures those things, and those are exactly the things the measurers want to measure.

I wouldn't be so sure everyone realizes it doesn't measure anything relevant to actual job performance although it should be obvious (since nobody whiteboards algorithm puzzles all day in an actual job).

In all of these threads there will be many people saying they only want to hire the top% of developers (as if it was a strict linear scale, as opposed to hundreds of intertwined skills) and they justify leetcoding as the way to do that in a belief that top leetcoder is somehow a top software engineer, instead of realizing these are unrelated skillsets.

Exactly. It's all about signaling [1]:

"... However, employers do refer to the educational background to differentiate between high-quality workers and low-quality workers (Kasika, 2015). Consequently, individuals with higher-ability use their educational background to gain the "education signals" that allow them to move into high level and high wage positions"

In this case, the "educational background" is shown by solving leetcode-style algos.

[1] https://www.researchgate.net/publication/299509496_Signaling...

> Big companies want people who are smart enough to do the work, but obedient enough to put up with all the bullshit that comes with working at a big company. What person better matches that than someone who's able and willing to study for and pass a tech version of the SAT?

Leetcode rather measures whether you are into brutal cramming. Let's put it this way: the kind of people that fits this property is in my opinion often "a little bit special", i.e. not the kind of employee that in my experience both startups and big companies prefer.

startups need even more obedient ones

those who will overwork, be on call 24/7, doing 10 men's jobs at once and can tolerate abuse from managers, because you want your stock options to vest

Getting paid literally 1/2 of less of what you make at FAANG...

That's why they hire only 10xers, so they CAN do a work of 10 men!

doing 10 men's jobs at once

10 what now?

Ten male humans.

They are trying to point out that women also code instead of going with the colloquial understanding that it would be ten people’s jobs.

'Men' does not necessarily mean 'males.'

I think comparing it to the Bar exam for Law (and other field's equivalents) is a more apt comparison than the SAT.

> You might ask, so why do startups do leetcode too?

Not all big companies do, not all startups do and not all companies of sizes in between do. It's also a little bit up to the candidates to just not put up with the bs and stop the interview process or not even start it in the first place.

I'd agree with this. If they want people to crank out the product they have in mind... Test for hte ability to crank code.

And in that case the leetcode interview may actually be very valid, and not even evil!

> What person better matches that than someone who's able and willing to study for and pass a tech version of the SAT?

Is it actually? "Leetcode" is pretty much covered by 6.006 [0] or equivalent. You shouldn't have to "grind" or study for hours if you passed the class (your problem sets should more than cover what you'll do on the blackboard).

Anyone applying for a serious engineering job should have at least done one algorithm class. This is what leetcode filters for.

[0] https://ocw.mit.edu/courses/6-006-introduction-to-algorithms...

I don’t think companies are doing it consciously, but I do think that they do use eighty hours of interviews and coding challenges to select for compliant fungible cogs.

>an exercise left to the reader

So it turns out that startups also want a culture of meaningless and arbitrary brainteasers instead of building cool shit that works?

Someone who builds a truly novel technology solution involving hundreds of hours of effort gets filtered out of an interview involving contrived scenarios. You may have built the next generation X, but given an array of strings and a fixed width, can you format the text such that each line has exactly maxWidth characters and is fully justified -- in the next 30 minutes? Maybe you should have cultivated that skillset instead, because around here we value parlor tricks more than real world accomplishments.

> Someone who builds a truly novel technology solution involving hundreds of hours of effort

This person should already have enough of a reputation to get a job at many companies, if their work is public enough.

What do you suggest for the 99%+ other candidates?

I have over a decade of experience, including driving big technical change at one organisation, and was filtered out by a timed leetcode test. I put together a repository of leetcode practice, as I had a feeling that I would have bad luck on the day. They didn't look at this.

The internal recruiter said it kept happening for seniors and people with a lot of experience, but his hands were tied, as the leetcode process was deemed important by the CTO.

Actually best thing happened to you evading the work under this CTO

What was the question?

Is that relevant?

>it kept happening for seniors and people with a lot of experience

Why would this be? Old brains not being as "flexible" to think up novel solutions?

If you think it’s about flexibility or brilliance or intuition or experience, you couldn’t be more wrong. Leetcode and others and purely about practice and that too recent practice, all questions follow similar patterns and if you have it fresh in memory you can write it in 5 mins. But writing it from brainstorming can take more time tha allotted.

So are top leet coders better programmers? Not really, a lot of them are colleges students who have time to practice and they are in similar competitive circle. I’ve interviewed many and couldn’t hire even one

You think LeetCode problems require novel solutions? It's the exact opposite: they're great for new grads who just spent 4+ years cramming for exams.

Bingo. I was asked recently to find the maximum subarray of an array, in a live coding exercise. In TypeScript. So, JS, but with type annotations.

The interviewer themself said "this is probably more aimed at someone who just graduated."

It was for a senior data engineering role, where the odds of me implementing classic dynamic programming problems on the regular are slim to none.

They made me an offer anyway, but I just wonder what value they found in that. Oh, he knows about Big O? He's heard of memoisation?

They were big on FP, apparently. But not Scala, there was too much FP in that for them, hence the TypeScript.

Mind you my first ever rejection was for a Python role back in 2011 when Python was still very niche in my country, and while I had a portfolio of, imo, pretty decent Python code, they weren't interested because I didn't have a degree, and people without degrees write unstructured code.

Which is a very long way of saying, every interview process ultimately devolves into people hiring people like them.

I'd hire you on the basis that you know that it's memoization and not memorization.

I was once asked "how do you sort a list of integers?" - the question was so ill defined that I thought it was a joke (in a database?, how long is the list?, in a flat file?, memory only?, on an embedded system?, a dataset in spark? ...).

Thinking, ok, this must be an ice breaking joke I responded, "I don't know, how _do you_ sort a list of integers?". In a condescending tone they responded "are you even a programmer?". At the time, I had been in the game for more than 10 years.

I am really not sure how one could successfully build several companies and have shipped several products without knowing "how to sort a list of integers".

Trying to find a good place to grow your career is difficult on many levels, but if you find leetcode questions silly, you might be too advanced for entry level jobs. ...and you probably don't want to work there.

In that way it also acts as a legal form of ageism.

Or, perhaps more accurately, a layer of plausible deniability to cover illegal age discrimination.

People straight out of college will spend a month doing 100+ Leetcode problems targeted at the companies they're aiming for. Leetcode tracks problems used at interviews, so there's a very good chance they'll see a problem identical or similar to one they've solved already, and it's just pattern recognition.

After the 3rd or 4th time doing this, practicing the same problems just to pass interviews isn't very appealing, especially if you've saved money and can do something more interesting.

You could try to "wing it" and derive it on the spot, but you'll be outcompeted by someone doing a lookup of a solution + alternate solutions from cache.

Nah, it’s just that LC problems take practice to get good at. Similar to weight lifting you wouldn’t expect to bench 250 lbs your first week, but after training and making progress you’ll get there (or close).

If you’ve never done LC (or any competitive programming) you’ll struggle. Practice more and you’ll recognize the dozen or so patterns.

Also LC is not “novel,” maybe at the time when the algorithm was first devised but not when you have 20 minutes to solve one.

The tests aren't testing for competency at the job, and after a decade of experience writing software you have long ago realized that party tricks and cute algorithms are a fairly rare part of the job (generalizing here of course), so you stop thinking about them as much and get out of practice. When they do show up, you certainly don't have to do them in 10 minutes, and I think everyone would rather you didn't anyway, so that you write a robust solution rather than a clever one.

Students are often better at leetcode because school has been drilling this shit into them for the past three years, but it will probably be the last time they see such a compelling algorithmic challenge until their next leetcode exam.

>write a robust solution rather than a clever one

but why must it be mutually exclusive? Are you implying all "robust" solutions, whatever that means, are dumb? Surely you put some thought in it to make it "robust"?

In general in this type of use, "clever" and "dumb" would better be called "tricky" and "obvious" respectively. Usually people describe very tricky solutions as "clever", and much more obvious solutions as "dumb" jokingly.

For example, storing some flag in the high bits of a pointer field of a struct is a "clever" solution, whereas having a separate bool field is a "dumb" solution. In most cases, the "dumb" solution is much more robust over time (less likely to cause bugs as the code changes and is modified by various people). Of course, the "clever" solution is necessary in some situations (very constrained environment, critical infrastructure such as an object header used for every type etc), but should often be avoided if possible.

What's important is that the way this is often presented is that more experienced people will prefer "dumber" solutions, as experience often shows that long-time maintainability trumps many small losses of efficiency. So using "clever" and "dumb" in this way is not at all intended to put down the engineer writing the more robust version.

When it comes to everyday software solutions you are usually aiming for obvious and clear, and often 'clever' is obtuse and opaque but definitely not always.

I think there might just be some vernacular nuance here though, maybe we can call it smart and robust, versus clever and opaque. Some problems are just difficult though, and if you get to work on that kind of problem regularly then that is pretty lucky.

Novel solutions aren't as helpful as you think. Pragmatic, simple, vanilla solutions are reliable.

You won't create your own linked list library, you'll use one from the standard library.

General runtime analysis can be helpful - but production, real-world benchmarks trump all theoretical performance values.

Code changes - how do I make a change to a production system in a million line code base that has good test coverage and when deployed, won't bring the entire system down. That's an exercise in the coding interviews that is completely ignored but most useful in the day-to-day professional setting.

> General runtime analysis can be helpful - but production, real-world benchmarks trump all theoretical performance values

This is what I always found funny. Most of the software development work these days is related to web apps. Optimizing that nested loop won't do anything if you have to wait 300ms on some shitty API to answer anyway. It literally doesn't matter, noone cares.

Related meme I saw on reddit some time ago where senior developer says 'haha nested for loop go brrrrr':


I've forgotten most of the algorithm stuff I learned at university that I haven't used in my job. I know how dynamic programming works and I can recognise such a problem, but don't ask me to implement it in an hour. I know how graphs work and what the various algorithms are, but why would I implement any of them when I can just import networkx?

More likely a lot of us don't feel we have to cram for problems suitable to screen entry-level candidates at most.

Personally I'm perfectly happy to be filtered out by such tests and refuse to practice for them, as companies that use them for senior level positions are companies I really don't want to work at.

> This person should already have enough of a reputation to get a job at many companies, if their work is public enough.

But then why do people with that kind of reputation still (at certain companies) have to jump through these hoops?


> What do you suggest for the 99%+ other candidates?

What about (instead of forcing a months long decision process upon the candidates and the company) bringing them into the company after a short interview (maybe 2hrs), and making sure they can afford housing, food and everything else they need. If you like their work, they stay employed. If, say after one month, you do not like what you see, you can easily let them go. Of course you tell them upfront what the deal is.

We could call it, I don't know, maybe trial or probationary period.

That's a big overhead for both the candidate and the company. Only an unemployed candidate could do that, and even then they'd have to stop interviewing at other places to dedicate the month. No thanks.

True Probationary period / uncertain employment is

1. Untenable for employees - of I have a mortgage or family or plans or obligations let alone a current job, taking this kind of risk is unacceptable

2. Untenable for companies - that's way too much investment.

Companies do have probation periods formally but they are exceeeeedingly rarely invoked, for above reasons.

Might I introduce you to Europe?

Works here. My org recently ditched a bad hire with it.

I think we'd need to share more detail (and fwiw, I'm half from europe and half from Canada :).

Certainly companies have probation periods. And on paper, that reality and what's proposed in previous post are similar.

But I think there's a massive real world difference between "Default stay hired" and "Default not stay hired".

Probation, as it has currently been implemented in most companies I've worked in, exists, is formal, can and has been used, but is an exception. It's used when there's a massive, unanticipated, egregious problem in performance.

What is sometimes proposed in these threads is effectively replacing long/multiple interviews, with a probation period. While such probation period may look similar or same on paper, I think it's a completely different approach: "We're sure of you (though possibly wrong) so we're hiring you" vs "We're not sure of you so let's hire you and see!". I for one would have only touched the latter with a 100ft pole maybe once in my life. Certainly, I imagine anybody with current job and monthly obligations, would be quite wary in taking a "we don't know so let's try it!" approach to hiring. No, let's figure it out first please :)

I've only done the latter in the form of being brought on as a contractor at (high) contractor rates but with the understanding they'd prefer to have me join full time, at a time when I was already doing contracting and had other clients in parallel covering parts of my costs. In that situation I was not taking on any more risk than I had already chosen (and planned for) by contracting, so it was fine.

It's the only kind of context in which I'd ever consider the "we don't know so let's try it" approach.

> But then why do people with that kind of reputation still (at certain companies) have to jump through these hoops?

Your article points out that in this example: "I'm not allowed to check in code, no... I just haven't done it. I've so far found no need to.".

> If, say after one month, you do not like what you see, you can easily let them go. Of course you tell them upfront what the deal is.

> We could call it, I don't know, maybe trial or probationary period.

You make it sound like it's a better solution for candidates, but it's way worse for many of them and it has been explained by other commenters already.

How many companies actually do this? At which scale?

Some companies increased their difficulty to hire by having aggressive PIP objectives. Likewise, having a "real" probation period where you fire, say, 10%+ of employees is not gonna make you competitive when candidates compare their offers.

My expectation with such an arrangement is that if you do decide to hire me, since I am now a known quantity you won't have an excuse to pay anything other than top of the market rates. "You said you hire only the best right, and you've seen me work, you want to hire me, looks like the best make $X."

> But then why do people with that kind of reputation still (at certain companies) have to jump through these hoops?

They don't. Your link is not about the interview.

It would be fair if the company has to pay you 5-11 months of salary if they decide no after the evaluation period. That would leave ample time to find another job. Also, in many jurisdictions, this kind of arrange isn't legal, for good reason, as the company has way more power over the individual worker.

We could make sure that a person only has to do that trial once in their career and then every other company should accept that they've done it because they've proven they've done it. We could call it an Apprenticeship or an Engineer-In-Training stage. (Where have I heard those before?~)

>This person should already have enough of a reputation to get a job at many companies, if their work is public enough.

You can't just hire someone based on their reputation at a company of any maturity. That's a legal and HR nightmare. There has to be a process with a semblance of objectivity, and that process has to demonstrably apply to everyone equally, always.

Which in practice means you put out a fake job posting where the qualifications uncannily mirror this persons resume to a tee, and you hand them the job formally after a week.

Yes, but the forms must be observed. You can't just skip the interview, or give people you like a softball interview compared to other candidates.

I've been in situations where I was already somewhat working and onboarding while the faux job ad was up for the two week or however long mandatory posting period. tallied the hours separately and got paid back after i was hired.

I can't at all imagine why this would be the case. Why would a company have any kind of liability for hiring biases such as reputation (except for systematically refusing candidates from protected groups, of course)?

Hiring based on reputation is the same as hiring based on resume. And it's extremely common in almost any company. Why would it be a legal or HR nightmare?

> What do you suggest for the 99%+ other candidates?

To apply for the 90% tech companies out there. 10% of all tech companies out there are FAANG or FAANG-like. 90% of tech companies are normal tech companies (they'll care about your education and cv and the interviews are usually just a chat. No IQ tests)

I think it would be exceptionally hard to build an individual reputation in the tech community in general, props to those who have, but building a reputation in a specific industry and local community is much more achievable for everyone.

In smaller industry niches this can be true for companies more than people. At this point in my career, the fact that I worked at Company X is evidence enough that I can do the job Company Y wants me for, since it's a tight industry they essentially know of the work I was doing, even though it wasn't a groundbreaking novel technology of my own.

Wasn't there famously the story of the guy who wrote a package manager (brew?) that become massively popular and was widely used at google, and yet google rejected him for a job because he couldn't invert a binary tree or something?

Don't know all the details so I could be missing something crucial, but if not it'd seem that reputation isn't enough

Yup. Max Howell, creator of Homebrew. A bit more information here https://www.quora.com/Whats-the-logic-behind-Google-rejectin... (link to Quora, uggh, but that's where he wrote it). HN discussion of that: https://news.ycombinator.com/item?id=15981338. Howell's earlier tweet after getting turned down by Google: https://nitter.net/mxcl/status/608682016205344768?lang=en. "Google: 90% of our engineers use the software you wrote (Homebrew), but you can’t invert a binary tree on a whiteboard so fuck off."

I don’t necessarily think leetcode should be the only litmus test for a candidate but Max is an outlier. Not every candidate getting rejected by a leetcode question is also capable of building out homebrew on their free time.

I had a great track record that was out in the open, actually displaying vast knowledge of algorithms and data structures and exactly the stuff that's being asked in those interviews. FAANG interviewers did not care one bit about it. They actually consider it bias to look at a person's prior work.

Plenty of great developers with a prominent public profile appear to be under-employed.

They 100% can get a job and a decent one. I know because I fall into this category BUT the FANG salary are 2 to 2.5 times what I make at this point which pushes me down the study leetcode part.

Guy who wrote apple homebrew got filtered and rejected by Google for not reversing a binary tree.

Apple homebrew is used by tens of millions of people daily.

Nit: homebrew is not from Apple and is not even limited to Apple platforms.

This reminds me of creator of „brew” who got declined by Google because he couldnt write a reverse binary tree search in 1h.

Dont mind that half of Google is using his work for free.

That wasn’t actually true. He later said that he wasn’t asked to invert a binary tree.

Also, nobody at google is using homebrew for work.

Anyone using a Mac for tech related jobs has a high probability of using brew. That includes a lot of googlers

Google doesn't allow any code on laptops. Everything is done via web IDEs or ssh into a linux machine. So nobody is using brew to obtain dependencies.

You are factually wrong. Not only that, you're conflating several different processes and team rules.

Many googlers use brew to install applications on their laptops. This not against policy. Other googlers work with code stored directly on their laptop. There may even be developers who are obtaining deps (for their own builds) from brew.

The problem with the brew author is that he had every opportunity to make himself look hirable at Google but instead chose to write an incorrect screed and publish it on the internet.

You need a full Santa exemption with business reason to use brew. The average person working on some server that deploys to borg does indeed use their Macbook as a thin client. Who's obtaining deps for their builds from brew? If you're building stuff on Mac, it's via bazel and all your deps are in source control.

I never said anybody is obtaining deps for builds- that's all UncleMeat.

At the time I used brew (5 years ago) it didn't require a santa exception with business justfication (and my justification would have been "I need this for my work"). Fortunately this wasn't really a problem for me any way as I don't even look at Mac machines as anything other than a thin client.

It is true that this was the problem with the brew author. Homebrew being part of the typical Google workflow is entirely independent of the situation.

But, as usual for internet discussions, it is fun to rathole on side conversations.

The best part is, I was defending the use of homebrew (which I absolutely hate) and local development (which is far inferior, IMHO, to blaze/forge/citc/piper). I had really hoped releasing abseil/bazel would help but sadly, it was done too little, too late.

How are those tools better than for example plain old mvn or gradle ?

That seems unlikely, given that they’ve implemented an Xcode project generator for iOS apps built with their build system: https://tulsi.bazel.build/

No one’s running Xcode on the web or on a Linux machine.

I’m sure there are a few open source developers who use brew or Xcode directly, but most Mac builds happen in a distributed build system wired up to use remote macs. Yes, via Xcode. Not on local machines.

The mac build system is wild. It is still all done remotely with a farm of macs running xcode. You still don't build code for running on macs on your local machine.

You actually do, what Tulsi generates is an XCode project that has shell script build steps that call out to bazel. Bazel underneath the covers will end up calling the clang that comes with XCode.

Do you work for Google? I'm just doubting you a little. Like the guy from project zero is finding bugs in Windows via ssh to a Linux machine? Definitely going with Doubt here.

There are a few outliers, but yeah. Per policy, no code is allowed on laptops. And because Google's build tooling is very centralized basically everybody works on the same kind of machine.

The folks developing Chrome for Windows or iOS apps might have different workflows, but even then they aren't going to be using brew because of Google's third party code policies.

They are correct. Stuff like project zero is an extreme outlier. I've been using a MBP for 5 years and have never used homebrew on it.

Cheers for clarifying :)

I'll admit that surprises me greatly, I can't see why it's considered more efficient, but hey, Google.

The first program you build in Noogler training takes more compute and io to build than the Linux kernel. The distributed build systems laughs at such a trivial program and barely breaks a sweat at programs 10x that size.

A poor little laptop would break down and cry.

Google has a giant monorepo. It is too big for git. (Virtually) everything is built from source. Building a binary that just runs InitGoogle() is going to crush a laptop.

I believe that there are also a bunch of IP reasons for this policy, but from a practical perspective doing everything with citc and blaze is really the only option.

Can you share the order of magnitude we're talking about here?

probably efficient because their codebase size would outstretch local disk space requirements, and also would allow for comprehensive access controls

...worked in another place which didn't allow local code. It was the most horrible dev experience I had. Cannot fathom why google went that way too.

Google lives and does by its distributed build system. Most devs don’t even have exactly have code on their local workstation either—it comes via remotely mounting a file system.

But 99.9% of the builds happen remotely. So local vs remote code just isn’t that relevant.

It is also true that Google spends millions and millions on its dev environment every year, so this isn’t your average “no code on laptops” situation.

Is the code being in a VM mutually exclusive with having homebrew installed?

Why would I install homebrew on a work laptop if I cannot build or run code on my work laptop? Why would I install a dependency management system if company policy is that all third party code is checked into the repo as source and built using blaze?

Because you can also install applications via homebrew. I use it for a few things, including installing bat, delta and other Rust coreutil replacements that I prefer. I absolutely do not use homebrew for project dependencies.

When I worked for google, several of my projects were developed locally on laptops. My intern (who was developing tensorflow robotics computer vision stuff) used homebrew to install tools. Not everybody at Google used blaze.

what in the world does that have to do with whether he was qualified to work for google? the average developer at google is definitely a much much less proficient programmer than the creator of homebrew.

None of them own a Mac?

Google doesn't allow any code on laptops. Everything is done via web IDEs or ssh into a linux machine. So nobody is using brew to obtain dependencies.

You shouldn't use Brew to obtain dependencies. This is how you end up with people complaining about a brew upgrade replacing the version of Postgres their project depends on.

You probably shouldn't be using dpkg or rpm for that, either, unless your CI and deployment targets are running the exact same version of Linux that you are, and even then—there are usually cleaner and more cross-platform/distro ways to do it, especially if you need to easily be able to build or run older versions of your own software (say, for debugging, for git-bisecting, whatever). I continue to wonder how TF people have been using typical Linux package managers, that they end up footgunning themselves with brew. "Incorrectly", I suspect is the answer, more often than not.

Where it excels is installing the tools that you use, that aren't dependencies of projects, but things you use to do your work.

Get your hammer from Brew. Get your lumber from... uh, the proverbial lumber yard, I suppose. Docker, environment-isolated language-specific package managers, vendored-in libs, that kind of thing.

I don't install project deps with Brew (it's a bad idea, but, again, so is doing that with dpkg or rpm or whatever directly on your local OS, a lot of the time) but I do install: wget, emacs, vscode, any non-Safari browsers I want, various xvm-type programs (nvm, pyenv, that stuff), spectacle, macdown, Slack, irssi, and so on.

That's fine, but almost nobody is running tools on their MBP. So for this sort of thing you'd be using the package manager distributed with glinux. And Google is also a really weird island where tons of tools are custom. You cant use some open source tool for git bisecting because Google doesn't use git. You cant use some open source tool for debugging because borg is a weird custom mess and attaching debuggers requires specialized support.

Google uses git. I used to sit next to Junio Hamano, the primary developer of git, and lots of teams that used my team's services were using git. Lots and lots of teams. There was even an extension to use git with google3, which was really nice, but was replaced with a system that used hg instead.

I was very imprecise. Git is used both for OSS stuff as well as some other stuff. But the norm is development in google3 and even if you've got a layer of git commands on top of that, the actual source and change management is being done by citc/piper.

True, although I predated citc and piper and we definitely built apps locally on machines with source code (from perforce). I was strongly advocating that more people switch to abseil and build their code like open source in the cloud (the vast majority of compiled code doesn't have interesting secrets that could be used to game ranking, or make money from ads).

OK, that makes sense, thanks for the explanation.

I don't do leetcodes because I am not really into programming puzzles (or parlor tricks :D ), but I roughly tried to do what you described. ~45 minutes it took, deciding in the middle to not worry about the algorithm. Guess I won't get the job ... shrugs.

You won't get the job not because you "didn't worry about the algorithm" but because you didn't ask any questions about the problem; just went straightforward to the implementation. In FAANG interviews that would be a red flag.

This is a myth. It doesn't matter if you ask "questions" but implement an n^2 solution. Unless you implement the optimal solution, usually using a top-down DP or some array trick, you aren't moving forward in the interview process.

In the FAANG interviews I've done you're never allowed to ask questions...? Maybe for clarification of the problem space, but not about the algorithm or the promise of a particular solution.

If I'm the interviewer and you don't ask questions I'm going to rate you very low on your communications skills. You may have the best algorithm in your head and write the most elegant code, but working at a company requires you to communicate your ideas and plans and code and everything else to your team. And no, communicating only at the end (code review time) is not enough. This is not a school assignment that you silently write and then turn in for a grade.

The best interviews I've experienced, both as an interviewer and an interviewee, are the ones that feel like two team members collaborating to narrow down requirements and solve a problem.

> about the algorithm or the promise of a particular solution

It's not about "the" algorithm or "a" solution. It's about you the candidate being able to propose multiple solutions, perhaps with space-time tradeoffs, to provide a recommendation based on your judgement, and to ask the interviewer what they think of your proposal.

I mean, that's great, and I feel the same way. Yet every time—since the introduction of leetcode questions, anyway—as the interviewee I've been asked not to ask questions about the algorithm or the solution, just clarification of the problem space. FWIW I have been employed at several of the FAANGs or whatever they're called now and have also been on the hiring side, where I certainly did not discourage asking questions of any sort.

Facebook interviews were very much interactive with my interviewer probing me on O(n) type questions and me refining it down to be more efficient. I was certainly allowed to answer questions, typically about scale. My code which was on a whiteboard certainly wouldn't compile and a good amount of the discussion was making sure that the interviewer could follow it and that he was satisfied with each of the steps.

My first round I passed with a less than optimally efficient solution, but he was satisfied every step of the way during my work.

While I was lukewarm to the prospect of working for Facebook, the interview process was very positive and reflected very well.

On a personal level, self interest would have me like the leetcode style problems because I can get most of them right on the first try during a timed interview, without studying. If I were pursuing a job at a FAANG, I might actually study them and I'm sure it would go well for the testing portion of the interview.

However, when I interview this is not what I'm looking for. I'm typically looking for someone who knows the particular language that I'm hiring for. My questions run from the very simple to as deep as they can go on either language or implementation details. From the most junior to the most senior, they get the same starting questions and I expect the senior people to go deeper and explain why they choose something over something else. I'm also testing their ability to explain it to me (not just get it right) as that is part of their job working with juniors.

I really don't even care if they have the names of things right and don't really count things wrong against them if they get the names of two things backwards for instance. For example in Go, a huge percent of the time you might use slice over arrays. Some people get the names backwards, but can identify which one they actually use and they know that one can change size. They are correct in usage and misnaming them. I inform them of the name, encourage them a bit and move on.

I've never liked the "look at this code, what's wrong with it" approach. There are too many contexts that I have to jump into at the same time. There is often an expectation that I find a specific problem with it. I'm lacking the usual tools like an IDE or compiler. What level am I looking at in the code? Does it compile? Are there off by one errors? Cache invalidation? Spelling errors? Logic errors? Business errors?

This guy has missing tests on code that needs to be refactored in order to make those tests. Maybe he has it figured out just right, but the "jump into my code" interviews I've been in on all seemed like they had secret gotchas that the interviewer expected specific answers about.

In short, I haven't seen a proper, repeatable process for interviewing for software development.

Well, good to know if that were ever an opportunity for me. Probably would have to invent questions because it's hard to ask yourself out of being dumb-founded on the spot for the moment.

The interview for my current job had something simpler: something like finding random permutations, then what's the algorithmic complexity of this random algorithm. (It was years ago, I forget the details of it.) I just talked through the solution. That was nicer that having to come up with questions. :)

So even if you deliver exactly what they were expecting they are upset that you didn't need to ask any questions to get there?

Yes, because getting to the right answer is not the point of the interview. Apart from anything else, getting to the right answer may mean you memorised it and are incapable of doing anything else. Always show your working and thought process. Asking questions and showing that you understand tradeoffs and that users have different requirements is a good way to do that.

That's what I've heard, yes. Crazy, eh?

> given an array of strings and a fixed width, can you format the text such that each line has exactly maxWidth characters and is fully justified

So many questions. What character encoding is the string? What is human language? Should I honor non-breaking spaces and other similar codepoints? Is the string full of 'simple' characters or 'complex' characters? Graphme's? Emoji's? What's the min and max limits on width? How long (or short) can each string me, and how large could the array be? On what system am I running, and does the array fit in memory or is it paged off a disk? Does the font we're using support all of the graphme's present in the string?

Amusingly, I had a variation of this problem as part of an Amazon L8 IC role interview, framed as a Prefix Tree. Solved the problem, didn't get the role. :(

Sadly enough, I'm literally scoping out a new feature at the startup I work at that involves rationally splitting text into lines with a max length and buffers with a max line count so we can interface with a legacy system from the late-80s/early-90s

No joke: Isn't this the famous Donald Knuth text layout algorithm? I heard it is still used today -- it is that good.

I've done that in interviews: "oh hey, the most efficient known algorithm is this classic named X". That tends not to win interviews either, but in the real world knowing the name of the best algorithm (or even just that a best algorithm exists and a general idea of what you'd google to find it) is more useful than knowing the details of it. If I need to reimplement a well known algorithm I can often reimplement it from the Wikipedia description, that's trivial and boring make-work (and will always be trivial and boring make-work). But I need to know which well known algorithm sometimes and that's a far more useful practical skill.

Asking contrived algorithmic puzzles might prove nothing, but the examples in TFA are actually pretty basic programming:

“I have an array with positive numbers, find the n^th largest”

yeah, but in real world when this kind of problem will need to be solved people will most probably sort O(NlogN), or use priority queue O(NlogK), or even will go with something like O(N*K), almost no one will go with O(N) algo and because usually N and K are rather small and this code will not be called too often time complexity may be ignored. Still any solution shorter than O(N) will be called inefficient. And in real world they will know N and K from this what kind of problem they are solving, and this will not be hidden in mist of abstraction with assumption that "candidate should ask".

I mean, in the real world you probably use a library method. If I were an interviewer I would not be expecting the candidate know about median-of-medians (O(n) worst case). I wouldn't even expect they know a-priori about quickselect (O(n) avg). But I don't think it's unreasonable that given a few hints, a candidate could understand and implement quickselect in 30 mins. Most people know about quicksort already, and quickselect is not very different. You can even give them the partition and select_pivot function at the start and then if there's time have them fill those in. In the rare situation they haven't even heard of quicksort, you can even write the shell of the algorithm for them, and have them adapt it to quickselect.

Even then, all thats probably a bonus - a priority queue implementation, or many other possible solutions are probably good enough for me.

Moreover, with the micro-optimized SIMD quicksort algos that are perennially cropping up on this website... I would be willing to bet that "sort and take first N" is objectively faster than my crappy Python implementation -- even if it is linear time.

Not only is it basic, it's included in the C++ standard library (since '17):


FOMO is an atrocious hiring practice. Chances of your scenario, where you somehow find a person that can create something novel, that they will stick around to actually produce novel thing and then this novel thing “making it” are nonexistent.

Applications are open for YC Winter 2023

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact