Now that I am 51, I feel annoyed that all of these stories of interviews involve asking questions about algorithms that rarely come up in real coding, and if they do you should NOT be rolling your own code, you should use the established, battle-tested solution that is out there on the internet if you spend 60 seconds looking for it. Much more important is to have the experience of how code complexity accumulates, and how to mitigate that. I cannot spend hours and hours studying up on these algorithms, there are much more important things (real coding-related things) which I need to learn about, to the extent I have time to do that. What should I choose, pytorch or keras? React or Vue? Go? Kubernetes? Spark? All much more important questions than how to do a breadth-first search.
So, was I wrong then? Or am I wrong now? Perhaps both.
My frustration isn't exactly like yours (my time is probably a lot less valuable). I feel that the questions are all geared at puzzle solvers.
If you're a puzzle solver, you love answers. You love digging into the details. You love finding out the basic components of a system. I think these are the people who excel in academics.
Almost every other intern I met once I got to the bay area was a puzzle solver. They were also competitive and had amazing grades. And these people LOVED algo questions. Over lunch, they could go through half a dozen questions.
I slipped through the cracks I guess. I'm bored by puzzles, have awful grades (and in a worse program), and not so competitive. What I do enjoy, however, is building up. I like modeling and being about to think at greater and greater levels of abstraction to solve problems that aren't puzzles, but instead open ended questions.
Anyway, I think both of these types of people have a place is software, and only one is given attention
In their eyes, they have given you a problem involving a tree from the graph theory and you are insisting on an "out of the box" solution involving woodpeckers.
Let me elaborate.
Usually hypothetical situation is just a way to talk about an abstract problem.
It's easier to visualize and to talk about an egg falling and breaking (or not) than abstract measurements.
Now what you call 'thinking outside the box' is just attacking the irrelevant details of an imaginary situation which is there purely for convinience and can be replaced with another hypothetical situation corresponding to the same abstract problem.
It's actually supposed to be mutual judgement. But I agree: be respectful if you decide to drop a remark about the relevance of the question. There are enough reasons to treat people with respect even if you can afford to turn them down.
Agreed, I was being sarcastic.
The people in Creative Lab were impressed with the ride range of knowledge I had, the SRE guys who rejected me got stuck into me for mixing up some VMware terminology.
In my opinion, you will do well if you focus on building your own business sooner or later.
It goes something like: "Hey, new Engineer, I need you to work on experimental feature X". - Then the engineer builds system Y to provide feature X, probably poorly architected and not scalable, not expecting it to go anywhere. Some months later, app containing feature X gets distributed to thousands of users, or goes viral and hits millions of users.
Congratulations, like it or not, you have now become the lead architect of the now-widely-deployed "experimental" system Y, and you are going to either crash and burn or become a damn good software architect by maintaining it out of sheer necessity. Later, you find yourself putting "Lead Software Designer" on your resume, having earned that role and title.
A lot of successful software was originally designed "by mistake" and grows far, far greater than its original purpose. And in contrast, you get the well-designed software that was properly architected, but never goes anywhere because it never saw the light of a successful deployment at scale.
Or are you saying that you start of at O-1 and not a higher paygrade? If that's the case, the OP was pointing out that you start off outranking enlisted service members, not that you start out above O-1.
Of course in really large companies I suppose having these as a seperate role is a necessary evil. But I still think there should be much more collaboration and interaction between people in these different roles than I generally see.
I have found that the data structures that you need are often simple trees and hashes. Database design is probably more important. I often shake my head at students doing hours of dynamic programming to crack interviews. In the past year, how many problems in your company have you solved through dynamic programming? I suspect, not many.
1) Gathering requirements.
2) Turning those requirements into data structures.
I think a lot of companies will separate those into different roles, but I strongly believe that's a costly mistake in most situations. There's a feedback loop. Requirements inform your data structures, but your data structures help you contextualize (not pollute) your requirements. If you're working with an existing system, you also want to understand current structures in order to guide requirements. By guide, I don't mean to ignore pain, but if you're building a system to support multiple clients / users, then the real trick is to figure out how to do that in a cost-effective and elegant way.
When these roles are separated, and the requirements are gathered without knowledge of the structures, I believe you're setting yourself for failure in the long term (spaghetti code / everyone gets something different).
Getting these to gel is the puzzle I'm personally passionate about. I don't mind what I would call the math side of programming / code puzzles, but it simply doesn't give me the endorphin rush that building a system that elegantly ties things together does. The best is when you're working towards that or have achieved that and inspiration for further possibilities, improvements, efficiencies start popping up.
I'm not sure how you test for this as part of the hiring process, at least not efficiently. I do think we get better at this with experience, but I doubt it's exclusive to seasoned developers.
It's not the same thing as some piddly twenty minute problem -- it's harder, requires some experience, and it has the same kind of aptitude. Consider it a rule that people that can't handle data structures problems are going to create systems architecture problems.
In my experience, algorithm interviews aren't even very good at measuring aptitude for algorithms. Recent study and aptitude for the format have too much influence. Asking candidates about architecture has been much better at predicting their aptitude for architecture.
If you can rigorously ask architecture questions, and don't let people get away with hand-waving, that sounds fine. I've been asked good architecture questions and I've been badly asked bad architecture questions too, so... it depends on the question, and how you ask.
As other commenters have noted, choosing an architecture for a system may have more to do with vision, common sense, and even imagination. And the web is such a great example of this, maybe the classic example. As you probably know, it's possible to write a webserver without having a clue about fancy data structures or algorithms. It's the imaginative composition of pre-existing features that creates the architecture.
Another example would be the extensibility of Emacs through elisp. Consider the notion of hooks which essentially allow utility functions to be customized by the user. It's a purely architectural concept.
Or the Unix philosophy: small utilities that read and write text, linked by pipes. Really no algorithmic complexity to it at all.
It feels problematic to me that you are saying to someone that they "need to stay far, far away" from some aspect of computing. It's a kind of gatekeeping. Sure, an outfit like NASA needs to make sure that they don't have amateurs writing their systems software. But in the industry more broadly, people have to be given permission to pursue whatever interests them. It's important, for diversity and the invention of new applications, that "differently able" coders aren't told, by intimidating stereotypical computer science types, that they don't have the aptitude to architect a useful system, or that it's "a rule" that people with their skills will just create problems.
Vision and a perspective that differs from the norm are a lot rarer and potentially more valuable than the ability to apply software engineering principles. Architecture, in the broadest sense, is precisely the area where these "outside the box" contributions are likely to be valuable. Consider the invention of the Wiki. It came out of the pattern language world, it was a hugely beneficial innovation, and it has almost zero data structures/algorithms complexity behind it.
I'd even argue that the examples of pitfalls that you list are more clerical matters of systems programming than cases where architectural design principles are lacking.
(And even the professionals who can reliably grind out working code for large systems can get the architecture spectacularly wrong, as the case of the X window system: http://web.archive.org/web/20170920104011/https://minnie.tuh... )
I really needed to hear this thanks.
Web servers and clients, for example, are a case where I've seen bad architecting cause months of damage. An Emacs function hook or advice system, likewise, requires care in terms of when the hook is called, and what the rules for their scoping are. It's a bad software architect that thinks they can just slap callbacks on something.
I'm no expert but I know a logistics company that has the 30 workers 1 puzzle guy setup. He maybe writes 20 lines of code a week but its on stuff other people cant figure out. However if he was the only type there literally nothing would ever get done.
I've been out of college more than 10 years now, do a mix of hardware and software (so am not coding all day everyday) and I can hack together algorithms like BFS if I spend a couple of minutes thinking about it.
I get that there are many awful interviewers out there and that hiring is pretty broken, but BFS seems like a pretty reasonable interview question, so long as the interviewer talks through what it actually needs to do and isn't only looking for a memorized answer. The question is well contained, doesn't require the candidate to know any specific API or framework, and can be implemented in pretty much any language. Probably the biggest downside to it is it's such a common question that many candidates study it ahead of time which makes it a worse filter.
Rather than memorizing a thousand algorithms, I generally focus on being able to solve problems simply, quickly, and elegantly, which will help in actually doing most jobs.
I've heard people actually defend these useless interviews on correlation grounds, like people who are docile, follow the herd, and "prepare" are good hard-working workers who also tend to have done well in school or look polished in other aspects of life. That's what these interviews are really looking for. So, if you're not a well known expert, suck it up, be a servile cog and follow the script.
The trick seems to be knowing when the game is worth playing. There are certainly situations where competition is too fierce and requires too much preparation, so it's not worth doing if you don't really enjoy the game.
I'm not sure getting hired at a large tech firm is quite that competitive, though? They do hire lots of people all the time.
Of course, I could probably come up with a way to do it by myself. I could probably even come up with an efficient way if I spent an afternoon/day on it.
But why do that when I can google it, and get 3 blog posts and 2 stack overflow answers detailing the different options and the trade offs between them, most likely even with an implementation I can base mine off of?
That's why these interview situations are stupid. They're like school where copying is cheating, whereas in real-world situations copying is a great way of doing something.
A lot of resources out there are wrong, inaccurate, or not reasonable for your particular context, and it requires a reasonable amount of algorithmic intelligence to be able to sniff out what's appropriate.
If I had a dollar for every high-upvoted SO post that misstates a problem or doesn't offer proper caveats... but I can make that judgment because I've already thought about a related problem in the past.
It's like asking why one should learn how to write properly when Grammarly and spell checkers exist, or why mental arithmetic is useful when we have calculators—at some point those your tools will be inappropriate or unavailable. Search tools are force multipliers, not replacements for personal knowledge and intuition.
Right, and having that algorithmic intelligence is key part of me being a good developer. But testing how someone implements an algorithm in time-scarce circumstances is not a good test of that kind of intelligence because it is likely to favour people who have been exposed to that particular algorithm before (even if they only rote-learnt information about it - as many interviewees seem to do in practice) over people who have the capability to think deeply and reason correctly about it, but have not previously been exposed to that problem.
BFS is not something I thought about much in the last 5 years, and quite frankly could care less about. I got stuck when I knew I needed 'something' to finish implementing BFS, but couldn't remember and the google interviewer offered no help. What I couldn't remember was the queue.
THIS is the problem. The (unreasonably) timed nature of the exercise means that the only people that will do well are the people that prep heavily for this specific skill, in much the same way that people about to take the LSAT are likely to do better than it than actual fucking lawyers with proven experience.
Because which law school they went to is on their resumé and the average LSAT scores for each school are trivially available, tracking very, very closely with school prestige.
a lot of great data scientists dont even know how to use python, let alone implement a tree and tree traversal. there is so much more interesting and pertinent stuff I spend my cycles on learning / keeping in my head than undergrad cs bs.
But, if you explain what it is in words, and have them whiteboard how to turn those words into pseudocode or somesuch, then it could be a appropriate, sure.
Who could really solve these problems, error free, on the spot with no prior exposure to underlying concepts?!?
That doesn't mean all "implement X algorithm questions" are categorically bad.
But put them in a room and say "Make us a 2 layer red velvet cake" with no access to a recipe, and you're not going to get a red velvet cake. It's an easy recipe that most people easily recognize with just a glance, and anyone could make it if they have a recipe on-hand or make red velvet cakes with abnormally high regularity. But most people, even if they're great chefs otherwise, will fail.
Recognizing that a concept exists and implementing a relatively complex concept are different things.
To know how to make a novel cake (not a recipe you've memorized), you need to have a good understanding of how the different ingredients interact. You probably even have a good sense of why they're there.
Those seem like exactly the qualities you'd want to look for in a cake maker. Sure, they'll probably be going from recipes in their day to day job. But they'll be able to tweak them intelligently too.
But most restaurants are looking for someone who knows how to cook and use the tools in the kitchen. A chef who doesn't specialize in cakes will still probably be fine if you give them 5 minutes to look up a recipe and make one. In the same way, the overwhelming majority of tech job openings are in need of programmers with some degree of specialization and the ability to know how to look up what they don't know and the experience/knowledge to know how to put that info to use.
Companies are testing employees on how to make cakes when their job will be all about making burgers to order.
I'm dubious of this claim. Google/Alphabet has a PPE of ~$150,000. Activision-Blizzard has a PPE of ~$33,000, EA had ~$95,000, and Ubisoft ~$15000. I think the only possible major game studios who could meet your claim are Epic or maybe Valve. Unfortunately, those numbers aren't public. So I'm going to assume you work on Steam, since Epic's new platform is likely too recent to have lots of users, and also I don't think it was competently designed.
(the other thing is that this metric isn't particularly meaningful. Google invests a huge amount in unprofitable things with the goal of profits 5 or 10 years down the line. Google devotes thousands of engineers and billions of dollars to things it knows aren't profitable. That doesn't make Google more or less competent, it just implies different priorities than one that's entirely driven by immediate returns).
>Based on my experience (currently responsible for a huge chunk of a major gaming platform, with hundreds millions users), inflexibility, emotional immaturity, inability to think outside of the box, inability to listen and admit failures, lack of emphaty, not understanding basics of sales(to sell own ideas) are major things that contribute to a bad performance reviews.
All of these are important, but they only become important once you have the baseline level of competence. If you have two equally competent people, then yes indeed the softer skills like flexibility, creativity, humility, and communication skills start to matter, and eventually they matter a lot. But those are only good things if you assume a baseline level of competence.
Consider two options. One is that you screen solely for competence. You get only competent engineers along a wide range of interpersonal/soft skills. Some can grow into leadership roles, some can't (or at least it takes longer). But even the most convincing, best "salesperson" has great engineering talent. Then the alternative: that great salesperson is a bad engineer. The ideas they promote aren't always good ones, sometimes they're bad. That's not just "not good", its actively harmful.
>Our strongest engineers will not pass google interview without preparation (me included) and it is telling.
What does it tell? You seem to be suggesting that Google is optimizing badly, and that's possible. But that is by no means the only conclusion. Here's an alternative: "The most profitable things are not always the most technically challenging". Every once in a while you'll see people on HN commenting about how they wired up a shopify store that provides some significant passive income. That's not challenging. It would take an average engineer a week to do, if that. It's still profitable. But that doesn't make you qualified to work at Google (or Valve).
But, as with a concept like 0, making the cognitive leap to understanding that there should be a way to represent the lack of something is counterintuitive (indeed, consider Tony Hoare's billion dollar mistake. The "right" way to represent the absence of an object is a hard problem). However, once you've been exposed to it, it's not particularly challenging to rederive the necessary pieces.
So yes, complaining about BFS is, imo, akin to complaining about being asked what the result of the arithmetic operation `35 - 12 - 23` is.
I'm not asked to implement BFS every day, but the concept of a tree traversal is incredibly common.
Even more explicitly, a lot of my recent work has been involving automated refactoring tooling, so there's lots of work with abstract syntax trees and control flow graphs, which are trees or graphs, and therefore need be traversed.
Yes I can probably implement a sufficiently fast maze solving algorithm, depending on exactly the environmental constraints (for ezple, given the problem as stated I'd assume something like a* using a precomputed mat heuristic would be fast for most mazes of reasonable size, I promise either didn't google that).
But I wouldn't expect someone to answer that in an interview. Hell, I don't ask dynamic programming questions because I think they're too obscure and tricky (in the riddle way). My most common interview question involves a problem I've faced multiple times at work, has multiple valid solutions, and which I had to implement myself in various flavors before eventually coming across a relatively obscure standard library implemention.
Although before I was able to identify that implementation, I had to solve the problem 2 or 3 times, see the different flavors, and identify the common themes and the theoretical underpinnings.
I don't expect a candidate to do that without help, and most don't. But it sure sounds like you think that I'm somehow a bad engineer for asking a question that tests on the job skills I've needed, and the ability to break a problem down into component parts and implement them cleanly.
But oh no, it uses an algorithm. It's a terrible question and I should just ask a question about software engineering that doesn't use any algorithms. Because obviously people will always be able to identify the standard library implemention (no candidate I've interviewed ever has, in fact). So colore dubious of this whole argument about whiteboarding questions being a bad test.
Consider that BFS was first published in 1945, while the turing machine was published almost 10 years prior. Is BFS an implicitly more complex concept than a turing machine? That seems like a strange argument to make, but I'm certainly interested.
> was I wrong then? Or am I wrong now? Perhaps both.
Neither, I think. The Algorithm interviews are bad for 90% of the positions out there, and experience matters much more than trivia, so you were right then.
And today, everyone is trying to hire "senior" (and above), so there's no concept of "will learn, lacks experience" - and the algorthm questions do badly at identify THAT, so you're right today too.
It turns out that interviewing is hard, and we don't know what works well. Most of us think "we" are the ones that can obviously see what's wrong and everyone else is interviewing poorly. Most of us are wrong.
But even besides that, we need to focus more on getting newer coders that DON'T have experience. We'll spend years looking for people that have experience...longer than it would take to grow that experience. We want seniors because we want people that will grow without being mentored, because we have far more work than people...but we skip one of the ways to reduce that workload.
I'd much rather have an interview that was teaching something new, and see how the person processed it. That's not a good interview either, but at least it is trying accomplish the correct goal.
And when we do interview for senior and above, experience will count a lot more than building a frickin' linked list.
Cowen’s Second Law: There is a literature on everything. Hiring is no different. Why do so many of us that say “of course you shouldn’t reimplement timsort, stand on the shoulder of giants” either make up an interview process based on gut feeling or cargo cult one based on what we heard google does?
So no matter if you've read the literature and implemented a process, there is something wrong with it.
That's why there threads persist. Every process has something wrong with it depending on who you are. Funnily enough everyone eventually manages to sort themselves into the right category.
The system isn't broken any more than armed conflict is broken as a way of settling disputes. It's works, it's just a little terrifying and awful.
Personally, I'd rather not work somewhere that hired based on drilling fringe information - what will the job be like? Maybe I'll feel differently in 10 or 20 years
I would hope so, because I do all those things and, after over a decade writing software, find the ability to solve puzzles almost completely useless for most coding I've ever done. But reality doesn't need to make sense. So, if you want to work for Google or Facebook, learn puzzles anyway :)
That may not be necessarily a bad thing, depending on the position of course. But projects are not the be all end all, at least not in my company, and where I interviewed.
1. The first interview must not be a technical assessment — identify shared interests, look for collaboration opportunities. You need to understand if the person in front of you will be someone you would work with for the next 10 years, their current technical skills have little to do with this.
2. At the end of the first interview, right before the logistic questions, assess how they position in the junior-senior axis — either with direct questions or by pushing them to the topic. Once you‘ve done that ask an evil surgically difficult question — the correctness of the answer is not what you‘re looking for: pay attention to how they present their (eventual) missing knowledge, or how the address the answer by giving context. A senior developer doesn’t know everything, but must know enough to manage gaps in their knowledge.
3. Search for young and talented people — search for the fire in their eyes. You want someone which, eventually, will reach goals by their own.
That said, I do like the interviewing approach we're replying to, other than that "y" word... :-)
Did you know that it is perfectly legal to discriminate against young people?
Yeah sucks to be them, I guess. Apparently nobody cares about discrimination against those people.
Old people are among the richest people in the world. But I guess when things are made fair, and neither young or old people are discriminated against, that appears like "discrimination" against those who are now being treated fairly.
Even though I'm sure there is a lot of variance between interviewers, I can't help but think that the abusive style of whiteboard grilling must "work" in that it results in able workforce, and that might be because of the reason I suggested.
I know this is off-topic, but you can't go wrong with Keras. "Deep Learning with Python" by Francois Chollet (the creator of Keras) is a Kernighan-level book. A book like this comes around once a decade (or two). If you decide to go for it though, make sure you don't follow its instructions about the amazon cloud deep learning VM's; you'll end up paying $10 per day, even if you stop the VM.
What would you recommend as the alternative for someone without nvidia gpu? Google colab?
Some analogies make sense, some probably don't map well.
Google prefers to hire Devs with strong CS foundations because they assume these Devs can contribute in many areas.
If they were only to hire someone based off something that they've done before and excel only in that area, they'd be like other companies who only hire selectively based on specific skill-set (e.g.: Java Dev, or 3D/game devs).
Me, I tell them in the first phone screen, "I'm a Systems/DevOps guy, I don't do algorithms and whatnot so I'm just letting you know in advance in case you ask me to whiteboard something like that." So far I was only spurned once out of half a dozen screens employing this statement.
If you want to screen liars, there are a million ways to do it. Truth is, folks, most interviewers don't read resumes in depth, and they're not clever enough to interview for honesty.
But I might try to do some algorithm type interview, but wouldn’t feel bad if I bombed it. In fact, the only algorithm type interview I have done in the last 10 years, they made me an offer but I declined it for another job that asked more architectural questions.
The way I see it, if I could write 65C02 and x86 assembly language programs in the 80s, I think I can handle the needs of your software as a service CRUD app.
Usually, I'm just looking for one thing the candidate does well. There's really not much room for lying and if they claim to have some crucial skill and then I discover after the fact that they don't, that's grounds for dismissal.
Everybody here seems to assume they are trying to filter for the good candidates...but what they are actually trying to do with these kind of interviews is filter out the bad ones!
Because the damage when hiring a bad candidate is much higher than the damage from missing out on a good one!
I get that this is the conventional wisdom in a subset of this industry, but in my experience this "truth" does not have as much objective support as the people who keep repeating it think it does.
In companies where opportunity cost and hidden losses are nobody's responsibility, this sort of mediocrity is well accepted.
The answers to both sets of questions are very situational, yet some people treat them as if they have universal answers.
Less than the cost of being perpetually afraid of being fired.
If an engineer on my team got fired, no matter how much they deserved it, I'd probably start looking for a new job or team immediately.
The fear of being let go isnt something I'd ever want to deal with.
Of course it sucks for the individual...but such is life
The top companies with these leetcode tests probably don't care that good people being rejected or avoiding them because of the amount of preparation required. Middle sized companies and startups doing leetcode tests are missing good people and probably can't afford the same number of false negatives as someone like Google with an endless supply of candidates.
I would much rather spend a weekend doing a take home test and showing how I actually work than being asked to write irrelevant code on a whiteboard. I think leetcode style interviewing is driven by time constraints of the interviewer and some more effort and thought needs to be put in on how to choose people based on the specifics of the actual role they are hiring for.
I think the counterargument might be that at our age, you should have a network of people who will hire you just based on your stellar performance on previous jobs, if you're that experienced and sophisticated.
If you somehow don't, well, maybe you need to brush up on your tree balancing skills...
This is exceedingly rare... so you wrote a HashMap implementation in Java?? None of the custom maps discussed [here](http://java-performance.info/hashmap-overview-jdk-fastutil-g...) were enough? It must be a really rare problem you're working on if there's no library that provides an acceptable solution for it in Java.
> I would rather work with someone who knows how neural networks really work rather than who knows pytorch or keras
I would rather work with someone who knows general programming (which is not the same as algorithms - there's so much more to it!) and software design... but if working on a product that uses a framework heavily, good knowledge of that framework can be the difference between getting things done in minutes VS weeks (of painful and time-consuming battling with the framework to do thing the way you think they should be done, rather than how it's done within the framework).
More mundanely, when I was starting out at a small shop (I'm around your age) my coworkers hadn't mastered undergrad algorithms, and it made a significant difference in what we could do.
It's unfortunate that attempting to test for this seems to have encouraged too much memorizing of known tricks relative to better mastery of principles.
But how do you approach them? What prompted my comment was all the times people comment online about "memorizing algorithms" to prepare for interviews -- learning about algorithms will include a lot of them sticking in memory, yes, but approaching it as cramming a textbook into memory the way students cram for a final exam seems... disappointing.
That's so right. But that is experience, so I think when you know most the candidates don't have it, you end up looking for other stuff to test, and then your interview process gets infected, and eventually when you are interviewing for positions that should have experience, you get a lot of the same bullshit questions.
There are plenty of stories here of novice programmers doing some passion project, and 6-12 months after learning to program they are busting out some fairly complex algorithms that to someone that took the long route of getting a CS degree can look somewhat amazing, and frankly, humbling. But I think what's going on is that's the easy stuff to get a hold of, because all it takes is some judicious googling, or more likely, asking some professionals from help, and you're pointed in the right direction for the tool that solves your need, so you use it. Getting amazing progress on a new project isn't really hard.
Getting sustained progress, or being able to fix your problem a year or two in once you realize a major architectural change is required, and not being bogged down for months and losing interest, those are amazing, and those take experience or constant mentoring to achieve usually.
Or maybe I'm just projecting what I currently value.
This then becomes representative of your experience in spite of that fact that you are never likely to be confronted with that kind of problem with that kind of timeframe. Whatever's on your CV, and whatever you can say about what you've learned over the years, becomes totally irrelevant in the face of that.
I think it's offensive and I don't like how the industry has standardised on basically assuming everyone's a bullshitter. What's worse is that you'll have to repeat this over and over again for any company you interview with.
To that extent I'd much rather see the formation of a professional body that offers a trustworthy credential, potentially one that you need to renew every couple of years to ensure you're still current. That comes with challenges of its own but I'd rather have something that is both verifiable and can be reused.
Don't think this is true. At most tech interviews, the algorithms portion is more of a technical bar. It's usually the CV, years of relevant experience, and performance on the architecture/system design interview that determines what leveling you are.
>To that extent I'd much rather see the formation of a professional body that offers a trustworthy credential
Every company values different things from their candidates, so I don't think a formal interview process will ever go away. That said, there are companies like Triplebyte that try and standardize the interview process - they usually don't ask algorithm questions either. It's not quite a certification, but they'll fast-forward you through the interview process at companies that use them.
I have often seen weak developers promoted to senior because they are good with tools, trends, and configurations only to struggle in the face of an original problem. You can't simply hope to download your way out of every development problem and hope to be taken seriously as a senior. That reality is something many people are unwilling to accept at great emotional peril.
In the UK I've interviewed at dozens of companies and never once been asked to code a breadth first search or something like that. Nor have I ever heard of a developer being asked to solve such things in interviews over here.
Also a company hiring a mid-level developer (say around 3 years experience) doesn't really have that much of a pool to chose from. Most good developers are already in jobs and pickings are slim, its a buyers market.
Again, in my experience, its always seemed like the interview was more of a personal test rather than a technical one. Will the person be easy to work with? Will they be communicative or will they bottle up their issues and resent the team for it. Are they willing to learn? Will they be able to hande occasional pressure and stress? I think its generally accepted that its extremely difficult to accurately gauge a person's technical abilities in an interview. Also employees always start on a probationary contract in which they can be fired with 1-2 weeks notice, this usually lasts between 3-6 months, so if it turns out the employee was blagging it in the interview then it will become apparent pretty quickly and they will likely not make it past the probation period.
Not all of them are bad though; I had an interview that had me implement a basic data structure in accordance to the specifications given. It didn’t require memorization, practice, or any “Leetcode grinding” — only fundamentals in communication and programming. The time I had was about an hour so this is definitely a plausible substitute.
> Much more important is to have the experience of how code complexity accumulates, and how to mitigate that
That's part of the reason. The average developer knows how to pick a popular library and throw code at it until it works. When entire teams do that you end up with endless complexity.
That's why Google and Amazon are known for [re]inventing their own solutions.
Are "real code-related things" really much more important if it only takes hours and hours to learn something that might get you an offer at one of these companies?
Perhaps those hours and hours are actually far more valuable 'financially' speaking than a lifetime of studying code that is valuable 'technically' speaking. I don't doubt that some of the big name companies use rarely-used algorithms with battle-tested solutions on purpose JUST to see if you prepared for battle. They look for non-technical smarts because they have no shortage of technically prodigious applicants.
At Google, with over 400 applicants for every position, they have no shortage of technically-competent engineers. They look for people who think way outside the box and try new things, sometimes resulting in industry defining inventiveness. It's ~10 times harder to get hired at Google than get into Harvard.
You basically "build" a linkedin profile that comes up when recruiters search for technically hot things, and then they will come knocking
An alias for so-called 'algorithms' is 'imperative style' for 'functional programming style', or 'procedural programming' for 'object-oriented programming'.
They are too arcane to read, hard to modify and error-prone. Most of them require mutable data structures, intermediate snapshot for a special optimization purpose, make everything much harder to reuse.
It mostly requires you to instruct every piece of code, every step of execution. Instead of just describing what you need, and let the code find a way to fulfill it autonomously, in a declarative manner like Haskell or Prolog.
And 99% of software companies do not really have the need of requiring their employees having a deep grasp of algorithms, while most of them need their code somehow more maintainable. Those of the Leetcode oriented interviews are just really bad.
Personally, I believe business sense and culture stuff are very important. As for skill part, a lot of concepts could be more useful than algorithms like understanding of domain and relation modeling, immutable data structures, lazy evaluation, reactive design etc.
Granted that you only need a vague idea about the algorithm. But going from a vague idea to working code is also a valuable skill.
Wow, that’s amazing! Congratulations to the author because this demonstrates they have genuine talent.
In contrast, I’ve been a programmer for 10+ years, and I cannot pass the technical interviews in the companies mentioned above. At first, I thought the reason behind my failures was a lack of formal education in Computer Science, so I started reading more books. Then I thought, maybe it’s the fact that I spend more of my “productive” hours in my job just doing lumberjack web development, so I started participating in competitive programming (LeetCode, Code Golf, HackerRank, Code Wars, among many others).
Finally, I realized my brain needs more time than the average programmer to find patterns in this type of problems.
I gave up on my goal to land a job in one of these big corporations.
However, I don’t feel bad about giving up, in fact, thanks to all these books and competitive coding exercises, I was able to find two of the most exciting jobs I ever thought I would have, for 4+ years I worked in the software security industry doing Malware Research and building infrastructure tools for other security researchers. Most recently, I entered the game industry, and finally, I can use my algorithms and data structures for non-trivial projects.
Interestingly, I’ve been recently getting more messages by recruiters who want me to work for some of these companies. I politely decline the invitations because I know I cannot pass the technical interviews, but I promptly refer to some of my colleagues because I can see my younger self reflected in them, and I want them to have the experience that I couldn’t have to work for one of these companies. Even if they work only for a few months, as many people burn out, having the company’s name in their resume will grant them dozens of new opportunities.
I often think what would Google/Facebook would be like if they hired in some experienced engineers that may not be able to whiteboard a BFS tree or can tell you Djikstra's algorithm, but have proven business track records of getting projects done, on budget, and on time. Real, pragmatic, get-it-done types of engineers. (That's not to say whiteboard expert engineers can't also be this way - it's just that whiteboard interviews don't hire for this in particular - technical expertise comes first)
There was an excellent comment on another thread yesterday (that I can't find) that basically said something along the lines of "If I'm asked about BFS trees in an interview I'm going to tell them I'm just going to google a library that can handle it - I've got more important work to get done"
> I cannot help but think that these big tech companies (FAANG, et. al) are missing out on diversifying and increasing their engineering expertise by passing over developers like you.
I think this is certainly true.
> I often think what would Google/Facebook would be like if they hired in some experienced engineers that may not be able to whiteboard a BFS tree or can tell you Djikstra's algorithm, but have proven business track records of getting projects done, on budget, and on time.
Well... how do we find these people? By looking at their resumes where they claim this? By contacting references who will attest to it? By trusting the intuition of subjective evaluators of the candidates?
Practically speaking, FAANG companies do hire such individuals, they just do it through acqui-hires. If a person works at a company that is good enough to be worth acquiring, then we have a good signal that they are effective employees even absent a direct evaluation of their technical abilities.
there's a huge pool of employees that are in companies which aren't potential acquisition targets.
> Well... how do we find these people? By looking at their resumes where they claim this? By contacting references who will attest to it?
internal references? if you've got a couple of internal folks who are doing good work, and they all worked with and vouch for old bob, maybe that's better than anything you're going to find out from <8 hours of whiteboard scribblings?
> By trusting the intuition of subjective evaluators of the candidates?
even the faintest whiff of implication that FAANG interviews might not be subjective is hilarious.
At each place it was people from other teams completely unrelated to that team who interviewed (or would interview) me and eventually turned into a decision panel where everything about me would be considered by these people who really knew nothing of my character/skills aside from the resume and white boarding.
Between that and the amount of times I heard "Stanford" tossed around in a way that put down other schools (while not having a degree at all myself) I decided to give up on ever working at any of these places without being an acquihire. It just seems like a far fetched pipe-dream and I'd never check the required boxes that they expect for someone to sit in the same building with them. And honestly, none of that sat well with me.
It was an interesting time and I got to finally experience SV and realized it's likely not a place for someone like me.
First of all it sounds like your interview experience was unpleasant so if this was at Google I apologize.
Secondly, I can totally relate to feeling like I don't belong having also come from a nontraditional background (No college degree).
> At each place it was people from other teams completely unrelated to that team who interviewed (or would interview) me and eventually turned into a decision panel where everything about me would be considered by these people who really knew nothing of my character/skills aside from the resume and white boarding.
This sounds similar to what we do at Google; people who know you actually aren't allowed to interview you because they will be biased. We try to make the interview as objective as possible. Note that information from anyone who knows you or referred you will be shown to the hiring committee though so it's not as though that feedback is not used.
> Between that and the amount of times I heard "Stanford" tossed around in a way that put down other schools (while not having a degree at all myself) I decided to give up on ever working at any of these places without being an acquihire. It just seems like a far fetched pipe-dream and I'd never check the required boxes that they expect for someone to sit in the same building with them. And honestly, none of that sat well with me.
I understand why you feel this way but that's definitely not true! I don't think you necessarily should want to work in FAANG but I strongly think you should believe you are capable.
It's weird that it's that skill, and only that skill, that gets evaluated. I did another interview at a different one where there was at least a design interview (though I flubbed it and wound up being incoherent).
It really feels like they aren't even trying to evaluate anything other than whiteboard coding. Accepting that it's not a great signal and yet fully investing in it.
It's such a weird skill and it's so easy to perform badly; I'd be kind of shocked if the test/retest validity wasn't very low.
Also... I mentioned that working on teams with other women was important to me... but every technical onsite I've had has been given by a man. They've pitched teams led by women, and my HR/recruiting contacts have been nearly all women. But for the interview itself? All men.
It's absolutely very low, and they're okay with that. One of the things recruiters at these companies will tell you if you get rejected is some variant of, "Don't worry, you can always try next year." These companies fully understand that they're rejecting good engineers. They don't care, because, historically, the number of engineers applying has been so high that they could reject 75% of the good engineers and still have enough to fill their headcount.
We'll see how that attitude towards interviewing changes when high Bay Area/Seattle housing prices make it more difficult for them to recruit.
At big companies this is hard to scale because you have to protect against nepotism at the top layer. This forces policies protecting against the bad outcome.
At small companies this does happen.
Get a sense of the projects they have worked on if those skills relate to the position. Make a decision based on that.
Random coding tests, whiteboarding, buzzword dropping are only helpful to a point.
However, OP was talking about internal references. This would be a situation where I work at Company A, but I know you and trust your work. Therefore, I refer you to Company A, big up you a little and suggest that they pursue hiring you.
The difference is that if you suck, it reflects poorly on me (and quite likely kills my dreams of upward mobility). Normal provides references don't have the same motive to be truthful!
I’ve never heard of any other company doing this, but I haven’t been looking especially carefully.
From FAANG, to start ups, to established traditional engineering companies.
Just curious, because again, I’ve only ever heard this from friends that got bought by Google. I’m not saying others don’t, just haven’t heard of it from folks getting acquired at other places.
- If they decline, you usually get a very generous severence package
- While sometimes technical, there is usually significantly evidence of your abilities from the acquired company's records.
- In my experience it's usually a cultural test more than any other.
- Interviewing is part of the evaluation. If the hiring company thinks everyone is grossly incompetent they can reevaluate the acquisition or call it off entirely.
While not perfect I wonder if a person's projects are a good signal for this. There certainly are those who have developed a significant open source project but who get rejected by the FAANG companies, the author of Homebrew being a recent infamous example.
As an interviewer, I can definitely say that GitHub has become a bigger part of helping to evaluate people.
Many people can’t work on open source, either due to a lack of time or corporate policy.
Not to diminish his accomplishment... he could be a good candidate for PM but his tool is more of a dev-tool (so I suppose this is a niche-ish?)
> I wrote a simple package manager. Anyone could write one. And in fact mine is pretty bad. It doesn't do dependency management properly. It doesn’t handle edge case behavior well. It isn’t well tested.
But ultimately, should Google have hired me? Yes, absolutely yes. I am often a dick, I am often difficult, I often don’t know computer science, but. BUT. I make really good things, maybe they aren't perfect, but people really like them. Surely, surely Google could have used that.
Having worked with people who are dicks but make really good things...I wouldn't want him on my team either. There are plenty of people who make really good things and aren't dicks.
It’s always possible that Google felt he wasn’t what they needed. And keep in mind that Howell ended up being hired by Apple and working there for a while.
I mean... Yes?
This system, backed by "trusted recommenders" is exactly what academia uses, and they seem perfectly capable of discovering Higgs Bosons and whatnot.
This is inherently biased and proven to be the worst way to hire people.
I would argue that pair programming or take home projects do a far better job than the standard Whiteboard Algo interviews. In fact I don't think it's even close but SV engineers have been captured by this Whiteboard Interview Stockholm Syndrome/Hazing and continue to perpetuate the insane idea that there is no better way.
How does one not have time for pair programming? It's the exact same time commitment, from both the interviewer and interviewee, as a whiteboarding session. It's just a different format for the interview.
I don't think it's the case that people don't like interviews. I think you're making a strawman. I'm not going to go and say that you're part of the cycle of SV engineer being hazed, reproducing it and exhibiting a sunken cost fallacy/bias towards the way you've done it, but it's possible. I don't think this is the ideal way to interview.
What's more, I've been seeing actual improvement on the state of the art in this area. Platforms like Karat et al (not sure if Karat is the best in this area but I did have a good experience with a company that used them and interviewed me well) are actually incentivized to minimize false positives and negatives rather than just one. You might be surprised at how much better a good technical interviewer can be than an average strong engineer.
And their education, previous employment, etc.
No one makes a neuro-surgeon perform a surgery before hiring them.
True, but I also know that a neurosurgeon graduated from an accredited school, did an internship at an accredited hospital, and then passed a standardized licensing exam.
If we had that for engineers we wouldn't need to test their skills either.
When I was doing hiring, I would see resumes of people with 10+ years of experience who couldn't write a simple loop in their favorite language. If you've been coding for 10 years, you should be able to write a simple loop. That's the problem we're solving for here.
What you are asking for, is a surgeon to retake the boards every time they change hospitals. Or a licensed professional engineer to retake the PE exam everytime they change jobs.
The algorithms asked in interviews are rarely implemented in a job. All they prove is how much you studied and the author of the article proves that.
What our industry is missing, IMHO is required certification and training. Doctors and nurses have yearly training and related exams to keep their certification. That's why a hospital can hire staff based on a license...Not becsuse the items you mentioned. But our industry would never go that route, which is a longer rant.
Also if peope who can't write for loops are passing your phone screen... You might want to update your phone screen.
Yes, just as you retake your driving exams every five years, you should retake your engineering exams (unless you are an accredited engineering instructor, in which case the license becomes permanent) every ten years.
Uhh... I haven't taken a driving test in 17 years (and two states ago, to boot).
While I agree with this, the FAANG interviews take it too far. Simple fizzbuzz is enough to weed out these completely incompetent engineers. If they pass that, accomplishments and experience are going to be a much better indicator of engineering ability imo.
That's the problem some interviews solve for. FAANG interviews take it to another level.
The bar is being raised all the time as much as it is painful and hard for me to see/experience this :(.
We have that for "real" engineers. Doesn't stop many of them from being atrocious at their job. Doesn't stop doctors either, for that matter.
...and meets ongoing continuing education requirements.
> If we had that for engineers we wouldn't need to test their skills either.
We do have that for professional engineers.
We don't have it for people working in software whose job titles have become (but largely weren't for similar roles a few decades ago) "engineer".
Which are a joke. Attending conferences and weekend workshops take care of this.
Sure, professional licensing provides fairly weak guarantees, but it's more than exist in software, hence FizzBuzz.
I think we will, eventually. Other fields of engineering already do, of course. Software is new and not that many people have been killed by bad software (unlike bad buildings), but I think we'll get there.
Unfortunately, this seems to be poorly correlated with whether people can actually code.
There is also an experimental virtual surgery simulator for objective evaluation - https://www.ncbi.nlm.nih.gov/pubmed/26153114
Google allows you to write code on a Chromebook instead of a whiteboard (don't know if it's for everyone or some people). Is that pair programming? Or do you mean actually working with your preferred IDE/text editor, terminal, browser to look stuff up?
I think they weed out a few, but completely ignore practical development skills, work ethic, soft skills, design and architecture skills, etc.
Of course maybe this explains why most of the big tech companies have seemed pretty stagnant for the last decade, largely failing with products and decisions that have poor execution and market fit outside of the products that made them big in the first place.
Has anyone actually studied the best way to hire devs? Like a real, independent study that measured and compared results across, perhaps, a wide array of metrics?
I don't know how you would ever strive to do such a thing when people can't even agree how to measure productivity.
> Of course maybe this explains why most of the big tech companies have seemed pretty stagnant for the last decade, largely failing with products and decisions that have poor execution and market fit outside of the products that made them big in the first place.
In most large companies these things have nothing to do with programmers; the decisions are made by management and products are designed by PMs. Hell, you can't even necessarily blame buggy products on programmers: I've been on projects where everyone realizes things suck but if you don't get funding or agreement to work on infrastructure projects what can you do?
This is the whole point of the non-coding portion of the interview, and while it’s not possible to get a full picture of this in the short amount of time allotted, it is generally enough to throw out the obvious ones.
> maybe this explains why most of the big tech companies have seemed pretty stagnant for the last decade, largely failing with products and decisions that have poor execution and market fit outside of the products that made them big in the first place
I think this is unrelated to hiring and more a result of corporate policy.
Tech interviews are not solely whiteboarding. Even at FAANGs there are system design and hiring manager interviews designed to ask about those sorts of things.
Also, what evidence is there that these high-false-negative practices are actually reducing false positives by a significant degree? In these kind of threads I read the same arguments and assertions, with the same lack of evidence, as I hear from people trying to defend airport security theater. I should start calling these practices "interview theater".
In my experience (hired 10 engineers), yes.
It's technically easy to fire people, but it's emotionally draining on the team and on the manager in particular. Every time you fire someone, the team's morale takes a hit, and if you were firing people often, folks would begin to feel insecure and you'd struggle to keep good engineers.
Smart, good people want to work with other smart, good people. If you're constantly bringing in bad hires (even if you later fire them) then your good engineers will get fed up and leave.
The net value of a bad hire can easily be negative, if you consider the high onboarding costs to get a new engineer ramped up and trained. (Of course, the very best engineers basically sit down and start coding as well as your existing good engineers, but these true outliers are by definition uncommon). Not to mention the cost of the bugs and rework that bad hires produce.
If you use a recruiter, you're typically paying a large fee that you can get refunded after N months (details vary, 3 seems common), so you'd have to be making your assessment in the first N months; that's a lot better than trying to make an opinion on the first day, so if all of the above wasn't true, in the abstract, it would be great to be able to hire aggressively and fire likewise. However, good engineers balk at the concept of a "trial hire"; understandably, as personally I'd refuse to work somewhere that didn't have conviction that they wanted to keep me.
I imagine the relative significance of these points varies between companies and stages; all of these points are from early-stage (<25 person company) startup life; YMMV.
> (something you won’t get from a whiteboarding challenge).
I agree with this point, though I think it's orthogonal to the question of false positive vs. false negative cost optimization.
I know Google does a lot of metrics on hiring. Have they correlated their practice with IQ anywhere? Guessing nothing public as that would trigger a firestorm.
At the end of the day it’s just the cognitive elite trying to hire the cognitive elite.
Conformance too (due to the prep time)
It is instead the opposite. These companies are testing the quality of "who has practiced the most for these types of questions".
It has nothing to do with intelligence. It is instead almost directly correlated with how much time you have spent practicing interview questions.
Those people generally also do well on those whiteboard questions.
I think this happens less than people think. I've been at two big tech companies and interviewed people while at both (and obviously was interviewed myself).
I think what trips people up is that many questions have solutions that are given by "named" CS algorithms like the ones you listed, but also have simpler solutions that are fine too. And honestly, candidates that invoke named algorithms and (maybe eventually regurgitate the textbook algorithm) are often not the good ones (to me at least). Most problems like this often have simplifications that good candidates take advantage of to do something custom (and much simpler) than the "named" algorithms.
Basically, if I'm interviewing you and you regurgitate a famous algorithm... well I'm not going to ding you if you do it correctly, but in my experience, I'm much more likely to give a good review to someone that methodically works out a simple solution instead. Often the people that successfully dig up a named algorithm and apply it can't talk about it very well.
So, I suspect that a lot of the people that complain about not knowing a famous algorithm in an interview simply failed to work out a simpler solution to a simpler problem and didn't realize it.
I'm not talking about brute force solutions. For many problems, there are things in between <famous algorithm> and brute force. Some problems offer simplifications over more general problems where <famous algorithm> is strictly worse than a simpler, more customized solution (same efficiency, but simpler to write and understand).
Can you give a few concrete examples of how a deep understanding of BFS that can't be googled and read in 10 minutes at the time you need it can help you ship profitable software products faster than your competitors?
Indirection is great in this trade, a lot of what I "know" is just an index key/search strategy to go look it up again. But a lot of what I "know" I know directly. According to this old paper, for many fields including software you need to directly know about 70k±20k random things in the field to be in the running for expert status. Proficiency is also related to how many things you know directly. Thus you can't do an index lookup for everything unless you have no constraints on time and/or don't care about improving your level of expertise.
For me, it always goes like this:
* Am I iterating over a list multiple times?
* Am I iterating over a list and comparing it to some elements in another list?
Turn one of them into a map and continue. It's pretty simple.
I usually find DFS/BFS applicable when I'm dealing with data that forms relations like "children" or "neighbors" or "connected". Sometimes that's explicit because the data is already in a tree or graph structure, sometimes it's not. And often rather than looking for a particular element, I want to iterate over all the data connected by the relationship to do something with after as a list/set/map of the data or some of its properties.
A concrete example from a side project (it's more fun than Real Work examples, and people have given other examples in the thread anyway), I was writing a client for the board game Go. If you're not familiar, the data is just a 2D grid (easy to interpret as a graph), black/white stones get placed on coordinates like (3,4) marking intersecting lines. A single stone forms a connected group with another stone if both are the same color and separated horizontally or vertically by one. Groups have a count of "liberties" representing the number of unoccupied points the group can expand to if a stone of its color is played there -- a single stone by itself in the middle of the board will thus have 4 liberties, if you connect a stone then that group now has 6. If you cause an opponent group's liberties to go to 0 by surrounding it, the whole group dies, and its stones are removed from the board with those points becoming unoccupied again. Suicides (causing your own group to hit 0) are ok only if they capture the opponent first, otherwise are invalid moves.
Handling the game rules is easy with DFS/BFS. All you need to start with is a function like get_neighbors(position) which on a grid is trivial, being the 2-4 surrounding points depending on whether position is an edge/corner. Then you can use DFS/BFS (doesn't matter) and several lines of code later you've got a function like get_group_member_positions(start_position) that gives you every member stone's position no matter which one you query first. Now you can make a function count_liberties(group_positions) -- which can be solved as another depth-first traversal problem over the neighbors of each member of the group that are empty and haven't been counted already.
Even less code this time though; have you ever had to write 4 lines of code like "for el in list, for child in el.children(), if child blah, do something"? That's a DFS, just not a general one since you know how many layers there are (or that you care about), and it assumes children aren't shared between els.
From all that you can then have a function like get_stones_captured_by_move(move) that tells you what (if any) stones need to be removed if move were to go through: for each neighbor of the move, if any are the opponent's color, count the liberties for that neighbor's group and if it's 1 then those will die when the move is played (takes the last liberty). If it's all one big enemy group around the move you'll have to account for duplicate positions to remove, but those are minor optimization details. Similarly the fact that everything is recalculated all the time, that could be optimized with more storage.
BFS/DFS are just basic building blocks here.
You will be asked to find the biggest file in folder structure, find largest modúle in dependency tree or the person in data structure that has no department and thus caused null pointer exception.
You also won't google it, because it is simple and ylnormal programer don't need google for this.
I think the HN crowd is heavily inclined to web apps and also front end. I can't tell you how important fundamental CS knowledge is for backend. I bet you can't use a library or API for how caches work or when paging happens. Knowing these things makes you a well rounded engineer ready to tackle different problems. I can trust you to write a mobile app or work on some part of a self driving car because you have the building blocks to do so. Most of the comments in such threads have no interest in interesting and diverse jobs because "why bother, when will I use this". To any new grad, yes the interviews can be better but please spend time on these things at school. They are an investment in your career's stock.
And I don't have this kind of fundamental, "what do you mean 'google it' are you a complete and total fucking moron!?" attitude about anything in software. I routinely browse through google results (mostly SO/Medium) about very basic concepts just to read the words again and re-warm those caches in my brain.
These questions come up all the time. You just need to be proficient enough with algorithms to identify them.
would go even further and say that you could easily have installed something like https://oracle.github.io/opengrok/ in three commands for your organization and extract a lot more value while resolving the issue.
10 million lines of code fits in RAM pretty easily. You could use the worst algorithm in the world and still complete that whole task with just a few seconds of compute time.
I think that's the point. You don't really need to know about BFS in most cases, because in most cases you can solve the problem with any old search in just a few seconds.
For BFS? Almost never, no. Libraries are generally useful for reference implementations data structures and algorithms that work on collections; I have yet to find an algorithm that works well for graph problems or when you need to subtly tweak the data structure (for example, a binary tree that counts the number of elements on its left and right). The issue with libraries is that they are built for the general use case, and a lot of time in the real world it is non-trivial to transform your problem into the format that the library will expect, so you end up having to do this yourself.
It's not a question of speed. It's the fact that tree traversals are the best way to analyze parsed text. Although, it was quite resource intensive and we ended up distributing the workload among multiple computers so we could scan all pages at once. Luckily this was easy because pure functions are trivially parallelizable.
I thought you were talking about actually using BFS on a data structure, in which case I'd use a library to do it so I don't have to reimplement all the loops and because there are modern libraries that would take care of the parallelization (like this one)
On the topic of my example not constituting "knowing BFS", the BFS I shared is just a slight modification of what most students are taught and is close to the second method for level order traversal of a tree on geeks for geeks. If you don't like the BFS usage because it isn't on an explicit tree (although nested HTML templates most definitely form a tree), the result of parsing an AspxFile in my gist returns an explicit tree data structure and the helper method at the bottom does a standard preorder traversal on that linked structure. That feels like "actually using DFS on a data structure".
 https://github.com/PositiveTechnologies/AspxParser/blob/mast... (AspxParseResult has a reference to the root AspxNode, and btw I didn't write this parsing library)
I wouldn't hire an electrical engineer who complained that he shouldn't have to know Ohm's Law, either.
...And how you are keeping track of that and other such minutiae. What data structures are you using and why? And how about if X which would invalidate your approach.
Also, it's not about "knowing" it. It's about implementing it in such a way as to please the interviewers.
It's pop quiz nonsense.
As for pleasing the interviewers, of course it is about pleasing the interviewers. They're making the "buy" decision on the talents you're "selling". Of course the seller needs to please the buyer to close the sale. I don't know how it could work any other way.
This is a good question, IMO, because it tests “I need to do this, which data structure would work well for it”, which does come up often in real life.
Sorry, past midnight here after a busy day at work :-P
I know. These people are expecting to be paid way above the average salary - I'd expect them to know the fundamentals, or (more importantly) be able to figure it out.
"BFS?? I'd use a library for that" - well, thanks for that... instead of hiring you I'll just download some libraries instead.
DHH's tweet https://twitter.com/dhh/status/1085987159406927872
1. Creator of top software project can't pass whiteboard interviews.
2. I can't pass whiteboard interviews.
Therefore, they are stupid for not hiring me.
It strokes everyone's inner narcissist--why spend the hours on leetcode, they are stupid anyway.
My above comment was strictly related to the idea that DHH shouldn't be considered for a job over anther candidate who is an unknown quantity simply because DHH apparently is not good at algorithm problems in interviews.
I know that sounds rude... but one seems better to be a leader while others are better engineers would you agree?
I'd totally fanboy and hire Jeff though if it came down to just one though.
I also don't get why people like DHH is assumed to be a "miss" for not being hired by some of these company...
DHH is a unique individual with his own way of thinking. He wouldn't be DHH if he works for someone else (or under someone).
What DHH does could absolutely be incorporated by Google. Angular => RoR, GCE => Saas, conference speakers => conference speakers.
Anyway the point is that even at a company like Google there's plenty of room for people that haven't memorized algorithms and data structures but companies that only interview on these sorts of things are missing out on them because they seemingly don't care or haven't figured out how to interview someone substantially different from what they normally hire.
I find the skillset required to build Basecamp differs greatly with building AWS/GCE portfolio (with over hundred different services that can be combined to deliver solutions for multiple ranges of companies...)
I don't think Google was looking for someone to build RoR or Angular. Those were merely side projects that came out once every few years. No offense but some of the key components of RoR were implementation of Martin Fowler's Enterprise Application Architecture patterns.
Such a situation would show me that he either does not care about getting the job, or is so arogent that he automatically expected that people who throw job offers at him.
That shows that this person has horrible personality problems and that I would not want to work with him, no matter how much he has accomplished in his life.
I think the phrase "jack of all trades, master of none" is overused and doesn't hold a lot of truth. It's very possible to have a lot of experience and skill in many different areas. Maybe there's even a positive correlation between being a very good programmer that can pass whiteboard interviews, and being a pragmatic, get-it-done type of engineer.
There is still so much stupid money out there that it doesn't pay to do this. Literally immaterial, and it's more cost-effective to hire for a narrow-but-consistent set of requirements.
I bet you're quite good at something (perhaps multiple things), but they don't value it. Their loss. I'm glad you found something that works for you.
Why? It's not so clear to me that it's not working right now. These companies seem to be doing completely fine.
If hiring programmers who aren't good at programming puzzles is a competitive edge, I would expect to see other companies hiring these programmers and succeeding.
It seems more likely to me that there is (at most) a very marginal loss from hiring this way. It's not the fairest way to hire, it's not the most equitable way to hire but the evidence doesn't suggest to me that it's not working for these companies.
You're begging the question. Other companies do hire people who are excluded by a certain class of employer, and that's exactly why some get outraged by the types of quizzes that are used. The difference between the standards creates cognitive dissonance over one's ability and status. It's effectively a caste system, and it involves persistent blind spots about discrimination, whether towards protected classes or not.
I can't pass an Amazon or Google phone screen - both of them invited me to try, and they then made me feel very out of place. I'm not a software engineer per se and I didn't get my CS degree from a top school. Yet I have worked for a company that Google outsources important work to, proving I can do useful things that Google needs done and maybe isn't even able to do in house.
I don't go around raging against Google hiring who they want, partly because I'm probably better off not working there anyway. But I understand where people are coming from when they do get angry, even if it is the sign of an imperfect character.
From an employers point of view, they just want to grab the employees that are suitable for their goals and jettison the others. But from a potential employee's perspective, it seems like if I'm not good for one job at a company, I should be good for another. Rejection seems unreasonably total. If, by analogy, you deal in second hand cars, it's a reasonable business model to buy virtually any car so long as the price is right and you know who to sell it to. But not all businesses are run like that of course. It aggravates people that there is a narrow vision for who is useful.
Very good analogy. And true. My dad used to own a used car dealership back in the day. He bought pretty much anything he could get for a good price that was in decent shape. If he only bought cars that were in perfect shape, he'd have pretty much no cars on his lot.
And I've worked for companies whose bosses are so damn picky about who they hire that they would go without hiring a single person even after conducting hundreds of interviews themselves over six months, as the rest of us were struggling to keep up with the workload.
It got so bad one time the higher ups eventually had a recruiter just pick a few people and said "Here, these are your new employees. Get them up to speed." and he just had to accept it.
In the meantime, collegues are leaving the gig, making my level of stress go up.
We do see other companies hiring differently, and being rewarded in the market, in part, for it. Most of the time it's a marginal reward. Sometimes though we get a new FAANG, staffed by people who wouldn't have made it through FAANG's ringers.
Its the difference between fracking the shale for every precious drop because that's all you've got where you are vs. being able to just sink a short pipe in the ground and stick the sweet crude that gushes out into barrels.
Um.. have you looked at their stock? Investors would probly want them to continue coding puzzles if they knew about that.
That's a great way of putting it. Hard to see any other explanation.
Well, maybe not even the "average" programmer, but just the elite of the elite Google-tier programmers ; ) I feel the same way. When I was young, I thought that absolute mastery was within my grasp but after 25 years, tons of self-study, a couple of college degrees and yet still a lot of surprising rejections, I have to concede that while I may actually be very good at what I do (I'm among the most respected and most sought after everywhere I've ever worked), there are still people out there WAY better than I am, and that's OK.
You're better off following your own path and making your own things happen.
You have no idea how diverse people are in terms of talent at big companies. This is a myth peddled here at HN that FAANG employees write CRUD apps and are one dimensional. The best perk of my job is the random lunch scheduler where I've met a former Olympian, someone who is on the Rust core API team and a professional ballerina + software person.
The same was the case when I worked at Mozilla..
But I've always felt the interview process at these places extremely arbitrary.
I'm the same way. I'm really good a identifying solutions to a problem. I'm really good at identifying the costs/benefits/issues with each of the problems. However, I take a while to get there. I may figure out one (suboptimal) solution quickly, but it takes me a while to pick out a variety of them and identify the one I think is the right one for the current situation.
In school would usually (ok, often) get highest score, but took the longest to finish.
Maybe we are just not good enough.. And maybe it's sour grapes for having been rejected by them, but honestly I don't feel I would be a good fit for Google or FB ideologically. I value user privacy and freedom too much. In my eyes they are both evil-corps.
> Finally, I realized my brain needs more time than the average programmer
I also came to this conclusion. I accept it and focus on my strengths as a team lead instead.
Your experience allowed you to move toward doing work that excites you. Even though you didn't meet your goal, you still had great success.
Just progress an inch at a time.
I mean yes, some engineers are going to be better at interviews than others but we gotta go by some signals. We had to flunk a few because it was "we see the depth of knowledge the person describes, but we have to go by the signals we saw, otherwise we're just randomly picking people".
Point is it is tough. However it does say something that our interviews are just a bunch of quizzes and studying for a month will make a huge difference. When hiring the goal is to find out if the person is good, not if they crammed, but cramming does work quite well.
I'm not arguing that it shouldn't be the way it is, and I'm also not arguing against.
Claiming "Finally, I realized my brain needs more time than the average programmer to find patterns in this type of problems." feels a little bit like you've given up and that in itself strikes me as the worst sign.
I haven’t applied lately. Working in one of these big companies was a dream of my younger self. Nowadays, I prefer companies where I can work at my own pace. Moreover, while the salaries at these companies are exaggeratedly high, especially in San Francisco, I feel that my current compensation package is good enough to live a kind and peaceful life.
Those companies sure want you to believe what you're claiming because it means they get the first bite of the cherry but the reality is very different (from my experience anyway).
I strongly disagree, and I'm not sure why you would say that. I've heard that working at Google is a golden stamp on your resume, and you will be able to easily get a job at any other company. You'll probably even be able to skip the coding challenges in any future interviews.
Is this speculative or does this actually happen?
As a team leader I'm involved in hiring and I certainly wouldn't treat one applicant any different from another regardless of their previous postings nor experience (except when I'm headhunting someone I've previously worked with). My own experience interviewing has taught me it's easy for people to put stuff down on CVs (eg they might have legitimately worked at Google but through an acquisition rather than hired by; or not even with the team they suggest they have). So I would consider short-cutting part of the interview process in the way you described to be grossly negligent.
I don't dispute that your CV is more likely to get short listed however you can certainly still sell yourself without having FAANG on there.
My general point is that while I don't disagree that having FAANG on your CV will undoubtedly look good, however a good engineer shouldn't have any problems getting awesome jobs with or without FAANG. Thus is the prestige attached to FAANG really equatable in the real world or is it perhaps disproportionately hyped?
Maybe this is just one of those differences between how people are hired in SV (where I haven't worked) and London (where I do work)?
I think it would be pretty rude and unnecessary if you asked a high-level Google employee to do a whiteboard algorithm puzzle, or a take-home assignment where they have to build a little todo list app. Especially if they are a Distinguished Engineer or a Google Fellow. I don't know where the cutoff is exactly, but at a certain point no-one should ask you to do any more coding puzzles. You'll still go through interviews, but they shouldn't need to test your basic programming skills.
To be honest, judging from your last post I suspect our opinions aren't that far apart. :)
In Silicon Valley, everyone knows that it is fake prestige and the fact that you worked for a FAANG doesn't really mean anything. I both know excellent and mediocre candidates coming out of FAANGs. It literally carry no information besides the fact that you are probably more attracted to prestige.
A few years ago I was contacted by a Google recruiter, but I realized that it was going to be a very junior SRE role, so I wasn't interested in that. I could have put Google on my resume, but that probably wouldn't have been very impressive.
It still makes sense to work at one of these places that pay twice as much as everyone else.
It is simple offer and demand. The supposed prestige of FAANGs also mean that more people wants to get in and unless they really want you, they would be stupid to overpay you.
Maybe my things are different in London than they are in SV but I've never had any issues. Quite the opposite in fact; I've been turning awesome jobs down.
Anecdotally I've really not seen the 2x trend you described and I have spent a fair amount of time negotiating peoples wages. Plus engineers who are really concerned about maximizing their income will generally turn to contracting rather than permanent work anyway (at least that is the case in London where contracting is common place).
Suffice to say, I'm pretty unconvinced that FAANG pay people twice what any other business would. Maybe if you're looking for jobs outside of tech hubs (like smaller towns away from expensive cities) but other tech firms in London (and SV I'd guess?) would have to pay competitive wages else risk not attracting any talent.
I also want to clarify that I didn't mean they always pay 2x over the competition, I meant that it's not unusual for them to. Also when I say FAANG - I kind of included similar large companies (Microsoft, LinkedIn, Salesforce, etc). There's also a lot of large companies that come close, maybe 80-90% of FAANG pay.
>Plus engineers who are really concerned about maximizing their income will generally turn to contracting rather than permanent work anyway
I think it's pretty difficult to get steady contract work that pays more than what FAANG does. I charged $175/hr when I used to contract and it was decently steady, but that only comes out to around the same I would be making at a FAANG minus all the perks/benefits. Also when you get to higher seniority levels, your pay at FAANGs start to reach into the 6 or 700's, meaning you'd need a contract rate of $300+/hr to compete.
Usually when companies need that high level of technical architecture/leadership - they'd normally hire someone full-time rather than use a contractor, so it would probably be difficult charging that rate while getting steady work.
Ahhh I was including those sort of companies (baring Microsoft) as part of my counterexample. If the discussion was "any large / reputable organisation" then my comments would have differed a little.
In any case it sounds like London is very different to SV. It's very common for senior engineers to contracting and that's precisely because the difference in pay between permanent and contracting is night and day (the polar opposite to what you were describing in SV)
Nobody aces those interviews in the way you probably think; by just magically knowing every detail.