It's amazing how often "Hey, thanks for reaching out, I'm interested. Can you tell me more about the role?" results in the conversation ending right away. Probably over 50% recruiters that contact me do not reply back after that very polite and neutral response.
Many who do keep the conversation going have not read my profile or resume carefully, so I'll give them a summary of the types of work I'm actually interested in, which is never what they contact me for, and politely decline to move forward with the (usually way too junior) role they are looking to fill. That will almost always end the engagement.
Sure, it's a lot of noise, but filtering is very cheap: the time it takes to reply back. My actual success rate with recruiters probably pretty average. Of the eight or so jobs I've had in my ~20 year career, about three were obtained through recruiters, two times through in-house staff, once through an external recruiter.
Edit: heck with it, I've got nothing to hide. Enjoy:
So folks don't abuse my poor auto-responder, here's what would happen if you hit the email in that resume:
2) The numbers in that autoresponder caused my jaw to hit the floor. I thought I was well compensated but apparently there is a lot of room for me to grow!
AND work on IT. As an housing architect I received zero offers from recruiters despite having relevant experience. By the other hand I have been contacted several times by IT recruiters once I listed there (irrelevant) IT side projects.
Resuming: it's not you, it's IT..
I normally get nearly zero recruiter emails, but late last year I changed my location preferences on Indeed and/or Career builder (I don't remember) from my hometown to Washington DC and suddenly I was deluged with the 20+ recruiter emails per month; more right after I update something on my profile.
Strangely, 1/2 of the emails are for locations far away from DC. I might try changing my preferences to San Jose and see how many more I get.
My "profile" is pretty much just my resume. Experience seems to be another important factor.
80% of it is for jobs within the surrounding city, 10% for within the state and 10% out of state. That said, most of these jobs you could also find without the recruiters as well, but sometimes ones from internal recruiters (and if you're lucky a developer/dev manager) are useful.
Admittedly, I'm not in the market for a developer position, and I deliberately down-play my development experience in my profile, which probably reduces my contact count substantially. I should conduct an experiment wherein I stuff my resume and LinkedIn profile with programming languages and framework keywords for a month and record whether it has an effect on recruitment volume. I suspect it would.
Just follow all the guidelines that LinkedIn gives you, so that your profile is an "all-star" and make sure you have a bunch of connections (500+).
An email a day is fairly normal at least for SF engineers.
The more important nugget is that he is contacted about two attractive opportunities a year for his passive method that have a fairly high probability of leading to offers if he is interested. This enables him to:
1. Accurately appraise his worth to companies,
2. Quickly scale to a much higher number of interesting opportunities through the 14 worthwhile recruiters per year that already value his conduct (even if only 2 per year have opportunities with appropriate fit) by actively involving their aid if he becomes dissatisfied with his current employer/role (or they become dissatisfied with him),
3. Identify hiring trends in his field.
I for one think it is a brilliant strategy, and I'll probably adopt it myself!
I'm not kidding. Dear HN reader, please steal this idea!
To me, getting this done on interesting domains seems to be the hard part. For people with their own domain, getting a separate server to deal with email for that is some hassle. You can't really do this on a generic domain either, cause that looks a lot less professional. Signing people up for gMail accounts might work, but that's probably against google eula. I'd guess the same for other webmail services that are at least somewhat professionally acceptable.
Best way I see is to give people with their own domain as easy a time as possible to set up DNS correctly. Getting through DKIM and SPF reliably seems like a minefield though.
Okay maybe I took it too far.
You didn't take it too far, someone build the first online recruiter focused video game where the prize at the end of each level is the contact details for a more-and-more suitably qualified candidate.
Related anecdote: There was a technology recruiter that was constantly calling me at work, despite the fact that my work number is not listed anywhere* and my LinkedIn profile having clear instructions not to do that.
I decided to look him up on LinkedIn. Guess what his previous job was? Debt collector.
* I figure the recruiters get around that by calling the front desk and asking to be transferred to me.
The best recruiters can not only make efficient matches, but they also connect the dots to reach out to matches in the pool of passive candidates they talked to when a new role opens up.
Even better than that is when I'm able to actual push back with companies and make an impact on the hiring process to get someone hired.
With that foundation I'm more effective as I see more career trajectory data points when I talk to candidates that specialize (data, security, devops, ml) or ladder up (vp, manager, cto). Then I can provide even more value to new candidates with the career counseling approach.
I work with about 70 companies (only in NYC) from pre-launch to almost IPO and a new role opens up to a us every week. It's helpful to find a recruiter who knows you & the market well enough to curate jobs you want and tell you about opportunities that might not even be listed or on your radar.
They have their own recruiting team that you are going to be dealing with anyway, and aren't playing the numbers game or shuffling their candidates between many different companies trying to get their (usually very large) hire bonus. They just throw as many people they can at as many companies they can till they get a bite.
Senior employees can be up to a FULL YEAR of the hired employees salary. As that employee you don't pay it, but it's not something that is benefiting you either. On top of that, the companies often have claw backs. If the employee leaves under a certain time, they have to give back their fee.
The reason they don't respond after the first email, is that they need someone who is going to be very proactive and actually motivated to get hired. Else their candidate catapult might hit the target, but less likely to stick.
Not true. Companies hire recruiters to find employees. Not all companies (especially startups) have their own internal recruiting department.
The flip side is true, though: as a company trying to hire people, every recruiter email that says "I have a perfect candidate for you" is junk. They just invent those people or send resumes of people who aren't even on the job market.
A founder should personally be handling recruiting until the company is big enough to have their own internal recruiting department.
This is false. My last two roles were both initiated by 3rd-party recruiters.
If you don't have any way for companies to target you themselves, then 3rd party recruiters might be more effective the you targeting companies directly yourself.
However, if you are in the boat where you are getting many recruiter emails a week, my advice is sound. They are all junk, and the only way to get any signal to noise is to look for ones directly from the company doing the hiring.
Tumbleweeds every time.
If the email shows any effort whatsoever, mentioning a project I worked on, mentioning my current job, pointing out the role in question would fit my skills and it actually does, basically anything at all that suggests it's not just a form email sent to hundreds of people, then I will reply.
However, there's a third type of recruiting email that shows the person on the other side really is directly targeting you. It usually comes from the person who would be your manager, or at least someone who has a direct stake in the company (say the CTO or CEO at a small company). These emails I take seriously and appreciate. One of these led me to my current job.
Oh and I don't appreciate recruiting emails that have tracking links in them. Usually I will politely respond that I don't appreciate the tracking.
Tracking links are not always == shotgun approach
(I don't redirect all recruiters, though, so there's still a filter)
The best time to interview is when you're very happy with your current job.
Zero pressure, zero commitment and potentially huge upside in terms of pay and title increase.
"Hey, you look like a great candidate for $AWESOME_JOB"
"Great, let's talk"
"Oh, you're not really what they're looking for, can I interest you in $GARBAGE_JOB or $BASEMENT_AT_YELLING_CORP"
But by then I've showed interest, so I start getting calls. Not worth it.
I see you're working as $ROLE in $COMPANY_A. How would you like the exact same $ROLE in $COMPANY_B? Or worse, How would you like $ROLE-1 in $COMPANY_B.
Sorry but it's going to take at least $ROLE+1 to get me to uproot my life and go through that interview gauntlet again.
Of course, as I said in another post, the calculus changes entirely if you're unemployed and need "something, anything".
My company has 'Linux' in it's name, apparently recruiters think that means I'm a sysadmin (I'm a dev).
To paraphrase you, there's always that remote chance that one of the Nigerian Princes could actually need your help.
I used to do a similar thing to you, but it's too much work now and the levels of job spam ("Oil pipeline engineer" roles, simply because my CV has the word engineer in it (prefixed with Software)... lazy recruiter, that's bad!). Basically, if they can't make the effort, why should I? I guess the answer is, "Because there's always that remote chance that one of them could be able to set me up with a "dream job""..
Think about it like the real estate market (in normal markets, not Silicon Valley). Some houses sell before they even hit the market. A few also sell after the realtor does a few private showings. The rest are the "dog properties" that go on the MLS and need heavy marketing to sell.
That's... a strange way to look at the housing market. The norm is to list your property and then see what bids you get. Putting your house on the market is not some weird trick to pass a crappy house on to a bunch of rubes, or am I misunderstanding you?
Like, how do you reliably find a willing buyer pre-listing, and how do you know that's a good price (other than just blindly trusting your agent), if you don't even bother listing it? Listing is as much about price discovery as it is about finding more buyers.
The hotter the market, the more realtors know buyers willing to pay a lot for the first home that meets all their needs. Hence more homes are sold before listed.
I think that's locale specific. Except in the 15million+ bracket, auctions (here in Melbourne at least) tend to get the best money for the seller.
The "dog properties" don't get listed on MLS at all; there is a cost to listing on MLS and the dog properties generally don't generate enough interest to justify the expense (and usually don't even attract a realtor willing to invest the effort).
If the quality that companies were getting from regular applications was better than the quality they get from their recruiters why would companies pay to have recruiters?
If there's an awesome job available out there, with way above average pay, benefits, great opportunities for advancement, good work/life balance, etc., you'll fill it with a top talent. You're not going to need a recruiter. Word will get around even to people who are not actively looking, trust me. Similarly, if you have an "average pay for an average worker" kind of job, you'll find that average worker.
When you have a ho-hum average job but you want top talent, then you're going to need that recruiter because the job needs to be actively sold.
I invite anyone who works at a company that pays 3X average salaries or is well known for being an unbelievably great place to work to reply and tell me they have trouble hiring.
They could definitely fill their need for new software engineers naturally, but presumably have data that the quality of candidates they get by bothering a substantial fraction of the world's software engineers on ~a yearly basis gets them better applicants and engineers.
Why do you think so? Just because you receive X resumes doesn't mean they're all qualified to work there.
Above average people are usually treated well in theirs jobs. If you want to hire them you need to actively approach and lure into applying. Even then, they will not bother to refresh algorithms questions for whiter-boarding.
Below average people are looking for jobs because they are on PIP or they have toxic relationship with management or they will never get promoted in current role and need to change jobs.
Really talented engineers already have jobs and may not be actively looking to move, but if the right opportunity for the right company presents itself then they might consider it.
Due to my current situation, I tend to only deal with recruiters who are local to where I live; unless the job opening allows for telecommuting, or it is a "too good to pass up" situation (I have yet to see one) - I will generally pass it up.
Instead, I currently only work with a couple of local recruiters. I have told both what I expect for interviewing (I prefer a practical interview with tests - not a whiteboard pressure interview), and I keep both informed what interviews the other has sent me on, so they aren't both submitting me to the same position opening.
Going beyond 2-3 recruiters in such a situation can and will lead to a tracking nightmare, to keep all of them in synch and not submitting you to the same opening - either at the same time, or worse, after you have already been interviewed once and weren't successful.
I do however, try to review the contacts I do get from recruiters, and if I feel they might be useful in the future, I tell them so, and keep a contact with them (even if it is just a LinkedIn or email contact) - and let them know I am interested in the future. That'll usually be enough for them to keep me in their DB for future potential offers to come up.
It's a little less flattering when they can't correctly pronounce my name or the state I live in.
But mostly I want to get it because I want to see what these companies are expecting from their candidates, technology these companies are using, and the salary range. DevOps / SRE / Production Engineers are high in demand.
Note many job recruitments are from agency so they are likely hiring consultants, so paycheck doesn't come from the actual client company.
See, I see that time as very expensive.
Once I applied for a job that had a really obscure job requirement which I met. The job was already filled but I spoke to the recruiter, who was really nice and ultimately found me a different job in another skill set about a month later.
The other was the traditional thing were a recruiter reached out to me. I actually didn't like the recruiter, but I liked the job and ended up pursuing it anyway.
As an engineer you should have a clear picture of the kinds of companies that you want to work for. As you get older, your selection criteria should improve and become more detailed.
The only ones I tend to ignore are the recruiters who are bringing positions that are far out of my geographic areas and are not remote.
Let's presume this is out of preference and not ability. It's a pretty basic concept. If your preference stops you from learning something as basic as this as a programmer, then it doesn't seem likely that you will be motivated to keep up with even more abstruse concepts.
Nearly every programmer nowadays knows that naive string concatenation is inefficient, and so they should use a stream or something like that. I'd rather hire someone who knows exactly why it's O(n^2) and why adding to the end of an array that doubles when it expands is O(n) amortized. Why? Because a different but analogous situation might well come up in a programming job, and the person who likes to think about such things is more likely to spot the potential problem and avoid it altogether! The fact that the op would actually feature the above sentences as a large text excerpt sets off the "Dunning-Kruger" alarm for me.
That said, the op still has a good point. There is considerable organizational disconnect being displayed here. Those big companies would do well to have developers or a puzzle website do the initial filtering, rather than waste people's time by alternatively telling them they're supposedly wonderful, then supposedly horrible.
It's also not realistic, given the wheat-to-chaff ratio out there, for a manager to interview all the candidates directly without an elaborate screening process, although the 'get interviewed by some generic programmer who has a cheat sheet for the Hard Questions you are working through on the fly' seems like an anti-pattern. I have talked to all of our groups hires as a near-final step.
There's a valid point in there somewhere, though. The recruiters that Google uses locally, for example, seem clueless enough to not care that their approach is quite offensive to someone who is not junior and/or desperate for a job. For several years, every few months, I got a come-on from our local Google office that seemed to be implying that working for Google was So Neat that I would probably want to drop 2-3 levels down the SE -> Senior SE -> Staff SE -> Senior Staff SE -> Principal Engineer chain, AND I would only have to carry a pager 1 week out of 4 as a Site Reliability Engineer. Well, hey, that sounds like a real step up from running a team doing something I really like at my current employer and getting to dream up SIMD tricks and algorithms all day! I'll get right on that. My very own pager!
It's also not realistic, given the wheat-to-chaff ratio
out there, for a manager to interview all the candidates
directly without an elaborate screening process
I'm a hiring manager. If I say someone should spend 8 hours attending an in-person interview before I'll spend 5 minutes reviewing their resume/github, I'm saying my time is a hundred times as valuable as theirs. I can understand someone taking umbrage at that - especially someone who had better skills than me. And I want to hire people with better skills than me.
8 hours vs 5 minutes is a bit extreme, for a start.
Secondly, the time a hiring manager needs to spend reviewing resumes includes unsuccessful candidates as well - so it's not 5 minutes, its 5 minutes * |all_the_vaguely_plausible_candidates|. There might be 20 or 30 resumes in the pile for some of these jobs. SO the manager might be 100-150 minutes in, in order to get to that 1 person who needs a "8 hour interview" (or whatever that is; I suppose places that fly you somewhere might be burning 48 hours or more - I know people in Australia who have been flown to the US for job interviews!). If you live in town it might be more like 2-4 hours. So the ratio isn't quite as extreme as you make it out to be for the manager, even if it seems unfair to count time spent reviewing all the other candidates' resumes/githubs.
As you say at Intel: SE -> Senior SE -> Staff SE -> Senior Staff SE -> Principal Engineer
IBM has: Engineer -> Staff -> Advisory -> Senior -> Senior Member of Technical Staff -> Distinguished Engineer.
I'm sure other companies have different ranks too.
Unless you dig into the technical ladder of each company, its hard to say that Staff SE is the same everywhere.
Asking as a Googler who occasionally gets to dream up a neat trick here or there (and leads a team of pretty clever people, too!).
The main thing I guess is that a recruiter would have to be asking a question more like yours and less like "hey, how do you feel about an exciting job in the new field of SRE where only have to carry a pager 1 week in 4" to make me feel like it was even a respectful approach. They had even grasped that I was at Intel as a result of a fairly successful exit, but... hey... maybe some people want that pager.
As it stands, it was pretty much a "Hey $NAME, we like think it's good that you did a graduate degree at $SCHOOL. Congratulations on your $RECENT_CAREER_MILESTONE. If you ever tire of your $MEANINGLESSLY_SENIOR_SOUNDING_JOB at $MASSIVELY_INFERIOR_COMPANY_TO_GOOGLE you should join our generic hiring process".
(not hating on SRE, mind you - but no-one sane looking at my LinkedIn profile or anything else would think I have that skill set or would be willing to retrain from the bottom to develop it)
For example, in your string concatenation example, the naive solution is good enough except in situations where large numbers of strings are to be concatenated in a given run, or the strings are huge, etc., relative to the compute environment. Does it really matter if one uses an O(n^2) solution when a few dozen (or even a few hundred) such operations are applied on "small enough" strings in a given execution on modern computer hardware? No, it just doesn't.
Select candidates who are keenly interested in and knowledgeable about algorithms for positions where it's important (i.e. part of the regular course of development), not because of that slight chance there is one edge case where one adjustment to a more efficient algorithm might possibly be useful at some indeterminate point in the future.
Well put! At the end of the day, it's way more important to get your product out to customers/users rather than holding a microscope over every line of code you write and endlessly obsessing over big O numbers. It's fine if you're in academia, but it can be a death knell if you're working in a crowded domain. You can always go back and improve code later; customers/users are not going to come back later.
I'm all for striking the right balance with perf concerns, but in my experience that's not "just deal with it later". Giving no thought to performance is as bad as micro optimizing at the beginning.
It's not even an obsession. Those pieces of knowledge could be fully mastered in about a half hour. It's first principles knowledge, like chemistry or thermodynamics. I can tell in a minute that something like "Solar Freakin Roadways" is a scam, whereas so many people gave that scam literally millions of dollars. Knowledge is literally power, in a very concrete way that expresses itself directly as dollars!
Instead of having some basic first principles knowledge to spot such things ahead of time, benefiting from the collective experience of your field, you'd rather just labor in ignorance and run into everything for the first time and get out the profiler? Imagine scaling that up to the size of the Amazon or Google workforce?
I absolutely don't disagree with optimizing when it's obvious (two for loops instead of one, etc.) All I'm saying is that, when you have thousands of lines of code (millions?) and even more complicated/intricate connections in your head of different modules, all the while when trying to reach a deadline, things aren't as obvious. You have to make compromises, it's inevitable.
Amazon or Google did not scale up magically in one go by following established guidelines. I am very certain they had growing pains and I know that they had "hacks" while figuring out the right solution for the problem at hand.
 As attested by a former Googler who works at my current workplace
What we disagree with, is the expansion of one's knowledge of what constitutes "obvious." Many people would say that the scam nature of "Solar Freakin Roadways" was far from obvious. Other people would smack their foreheads and ask why some people didn't pay attention in middle school and high school physics!
My conjecture is that a big fraction of programs could get significant performance boost noticeable by end users if they were developed by good developers who have some knowledges in algorithms. You don't need to be an expert, but knowing very basics on time complexity and fundamental data structures is necessary to every programmer IMHO.
In your highlighting example, instead of working with someone who will optimize, I'd much rather work with the guy who will realize we can just debounce the dialog, write a one line comment about the performance issue, and then move on to the next thing.
Maybe I'm biased though. I've worked with the performance guy; he was brilliant and the code was clever and highly performant, but he was also slow and didn't write clear, maintainable code. We didn't ship in time and I had to find a new job when the project got scrapped. Damned if it wasn't fast though.
As long as you are aware of the issue with naive string concatenation, you can use a string buffer or a mutable string – it is trivial to solve in a few more lines of code. What's more difficult involves deletion in the previous text. If this happens often, you will need rope or skip list, which is challenging if you implement from scratch. In the latter case, I would leave a comment without implementing the optimal solution and let the profiler decide later.
Sounds more like a project management issue to me, than a problem with that coder.
I'd rather see basic project management skills than basic knowledge of algorithms. Unfortunately, the former is much more rare.
From my experience: very frequently.
So it's useful at these corporations for all engineers to have this understanding, I think.
The biggest impact to our performance has been switching one JSON serialization library for another, no algorithmic knowledge needed, just basic benchmarking skills.
Knowing when you can just slap in the naive string concatenation and move on is also a useful result of the right skills. If you had better profiling tools/skills, I posit that you wouldn't have had to do the useless optimization.
True. And people who clutter their code by using some fancy "StringStream" class when they could've just used a concatenation are also part of the problem!
Saying that programmers don't need a basic knowledge of algorithms is like saying drivers don't need to know how to back up a trailer or do a 3 point turn. Over 90% of the time, you don't need it. But when you do, you really do!
Knowing the typical time and space complexity for an algorithm (without even understanding one bit about the implementation details) in a given category might be considered too trivial to be even basic, but I submit it's about the most "advanced" piece of knowledge for what a typical engineer really must know to write good, high quality and performant enough code.
Most can get by using a given language's library implementation of an "obvious" data structure and naive algorithms around it. For example, in Python, the "obvious" structure to map keys to values is a dict, and iteration over the keys or item tuples is a typical access pattern. An engineer doesn't need to know whether the dict insert or lookup is amortized O(1) or any other value, and he doesn't need to know the space required is O(n), etc. In almost every case he's likely to put it to use for the dict is fine.
I think we're both saying that it's generally a good level of 1st Principles knowledge for programmers to have.
It is about your base knowledge (which should include algorithms) and knowing those algorithms when its matters.
If you have the choice of two identical software engineers and one knows 1 algorithm more than the other, he/she wins.
This same exact argument could be used to require every interview candidate to know assembly.
>and the person who likes to think about such things is more likely to spot the potential problem and avoid it altogether!
I find that developers who constantly get caught up in the performance of individual algorithms will waste incredible amounts of time optimizing them and will miss major optimizations that come from seeing the system as a whole.
For example, I would rather take the developer who points out that a particular program can make use of the fact that most of the code doesn't need sorted widgets rather than the one who spent the whole time optimizing the widget sorting algorithm.
Yes, and that argument could be used to require every interview candidate to know quantum physics.
At some point you draw a line, and Google/Amazon/Microsoft/etc. have very clear pictures of where their respective lines should be. It seems to work well for them.
Think about it this way. 1st principles knowledge, like algorithms, isn't at all useful 90% of the time. But 1% of the time, not knowing it will cause very out-sized penalties in efficiency or debugging costs. Now take the frequency of the occasional 1st principles usefulness penalties and multiply it by all of the developer-hours at Google/Amazon/Microsoft.
That is why Google/Amazon/Microsoft do that!
There is no proof that they are actually preventing bad engineers with this process or that they couldn't get much better engineers and fewer flops by fixing it.
People want to work there in spite of these irritating hiring processes. I frequently travel via plane but that doesn't mean I think the TSA process is good.
Have you ever done the standard Comp Sci compiler implementation class? Do you write C++ and use the C++ standard library? If you answered yes to both questions, then you should know from first principles how just about everything in the standard library is implemented, and can use those tools with complete knowledge of when they can fail or must be used differently.
Those developers are merely exhibiting yet another kind of ignorance.
It's the developer who has a clue about algorithms who is more likely to make the valuable insight. The needless optimizer is just doing some Cargo Cult algorithms analysis. (Probably motivated by signalling geekiness, not producing useful results.)
People create value with software when they use code to solve problems.
For some people, that means tackling a gnarly complex problem and by a combination of wits, experience and education, come up with an efficient and elegant solution. Typical example: someone working in a specialized role in back-end (i.e. not user-facing) systems in a large company (small companies can't usually afford specialized roles). Concrete example: V8 JIT engineers.
For other people, it means looking up from the keyboard, figuring out the company strategy and product fit, and seeing what can be most efficiently (in terms of effort) put together to improve or align both. Creating a proof of concept that gets buy-in for a more in-depth solution.
The second category almost always delivers much more value than the first. And there's a spectrum in between, a spectrum of people who are all very good at what they do, all very good at creating value; the spectrum extends from the guts of the machine all the way out to the interface with the customer.
I've worked in some very different industries in my career; ranging from compiler engineer to web app developer. Compiler engineers leave their fingerprints on far more code, and the job is intellectually stimulating. But realistically, most of the reward of the job comes from the fun of job itself. As a web app developer, I apply my knowledge of compiler techniques to things like SQL generation from filter predicates represented as ASTs, for efficient display of the user's data. But I could only deliver that because I saw the possibility of connecting the back end we had with an Excel-like experience we could give the user; and I had to take it on.
I've seen other people take a bunch of open source components - not knowing in depth how they worked - and put them together to create surprisingly credible solutions very cheaply, the kinds of solutions that win 7-figure SaaS deals. You don't get there with your knowledge of binary trees or compiler principles.
But having some basic algorithms knowledge might make the difference between delivering such a solution that runs reliably and quickly enough and not.
The ability to implement the sorting algorithm in the C++ standard lib is completely orthogonal to people who have this insight.
People who have an understanding of complexity analysis is really all it takes to have that insight. Memorizing a bunch of datastructure algorithms has almost no bearing on this ability.
Weak example! Exactly when would you want to use a shared_ptr, and when you you absolutely not want to use a shared_ptr? Why? And what pieces of 1st principles knowledge would let you simply know that in about a minute?
Note that in this thread and others I've been advocating for First Principles knowledge. If you have that knowledge, then you'd probably know several of the most basic algorithms and their analysis from having learned that. I'm not advocating that people just be able to spit out a memorized text!
You're not thinking big enough. My usual extension of the argument is to suggest that part of the interview process should be to give the candidate a bucket of sand and some ore, and make them smelt everything and build their own CPU from scratch.
Because, y'know, it's important to cover fundamentals!
(I'd give people extra credit for coming up with creative and interesting variants based on Buffon's needle, rather than building the full infrastructure to do lithography).
...because knowing assembly is actually useful to a high level developer? If it were, then yes, I'd say they should know assembly too. But I know assembly; I've written entire published games in assembly language. And yet I don't believe knowing it is actively useful any more. Knowing the basic concepts like how strings, integers, and floating point values are stored and compared, yes. You typically learn that as you're learning algorithms and data structures. But you can learn those concepts using C; actually knowing assembly language fluently is overkill today.
So there's no slippery slope argument to be made here.
> I find that developers who constantly get caught up in the performance of individual algorithms
Straw-man.  Developers who understand how to optimize can, at the same time, make intelligent decisions on when to optimize. Developers who are overly focused on minutia that isn't important are solving the wrong problem, certainly. But part of the skill of optimization is knowing when to do it.
The problem is that if you don't know how to optimize algorithms, if you don't understand big-O notation and its implications, then you won't really get how to optimize at either the individual algorithm level or at the system level. Because the concepts are the same across all the levels of complexity.
Yes, some developers can get obsessed with optimizing the wrong things. That's why experienced developers will profile before spending a lot of time optimizing.
But you have to be at least passingly familiar with algorithms to even know that it's a likely problem.
And you know what? I don't always obsess over "the most optimal" algorithm for every problem. Sometimes the more optimal algorithm for large N will require more overhead for the small N that we're dealing with, and the brute force algorithm will not only be "just fine," it will be faster and require less work AND less memory overhead. And sometimes N is just always going to be too small to worry about.
I've sometimes just used a quick-and-dirty algorithm only to discover that its behavior was far worse than I had guessed (something that should be instant is taking seconds), but then because I do understand algorithms it takes me 5 minutes to rewrite for better time complexity, and the performance glitch vanishes.
And that's who Google is trying to hire, at least in general. It's not like I don't understand object oriented design as well.
To expand on what you are saying: If you're doing "high level development" on a business app in C++ 11 then knowing assembly and having been through a bog-standard undergraduate CS Compiler Development course will actually help you use a whole bunch of things in the C++ standard library, because you can easily guess about how such things would be implemented from first principles. Otherwise, things like smart pointers are just "magic" and just need to be used in a bunch of disconnected abstruse ways. Having the first principles knowledge lets you easily know when to use another shared_ptr, when to use a reference, and when not to use one -- all just from first principles, no arbitrary memorization required!
Hell, having that kind of knowledge even benefited someone working in Smalltalk back in the day.
Agreed, and my understanding of assembly does help in those ways. I actually feel like compiler design was one of a very few classes in college that really, really taught me something.
A senior developer, though, probably should understand what's happening at the low level. And most teams should have a senior developer to keep the team from making rookie mistakes. There are plenty of programming jobs that can be done with less skill or deep knowledge. Like the ones that are discussed in the Wired article on coding being the next "blue collar" job. 
I think that the lead or "surgeon" (to quote The Mythical Man Month) in charge of any app or service development project should be well trained. Anyone who brings in a junior developer (anyone without a CS degree or equivalent, or less than 5 years of professional experience) to lead a project is asking for trouble.
But I think that average developers don't need the full training any more than a nurse needs to have a full medical degree. There are certainly things that I don't want to have to do, and while I might be able to do them better in some way than a more junior developer, they just don't matter enough.
It is a Catch-22, though: Just having a CS degree and five years of experience doesn't make a developer competent. I'd love to see some kind of certification to help separate the wheat from the chaff, so that non-experts could distinguish a top developer from a mid-tier developer.
But none of the certifications I'm aware of do anything aside from test that you've memorized the right buzzwords associated with a particular technology, and as such having such a cert is almost useless.
So yes, I'm arguing for certifications despite the fact that I view certifications as useless. If we had good certifications, maybe we could help establish a better way to distinguish developer skills. Would love to see that, but I admit it's selfishly motivated: I'm really good at programming and technical tests, so any such regimen I'd likely end up with the highest ratings, except if domain-specific knowledge was required.
A nurse needs to have enough 1st Principles knowledge plus specific training to keep from making egregious mistakes. In fact, where X is a profession, an X needs to have enough 1st Principles knowledge plus specific training to keep from making egregious mistakes.
The fact that our "field" keeps producing X without that minimum level means something is broken. It would be like the nursing field producing nurses who didn't know how to spot a Tension Pneumothorax, because most nurses don't have to deal with that 99% of the time.
So yes, I'm arguing for certifications despite the fact that I view certifications as useless. If we had good certifications, maybe we could help establish a better way to distinguish developer skills.
The fact that you entertain this thought is an indication that something is not quite right with our field.
Agreed. But try as I might, I can't come up with a solution that I'd actually support the implementation of.
If you have any ideas about we can get from where we are to where we should be, I'd be interested in hearing them. Having IEEE license developers at varying levels of skill sounds like a start, and requiring Internet-visible software to be written by developers with certifications sounds nice...until you think about open source projects, which thrive in large part from free effort. And until you think about the fact that software traverses borders fluidly. And what about entrepreneurship? Should we tell people they can't write apps unless they have the certification? Or that they need to pay an expert to audit their apps?
And the fact that big companies will do whatever they can to cut costs, including hiring developers in countries without certification requirements if you don't motivate them otherwise. (Maybe a DMCA-style safe-harbor, where ONLY if they use a development team with sufficient certifications can they be proof against lawsuits for defects; otherwise people can sue them for exploits that result from their software?)
I have lots of ideas, as you can see above. I just don't like any of them. There probably needs to be a constellation of rules in place, set by a professional organization and ensconced as legal requirements... But half of the rules I imagine wanting I also, under other circumstances, would despise.
It's about as relevant as implementing quicksort.
>My guess? They've accidentally used an O(n^3) algorithm where they delete one character at a time and copy the entire message, re-concatenating it every time.
People who profile and fix bottlenecks are much more valuable than people who think they have implemented the most efficient algorithm for every operation in their program.
...did you actually read my comment? Because I went on to say that I didn't think it was relevant.
That said, understanding how quicksort works is important, not because you'll be called on to implement it, but because there are key strategies used in its implementation.
And yes, I can implement quicksort without looking up the algorithm. It's really not that hard.
Sorry, but no, wrong answer. Partial credit for suggesting profiling. Let me take this apart:
If they're calling a function that triggers an event handler for every character deletion when they're deleting a block of text, then they're Doing It Wrong. Yes, profiling is important. But calling delete one character at a time for a block of text is plain lazy.
> Memorize every algorithm used by a system
Umm....this is a text editor we're talking about. There are very few algorithms, and it shouldn't be a case of "memorize them" as much as "observe them." If there are more than 3-4 algorithms, then it should also include "draw a picture of how they interact," if your abilities at modeling them in your head are insufficient.
But someone who is strong at algorithms should be able to just see what's wrong with a text editor that deletes characters one at a time. It should be obvious that it's likely to be a problem, and they shouldn't do it to begin with.
> People who profile and fix bottlenecks are much more valuable than people who think they have implemented the most efficient algorithm for every operation in their program.
Both are necessary, and as I've said elsewhere, "developers that are anal about optimizing everything" is a straw man; good developers know when optimization is important, and good developers can write more optimal code without making it harder to read. Just knowing how to optimize doesn't automatically make you optimize every single routine, nor does it mean everything is more complex. Half the time when I optimize something the code ends up cleaner and easier to read, in fact.
And it's not always about bottlenecks. Sometimes it really is about the algorithm, and if you don't know your algorithms, you won't be able to recognize this. As said above, it's a prime example of the Dunning-Kruger effect: If you don't know algorithms, you don't even know what you don't know. Claiming you don't need to know algorithms just reinforces the point. Sorry.
In another comment on this post I described an optimization that I did that had no easy-to-fix bottleneck that was slowing things down, and that required a high level algorithmic change to fix.  In about an hour I sped it up by a factor of about a thousand. Ultimately I was doing the same things, but I was able to improve the algorithmic efficiency.
And someone strictly looking for bottlenecks can look at that code forever and not see the algorithmic change needed, because the code itself was written to be pretty optimal; it was a high level change to the algorithm that made it so much faster.
So. Much. Fail.
The Amazon Alexa/Echo app is similarly poorly programmed. Takes like 10 seconds just to boot up, which you need to do if you want to look at your shopping list -- and if the gets pushed to the background, you need to boot up the app again. Sigh.
And I'm pretty sure that the person who doesn't want to learn binary-tree traversing will never be able to point that out. That's the point you missed of the post you replied to.
Your ability to efficiently teach people stuff that you claim is important is a chance to hold yourself accountable for how well you actually know something AND whether it is actually important at all. If an interviewer taught me something I didn't know, and then quizzed me on it, I'd be impressed as heck, and I'd wanna work with that person (if they were not an asshole).
1. Pick a simple but relatively obscure data structure, something they are unlikely to have crammed the night before. I always start by asking the candidate if they are familiar with it; the answer is almost universally "no". I then pull out a wikipedia printout and a notepad, and spend 10 minutes explaining it to them, with diagrams. Once they are sure they have understood the basic operations, I ask them to implement one of these basic operation we just walked through. The amount of code here should be ~10 lines; keep on adding simplifying assumptions until it is.
Maybe not the healthiest attitude ...
> Make the interview more about working together and capacity to learn, and less about computer science trivia night at the pub with no beer.
Understanding b-trees is pretty useful. If you work with relational databases, most questions concerning query performance require some understanding of b-trees.
I wouldn't ask a candidate to implement b-trees (or a sorting algorithm, or red-black trees, etc.), but asking about their basic properties seems reasonable. Or better yet, asking about the performance of a realistic SQL query which uses a b-tree index.
No one is arguing that there isn't any value in knowing CS. Rather, the argument basically is that for the vast majority of developers, studying algorithms is a net loss because it's time that could be better spent learning more valuable skills.
If you're the person whose job is creating Redis then yeah you should probably know something about about algorithms and data structures. For most other people, having a cocktail party level familiarity with that stuff and a good understanding the phrase "tools not rules" is probably good enough.
But the op is basically implying that you don't even need the "cocktail party level familiarity." Knowing just enough first principles chemistry to know how CO2 and CO are produced can save your life. It's one thing to just know the rule you shouldn't run your car in a closed garage. It's another thing to know the general conditions where you might be producing CO, so you also know not to run your laser cutter in a poorly ventilated room.
First principles knowledge are a series of tools for reasoning about the world, your program, your software libraries, etc. Programmers thinking they don't need to know the most basic algorithm knowledge is basically the anti-intellectual stance of eschewing first principles knowledge.
I've been through a CS education track, including algorithms and complexity, my day job has been software development for over 5 years since I've graduated college. In practice I have been able to understand complexity tradeoffs when selecting approaches and building implementations and have been able to recognize shortcomings in the same done by others and been able to improve upon what was built. The difficulty I have encountered has never been in the implementation of an algorithm. I have even taught the first principles of complexity to others sufficiently well that I have seen them make informed design decisions.
I will openly admit that not once, during all of that time, have I ever been able to whiteboard an optimally efficient (or nearly so) algorithm implementation _and be confident it was such_. Mind you, I'd love to be able to, and I'm definitely not arguing that understanding complexity isn't useful knowledge. However I would contend that being able to bang out a optimal b-tree implementation from memory on-demand isn't first principles knowledge, but is merely _trivia_ ability.
I would further argue it's not even first principles knowledge that is most important, but rather a combination of critical-thinking, enough self-awareness to notice when you've exceeded your current knowledge/experience, and sufficient humility to fix that lack rather than plod along with blinders on.
Sorry that this became a bit rambling, it's just a topic that's always intrigued me in how divisive it can be.
a) What the heck is an optimal B-Tree? (Don't you have to tune the parameters? b) I would also contend that it's trivia, and merely a way of getting at the ability to apply First Principles knowledge.
So you're not disagreeing with the value of 1st principles knowledge. Then you cite an even deeper kind of knowledge. Where exactly are we disagreeing here?
Usually I don't bother engaging in arguments with these people because I know that the standard algorithms interviews will weed these people out so that I never have to deal with them but I've started to notice a trend in more of these people finding their way into the "big 4" type companies and ending up on large scale/infrastructure projects where they end up making very critical mistakes, see https://news.ycombinator.com/item?id=13700452
> If she would have started her email with "We're looking for an algorithm expert," we would never have gotten any further and would not have wasted our time. Clearly, I'm not an expert in algorithms. There is no point in giving me binary-tree-traversing questions; I don't know those answers and will never be interested in learning them.
We should avoid the notion that naive string concatenation is a problem that impedes our ability to deliver software, except for niche applications, this is not the case.
The attitude that really basic Computer Science concepts like algorithms and algorithmic complexity are irrelevant is exactly why software projects are so frequently FUBAR.
The Dunning-Kruger reference is spot on. The original author doesn't even know what they don't know, or why they should want to learn it. Being proud of ignorance of a cornerstone topic isn't something to be celebrated.
Being generally messy and not considering the over all design hurts way more than just basic CS.
I have seen plenty of performance issues related to CS knowledge but a lot of times those are masked by messiness.
The Quora app, though: If I'm writing a reasonably long answer and I delete a paragraph, it can take more than 10 seconds to complete. There's some profound inability to understand algorithms in there somewhere, I can guarantee it.
There was just a headline on Hacker News a few days ago about a "reject!" function, if I remember correctly, that ended up with an n^2 algorithm for several major Ruby releases because someone fixing a bug didn't understand the consequence of their change. I don't have the link, though.
EDIT to fix the name of the Ruby function and to add the link.  Thanks to user nighthawk454 who dug it up!
I'm not against this type of training, I'm very familiar with and have competed in ACM ICPC, TopCoder, etc, but what I've noticed is that there seems to be a divide in the type of devs that are merely brute forcing and solving as many questions as possible in order to create their own solutions bank vs the ones who take a more structured approach of learning problem solving paradigms and algorithms/data structures in a way that enforces why they're used in the situations they are and how those can be adapted and composed to solve larger problems.
If you want horror stories... I've worked with a dev that didn't understand the concept of passing by value vs by reference. This was a person being paid 6 figures and has been writing code for cloud infrastructure with a fundamental misunderstanding of the underlying memory model. I've also had devs that didn't even know that an abstract data type like a map could be implemented with either a hash table or self balancing search tree and therefore no idea of when you should/can/can't use one or the other. This type of thing wouldn't really bother me if I was talking to a front end dev, since I imagine most of what they do is solve design problems, but it's very worrying when these are people working on the lowest layer of a public cloud infrastructure...
I wonder what the history of their mobile app is, though? It feels like it's a hybrid app, and it feels like it's developed iOS-first, Android-as-afterthought, so it may be the ugly stepchild of the team that gets no attention. "Does it build on Android? Ship it!"
The worst of the problems seem to be when I'm also using the SwiftKey keyboard, so they may not see them internally. Somewhere along the way they're doing something pathological, though, because SwiftKey works like lightning in other apps, and only in the Quora app can get bogged down and end up so slow that I have to disable it and use the standard keyboard just to type anything at all. There's also a bug where I sometimes can't hit enter after a link I've pasted, and again I have to switch to another keyboard.
SwiftKey may be reacting to notifications that the text has changed, for instance, and Quora is sending 200 change notifications, one for each character changed, and then SwiftKey is querying the entire source document again each time... Or some such. Don't know.
It's not just SwiftKey related issues, though. Whatever they were using for rich text editing has been broken in different creative ways every few months, where edits wouldn't take, or what you're actually seeing would vary from what gets posted, or you can't edit the link text, or you can't GET OUT of the link text, so all the new text you type ends up part of the link, or ... It's been broken in so many ways that I've lost count. A recent bug I encountered was when I was pulling the text down to get to the top of what I'd written, I scrolled it down once too often, and that triggered a "refresh" ("drag down from the top" to refresh) which lost all of my text. Sigh.
The Quora app is really in need of a solid team experienced in app development, robust data handling, and proper UI behaviors. If it has competitive programmers working on it, they're solving the wrong problems.
You must mean "some software projects" and not "frequently." And who's proud of ignorance? Ignorance of what? And, what's more, why should it be considered a negative if someone isn't concerned, or even is proud, about ignorance of certain things.
* Over budget in dollars or time or both
* Insecure (sometimes profoundly so)
* Having major UI issues
* Missing major obvious features
* Hard to extend
* Full of bugs, both subtle and obvious
Pick any five of the above at least. Keep in mind that most software projects are internal to large companies, or are the software behind APIs, or are otherwise niche products, though there are certainly plenty of mobile apps that hit 4-7 of those points.
Who is proud of ignorance? Original author:
> There is no point in giving me binary-tree-traversing questions; I don't know those answers and will never be interested in learning them.
He proudly states that he will never be interested in learning these topics. These topics that are profoundly fundamental to computer science.
Why should it be considered harmful? Because anyone who is writing software should always be learning, and should never dismiss, out of hand, interest in learning core computer science concepts. They should be actively seeking such knowledge if they don't have it already. It's totally Dunning-Kruger to think that you don't need to know these things. His crack about learning "object oriented design" instead made me laugh: As if knowing OOD means that you don't need to know algorithms. To the contrary, if you don't understand the fundamentals, you can create an OOD architecture that can sink a project.
It's like the people who brag of being bad at math -- only worse, because this is the equivalent of mathematicians being proud of their lack of algebra knowledge.
There is something I've mutated to my own liking called the 80/20 rule. In software, you will spend 20% of your time making the application 80% of it's maximum potential performance. You can then spend the remaining 80% of your time gaining the extra 20%. If that is cost effective to your company, then by all means do it. If it isn't 80% is just fine and you've cut your development costs by 4/5ths.
For me, being able to "rote" an algorithm on some whiteboard falls into the last 20%, maybe. Collection.Sort, whatever it uses, is good enough. Hell, you get about 78% of that 80% by using indexes properly on your RBDBMS.
I would say it's necessary but not sufficient. There is no perfect interview strategy. But for projects that are entirely developed by people who can't traverse a binary tree, I'd say the odds of failure are very, very high.
Sure, a strong developer can hit that 80% of performance quickly (in probably less than 20% of the time), but a weak developer who doesn't know how to optimize probably won't even make 5% of "maximum potential performance."
I had to work with a tool once that had a "process" function that would take 2-3 minutes to do its work. It was used by level designers, and it was seriously impeding their workflow. The tool was developed by an game developer with something like 10 years of professional development experience (he was my manager at the time), and he thought it was as fast as it could go -- that he'd reached that maximum potential performance, with maybe a few percentage points here or there, but not worth the effort to improve. He didn't want to spend another 80% of his time trying to optimize it for a minor improvement either, so he left it alone.
I looked at what it was doing, figured out how it was inefficient, and in less than an hour rewrote a couple of key parts, adding a new algorithm to speed things up. Bang, it went from minutes to 200ms. No, I'm not exaggerating. Yes, I had a CS background, and no, the experienced developer didn't.
If you end up with accidental n^3 algorithms baked into your architecture, 5% performance in development can be 0.1% performance in the real world, or worse as your N gets large enough. And yes, that's even if you index your data correctly in your database.
And that's when your site falls down the moment you have any load, and you end up trying to patch things without understanding what you're doing wrong. In my example above I improved the speed by nearly a factor of a thousand. In an hour. That can easily mean the difference between holding up under load and falling over entirely, or the difference between being profitable (running 4 servers that can handle all of your traffic) and hemorrhaging money (running 4000 servers and the infrastructure to scale dynamically).
Which is why you need a strong developer to run the project to begin with. Maybe you're that strong developer; I'm not really speaking to you in particular. But I know a lot of developers who just don't have the right background to be put in charge of any projects and expect that they'll succeed.
In the 2000's and prior, it was common knowledge that the majority of software projects failed. Even if they succeeded on paper and shipped, they weren't actually used. By some estimates, it was something like 75% of software projects.
If you look at the contents of "ecosystems" like Steam and the the various app stores, you'll see much the same. Most of the software out there is a defacto failure, and much of it is due to the ignorance of the programmers resulting in substandard programs.
This is highly relevant because one not uncommon problem with highly-tuned algorithms is the greater cost of writing that faster code and then having to change it when you realize the design needs to be different, especially if pride + sink cost leads to trying to duct tape the desired feature on top of the wrong foundation for awhile until it's obvious that the entire design is flawed. That failure mode is especially common in large waterfall projects where nobody wants to deal with the overhead of another round.
This is highly relevant because one not uncommon problem with highly-tuned algorithms is the greater cost of writing that faster code and then having to change it when you realize the design needs to be different
Can you give me a specific example of this? I'd say, give me a specific example, and most likely, I'll give you a reason why the software architecture of that example is stupid.
"Poor performance" isn't the only negative result of using an insufficiently experienced developer, or a developer who doesn't have a full grounding in algorithms and data structures.
Someone without a full CS background (and without the ability to remember much of that background) is likely to know a few patterns and apply them all like a hammer to a screw. This leads to profoundly terrible designs, not only when you take performance into account, but finding the best model for the actual business problem, handling changes, permuting data in necessary ways, and other issues.
> That era was dominated by waterfall development which is notoriously prone to failure due to the slow feedback loop.
Extreme programming was the first formalized "agile" approach, and the very first agile project to utilize it was a failure.  A big problem was performance, in fact:
> "The plan was to roll out the system to different payroll 'populations' in stages, but C3 never managed to make another release despite two more years' development. The C3 system only paid 10,000 people. Performance was something of a problem; during development it looked like it would take 1000 hours to run the payroll, but profiling activities reduced this to around 40 hours; another month's effort reduced this to 18 hours and by the time the system was launched the figure was 12 hours. During the first year of production the performance was improved to 9 hours."
Nine hours to run payroll for 10,000 people. We're not talking about computers in the '70s with magnetic tapes. This was 1999 on minicomputers and/or mainframes. If that wasn't a key algorithmic and/or architectural problem, then I would be amazed.
When you're designing a system using agile, it often ends up with an ad hoc architecture. Anything complicated really needs BOTH agile and waterfall approaches to succeed. You need to have a good sense of the architecture and data flow to begin with, and you need to be able to change individual approaches or even the architecture in an agile manner as you come across new requirements that you didn't know up front.
> This is highly relevant because one not uncommon problem with highly-tuned algorithms is the greater cost of writing that faster code and then having to change it when you realize the design needs to be different
I'm going to say  for this claim.
I gave an example in another thread of having improved the speed of a game development level compiler tool by a factor of about 1000, with no major architectural changes, and it took me about an hour.
At the same time I performed a minor refactor that made the code easier to read.
Bad design == Bad design. That's it. Good design can include an optimized algorithm. A good design tends to be easy to extend or modify.
Very rarely it makes sense to highly hand-tune an inner loop. The core LuaJIT interpreter is written in hand-tuned assembly for x86, x64, ARM, and maybe other targets. It's about 3x faster than the C interpreter in PUC Lua, without any JIT acceleration. That is a place where it makes sense to hand-tune.
> pride + [sunk] cost[s] leads to trying to duct tape the desired feature on top of the wrong foundation
That's another orthogonal error. It makes me sad sometimes to throw away code that's no longer useful, but I'll ditch a thousand lines of code if it makes sense in a project.
Every line of code I delete is a line of code I no longer need to maintain. I'm confident enough in writing new code that I don't feel any worry about deleting old code and writing new. This is as it should be. Sad for the wasted effort, yes. But I know that the new code will be better.
And I happen to know for a fact that naive string concatenation was a big part of the performance problem! (Bringing it back to my original comment.)
A project can die a thousand deaths before performance becomes its death knell.
I really don't think that every programmer in Amazon or Google is faced with algorithm problems everyday, that's probably localized to some projects that deal with those, not the majority of programmer population.
Expecting every programmer to know how to solve given problem on a whiteboard is like expecting every life gourd to know how to do a water escape (because it involves swimming AFTER you solve a problem of tied hands).
And most of a lifeguard's job is sitting around blowing a whistle at troublemakers. Does that mean you'd be fine with a lifeguard who only has those two actions in his skillset? Also, as someone who has held a number of jobs that involve your 1 through 4 above: knowing basic algorithms is beneficial to the job. You don't need that knowledge often, but when you do, it has outsized benefits. Also, you wouldn't know that, unless you had that knowledge while doing that job. If you didn't have that knowledge, you wouldn't know how it would have benefited you.
The fact that "in-depth knowledge" is applied to "traversing binary trees" makes me sigh and feel uneasy.
It's more about testing the range of skills that will actually be applied. To toss in another incomplete analogy: If you're interviewing someone to build you a deck in your backyard don't spend 90% of the interview time asking him how he'd chop down trees for wood and manufacture the nails. You probably also don't need to ask him his opinion on the local building height restrictions or if he knows how to install an attic fan. You want to know that he can build a deck.
I think a lot of the above analogy mismatch and "not strictly necessary"-ness is because an application slowdown is much less severe than a drowning death. But if you account for that difference, it is very apt.
If you're interviewing someone to build you a deck in your backyard don't spend 90% of the interview time asking him how he'd chop down trees for wood and manufacture the nails. You probably also don't need to ask him his opinion on the local building height restrictions or if he knows how to install an attic fan. You want to know that he can build a deck.
But "how he'd chop down trees for wood and manufacture the nails" isn't apt at all! As a programmer, understanding algorithms is much more like 1st principles knowledge behind understanding building materials. Your contractor needs to know the structural properties of different kinds of wood, the properties of the soil in your backyard, and some basic chemistry/engineering/physics. Otherwise, she might put the wrong materials together and invite galvanic corrosion, or use the wrong materials in the wrong context and invite premature wood rot. You'd want a carpenter to understand wood harvesting, milling, and treatment in how it effects the properties of wood. Could you get by most of the time, just knowing how to saw boards and drive nails with power equipment? Sure. But someone with the right knowledge is going to avoid the unlikely but very costly pitfall and save her customer thousands of dollars otherwise.
You want to know that he can build a deck.
On time and under budget. She's more likely to do that if she's armed beforehand with certain experience and 1st principles knowledge. All things being equal, the contractors who built decks on time and under budget had good trades instruction or otherwise had the right experience and/or 1st principles knowledge. Also, if you look at the longevity and long term structural integrity of deck projects, you will likely find such a correlation.
(Also, it would be a really good idea if your contractor had a knowledge of local building regulations concerning the deck.)
I sense a contradiction between the 1st clause and the last clause.
Expecting every programmer to know how to solve given problem on a whiteboard is like expecting every life gourd to know how to do a water escape
Expecting every programmer to know the basics of recursion and about 10 basic algorithms and the reasons behind their time/space complexity analyses is like expecting every life guard to know what a drowning person looks like (nothing like what they show in movies and TV shows) has the ability to tread water without using arms, and knows how to tow a panicked, struggling swimmer without letting them kick you in the stomach.
A programmer who can't explain the three bits of knowledge in my comment above is like a lifeguard who simply knows how to swim and doesn't know what a drowning person looks like. Over 90% of the time, all you have to do is sit in the tower chair and blow your whistle at troublemakers, but in the occasional instance, you will have to know something special, with a higher cost than usual if you don't know.
The failed landing of Asiana Airlines at SF and the Air France Airbus that dropped into the ocean are two good examples.
Most programmers are metaphorically airline pilots, doing a routine job that is more about following the processes than elite skills, but some people wish that they were doing something more complex than they are.
Programmers who think they can only get by with the "routine" skill level are like airline pilots who think it's just fine for them to do the checklists, run the autopilot, and know nothing else. Who would you rather have piloting your plane?
You and I live in different worlds, or have vastly different definitions of 'programmer.'
"How do we satisfy unlimited wants with limited resources?" - economics to the rescue again.
Everyone would love to hire Linus Torvalds for that matter, but there are plenty of programming jobs where you don't really need to, or don't want to, or can't spend that kind of money. There are plenty of useful, productive positions for people who aren't up on their O notation. The important thing is to be clear what kind of person you're looking for.
Torvalds is very lucky to have found his niche because he is unemployable - the first sweary rant at a teammate and he'd be shown the door. Well except maybe at Uber.
Now that's what I call naive! Or maybe hopeful. Just saying you're giving an awful lot of credit here...
I did spend awhile as an expensive "consultant" where often I was hailed as a genius just because I applied that tidbit of knowledge and sped up the application by some large fraction. However, I've also noted that there are some mainstream programming communities where people know this about string concatenation and spread this knowledge around.
To believe any concept is so basic everyone should know it seems unlikely, since if everyone knows it the odds of being able to find someone that knows it are high and the odds of gaining value from it are small.
I could just as easily say "we should be asking every programmer to handle a six-box grid layout" on a white board.
It's a skill set that's relevant to almost all of programming jobs. Quick: give me an example of how time/space complexity can be relevant to layout manager code. (Can't do it? Dunning-Kruger just reared its ugly head again.)
You don't necessarily need to write your own layout manager. But to properly shop for one for some demanding use cases, you may well benefit from just cocktail-party level first principles knowledge, so you can pick the right library. Or, do you just have faith that Apple or [Big Company] knows what it's doing, and assume that everything can handle everything you throw at it. That works over 90% of the time. It's the exceptional case where you need the 1st principles, and such cases usually come with outsized penalties if you don't know.
@stcredzero, I've seen your name pop up time and again on this thread. I don't think you realize how rude you're being.
You clearly think that Data Structures 101 is important knowledge, to the point of saying that most software project failures are caused by lack of that knowledge. Others in this thread take a more moderate position.
Can you accept that this is a difference of opinion, not a sign of incompetence, and stop being so rude about it?
I am being what I need to be. I find it remarkable that some people are receiving this information and prioritizing protecting their ego, rather than investigating what motivates such sentiments. Perhaps that difference should be exploited in interviews?
Let me clarify: I think that Data Structures 101 is important knowledge, to the point of saying that a significant number of software project failures are caused by lack of that knowledge.
Others in this thread take a more moderate position.
They either don't have the knowledge and don't appreciate what they don't know. Classic Dunning-Kruger. Or, they have the knowledge but haven't been in the right kind of projects where someone not knowing really bit them hard. ("It never happened to me, so it must not be that big a deal.")
Find me someone who has the knowledge and who has been "bit hard." You can find them in these threads. Read what they say carefully. It's much more than a matter of opinionated debate.
Not to be overly nit-picky here, and technically O(n) is correct as well (it's also the un-amortized worst case of insertion), but you probably meant to say that the amortized time is O(1), or θ(1) to be even more precise :)
Or, you should demonstrate the self awareness to realize that you're just playing with the ambiguity of English in a hastily written comment. Exercise: in what precise context would time/space complexity be O(n), and in what precise context would it be O(1)? (Though the emoticon is probably an indicator that you already knew all this.)
Actually, there are lots of skills and competencies that are valuable on engineering teams other than the ability to have all basic algorithms on call in your head at all times.
Where did I ever say such a preposterous thing? Please provide a quote.
And again, if you understand algorithms at a basic level from First Principles, then you don't have to memorize every algorithm like baseball card trivia. The important thing is having the background knowledge. If you don't think such knowledge is foundational background, then you probably don't know enough to know that. Hence: Dunning-Kruger.
There are some "free" optimizations I'd expect a lot of seasoned programmers to be familiar with, like as you mention the string concatenation issue, but those are easy to catch during code review. Similarly for certain optimizations that are even 'freer' in the sense that modern compilers will take the easier-to-read (if naively slower) version and produce the fast version, I expect to not have to tell people during code review that the 'slower' version isn't slow because the compiler does the right thing.
These require experience and time spent carefully designing the system. A profiler shows you what is slow in your current implementation, not how to convert it to a more elegant one.
If you look at the current AI, machine learning, and deep learning fields it's an absolute must that you have some experience with algorithms.
If I ever have to write it, it's a pretty simple thing to look up. This piece of algorithm trivia has no relevance for software engineering jobs in 2017.
There is of course one reason to learn it: It keeps coming up in interviews.
I'd love to hire a programmer that knows this is actually O(n^3), and not O(n^2).
(Well, it is if you are naively concatenating N strings, each of length N characters).
(Had a manager who insisted it was exponential. Wisely did not argue with him).
(I await attempts to recruit me based on this comment).
Hint: Don't ever work for or hire anyone who would exploit the ambiguity of the English language in a hastily written comment to gain the false appearance of geeky superiority. (Exercise: how would you interpret the situation referred to in the comment to wind up with O(n^2) and how you interpret it to wind up with O(n^3)?)
People like to boast about how important it is to know these kinds of complexities, but in reality they just use mental heuristics to come up with them (which is how we got n^2 and exp(n) from my manager).
For string concatenation, is it important they know how to do it in a linear fashion? Or is it important that they can actually calculate the complexity? Or do you want an in-between where it is OK that they do not really know the complexity, but can tell you that it is definitely super-linear?
I've seen people be fussy that they should be able to explain why it is n^3 (or n^2 or whatever), in a somewhat rigorous fashion. I say "somewhat" because when I then turn to them and ask them to derive the exact expression (not just in big-Oh notation), they will usually fail. Then they will go through hoops trying to explain why it is important to know it in big-Oh notation but not all that important to be able to derive the exact formula.
(There is, of course, a camp that does not believe much in big-Oh and actually prefers to know the leading constant, etc. My example is not contrived.)
I find people will nitpick to their level of granularity, and it is always easy to come up with some justification for their level of preference. But to others, it is just nitpicking.
For me, it is important to know that this is super-linear, whereas one can do it linearly. I don't care if the candidate knows it is n^3 or n^2. That is also why I did not point out my manager's error at saying it is exp(n).
I have seen the worst apps written by "Very Smart People" who obviously had never built an Android application before. It doesn't matter how smart you are, the first time you do something it will suck. I have had catastrophic failures caused by premature optimization, because locking a tool into a fancy algorithm before you know where the actual bottleneck is is a recipe for disaster. I have seen so many problems caused because people couldn't take feedback or didn't ask for help, because they were so wrapped up in being The Kind Of Person Who Knows The Answers.
Frankly, passing algorithm questions is a great way to signal that I probably don't want to have to deal with your code.
Personally, I love working with people coming out of the good agencies because building 15 or 30 applications from scratch in an environment with strong mentorship and rapid feedback is, in my experience, more likely to produce a good programmer than all the smarts in the world.
That's a very bold (even arrogant, sorry) statement about 57,000+ (googled it) employees.
So 57,000 are worse devs then you ? This is what i hear from your statement, correct me if I am wrong ..
Many extremely talented people want to work for Google. They will take the tests, even if they don't agree with the hiring system at all .
Maybe they want to work at Google despite its incapability (according to you) in the hiring and testing process.
> it is whether I want to work with the code of people who can pass that interview
Have you found any correlation between poor code and being able to pass Google's interview ?
(edit: new lines)
> So 57,000 are worse devs then you ? This is what i hear from your statement, correct me if I am wrong ..
The idea that you can be objectively better or not better at programming, and that it's decidable through one or two simple factors is kinda wrong and goes with OP's point.
It's more on the order of 20k developers, not 57k.
>Have you found any correlation between poor code and being able to pass Google's interview ?
cough cough Angular 2
There is an enormous difference between poor code and a library you don't like. I may not be a huge fan of Angular 2, but it's certainly not poorly written code.
Probably not. There's plenty of better workers than me who I wouldn't want to work with.
I am also specifically not passing some objective value judgments here. They may very well be hiring the best developers for what they are doing: they just aren't capable of hiring the type of developers I most enjoy working with. This is a subjective preference, not an objective one.
There are absolutely talented developers who don't value working with the kinds of developers I want to work with and they won't care that Google can't hire them. I have enjoyed working with Google developers in other contexts, but many of them wasted a bunch of time and energy jumping through Google hoops because they decided it was worth it. It is my opinion that they would be better developers if they had spent that time on something useful instead.
Google makes it easy for people I don't care about working with to get hired, and hard for people I proactively want to work with to get hired. Nothing about that is about my skill at all.
He never said they are worse.
>Many extremely talented people want to work for Google.
"Most people do/think XYZ."
Do I have to explain why that's not a good argument?
>They will take the tests, even if they don't agree with the hiring system at all.
Other people don't mind shitty treatment, so what?
This is why i asked for clarifications. the OP said
'Google is incapable of hiring the engineers I want to work with', which implies that the engineers he is looking to work with are NOT there. Google is incapable of hiring them, and since they are top quality devs, someone else (more capable) did.
> Do I have to explain why that's not a good argument?
Yes, please explain.
Because you only took the first sentence of my argument. the second sentence was that those "many people" are going to do what it takes, even if it means taking 'stupid' tests, smuggy interviers, and some other large corp bureaucracy (which to me personally is more concerning, but is not specific to Google, rather to any large corp).
And why ? My guess is that it's probably so cool,curious, challenging to work on something like Google Maps/Balloons/Self driving cars that they will be willing to make a lot of sacrifices and tradeoffs.
THAT was my argument.
> Other people don't mind shitty treatment, so what?
please explain, i didn't understand your point
"Many people want to go to Heaven, and they are going to what it takes, even if it means answering stupid questions, pretending to feel bad about trivial things, and some other large org bureaucracy."
My guess is that it's probably so cool, curious, satisfying to be in Heaven that they will be willing to make a lot of sacrifices and trade-offs.
"Many people voted for Trump", my guess is that it's probably cool, curious, challenging to finally tackle real issues that they will be willing to make a lot of sacrifices and trade-offs.
That's stupid, right? But why?
>My guess is that it's probably so cool,curious, challenging to work on something like Google Maps/Balloons/Self driving cars that they will be willing to make a lot of sacrifices and tradeoffs.
Those people might believe that it's cool, curious, whatever. But then you should end up with "many people think X about Y", not "Y has a property X (because many people think it does)".
>This is why i asked for clarifications. the OP said 'Google is incapable of hiring the engineers I want to work with', which implies that the engineers he is looking to work with are NOT there. Google is incapable of hiring them, and since they are top quality devs, someone else (more capable) did.
He doesn't want to work with them because they are unpleasant to work with (or be around in general), not because there is an issue with their programming capabilities.
That's an interesting assumption to make. I know many programmers who are excellent at what they do and have also outright refused offers from Google, Amazon and Facebook; it's usually on the principle that such large, bureaucratic organizations are quite defective insofar that they are adept at suffocating those who are happiest with maximal flexibility and responsibility.
There's a group who jump from contract to contact, startup to startup, because the thrill of new challenges and little to no barriers from organizational structure is worth the uncertainty in compensation.
It also helps that a few aren't willing to relocate to the USA.
I know quite a few of them. All of them, without exception, are incredibly brilliant engineers. You might think this is just anecdotal evidence, that's correct. But also think that Google Engineers are responsible for many top quality, world class software projects. How could that be possible without really good engineers?
Yes the interview process is tedious and can be frustrating. But negating the fact that they have some of the best software engineers in the world is just silly.
A: Build great content.
> Personally, I love working with people coming out of the good agencies
What does that mean? What agencies? What makes these people good to work with?
I'm seriously flabbergasted by most of the points you make.
Can you work in a team? Can you communicate? Do you dress in a presentable manner? Can you explain technical ideas simply? Do you know anything beyond cutting code?
It's simply not good enough to be some lone-ranger ubercoder anymore. Coders are a dime a dozen.
I'd actually rather take someone with a degree in statistics or physics, let them learn to code on the job, and I'd be able to apply them to an order of magnitude more problems than your typicall mathematically illiterate self-styled genius who can write a bit of code.
Programming is the easy part, stop deluding yourself into thinking what you do is difficult. It isn't.
Wow that's some serious sour grapes.
(This is a serious, non-ironic question: I don't know what you mean, but it sounds interesting.)
Edit: I guess you mean recruitment agencies? How does one find the "good ones," other than through bitter experience and/or luck?
Any idea how one finds the "good" agencies? Is it a networking thing?
It's the robustness principle.
I have to work with their APIs often (as well as Amazon, Facebook and a couple other big players, they're all equally awful).
Their documentation will tell you a type is an integer and will never be null. They will do this for 3 months, and then change it to send a string that can be nullable or empty string, without telling you or updating API versions. It goes beyond robustness. It's the "BigCorp will actively lie to you in their documentation" principle.
Personally, "Be conservative in what you do, be liberal in what you accept from others" is a bad policy in the modern world. It should be "Be conservative in what you do, be strict as if you don't trust them in what you accept from others".
>If a property is optional or has an empty or null value, consider dropping the property from the JSON, unless there's a strong semantic reason for its existence.
This style guide entry suggests a course of action in the event that one of these keys currently has a null value.
The style guide isn't prohibiting nullable keys.
> I have had catastrophic failures caused by premature optimization, because locking a tool into a fancy algorithm before you know where the actual bottleneck is is a recipe for disaster.
EDIT: Or, are you saying that it was from the decisions made before Google acquired Android?