Hacker News new | comments | show | ask | jobs | submit login
Why I Don’t Talk to Google Recruiters (yegor256.com)
892 points by kesor on Feb 21, 2017 | hide | past | web | favorite | 652 comments



Unlike most of HN (it seems), I like hearing from recruiters, because despite the very low signal-to-noise ratio, there's always that remote chance that one of them could be able to set me up with a "dream job". It's zero cost to me to politely reply to a recruiter and ask for more info, and I try to at least respond to everyone. What I've found is that they must have a lot of candidates they're juggling because falling out of the funnel is surprisingly easy!

It's amazing how often "Hey, thanks for reaching out, I'm interested. Can you tell me more about the role?" results in the conversation ending right away. Probably over 50% recruiters that contact me do not reply back after that very polite and neutral response.

Many who do keep the conversation going have not read my profile or resume carefully, so I'll give them a summary of the types of work I'm actually interested in, which is never what they contact me for, and politely decline to move forward with the (usually way too junior) role they are looking to fill. That will almost always end the engagement.

Sure, it's a lot of noise, but filtering is very cheap: the time it takes to reply back. My actual success rate with recruiters probably pretty average. Of the eight or so jobs I've had in my ~20 year career, about three were obtained through recruiters, two times through in-house staff, once through an external recruiter.


I take it one step further: the email address on my resume is a black hole. Its only purpose is to feed an autoresponder who kicks back a warm, generic, email thanking the recruiter for their time, acknowledging they have a difficult job, and lays out my requirements for any position: what I am and am not interested in doing, my salary/hourly/per-project requirements, my location requirements (100% remote), etc. At the bottom, there's another email alias along the lines of 'yesireallydidreadthisgiganticemail2017@mydomain.com' that goes straight to me. I ask the recruiters to not email that address unless they've read the whole thing, and their position matches my requirements. I get 200+ emails to the catch-all autoresponder a month, and maybe 1 (qualified!) reply to the 'its really me' alias every six months. About once a month, I get an email to the 'its really me' alias from a recruiter expressing joy, amusement, and thanks for spelling out what I'm looking for so early in the process. All in all, it's a far more pleasant way to go about passively searching for The Next Big Role(tm).


Have you ever actually got a job that you accepted through this method? Also, what wizardry did you use to get 200+ recruiter E-mails a month? Do you just have a highly optimized resume out there on all the job sites or something?


Yes, I've gotten my last 3 non-consulting jobs through this method. Before I switched to FT consulting, pretty much all of my jobs were through recruiters. The 'wizardry' is basically having a well put together resume (I've had several people/orgs/etc take a look at it over the years and give me optimization tips), and I maintain profiles at dice, monster, and indeed. That's... about it. I avoid LinkedIn like the plague because I detest the company and its practices, but I'm sure folks who are less picky than me could do something similar on LI and EASILY get more than 200+ contacts a month.


I would be interested in seeing what your resume looks like, at least on a generic anonymized level.


No problem, whats a good email for you?

Edit: heck with it, I've got nothing to hide. Enjoy: https://www.fuzzy-logic.org/file/Lee_Whalen_Resume.pdf

So folks don't abuse my poor auto-responder, here's what would happen if you hit the email in that resume: http://bit.ly/2lxsly3


1) I'm impressed by your technique, and I intend to copy it.

2) The numbers in that autoresponder caused my jaw to hit the floor. I thought I was well compensated but apparently there is a lot of room for me to grow!


Thank you very much, I appreciate!


I get between 30 and 50 a month and all I really did was setup a linked in profile.


> all i really did was setup a linkedin profile

AND work on IT. As an housing architect I received zero offers from recruiters despite having relevant experience. By the other hand I have been contacted several times by IT recruiters once I listed there (irrelevant) IT side projects.

Resuming: it's not you, it's IT..


Incredible--would you mind sending me your LI profile (privately of course, email in my HN profile)? I'd love to see an example of a profile that generates that much interest, even if it's mostly low signal.


I think location is very important with these sorts of profiles. I'm on linked-in, Indeed and career builder.

I normally get nearly zero recruiter emails, but late last year I changed my location preferences on Indeed and/or Career builder (I don't remember) from my hometown to Washington DC and suddenly I was deluged with the 20+ recruiter emails per month; more right after I update something on my profile.

Strangely, 1/2 of the emails are for locations far away from DC. I might try changing my preferences to San Jose and see how many more I get.

My "profile" is pretty much just my resume. Experience seems to be another important factor.


Maybe I am mistaken, but I thought that much recruiter interest on linked in is pretty typical. I live in the Mid Western United States and what was previously mentioned is also true here from my observations. Most of it comes down to having the right keywords and tags I believe. Having .net/java/mobile in your profile nets a lot of messages where I am. Words like Scala/Python/Node/Ruby/etc gets you a bit more.

80% of it is for jobs within the surrounding city, 10% for within the state and 10% out of state. That said, most of these jobs you could also find without the recruiters as well, but sometimes ones from internal recruiters (and if you're lucky a developer/dev manager) are useful.


Interesting. I suppose if I were to ever get 20+ recruiter contacts per month I would retract my previous comment about replying to each of them being cheap time-wise.

Admittedly, I'm not in the market for a developer position, and I deliberately down-play my development experience in my profile, which probably reduces my contact count substantially. I should conduct an experiment wherein I stuff my resume and LinkedIn profile with programming languages and framework keywords for a month and record whether it has an effect on recruitment volume. I suspect it would.


Another interesting bit I've noticed is coworkers getting some of the same recruiter spam from the same recruiter. Seems some of them just blast everyone working for a particular company and hope to get a reply.


It's honestly not that incredible.

Just follow all the guidelines that LinkedIn gives you, so that your profile is an "all-star" and make sure you have a bunch of connections (500+).

An email a day is fairly normal at least for SF engineers.


can you send me the link to your profile, i'd like to increase the interest on mine


I think you're right to thank them for their difficult job. I seriously got bored just reading the description of your setup. What percent get through to the second email? Is it like 10%? 50%?


(1 per month + (1 per 6 months)/6) / (200 per month) = 0.58%.

The more important nugget is that he is contacted about two attractive opportunities a year for his passive method that have a fairly high probability of leading to offers if he is interested. This enables him to: 1. Accurately appraise his worth to companies, 2. Quickly scale to a much higher number of interesting opportunities through the 14 worthwhile recruiters per year that already value his conduct (even if only 2 per year have opportunities with appropriate fit) by actively involving their aid if he becomes dissatisfied with his current employer/role (or they become dissatisfied with him), 3. Identify hiring trends in his field.

I for one think it is a brilliant strategy, and I'll probably adopt it myself!


You did get bored, didn't you. He detailed the numbers in his post.


They said 200 emails/month to the general and one every six months to the second email, so that's well under 1%


I'd pay at least a few dollars per month / tens of dollars per year for this service.

I'm not kidding. Dear HN reader, please steal this idea!


Dude, you can do it yourself with a gmail throwaway. 'hireme.myname@gmail.com' or similar, set your vacation auto-responder appropriately, and have 'realdeal.myname@gmail.com' auto-forward to your real address.


I know how to set up 2 email accounts and forward one to another email. "this service" would presumably be more than allocating 2 email accounts.


If everyone does this, then recruiters will simply start to spam the yesireallydidreadthisgiganticemail2017@mydomain.com emails without reading the autogenerated "profile".


What about 1% of people doing this? Or even 0.5%? That is easily enough to live off, and enough to slip under the radar.

To me, getting this done on interesting domains seems to be the hard part. For people with their own domain, getting a separate server to deal with email for that is some hassle. You can't really do this on a generic domain either, cause that looks a lot less professional. Signing people up for gMail accounts might work, but that's probably against google eula. I'd guess the same for other webmail services that are at least somewhat professionally acceptable.

Best way I see is to give people with their own domain as easy a time as possible to set up DNS correctly. Getting through DKIM and SPF reliably seems like a minefield though.


then we make them go through an animated slideshow and do a quiz to get to an email address


And when that gets rigged by some wicked OCR, then a Super Mario simulation where the princess is the realdeal@email.com, and every 10 coins or every level would get them an additional resume-info nugget to consume.

Okay maybe I took it too far.


Turns out Bowser was just trying to hire a competent plumber.


But at least a few people would be employed making this system.


Got to keep that arms race going!

You didn't take it too far, someone build the first online recruiter focused video game where the prize at the end of each level is the contact details for a more-and-more suitably qualified candidate.


Can I buy more coins directly in the simulation?


And we have AI that can play Super Mario now.


Thank you for your contribution


My partner works at a recruitment company. Here's the thing: most recruiters are not very good at what they do and are only chasing placements. To get the most out of recruiters you need to be more pro-active, and find a recruiter or two who really understand your skill set and career ambitions. Build a relationship with them, and don't waste time with the other 90% of batch mailed crapshoots.


> Here's the thing: most recruiters are not very good at what they do and are only chasing placements.

Related anecdote: There was a technology recruiter that was constantly calling me at work, despite the fact that my work number is not listed anywhere* and my LinkedIn profile having clear instructions not to do that.

I decided to look him up on LinkedIn. Guess what his previous job was? Debt collector.

* I figure the recruiters get around that by calling the front desk and asking to be transferred to me.


HAhahahaha debt collector. We all gotta eat somehow. Tech industry casts a wide, frothy net when business is booming!


This is true. I'm an engineer that started recruiting six months ago part-time and was able to bill six figures pretty quickly. I realized that colleagues in recruiting had a really tough time making efficient matches.

The best recruiters can not only make efficient matches, but they also connect the dots to reach out to matches in the pool of passive candidates they talked to when a new role opens up.

Even better than that is when I'm able to actual push back with companies and make an impact on the hiring process to get someone hired.


How'd you get started doing recruiting? I've been talking with recruiters lately and I find myself doing enough tech explaining to recruiters that I've wondered if I could consult for them. Interested to hear about your experience.


Feel free to e-mail (in profile). Started by trying to build tools for recruiters and interviewing recruiters for customer feedback. Found a good fit with a recruiter who placed my whole NY team before our startup got acquired and he offered me a consultant gig. Really enjoying it so far. Great way to monetize a mix of engineering career consulting + staying on top of startup trends.


Interesting! Do you feel like as you get further and further from having done the actual work (that you're helping recruit for) that you'll still be as effective?


I'm still coding professionally (js eng) and stay on top of trends! But, I don't think coding in the trenches will make me more effective. I've had a few eng. jobs, most of my friends are engineers, and I'm really passionate about job trends, job satisfaction, and employee retention.

With that foundation I'm more effective as I see more career trajectory data points when I talk to candidates that specialize (data, security, devops, ml) or ladder up (vp, manager, cto). Then I can provide even more value to new candidates with the career counseling approach.


I don't doubt that this is the way to find a good recruiter. However, if I am going to put in the time to vet multiple recruiters in order to find one worth working with, why would I not just spend that time finding a job?


It's hard to get an accurate pulse of the market if you spend only a month every few years searching through angelist/indeed/etc.

I work with about 70 companies (only in NYC) from pre-launch to almost IPO and a new role opens up to a us every week. It's helpful to find a recruiter who knows you & the market well enough to curate jobs you want and tell you about opportunities that might not even be listed or on your radar.


Yup, we need Meta-recruiters: recruiters that can find the right recruiter for you!


But how to keep out the spammy Meta-recruiters? We'll need Meta-Meta recruiters!


Quis recruitiet ipsos recrutes?


Ah yes, good o'l Juvenal. Meta Poet himself.


Head hunting is very profitable. If you get a recruiter email for a company that is not that company, it's _always_ junk. 100% of the time, it is a waste of time. Ignore those, and look for the ones that are from the actual company.

They have their own recruiting team that you are going to be dealing with anyway, and aren't playing the numbers game or shuffling their candidates between many different companies trying to get their (usually very large) hire bonus. They just throw as many people they can at as many companies they can till they get a bite.

Senior employees can be up to a FULL YEAR of the hired employees salary. As that employee you don't pay it, but it's not something that is benefiting you either. On top of that, the companies often have claw backs. If the employee leaves under a certain time, they have to give back their fee.

The reason they don't respond after the first email, is that they need someone who is going to be very proactive and actually motivated to get hired. Else their candidate catapult might hit the target, but less likely to stick.


>If you get a recruiter email for a company that is not that company, it's _always_ junk

Not true. Companies hire recruiters to find employees. Not all companies (especially startups) have their own internal recruiting department.

The flip side is true, though: as a company trying to hire people, every recruiter email that says "I have a perfect candidate for you" is junk. They just invent those people or send resumes of people who aren't even on the job market.


Recruiting (and hiring) is a core competency for every company. Especially for startups.

A founder should personally be handling recruiting until the company is big enough to have their own internal recruiting department.


Massive difference between sourcing candidates and interviewing/making hiring decisions. If a founder is doing the former it's probably that she's wasting a massive amount of time and there is a huge opportunity cost there. Recruiters have their place.


" If you get a recruiter email for a company that is not that company, it's _always_ junk. 100% of the time, it is a waste of time."

This is false. My last two roles were both initiated by 3rd-party recruiters.


Yes, it is possible to get hired with the use of 3rd-party recruiters, but it is still not in your benefit. If they never got people hired, then they wouldn't make money and even try.

If you don't have any way for companies to target you themselves, then 3rd party recruiters might be more effective the you targeting companies directly yourself.

However, if you are in the boat where you are getting many recruiter emails a week, my advice is sound. They are all junk, and the only way to get any signal to noise is to look for ones directly from the company doing the hiring.


100% true. I almost always reply to recruiters with my rate and ask if they can afford it "since most companies that pay recruiting fees cannot pay full rates."

Tumbleweeds every time.


I make it a point to reply to recruiters. But there's many different types of recruiting email. I completely ignore (and sometimes even filter straight to the trash) blatantly shotgunned form emails.

If the email shows any effort whatsoever, mentioning a project I worked on, mentioning my current job, pointing out the role in question would fit my skills and it actually does, basically anything at all that suggests it's not just a form email sent to hundreds of people, then I will reply.

However, there's a third type of recruiting email that shows the person on the other side really is directly targeting you. It usually comes from the person who would be your manager, or at least someone who has a direct stake in the company (say the CTO or CEO at a small company). These emails I take seriously and appreciate. One of these led me to my current job.

Oh and I don't appreciate recruiting emails that have tracking links in them. Usually I will politely respond that I don't appreciate the tracking.


A CEO of a small company once sent me what might have been a really interesting opportunity and I was going to reply but then I noticed the tracking links and realized it was just a very-well-crafted form letter. I stopped considering it then.


Tracking pixels on emails aren't always an indication of a form letter. Marketing folks use lots of services that add tracking to all their outgoing email. (Whether that's OK or not for you is a different matter, I'm just pointing out that it could have still been an individual email.)


Keep in mind - the CEO in question could be writing personal e-mails out from an ATS / Sourcing tool that helps keep track of communications.

Tracking links are not always == shotgun approach


Tracking links are always extremely bad form, though. I'd be interested in hearing from the other side, though, why such behaviour would be considered acceptable.


I think of acceptable behaviour in terms of a blacklist rather than a whitelist. I don't have a problem with tracking links because I don't see a reason to have a problem with tracking links.


By tracking links do you mean custom-generated links to a certain page that reply back to some software showing that you clicked the link?


Yes, exactly. Usually they are in the form of http://sometrackingwebsite.com/someid?nextUrl=https://youtub....


A colleague of mine gave me some excellent advice here. She redirects recruiters. I am very happy with my current job so I'm not generally interested in emails from recruiters. However, I do hav many friends who are at jobs doing things below their potential. It's always good to see if the job fits anyone's profile, even vaguely, and redirect the recruiter to them.

(I don't redirect all recruiters, though, so there's still a filter)


> I am very happy with my current job

The best time to interview is when you're very happy with your current job.

Zero pressure, zero commitment and potentially huge upside in terms of pay and title increase.


As a young person with zero other commitments to worry about (kids, loans, etc), I'm really enjoying what I work on (and get paid pretty well) and that's enough for me for now. This could change in the future. I'll keep your advice in mind in case this situation does change :)


My conversations have usually gone like this:

"Hey, you look like a great candidate for $AWESOME_JOB"

"Great, let's talk"

"Oh, you're not really what they're looking for, can I interest you in $GARBAGE_JOB or $BASEMENT_AT_YELLING_CORP"

But by then I've showed interest, so I start getting calls. Not worth it.


Seen that, but more often than not it's:

I see you're working as $ROLE in $COMPANY_A. How would you like the exact same $ROLE in $COMPANY_B? Or worse, How would you like $ROLE-1 in $COMPANY_B.

Sorry but it's going to take at least $ROLE+1 to get me to uproot my life and go through that interview gauntlet again.

Of course, as I said in another post, the calculus changes entirely if you're unemployed and need "something, anything".


Or, even better, would you like to be hired for $UNRELATED_ROLE.

My company has 'Linux' in it's name, apparently recruiters think that means I'm a sysadmin (I'm a dev).


At a location 3000 miles away.


I have the same experience - it's easy to turn down if you are not interested, or if they what they propose is a bad fit. I also think it is good to remember how extremely lucky we are to work in a field where there is such high demand. I have several friends that are in a technical field (but are not developers) looking for new jobs, and they have to work really hard to find anything. For developers, we can just sit back and wait for employers to look for us.


Now consider this approach with spam.

To paraphrase you, there's always that remote chance that one of the Nigerian Princes could actually need your help.

I used to do a similar thing to you, but it's too much work now and the levels of job spam ("Oil pipeline engineer" roles, simply because my CV has the word engineer in it (prefixed with Software)... lazy recruiter, that's bad!). Basically, if they can't make the effort, why should I? I guess the answer is, "Because there's always that remote chance that one of them could be able to set me up with a "dream job""..


Recruiters and Nigerian scams aren't comparable. People legitimately get jobs through recruiters. I got my dream job through a recruiter who found me. Nobody gets the Nigerian Prince's money


My logic is that if something needs to be actively sold (pushed via recruiters), it's most likely average at best. I'm betting that the best jobs don't go through recruiters (or maybe they do if that's the company-wide policy, but the hiring manager already has a candidate in mind when they posts the ad).


You've hit on a truth here. Most of the jobs available through recruiters are what I like to call "dog jobs". The ones that aren't filled internally and don't instantly get a line of top candidates because they are so good or pay so well--the ones that NEED someone to sell them. Those are the jobs that are available on job boards and that recruiters and are trying to fill--not the awesome ones.

Think about it like the real estate market (in normal markets, not Silicon Valley). Some houses sell before they even hit the market. A few also sell after the realtor does a few private showings. The rest are the "dog properties" that go on the MLS and need heavy marketing to sell.


> The rest are the "dog properties" that go on the MLS and need heavy marketing to sell.

That's... a strange way to look at the housing market. The norm is to list your property and then see what bids you get. Putting your house on the market is not some weird trick to pass a crappy house on to a bunch of rubes, or am I misunderstanding you?

Like, how do you reliably find a willing buyer pre-listing, and how do you know that's a good price (other than just blindly trusting your agent), if you don't even bother listing it? Listing is as much about price discovery as it is about finding more buyers.


The point is that the best homes(top 10%) rarely need to go on the market.

The hotter the market, the more realtors know buyers willing to pay a lot for the first home that meets all their needs. Hence more homes are sold before listed.


> The point is that the best homes(top 10%) rarely need to go on the market.

I think that's locale specific. Except in the 15million+ bracket, auctions (here in Melbourne at least) tend to get the best money for the seller.


The housing market in the Bay Area is very different from the housing market literally everywhere else in the US (except in some ways Manhattan). Pre-listing sales are very rare outside of SFBA and Manhattan, and the vast majority of domestic residential sales (>70%) are of MLS-listed houses/units, even in major metros like Chicago and LA.

The "dog properties" don't get listed on MLS at all; there is a cost to listing on MLS and the dog properties generally don't generate enough interest to justify the expense (and usually don't even attract a realtor willing to invest the effort).


This assumes that the Venn diagram of "top candidates" and "people actively looking for jobs" overlaps significantly. The best candidate for a job may not be looking for a job so companies pay recruiters to find them.

If the quality that companies were getting from regular applications was better than the quality they get from their recruiters why would companies pay to have recruiters?


I think we're agreeing with each other. Recruiters are needed if the job does not fill itself naturally, and a job will tend to get the candidate flow it deserves.

If there's an awesome job available out there, with way above average pay, benefits, great opportunities for advancement, good work/life balance, etc., you'll fill it with a top talent. You're not going to need a recruiter. Word will get around even to people who are not actively looking, trust me. Similarly, if you have an "average pay for an average worker" kind of job, you'll find that average worker.

When you have a ho-hum average job but you want top talent, then you're going to need that recruiter because the job needs to be actively sold.

I invite anyone who works at a company that pays 3X average salaries or is well known for being an unbelievably great place to work to reply and tell me they have trouble hiring.


Google and Facebook both pay reasonably above-market and have high employee satisfaction, and receive more resumes in a month than they could possibly hire in the next 10 years. They also both have ginormous internal recruiting departments.

They could definitely fill their need for new software engineers naturally, but presumably have data that the quality of candidates they get by bothering a substantial fraction of the world's software engineers on ~a yearly basis gets them better applicants and engineers.


> They could definitely fill their need for new software engineers naturally

Why do you think so? Just because you receive X resumes doesn't mean they're all qualified to work there.


Jobs do not fill magically. Most of us is too busy with work, side projects or family to track if your amazing company have new hot opening.

Above average people are usually treated well in theirs jobs. If you want to hire them you need to actively approach and lure into applying. Even then, they will not bother to refresh algorithms questions for whiter-boarding.

Below average people are looking for jobs because they are on PIP or they have toxic relationship with management or they will never get promoted in current role and need to change jobs.


I don't think it is really trying to sell the position, it's trying to find quality engineers who aren't looking.

Really talented engineers already have jobs and may not be actively looking to move, but if the right opportunity for the right company presents itself then they might consider it.


Wait! There's actually money!? :)


I've been responding with "Sorry I'm pretty happy where I'm at right now, but can I keep your email address and let you know when I do start looking for something new". My goal is to have a giant mailing list of recruiters when I do start looking for work again.


I've found that working with multiple recruiters can be a nightmare if they are all working the same geographic market.

Due to my current situation, I tend to only deal with recruiters who are local to where I live; unless the job opening allows for telecommuting, or it is a "too good to pass up" situation (I have yet to see one) - I will generally pass it up.

Instead, I currently only work with a couple of local recruiters. I have told both what I expect for interviewing (I prefer a practical interview with tests - not a whiteboard pressure interview), and I keep both informed what interviews the other has sent me on, so they aren't both submitting me to the same position opening.

Going beyond 2-3 recruiters in such a situation can and will lead to a tracking nightmare, to keep all of them in synch and not submitting you to the same opening - either at the same time, or worse, after you have already been interviewed once and weren't successful.

I do however, try to review the contacts I do get from recruiters, and if I feel they might be useful in the future, I tell them so, and keep a contact with them (even if it is just a LinkedIn or email contact) - and let them know I am interested in the future. That'll usually be enough for them to keep me in their DB for future potential offers to come up.


Most of the recruiters I've talked to work for one and only one company.


In my experience, recruiting is a fairly cut-throat industry with a very high turnover rate of staff, so once you come to use your list you may find that a fair proportion bounce (or are redirected to someone else). It can't hurt though!


I find these calls quite flattering. I haven't been on the market or done any computer consulting since 2013, but got a call from a recruiter just yesterday.

It's a little less flattering when they can't correctly pronounce my name or the state I live in.


I like getting recruiter emails because it kind of means I am marketable, even though some of the recruitments can become an anecdote I can talk about at a dinner party with friends... (you know one of those totally weird recruitments)

But mostly I want to get it because I want to see what these companies are expecting from their candidates, technology these companies are using, and the salary range. DevOps / SRE / Production Engineers are high in demand.


Do the typical recruitment emails you see specify salary? The ones I get tend not to.


Certainly some do. They may say "Cloud Engineer 150k!" when they meant up to 150K. Or they will tell you $80,000 to $120,000.

Note many job recruitments are from agency so they are likely hiring consultants, so paycheck doesn't come from the actual client company.


> filtering is very cheap: the time it takes to reply back.

See, I see that time as very expensive.


I think OP's reply 'I am happy to interview but only with the hiring manager I will be directly reporting to' is quite polite.


I tend to agree with you and try to at least be cordial with recruiters in my region. I have gotten 2 jobs from recruiters, both being external recruiters.

Once I applied for a job that had a really obscure job requirement which I met. The job was already filled but I spoke to the recruiter, who was really nice and ultimately found me a different job in another skill set about a month later.

The other was the traditional thing were a recruiter reached out to me. I actually didn't like the recruiter, but I liked the job and ended up pursuing it anyway.


I always give recruiters a fair go with their pitches (I'm between contracts now and today I had maybe 10 recruiters pitch me positions) - Most of them are pretty good at their job and they come up with decent stuff. You just have to ask the right questions to probe them and understand if the opportunity is actually good for you before you take it to the next phase.

As an engineer you should have a clear picture of the kinds of companies that you want to work for. As you get older, your selection criteria should improve and become more detailed.


Agreed, this makes perfect sense. I'll even go further and let them put me up in a hotel / buy my plane ticket even if I'm not particularly interested. Going to interviews is a lot of fun.


Many years I had sent out resumes looking for work as a MCSE. Never mentioned novell. Guess what? Recruiter emailed me looking for novell engineer... Hw fell out of my funnel real quick.


I respond to most if not all of them.

The only ones I tend to ignore are the recruiters who are bringing positions that are far out of my geographic areas and are not remote.


It's definitely not "zero cost." It may be a low cost, but as with any choice, there is always an opportunity cost.


There is no point in giving me binary-tree-traversing questions; I don't know those answers and will never be interested in learning them.

Let's presume this is out of preference and not ability. It's a pretty basic concept. If your preference stops you from learning something as basic as this as a programmer, then it doesn't seem likely that you will be motivated to keep up with even more abstruse concepts.

Nearly every programmer nowadays knows that naive string concatenation is inefficient, and so they should use a stream or something like that. I'd rather hire someone who knows exactly why it's O(n^2) and why adding to the end of an array that doubles when it expands is O(n) amortized. Why? Because a different but analogous situation might well come up in a programming job, and the person who likes to think about such things is more likely to spot the potential problem and avoid it altogether! The fact that the op would actually feature the above sentences as a large text excerpt sets off the "Dunning-Kruger" alarm for me.

That said, the op still has a good point. There is considerable organizational disconnect being displayed here. Those big companies would do well to have developers or a puzzle website do the initial filtering, rather than waste people's time by alternatively telling them they're supposedly wonderful, then supposedly horrible.


I agree. Personally, I'm a bit horrified at the idea that someone who wants to be an "expert at object oriented design" would turn up their noses at "binary-tree-traversing questions". It sounds like a formula for aspiring Architecture Astronauts.

It's also not realistic, given the wheat-to-chaff ratio out there, for a manager to interview all the candidates directly without an elaborate screening process, although the 'get interviewed by some generic programmer who has a cheat sheet for the Hard Questions you are working through on the fly' seems like an anti-pattern. I have talked to all of our groups hires as a near-final step.

There's a valid point in there somewhere, though. The recruiters that Google uses locally, for example, seem clueless enough to not care that their approach is quite offensive to someone who is not junior and/or desperate for a job. For several years, every few months, I got a come-on from our local Google office that seemed to be implying that working for Google was So Neat that I would probably want to drop 2-3 levels down the SE -> Senior SE -> Staff SE -> Senior Staff SE -> Principal Engineer chain, AND I would only have to carry a pager 1 week out of 4 as a Site Reliability Engineer. Well, hey, that sounds like a real step up from running a team doing something I really like at my current employer and getting to dream up SIMD tricks and algorithms all day! I'll get right on that. My very own pager!


Well said. The problem is, many recruiters actually wouldn't understand that sarcasm. And that's the problem the OP is talking about. For a lot of recruiters, they are so excited to be working at a big tech firm that they don't understand why anyone else would be any less excited.


  It's also not realistic, given the wheat-to-chaff ratio
  out there, for a manager to interview all the candidates
  directly without an elaborate screening process
Sure, but that screening process should be completed before you invite a candidate for an on-site interview.

I'm a hiring manager. If I say someone should spend 8 hours attending an in-person interview before I'll spend 5 minutes reviewing their resume/github, I'm saying my time is a hundred times as valuable as theirs. I can understand someone taking umbrage at that - especially someone who had better skills than me. And I want to hire people with better skills than me.


I broadly agree with you, but let me nitpick a bit, because nothing suits HN like a pedantic quest for truth.

8 hours vs 5 minutes is a bit extreme, for a start.

Secondly, the time a hiring manager needs to spend reviewing resumes includes unsuccessful candidates as well - so it's not 5 minutes, its 5 minutes * |all_the_vaguely_plausible_candidates|. There might be 20 or 30 resumes in the pile for some of these jobs. SO the manager might be 100-150 minutes in, in order to get to that 1 person who needs a "8 hour interview" (or whatever that is; I suppose places that fly you somewhere might be burning 48 hours or more - I know people in Australia who have been flown to the US for job interviews!). If you live in town it might be more like 2-4 hours. So the ratio isn't quite as extreme as you make it out to be for the manager, even if it seems unfair to count time spent reviewing all the other candidates' resumes/githubs.


There's also the issue of developer titles not being standard.

As you say at Intel: SE -> Senior SE -> Staff SE -> Senior Staff SE -> Principal Engineer

IBM has: Engineer -> Staff -> Advisory -> Senior -> Senior Member of Technical Staff -> Distinguished Engineer.

I'm sure other companies have different ranks too.

Unless you dig into the technical ladder of each company, its hard to say that Staff SE is the same everywhere.


Fair enough, and there's a legit argument that even identical titles from two given companies doesn't necessarily mean a automatic equivalence. That being said, when the scales are roughly equivalent, someone approaching you with a job offer 2-3 grades below where you are now is just embarrassing - you know, it's typical to offer a promotion to jump ship.


This sort of invites the question of what you WOULD be interested in working on (and what size team you've been leading) if you were to consider leaving.

Asking as a Googler who occasionally gets to dream up a neat trick here or there (and leads a team of pretty clever people, too!).


I'm happy where I am, honestly, and am not fishing for someone to flatter me, and am certainly not going to go into the answers to that question on a public forum. They approached me, I was minding my own business.

The main thing I guess is that a recruiter would have to be asking a question more like yours and less like "hey, how do you feel about an exciting job in the new field of SRE where only have to carry a pager 1 week in 4" to make me feel like it was even a respectful approach. They had even grasped that I was at Intel as a result of a fairly successful exit, but... hey... maybe some people want that pager.

As it stands, it was pretty much a "Hey $NAME, we like think it's good that you did a graduate degree at $SCHOOL. Congratulations on your $RECENT_CAREER_MILESTONE. If you ever tire of your $MEANINGLESSLY_SENIOR_SOUNDING_JOB at $MASSIVELY_INFERIOR_COMPANY_TO_GOOGLE you should join our generic hiring process".

(not hating on SRE, mind you - but no-one sane looking at my LinkedIn profile or anything else would think I have that skill set or would be willing to retrain from the bottom to develop it)


Conjecture: the majority (by far, probably on the order of 75%-80% or more) of programming and engineering problems to be solved in a typical company or typical application will not see significant differences in performance by selecting a naive implementation.

For example, in your string concatenation example, the naive solution is good enough except in situations where large numbers of strings are to be concatenated in a given run, or the strings are huge, etc., relative to the compute environment. Does it really matter if one uses an O(n^2) solution when a few dozen (or even a few hundred) such operations are applied on "small enough" strings in a given execution on modern computer hardware? No, it just doesn't.

Select candidates who are keenly interested in and knowledgeable about algorithms for positions where it's important (i.e. part of the regular course of development), not because of that slight chance there is one edge case where one adjustment to a more efficient algorithm might possibly be useful at some indeterminate point in the future.


> Select candidates who are keenly interested in and knowledgeable about algorithms for positions where it's important (i.e. part of the regular course of development), not because of that slight chance there is one edge case where one adjustment to a more efficient algorithm might possibly be useful at some indeterminate point in the future.

Well put! At the end of the day, it's way more important to get your product out to customers/users rather than holding a microscope over every line of code you write and endlessly obsessing over big O numbers. It's fine if you're in academia, but it can be a death knell if you're working in a crowded domain. You can always go back and improve code later; customers/users are not going to come back later.


Customers won't necessarily come back after you rewrite your product to deal with all the perf problems your naive implementations caused.

I'm all for striking the right balance with perf concerns, but in my experience that's not "just deal with it later". Giving no thought to performance is as bad as micro optimizing at the beginning.


True. It's a balance, it's hard, and there is no silver bullet. I believe I should learn from my mistakes and move on rather than have rabid obsession over big O. If something is obvious or caught by code reviews, fix it then and there else move on.


I believe I should learn from my mistakes and move on rather than have rabid obsession over big O.

It's not even an obsession. Those pieces of knowledge could be fully mastered in about a half hour. It's first principles knowledge, like chemistry or thermodynamics. I can tell in a minute that something like "Solar Freakin Roadways" is a scam, whereas so many people gave that scam literally millions of dollars. Knowledge is literally power, in a very concrete way that expresses itself directly as dollars!

Instead of having some basic first principles knowledge to spot such things ahead of time, benefiting from the collective experience of your field, you'd rather just labor in ignorance and run into everything for the first time and get out the profiler? Imagine scaling that up to the size of the Amazon or Google workforce?


We agree to disagree.

I absolutely don't disagree with optimizing when it's obvious (two for loops instead of one, etc.) All I'm saying is that, when you have thousands of lines of code (millions?) and even more complicated/intricate connections in your head of different modules, all the while when trying to reach a deadline, things aren't as obvious. You have to make compromises, it's inevitable.

Amazon or Google did not scale up magically in one go by following established guidelines. I am very certain they had growing pains and I know that they had "hacks"[0] while figuring out the right solution for the problem at hand.

[0] As attested by a former Googler who works at my current workplace


I absolutely don't disagree with optimizing when it's obvious (two for loops instead of one, etc.)

What we disagree with, is the expansion of one's knowledge of what constitutes "obvious." Many people would say that the scam nature of "Solar Freakin Roadways" was far from obvious. Other people would smack their foreheads and ask why some people didn't pay attention in middle school and high school physics!


Details do not matter until they show up. Still take string concatenation as an example. Suppose you want to highlight some particular words while a user is actively typing characters in a dialog. If you naively concatenate every character, the user may have responsiveness issues or at least wastes CPU cycles/battery unnecessarily. Small cases like this add up in a big project.

My conjecture is that a big fraction of programs could get significant performance boost noticeable by end users if they were developed by good developers who have some knowledges in algorithms. You don't need to be an expert, but knowing very basics on time complexity and fundamental data structures is necessary to every programmer IMHO.


I tend to run in to more critical issues than performance, like shipping the product/feature at all, meeting deadlines, writing maintainable code, and writing documentation.

In your highlighting example, instead of working with someone who will optimize, I'd much rather work with the guy who will realize we can just debounce the dialog, write a one line comment about the performance issue, and then move on to the next thing.

Maybe I'm biased though. I've worked with the performance guy; he was brilliant and the code was clever and highly performant, but he was also slow and didn't write clear, maintainable code. We didn't ship in time and I had to find a new job when the project got scrapped. Damned if it wasn't fast though.


> In your highlighting example, instead of working with someone who will optimize, I'd much rather work with the guy who will realize we can just debounce the dialog, write a one line comment about the performance issue, and then move on to the next thing.

As long as you are aware of the issue with naive string concatenation, you can use a string buffer or a mutable string – it is trivial to solve in a few more lines of code. What's more difficult involves deletion in the previous text. If this happens often, you will need rope or skip list, which is challenging if you implement from scratch. In the latter case, I would leave a comment without implementing the optimal solution and let the profiler decide later.


We didn't ship in time and I had to find a new job when the project got scrapped. Damned if it wasn't fast though.

Sounds more like a project management issue to me, than a problem with that coder.


Three man team and he was the lead. Yes, it was a project management issue.

I'd rather see basic project management skills than basic knowledge of algorithms. Unfortunately, the former is much more rare.


Sure. But at the scale of e.g. Google, Amazon, etc. how often do you think they are performing these lower level computations?

From my experience: very frequently.

So it's useful at these corporations for all engineers to have this understanding, I think.


From my admittedly limited experience at Google, given the size of the input data even a "small" project deals with those little algorithmic inefficiencies really add up quickly. You have to do a lot of optimizing and using clever data structures to make something work at all.


I haven't worked at any of these companies so I wouldn't know how often they're performing these computations. My conjecture is broad, of course, and a statistical suggestion at that: it seems unobjectionable to claim that for some companies the "typical application" lives in the 20%-25% "outlier" region most of the time.


Where I work our core rendering algorithm which runs several times on every page load used naïve string concatenation. When I initially wrote it I knew string concatenation was inefficient but never got back to it and for 99.9% percent of cases it didn't matter! We finally hit that 0.1% case and after optimizing the algorithm, it didn't make a dent on our overall performance numbers at all.

The biggest impact to our performance has been switching one JSON serialization library for another, no algorithmic knowledge needed, just basic benchmarking skills.


Your one anecdote obviously proves everything once and for all. The biggest recent performance gains in my personal project had to do with switching to WebRTC and switching libraries.

Knowing when you can just slap in the naive string concatenation and move on is also a useful result of the right skills. If you had better profiling tools/skills, I posit that you wouldn't have had to do the useless optimization.


What made you reimplement the rendering algorithm and/or switch JSON libraries (eg were these changes backed by data/measurements)?


Conjecture: the majority (by far, probably on the order of 75%-80% or more) of programming and engineering problems to be solved in a typical company or typical application will not see significant differences in performance by selecting a naive implementation.

True. And people who clutter their code by using some fancy "StringStream" class when they could've just used a concatenation are also part of the problem!

Saying that programmers don't need a basic knowledge of algorithms is like saying drivers don't need to know how to back up a trailer or do a 3 point turn. Over 90% of the time, you don't need it. But when you do, you really do!


Well, sure but one person's "basic" is another's "esoteric," and "when you do" is something I think is relatively infrequent. I cared deeply about this sort of thing when I was writing code for HPC modeling of physical systems, but in very specialized and narrow context. Since I've left academia I have rarely encountered a problem where understanding implementation details behind complexity was really necessary.

Knowing the typical time and space complexity for an algorithm (without even understanding one bit about the implementation details) in a given category might be considered too trivial to be even basic, but I submit it's about the most "advanced" piece of knowledge for what a typical engineer really must know to write good, high quality and performant enough code.

Most can get by using a given language's library implementation of an "obvious" data structure and naive algorithms around it. For example, in Python, the "obvious" structure to map keys to values is a dict, and iteration over the keys or item tuples is a typical access pattern. An engineer doesn't need to know whether the dict insert or lookup is amortized O(1) or any other value, and he doesn't need to know the space required is O(n), etc. In almost every case he's likely to put it to use for the dict is fine.


Knowing the typical time and space complexity for an algorithm (without even understanding one bit about the implementation details) in a given category might be considered too trivial to be even basic, but I submit it's about the most "advanced" piece of knowledge for what a typical engineer really must know to write good, high quality and performant enough code.

I think we're both saying that it's generally a good level of 1st Principles knowledge for programmers to have.


"Good" and "necessary" are different things, though. I consider it "good" to know as much as possible about things I'm interested in. I don't consider it "necessary."


It's not about using this knowledge all the time.

It is about your base knowledge (which should include algorithms) and knowing those algorithms when its matters.

If you have the choice of two identical software engineers and one knows 1 algorithm more than the other, he/she wins.


>Let's presume this is out of preference and not ability. It's a pretty basic concept. If your preference stops you from learning something as basic as this as a programmer, then it doesn't seem likely that you will be motivated to keep up with even more abstruse concepts.

This same exact argument could be used to require every interview candidate to know assembly.

>and the person who likes to think about such things is more likely to spot the potential problem and avoid it altogether!

I find that developers who constantly get caught up in the performance of individual algorithms will waste incredible amounts of time optimizing them and will miss major optimizations that come from seeing the system as a whole.

For example, I would rather take the developer who points out that a particular program can make use of the fact that most of the code doesn't need sorted widgets rather than the one who spent the whole time optimizing the widget sorting algorithm.


> This same exact argument could be used to require every interview candidate to know assembly.

Yes, and that argument could be used to require every interview candidate to know quantum physics.

At some point you draw a line, and Google/Amazon/Microsoft/etc. have very clear pictures of where their respective lines should be. It seems to work well for them.


At some point you draw a line, and Google/Amazon/Microsoft/etc. have very clear pictures of where their respective lines should be. It seems to work well for them.

Think about it this way. 1st principles knowledge, like algorithms, isn't at all useful 90% of the time. But 1% of the time, not knowing it will cause very out-sized penalties in efficiency or debugging costs. Now take the frequency of the occasional 1st principles usefulness penalties and multiply it by all of the developer-hours at Google/Amazon/Microsoft.

That is why Google/Amazon/Microsoft do that!


>It seems to work well for them.

There is no proof that they are actually preventing bad engineers with this process or that they couldn't get much better engineers and fewer flops by fixing it.

People want to work there in spite of these irritating hiring processes. I frequently travel via plane but that doesn't mean I think the TSA process is good.


This same exact argument could be used to require every interview candidate to know assembly.

Have you ever done the standard Comp Sci compiler implementation class? Do you write C++ and use the C++ standard library? If you answered yes to both questions, then you should know from first principles how just about everything in the standard library is implemented, and can use those tools with complete knowledge of when they can fail or must be used differently.

I find that developers who constantly get caught up in the performance of individual algorithms will waste incredible amounts of time optimizing them and will miss major optimizations that come from seeing the system as a whole.

Those developers are merely exhibiting yet another kind of ignorance.

For example, I would rather take the developer who points out that a particular program can make use of the fact that most of the code doesn't need sorted widgets rather than the one who spent the whole time optimizing the widget sorting algorithm.

It's the developer who has a clue about algorithms who is more likely to make the valuable insight. The needless optimizer is just doing some Cargo Cult algorithms analysis. (Probably motivated by signalling geekiness, not producing useful results.)


I'd love to agree with you - it would be good to my ego - but I just don't think you're right.

People create value with software when they use code to solve problems.

For some people, that means tackling a gnarly complex problem and by a combination of wits, experience and education, come up with an efficient and elegant solution. Typical example: someone working in a specialized role in back-end (i.e. not user-facing) systems in a large company (small companies can't usually afford specialized roles). Concrete example: V8 JIT engineers.

For other people, it means looking up from the keyboard, figuring out the company strategy and product fit, and seeing what can be most efficiently (in terms of effort) put together to improve or align both. Creating a proof of concept that gets buy-in for a more in-depth solution.

The second category almost always delivers much more value than the first. And there's a spectrum in between, a spectrum of people who are all very good at what they do, all very good at creating value; the spectrum extends from the guts of the machine all the way out to the interface with the customer.

I've worked in some very different industries in my career; ranging from compiler engineer to web app developer. Compiler engineers leave their fingerprints on far more code, and the job is intellectually stimulating. But realistically, most of the reward of the job comes from the fun of job itself. As a web app developer, I apply my knowledge of compiler techniques to things like SQL generation from filter predicates represented as ASTs, for efficient display of the user's data. But I could only deliver that because I saw the possibility of connecting the back end we had with an Excel-like experience we could give the user; and I had to take it on.

I've seen other people take a bunch of open source components - not knowing in depth how they worked - and put them together to create surprisingly credible solutions very cheaply, the kinds of solutions that win 7-figure SaaS deals. You don't get there with your knowledge of binary trees or compiler principles.


I've seen other people take a bunch of open source components - not knowing in depth how they worked - and put them together to create surprisingly credible solutions very cheaply, the kinds of solutions that win 7-figure SaaS deals. You don't get there with your knowledge of binary trees or compiler principles.

But having some basic algorithms knowledge might make the difference between delivering such a solution that runs reliably and quickly enough and not.


>It's the developer who has a clue about algorithms who is more likely to make the valuable insight.

The ability to implement the sorting algorithm in the C++ standard lib is completely orthogonal to people who have this insight.

People who have an understanding of complexity analysis is really all it takes to have that insight. Memorizing a bunch of datastructure algorithms has almost no bearing on this ability.


The ability to implement the sorting algorithm in the C++ standard lib is completely orthogonal to people who have this insight.

Weak example! Exactly when would you want to use a shared_ptr, and when you you absolutely not want to use a shared_ptr? Why? And what pieces of 1st principles knowledge would let you simply know that in about a minute?

People who have an understanding of complexity analysis is really all it takes to have that insight. Memorizing a bunch of datastructure algorithms has almost no bearing on this ability.

Note that in this thread and others I've been advocating for First Principles knowledge. If you have that knowledge, then you'd probably know several of the most basic algorithms and their analysis from having learned that. I'm not advocating that people just be able to spit out a memorized text!

http://v.cx/2010/04/feynman-brazil-education


Has modern CS education been too little first principles knowledge, too much memorization or something else that detracts from understanding the fundamentals? Maybe that's the problem.


This same exact argument could be used to require every interview candidate to know assembly.

You're not thinking big enough. My usual extension of the argument is to suggest that part of the interview process should be to give the candidate a bucket of sand and some ore, and make them smelt everything and build their own CPU from scratch.

Because, y'know, it's important to cover fundamentals!


That sounds like a really fun interview. "Given a bucket of sand, build a machine that computes pi to the nth decimal".

(I'd give people extra credit for coming up with creative and interesting variants based on Buffon's needle, rather than building the full infrastructure to do lithography).


> This same exact argument could be used to require every interview candidate to know assembly.

...because knowing assembly is actually useful to a high level developer? If it were, then yes, I'd say they should know assembly too. But I know assembly; I've written entire published games in assembly language. And yet I don't believe knowing it is actively useful any more. Knowing the basic concepts like how strings, integers, and floating point values are stored and compared, yes. You typically learn that as you're learning algorithms and data structures. But you can learn those concepts using C; actually knowing assembly language fluently is overkill today.

So there's no slippery slope argument to be made here.

> I find that developers who constantly get caught up in the performance of individual algorithms

Straw-man. [1] Developers who understand how to optimize can, at the same time, make intelligent decisions on when to optimize. Developers who are overly focused on minutia that isn't important are solving the wrong problem, certainly. But part of the skill of optimization is knowing when to do it.

The problem is that if you don't know how to optimize algorithms, if you don't understand big-O notation and its implications, then you won't really get how to optimize at either the individual algorithm level or at the system level. Because the concepts are the same across all the levels of complexity.

Yes, some developers can get obsessed with optimizing the wrong things. That's why experienced developers will profile before spending a lot of time optimizing.

But put a bunch of developers who ignore big-O together and you'll end up with code like the Quora app: If I delete a paragraph in the app it can take 10 seconds to finish deleting it. Sometimes the app will update once or twice with parts of the paragraph deleted. I'm not writing a book in the app; the N can't be more than a thousand or so for the entire answer. Even JavaScript can iterate over a thousand characters in milliseconds.

My guess? They've accidentally used an O(n^3) algorithm where they delete one character at a time and copy the entire message, re-concatenating it every time. Experienced developers wouldn't even consider writing that code to begin with, instead using something like ropes when dealing with text that's being actively edited, because that's what you do with text in an editor. [2] There's even already a JavaScript implementation they could have used off the shelf [3] (I'm assuming that the app is hybrid and running in JavaScript; if it's actually native then, well, it's quite an achievement for it to have such poor performance).

But you have to be at least passingly familiar with algorithms to even know that it's a likely problem.

And you know what? I don't always obsess over "the most optimal" algorithm for every problem. Sometimes the more optimal algorithm for large N will require more overhead for the small N that we're dealing with, and the brute force algorithm will not only be "just fine," it will be faster and require less work AND less memory overhead. And sometimes N is just always going to be too small to worry about.

I've sometimes just used a quick-and-dirty algorithm only to discover that its behavior was far worse than I had guessed (something that should be instant is taking seconds), but then because I do understand algorithms it takes me 5 minutes to rewrite for better time complexity, and the performance glitch vanishes.

And that's who Google is trying to hire, at least in general. It's not like I don't understand object oriented design as well.

[1] https://en.wikipedia.org/wiki/Straw_man

[2] https://en.wikipedia.org/wiki/Rope_(data_structure)

[3] https://github.com/component/rope


...because knowing assembly is actually useful to a high level developer? If it were, then yes, I'd say they should know assembly too. But I know assembly; I've written entire published games in assembly language.

To expand on what you are saying: If you're doing "high level development" on a business app in C++ 11 then knowing assembly and having been through a bog-standard undergraduate CS Compiler Development course will actually help you use a whole bunch of things in the C++ standard library, because you can easily guess about how such things would be implemented from first principles. Otherwise, things like smart pointers are just "magic" and just need to be used in a bunch of disconnected abstruse ways. Having the first principles knowledge lets you easily know when to use another shared_ptr, when to use a reference, and when not to use one -- all just from first principles, no arbitrary memorization required!

Hell, having that kind of knowledge even benefited someone working in Smalltalk back in the day.


> If you're doing "high level development" on a business app in C++ 11 then knowing assembly and having been through a bog-standard undergraduate CS Compiler Development course will actually help you use a whole bunch of things in the C++ standard library

Agreed, and my understanding of assembly does help in those ways. I actually feel like compiler design was one of a very few classes in college that really, really taught me something.

But if I'm hiring a Python or JavaScript developer, I'm not going to require they know assembly. If only because that would restrict the hiring pool so much that I'd likely never find an employee.

A senior developer, though, probably should understand what's happening at the low level. And most teams should have a senior developer to keep the team from making rookie mistakes. There are plenty of programming jobs that can be done with less skill or deep knowledge. Like the ones that are discussed in the Wired article on coding being the next "blue collar" job. [1]

[1] https://www.wired.com/2017/02/programming-is-the-new-blue-co...


Paraphrase: you have to compromise due to market realities. Isn't that an indictment of the poor state of training in our field?


Yes and no.

I think that the lead or "surgeon" (to quote The Mythical Man Month) in charge of any app or service development project should be well trained. Anyone who brings in a junior developer (anyone without a CS degree or equivalent, or less than 5 years of professional experience) to lead a project is asking for trouble.

But I think that average developers don't need the full training any more than a nurse needs to have a full medical degree. There are certainly things that I don't want to have to do, and while I might be able to do them better in some way than a more junior developer, they just don't matter enough.

It is a Catch-22, though: Just having a CS degree and five years of experience doesn't make a developer competent. I'd love to see some kind of certification to help separate the wheat from the chaff, so that non-experts could distinguish a top developer from a mid-tier developer.

But none of the certifications I'm aware of do anything aside from test that you've memorized the right buzzwords associated with a particular technology, and as such having such a cert is almost useless.

So yes, I'm arguing for certifications despite the fact that I view certifications as useless. If we had good certifications, maybe we could help establish a better way to distinguish developer skills. Would love to see that, but I admit it's selfishly motivated: I'm really good at programming and technical tests, so any such regimen I'd likely end up with the highest ratings, except if domain-specific knowledge was required.


But I think that average developers don't need the full training any more than a nurse needs to have a full medical degree.

A nurse needs to have enough 1st Principles knowledge plus specific training to keep from making egregious mistakes. In fact, where X is a profession, an X needs to have enough 1st Principles knowledge plus specific training to keep from making egregious mistakes.

The fact that our "field" keeps producing X without that minimum level means something is broken. It would be like the nursing field producing nurses who didn't know how to spot a Tension Pneumothorax, because most nurses don't have to deal with that 99% of the time.

So yes, I'm arguing for certifications despite the fact that I view certifications as useless. If we had good certifications, maybe we could help establish a better way to distinguish developer skills.

The fact that you entertain this thought is an indication that something is not quite right with our field.


>The fact that you entertain this thought is an indication that something is not quite right with our field.

Agreed. But try as I might, I can't come up with a solution that I'd actually support the implementation of.

If you have any ideas about we can get from where we are to where we should be, I'd be interested in hearing them. Having IEEE license developers at varying levels of skill sounds like a start, and requiring Internet-visible software to be written by developers with certifications sounds nice...until you think about open source projects, which thrive in large part from free effort. And until you think about the fact that software traverses borders fluidly. And what about entrepreneurship? Should we tell people they can't write apps unless they have the certification? Or that they need to pay an expert to audit their apps?

And the fact that big companies will do whatever they can to cut costs, including hiring developers in countries without certification requirements if you don't motivate them otherwise. (Maybe a DMCA-style safe-harbor, where ONLY if they use a development team with sufficient certifications can they be proof against lawsuits for defects; otherwise people can sue them for exploits that result from their software?)

I have lots of ideas, as you can see above. I just don't like any of them. There probably needs to be a constellation of rules in place, set by a professional organization and ensconced as legal requirements... But half of the rules I imagine wanting I also, under other circumstances, would despise.


>...because knowing assembly is actually useful to a high level developer?

It's about as relevant as implementing quicksort.

>My guess? They've accidentally used an O(n^3) algorithm where they delete one character at a time and copy the entire message, re-concatenating it every time.

Or it's actually a javascript event handler that is doing something else expensive per char delete. It's more important to know to profile rather than try to memorize every algorithm used by a system and guess which one is causing the problem. Once you identify where the bottleneck is, you can search for "efficient algorithms to do X".

People who profile and fix bottlenecks are much more valuable than people who think they have implemented the most efficient algorithm for every operation in their program.


> It's about as relevant as implementing quicksort.

...did you actually read my comment? Because I went on to say that I didn't think it was relevant.

That said, understanding how quicksort works is important, not because you'll be called on to implement it, but because there are key strategies used in its implementation.

And yes, I can implement quicksort without looking up the algorithm. It's really not that hard.

> Or it's actually a javascript event handler that is doing something else expensive per char delete. It's more important to know to profile rather than try to memorize every algorithm used by a system and guess which one is causing the problem. Once you identify where the bottleneck is, you can search for "efficient algorithms to do X".

Sorry, but no, wrong answer. Partial credit for suggesting profiling. Let me take this apart:

> Or it's actually a javascript event handler that is doing something else expensive per char delete.

If they're calling a function that triggers an event handler for every character deletion when they're deleting a block of text, then they're Doing It Wrong. Yes, profiling is important. But calling delete one character at a time for a block of text is plain lazy.

> Memorize every algorithm used by a system

Umm....this is a text editor we're talking about. There are very few algorithms, and it shouldn't be a case of "memorize them" as much as "observe them." If there are more than 3-4 algorithms, then it should also include "draw a picture of how they interact," if your abilities at modeling them in your head are insufficient.

But someone who is strong at algorithms should be able to just see what's wrong with a text editor that deletes characters one at a time. It should be obvious that it's likely to be a problem, and they shouldn't do it to begin with.

> People who profile and fix bottlenecks are much more valuable than people who think they have implemented the most efficient algorithm for every operation in their program.

Both are necessary, and as I've said elsewhere, "developers that are anal about optimizing everything" is a straw man; good developers know when optimization is important, and good developers can write more optimal code without making it harder to read. Just knowing how to optimize doesn't automatically make you optimize every single routine, nor does it mean everything is more complex. Half the time when I optimize something the code ends up cleaner and easier to read, in fact.

And it's not always about bottlenecks. Sometimes it really is about the algorithm, and if you don't know your algorithms, you won't be able to recognize this. As said above, it's a prime example of the Dunning-Kruger effect: If you don't know algorithms, you don't even know what you don't know. Claiming you don't need to know algorithms just reinforces the point. Sorry.

In another comment on this post I described an optimization that I did that had no easy-to-fix bottleneck that was slowing things down, and that required a high level algorithmic change to fix. [1] In about an hour I sped it up by a factor of about a thousand. Ultimately I was doing the same things, but I was able to improve the algorithmic efficiency.

And someone strictly looking for bottlenecks can look at that code forever and not see the algorithmic change needed, because the code itself was written to be pretty optimal; it was a high level change to the algorithm that made it so much faster.

[1] https://news.ycombinator.com/item?id=13701931


This is a great post and I completely agree, it is amusing however to note that Quora is known in the competitive programming community for having developers quite strong at algorithms, sometimes I guess that doesn't make it through to the software though.


I'd put money on the fact that those competitive programming Quora developers aren't working on the Android app. Or even on the Android/Mobile API back-end, which seems to fail frequently -- and lose information you've entered into the app, which doesn't do the obvious things like queuing up requests to be pushed later when they initially fail.

So. Much. Fail.

The Amazon Alexa/Echo app is similarly poorly programmed. Takes like 10 seconds just to boot up, which you need to do if you want to look at your shopping list -- and if the gets pushed to the background, you need to boot up the app again. Sigh.


> For example, I would rather take the developer who points out that a particular program can make use of the fact that most of the code doesn't need sorted widgets rather than the one who spent the whole time optimizing the widget sorting algorithm.

And I'm pretty sure that the person who doesn't want to learn binary-tree traversing will never be able to point that out. That's the point you missed of the post you replied to.


Instead of requiring someone to know a b-tree, how about just teaching it to them in an interview, and then walking through an exercise to see if they get it? That'd be more impressive to me. If someone claims to know what a b-tree, test them on a more advanced concept that builds on using a b-tree that they are unlikely to know. Then teach them that and see if they get it. Make the interview more about working together and capacity to learn, and less about computer science trivia night at the pub with no beer.

Your ability to efficiently teach people stuff that you claim is important is a chance to hold yourself accountable for how well you actually know something AND whether it is actually important at all. If an interviewer taught me something I didn't know, and then quizzed me on it, I'd be impressed as heck, and I'd wanna work with that person (if they were not an asshole).


This has worked quite well for me. I tend to ask one of two types of algorithm questions:

1. Pick a simple but relatively obscure data structure, something they are unlikely to have crammed the night before. I always start by asking the candidate if they are familiar with it; the answer is almost universally "no". I then pull out a wikipedia printout and a notepad, and spend 10 minutes explaining it to them, with diagrams. Once they are sure they have understood the basic operations, I ask them to implement one of these basic operation we just walked through. The amount of code here should be ~10 lines; keep on adding simplifying assumptions until it is.

2. Pick a fundamental data structure from a high-level language runtime, something you would use without thinking: python lists, javascript objects, that kind of thing. I emphasise that familiarity with said language is not required, and list some real-world properties and performance characteristics (access speed, iteration, ordering, mutation), as well as common and less-common use cases. I then ask the candidate how they think it is implemented under the hood (i.e., if you wanted these semantics in a language that didn't natively provide them, how would you do it?) I often don't have a full answer myself, so we brainstorm the requirements and implementations together; no code, just open-ended discussion. The best candidates have often emailed me afterwards, having looked up an actual open-source implementation.


Without knowing how well it works, I like this approach. It's thoughtful and engaging. Not perfect, and maybe that's a good thing.


I really like this approach; can I interview with you? :P


I've had interviewers try to school me on something that I wasn't super well versed on, and my internal thought process was "If I wanted to learn the finer points of xsd, I'd go dig in on my own and learn it. Right now I'm not that interested in you teaching it to me."

Maybe not the healthiest attitude ...


Definitely a good idea to decouple rote memorization from the interview process.


I do like your suggestion, but

> Make the interview more about working together and capacity to learn, and less about computer science trivia night at the pub with no beer.

Understanding b-trees is pretty useful. If you work with relational databases, most questions concerning query performance require some understanding of b-trees.

I wouldn't ask a candidate to implement b-trees (or a sorting algorithm, or red-black trees, etc.), but asking about their basic properties seems reasonable. Or better yet, asking about the performance of a realistic SQL query which uses a b-tree index.


> If your preference stops you from learning something as basic as this as a programmer, then it doesn't seem likely that you will be motivated to keep up with even more abstruse concepts.

No one is arguing that there isn't any value in knowing CS. Rather, the argument basically is that for the vast majority of developers, studying algorithms is a net loss because it's time that could be better spent learning more valuable skills.

If you're the person whose job is creating Redis then yeah you should probably know something about about algorithms and data structures. For most other people, having a cocktail party level familiarity with that stuff and a good understanding the phrase "tools not rules" is probably good enough.


Rather, the argument basically is that for the vast majority of developers, studying algorithms is a net loss because it's time that could be better spent learning more valuable skills.

But the op is basically implying that you don't even need the "cocktail party level familiarity." Knowing just enough first principles chemistry to know how CO2 and CO are produced can save your life. It's one thing to just know the rule you shouldn't run your car in a closed garage. It's another thing to know the general conditions where you might be producing CO, so you also know not to run your laser cutter in a poorly ventilated room.

First principles knowledge are a series of tools for reasoning about the world, your program, your software libraries, etc. Programmers thinking they don't need to know the most basic algorithm knowledge is basically the anti-intellectual stance of eschewing first principles knowledge.


I find your stance well considered and it intrigues me because I don't think I agree with it. However I might agree in principle and differ on what exactly it means for someone to know first principles.

I've been through a CS education track, including algorithms and complexity, my day job has been software development for over 5 years since I've graduated college. In practice I have been able to understand complexity tradeoffs when selecting approaches and building implementations and have been able to recognize shortcomings in the same done by others and been able to improve upon what was built. The difficulty I have encountered has never been in the implementation of an algorithm. I have even taught the first principles of complexity to others sufficiently well that I have seen them make informed design decisions.

I will openly admit that not once, during all of that time, have I ever been able to whiteboard an optimally efficient (or nearly so) algorithm implementation _and be confident it was such_. Mind you, I'd love to be able to, and I'm definitely not arguing that understanding complexity isn't useful knowledge. However I would contend that being able to bang out a optimal b-tree implementation from memory on-demand isn't first principles knowledge, but is merely _trivia_ ability.

I would further argue it's not even first principles knowledge that is most important, but rather a combination of critical-thinking, enough self-awareness to notice when you've exceeded your current knowledge/experience, and sufficient humility to fix that lack rather than plod along with blinders on.

Granted, I might be less competent than I believe, but then, in practice, it would seem that incompetence can lead to genuine success. Perhaps the scope of my 5 years real-world experience is so narrow that I've simply not had the chance to encounter a situation where others would say "if you're not able to whiteboard a b-tree, you're not the right person to solve this problem". But then, I'm a full-stack developer that regularly uses Java, Javascript, and Python and have occasionally had reason to use R and C#, so it seems unlikely that my experience is that exceedingly narrow.

Sorry that this became a bit rambling, it's just a topic that's always intrigued me in how divisive it can be.


However I would contend that being able to bang out a optimal b-tree implementation from memory on-demand isn't first principles knowledge, but is merely _trivia_ ability.

a) What the heck is an optimal B-Tree? (Don't you have to tune the parameters? b) I would also contend that it's trivia, and merely a way of getting at the ability to apply First Principles knowledge.

I would further argue it's not even first principles knowledge that is most important, but rather a combination of critical-thinking, enough self-awareness to notice when you've exceeded your current knowledge/experience, and sufficient humility to fix that lack rather than plod along with blinders on.

So you're not disagreeing with the value of 1st principles knowledge. Then you cite an even deeper kind of knowledge. Where exactly are we disagreeing here?


I'm curious to meet these devs that claim that even having a basic understanding of algorithms/data structures fundamentals is not worth their time. Something tells me that the work they do is more related to design/front end and as a consequence they've either never needed to use computer science fundamentals or have been able to get away with naive implementations due to working on trivial problems.

Usually I don't bother engaging in arguments with these people because I know that the standard algorithms interviews will weed these people out so that I never have to deal with them but I've started to notice a trend in more of these people finding their way into the "big 4" type companies and ending up on large scale/infrastructure projects where they end up making very critical mistakes, see https://news.ycombinator.com/item?id=13700452


I think you're misquoting the article, and the "Dunning-Kruger" is uncalled for. Here's the full quote:

> If she would have started her email with "We're looking for an algorithm expert," we would never have gotten any further and would not have wasted our time. Clearly, I'm not an expert in algorithms. There is no point in giving me binary-tree-traversing questions; I don't know those answers and will never be interested in learning them.

We should avoid the notion that naive string concatenation is a problem that impedes our ability to deliver software, except for niche applications, this is not the case.


> We should avoid the notion that naive string concatenation is a problem that impedes our ability to deliver software, except for niche applications, this is not the case.

The attitude that really basic Computer Science concepts like algorithms and algorithmic complexity are irrelevant is exactly why software projects are so frequently FUBAR.

The Dunning-Kruger reference is spot on. The original author doesn't even know what they don't know, or why they should want to learn it. Being proud of ignorance of a cornerstone topic isn't something to be celebrated.


I would say in my experience most projects are messed up from lack of separation of concerns--on many levels. Followed by, different teams maintaining the same code base all with their own stands.

Being generally messy and not considering the over all design hurts way more than just basic CS.

I have seen plenty of performance issues related to CS knowledge but a lot of times those are masked by messiness.


Can you provide an example of an open source code base which is messed up due to ones inability to understand algorithms?


I don't have an open source example; I tend to actively avoid such projects, and so I tend to accumulate lists of the ones that seem well engineered rather than the opposite.

The Quora app, though: If I'm writing a reasonably long answer and I delete a paragraph, it can take more than 10 seconds to complete. There's some profound inability to understand algorithms in there somewhere, I can guarantee it.

There was just a headline on Hacker News a few days ago about a "reject!" function, if I remember correctly, that ended up with an n^2 algorithm for several major Ruby releases because someone fixing a bug didn't understand the consequence of their change. I don't have the link, though.

EDIT to fix the name of the Ruby function and to add the link. [1] Thanks to user nighthawk454 who dug it up!

[1] https://news.ycombinator.com/item?id=13691303


Given that the Quora interview process is reputably difficult [0], and not lacking in algorithms questions [1][2], this is quite a puzzling situation.

[0] https://www.quora.com/Which-companies-have-really-hard-algor...

[1] https://www.quora.com/challenges

[2] http://www.businessinsider.com/heres-the-test-you-have-to-pa...


I work at a "big 4" company that's also known for having difficult algorithms questions and I can confirm that sub-par developers still pass through. After talking to a lot of my friends in the industry, my theory is that most devs these days are basically just using sites like leetcode to train and memorize implementations of algorithms which end up being the same questions used by interviewers that use leetcode as a question bank.

I'm not against this type of training, I'm very familiar with and have competed in ACM ICPC, TopCoder, etc, but what I've noticed is that there seems to be a divide in the type of devs that are merely brute forcing and solving as many questions as possible in order to create their own solutions bank vs the ones who take a more structured approach of learning problem solving paradigms and algorithms/data structures in a way that enforces why they're used in the situations they are and how those can be adapted and composed to solve larger problems.

If you want horror stories... I've worked with a dev that didn't understand the concept of passing by value vs by reference. This was a person being paid 6 figures and has been writing code for cloud infrastructure with a fundamental misunderstanding of the underlying memory model. I've also had devs that didn't even know that an abstract data type like a map could be implemented with either a hash table or self balancing search tree and therefore no idea of when you should/can/can't use one or the other. This type of thing wouldn't really bother me if I was talking to a front end dev, since I imagine most of what they do is solve design problems, but it's very worrying when these are people working on the lowest layer of a public cloud infrastructure...


See, your case is exactly when I would expect to be quizzed for algorithmic complexity and the like. And I would most likely fail! I understand pass by reference / pass by value but I would have to study hash tables (which I'm familiar with) and self balancing search trees (which I have heard of, but have not, to my knowledge, encountered).


Interesting.

I wonder what the history of their mobile app is, though? It feels like it's a hybrid app, and it feels like it's developed iOS-first, Android-as-afterthought, so it may be the ugly stepchild of the team that gets no attention. "Does it build on Android? Ship it!"

Also, if it is hybrid, it may be a JavaScript team -- and JavaScript still has that "top developers love to hate it" aura [1], so it may be that some of the least talented Quora developers populate the team? (Apologies to the Quora app team, but ... 10 seconds to delete a paragraph?)

The worst of the problems seem to be when I'm also using the SwiftKey keyboard, so they may not see them internally. Somewhere along the way they're doing something pathological, though, because SwiftKey works like lightning in other apps, and only in the Quora app can get bogged down and end up so slow that I have to disable it and use the standard keyboard just to type anything at all. There's also a bug where I sometimes can't hit enter after a link I've pasted, and again I have to switch to another keyboard.

SwiftKey may be reacting to notifications that the text has changed, for instance, and Quora is sending 200 change notifications, one for each character changed, and then SwiftKey is querying the entire source document again each time... Or some such. Don't know.

It's not just SwiftKey related issues, though. Whatever they were using for rich text editing has been broken in different creative ways every few months, where edits wouldn't take, or what you're actually seeing would vary from what gets posted, or you can't edit the link text, or you can't GET OUT of the link text, so all the new text you type ends up part of the link, or ... It's been broken in so many ways that I've lost count. A recent bug I encountered was when I was pulling the text down to get to the top of what I'd written, I scrolled it down once too often, and that triggered a "refresh" ("drag down from the top" to refresh) which lost all of my text. Sigh.

The Quora app is really in need of a solid team experienced in app development, robust data handling, and proper UI behaviors. If it has competitive programmers working on it, they're solving the wrong problems.

[1] I used to be guilty of this, but JavaScript got better, linters now help prevent the worst legacy JavaScript issues, I changed my opinion in large part, and I use TypeScript now anyway. :)



Yes, this link! Thanks!


> The attitude that really basic Computer Science concepts like algorithms and algorithmic complexity are irrelevant is exactly why software projects are so frequently FUBAR.

You must mean "some software projects" and not "frequently." And who's proud of ignorance? Ignorance of what? And, what's more, why should it be considered a negative if someone isn't concerned, or even is proud, about ignorance of certain things.


Other people have cited numbers, but in my opinion, most software project (greater than 75%) end up:

* Over budget in dollars or time or both

* Fragile

* Insecure (sometimes profoundly so)

* Having major UI issues

* Missing major obvious features

* Hard to extend

* Full of bugs, both subtle and obvious

Pick any five of the above at least. Keep in mind that most software projects are internal to large companies, or are the software behind APIs, or are otherwise niche products, though there are certainly plenty of mobile apps that hit 4-7 of those points.

Who is proud of ignorance? Original author:

> There is no point in giving me binary-tree-traversing questions; I don't know those answers and will never be interested in learning them.

He proudly states that he will never be interested in learning these topics. These topics that are profoundly fundamental to computer science.

Why should it be considered harmful? Because anyone who is writing software should always be learning, and should never dismiss, out of hand, interest in learning core computer science concepts. They should be actively seeking such knowledge if they don't have it already. It's totally Dunning-Kruger to think that you don't need to know these things. His crack about learning "object oriented design" instead made me laugh: As if knowing OOD means that you don't need to know algorithms. To the contrary, if you don't understand the fundamentals, you can create an OOD architecture that can sink a project.

It's like the people who brag of being bad at math -- only worse, because this is the equivalent of mathematicians being proud of their lack of algebra knowledge.


So if the developers knew to traverse a binary tree, how many of the 75% succeed?

There is something I've mutated to my own liking called the 80/20 rule. In software, you will spend 20% of your time making the application 80% of it's maximum potential performance. You can then spend the remaining 80% of your time gaining the extra 20%. If that is cost effective to your company, then by all means do it. If it isn't 80% is just fine and you've cut your development costs by 4/5ths.

For me, being able to "rote" an algorithm on some whiteboard falls into the last 20%, maybe. Collection.Sort, whatever it uses, is good enough. Hell, you get about 78% of that 80% by using indexes properly on your RBDBMS.


> So if the developers knew to traverse a binary tree, how many of the 75% succeed?

I would say it's necessary but not sufficient. There is no perfect interview strategy. But for projects that are entirely developed by people who can't traverse a binary tree, I'd say the odds of failure are very, very high.

Sure, a strong developer can hit that 80% of performance quickly (in probably less than 20% of the time), but a weak developer who doesn't know how to optimize probably won't even make 5% of "maximum potential performance."

I had to work with a tool once that had a "process" function that would take 2-3 minutes to do its work. It was used by level designers, and it was seriously impeding their workflow. The tool was developed by an game developer with something like 10 years of professional development experience (he was my manager at the time), and he thought it was as fast as it could go -- that he'd reached that maximum potential performance, with maybe a few percentage points here or there, but not worth the effort to improve. He didn't want to spend another 80% of his time trying to optimize it for a minor improvement either, so he left it alone.

I looked at what it was doing, figured out how it was inefficient, and in less than an hour rewrote a couple of key parts, adding a new algorithm to speed things up. Bang, it went from minutes to 200ms. No, I'm not exaggerating. Yes, I had a CS background, and no, the experienced developer didn't.

If you end up with accidental n^3 algorithms baked into your architecture, 5% performance in development can be 0.1% performance in the real world, or worse as your N gets large enough. And yes, that's even if you index your data correctly in your database.

And that's when your site falls down the moment you have any load, and you end up trying to patch things without understanding what you're doing wrong. In my example above I improved the speed by nearly a factor of a thousand. In an hour. That can easily mean the difference between holding up under load and falling over entirely, or the difference between being profitable (running 4 servers that can handle all of your traffic) and hemorrhaging money (running 4000 servers and the infrastructure to scale dynamically).

Which is why you need a strong developer to run the project to begin with. Maybe you're that strong developer; I'm not really speaking to you in particular. But I know a lot of developers who just don't have the right background to be put in charge of any projects and expect that they'll succeed.


You must mean "some software projects" and not "frequently."

In the 2000's and prior, it was common knowledge that the majority of software projects failed. Even if they succeeded on paper and shipped, they weren't actually used. By some estimates, it was something like 75% of software projects.

If you look at the contents of "ecosystems" like Steam and the the various app stores, you'll see much the same. Most of the software out there is a defacto failure, and much of it is due to the ignorance of the programmers resulting in substandard programs.


Do you have a citation supporting the claim that those failures were due to poor performance and not far more significant problems like failing to correctly model the actual business problem or handle changes? That era was dominated by waterfall development which is notoriously prone to failure due to the slow feedback loop.

This is highly relevant because one not uncommon problem with highly-tuned algorithms is the greater cost of writing that faster code and then having to change it when you realize the design needs to be different, especially if pride + sink cost leads to trying to duct tape the desired feature on top of the wrong foundation for awhile until it's obvious that the entire design is flawed. That failure mode is especially common in large waterfall projects where nobody wants to deal with the overhead of another round.


Very large numbers of shovelware apps in the iPhone app store had the problems with crashing.

This is highly relevant because one not uncommon problem with highly-tuned algorithms is the greater cost of writing that faster code and then having to change it when you realize the design needs to be different

Can you give me a specific example of this? I'd say, give me a specific example, and most likely, I'll give you a reason why the software architecture of that example is stupid.


> Do you have a citation supporting the claim that those failures were due to poor performance and not far more significant problems like failing to correctly model the actual business problem or handle changes?

"Poor performance" isn't the only negative result of using an insufficiently experienced developer, or a developer who doesn't have a full grounding in algorithms and data structures.

Someone without a full CS background (and without the ability to remember much of that background) is likely to know a few patterns and apply them all like a hammer to a screw. This leads to profoundly terrible designs, not only when you take performance into account, but finding the best model for the actual business problem, handling changes, permuting data in necessary ways, and other issues.

> That era was dominated by waterfall development which is notoriously prone to failure due to the slow feedback loop.

Extreme programming was the first formalized "agile" approach, and the very first agile project to utilize it was a failure. [1] A big problem was performance, in fact:

> "The plan was to roll out the system to different payroll 'populations' in stages, but C3 never managed to make another release despite two more years' development. The C3 system only paid 10,000 people. Performance was something of a problem; during development it looked like it would take 1000 hours to run the payroll, but profiling activities reduced this to around 40 hours; another month's effort reduced this to 18 hours and by the time the system was launched the figure was 12 hours. During the first year of production the performance was improved to 9 hours."

Nine hours to run payroll for 10,000 people. We're not talking about computers in the '70s with magnetic tapes. This was 1999 on minicomputers and/or mainframes. If that wasn't a key algorithmic and/or architectural problem, then I would be amazed.

When you're designing a system using agile, it often ends up with an ad hoc architecture. Anything complicated really needs BOTH agile and waterfall approaches to succeed. You need to have a good sense of the architecture and data flow to begin with, and you need to be able to change individual approaches or even the architecture in an agile manner as you come across new requirements that you didn't know up front.

> This is highly relevant because one not uncommon problem with highly-tuned algorithms is the greater cost of writing that faster code and then having to change it when you realize the design needs to be different

I'm going to say [citation needed] for this claim.

I gave an example in another thread of having improved the speed of a game development level compiler tool by a factor of about 1000, with no major architectural changes, and it took me about an hour.

At the same time I performed a minor refactor that made the code easier to read.

Bad design == Bad design. That's it. Good design can include an optimized algorithm. A good design tends to be easy to extend or modify.

Very rarely it makes sense to highly hand-tune an inner loop. The core LuaJIT interpreter is written in hand-tuned assembly for x86, x64, ARM, and maybe other targets. It's about 3x faster than the C interpreter in PUC Lua, without any JIT acceleration. That is a place where it makes sense to hand-tune.

> pride + [sunk] cost[s] leads to trying to duct tape the desired feature on top of the wrong foundation

That's another orthogonal error. It makes me sad sometimes to throw away code that's no longer useful, but I'll ditch a thousand lines of code if it makes sense in a project.

Every line of code I delete is a line of code I no longer need to maintain. I'm confident enough in writing new code that I don't feel any worry about deleting old code and writing new. This is as it should be. Sad for the wasted effort, yes. But I know that the new code will be better.

[1] https://en.wikipedia.org/wiki/Chrysler_Comprehensive_Compens...


The C3 system only paid 10,000 people. Performance was something of a problem; during development it looked like it would take 1000 hours to run the payroll, but profiling activities reduced this to around 40 hours; another month's effort reduced this to 18 hours and by the time the system was launched the figure was 12 hours. During the first year of production the performance was improved to 9 hours.

And I happen to know for a fact that naive string concatenation was a big part of the performance problem! (Bringing it back to my original comment.)


You really think that the majority of failures on Steam and app stores are due to lack of basic algorithm performance knowledge?

A project can die a thousand deaths before performance becomes its death knell.


The large numbers of iPhone apps crashed a lot before ARC made memory management easier are good examples of bad programming killing a project.


In every single company I've been for the last 15 years, I haven't had any algorithmic problem to solve, most programmer jobs is just making software as quickly as possible and fix performance issues only when they appear.

I really don't think that every programmer in Amazon or Google is faced with algorithm problems everyday, that's probably localized to some projects that deal with those, not the majority of programmer population.

Expecting every programmer to know how to solve given problem on a whiteboard is like expecting every life gourd to know how to do a water escape (because it involves swimming AFTER you solve a problem of tied hands).


I agree. Most companies SAY they want algorithm experts and API ninjas, and computer science wizards, because it sounds like the right kind of thing to say. But most of the actual work out there is either 1. plumbing some data from one layer in the stack to the other (typical CRUD app) or 2. writing glue code to integrate one vendor's middleware to another vendor's middleware, or 3. fixing bugs in some 10 year old legacy application, or if you're lucky, 4. getting an application to work barely well enough in order to ship it on time. An in-depth knowledge of traversing binary trees is not required for any of this stuff.


But most of the actual work out there is either 1. plumbing some data from one layer in the stack to the other (typical CRUD app) or 2. writing glue code to integrate one vendor's middleware to another vendor's middleware, or 3. fixing bugs in some 10 year old legacy application, or if you're lucky, 4. getting an application to work barely well enough in order to ship it on time.

And most of a lifeguard's job is sitting around blowing a whistle at troublemakers. Does that mean you'd be fine with a lifeguard who only has those two actions in his skillset? Also, as someone who has held a number of jobs that involve your 1 through 4 above: knowing basic algorithms is beneficial to the job. You don't need that knowledge often, but when you do, it has outsized benefits. Also, you wouldn't know that, unless you had that knowledge while doing that job. If you didn't have that knowledge, you wouldn't know how it would have benefited you.

The fact that "in-depth knowledge" is applied to "traversing binary trees" makes me sigh and feel uneasy.


I'm not sure the lifeguard analogy is apt. There are plenty of programming jobs where algorithm knowledge is helpful but not strictly necessary. You and I might not find ourselves in them but they exist.

It's more about testing the range of skills that will actually be applied. To toss in another incomplete analogy: If you're interviewing someone to build you a deck in your backyard don't spend 90% of the interview time asking him how he'd chop down trees for wood and manufacture the nails. You probably also don't need to ask him his opinion on the local building height restrictions or if he knows how to install an attic fan. You want to know that he can build a deck.


I'm not sure the lifeguard analogy is apt. There are plenty of programming jobs where algorithm knowledge is helpful but not strictly necessary.

I think a lot of the above analogy mismatch and "not strictly necessary"-ness is because an application slowdown is much less severe than a drowning death. But if you account for that difference, it is very apt.

If you're interviewing someone to build you a deck in your backyard don't spend 90% of the interview time asking him how he'd chop down trees for wood and manufacture the nails. You probably also don't need to ask him his opinion on the local building height restrictions or if he knows how to install an attic fan. You want to know that he can build a deck.

But "how he'd chop down trees for wood and manufacture the nails" isn't apt at all! As a programmer, understanding algorithms is much more like 1st principles knowledge behind understanding building materials. Your contractor needs to know the structural properties of different kinds of wood, the properties of the soil in your backyard, and some basic chemistry/engineering/physics. Otherwise, she might put the wrong materials together and invite galvanic corrosion, or use the wrong materials in the wrong context and invite premature wood rot. You'd want a carpenter to understand wood harvesting, milling, and treatment in how it effects the properties of wood. Could you get by most of the time, just knowing how to saw boards and drive nails with power equipment? Sure. But someone with the right knowledge is going to avoid the unlikely but very costly pitfall and save her customer thousands of dollars otherwise.

You want to know that he can build a deck.

On time and under budget. She's more likely to do that if she's armed beforehand with certain experience and 1st principles knowledge. All things being equal, the contractors who built decks on time and under budget had good trades instruction or otherwise had the right experience and/or 1st principles knowledge. Also, if you look at the longevity and long term structural integrity of deck projects, you will likely find such a correlation.

(Also, it would be a really good idea if your contractor had a knowledge of local building regulations concerning the deck.)


I haven't had any algorithmic problem to solve, most programmer jobs is just making software as quickly as possible and fix performance issues only when they appear.

I sense a contradiction between the 1st clause and the last clause.

Expecting every programmer to know how to solve given problem on a whiteboard is like expecting every life gourd to know how to do a water escape

Expecting every programmer to know the basics of recursion and about 10 basic algorithms and the reasons behind their time/space complexity analyses is like expecting every life guard to know what a drowning person looks like (nothing like what they show in movies and TV shows) has the ability to tread water without using arms, and knows how to tow a panicked, struggling swimmer without letting them kick you in the stomach.

A programmer who can't explain the three bits of knowledge in my comment above is like a lifeguard who simply knows how to swim and doesn't know what a drowning person looks like. Over 90% of the time, all you have to do is sit in the tower chair and blow your whistle at troublemakers, but in the occasional instance, you will have to know something special, with a higher cost than usual if you don't know.


I like the analogy of jet fighter pilots vs. airline pilots. Most programmers are metaphorically airline pilots, doing a routine job that is more about following the processes than elite skills, but some people wish that they were doing something more complex than they are.


And yet, there are still occasions where the airline pilot needs to draw on deeper skills. We know this from many deaths averted by good pilots, and needless deaths resulting from poorly trained pilots.

The failed landing of Asiana Airlines at SF and the Air France Airbus that dropped into the ocean are two good examples.

Most programmers are metaphorically airline pilots, doing a routine job that is more about following the processes than elite skills, but some people wish that they were doing something more complex than they are.

Programmers who think they can only get by with the "routine" skill level are like airline pilots who think it's just fine for them to do the checklists, run the autopilot, and know nothing else. Who would you rather have piloting your plane?


This gourd could solve it: http://veggietales.wikia.com/wiki/Mr._Lunt


> Nearly every programmer nowadays knows that naive string concatenation is inefficient, and so they should use a stream or something like that.

You and I live in different worlds, or have vastly different definitions of 'programmer.'


> I'd rather hire someone who knows exactly why it's O(n^2)

"How do we satisfy unlimited wants with limited resources?" - economics to the rescue again.

Everyone would love to hire Linus Torvalds for that matter, but there are plenty of programming jobs where you don't really need to, or don't want to, or can't spend that kind of money. There are plenty of useful, productive positions for people who aren't up on their O notation. The important thing is to be clear what kind of person you're looking for.


Everyone would love to hire Linus Torvalds for that matter

Torvalds is very lucky to have found his niche because he is unemployable - the first sweary rant at a teammate and he'd be shown the door. Well except maybe at Uber.


I suspect you've got the cause and effect backwards. More likely, because he found that niche where he does not have to learn to moderate, he does not.


That's like the old advice that you should always watch how your date treats the waiter...


I would not hire Linus Torvalds. For all his technical capability (which I've followed as a linux and git user for some 25 years now) many of his externally visible statements are not acceptable for any modern workplace (this has been discussed a number of times, I know people have differing opinions, but if Torvalds worked at a modern corporation with a real HR department, his words would get him in trouble.)


I strongly suspect that Linus Torvalds does not care at all about "real HR departments."


> Nearly every programmer nowadays knows that naive string concatenation is inefficient

Now that's what I call naive! Or maybe hopeful. Just saying you're giving an awful lot of credit here...


Just saying you're giving an awful lot of credit here...

I did spend awhile as an expensive "consultant" where often I was hailed as a genius just because I applied that tidbit of knowledge and sped up the application by some large fraction. However, I've also noted that there are some mainstream programming communities where people know this about string concatenation and spread this knowledge around.


There are vast amounts of basic information in the world that provides opportunities. In turn, knowing that information provides the opportunity to learn new information that is now "basic" as a result of learning the prior information.

To believe any concept is so basic everyone should know it seems unlikely, since if everyone knows it the odds of being able to find someone that knows it are high and the odds of gaining value from it are small.


It is a basic question about one very narrow skill set that is relevant for some, but not all, programming jobs. It is trivial only for people who had to slog through only-sort-of-relevant CS degrees, who may or may not be the most productive programmers. I could just as easily say "we should be asking every programmer to handle a six-box grid layout" on a white board. It is just as trivial, it is relevant to the same percentage of programmers, but if that were the intro criteria I would probably have a problem hiring people to do the algorithms work.


It is a basic question about one very narrow skill set that is relevant for some, but not all, programming jobs.

I could just as easily say "we should be asking every programmer to handle a six-box grid layout" on a white board.

It's a skill set that's relevant to almost all of programming jobs. Quick: give me an example of how time/space complexity can be relevant to layout manager code. (Can't do it? Dunning-Kruger just reared its ugly head again.)

You don't necessarily need to write your own layout manager. But to properly shop for one for some demanding use cases, you may well benefit from just cocktail-party level first principles knowledge, so you can pick the right library. Or, do you just have faith that Apple or [Big Company] knows what it's doing, and assume that everything can handle everything you throw at it. That works over 90% of the time. It's the exceptional case where you need the 1st principles, and such cases usually come with outsized penalties if you don't know.


> Can't do it? Dunning-Kruger just reared its ugly head again.

@stcredzero, I've seen your name pop up time and again on this thread. I don't think you realize how rude you're being.

You clearly think that Data Structures 101 is important knowledge, to the point of saying that most software project failures are caused by lack of that knowledge. Others in this thread take a more moderate position.

Can you accept that this is a difference of opinion, not a sign of incompetence, and stop being so rude about it?


I don't think you realize how rude you're being.

I am being what I need to be. I find it remarkable that some people are receiving this information and prioritizing protecting their ego, rather than investigating what motivates such sentiments. Perhaps that difference should be exploited in interviews?

You clearly think that Data Structures 101 is important knowledge, to the point of saying that most software project failures are caused by lack of that knowledge. Others in this thread take a more moderate position.

Let me clarify: I think that Data Structures 101 is important knowledge, to the point of saying that a significant number of software project failures are caused by lack of that knowledge.

Others in this thread take a more moderate position.

They either don't have the knowledge and don't appreciate what they don't know. Classic Dunning-Kruger. Or, they have the knowledge but haven't been in the right kind of projects where someone not knowing really bit them hard. ("It never happened to me, so it must not be that big a deal.")

Find me someone who has the knowledge and who has been "bit hard." You can find them in these threads. Read what they say carefully. It's much more than a matter of opinionated debate.


> why adding to the end of an array that doubles when it expands is O(n) amortized

Not to be overly nit-picky here, and technically O(n) is correct as well (it's also the un-amortized worst case of insertion), but you probably meant to say that the amortized time is O(1), or θ(1) to be even more precise :)


to be even more precise :)

Or, you should demonstrate the self awareness to realize that you're just playing with the ambiguity of English in a hastily written comment. Exercise: in what precise context would time/space complexity be O(n), and in what precise context would it be O(1)? (Though the emoticon is probably an indicator that you already knew all this.)


I'm well aware how to analyze a dynamically growing array (each insert "pays" for two moves in the future), but I'm not sure what you're getting at. The amortized complexity of appending an element to the end is θ(1), worst-case non-amortized θ(n) when the growing needs to happen.


Then do that N times. Then re-read the text and think of the context being discussed.


Actually, there are lots of skills and competencies that are valuable on engineering teams other than the ability to have all basic algorithms on call in your head at all times. If you filter the engineers you are willing to hire based on any one factor (unless you are looking for that specific specialty), then you are missing out on a variety of perspectives and experience that could make a better team. You don't need everyone on your team to be an algorithm expert. If you don't recognize that there are a huge variety of other skills necessary to successful software projects, then you are the one experiencing the Dunning-Kruger effect, not the op.


Point of clarification: I don't want everyone to be an algorithm expert. I do want every programmer to have the basic background knowledge. If you are passingly familiar with binary trees and recursion, then you can figure out the answer from First Principles. If you are not familiar with the skills at that level, and don't have the basic skills you're supposed to get by studying such things, then perhaps one would characterize that as merely memorizing.

Actually, there are lots of skills and competencies that are valuable on engineering teams other than the ability to have all basic algorithms on call in your head at all times.

Where did I ever say such a preposterous thing? Please provide a quote.

And again, if you understand algorithms at a basic level from First Principles, then you don't have to memorize every algorithm like baseball card trivia. The important thing is having the background knowledge. If you don't think such knowledge is foundational background, then you probably don't know enough to know that. Hence: Dunning-Kruger.


Optimal algorithms are not the only most important thing that a software developer should know. It depends entirely on the type of the job. In most cases an experience of "design patterns" is most important.


I'd rather hire someone who can actually get the job done in a timely manner. We can optimize it later, especially after profiling to know exactly what we need to optimize, and especially if there are security implications that mean we can't ruthlessly optimize everything. People who optimize too soon end up with solutions that are hard to change later, or solutions that don't even do the job they're supposed to. It's all too easy to get a half-working version that's super fast by focusing on the speed part more than the working part. https://alibgoogle.wordpress.com/2011/02/18/scheme-vs-c/ is worth a read. "Real efficiency comes from elegant solutions, not optimized programs. Optimization is always just a few correctness-preserving transformations away."

There are some "free" optimizations I'd expect a lot of seasoned programmers to be familiar with, like as you mention the string concatenation issue, but those are easy to catch during code review. Similarly for certain optimizations that are even 'freer' in the sense that modern compilers will take the easier-to-read (if naively slower) version and produce the fast version, I expect to not have to tell people during code review that the 'slower' version isn't slow because the compiler does the right thing.


Often enough, implementing the optimal solution is not really more time-consuming than the brute force solution. Often it's not about micro-optimizations, but a difference between O(n^3) and O(n*logn).


I guess I think of programming in a big company as a team exercise. And thus the hiring pipeline can take advantage of specialization and good-enough economics, as well as teaching people more on-the-job. It's a large burden if you demand everyone on the team know the same first principles (of course there may be disagreement on what those are). Even if personally I want to know as much as I can and get along with others who do too, I'm not really going to hold it against a team member if they produce a poor-complexity algorithm that in review I (or someone else, I think it is good to have at least one person on the team who has algorithm knowledge) notice and can suggest an alternative that's easy to use instead. If they do it again (for the exact same thing) I might raise an eyebrow and remind them of the repeat mistake. If they do it a third time I might write them off as incapable of learning new things. But this pattern applies to other aspects of software too. e.g. Getting people to write testable code can itself be a hassle if they haven't been exposed to that before. Or something as dead simple as getting people to use the rest of the team's coding style guidelines, I've seen interns who still had problems with that near the end of the summer internship, I wouldn't be very happy if it was a senior colleague...


The problem is when the goal is to get something out of the door as soon as possible. In such cases developers tend to settle for the first thing that comes to their mind, which is rarely an elegant solution.

These require experience and time spent carefully designing the system. A profiler shows you what is slow in your current implementation, not how to convert it to a more elegant one.


I think you got it right. The fundamental issue seems to be that the recruiter flattered the candidate and made him/her think that they had genuinely looked through their profile and determined that they would be a good candidate. Most recruiters will not do this, simply because their priorities are not aligned that way. I think the OP revealed a very important mechanism: recruiters' job is just to get bodies in the system, which needs more people continuously for replacement and expansion.


Whenever I hear a programmer talk badly about algorithms it leaves a poor impression with me. It means they have not taken the time to do basic research about their field and shows their aversion to researching new topics.

If you look at the current AI, machine learning, and deep learning fields it's an absolute must that you have some experience with algorithms.


I have done professional programming for decades. I have never written code to traverse a binary tree. I almost certainly never will.

If I ever have to write it, it's a pretty simple thing to look up. This piece of algorithm trivia has no relevance for software engineering jobs in 2017.

There is of course one reason to learn it: It keeps coming up in interviews.


could you explain what you mean by stream ?


>Nearly every programmer nowadays knows that naive string concatenation is inefficient, and so they should use a stream or something like that. I'd rather hire someone who knows exactly why it's O(n^2)

I'd love to hire a programmer that knows this is actually O(n^3), and not O(n^2).

(Well, it is if you are naively concatenating N strings, each of length N characters).

(Had a manager who insisted it was exponential. Wisely did not argue with him).

(I await attempts to recruit me based on this comment).


I'd love to hire a programmer that knows this is actually O(n^3), and not O(n^2).

Hint: Don't ever work for or hire anyone who would exploit the ambiguity of the English language in a hastily written comment to gain the false appearance of geeky superiority. (Exercise: how would you interpret the situation referred to in the comment to wind up with O(n^2) and how you interpret it to wind up with O(n^3)?)


While I wrote it without much seriousness, I think my comment does highlight a problem with the whole discussion here.

People like to boast about how important it is to know these kinds of complexities, but in reality they just use mental heuristics to come up with them (which is how we got n^2 and exp(n) from my manager).

For string concatenation, is it important they know how to do it in a linear fashion? Or is it important that they can actually calculate the complexity? Or do you want an in-between where it is OK that they do not really know the complexity, but can tell you that it is definitely super-linear?

I've seen people be fussy that they should be able to explain why it is n^3 (or n^2 or whatever), in a somewhat rigorous fashion. I say "somewhat" because when I then turn to them and ask them to derive the exact expression (not just in big-Oh notation), they will usually fail. Then they will go through hoops trying to explain why it is important to know it in big-Oh notation but not all that important to be able to derive the exact formula.

(There is, of course, a camp that does not believe much in big-Oh and actually prefers to know the leading constant, etc. My example is not contrived.)

I find people will nitpick to their level of granularity, and it is always easy to come up with some justification for their level of preference. But to others, it is just nitpicking.

For me, it is important to know that this is super-linear, whereas one can do it linearly. I don't care if the candidate knows it is n^3 or n^2. That is also why I did not point out my manager's error at saying it is exp(n).


The reason I won't work at Google is because Google is incapable of hiring the engineers I want to work with. It's not a matter of whether I could pass that interview; it is whether I want to work with the code of people who can pass that interview. Whether I want to get code reviews from people who can pass that interview. Whether I want to rely on the code of people who can pass the interview not to break down in interesting and novel ways.

I have seen the worst apps written by "Very Smart People" who obviously had never built an Android application before. It doesn't matter how smart you are, the first time you do something it will suck. I have had catastrophic failures caused by premature optimization, because locking a tool into a fancy algorithm before you know where the actual bottleneck is is a recipe for disaster. I have seen so many problems caused because people couldn't take feedback or didn't ask for help, because they were so wrapped up in being The Kind Of Person Who Knows The Answers.

Frankly, passing algorithm questions is a great way to signal that I probably don't want to have to deal with your code.

Personally, I love working with people coming out of the good agencies because building 15 or 30 applications from scratch in an environment with strong mentorship and rapid feedback is, in my experience, more likely to produce a good programmer than all the smarts in the world.


> The reason I won't work at Google is because Google is incapable of hiring the engineers I want to work with

That's a very bold (even arrogant, sorry) statement about 57,000+ (googled it) employees.

So 57,000 are worse devs then you ? This is what i hear from your statement, correct me if I am wrong ..

Many extremely talented people want to work for Google. They will take the tests, even if they don't agree with the hiring system at all .

Maybe they want to work at Google despite its incapability (according to you) in the hiring and testing process.

> it is whether I want to work with the code of people who can pass that interview

Have you found any correlation between poor code and being able to pass Google's interview ?

(edit: new lines)


I don't think he/she means that they are worse. We all want to work with developers who can help us learn and get better. From what I understand, he/she just means that their process of hiring has no correlation to the quality of code that they produce. Their hiring process is incapable by design of hiring those people BASED on those skills he/she is looking to work with, but sometimes they do end up hiring capable engineers with no credit to their process.


57,000 employees does not equal 57,000 engineers, by the way. Google has a lot of people that do things other than code.

> So 57,000 are worse devs then you ? This is what i hear from your statement, correct me if I am wrong ..

The idea that you can be objectively better or not better at programming, and that it's decidable through one or two simple factors is kinda wrong and goes with OP's point.


Since Google is an advertising company that even hires their own chefs, I would not expect 100% of their employees to be software developers.

It's more on the order of 20k developers, not 57k.

https://www.quora.com/How-many-software-engineers-does-Googl...

>Have you found any correlation between poor code and being able to pass Google's interview ?

cough cough Angular 2


> cough cough Angular 2

There is an enormous difference between poor code and a library you don't like. I may not be a huge fan of Angular 2, but it's certainly not poorly written code.


Back in the day I worked for a company that had in a single division more engineers that google has employees :-)


Who? IBM?


BT Systems Engineering division effectively the heir to Tommy Flowers


> So 57,000 are worse devs then you ?

Probably not. There's plenty of better workers than me who I wouldn't want to work with.


I never said that they were bad coders, or that I wouldn't want to work with some Google employees. I said they are incapable of hiring many developers I do proactively want to work with, because of their hiring process. It's who they exclude that is the problem; not who they hire.

I am also specifically not passing some objective value judgments here. They may very well be hiring the best developers for what they are doing: they just aren't capable of hiring the type of developers I most enjoy working with. This is a subjective preference, not an objective one.

There are absolutely talented developers who don't value working with the kinds of developers I want to work with and they won't care that Google can't hire them. I have enjoyed working with Google developers in other contexts, but many of them wasted a bunch of time and energy jumping through Google hoops because they decided it was worth it. It is my opinion that they would be better developers if they had spent that time on something useful instead.

Google makes it easy for people I don't care about working with to get hired, and hard for people I proactively want to work with to get hired. Nothing about that is about my skill at all.


>So 57,000 are worse devs then you ? This is what i hear from your statement, correct me if I am wrong ..

He never said they are worse.

>Many extremely talented people want to work for Google.

"Most people do/think XYZ."

Do I have to explain why that's not a good argument?

>They will take the tests, even if they don't agree with the hiring system at all.

Other people don't mind shitty treatment, so what?


> He never said they are worse

This is why i asked for clarifications. the OP said 'Google is incapable of hiring the engineers I want to work with', which implies that the engineers he is looking to work with are NOT there. Google is incapable of hiring them, and since they are top quality devs, someone else (more capable) did.

> Do I have to explain why that's not a good argument?

Yes, please explain. Because you only took the first sentence of my argument. the second sentence was that those "many people" are going to do what it takes, even if it means taking 'stupid' tests, smuggy interviers, and some other large corp bureaucracy (which to me personally is more concerning, but is not specific to Google, rather to any large corp). And why ? My guess is that it's probably so cool,curious, challenging to work on something like Google Maps/Balloons/Self driving cars that they will be willing to make a lot of sacrifices and tradeoffs. THAT was my argument.

> Other people don't mind shitty treatment, so what?

please explain, i didn't understand your point

(edit: spacing)


>Yes, please explain. Because you only took the first sentence of my argument. the second sentence was that those "many people" are going to do what it takes, even if it means taking 'stupid' tests, smuggy interviers, and some other large corp bureaucracy (which to me personally is more concerning, but is not specific to Google, rather to any large corp). And why ? My guess is that it's probably so cool,curious, challenging to work on something like Google Maps/Balloons/Self driving cars that they will be willing to make a lot of sacrifices and tradeoffs. THAT was my argument.

"Many people want to go to Heaven, and they are going to what it takes, even if it means answering stupid questions, pretending to feel bad about trivial things, and some other large org bureaucracy."

My guess is that it's probably so cool, curious, satisfying to be in Heaven that they will be willing to make a lot of sacrifices and trade-offs.

"Many people voted for Trump", my guess is that it's probably cool, curious, challenging to finally tackle real issues that they will be willing to make a lot of sacrifices and trade-offs.

That's stupid, right? But why?

>My guess is that it's probably so cool,curious, challenging to work on something like Google Maps/Balloons/Self driving cars that they will be willing to make a lot of sacrifices and tradeoffs.

Those people might believe that it's cool, curious, whatever. But then you should end up with "many people think X about Y", not "Y has a property X (because many people think it does)".

>This is why i asked for clarifications. the OP said 'Google is incapable of hiring the engineers I want to work with', which implies that the engineers he is looking to work with are NOT there. Google is incapable of hiring them, and since they are top quality devs, someone else (more capable) did.

He doesn't want to work with them because they are unpleasant to work with (or be around in general), not because there is an issue with their programming capabilities.


> and since they are top quality devs, someone else (more capable) did.

That's an interesting assumption to make. I know many programmers who are excellent at what they do and have also outright refused offers from Google, Amazon and Facebook; it's usually on the principle that such large, bureaucratic organizations are quite defective insofar that they are adept at suffocating those who are happiest with maximal flexibility and responsibility.

There's a group who jump from contract to contact, startup to startup, because the thrill of new challenges and little to no barriers from organizational structure is worth the uncertainty in compensation.

It also helps that a few aren't willing to relocate to the USA.


Just as a matter of correctness: only 1/3rd of Google employees are engineers/development guys.


How many Google Engineers do you know?

I know quite a few of them. All of them, without exception, are incredibly brilliant engineers. You might think this is just anecdotal evidence, that's correct. But also think that Google Engineers are responsible for many top quality, world class software projects. How could that be possible without really good engineers?

Yes the interview process is tedious and can be frustrating. But negating the fact that they have some of the best software engineers in the world is just silly.


Q: How many Google engineers does it take to screw in a lightbulb?

A: Build great content.

https://twitter.com/dr_pete/status/623981710003187712


I don't get it.


It's the standard Google answer to every question.


> It's not a matter of whether I could pass that interview

Rrright.

> Personally, I love working with people coming out of the good agencies

What does that mean? What agencies? What makes these people good to work with?

I'm seriously flabbergasted by most of the points you make.


> Frankly, passing algorithm questions is a great way to signal that I probably don't want to have to deal with your code.

Wow that's some serious sour grapes.


What is it with developers and their code-centric view of employment? Any idiot can "write code" in 2017. It's not particularly difficult. You might do it slightly better but still your profession is not by a long shot "difficult".

Can you work in a team? Can you communicate? Do you dress in a presentable manner? Can you explain technical ideas simply? Do you know anything beyond cutting code?

It's simply not good enough to be some lone-ranger ubercoder anymore. Coders are a dime a dozen.

I'd actually rather take someone with a degree in statistics or physics, let them learn to code on the job, and I'd be able to apply them to an order of magnitude more problems than your typicall mathematically illiterate self-styled genius who can write a bit of code.

Programming is the easy part, stop deluding yourself into thinking what you do is difficult. It isn't.


What are these "agencies" you speak of, and how do you tell the good ones from the bad?

(This is a serious, non-ironic question: I don't know what you mean, but it sounds interesting.)

Edit: I guess you mean recruitment agencies? How does one find the "good ones," other than through bitter experience and/or luck?


He means dev agencies that contract for projects. Agencies act as general contractors, augment your staffing, or provide consulting services. Agencies build whole or partial products, often times for stuff that you use on a daily basis. For example, there's an agency with relationships with many YCombinator companies and builds most of their mobile app or web apps.


Thanks.

Any idea how one finds the "good" agencies? Is it a networking thing?


You hire agencies the same way you hire people.


But Google's code quality is quite high...


I don't know about their code quality, but we consume some of their data via APIs and they periodically randomly change their non-nullable data types to nullable. Fun times hotfixing those in prod.


When data comes in over the wire, you probably shouldn't be making assumptions about what's nullable. You should write a parser that fails gracefully if it receives unexpected input.

It's the robustness principle.

https://en.wikipedia.org/wiki/Robustness_principle


> When data comes in over the wire, you probably shouldn't be making assumptions about what's nullable.

I have to work with their APIs often (as well as Amazon, Facebook and a couple other big players, they're all equally awful).

Their documentation will tell you a type is an integer and will never be null. They will do this for 3 months, and then change it to send a string that can be nullable or empty string, without telling you or updating API versions. It goes beyond robustness. It's the "BigCorp will actively lie to you in their documentation" principle.


You're missing the point. Sure, you should make it fil gracefully on unexpected input. But Google shouldn't change APIs in a way hat causes 3rd party apps to fail at all.


Third party apps wouldn't fail if they were correct in the first place. Failing gracefully on unexpected input is the way a competent engineer would structure their program.


Is there a functional difference between "my production code suddenly stopped processing work because it rejected a newly nullable input value" vs "my production code suddenly stopped processing work because it crashed on a newly nullable input value"?


If the crash gives an attack vector for a security threat then there is.

Personally, "Be conservative in what you do, be liberal in what you accept from others" is a bad policy in the modern world. It should be "Be conservative in what you do, be strict as if you don't trust them in what you accept from others".


That's a valid point if it's security-critical. My initial response was with regards to grandparent comments arguing over semantics, when the end results in both cases was "successful processing ceased because of an API change."


'Liberal in what you accept from others' in this case would mean being prepared to handle nullable JSON fields.


Ideally it's more like "my production code skipped over 1% of batch jobs because it didn't expect the null values" or something like that. Handling errors gracefully is a bit of an art unto itself.


Indeed it is an art.


production code that rejects a suddenly nullable input value is more deterministic than "something crashed" and can be handled better (triggering a watchkeeper, notification) rather than relying on someone paying attention to the crash logs.


"Failing gracefully" is still failing. I don't want my app to fail, gracefully or otherwise.


Funny you point this out. Google's JSON Style guidelines[1] say this:

>If a property is optional or has an empty or null value, consider dropping the property from the JSON, unless there's a strong semantic reason for its existence.

[1]: https://google.github.io/styleguide/jsoncstyleguide.xml?show...


That's a different issue. The commenter is referring to keys that can take on null.

This style guide entry suggests a course of action in the event that one of these keys currently has a null value.

The style guide isn't prohibiting nullable keys.


yes some of their api's and tools are a bit shoddy eg unable to parse a xml sitemap to spec and that spec was only a 3 page RFC!


Have you ever worked with the Android SDK?


In my opinion, Android's code organization is a result of questionable architectural constraints applied early in the development of the platform.


Isn't that exactly the point roguecoder made?

> I have had catastrophic failures caused by premature optimization, because locking a tool into a fancy algorithm before you know where the actual bottleneck is is a recipe for disaster.

EDIT: Or, are you saying that it was from the decisions made before Google acquired Android?


I think that highly depends on what you value in code.


That's a bit biased.


I certainly wouldn't want to work with you!


amen!

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: