You can be a nobody without connections or degrees and if you can prove you have skills during an interview process you may be hired.
Contrast this with other hiring processes which are more irrational, like med residency match, investment banks favoring "target school graduates", law firms favoring "top 14 graduates", etc.
I think whiteboarding is dumb and I'm increasingly unwilling to go through with it. That's a hurdle.
But in many careers the hiring processes are walls, as in there is 0% chance you'll get hired. Because you don't know someone, because you didn't go to the right school. The barriers have little correlation with ability to do the job, and more importantly after some milestone has been crossed it's impossible for you to improve your chances.
The negative thing is that most tech companies heavily favor "top school" candidates and actively recruit for them. They would rather higher someone provably less qualified from a "top school" than someone else. They track and boast about how many "top school" candidates they hire.
Tech companies are hugely biased in favoring the upper class. And then they misguidedly try to pay a recompense for this unethical bias by discriminating on the basis of race in favor of "unrepresented minorities". Of course, they still really want those "URMs" to come from a "top school".
Their goal is to counter their active classism through active racism. As if they somehow cancel each other out.
Everyone said it was impossible.
She went to LinkedIn, found people with the right skills (strong data and ability to communicate), and had a massive fight with HR because none of the candidates came from "top" schools.
She won the argument, and all of the hired candidates did a great job.
People (especially US people for some reason) seem overly obsessed with the university someone attended, when it doesn't seem to be that predictive of workplace success.
definitely not. french companies actually have engineering salary tables depending on your age and university
It discusses how there are 6 ranks for business schools, and your salary for the first N years will be based on that. Same for engineering. It’s funny that one of the companies is proud to declare that they can move salaries by “up to 5%!!!” based on the candidate themselves (whereas the school can make a 20-30% difference...)
Source: am a dropout of a shitty university, still have no degree. It does not bother HRs, as far as I could see so far.
That happens literally everywhere, especially at big companies that make easy money. The elites don't like sharing the pie with the great unwashed.
Perhaps it's bad in other places too. I personally think it's idiotic, as the point of interviewing is to find great candidates, and I have never felt like the University they attended was a particularly good predictor of that.
Can I have a citation?
Good job prospects upon graduation is one of the things that makes a school a "top school" and attracts smart students. If you wanted to hire people with no work experience, and money was no object, a "top school" would be the logical place to go to first. And I say this as someone who didn't go to a top school. So tech companies actively recruit from top schools only insofar as every other company in every other industry does. But that doesn't mean they recruit exclusively from top schools either. Stanford, MIT, and the Ivy League literally don't graduate enough students for that to be a feasible new grad hiring strategy.
You'd have to provide evidence for the first half of that statement though. My personal experience is after you've worked a few years, no one in software engineering cares where (or even if) you went to school. And any software engineer with a pulse located in the SF Bay Area can get at least a phone interview with any of the top tech companies.
I've worked for 10 years and recently applied for a position. The manager told me I was a good candidate, and that I checked the box for coming from a top school.
He didn't use the phrase "checked the box" but did explicitly say that my coming from a top school meant he could skip most of the technical portion of the interview and just focus on the people aspect.
But for the most part I agree with you. It usually is important for the first job.
As someone who went both to a top school and a very average school, I do have a problem with it. If you've not been to an average school, you may be surprised at how many bright and motivated students there are. And if you've not been to a top school, you may be surprised at how average most of the students are.
I don't know if this generalizes, but it was my observation: Top school students tended to be a bit less honest (soft cheating, etc). At least where I was, it appeared to be clearly tied to the competitiveness needed to get in and get top grades.
 My grad school group-mate, who had only been at top schools, once went for an internship in a national lab. He was shaken at the fact that another intern from the University of Alabama-Huntsville was as capable/smart as he was. I saw this often in top school students, where they just assume that if they're doing well in school, that they are somehow better educated than the rest of the country.
If I were in a position to interview and hire someone, graduating from a top school would at least garner some attention, assuming the degree was relevant, but it's not a 'free pass' through any of the steps of the interview process, and may even earn them a more critical assessment in the implicit 'culture fit/personality' category.
This is just my opinion on the matter, not trying to make any sort of factual claims.
That may be true, but likely both of these have poor GPAs and thus are filtered out anyway. Usually you'll be evaluating candidates with at least a decent GPA.
I'm not claiming the average is the same between the two. But when there are a lot more average schools than top schools, chances are that numerically most good candidates do not come from top schools.
When I look at resumes of new grads, I ignore the school altogether. GPA has to meet some not-high threshold, and then it's just a peek at interesting projects they may have done.
And it has always made me wonder why they were offering such high pay for such low experience, when salaried positions in London are lower than NYC, and most of the contract work I see in NYC are half that rate.
Google told their recruiters to actively not hire white or asian males for certain roles.
Perception shaping is always unsavoury, but that's pretty dark.
Ignore yourself. The system surrounding you is not unbiased and never was. Here are some things I'm aware of that happened at Google/other comparable tech firms:
1. Recruiters tracked the quality of interviewers (as judged by candidate and hiring committee feedback) and assign the best interviewers to women/minorities.
2. Sourcers could get much higher bonuses if they recruited women.
3. Comp can end up artificially higher for women, which obviously is a form of recruiting. At Microsoft managers were given bonus pots that could only be allocated to women.
4. Women who failed phone screens were presented for on-site interviews anyway in the hope that they could somehow make up for it. Men were dropped immediately.
5. Women are targeted with specialist recruiting teams, fought over to a dramatically higher extent than men.
6. Men are sometimes just excluded from recruiting events completely, e.g. "Code Jam to IO for Women".
And you seem to have chosen to ignore flashing red alarms like recruiters filing lawsuits with copies of emails where they were told to stop recruiting white men.
BTW, don't look at the firing process. Unlike hiring+promotion, engineers don't control that, HR does (PeopleOps or whatever it's called now). It's an open secret that at Google it's nearly impossible to get fired if you're a female engineer, even if your performance is terrible and your team hates you. At worst they'll start moving you around.
If cardinality(pigs + sheep) > cardinality(work horses), the truth will be downvoted.
In software, the work horses need to be more proactive and not give up what is theirs.
Not my claim that Google is bias free. I am not denying what you have claimed, it is just that I have not come across such incidents and if you are a qualified person it is extremely unlikely that I will not judge you performance properly because of your gender, race or ethnicity.
There is no doubt that Google has gone lala SJW route in last few years but then many of us put conscious efforts in fixing those problems.
You're trying to argue that processes to encourage women to join somehow make it easier for them to be hired. Those aren't the same.
The unfireable nature of female engineers there was rather well known, at least a couple of years ago. The last I heard on that was from a fairly senior manager who after a couple of whiskeys reported he knew of managers fighting to keep female transfers off their teams. Not due to any innate sexism but because they'd realised that female transfers were far more likely to be troublemakers or poor performers than male transfers, due to HR's desperate attempts to recast unacceptable behaviour as just "not being a good fit for the team" and constantly moving them around. I had one on my team who was constantly lying to her teammates, as well as being a completely incompetent coder. For instance she was mystified by a CL she reviewed one day that contained hexadecimal, something she'd apparently never seen before! Some people left the team specifically to get away from her. But, untouchable because the bosses boss was a feminist who thought this young woman with clear management ambitions was just wonderful. Result: she was rapidly promoted into management where she wanted to be, to the disbelief of her remaining teammates.
Most Googlers were never really aware of these practices. Nonetheless, to believe Google is unbiased requires an incredible suspension of disbelief given the rather extreme publicly stated positions Pichai and the remaining senior management have taken, not to mention the Damore fiasco.
I've heard lots of things from recruiters that were wrong. So much so that I generally advise people I know to check with be before believing anything a recruiter says. But because they lie on purpose, but because they're often misinformed.
This goes for compensation, process, and policy questions where recruiter statements reliably break with policy and practice. So pardon me if I don't find recruiters to be a reliable source for hot corp goss.
> Not due to any innate sexism but because they'd realised that female transfers were far more likely to be troublemakers or poor performers than male transfers, due to HR's desperate attempts to recast unacceptable behaviour
Sounds like innate sexism to me, given that the same thing happens with men. It's really hard to get fired. Ive had to deal with (men) not being fired for ages.
> For instance she was mystified by a CL she reviewed one day that contained hexadecimal, something she'd apparently never seen before!
Depending on the language and background, this sounds reasonable. I wouldn't expect a he java or frontend person to necessarily know hex. So yeah you're making my case for me. Sounds like bias against women.
To contribute my own, relatively unique, anecdote, Ive interviewed both as a man and as a woman and the process is considerably easier when you just get to coast through on the "white nerdy guy, must know tech" stereotype.
Showing bias in favour of women is very easy: just quote the executive leadership saying things like "we want more women", cite pro-women policies or present one of many other pieces of hard evidence. No such evidence exists for a pro-male bias which is why this argument always ends up relying on logical fallacies and innuendo.
Maybe try talking to actual women in the field before making such wildly false claims. I do find it hilarious that there's this overarching "feminist propaganda" and despite all that tech companies still routinely have essentially no women in the engineering staff.
Given the differences in the genders of who chooses to study the relevant qualifications, that's obviously a false assumption.
The amount of just unsourced vitriol of your comment is unapproachable
My comments are phrased in a level, factual manner. They're mostly retellings of things seen or experienced first hand, thus I am myself the source. But if you want sourced evidence of similar claims, by all means, go read the recruiter lawsuit against Google that was filed. It has plenty.
Maybe try talking to actual women in the field before making such wildly false claims
If you're going to assert a claim is false you need to pick something specific and show it's false, otherwise you're just blustering. And having direct experience of talking to women about this, I can tell you that many recognise the built-in advantage they have and are quite uncomfortable about it.
I do find it hilarious that there's this overarching "feminist propaganda" and despite all that tech companies still routinely have essentially no women in the engineering staff. 
It's pretty ironic that you put citation number in square brackets and then don't actually provide one, given your moaning about unsourced claims. As for "essentially no women" you mean about 15-20%, which is far cry from essentially none. It's this sort of thing that justifies my claim of propaganda; it's normal for jobs to have unbalanced distributions of genders. Very few jobs have exactly equal proportions of men and women. For instance HR has a higher proportion of women than software has a proportion of men, but I don't see much talk of the terrible anti-male bias that must obviously pervade the HR industry. /s
Edited previous comment for the missing source, that's my bad. (and despite calling me out you still can't find a single source for your claims (short of a vague command to go read a document you clearly haven't read, which is just, beautiful))
Let's even abandon, for the sake of argument, any desire to see ratios in engineering even approach demographic ratios and instead just look at the rates graduating with CS degrees. That puts the ceiling closer to between 30 and 40 percent[0, for a representative top tier school] and, by your own admission, we close to half that on average (the numbers fall of faster if you consider technology leadership or look more at smaller companies (which is harder to source considering a lot of places aren't very open with regards to their hiring stats, but in my experience working in nyc I’ve only seen sub 10% (N=3). Sub 10% to me essentially none, since that can basically evaporate with normal engineering churn). If we were to assume there was a grand bias, you'd expect an over representation in relation to the rate graduating at the very least.
“Thing exists” does not imply “thing normal” or “thing ideal”. That’s a common logical fallacy used to justify traditionalism in all forms. Also, as an aside, people are talking about inequality in the HR field, you’re just not paying attention to it (tldr it is weird that there are more women and even with the numerical advantage they’re still underrepresented in leadership which reflects in their comp) . When we look at technology it’s especially strange because there is no clear mechanism (outside of social bias) that might explain why we’d see the ratios present. Despite what men on the internet like to believe there’s no evidence women that go into math or computer science are worse at it than men. Estrogen is great but it doesn’t change your ability to write code. Hell no mechanism to explain why the ratios are more skewed than medicine  or law even.
As for women being “uncomfortable talking with you about this”, I’d suspect that has a lot to do with your fear of a nonexistent feminsit boogieman and repeated claims that they don’t deserve their jobs than any kind of conspiracy. Imposter syndrome acts across genders and this repeated narrative plays to a lot of people’s insecurities.
This was far more effort than you deserve, but, I can only hope one day the culture at some of these major tech companies start to change, if only so I don't have to hear think pieces about how hard it is to hire from people that auto exclude 50% of the population.
I can't imagine why women are uncomfortable talking you, a proud sexist that openly claims there's feminist propaganda involved in their hiring. I can't think of any reason short of shame of being involved in such an obvious conspiracy.
This isn't even a grammatical sentence, but you appear to be suggesting that being told what to read isn't providing a source, which is nonsensical.
Go read: https://www.documentcloud.org/documents/4391847-18-CIV-00442...
To repeat - for most of what I've written I'm the source. Make of it what you want. What I've seen is consistent with similar claims made by others, many times in many contexts. The tech industry discriminates against men systematically, and it's because of the distorted ideological beliefs of people like you!
That puts the ceiling closer to between 30 and 40 percent[0, for a representative top tier school
GA Tech isn't representative. Even your own linked document says that: "Georgia Tech also awards more engineering degrees to women than any other U.S. institution"
GA Tech is famous for having a much higher proportion of women on its courses than normal. I guess someone told you it's a success story and now it's your go-to example.
They "achieved" this by systematically discriminating against men, which has led to a Title IX complaint against them for no less than ten different programs:
They routinely ban men from all sorts of events so if you believe this is an example of an unbiased selection process you're making my case for me. Men are systematically discriminated against and women never are: the disparate outcomes reflect fundamental differences and NOT some sort of non-existent bias against women.
Let's even abandon, for the sake of argument, any desire to see ratios in engineering even approach demographic ratios
You act like it's an absurd position to "abandon", but it's an absurd position to have in the first place. Let's not do for-the-sake-of-argument, let's deal with reality. Nearly all jobs have distributions different to base demography.
Here's a chart you should look at: https://www.bostonglobe.com/metro/2017/03/06/chart-the-perce...
You're picking on engineering here, but why not pick on:
1. Kindergarten teachers, 97.% female
2. Dental hygienists, 97.1% female
3. Nurses, 90% female
4. Phlebotomists, 86.% female
5. Insurance claims processors, 85% female
All these jobs are less representative of the population than programming, which at merely 80% male is significantly less far from 50/50 than a huge number of teaching and medical related roles.
If you scroll the list you'll see that most professions aren't even close to 50/50.
“Thing exists” does not imply “thing normal” or “thing ideal”. That’s a common logical fallacy used to justify traditionalism in all forms
Actually this kind of thinking is itself a logical fallacy. You're starting from a base point of assuming you can understand the reasons for absolutely every fact about the world, which clearly isn't the case. To believe you can decide what is ideal in any area of human existence requires a vastly over-exaggerated sense of one's intellect.
What you call traditionalism is really just a starting assumption that when studying complex evolved systems there are reasons for its current state that you may not understand. This is a perfectly rational assumption and made all the time in e.g. medicine. It's an assumption of incomplete information and inaccurate methods, that can lead to creating new problems instead of solving them. It's what led to "first, do no harm" as a medical concept.
When we look at technology it’s especially strange because there is no clear mechanism (outside of social bias) that might explain why we’d see the ratios present.
This is the root of the problem - that belief is pure ideology. The obvious explanation is that women find technology less interesting than men because they're women and women are different to men, in all sorts of complex ways. This statement is like saying "there's no clear mechanism for why almost everyone who works with children is a woman". Of course there's a clear mechanism for it: they're women, they have babies, they evolved to want to care for children as a result and thus women very often enjoy children's company more than men do. The idea that anything other than the base 50/50 case must be bias ignores not only vast amounts of basic evolutionary theory but also common sense.
In the end I'm arguing with you because it's people like you who ultimately argue for and implement anti-male discrimination, on the belief that you're on some grand moral quest to eliminate discrimination against women. But like Animal Farm, the evil you think you're fighting is in fact yourself - the only gender based discrimination I've ever seen in my entire career was done by feminists.
Having had direct experience of how it works over the years, absolutely, incompetent women are more likely to get through the process. You can't constantly, for years, tell everyone that reducing the proportion of men is a critical priority and not have people bend the rules and make exceptions as a consequence. They're only doing what they're told to.
If companies like google were actually actively discriminating against competent asian/white male developers in favor of minorities their engineer demographics wouldn’t be 80%+ asian/white male
Likely the proportion would be higher. But yes, it's hard to change the demographics in areas where hard skills are measurable and where women don't really want to be anyway. Probably that's why feminists are moving on from targeting engineering roles: their current thing is leadership positions where less tangible "soft" skills are more important, comp is higher (the ultimate goal) and it's easier to manipulate the recruiting process. Hence laws enforcing that women be allocated board seats, things like that.
And there lots of men have witnessed women being put into management roles in software they were completely unsuited for, over and over. I think most guys have a story like that by now.
And it's always HR... they aren't impacted at all with ok:ish hires.
Fact is, hiring is in the instant a zero sum game. If recruiters are prioritising women it means they're putting men to the back of the queue in the hope they won't be forced to hire them. It's sexism, it's wrong and it makes a mockery of everything feminists claim to believe.
The active recruitment is to counterbalance the fact that referrals, one of the biggest sources of talent, is not a diverse pipeline. Everyone's network is mostly male and white or Asian. This is even true of engineers from underrepresented groups. If you want a shot at hiring qualified underrepresented candidates, you have to actively recruit them. Your existing workforce cannot help identify them. That's what's meant by diversity and inclusion.
Now whether you agree that diversity and inclusion are worthwhile is another discussion altogether.
Other posts in this thread make claims oppose that.
One person says that bad phone screens for men? No call back... bad phone screens for women? call back and face-to-face to get them another chance.
that's the definition of "more attempts".
Whether said comment is real and honest is unknown (random internet comment) and whether "diversity and inclusion" are worth it (actively choosing ("recruiting") someone on race/color/etc to battle perceived racism is... a form of racism itself) is of course another battle...
I just want to call out that diversity and inclusion are not about battling "perceived racism". Diversity and inclusion measures are to counterbalance the fact that professional (and personal) networks in tech are not diverse. The status quo left alone would bias itself toward white and Asian males irrespective of intent. By actively looking for underrepresented candidates, companies can counterbalance network effects in hiring.
Yes they do. They are not subject to the same cool down period on a phone screen failure. Remember, google pitches it as “looking for a good signal” so retrying until the candidate passes isn’t lowering the bar in their mind (even though it is because phone screens are flawed but that’s another discussion).
> If you think otherwise just ask an engineer from an underrepresented groups about their recruiting experiences.
I have, I worked there when this started several years back. Several got a chance at a phone rescreen sooner than the normal back-off and one got an invite to come back for a second on-site because “the signal wasn’t clear” on the first.
Whether or not you feel this is a problem, it's worth reviewing the data.
[And for the record, I've enjoyed every female or minority colleague I've ever worked with, and made efforts to ensure their success, whatever their ability. I don't particularly object to AA hiring, but I don't like wasting my time on "fake" interviews, so I think publication of stats like this should be required.]
Is this true? I can see this being the result of poorly implemented hiring processes, but I can't see this being the explicit goal at a reasonable company doing reasonable things.
It wasn't that there were not qualified people from each school, it was just easier to find them at schools with hard-<subject> educations.
There were other schools that were physically closer even within a few miles, but he could go all day without finding a promising candidate.
He hires many minorities, women, people of different orientations, so it wasn't that.
It was more like, "does anyone understand a linked list?"
Years ago, I had an interview for Yellow Pages. I know, who the hell still uses Yellow Pages? Well, this was back in 2014, though I'm still wondering this now. Anyway, the entire interview process was very depersonalizing. Nobody asked me about my background, why I was interested in working for them, or anything like that.
After those shenanigans, they brought me into a board room with 6 other people, and I they asked me to solve several brain teasers, including the "burning rope" problem. These people were stone cold! No humor about them. Fortunately, I memorized most of these brain teasers from the internet and previous interviews, so this part wasn't so much difficult as I had to act like I hadn't heard those questions before.
I didn't get hired, and it was for the best because I'm not a robot, and I don't like brain teasers.
YellowPages.com looks a lot better than it did in 2014, but let's be honest, it's a glorified ripoff of Yelp with shitty search results. In fact, it looks nearly identical to Yelp. I wouldn't have been proud to work on that.
However, YellowPages.com ripped off the concept of yellow pages just as much as Yelp.
I'm saying that the YellowPages.com is a ripoff of the Yelp design and experience. If you changed just the logo and the color, it would look identical to Yelp, or at least how it looked before the recent redesign. Of course business listings and ratings are nothing new.
The same with diagram drawing, 10 boxes and arrows, but missing most aspects of the architecture that actually matter. As a hiring manager I much more prefer to talk through the problem to see if the candidate can ask the right questions, thinks about NFRs, can suggest alternative solutions in light of new information.
As business gets more intertwined with tech, workers need more technical competency - in parallel with automation of traditional jobs.
My take is that the more traditional companies will have to compete with tech firms on attracting talent, which in turn means that they'll have to change their ways when it comes to hiring, on how they do their hiring.
I think the old days of "We only hire Harvard / Yale / Princeton / Stanford grads with 3.5+ GPA and top internships" are starting to die out. Yes - some of the most competitive jobs will probably continue to use proxies like that, but even firms with the worst gatekeepers are starting to see that there's talent everywhere, and that modern tools can help identifying them. (Remember, one major reason that prestigious firms only hire candidates from top schools and programs, is that they only do their campus recruitment at those schools, because it's a pretty labor and resource costly activity - you can't visit every school in the country, and check out the top students there)
Thats what we want to see.
That said it’s at least a partially objective skill check and I don’t know a better way.
Out of the handful of companies I interviewed at, Google was the only one who offered feedback on where my interview went wrong. I thought that was nice of them.
Did you go to MIT, Stanford, Berkeley, CMU, Harvard, a different highly selective University?
Were you previously working at another highly selective/famous company?
If so it's pretty easy to get the interview, if not it's pretty hard without a referral.
I just went to the jobs page and clicked the "I'd like a job please" button. And what do you know, a recruiter contacted me a few days later to setup an interview.
I think people underestimate how desperate big tech companies are for warm bodies right now. It's not that hard to get an interview.
I went to RPI which isn’t a bad place, but isn’t MIT. I had a good gpa and decent projects but got ignored or instant rejections from Google and Facebook, I was able to get other interviews (Twitter, Palantir) and after working at a famous company now it’s easy to get interviews, but there’s a randomness to it.
If you don’t have a brand name school or don’t know someone it’s still difficult. Not the fault of the companies really, there are just too many applicants.
Since I was in school there are more companies tackling this like triple byte so maybe it’s better now?
Let's be honest. It totally helps to know someone even in tech even with technical interviews. If you know someone in a group you're interviewing in they'll often give you hints about what the questions will be. They'll even give the exact questions that will be asked. This helps a lot in narrowing down your studying for the test... err... interview.
My ability to do really well on written tests actually got me into Stanford, despite coming from a pretty normal middle class background -- my dad worked for suburban city government. If I had majored in CS instead of science, I would be a shoe in, but alas I didn't.
If they could just do the tests slightly differently I could ace them. Instead, I'm mainly just avoiding the FAANG world. If I spent another 80 hours practicing them I might have a shot, but eh...
I was stuck in stages 2 and 4 ( anger and depression) of 'unable to land a faang(ish) job grief' for years and would make angry comments on this weekly thread.
I think I've made peace with it and I hope to be in stage 5 acceptance. I've been leetcoding every single day for last 3 months, 300 problems in I can hit my target of one easy and one medium with 1 hr goal. I hope to reach my target of 1 easy, 1 medium and 1 hard within one hr. I regret not doing this sooner.
Even if don't land a faag job, this process has led to me accept the process and make peace with it. I really don't care if hiring is 'broken', it is what it is, its an obstacle i need to overcome. Bring on 'trapping rain water' , 'describe one time you had conflict with your team' garbage. I am ready!!
Considering how standard it is, we might as well just make it a part of a software developer certification/license that you have to do once to break into the industry.
Then maybe companies can actually focus on hiring for the job?
Even then, I've started to ask what "hiring for the job" means. General aptitude in our field should be a good indicator of ability to learn and pick up skills in different specialties.
The funny thing is, despite our best efforts to not become a real standard profession we are behaving a lot like one, except we don't realize it and keep making candidates jump through the same hoops repeatedly.
Getting many software jobs is still about network and recommendations.
Getting many software jobs is about a standardized base level skillset and knowledge (i.e. leetcoding).
Getting many software jobs is about specializing in a domain and skillset (for e.g. ML or finance or cyber security and all their respective languages and frameworks).
And as many commentors have mentioned we aren't as meritocratic as we would like to believe. We still bring our biases to the hiring process. We still hire people we like for subjective reasons over others.
My point is this. Maybe, just maybe, it's time we as an industry standardized the profession officially and codified what it takes to get certain positions. That's what I can offer to this conversation constructively.
Yes, knowing algorithms and data structures IS imporant to being a good software developer, even if you are building CRUD or mobile apps. But, how many times to do I need to prove I know them? Yes, showing leadership skills IS important to being a good software developer. But isn't being a leader mostly about conflict management, moral obligation and being ethical?
Maybe we can stop fearing becoming a real profession that is beholden to standards and public scrutiny and embrace it. It will end up being better for everyone. Then we can revisit the criteria regularly to make sure the tests we need to pass represent what it means to be do our jobs and do them well.
because this hiring thing is the ultimate dead horse of HN. Every single comment here is a rehash of something that has been said a million times.
Here is the one from last week
exact same comments.
I think it is only a matter of time for this to happen. All it would really take would be two major companies deciding to standardize on some set of criteria, and smaller companies would follow suit for the sake of simplicity (and because no one really feels like they know what they're doing anyway).
Your credentials and political connections are what allow your resume through the ten arbitrary filters before you even get to the technical screen, collaborative coding, etc., and those same credentials / connections are what get you hired over everyone else who completed the technical assessments just as well as you did.
The tech screens exist as a political gatekeeper filter. It allows people who are fighting over petty authority among codemonkeys way down the food chain to enforce arbitrary and capricious standards, especially on cultural capitulation, to ensure they are hiring people that meet a good mix of (a) competent, (b) don’t know what they are worth, and (c) can’t quickly politically outrank you because you forced them to comply with your tribal pecking order bullshit on the way in the door.
That’s all it has ever been about. It has never ever been the least bit about meritocracy.
One of the best programmers I've ever met.
He couldn't do complex big-O analysis because he never learned it having started to work straight out of high school, but other than that, his code was meticulous, he gave excellent code reviews, and just had this natural understanding of technology and how to program. Probably the best programmer on our team, much better than me and I often sought out his opinion on things, even though I could have been his dad. I always learned things every time I talked to him.
He is a multi-millionaire now, and still works as a programmer but does it for fun.
I wish more of them were millionaires now. Many of them are still programming (degrees become more important as you move up), which is a shame because they’d probably make fantastic people managers.
Go to small startups if you want it to be about competence and teamwork.
And I do 3-4 hiring interviews a week here in NYC. What connections? First show that you can code; a lot don't pass this filter. Then show that you can architect a large distributed system on a whiteboard; a lot of people applying for a senior software engineer position lack the breadth of knowledge and the consideration required.
And after that you'd speak to CTO, where maybe you can say or do something so silly to be rejected; this happens very rarely.
I mean, hell, that's a great skill. I wish I had it. Then again, I've seen people forced to build "large distributed systems" to solve things I can solve on one thread on one CPU because they don't have my skills. It's almost as if there is more than one skill that qualifies you as "senior" or something!
All of those quotes were because they are all relative terms. Like high-scale meaning 10,000s of concurrent users; and single server for the web app (separate database server, or utilities server for cron jobs, offline processing, etc); and efficient meaning nothing really.
Non-senior SE candidates don't have to do this exercise.
So, genuine question then: What are the final and most important criterion that would be used to differentiate between two completely successful candidates being equal in their performance in your pipeline?
I think the answer to that will give more credit to the comment you are trying to negate.
Yikes! You may want to review laws about hiring... immediately.
This isn't to say that we never hire anyone from out of state. We have done so on many occasions. But if there are 2 equally qualified candidates, we are more likely to choose the one that already lives close by and can start immediately.
Some people have very little idea about load-balancing, DNS, database scaling (replication, sharding, etc), fault tolerance, graceful degradation, queues, throttling, caching, you name it.
An ideal interview should reflect the actual work you'll do during the job. And I think system design interviews accomplish that job pretty well (certainly much better than coding interviews). Will you ever have to design and build a system like you during the interview? Likely not - a startup, while does require building a lot things for scratch, rarely requires the scalability and intricacy outlined in these interviews. While large companies will have the components broken up by team, so you'll often be silo'd into just one part.
That said, knowing how your systems work at a high-level is much more useful than knowing something like shortest-path algorithms. Even if you are silo'd, understanding upstream/downstream interactions can be crucial.
This is just every candidate, certainly at the senior level. It’s impossible to use these questions to discern anything. Even asking them to give these details about past real systems they designed you’re just getting the same 5-layer-deep inception rote memorization of everything they could be asked and every follow up contingency.
You might find it as amusing as I did to learn that the word ‘meritocracy’ was intentionally coined to mean more or less the opposite of what you think and how you used it here — meritocracy was a derogatory word intended to highlight the social injustice of thinking that ‘merit’ is somehow fair and can be measured and used to rank people.
Anyway, I’ve gotten hired into a good spot in a company without knowing anyone there. Maybe the dozens of responders are some indication it’s more than round-off error? Why are you so certain?
> Your credentials and political connections are what allow your resume through...
Wait, aren’t credentials one type of “merit”?
Credentials are only a weak signal of merit. They are often gained illicitly and are no substitute for actual ability.
When I hire people, I care about potential and attitude more than merit. I don’t care as much about what someone has accomplished as I do about whether someone wants to learn, is smart and eager to be part of a team. I’ve seen with my own eyes people that come with a lot of accomplishment and a ton of skill, people who have a lot of merit by any definition, and who are terrible people to hire who cause real damage on teams. I don’t believe that “merit” is a particularly strong signal for job performance.
What are you referring to about credentials being gained illicitly? How often does this happen? Are you talking about degrees? I’ve never met someone with an illicitly gained degree, and it would be stupid to try that; you’d get caught immediately. It’s not as easy as just lying. If you’re talking about degrees, I serious doubt that happens “often”, but feel free to provide some evidence that it’s more than statistical noise or anecdotes. If not degrees, what are you talking about?
Lastly, and maybe most importantly, the meta point here is that “actual ability” is to some degree a socially unjust metric. IQ correlates with family income. Why? Because people with more money get better nutrition, more training, better schools, stronger business networks. The advantages in life are, statistically, a major component of what leads to “ability” in the first place.
I do think standardized tests do a pretty good job. Test mostly on algorithms/tech related questions and you'll mostly find people with interest in these topics. You're right that people who do well on these types of tests may still be terrible people, but it hasn't been my experience that interviews are good at snuffing out terrible people as opposed to people who aren't like the interviewers (ex. people who grew up in poor environments whereas the interviewer didn't).
> What are you referring to about credentials being gained illicitly? How often does this happen?
Regarding degrees, it's well known for example that the Ivy League have large biases towards legacy admissions and for example Asian Americans are highly penalized due to affirmative action. Any effects caused by these types of biases are immoral to me, maybe "illicit" wasn't the right choice of word.
> IQ correlates with family income. Why? Because people with more money get better nutrition, more training, better schools, stronger business networks. The advantages in life are, statistically, a major component of what leads to “ability” in the first place.
This is not true. In the US at least, the effect of shared environment on IQ is known to be very low by late adolescence. IQ correlates with family income because smarter parents have higher income and pass their genes down to their children.
Other than the writing tests, math and science standardized tests fail to predict college and especially graduate school performance very well, and it gets worse for predicting career performance. Standardized test scores absolutely correlate with SES, according to the testing agencies themselves.
If you think Ivy League racial bias is immoral, why do you believe that “merit”, which is racially biased in the US isn’t immoral?
Graduates from Ivy League schools represent only a tiny percentage of people, and what you’re actually referring to is Harvard, not even the whole Ivy League, a lawsuit at one single school brought by people with a political agenda against Affirmative Action. This was your argument for dismissing credentials and claiming they’re a weak signal. Are you reconsidering this point of view? That seems like really thin evidence for the strength of signal that credentials do or do not provide. If you believe that standardized tests are a strong signal, and standardized tests are used for college admissions, then doesn’t it follow that gaining the degree credential is at least as strong a signal as the standardized tests you advocate?
I don’t know anything about the anonymous blog post you’ve linked to, but it’s not a scientific source, nor a meta-study, and it appears to be cherry picking and have an agenda. I certainly wouldn’t blindly adopt the “inferred” conclusions you read there, just because it all seems plausible or convincing to you. Claiming that money doesn’t affect IQ or merit or outcomes doesn’t even pass the smell test, there’s strong evidence that being poor hampers ability, even stronger if we’re talking about extreme poverty.
There are pretty well known, well documented problems with cultural bias, in the US and globally. Financial inheritance and pure financial advantage are real; money can and does overcome the disadvantages of low IQ. Being rich has immense advantages in every way. If you are convinced that social biases don’t affect merit and that being rich doesn’t influence merit dramatically, you’re certain that poor people must be poor due to IQ, and you aren’t at all curious about why some smart people believe “merit” might be a subtle way to perpetuate the ideas of Social Darwinism, then we should probably stop here.
> You can be a nobody without connections or degrees and if you can prove you have skills during an interview process you may be hired.
I think it stops being seen as a breath of fresh air when e.g. Google's explicit expectation is that you will spend multiple months studying for the privilege before interviewing with them.
And that's after investing years and hundreds of thousands of dollars in extra schooling.
You also don't get to retake those every couple years.
Now, recruiters could have been lying about that but on the other hand, they obviously know that I applied before yet want me to apply again. I'd imagine a regular company would have done something about their recruitment process if they had that many false negatives. However, if I wanted to get dedicated employees, that would be exactly how I hired.
None of that is true of the interview. There is no notional standard. Assessments of the same person vary wildly from sample to sample. If you know the material, you are nevertheless expected to fail.
What? Yes you do.
If you mean set in terms of money, lawyer pay is strongly bimodal these days, with most lawyers earning the lower amount: https://www.lawcrossing.com/article/900049851/Is-Law-Still-a...
Of course their bar to entry is very high, they are the one of highest paying employers in tech. If you want the pay and prestige, you have to play their game.
Where I would agree with you is when non-Google companies use Google tactics for hiring but pay like a mom and pop shop.
I've heard of it, and I've heard that they just sort of take their good old time (and yours) with round after round after round of the process.
I've heard that people do it, though.
If you're not a fresh graduate, likely you've had much of the preparation as part of your previous job experience.
This can be beaten by brute force by memorizing a huge amount of material, but that isn’t the goal. It isn’t like the interviewers think you’ll need to implement their question in your day job.
On the other hand working at FANG encourages developing soft skills (cross team wrangling etc) that no amount of leetcode would teach you.
But very often you have to understand which algorithm to choose. Then you can pick an existing implementation.
In interviews I conduct I gladly allow to read the wikipedia page with a reference implementation (say, for the mentioned Dijkstra algorithm), or pick an implementation from a standard library of a language (say, for a priority queue). What I'm looking for is a conscious and reasonable choice of an approach, and understanding its trade-offs.
How was it in the 80s or 90s ? could you be hired just because you can fix or implement something even if you don't fit that well with a group ?
90s: we need to hire everybody who can code
I've no doubt your code is the best in the world (have you a github repo?), but with that attitude you'd be shown the door quite soon.
I suspect certain threads, by title, attract different population segments that differ on their programming interests, sensitivity, defensiveness, individuality, and so forth. Hiring threads normally have increased sensitivity, and I suspect the sensitivity is ramped up right now with many people being out of work.
As for my personal bias software hiring is horribly subjective. When I am interviewing the technical considerations of the interview are generally time consuming substance of no practical value in the consideration of candidate selection. Knowing that going in I prefer to watch the interviewer to gleam what decisions that forming and not disclosing.
So we added an option to interview by way of a paid trial period (work part-time nights/weekends for a few weeks with the hiring team). Figured maybe some people might prefer that, but probably wouldn't be feasible for most.
Every candidate since has chosen that route, with very good outcomes - so far, everyone who did well in the trial period has been a good hire. Some of our best hires did not interview well (and would not have been hired under our old process), but were outstanding in the trial period. And, a couple candidates interviewed so well we almost skipped the trial period, but they struggled to complete even simple tasks during the trial.
We've now optimized interviews for that process, where the decision is primarily about whether it's worth moving to a trial period. That usually only takes a short screening call and a 1-hour call with the team (we're a remote team, even before quarantines).
BTW, if you'd like to experience this first-hand, we're hiring - https://news.ycombinator.com/item?id=22753515
I would guess that most of your hires are single young people. And you are already kind of teaching them to work nights and weekends, which sounds like you are, again, choosing people who would not have a problem with a bad work life balance.
I also wouldn't be surprised if your process isn't teetering on the illegal side. Not on purpose, but it might be accidentally favoring or disfavoring a race/gender. Probably not enough to get you in trouble though, because those things are hard to spot. And unless your company is really big, probably nobody cares.
Contrast with the alternative: I'm a senior, considering switching jobs, but I have no real way to evaluate the potential new employer beyond word-of-mouth and interview experience. What if it turns out to be a place I don't like and I just gave up my former position for something worse?
Working a few evenings & weekends (and getting a little extra $$) to evaluate the new company sounds like a superb way to gain the confidence and jump ship without regrets. In best case, I get a new job I like. In best case, I get some extra money and keep my old job which is better than the new one proved to be. Win/win? What's the worst that could happen?
Everyone we have offered the option has taken it, and they're usually pretty enthusiastic about it vs traditional interview process. We're working with them a lot during the trial period, and after if they get hired - lots of opportunity to find out how they're doing, and so far no negative feedback.
At the point where we move to the trial, we don't usually know age (because we haven't even seen the candidate) or family status. From what I've picked up, it's a mix of ages (usually a bit older - we've found some experience is necessary to be successful as remote-only), and mix of single, married, and families.
For those who have ended up hired - I think most are parents with younger families.
Other than the trial period, nobody usually works nights or weekends.
Almost all processes do (though in this case, family status is probably the more relevant protected class it biases on.) That's not illegal, if it's sufficiently connected to the actual needs of the job, and they seem to have at least tried to evaluate that for their hiring process more than most places do, so they are probably less at risk of disparate impact discrimination than most hiring processes in tech.
Wow, if you say so, but that seems weird to me: every job I've ever had (and everybody else I've ever seen hired at any job I've ever had) it's taken quite a while before I was really able to contribute much productively. Just getting acquainted with the codebase, figuring out the deploy process, and learning all the unwritten rules about the company culture takes a non-negligible amount of time.
Most candidates have a local dev environment up and running in the first couple of days, and complete their first task (from our real backlog) in the first week. Within the 2-3 week trial period, most are able to finish several meaningful tasks.
Also, do you give them tasks on your actual code base? If yes, you are likely biased towards people already familiar with your tech stack (getting really productive with an unfamiliar tech stack takes more than a week, but short enough to make hiring somebody from a different background still worthwhile).
Any thoughts on this? (I realize that some amount of bias in the process is OK)
We do a subset of on-boarding, just enough to get them running a local dev environment and able to push commits and connect with the team.
We pick real tasks from the backlog that don't require a lot of ramp-up to complete.
We assign someone to work with them during the trial period, help get their environment going and orient them to the tasks, help them if they get stuck.
We discuss with the candidate how things are going each week, how they're feeling about things and how we think things are going. Usually by the 2nd week it becomes pretty clear if it's a good fit.
Since we are remote-only, candidates have to be capable of figuring things out mostly on their own - technically and organizationally, so we look for more experienced people - who probably are more likely to be successful with this kind of interview process.
Oh, that changes a lot. I think it would be a lot easier for people to log in a couple hours here and there as part of a trial than to commute to an office and spend an entire day.
I actually like this a lot, thank you for posting. Maybe the COVID-19 situation provides companies around the world with enough experience on remote work to make this kind of remote trial more attractive.
IANAL, so some of my phrasing may be wrong.
This seems to be a pretty popular opinion. It totally might be right, but it’s not my own experience, so I have a serious question because maybe I don’t know what’s happening out there with most hiring today - what are the broad-stroke outcomes that demonstrate that hiring isn’t working? Are there statistics that show that hiring has problems? All of the reasons given in the article are claims without evidence, nor objective comparison to hiring for other industries. When looking for jobs, I’ve never been evaluated on IQ or code alone, it has always come with communication and personality and culture fit evaluations, among many other things. When hiring, my own team does everything this article claims isn’t being done. So I might be completely unaware of the major trends out there... how can I see those trends from where I sit? Are people not getting jobs who should, has there been high unemployment? Are companies not able to hire people? If hiring is broken, what are the problems it’s causing?
> It's a disguised IQ test
It is amusing to me that many blog posts and comments around here argue for exactly that under the same banner ‘hiring is broken’. Lots of developers are frustrated about being evaluated by how well they communicate and not by their code. Lots of people here complain about in-person interviews and tests and argue that take-home coding should be the norm, or that coding tests should not used at all.
Whiteboard interviews are a skill unto themselves, with only a glancing relationship to day-to-day job content, artificially high-pressure, overly performative, etc. You measure hours invested in Leetcode, not suitability for the work.
Take home projects probably collect good signals, but present a high and (crucially) asymmetric burden on the candidate. No one wants to burn a weekend or vacation days pouring effort into something that the employer can just glance at for 5 seconds and throw in the trash. Also, many candidates would prefer to reuse their time investment over arbitrarily many interviews instead of starting from scratch on each company's assessment.
Interviews focused on personality, culture fit, communication skills, and "gut feel" overemphasize the interviewer's personal beliefs. Likable candidates aren't especially likely to be good, and good candidates aren't especially likely to be the kind of people the interviewer wants to have a beer with.
See I would much rather invest time and money into somebody that I like being around and who I think is capable, rather than have some jackass who is really really good off the bat.
Of course there are biases, but that's true weather you hire for skills, personality, or both.
I think a lot of programmers think that your skill level is the only thing you should be judged on, but I think thats mostly because there are a lot of very unlikable people in this field and they'd rather hide behind their perceived skill level than attempt to be a semi likable person.
There is a sweet spot, somebody who has skill but is also not terrible to be around, and can learn quickly. The problem is that most interviews swing too far one way or the other to properly asses for both.
Hiring software devs is a damned if you do, damned if you don’t proposition—somebody is always gonna bitch.
There's a problem at the moment in the market that there's a huge amount of pent up demand for senior developers. The market has responded with bootcamps and the like, and we have a ton of junior devs with very little knowledge and experience pouring into the workforce. The sad truth is that most devs fresh out of collage or a bootcamp aren't valuable enough to be worth hiring at large tech companies like Google. Most people who apply to any hiring role you advertise won't really be able to program effectively. (And if they can, they won't know the conventions of the language they use, they won't know anything about security, they won't be able to read code others write in order to debug it, etc etc.).
Programming is hard. It probably takes about a decade of practice to become good at it, and I don't think schools have figured out a replicable way to take someone off the street and teach them programming yet. (For example, most fresh grads have never had to read the code another programmer wrote. Imagine teaching people how to write without teaching them how to read!)
I think there's lots of angry junior folks out there saying "Hey, I can write a simple for() loop in python, and lots of people are hiring - but I apply to lots of jobs and keep getting knocked back! The hiring process must be broken!". And a few angry senior engineers out there saying "Why do I have to keep writing fizzbuzz? Its like I have to prove over and over again that I can program at all!".
Of course, nobody wants to admit that the reason they failed their 6th whiteboard interview in a row is because they aren't very good at system design. And the reason for that is that their collage didn't teach them any system design skills, and they have no experience, and they never learned how to use words to communicate technical ideas. And ArbitraryProductCo doesn't have the runway to train you up.
Of course there's the occasional person who is really skilled and somehow still manages to fail programming interviews all the time. But if the goal of a technical interview is "make sure we don't hire anyone who wouldn't be effective at the job", I think they're fit for purpose. I think the real sin is that we're afraid to tell people they aren't very good at programming yet, and we use technical interviews as a scape goat.
.. which is really unfortunate, because those big companies are the ones that have the most resources to hire, train, and mentor juniors. At the opposite end we have small companies that can literally go bankrupt when their hire doesn't work out and is unable to deliver working software. Even if the hire works out well enough, they're not learning as much as they could in a well resourced team with enough serious talent & seniors to mentor them.
> I think the real sin is that we're afraid to tell people they aren't very good at programming yet, and we use technical interviews as a scape goat.
I don't know if it's useful to tell people that they aren't very good. It's a serious chicken & egg problem because everybody wants to hire seniors that can hit the road running and nobody wants to train the juniors. The juniors really don't need to be told that they aren't good enough, they need a place where they can get good!
In theory. I'm a pretty senior programmer now as in "been doing this professionally since the mid-90's" and as far as I can tell, modern project management principles are explicitly designed to make sure that I spend as much time as possible cranking out code and as little time as possible helping newcomers out. That may not be what the "scrum manifesto" says, but it's how the project management professionals charged with executing it interpret it.
My point is, if we look at the other end of the spectrum, we have small companies that literally cannot afford to mentor juniors while paying them a salary (and tying up the seniors' productive time). Here it's not a question of how your company chooses to manage things, it's a question of whether they can afford to spend 20% of the budget on something that may turn out not to produce anything of value. Even if they're willing, it's a big gamble and can really put the company out of business in worst case.
So does this mean that programming education is broken? That companies should invest more in training? That bootcamps should revamp what they teach? That there should be industry standards for what programmers at different levels should be expected to know?
The most maddening thing about interviews imo is the opacity. There's always uncertainty about where exactly things went wrong, and how it should be fixed next time. It's almost a meta-engineering problem of its own. And it's a metaphor for the lack of software engineering standards.
Programming education is broken. I did one year of computer science at one of the top universities in the world (switched into mathematics after that), and I'd see that course as useless if not actively negative. The only way of learning that I've seen really work for anyone (myself included) is more like a craft apprenticeship, working closely with someone more experienced. We shouldn't be surprised that that produces widely different approaches.
Frankly the field isn't mature enough to have standards. If you tried to set a professional exam based on today's best practices, in five or ten years the answers would mostly be wrong. We still don't know the right way to write software. Million-dollar systems still come with laughably simple bugs.
What does the interview process look like for a craftsperson? That's probably the best we can expect from a field as unsystematic as ours. The one thing that strikes me is that in creative fields it's normal for people to show a portfolio of past (client) projects, whereas in software past employers usually keep all copies of your code. I have no idea how we'd go about changing that norm though.
Welcome to my shop. Here's some wood. Make a chair!
In most of the interviews I conduct, I get the candidate to write follow a spec we've written and some code. And I get the candidate to debug a program with some failing unit tests. About half of the candidates I interview fresh out of school have no idea how to get started debugging a program they didn't write. You need to do that just about every day at work, and its usually not even mentioned at school.
But I've worked as a teacher too. I will move heaven and earth for my students, but in my darkest moments I think taboo thoughts. Maybe IQ is a real thing, and maybe some people just don't have the neural hardware to ever be good at programming. If thats true, we do those people a huge disservice. We steal years of their lives and tens to hundreds of thousands of dollars training them in a skill they can never master. I'd love to see the stats on how many people graduate from a CS program, try but never find reliable work in our industry. I worry the numbers might be damning.
A few months ago a candidate I interviewed asked for feedback at the end of the interview. He wanted to know I recommended that he practice so he could improve. I said he should pick an opensource project on github - preferably something thats not too big and look through the issue tracker. Pick a simple looking bug and try and fix the bug and submit a PR. His whole manner changed after I said that - the idea of doing that was so incredibly daunting to him. But that right there? More or less, thats the hiring bar. As an interviewer I'm trying to answer the question "If I hire you, can you do the work?". Read, think, understand, modify, write code, communicate it to your team. Thats the work and thats the bar.
Approaching an open source project cold is a bit higher than the bar.
In a way, this is mirrored by the organizations themselves. Startups who make the right moves, and work hard through grit and 60-hour weeks, can become unicorns. Also some startups are led by visionary founders and cannot fail.
So all of this, buttressed by real world labor demands, create incentives for people to try to become programmers. Even those who aren't "cut out to be programmers." Our increasingly cutthroat and unequal society also incentivizes people shifting to programming as a safe career choice. "Learn to code."
I don't know how we can stop "pretending otherwise." Tech companies continue to complain about the engineering talent shortage. Bootcamps and online courses continue to promise people that they can become that talent. There aren't any agreed-upon industry standards by which to exclude people who truly aren't fit for it. FAAMG has infinite money and power in the industry to continue their entrenched practices. Most startups cargo cult the leading megacorps' processes. So instead, candidates are encouraged to continue grinding Leetcode and apply, apply again.
Those seem like orthogonal aspects. If we stressed the idea that you had to do (say) a 4-year degree at a great university, or something akin to the bar exam, would that be any more compatible with the idea that some people are 10x better than others? If anything I'd say the opposite: we'd expect most of the Harvard graduating class to be on roughly the same level, it seems a lot less wild that some self-taught people could be 10x better than others.
> So all of this, buttressed by real world labor demands, create incentives for people to try to become programmers. Even those who aren't "cut out to be programmers." Our increasingly cutthroat and unequal society also incentivizes people shifting to programming as a safe career choice. "Learn to code."
This happens in every field though? You get people who are desperate to become a doctor and apply to med school year after year, despite being completely unsuited to it. You get people who insist they're gonna make it as an actor/musician/comedian and spend decades working crappy day jobs so they can live where the action is, when really they'd be better advised to pick a career they're good at.
> I don't know how we can stop "pretending otherwise." Tech companies continue to complain about the engineering talent shortage. Bootcamps and online courses continue to promise people that they can become that talent. There aren't any agreed-upon industry standards by which to exclude people who truly aren't fit for it. FAAMG has infinite money and power in the industry to continue their entrenched practices. Most startups cargo cult the leading megacorps' processes. So instead, candidates are encouraged to continue grinding Leetcode and apply, apply again.
Well, if we told people outright that programming is a matter of IQ, and gave an actual IQ test rather than an IQ-like test in interviews, that might help some people realise it's not for them. You're right that what catches on in the industry is largely a function of what the most successful companies pick, but ultimately that list of top companies is not static and we'd hope that companies with better hiring practices will (eventually) rise to the top.
That's not the point I was trying to make- I'm saying that in programming we both prize talent born of nature, and skill honed by nurture. (Though admittedly that may exist in many other disciplines.) Because of the latter emphasis on grit, hacker culture encourages self-improvement and going beyond the capacities one started with. That dogma of self-improvement goes against the notion that people are not cut out to be programmers.
Though of course, this could also be a marketing ploy for recruitment on behalf of management: "Anyone can code, you should learn to. But we only hire from the best." By encouraging an increase in talent, they have a larger labor pool to choose from (and potentially undercut wages), while plucking out the few that can pass their interviews.
> This happens in every field though?
To some degree, but the details vary. Medicine or law used to be seen as safe secure careers into the (upper) middle class, but doctors are limited through the AMA, and currently law is a notoriously difficult and costly profession with dwindling prospects. Entertainment and the arts is universally known as a risky proposition. We're talking about software, which has had the reputation of being the current surefire path to a stable, even successful, career, for at least the past two or three decades.
> Well, if we told people outright that programming is a matter of IQ, and gave an actual IQ test rather than an IQ-like test in interviews, that might help some people realise it's not for them.
Leaving aside the legality of using IQ tests to exclude candidates, that opens up the questions of if there is a direct correlation between programming good software and IQ, why programming out of all STEM fields should focus so heavily on IQ, and why all of those other technical and engineering professions don't need to resort to IQ tests for hiring.
So do your students do this? Why not?
I see it as like if I were hiring for something as generic as "writer". It's easy to have a generic "writer" produce a small sample for you on the spot, similar to a CS interview. Of course you can always practice writing directly itself, but I would imagine someone who had completed a lot of coursework in linguistics, classics, literature, etc. would on average be very well-prepared if they were a good student. But you could still practice and teach yourself on your own if you wanted to
Perhaps, but it's still the closest thing the industry has to a "programming education"; I think it's the first thing employers look for, rightly or wrongly.
> it's kind of silly to judge a whole subject by the first year
How many of my limited days on the planet am I supposed to sink into something before I'm permitted to pass judgement? At some point Stockholm Syndrome would take over.
> A good CS student, who understands the coursework and doesn't cheat, should easily become a good enough programmer just from completing coursework in a mostly theoretical program to get hired basically anywhere. Programming is something you learn incidentally because it's intertwined with what you're doing anyway;
That's not what I saw happening (unless you count the official lectures/colloquia as "cheating"; certainly I saw cases where the meat of the answer to a supervision question was spoon-fed to us directly). The people who could program at the end of first year were the people who could program at the beginning or who "got it" immediately. I never saw people struggling with a new concept but then gradually being taught it (which is something I did see happen a lot in the mathematics course), and a frightening proportion of the students I was friendly with were coming out of that first year knowing seemingly nothing, certainly not being able to program or talk coherently about algorithms or computability. I suppose it's conceivable that those students were somehow getting something out of the system design type courses, but it seems implausible.
I understand there was a shake-up in that CS department a few years after I graduated, so maybe I went through it during a bad time. But the students who graduated there in the meantime aren't going to get a do-over.
> I see it as like if I were hiring for something as generic as "writer". It's easy to have a generic "writer" produce a small sample for you on the spot, similar to a CS interview. Of course you can always practice writing directly itself, but I would imagine someone who had completed a lot of coursework in linguistics, classics, literature, etc. would on average be very well-prepared if they were a good student. But you could still practice and teach yourself on your own if you wanted to
I'd suspect the overwhelmingly important part of writing is actually writing; I only know one person who I'd call a great writer, and hanging out with him the thing you notice is that he writes the way other people check their phone. All the things you list can enhance writing, certainly, but if you don't actually write then any amount of knowledge of linguistics or classical literature is meaningless (at least in terms of how it affects your writing ability).
More than the first quarter of it. I took at least a year of math classes, and I don't feel qualified to pass judgement on the math department.
There's a huge difference between intro and upper level classes. Just like there's a difference between Calc I and proof heavy upper level math class.
That being said I think the overall pedagogy is better in the math department, but then again they've been doing this for a lot longer.
I don't have a medical degree, but I still trust medical science. Why? Because it achieves positive results, and people I trust for other reasons trust them.
There's no you were rigorous enough in tracking this for this statement to be useful to anyone. The people who don't fit the mold stand out.
Sure, but has anyone claiming the opposite done rigorous analysis? Is there any evidence that having a CS degree makes for better programmers than not?
> The people who don't fit the mold stand out.
I'm not thinking about people who stood out as particularly unusual. Most of the time I didn't find out which field someone's degree was in until months into working with them.
I'm not making that claim, you're the one making a claim that people with degrees other than CS are better programmers without evidence.
I'll only make the claim that a CS degree made me a better programmer. Specifically the upper level theoretical classes. I can verify that there are many problems I've solved because I realized that the problem I was working on had already been solved 50 years ago.
I also worked about a decade as a professional programmer without a CS degree, before I went back. Personally I am a better programmer.
Would I have been an even better programmer had I taken another few semesters of math classes instead of CS classes? Who knows? Absent any other evidence though, the simplest explanation is that domain specific knowledge is likely useful.
>I'm not thinking about people who stood out as particularly unusual. Most of the time I didn't find out which field someone's degree was in until months into working with them.
The point is that the more unusual someone's background is, the more likely you are to remember it. Particularly if there is some confirmation bias involved.
Disagree; surely the null hypothesis for any given training programme is that it has no effect.
> The point is that the more unusual someone's background is, the more likely you are to remember it. Particularly if there is some confirmation bias involved.
There isn't anything unusual about professional programmers having a degree in maths or physics rather than CS. At least in my experience it was pretty close to an equal split.
That's not what's under test here though. It's training program A that includes domain specific knowledge or training program B that does not.
>There isn't anything unusual about professional programmers having a degree in maths or physics rather than CS. At least in my experience it was pretty close to an equal split.
Look at the number of graduates, the only way that is true is if almost every single physics or math graduate goes into programming. The fact that you think it's true is just further evidence of bias.
According to the Stack Overflow Developer Survey , about 8% of professional developers with degrees majored in math, or natural sciences vs. 63% in CS, software engineering, or computer engineering. There could be some sampling bias, but that's a huge difference.
I suspect SO surveys are heavily biased towards younger developers, but even taking those 2019 numbers at face value: about 20% of professional developers have no degree, and of those with degrees it's about 75% CS/information systems/sysadmin/webdev, 17% maths/physics/engineering, and 8% other. So a typical 15-developer team would be 9 with CS degrees, 3 with no degree, 1 with an engineering degree, 1 with maths/science and 1 other. The non-CS folk are not exactly rare unicorns.
Well then why did you say this:
You said explicitly math and physics degrees vs CS degrees. And you previously said the best programmers tended to have math, or physics degrees or something similar.
This isn't me being pedantic, it was the entire context of the discussion.
The point is that you are prone to confirmation bias as evidenced by your belief that it's close to an equal split. Your mental model is overrepresenting people with physics and math degrees likely because it confirms your belief that they are better programmers.
>So a typical 15-developer team would be 9 with CS degrees, 3 with no degree, 1 with an engineering degree, 1 with maths/science and 1 other. The non-CS folk are not exactly rare unicorns.
That's not the point, it's that you are more likely to remember the background of the 1 person on a team who has a Math degree because she is relatively rare compared to all of the people with CS degrees. This is a well known and well documented phenomenon. And it's one of the primary reasons that anecdotal evidence, even a large amount of anecdotal evidence is so often wrong.
I was giving those as examples of degrees that are normal and don't stand out. We don't think there's anything particularly odd about a programmer with a maths or physics degree. That's all I was saying.
> you previously said the best programmers tended to have math, or physics degrees or something similar.
A category which would include engineering, at which point we're at 20-25% of professional programmers with degrees by your numbers (which I still think are significantly biased).
> The point is that you are prone to confirmation bias as evidenced by your belief that it's close to an equal split. Your mental model is overrepresenting people with physics and math degrees likely because it confirms your belief that they are better programmers.
My mental model is that it's an equal split between CS degrees and not, and per your own sources that's accurate. You're fixating on a couple of specific examples of non-CS degrees that I mentioned when that's completely beside the point.
>The best programmers I've worked with have mostly not had CS degrees (tended to have degrees in maths, physics, or that sort of area).
That's the entire context of the discussion. The assertion that CS majors have worse outcomes (with respect to programming ability) than math, physics or similar majors.
>A category which would include engineering, at which point we're at 20-25% of professional programmers with degrees by your numbers (which I still think are significantly biased).
If you are including math, all natural sciences, and all other engineering degrees you get 17%, not 20-25%.
>My mental model is that it's an equal split between CS degrees and not, and per your own sources that's accurate. You're fixating on a couple of specific examples of non-CS degrees that I mentioned when that's completely beside the point.
There is no other logical way to parse this statement
than that you were talking specifically about math and physics.
I get it, you don't like that there are numbers that contradict you, so you are grasping at straws trying to find alternate interpretations to reconcile your statement with the numbers. You obviously don't like being wrong. I don't either, that's fine, but no one other than us is reading this far down. There's not point denying you farted when there's only 2 of you in an elevator.
I don't entirely disagree with you, there were certainly some later classes that were similarly not useful in the long run, but it wasn't a total crapshoot. My senior year included a year-long group project on a team of ~10 people that basically took us all the way through the lifecycle of a project -- from inception, to design and architecture, to polish and QA, to 'releasing' it. It was a very useful course that placed you into a startup-like atmosphere.
But I think this kind of confirms that apprenticeships may be far more useful to the programming field than college degrees. If my CS program did not include that senior course, I would probably pretty vehemently agree with you. And anecdotally, at my current workplace, one of our best programmers is a kid we hired basically right out of high school who has since grown in skill considerably thanks to the attention of more senior engineers.
Of course it is broken, it rarely mention naming, debugging, and never emphasize reading code. The real fundamentals are not there and you learn them on the job.
The attention of senior engineers is the single most valuable, expensive and scarce commodity of a modern software company. The incentives are absolutely not aligned for most companies to make it worth their time to hire junior engineers.
But obviously thats a problem - because as you say, where else will senior engineers come from? And I don't think we have a good answer here. The old school answer was apprenticeship - a company trains you up, and in exchange you work for them for a time. But most companies are loathe to do that because you're likely to quit and go somewhere else for a pay bump as soon as you have enough skills and experience that you're worth hiring.
For my money this, right here, is the real problem with modern programming / career development. Whiteboard interviews are the least of our problems.
This is only true in bad companies. Good companies understand that people move on, and don't assume that someone is hired in to a role forever. Once you realise that hiring is an expensive process that you will always be doing you then you can optimize it appropriately.
Some really good companies even use it as a point in job adverts. There are plenty of senior developers who actively want to teach and mentor juniors, and will be happier working in companies that encourage that process. It's a good way of retaining those staff.
Yes, they understand this, so they simply do not hire people who need to be trained for a year or two to be effective hires.
One problem is that companies with the means to fund this- FAANG and the like- are also the ones are incentivized to gatekeep the most to maintain their elite status and engineering superiority that leads to business edge. Startups, being naturally less risk adverse, are less likely to engage in formalized apprenticeship programs for juniors.
What happens instead is that you get the half-hearted approach of hiring college students to be interns or co-ops. Not all students are able to intern. The ones who do have a leg up once they graduate. Hence the stellar reputation of U of Waterloo grads.
Also, it seems to ignore the fact that training people is actually an incredibly useful thing to do to hone your skills as a senior dev, and that having to teach forces you to cristallize, simplify and explain thoughts and processes that you possibly never challenged before.
Tbh, I think I'm more productive and learning more when I have a decent intern to coach in my team than otherwise. So it's really a win-win situation.
Investing in your workforce for the long term: when you're big enough I think you should do it. Having a "reserve" is also useful. You don't have devs doing nothing: you have devs learning, teaching and ready in case a business opportunity appears.
But there are bigger problems. I've experimented with training people in various ways over the years.
Reality is programmers move around a lot. They are attracted by interesting new problems where they feel they're learning. Staying at one firm for 20 years isn't likely these days. There's nothing wrong with that, but it means if you spend a few years training someone then after that time period they may leave anyway, even if their comp is reset to be competitive, because the new place can offer them equal comp + new problems.
Another issue is that a lot of training junior-to-senior is about imparting experience, wisdom, beliefs etc. At some point they can code and it's about the decisions being made, rather than inability to do them. A characteristic of junior devs that are growing their skills is they tend to latch on to trends harder and quicker than senior people who have maybe seen it before, and can differentiate their CV without buzzwords. If a junior comes to work one day and says "We need to Kubernetize all our things" and you say "Actually, our server count is stable and low, we don't need to use Kubernetes, but we do need this new customer-facing feature implemented" then it's quite possible they'll get frustrated, want to argue with you. Of course replace Kubernetes with Haskell, Linux distro of the day, Rust, Go, whatever seems hip and new.
It can just end up being draining for everyone. Of course debating these issues can be teaching of a form, but often juniors don't see it that way. The student wants to become the master faster than sometimes makes sense.
If it takes years of training till the person is useful and worth it, then there is something wrong with the way training is organized. We give juniors easier tasks then to seniors and train them, but we also expect them to be useful from the start basically.
It really should not take years till they make enough work for for salary + training.
> If a junior comes to work one day and says "We need to Kubernetize all our things" and you say "Actually, our server count is stable and low, we don't need to use Kubernetes, but we do need this new customer-facing feature implemented" then it's quite possible they'll get frustrated, want to argue with you. Of course replace Kubernetes with Haskell, Linux distro of the day, Rust, Go, whatever seems hip and new.
There is no difference between senior wanting to change things. This sort of conflict is normal and final decision is not done by junior.
Most people actually can handle not being given their way all the time. If they have no zone for own autonomy or decisions then they will get frustrated. But that zone should be smaller then the massive architectural decision.
Moreover, portion of people pushing toward new technology and being willing to experiment is something that company should have to keep healthy. Otherwise you all will stagnate.
There is no difference between senior wanting to change things
Seniors are more likely to have been through several jobs and environments, and learned that tooling choices aren't that big a deal at the end of the day. They've also already (hopefully) got some accomplishments under their belt and don't need to redo other people's work to have something to show - they're more likely to do higher risk strategies like trying new things as a consequence.
“Why would we train them up, they’ll just leave anyway” -> “screw this place, I am going to leave when I get the chance” -> leaves -> go to step 1
1) Anyone with >0 years of experience outperforms a Waterloo internship candidate on the coding or algorithms round.
2) Anyone interviewing for a senior position performs well on the other rounds and fails only system design.
I'm pretty sure it's large companies that can afford to play long-term big-picture strategies with talent, and small ones that have such low-rent concerns as which languages and frameworks you know.
But is leetcode really the way to find them?
What removing a metric tells you is that people that they regretted hiring people who looked fine on those metrics. They have no way to determine if people who failed on other metrics were better than the people they hired.
The problem is that if you measure someone by a metric you can’t actually disregard it in your decision process. Even a double blind affects the subject’s behavior.
You didn't address this part of the problem. This friction is one of the reasons the job market is so distorted.
When you apply for a company, they have to assume you aren't very good, because most people who apply aren't very good. (Because people with strong skills get snapped up, and people with weak skills spam their resume everywhere they can.). Figuring out who's worth spending time interviewing is a hard problem in itself. And there's no silver bullet here - people lie about their work experience all the time. There are so many user accounts on github with copies or forks of random people's code, with basically no changes. I suspect they exist support lies on resumes.
I don't think it distorts the market, but it is annoying. A recruiter I talked to a few years ago said she thinks its crazy we don't use an agency model for programmers like actors do. The idea there is that good programmers pay a small percentage of their salary to a manager, who's job is to find you the best roles that suit your skills and negotiate pay on your behalf and so on. She tried to set up a business doing just that but she couldn't get enough clients to make it work. Good programmers balked at the idea of paying a few % of our salary to someone in an ongoing way, to look after our career. Having to prove your skills in each and every interview, and form those social connections on your own? Thats a choice we make.
It's extremely saddening how prevalent this prejudice is.
But what if I am still looking for work and companies literally don't reply to my applying to them? I might be the next John Carmack but if nobody gives me a chance (for reasons outside of my control and unrelated to my proficiency) then according to you I suck. :(
It's very broken to assume that skilled people get snapped up immediately so whoever is available must be mediocre (or bad).
It is an awfully imprecise assumption!
Look, companies wouldn't be advertising for roles if they didn't think qualified candidates (like you?) are out there. But its a numbers game; of course there are more unqualified people looking for work than qualified people at any given moment. Qualified people don't get snapped up immediately; but they aren't usually actively looking for work for long. And they often are picky about which places they apply to - for good reason. A highly qualified candidate might apply for 3 roles, get 2 offers and accept 1 of them. A weak candidate might apply for 50 roles. If those are the only two people sending out resumes - (or in general the pool has 50% strong candidates and 50% weak candidates), still only 6% of resumes will come from strong candidates.
Thats not none. And I really hear you about how frustrating it must be for companies to not even bother to reply - I mean, thats pretty rude. But ... what behaviour do you expect? What would you do with a pile of 100 resumes if you expected only 6 of them will be strong candidates? Should they bring all 100 people in for interviews, just in case there's a young John Carmack amongst them who has terrible resume writing skills? (I've interviewed 2 people who fit that description out of the 400 or so I interviewed in the last year. They definitely exist. But finding those people is prohibitively expensive for most companies.)
Flawed? Yes. Biased? A little. Prejudiced? That seems like a stretch. Can you think of a better system? That conversation seems more interesting than just complaining about it.
BTW I am not exactly young -- 40 y/o with 18 years of professional experience (not claiming anything about quality). I was just objecting to your general premise that if somebody isn't snatched immediately then they must be mediocre because I've witnessed programmers times better than me (whose sole efforts turned entire companies around) slag around jobless for 6 months and not being able to move beyond 2nd interview even if EVERYBODY told them they like their expertise and demeanour and that they are a good cultural fit.
But you are very likely correct that it's a numbers game and that various circumstances prevent companies to actually actively look for the gems.
So again, I do my best not to take anything personally.
On the surface it's numbers game. But 2/3rds of getting hired is a confidence game.
Some programmers are really good at what they do but they just suck at playing the game. Or more likely they suck at it worse than the average hiring manager.
As for confidence, I should learn to fake it already I guess. I am a realistic down to earth guy who doesn't deny when he doesn't know something -- nobody can know everything. But that's likely not the point; more likely it's about projecting an image of "nothing can give me a true pause"?
It's a skill some people are naturals and some aren't everyone though can get better at it.
A 10 means "The interview process is fine. It judges people fairly and objectively, and works well for both candidates and companies".
A 1 on the scale means "The whole interview process does a disservice to almost everyone it touches, and reflects badly on our industry as a whole"
Where do you think we are?
Personally I think we're at about a 7. Which is to say, I agree with you. I've interviewed people like that and it breaks my heart every time to hear their stories. But I think there's a silent majority for whom the interview process works as intended.
- About 0.5% of people I've interviewed had amazing skills but were unable to explain those skills on a resume, and wouldn't be able to get their foot in the door at most companies they applied to
- At a guess there's probably another ~3% or so of candidates who just for one reason or another don't come across well. They don't look or sound like they know what they're talking about, or something else is going on for them. But they actually have great technical skills when you get down to it. I suspect those people struggle to find work in most normal hiring processes.
- (I have other criticisms too - like how we don't give people feedback after interviews)
But thats ... I mean, it matters, but I think the cohort of "unappreciated gems in the rough" has to be in the small single digit percentages. There's a lot of blog posts complaining about hiring in general that hit the front page of HN, but I don't think they're fair representation of the state of software engineering hiring across the board. Great people getting looked over are the exception. The awful truth is that most people, most of time are judged fairly in job interviews based on their skills. Its just that programming is really hard and almost everyone in the world is terrible at it. Its so hard to learn that you can go into massive debt and spend years studying it in school, and still be mostly unemployable out the other end. In fact I suspect most people fresh out of school struggle to find work, because they just aren't very good yet.
So no wonder these posts get upvoted on HN. People have a really good reason to feel angry and let down by the system. The story that you're a gem and you're being passed over by the soulless corporations is a much easier pill to swallow than the idea that you're being looked over because you aren't very good at programming. And your degree means nothing, and from here it'll take you years to get good at programming, if you ever manage it. And nobody wants to take the time to teach you on the job. And nobody thought to tell you any of this while you were in school.
I don't know who's fault it is - if anyone's. Companies are doing whats in their best interest. Schools are doing their best to teach CS to everyone who wants to learn it. People think going to school to learn programming means they can find useful work out the other end. I'd like to think that most do, eventually. But there's a chasm in the middle that nobody talks about. A chasm between knowledge and programming jobs. Many people never find their way out of that gap. We don't even tell people its there.
I think people with your mindset have just repeated this over and over again to the point that we all take it as axiomatic. It also strokes our ego ("I can do something that basically nobody can even learn how to do!") so it's easy to maintain the farce.
This is the second time in this thread that I've seen the claim that the anger is driven by people who feel let down by the system. That's not me, the system has worked really well for me. But I still think it is a hall of mirrors that could be serious improvement. Not a 7, more like a 3 or 4, and held back by both people who think it works really well and people who think it's a bad solution but that there isn't really a better one (which I'm more sympathetic to).
I've often posited among friends in the industry that it's partly due to apathy and mismatched expectations. College students go into CS because they expect a good paying job, not because of any specific aptitude or interest. CS isn't the study of software engineering, nor of the act of programming. They graduate, with no appreciable skill, and no real desire to learn, and can't find work. We can argue about fairness until we're blue in the face, but the reality of the situation is that motivated students who are actually interested in learning have vast free resources available to them to learn and free resources available to them to demonstrate their competence. Anyone not taking advantage of these resources is going to be subpar compared to others in the labor market.
Pretty much anyone who went to college in the last 20 years should have been able to anecdotally identify the stand-outs in their classes, and in general on the other side of the pipeline those are the only folks having an easy time getting hired. I dropped out of college, even though I was one of the stand-outs, because I wasn't learning anything I hadn't already taught myself. I've had less difficulty in my career than some of my former classmates I keep up with and my salary is multiples of theirs when they are employed, we're all the same age and they're technically more credentialed. It comes down to the fact that most people are just not very good at programming or at systems design. In fact, I'm not a good programmer, my skill-set is very much in the arena of systems design.
This is not unique to CS. I feel this is the general mindset among most people who go to college: "I get a degree and get a job" and is the reason for so many young people coming out of college with a 4 year degree and no job.
Just getting a degree doesn't mean you are smart, curious, or driven. I have friends who got liberal arts degrees who are smart, curious, and driven and went on to have great careers and friends with business degrees who don't care and are unsurprisingly "underemployed" by their degree, but not by their personality and the way they live their life.
Which is, really, as good a way to learn about someone as most of the techniques we try. You only have ten or fifteen minutes to ask the company questions. What did you hunk was worthwhile to expend that time on?
I also am kind of picky although I am not sure how wise that is in the current crisis.
I think it depends on the context. To be picky, you could just send out a lot of resumes, and I as a prospective employer wouldn't have any clue how many you've sent, as long as you are still at your old company the whole time.
However if you have a good emergency fund, your old company folds, and you are being picky, people start to suspect you are unemployable after a bit.
Is this a problem for other professionals? Why not? When an accountant with a successful career spanning 15 years applies to a new job, does the company "have to" assume that know exactly nothing?
It does distort the market. It trades off against the desire to switch jobs, which distorts the availability of job switchers. When I get an email from a recruiter, I'm not just thinking "would that job be better for me?", I'm also thinking "am I willing to practice and perform the whiteboard code ritual right now?".
Such things do exist (at least in some form) but I haven't had a good experience. Same old story: like everyone else, they're looking for seniors, turning down juniors, and complaining about shortage of talent.
Storytime: I used to work at a huge Japanese multinational (you can probably guess which one), and I rose through the ranks pretty quickly. After two years (which given their turnaround, made me pretty ancient), I left the firm for greener pastures.
A few years later, they started a new sub-division, and a personal contact recommended me. It was a decent pay bump, and despite the negative press, I didn't mind working there, during my tenure. I figured I'd be a shoo-in. It was for a non-technical role (PM), but they wanted someone with a tech background, which I had. They then wanted to give me an online engineering test just as a formality. It was a quick test, and I solved it with little issue.
Fast forward 2 weeks and I get a rejection email, saying that my code wasn't up to snuff, and they'd be moving on. It took everything in me to not write back saying, "Hey, go to your main service and log-in. Ok yeah, so the interface and client backend to your SSO and the user management services that millions of your users interact with each day -- I wrote all of that. Oh yeah, your absurdly complicated coupon system -- I wrote a lot if not all of that too."
Companies keep complaining that they can't find talent, while their interviewers are asking candidates to implement red-black trees on a whiteboard in 5 minutes.
Sometimes it is worth writing back. Because people often don't realize there are flaws in the process they are using. Either they will ignore the feedback and it's business as usual or they will take it and work towards improving the system.
Have you gotten any actual work done in the last year?
> in the hiring space.
Oh wait. This is your job.
a) Open time. I give them a couple problems in the morning and they can take as long as they want to think about them before the interviews. This takes away the 'deer in headlights' situation that so often happens to people - all of us. I'm good on my feet, but 1 time out 3 'I just don't see it' immediately. But if I have some stress-free time, I usually do.
b) Don't test their knowledge of algorithms. Who cares if they've practiced certain things a lot? Doing homework is a small measure of what you want. I give them fairly basic problems that have nice side-show questions to ask and just have a discussion.
c) I try to test for knowledge where they should have it: if they've been on tools & build or dev ops they should have a good grasp of things there. Another way to say: you're looking for strengths not weaknesses. I keep an open mind and think 'how can we leverage this person's talent'? Maybe they're not good over here, but better over there.
d) Fairly simple code, idiomatic, etc. not rocket science.
e) Mature communicator and by that I almost literally mean not too crazy. I think most of us are 'abnormal' and that being a 'decent communicator' is a little bit rare. Companies are full of weird dynamics they have to be resilient a little bit.
I’ve definitely interviewed with companies where I was put in really unnatural situations that I struggled to shine in. For example, I bombed an interview in which I had to do a live coding challenge where I had to talk through what I was thinking while coding, which the interviewer insisted on so they could understand “how I think”.
I thought it was an incredibly cumbersome and useless request because I never talk when I code, and coding in front of others in an interview is tough enough as it is. I asked if we could discuss the solution once I completed the problem, but they declined that idea.
Needless to say, I didn’t get the job and the assessment felt like a huge waste of time. I think a lot of people see companies doing stupid stuff like this and get frustrated because candidates are often capable while being placed in high-pressure situations that don’t enable abstract creative thinking, which programming requires.
FWIW, there is limited time to collect data about a candidate in the context of an interview, and being able to observe a candidates’ thought process can help a lot with that. Obviously, this biases hiring towards candidates that are more observable, all else being equal.
I’m sure it varies between interviewers but the lineup I had at google clearly didn’t want to dedicate more than a minute or two to non-algorithm questioning.
Whenever hiring comes on up on reddit or HN I see a lot of posts that can only come from people who haven't been an interviewer much.
First rule of hiring: most of the people you talk to want your money. The candidate is selling themselves to you. They know firing people is hard and there are almost never consequences for misleading interviewers.
What happens when people are asked to talk about their prior projects, experience or really anything that isn't a highly controlled and repeatable coding question?
1. They pass off things they saw or read as their own experience.
2. They claim a team's accomplishments as their own.
3. They massively exaggerate or try to BS you in other ways.
4. They give uselessly vague answers, not necessarily deliberately.
Maybe you don't do these things, but back when I bothered asking these sorts of questions I did encounter such answers pretty frequently.
Companies have converged on live coding because that's something concrete, real and largely un-bullshittable. Yeah, toy programs in interviews aren't "real" programming but it's a lot closer to real than listening to someone ramble in a disorganised way about "their" previous project, and at the end realise you still don't know what that person actually did and what was done by others.
The "tell me a little about an interesting problem" question is usually there to gauge your ability to summarize a cool problem and solution down to to an appropriately sized story. Sometimes when I ask this question, candidates just launch into stream-of-consciousness expositions, talking just-in-time as the details come to their brains. At the 2 minute mark, I start thinking come on, don't do this to yourself. At 5 minutes, I will gently try to hint to the candidate to try to summarize and wrap it up. Some people just try to fill every silence with words and I have to finally firmly yank them back and cut it off. I hate to have to do it, but if the candidate can't time-manage the answer, I have to time-manage the questions.
* they're not cheap (although not that expensive all things considered), assuming you spend say 10 person hours on two calls and an on site.
* they produce false negatives. on the spot exam style algorithms interviews don't reflect day-to-day work, and they filter out people who might be great at the job, but aren't used to the specific type of stress that interview creates.
* they can produce false positives. failure to check for interpersonal skills in leet koders can leave you wishing you had never hired them.
* they're painful for the candidate. ideally your role is the best fit for a candidate. people only have so much emotional and logistical bandwidth. they may take an offer after interviewing with a handful of companies before they ever find you.
now, I don't have numbers for those last three points. I don't know if anyone does. but we can look at most hiring processes and see that they are a problem, we just can't tell if it's a big enough problem to consider the process "broken".
I’m not seeing a compelling problem with the outcome. The question is: are companies not able to hire good people? And: are good people not able to find jobs?
Interviews are also expensive for a company. And I don’t really see a clear or compelling argument that interviews somehow cost the candidate money. How much money? If the candidate doesn’t have a job, then what exactly is money or time cost to the candidate? Doesn’t the eventual job with income outweigh all interviewing costs? Isn’t the cost somewhat irrelevant if it’s not egregious, and there’s really no choice if you want a job? Isn’t the alternative, choosing not to interview, potentially way more expensive?
If you’re going to frame it as expensive, I think it’s more than fair to counter with you need to average the cost of the interview over the time period that you’re employed. If you interview for 10 hours, and you work there for 5 years, the cost of the interview is 20 seconds per day.
> they’re painful for the candidate
I agree that interviews have a time cost and that candidates have a finite budget. That means someone looking for a job should invest wisely, figure out how & when to decline an interview, study the companies before interviewing.
This is subjective, and it’s not a solvable “hiring problem”. This is something the candidate has to fix for themselves. Nobody is going to give you a six figure salary and ask you to work with them for many years without vetting you. It will always take time, and it will always be somewhat uncomfortable and emotionally draining. Some people prepare for interviews, and some people like to interview. If you view it as a skill, one that you can learn and improve at, maybe it will become less painful for you. But don’t expect that companies are ever going to do that for you.
It doesn't matter what the process is, some form of song and dance is needed.
It would require some sort of longer-term data gathering around the rejection / ghosting process.
For example, Google claims to be very objective and data-driven about their hiring process, but once a candidate is rejected there is no further data.
How would one solve such a problem?
If we agree that it's just window dressing around a "song and dance" (I prefer to call it "the rituals of gate keeping"), there is no problem to be solved.
But I still feel like giving in to cynicism is the wrong answer.
If google could know when it was about to deny a speculative false negative it could hire them exclusively into a part of the company where everyone is a speculative false negative. Then you could just watch them compete with those who pass and try and measure something objective.
For what it's worth: Someone who works well in an established company isn't always the right person to hire in a young company.
My point is that there's no control group and therefore no way to test the process in an empirical way and possibly improve it.
But the idea of finding a way to measure the performance of people who didn't get hired seems ludicrous, so we're back to being stuck in the status quo.
That's what I'm on about. I would love to see an article that talks about this rather than the screeds and diatribes that show up on HN so often.
Every job is likely to legitimately reject good candidates.
Most hiring is for a specific position. If you get 10 viable, or even excellent candidates, you can often still only hire one of them.
The one that gets it will take the job in their own direction. Almost by definition, the winning candidate is therefore the best fit.
Whatever roles the other candidates end up in will go in their own directions. Even if, a year on, you can objectively compare the performance of the one you hired against the one you didn't, their performance was within the context of the role they did win. They may not have been as good a fit for the role you didn't offer them.
I see it slightly differently; I think that 95% of the success of companies is driven by key moments or decisions by 5% or less of the work. (That’s usually by a similarly small slice of the workers as well.)