This article neatly demonstrates that resumes are not necessary and that not using them can unlock new sorts of candidates.
However, I don't think there's a conclusion to be made about the actual method used here. I suspect that it worked because it was different, not because it carried a fundamentally strong signal. If everyone did this, project descriptions would be gamed even more than resumes—it would select for people who prepared for the selection process² more than anything else.
This reminds me of various captcha strategies I've seen used by small forums to great effect—solving some math, typing a word into a text box, choosing a popular character's picture… etc. They all work, perfectly. But only because spammers don't care about the small fry: it's not worth their time to modify their bots for your little site. If any given captcha becomes used widely—or your forum grows big enough—they will bypass it trivially.
Now, an essay like this isn't quite as bad as a captcha, but the idea is the same: it works because it's new and different. If everybody used it, it would probably be a step back.
Ultimately, I think the real moral is that more companies should do their own thing, even if that thing is not great in the abstract. Being different carries a value of its own, and it breeds biodiversity that's healthy for the system as a whole. (Of course, many of the things companies try are really bad for various reasons, but that's a different story…)
¹ In particular, most people have a bunch of "red flags" they look for with, at best, cursory rationale—everything from passing on people who didn't go to the right school to those who have breaks in their work history, based on "common sense" or "experience" rather than anything meaningful. Most of these criteria seem counter-productive.
² I also think this is really true for college admissions and especially the admissions essay. A project blurb for hiring is more or less the same idea in a new context.
I find myself almost irresistably drawn towards the conclusion that people will game the hiring process, because it is a game. It has winners and losers. Everybody is trying to make themselves look better than everybody else, even to the point of making themselves look better than they really are: Job candidates, employers, internal and external recruiters, everybody who is selling a hiring product, etc. Everybody is trying to emphasize their positives and hide their negatives, while searching for the negatives of the others.
A drawback to "do your own thing" is that it is enormously inefficient for candidates, and the results may turn out to be based on the luck of a particular candidate guessing a particular employer's game is, or being generally better at games.
I don't know a solution. What strikes me is that as broken as the system seems, we still manage to hire good people most of the time.
In academia, someone observed this: "The revolutionary idea of one generation becomes just the stuff you say to get tenure in the next generation."
Maybe most companies shouldn't be different. But there's plenty of value in not being like everybody else.
Interestingly, the simplest conclusion would be that everyone is good, most of the time. So if we throw hiring out the windows and pick employee randomly, we might still get to the same result.
Which actually isn't too outrageous, since if we have 10% unemployment, wouldn't it mean 90% of people (or more, actually) have to be value added?
You might be right that randomly picking candidates might be just as efficient, but I can't imagine anyone taking that risk.
That's a load of highly-enriched equine fertilizer. I have had this discussion here before. I asked for information. I didn't get much. Some worst-case hypotheticals that, to my knowledge, have never happened anywhere. A buttload of management failures surrounding bad hires where management was responsible for the vast majority of the costs, not the hire. A couple legitimate bad hires where the costs were significant, but not in the same ballpark as people like you parrot.
Also ignored in this conversation: what is the cost of keeping your req open? How much money are you losing, directly or indirectly, by not having someone in that role? Why do you think someone can't grow into the role?
The bottom line is that "slow to hire and fast to fire" as difficult as it is, is very sound advice!
Without knowing you and jumping to a conclusion based on your comment, I assume you don't appreciate that employee boss relationships are adversarial by nature and there is often considerations management makes that might seem bad but are really a product of their understanding of whats possible and realistic within limiting circumstances...
I used to complain with my friends about all the things we would change in our boss and the company we worked for and now after many years I realize that at the time I had no flipping clue about anything.
Could the on boarding process be more akin to a mentor-apprentice relationship and utilize open source as the avenue, with the burden of work placed on the apprentice. Your company has both closed and open source projects that many of your engineer's contribute to and manage. An apprentice level candidate works on and applies a patch with the feedback from the mentor level engineers. At some sufficient level of acceptance based on performance the apprentice is brought in for a culture fit type interview and potentially offered a position.
From the view of the apprentice this may seem like MORE work then writing resumes and prepping for technical interviews. But from my point of view as an apprentice I'd be learning skills that seem more useful than gaming resumes, screening and technical interviews, and adding to my portfolio that may never get glanced at. Skills like communication and coordination with a team, Real world coding experience and pushing to production. From the viewpoint of the mentor, I see candidates that have already been introduced to the internal workflow and show the communication necessary to work with my engineering team.
It is hard to pull wool over someone's eyes for 3 months especially in a results oriented field like programming.
The test seemingly has to be 0.1% effective, to pay for itself. The thing that's hard to imagine is that the test is even less than 0.1% effective, because it's a scam. This is also how things are sold like a $5 extended warranty for a flash drive.
Do you know what's even more expensive than a bad hire? It's letting an extraordinary hire go away because your snake-oil test did not work. But that loss won't make onto a spreadsheet, then, who cares?
Could you actually quantify these costs a good agency will ofter refunds if a candidate is let go shortly after hiring.
I'll add a note that it's not necessary for every company to do something different. It's perfectly okay for your company to copy the way this one did things. It's perfectly okay for five or ten companies to do that. Only if a large fraction of the market starts copying a particular strategy, do you need to switch to something else.
That'd be a strategy that would better suit both the hunters and the hunted.
Which reminds me of Jeff Atwood's original "captcha" on the Coding Horror blog. It was a static image of the word "orange" every time. It was the most trivially defeatable captcha ever, but he didn't care because it worked. It would have been senseless for him to invest time and effort into implementing a complex captcha engine, when the existing solution was filtering spam bots just fine.
This really resonated with me - especially as a start up, doing what feels right just makes sense, especially when that's consistent with your team's culture and even sense of humor.
Real example - when we're not getting a sense of the 'real' person we're interviewing, we take them out for a friendly game of foosball. It's brilliant at removing nerves, but also gives us great insights into their team skills, competitiveness, and more.
Early in Google's hiring, they heavily selected for Python, which was a relatively new language at the time. It's debatable if there was something technically unique or strong about Python at the time. But many of the people who knew it at the time also happened to be programming enthusiasts, Google's right kind of crowd.
It's similar how ingredient X isn't as important as X's relationship to the status quo. Almost... hipster hiring in a way.
I find this a funny one. At one previous job the head of HR would highlight "breaks in work history" on the CV with a pen, three months here, six months there, as though that meant something.
I have to say though, that in my experience, these experiments in sourcing work quite well when your hiring is small. The moment you hit some sort of scale, it becomes very very difficult, if not impossible to run and rely on such experiments.
E.g. in the first growth phase at Box, we were tasked with hiring 25 engineers a quarter. At that scale, the company deals with too many resumes and too many stakeholders in the hiring process. And at that point, you also have a group of people explicitly looking at resumes, less involvement from actual hiring managers, deadlines to meet, land to grab etc. Not saying one thing is better than the other, just that hiring at scale is an entirely different game.
The other thing, which is implied in the article, but may get lost if the reader isn't careful: regardless of how a candidate is sourced, the interview bar still remains the same. i.e. AJ also must have had to clear same or similar technical interviews like other engineers that got hired there.
Yeah, scaling this stuff is hard. I do think there's a big danger in scaling it by offloading filtering powers to non-technical people because then you have to rely on proxies. Proxies aren't inherently bad, of course, but the ones we have now (school and past employment) are pretty bad, and if ultimately, we're getting things wrong more than we're getting them right, it outweighs the temptation to cut costs and time.
For a company like Box with a super strong engineering brand, it's OK to have a pretty high false negative rate, of course. You can reject a lot of good people and still have a revolving door of others who want to work there. However, smaller companies often take their cues from big ones and adopt the same processes without realizing that they may not work the same way.
And yes, thank you so much for calling that out. AJ had to meet the same bar as everyone else. Fortunately, he killed it.
It would be great if the majority of companies used both the resume and the cover letter effectively. It feels like most companies that require a cover letter only do so to screen out the laziest 10% who can't be bothered to write up a generic 1 page essay filled with ass-kissing and vague jargon.
The cover letter is just a relic from the olden days when the application was slower and more formal. There were less applicants for each position so HR probably had more time to read/screen.
This study presents an interesting alternative: Let people submit some text along with their resume on any topic of any length, and see how their personality comes through in the writing. Probably wouldn't work extremely well at a large company, but it seems like it served KeepSafe quite well.
I am biased, i think most HR are useless and most tech people aren't great at picking candidates either. they seem to have their own biases that they cant get over (for example: putting too much weight on technical skills and none on soft skills).
honestly, if i were a company i'd send select employees to school. either take classes or better yet teach a class. you'd get a semester long interview process and the employee would get something out of it.
When I interview, I tend to spend most of the time asking in depth questions about the projects I find most interesting on the resume. What was easy? What was hard? X sounds like it would be a problem, how did you solve it? What was fun? What was headbangonthewall miserable? Generally this gives a sense as to whether or not there's any bullshitting going on, and gives a sense for whether or not the candidate has a good head for thinking about hard problems.
Finally, I'll ask a few questions to probe for "difficult-to-work-with" red flags and finish with a few fairly easy "technical challenges" that offer opportunity for the candidate to either walk about having solved the problem, or walk away having solved the problem and demonstrated understanding of the solution from top to bottom.
Tell me about the most frustrating time when you needed a thing, or consensus on a thing, and you had to go through way too much to get it.
Tell me about the most frustrating time when you needed a thing, or consensus on a thing, and no matter how hard you pushed, you never got it.
And, of course: The fastest way to be shown the door around here is to be an agitator of your colleagues on the grounds of race, sex, religion or any other attribute that has little to do with work. Nobody here is in the business of policing behavior and nobody wants to be. This requires perhaps more discipline than other places as the hammer falls harder and quicker here if things go awry, so it demands either heightened discretion or a heightened sense of self awareness and awareness of those around you. Do you think you would be able to work under such conditions?
It's quite possible the interviewee thinks the item is worth fighting, but you don't because of different principles/axioms. Are you looking for a justification in this case?
On the negative side, you might have someone complaining at length about a bikeshed issue (see http://blue.bikeshed.com/), or complaining about processes they had to follow that sound reasonable to you (for instance, "one change per commit", or "don't break the build"). Or someone complaining bitterly that they don't get to use technologies invented five minutes ago.
It's a lot easier to get information about what people stand for and care about by finding out what they fight against.
Actual quote from an interview ^
Glad to see an article about trying something different in recruitment, it is a BS industry, partly because it is so difficult to measure "success" and follow up the process with meaningful data.
The experiment from the article basically does what you describe, cuts out the stuff to ignore and goes straight to the project history. It would be beneficial for this to spread even if some people can suss it out in the current process, because there is still a large culture of looking heavily at things like alma mater and GPA.
This kind of feels like the general "software isn't rigorous like the real engineering disciplines" angst, which as far as I can tell is also bullshit. In reality, the "rigorous" old-school disciplines still manage to make colossal messes of complex, unprecedented projects, in the same way that software companies often make colossal messes of complex, unprecedented software projects. Sure, civil engineers can build normal roads and bridges and buildings reliably, but then again, software engineers have no problem throwing up CRUD apps and wordpress blogs. There are plenty of bridge collapses, exploding batteries, stalled tunnel-borers, and so on to match all of software's spectacular failures.
It turns out that complicated things are really hard to build, regardless of your field.
This thread isn't about who is more likely to fail. We are talking about hiring and who is likely to succeed. My point that companies have dumbed down what they call success so that managers can have jobs has nothing to do with engineering catastrophes. What is being called a success, I call a failure.
I don't think software is "dumbing down" by ignoring credentials. We're ignoring credentials because we are finding that they have no predictive power whatsoever for competence. Most software companies do still pay attention to experience (at least while sourcing), on the belief that it is not useless as a predictor. This article is interesting because it provides strong data indicating that experience (as presented by the candidate on their resume) is also useless as a predictor.
The point is that people with good resumes are often incompetent, and competent people often have thin resumes. This article presents data indicating that filtering by resume is no better than filtering by coin flip. Thus, if OP's data and analysis are correct, paying heed to credentials and experience is irrational.
I disagree with that actually.
It's my own anecdotal evidence, but from what I've seen, people with degrees write generally better code. They have a better understanding of algorithms, are more aware that what they're writing in a high level language isn't running by magic but is being translated into lower level constructs which may or may not be very efficient, they're more likely to realize there's an existing algorithm for what they're doing, etc.
I guess I would sum it up that people without degrees tend to work harder and not smarter.
That's why to be an ME, a degree is a gatekeeper. To get a degree, you have to pass the math classes.
Yes, I know engineers who can't do the math, despite having a degree. I've never known a mechanic to learn the math on the job.
Apprentice -> Technician -> Engineer
Or alternatively you can like myself start at the Technician level.
And yes I did correct the "proper engineers" maths on occasion their is a bridge in Saudi that I had to fix.
The premise is not of ignoring them, it's about not elevating them to unreasonable degrees at the filtering stage. In most engineering disciplines, if you have project experience all that matters degree-wise is that you have an accredited one. That it came from MIT instead of Florida Atlantic University might add a couple points, but is otherwise irrelevant. The highlights of your projects get you the initial interview, where you talk details. Then you move to the real technical gauntlet (if there is one).
This industry is different. Too many companies use trivial screening criteria (because resume deluge) and undergrad-level technical quizzes (because fakers) up front, and end up throwing away significant numbers of people they actually want before the process really gets started. This experiment was actually about making the software interview process more like the rest of engineering, with the difference that software as of right now does not require a degree.
Management is the problem! Why doesn't management recognize me? I worked hard for my degree and I deserve the rewards. Fuck the fucking people that 'live for this shit'.
Where's their degree? I am trained in <tech sold as 'enterprise level' today>. I'm qualified. I deserve the job.
There is often a degree of "will this guy show how little I really know and take my position" in hiring decisions.
One of the marks of a really good manager is being able to make use of people's abilities without being threatened by them. I've found this ability somewhat rare.
Ford, Edison, etc
I don't really care where anyone went to school. It doesn't mean anything. Really, going to school at all doesn't mean much. I need to see what you've done outside of that to make any meaningful evaluation. It doesn't matter if it's a huge project. You can give me a couple 10-line things that do something useful and I'll still get to see how you name things, format code, use built-in libraries, etc. Then we can chit chat about project management and how much you love or hate it.
For example, I don't do well in whiteboard interviews, which is odd because I normally don't have a public speaking issue. It feels like there's some muscle memory attached to coding that isn't well replicated with poor handwriting in a room full of people.
Whiteboard lines of code are simply not the manner in which developers work once hired. That is the reason for the disconnect between speaking well about projects (easy to verbally explain and sketch) and the programming portion (bizarre.)
Right the industry is doing the equivalent of interviewing lawyers by asking them to write a legal brief on a white board.
We're testing the wrong thing: a proxy for the work, when we we could easily test the work itself.
I much prefer work sample tests rather than whiteboard Q/A as it better replicates the actual job. Give me a few hours with problems I would actually face on the job, my dev environment, internet access, and a set of problems that truly reflect the work, and I find it much more natural.
Is it too much to ask that an interview measure skills the job actually requires, in an environment that emulates the work?
The benchmark was performance in a long form coding interview for TripleByte, whereas Aline's is the final offer, so not exactly apples to apples.
What we're actually comparing here, though, isn't coding vs. describing projects. It's describing projects vs. resumes. And I expect that, there, resumes provide a lower signal.
I've done a lot of work that people would probably find very interesting and useful. But I tend to choke on whiteboard code interviews because they're so high stakes. Any time spent thinking about the problem looks bad, so you have to talk a lot. But I can't really think and talk at the same time. So I end up talking rather than thinking and I do poorly.
Now obviously I'm going to push to move the status quo towards something that doesn't put me at a competitive disadvantage. So we both know that I'm biased.
But the idea that people can talk about what they've done and answer any questions that you have (about what they did and programming in general) but still screw up on the actual "coding" part might mean that the part where you make them write code is more noise than signal.
The problem is that you never find out because if someone bombs the coding part you simply chuckle and say "well that person is clearly a liar, or something!" and they don't go any further in the hiring process. So they never get hired, and because they're never hired, you can't evaluate their work performance. Which might be excellent when they're not being actively scrutinized by multiple people all at the same time in a high stakes situation.
Unfortunately to try and get some objective data on this you'd have to hire several people who talk about their projects well but don't do well on the coding part. An understandably impossible task unless your client is a Google or Microsoft and they know it's just a big experiment regarding hiring.
But until someone does that and reports back (and they won't because it'll be a competitive advantage) it's tough for me to swallow the "talks good but can't code so NOPE" that I tend to see bandied about.
Putting someone in a pressure cooker and then measuring their performance will only tell you how they perform in a pressure cooker. Which is usually quite distinct from what they're going to do day-to-day.
For what it's worth, when I observed a disconnect between how well people spoke about their projects and how well they coded, it was generally a situation where someone had perfected a pretty polished self-pitch rather than a situation where I drilled down deeply into what they had done, asked them what they'd have done differently if we varied up certain constraints, etc. And when they fucked up on coding, it was on warmup problems that was something you'd reasonably expect anyone with some experience to be able to do (e.g. explain why you might want to use a hash table over a linked list for certain scenarios, reverse a string in place).
That said, one of the reasons I'm really psyched about interviewing.io (the thing I'm working on now) is that we're getting a lot of comparative interview data, i.e. where the same person gets interviewed a bunch of different ways. Excited to see if we can draw some good conclusions about what works and what doesn't.
The problem of poor performance under scrutiny, where there is pressure for superior performance (as in a job interview) is well-described as a form of social anxiety disorder (or social phobia), a common condition affecting ~10% of adults in the US (lifetime prevalence). Of course, there's a range of mild to severe symptoms, nonetheless it affects a significant population.
The implication is that in a not-so-pressured setting candidates might perform very differently. Furthermore, writing code, solving software problems are generally incremental processes more akin to watching paint dry than putting out fires for all the externally visible action there is to see. A "whiteboard" exam likely isn't a good model of the real requirements of the job.
There's an enormous amount of research on testing methodology, testing is a huge industry. Ironically enough, one that is extremely reliant on software for analyzing test data in order to determine what is a good test of sets of knowledge or actual abilities. Seems like there's a clue in there somewhere about how software enterprises could find out who is really good at creating software.
So I think that only works if you hire everyone, whether they interview well or not. Or else the process is biasing the results and it's not representative anymore.
If you really wanted to get better information you'd have to go interview people who are already employees at a particular company and have outsiders (people who don't already know them) conduct the interviews. Then when you're done you can compare the simulated hire/no hire results and the interviewers recorded confidence numbers in their evaluation against the performance evaluations of the interviewed employees.
So long as the outsiders conduct many different types of interviews (especially besides what the company normally does) you might get a clearer view into what kind of interviewing works well and what doesn't.
I know some people that applied to and got hired by Google. Google seems painfully aware of how uncorrelated their interviewing process is with their hiring results. The hoops that these guys jumped through I never would. So even if I was talented enough to work at Google (I won't speculate here) they'll never actually be able to hire me unless they actively recruit me and don't make me run the gauntlet.
The whole problem is a really tough nut to crack. I suspect that all the pipelines are going to be biased one way or another. If I were in charge of hiring, I'd want to try and use several of them so as to not miss out on good candidates who are undervalued for whatever reason.
There's a lot of talent out there, despite everyone thinking that there's a talent shortage. The error actually lies in trying to have a one-size-fits-all solution to a problem that's definitely not uniform. Companies are failing to adapt to the human-ness of their "human resources" and it's costing them.
This. I've been job-hunting lately and getting asked to do "technical challenges" and such like, which are useless to me because they are entirely asymmetric. They tell me nothing about the company except they are following the latest fad in candidate screening.
I pointed out in feedback to one of the testing companies a while back that they had no empirical basis for believing their evaluations didn't reject more qualified candidates for less qualified ones in terms of ability to do the actual job, I got a reply saying, "No, we have all kinds of empirical data! Our clients save $TIME in the hiring process!"
Which is great until you realize that perfectly competent engineers are being locked out of the hiring process by this nonsense. We saw a fad for this kind of testing in the mid-90's, just as the dot-com boom was starting to roll, and it didn't end well. The few companies I interviewed at that used coding tests of one kind or another all failed quickly, although it did give me the opportunity to ask an incompetent hiring manager at one of them how I'd managed to get a PhD in physics while having "below average mathematical abilities" according to one of their tests (which I swear had been written by an innumerate.)
HR people will simply assert that anyone who fails these kinds of tests is incompetent, and anyone who complains about it is just expressing sour grapes at their own failure, but that all side-steps the issue that there is no significant empirical validation on the quality of hires that such tests produce.
Their only real use from my point of view is that if the "interview" process is heavy on "coding challenges" and the like, I'm a lot less likely to bother with going through it, because it speaks to a company that has bought into policies that have no empirical basis and that provide the least amount of information to job seekers, and I'm not all that interested in working in the kind of monocultures such processes produce.
For senior people my favoured interview form is to mostly talk about a few obscure language features in their language of choice, and then have a free-form discussion about language design. Senior people who are any good care about languages, and have thought about languages, and can have nuanced, intelligent discussions about languages and the trade-offs involved. It acts as a good foundation for talking about other kinds of design issues as well. For junior people, some basic test of coding competency may be useful, but over the intermediate level they are very likely testing for the wrong thing, and either way we have no evidence.
Same goes with profiles from Cisco, VMWare, Intel, Synposys, IBM, and pretty much all the big companies that were pinnacles of business and your career at one point, but they are not considered hip anymore.
Looking at Github profiles, however much people talk about it, also doesn't regularly happen. It's a chore to type it out in the browser, if you're looking at a paper resume or if it's not hyper-linked. And again, if you worked at NG, what possibly interesting things you could have done? It's not Pinterest or Uber.
Applicant: Well, I wrote a code to optimize a UAV flight path to avoid enemy radar and minimize fuel consumption, and debugged another controls code to prevent people from dying. Did some debugging on a computational electrodynamics code for radar simulations...
Recruiter: Oh. You know what's cool? Photo sharing.
 At Raytheon, if you were the one developing radar avoidance algorithms and code, that was probably your title.
 "Algorithms", another word that has slightly different meaning leading to major misinterpretation.
At my company, you're permitting to choose a public title of either your job description (e.g. Lead Data Scientist), your "engineering rank" (e.g. Lead Scientist/Technologist/Developer/Engineer) or your consulting rank e.g (Lead Associate). All of these are equivalent in rank and promotion opportunities, but it does give you some flexibility on what you can place on your resume.
I'm sure it happens sometimes. I just haven't personally seen it with the roles "Software developer", "coder", and "programmer". That is effectively the same thing everywhere I've seen it. Now if I'm a programmer and say I worked in primarily QA or database position we have a problem, but there's no real nuance between those three titles.
"I saw the best minds of my generation... writing spam filters." — Neal Stephenson, Solve For X
People extoll the virtues of a solid university COS degree, not because it explicitly teaches skills for industry, but because it creates a background to help new and novel problems. Certainly your previous work background provides a similar advantage.
If you can perform those tasks, certainly whatever the consumer-focused needs you can learn. As Aristotle said, "For the things we have to learn before we can do, we learn by doing."
When you're reading resumes you go into awful zombie mode (I do this too). If you're writing one, pretend your audience is a braindead zombie that needs to be spoonfed everything. If you have cool projects, list them and describe them concisely in a way that makes clear that they're interesting and a big deal. And link to each project if possible to increase odds of click.
Since I work in and around Github and open source, usually someone's Github will tell me what I really want to know about them (not always, but really good candidates have a Github profile that stands out).
A lot of these positions are highly competitive and involve a great deal of skill and education. Around here, they're still seen as major resume builders.
Anecdotally, I haven't had trouble securing interviews with West Coast firms even though my background is with a major East Coast company.
Maybe that's because I ran a funded startup for a few years, or maybe it's just the demand in my field, but "hipness" seems like a very bad metric for evaluating previous employers.
As for the interesting things they are executing at these firms, trust me, there are plenty of very innovative products, even in the commercial spaces such as healthcare. Machine learning, big data, practical cryptography, and basic research problems abound.
Look through some unclassified materials and talk to some people about their civilian work experiences and one can hear all about many fascinating technologies, all of which certainly trounce Pinterest in terms of interesting projects.
Rogue nuclear weapons location intelligence, international fugitive management tracking, and human trafficking are primarily targeted by highly sophisticated machine learning pipelines at sub-second analysis speeds.
Water basin management, flood gauge tracking, flood inundation maps, roadway construction modeling... Your average mid-senior developer from one these companies has expertise in a variety of fields: data analysis, civil engineering estimation (hydrology, hydrodynamics), physical simulation, DevOps, embedded software, and low-level machine architecture.
If the National Guard ever rescues you from a flood, you'll be grateful for their HEC-RAS's software, the government contractor who created the estimate flood plain in using GIS, and the civil engineer who verified the estimates at the local level.
The "hip"-ness of previous employers is a bad metric. Maybe the employee would need to adjust to the business culture of the West Coast, but in my experience people generally adapt to new working environments quite well.
Disqualifying someone over working at a major East Coast tech firm is a heuristic with a very high false negative rate.
Also, you'll find many people from those firms looking to make a fresh start, work for less to get started, and a lot of them are refugees looking to escape the bureaucracy of their current positions, which renders them extra-motivated as a new hire.
From all the H1-B stuff I had gotten the impression they were more of an overseas firm.
I'm also in the Valley.
A few months ago I launched a side project, doing (of all things) resume review and revision services. When my clients want a review of a resume that I know won't get results, and I ask "Give me more to work with", the types of things I hear are eerily similar to the "awesome stuff" quotes in this post. I try to incorporate those things into the resume when possible.
Is it the resume itself that is the problem, or is it that candidates are just less inclined to include additional details (that may seem irrelevant) that could differentiate them from others? Some resumes will list accomplishments that make it rather clear of their qualifications, but everyone doesn't have that luxury.
When a candidate doesn't have a long list of work accomplishments, do they think to include this type of content that might get our attention?
Sometimes when you see someone's resume and they have great work accomplishments, like "last 4 years at [successful, engineering focused company], building their [product everyone knows about] platform", it makes deciding to interview that person fairly easy. However, there is that issue when someone is either just starting out from college, who doesn't have a lot of work experience or who hasn't had the best of work experiences (usually not their fault). For these people the main thing that I look for and try to evaluate is their ability to learn on their own. The main thing to consider when reviewing a candidate like that is, if they aren't as up to speed on the technology we need them to, how long do I think it will take to on-board them and get them self-sufficient on our platform. If someone does not have the best resume, but their Github profile is full of projects, even half finished projects, of them trying out different languages or frameworks or maybe making (or trying to make) contributions to open source, it usually makes it easier to say "lets give them a chance," especially if they have been playing around with the languages/frameworks we use.
Hope that helps answer your question.
The "especially if they have been playing around with the languages/frameworks we use" comment hit home a bit. Part of that may be just the asset of exposure to those technologies the hiring company uses (hitting the ground not quite running), but I've had a few clients that seemed to use interest and curiosity in their languages/tools as an indicator of a likemindedness which was viewed as a strong positive. Particularly in the FP world.
Though Aline didn't give hard numbers on this, I would not be surprised if a majority of the applicants completely ignored Keepsafe's instructions. She mentioned that a bunch just dumped generic cover letters or links in the box instead of actually answering the question.
The difficulty we had was not seeing a strong correlation between talking about projects and doing well at programming during an interview.
Ultimately though people are aware they're being watched and assessed under timed conditions so it's going to be somewhat stressful. If we think someone is so nervous they're clearly not able to code at all, we'll offer a take home test as well before making a final decision.
I think "take-home" style tests are the only reasonable way to evaluate this kind of technical competency. They are the closest to realistic. Throw people a small problem of the kind they'll actually be asked to work on. Let 'em deal with it--including documenting their solution!
Example: I developed a little state-machine framework for managing complexity in a large, legacy code-base. It allowed me to refactor a lot of ad hoc distributed logic into the transition table and clean up a lot of weird corner cases that made the code fragile and difficult to change.
Questions might include: "Why did you write your own rather than use an existing state machine framework like the one in boost?" (for C++ frameworks there's pretty much always one in boost, so even if you don't know anything about the area you can throw this in for fun and see what they say). Also: "Why a state machine rather than some other approach to refactoring?" And so on. This process gets at taste and good judgement, it gives you a sense of how tolerant of alternatives they are, and so on.
Additional edit: one of the things I look for in answers is people who say, "Yeah, that particular decision might have been a mistake... I always wondered what would have happened if instead I had..." Good developers are able to admit that not everything they do is perfect, and are willing to give alternative views a bit of credence.
What do you do with candidates like this besides weed them out at the resume review stage?
Have you been able to place anyone yet? I'd love to know out of the 300 or so interviews you've done, how many have led to accepted offers, and what those successes had in common.
> It was AJ, a candidate that Zouhair Belkoura, KeepSafe’s cofounder and CEO, readily admits he would have overlooked, had he come in through traditional channels.
This was the story of my job search three years ago. It still kinda is.
I ended up reluctantly taking a job working for the State of California, mostly because the schedule was flexible and it was a 5 minute commute. In the last 18 months I've gotten two promotions, including one four months after I started which is unheard of in state service (and which my bosses had to fight HR to get). I spearheaded the acquisition and implemented of a version control system, which no one hear had ever used (they were just FTPing files all the time, risking overwriting of others' work etc), and got buy-in from all the developers who now say they couldn't imagine working without it. I also now wear a variety of hats besides programming including sysadmin work, dba work, architecture design, etc. My bosses rely on me more and more everyday just for my opinions and advice, let alone the work I do.
Yet now, when I send out my resume, it's almost always crickets. A year and a half ago, when the first item on my work experience list was "failed entrepreneurship", the response rate to my resume was about 80% (not even talking about interviews, just getting a response at all). Now it's more like 20%, all because the first item in my work experience section is my current job working in public service.
I admit, many of my coworkers probably deserve the reputation that public sector work has. A significant number of them are clock-watchers that the bosses don't even try to assign anything important to, because they know they don't give a shit and can't easily be fired (union), they are just filling a chair waiting for their public pension to accumulate over 20-30 years. Nevertheless, I also list on my resume all of the above, including the pioneering (for us) work I've accomplished here implementing version control etc. I never intended to stay here more than a couple of years, but I also never intended for this job to have such a negative impact on my job prospects.
I think I am probably going to revise my resume to leave this job off completely, and just say I've been working for myself for 9 years instead of 7. I predict sadly that this will return the response rate back to what it was before I had this job.
The fact that you can and did push change in public sector is huge, HUGE to anyone with half a brain. It's all in how you tell your story.
That doesn't sound too far off from a lot of corporate america as well.
My cv makes reference to some experiments I worked on at my first job which are described in very generic terms to avoid any problems with the official secrets act.
eg "complex problems involving digitizing data, from both still and high-speed film cameras."
I do like the idea of a straight up pitch using the more interesting jobs I have done
In that sense he had an advantage over me. The sec people know the deal with the engineers who work for the government on this stuff. I did radar signal processing code; without details I can't share it becomes very hard for me to explain how my background is relevant even though I could prove it in a practical task.
Comments like this make me wonder if one reason for the ridiculously high failure rate in programming interviews is that the resume screening stage is actually acting as a negative filter, that is disproportionately removing the candidates you want and passing the ones you don't.
Let me illustrate:
Growth Engineer One line Resume: "I was employee 7 at Snapchat when we had 6 engineers and 400,000 users. During my tenure as the only growth engineer, our userbase grew to 10 million users over the next 6 months."
SEO resume: "I joined XYZ when it was ranking at page 10 for major industry keywords. 9 months later. Google the following keyword BDHDUYD, which accounts for 40% of your market. If you find the company in the top 3 results, we should schedule an interview."
Software engineer: I do not know, but I am sure you can insert a short paragraph here.
This is really not the case.
I've found that some people's greatest accomplishments are trade secrets that they can't show you. Some people's greatest accomplishments are in hobbies you don't quite understand. Some people are good at things even though they don't have a heroic narrative of great accomplishments building up to their job application.
Meanwhile, people who come in to an interview boasting "Look at this! I did this!" are often just taking credit for what their co-workers did.
My greatest accomplishments in the domain I want to work in are trade secrets of the US Government. :) I can talk at length about my current domain, though, which I want to get out of. :P
That's true, but a good cover letter can go a long way toward helping with this. Most cover letters are generic, bland, and obviously copy-pasted from a template. (Or more often from a previous application, sometimes with info about the previous company left in!) A cover letter that talks about something exciting you've done recently, and ideally how it might be related to the job, or even just how it demonstrates skills you'll use in the job (and describes exactly how), is awesome in comparison. A letter like that would absolutely get you an interview with me, almost regardless of experience. One of our current co-op students actually had almost no programming experience on paper; he had actually switched out of a theatre degree iirc. But his cover letter was awesome (the theatre degree probably not being coincidental). Got him the interview, which got him the job, and I haven't regretted it. Just a co-op of course, but the point stands. The cover letter is probably the most important part of your application. Take the time to write a good one.
It comes down to whether or not the people doing the recruiting all have the same subjective opinion.
Companies have the same problem the military does. They hire a certain stereotype because they've always hired a certain stereotype and the people in charge match that stereotype.
If there's any opportunity for disruption, it would be at tiny companies where there are people that 1) just need talent and 2) know that they process that made them successful is broken.
I'm not advocating for that. I could argue either side of it without any cognitive dissonance. Just pointing out the additional possibility.
Edit: Clarifying that this is for non-engineering roles.
Seems your example changed the incentives, got useful information in return, and that led to a positive result. Unsurprising in hindsight. I'm going to send your article to a few people to see if I can get any to try that approach.
This lady claims that the company is fighting for candidates with Google, although the only thing they do (if i read it right) is provide an encrypted version of Dropbox. How does this require world class engineers? I've coded a file syncing app quite fast as a personal project once, and I don't think I could call myself even a regular developer. I do not believe such application would be even remotely as complex as anything Google does.
I am a little puzzled, though, about why others seem to find resumes so opaque. It seems like resume-reading is a lost art. A resume is usually a document that someone has spent a lot of effort on to make themselves look good. If you learn to read them, that can tell you a lot about the author. (Note: searching for buzzwords is not "reading.") A resume should not be regarded as simply a collection of facts - of course you'll be misled if you do that; a resume should be regarded as a document of self-expression. After a while, you can see useful patterns in what people put in resumes - a least for more-experienced applicants. Almost every resume suggests a bunch of next questions, which can be asked in a phone screen or interview to get a pretty good idea of what a person is about.
It's worth recalling that absolutely all software engineers at all software companies in the world from the first ones around 1955 up to 2002 were hired without benefit of LinkedIn, StackOverflow, Github. Almost all of these engineers submitted resumes, which were reviewed prior to offering interviews. Yes, there were hiring mistakes in the old days, but I don't see a huge number of people taking about how the hiring process now is so much easier, smoother and more foolproof than it used to be.
Huh? The companies this woman hires for don't look at github? Not looking at public code that someone has published is more broken than relying on resumes. If someone has published code and it doesn't suck, I'll probably bring them in for an on-site, period. I may even tell them that "We're going to talk about "file foo.c in your code where you implemented feature Z. So be prepared."
And, I suspect with startups it was more a case of "How many years were you in government? That would makes us so unhappy that we would leave. Why didn't you?" That's a different way of asking "Is this really the place for you?"
As a hiring manager in a startup, when I knew I only had 9 months of runway without more funding, I'd feel REALLY bad about taking someone with a family away from their very stable job. As someone who has recruited employee single digit, I often have made a point to meet the family when recruiting someone--even if I have to fly to them. I need both the prospective employee and their partner to understand that the big probability is that the company won't be around in 24 months, there won't be any payoff, and a new employment search is likely to be the result. Yeah, there is a small probability that we'll survive and an even smaller probability that we'll get some money. It's a really delicate balance for me, at least, to properly sell the company (Startup! Options! Novel!) and reality (Bankrupt! Flameout! Layoffs!).
I'd say I'm batting about 50%. For every employee I scare off, I absolutely convince one to join. Funnily enough, every single one who didn't run away said the same thing: "My wife told me I had to work with you." They were stunned that someone so important (Hah! Management in a startup is a good way to understand how unimportant you are really quickly ...) would take the time to make sure the family was informed properly about the risks and rewards.
I was a help desk pleb at a well known inkjet/scanner/camera company 15 years ago and this company "extended" their clipper database to record third party cartridges, but recorded them in .ini format. That's right, one file per record, in key=value pairs. I was bored and accidentally mentioned to the guy whose job it was to copy and paste the data from each of all 90,000 ini files into an Excel spreadsheet that Perl could do it, and I'd even use references to hashes to do it. He had no idea about that last bit, but I did it for him on the proviso that he didn't tell anyone, and reduced 10 weeks of work to 30 seconds. They unfortunately made me employee of the quarter but neglected to tell me so I missed my awards ceremony.
Worked out really well. Not sure it's to be duplicated, but for me it went fantastically.
Your comment sort of sounds like a shill post, fyi.
Not saying the process described in the article is bad, even though I believe anything can be gamed, but I don't really see a big difference. Main change is the way recruiters looked at what they got, resume or essay wouldn't have change a lot I think.
Maybe off topic but if companies want the best people, maybe THEY should write the essay explaining why people should join instead of sitting in their high tower waiting for minions to come.
However, how can we conclude anything from a procedure that only examines the hired population and none of the unhired?
I see a couple of issues with this:
- Could it get you in legal trouble?
- Are you capable of evaluating who are the best employees AFTER they are hired?
- Only a large corporation would have the resources to gamble on hiring a random sample of people who failed their interview process.
Here's one half-hearted way to do it. Pick a random sample of resumes you reject, and give them phone interviews anyway. Pick a random sample of people who fail your phone interview, and give them an on-site interview anyway.
1. Github a/c
2. StackOverflow a/c
3. Their blog
4. Anything they made online
If candidate fails to submit link for any of above then just don't interview them. I would guesstimate this simple check filters out 70% of the junk resume and probably 20% of the good resumes. It can scale like crazy and expanded even more (for example, use APIs to get their profile information and rank resumes).
I would never be called based on that criteria as I have no inclination to spend my free time doing stupid shit online for hipster new developers that think GitHub is the end-all.
ps: to be fair, I probably wouldn't want to work in a company that has this mentality, so maybe that really does work
Not sure why you've latched onto GitHub, he said it could be any one of the four. The principle is obviously just about demonstrating work that you've actually done. I'm pretty sure that's reasonable and not cause to go off on hipsters. Yes, there would be some qualified candidates that would not be able to show those things for various reasons, and that is probably the reason for the high estimate of false negatives, but presumably the decision-makers would be permitted to use common sense to make exceptions.
However there is another more stronger counterpoint that I've argued with people around me. As developera pretty much all of us have used StackOverflow to get answers, used someone's blog to learn something or looked in to GitHub repo for some code. It just feels natural and ethically responsible to me that we also return back something to the community from which we consuming so much. People who are sweating out these content without any expectations of financial gains or even fame are obviously sacrificing their free time to help others. Why can't you return the favor? The lack of any evidence of your contribution to community may not reflect deficiency in your skill set but it does put a question mark on self-initiatives that you might take in your job or your willingness to help your colleagues even if it doesn't benefit you or your drive to make others become better from your learnings. After certain stage in career, skill sets are given and what matters is your ability to multiply your impact by leveraging and empowering others. Your participation and contributions to community are good indicators in this area, however false negatives and edge cases always exists.
Since when has every person worth hiring had their own blog?
Edit: sorry, your use of "any" was ambiguous, I read it as "if missing any of the above".
Even if not using GapJumpers itself, you can follow the concept by requesting solving a problem or submitting a piece of original technical content along with the resume.
I can just see this guy going out to Web devs, System devs, DBAs etc. and them all disagreeing because they're looking for different things (and value things differently).
This article starts out with an air of science and ends with a completely unproven conclusion.
While I do agree in my gut that resumes are not an amazing filter, she has completely failed to present evidence that her alternative interview process is better.
And in fact, while KeepSafe still has the no resumes option open, they are now accepting resumes again -- I do not great confidence that the alternative system was anything more than a PR move by the company.
I think she did a great job of doing exactly what she set out to do, and since this is just one anecdote, any qualitative or numerical data she presents won't be worth much, all the more reason to omit it and just share the story.
I'd still love to know why the company didn't switch to this full throttle.