> The most significant feature by far was the presence of typos, grammatical errors, or syntactic inconsistencies.
I've always used attention to detail when writing (although I give a pass on grammar/spelling to some extent to non-native speakers) as a proxy for general professionalism/intelligence with relatively high accuracy.
If you can't be assed to write accurately and consistently in your native language (and/or don't avail yourself of the myriad automated tools to achieve this end), how can anyone expect you to be accurate and consistent in any other endeavor? It's not like language and written communication is unimportant; indeed it may be the most important skill any information worker can have.
If I were considering hiring a native speaker of English, a single homonym misuse (its/it's, effect/affect, et c.) or lack of subject-verb agreement in any pre-hire communication (emails, CV, their public-facing website, READMEs on Github) would be sufficient for a NO HIRE in my book.
While I agree that immaculate attention to detail is important in documents you're creating for formal consumption (e.g. CVs/resumes), I don't see how the same extends to emails, websites and READMEs.
I'm a native English speaker and despite being well aware of homonym misuse, my brain doesn't always play the game and occasionally I find myself using the wrong word in my writing with no rational explanation.
Similarly, I often find myself leaving out words in my writing that I don't notice are missing even after re-reading a paragraph many times. (I fall for "PARIS IN THE THE SPRING" almost every time it comes up.)
Alternatively I'll change the wording in a sentence and not notice stray words are left over in the wrong order, again even after re-reading. Usually I have to do something else and look back at what I've written to read it with "fresh eyes". I imagine dyslexic people have similar problems.
Despite these issues I've been complimented many times on the quality of my technical writing and my code, so I don't consider my natural language issues as much of a disability when it comes to programming or technical work. That is, until I encounter somebody who holds your beliefs.
It seems obvious to me that there's a vast difference between somebody writing a CV using obviously incorrect spelling or random formatting, or inconsistent capitalisation, punctuation, tenses, etc., and somebody who makes one or two typos in a blog post on their personal home page.
It is almost impossible to proof read a document you wrote yourself for typos, specifically repeated/replaced words, immediately (while you can still remember writing it.) I don't know if this is true in all languages, but it's definitely true in English.
This is because your brain subvocalises (reads back) what you meant to say, not what you actually wrote, when you re-read and can still remember the sentence in your head.
I know of at least one example of a Deputy Headmaster, who was also my maths teacher, accidentally typing in a swearword to a school report. My Dad (the IT manager, whose job it was to check all the reports before they were posted) got him to read it back aloud (a day or so after he wrote it) and he read out the sanitised/corrected version, aloud, from a piece of paper with the swearword staring him in the face. He was very embarrassed when my dad pointed it out.
In cases like that, you wish you could define a specialised dictionary without the swearwords, so that dangerous typos jump out at you!
to GP: tooling can only get you so far, and you're battling against your own brain the rest of the way. A 2AM README typo is forgivable; a CV/cover letter issue (where you were supposed to be concentrating, and/or getting someone to proof read for you) less so.
Typos are one thing, misuse of homonyms or lack of subject/verb agreement is another entirely. I wouldn't begrudge someone a "teh" in a readme THAT much (though, again: where the fuck is your spellcheck?), but a misuse of "your" for "you're" immediately makes me start to question if they have a clue what they're doing.
> somebody who makes one or two typos in a blog post on their personal home page.
You'll note that the best bloggers/essayists (e.g. pg) have other people proof things before they post them for public consumption.
I didn't say it doesn't have false negatives. I just said that it's a good proxy for attention to detail and diligence. Life isn't a warm-up round or a first draft.
MVPs aside, when you release something for public consumption, it should be as high-quality as you can possibly make it.
That's amusing, considering that you made two rather bad errors in the above comment ("If you can't be assed" and "language and written communication is unimportant"). Granted, a Hacker News comment is hardly the same thing as a CV, but I still detect a touch of hypocrisy here. What makes you so sure that your own GitHub READMEs are free of errors? People have a tendency to gloss over their own mistakes.
As for "language and written communication is unimportant", the full statement was: "It's not like language and written communication is unimportant...". That seems correct to me.
Technically, it should be "It's not as if language and written communication are unimportant."
If we were to nitpick style, we'd say that the double negative is unnecessary, and that "It's..." construction is flabby. So we'd reduce the sentence further to "Language and written communication are important." Debatably, even this statement is reducible to "Written communication is important." From there, we could even cut down to "Communication is important," assuming that "written" is redundant in context.
Not trying to be cute; just illustrating a point. You could argue that I'm reducing too far and slicing away some nuance. Nevertheless, a primary goal in writing is to make a point as succinctly as possible. This is doubly true in short-form writing, such as in resumes and cover letters.
I don't see the errors there. "If you can't be assed to do X" seems standard English to me, if informal. What would you say is wrong with that?
"Language and written communication is unimportant" seems arguably OK to me. Both are mass nouns / uncountable, so the "is" there seems quite defensible to me (v.s. "are", which I assume is your contention?).
Did you actually read the blog post? It's rife with blatant grammatical errors.
Awful use of conjuctions and a bizarre obsession with commas plus errors that a spellchecker will miss (eg. "roughly out 9 out of 10 applicants" under the method section.)
Apparently you wouldn't hire the author, and she wouldn't hire herself.
One of my old managers used throw them in the bin if they had spelling mistakes I asked him about it once and his response was "If they're too lazy to press F7 I don't want them working for me".
One of the best programmers I've ever meet misspelled a core technology on his resume, 3 times, in different ways. There's always exceptions and this kind of resume triage misses them.
"As soon as you get someone who’s never been an engineer making hiring decisions, you need to set up proxies for aptitude. Because these proxies need to be easily detectable, things like a CS degree from a top school become paramount."
No doubt this is true to a some extent, and perhaps even to a large one. But let's not give the HR drones too much credit here.
How many of us have ever had to sift through a stack of 100 resumes or more in, say, a week or less? It's not an easy task. It's especially difficult for hiring managers, because they have day jobs to perform. They may not mean for things like "Google," or "Harvard," or "L33t CS Degree" to become proxies for our honest, thorough, intellectually rigorous analysis of every resume in the pile. But these things become a sort of shorthand.
HR types seem more prone to overemphasizing the letter of the law, to putting pedigree on a pedestal, and to thinking as un-differently as possible. But given a thick stack and a few measly hours, I doubt most of us fare significantly better.
If we're serious about moving toward a better hiring process, we need to start by recognizing the limitations of the resume itself as a normative tool.
TL;DR It would be awesome if candidates had the option to do a coding assignment and submit a writing sample in lieu of a traditional resume/cover letter.
I interviewed roughly 300 people for our back-end/full-stack engineer position
Q1. - Given that most companies interview less than 10% of the total number of applicants, I would love to know how one job received 3,000 applicants.
Q2. - Why did you have to interview 300 people to fill one job? There is a serious flaw in that strategy. If you spent one hour with every candidate you interviewed, that works out at just under 6 weeks of back to back interviews. That's not taking into consideration the logistics involved in filtering and processing these 300. All that aside, do you not consider it to be exceptionally flawed that you had to interview 300 different people in order to find one suitable person for the job?
He is likely saying that he interviewed 300 people to fill about six engineer positions, given he states a 1-in-50 hire rate from interviews. Still, that means 500 applicants per open position which, I agree, seems high.
In tech-ese, his use of the word "position" is referring to the abstract job role, rather than an actual instance of that role.
A 1-in-50 hire rate is absolutely shocking regardless of the company size, industry, etc. Whoever is responsible for prescreening these applicants prior to being interviewed simply isn't doing their job.
Not really, no. I've found resumes to be terribly poor indicators of hiring suitability, and so I end up conducting brief first-round phone screens with most at-least-slightly-suitable candidates.
This correlates well with the article here, as it turns out that resumes were more useful as a writing sample assessment than a work experience assessment to this company.
To be honest, I misinterpreted the ratio. I wasn't aware that the 50 included people who were phone screened. Fortunately the ratio makes a lot more sense in that context.
0. Peroni, thank you for reading this. I've been a fan of your writing for years, and your take on hiring was one of the things that ultimately helped me make the decision to switch from coding to recruiting.
1. Yeah, over the course of a year, we got several thousand applications/resumes for this position. The interview rate was roughly 1 in 10.
2. We interviewed 300 people and made offers to 6, i.e. the hit rate was 1 in 50, or 2%. Moreover (and fortunately!), each individual filtering round and subsequent interview took much less than an hour.
Thanks for the kind words. I feel exceptionally guilty for my harsh feedback now.
I'm still bothered by the 1 in 50 ratio. Maybe I'm misinterpreting it. Does that 50 include people who get phone screened or is that 50 exclusively people you have met face to face?
If it's the former rather than the latter, then consider my incredulity null and void. If it's the latter then you should drop me an email as I would jump at the chance to help you to halve that number.
The 50 includes people who got phone screened. The onsite to offer ratio was something like 3 to 1, if memory serves. I will update the original post to make that clearer.
when you say 'typos matter', what you mean is that there was a correlation between 'typos' and whether Trialpay hired the candidate? The resumes with the least typos turned out to be the ones Trialpay hired (presumably because the technical team judged them to be the best candidates).
That's different from saying the candidates with the least 'typos' turned out to be the best engineers.
Making a predictive statement like that is probably a bit premature. All I'm saying here is that the group of people who got offers differed significantly from the group that didn't get offers when it came to how many typos and other errors they had.
Moreover, whether an offer is a good proxy for whether someone is a great engineer is unknown. As I mentioned in another comment, it would be awesome to track on-the-job performance and use that as the dependent variable.
Interesting results that seem to challenge some received wisdom. However, N=1!
What this study tells you is the systematic relationship between resume detectable attributes and getting hired at this particular company. If this same analysis was carried out at 29 more companies we would have N=30 which is considered a small sample size. To effectively do that the author would need to put out a detailed description of methods for coding resumes and ideally a means of determining inter-rater reliability and so on.
Interesting conversation starter - but please no one make any important decisions based on this.
ok you avoided "the plural of anecdote is not data" :) (the rant previous days ago)
The study may also simply show a correlation between unprepared/quick applications (resumes/cover letters) and well thought, well prepared more serious ones. I mean, it may simply show how serious the applicant takes the current job offer.
Depends. From the point of view of the applicant perhaps N=1 as only one company's hiring was investigated. But from the point of view of someone scanning resumes, N=300 (there were 300 applicants considered).
From the perspective of lessons learned, that is generalizable truths, one can only generalize to that company. So the reader of this article can only really say, this is highly relevant and accurate if I intend to apply for an engineering position at this particular company. Beyond that it falls into the same anecdotal category as any other advice dispensed by recruiters or HR personnel who have responsibility to screen applicants entering into the interview process.
The big difference is this recruiter has gone to the trouble of actually measuring a number of factors and assumptions and can say with great confidence, this is how it works in a particular company. In that regard it is certainly more weighty than the type of blog post that says "I have been doing screening for 5 years and this is what matters to me. And, by the way it seems to correlate to hiring decisions at the back end of the process."
I was getting sick of reading about "cultural hiring" on HN. Finally someone has some numbers instead of trying to justify why they should only hire people who look and sound like themselves.
Don't read too much into those numbers. they were calculated over a total of 5 offers, with a huge selection bias inserted by the pre-screening routine.
As a person who is not totally secure with my command of English, I assure you I check for typos twice as hard for my job applications, including agonising over the choice of US or UK spellings when applying to US MNCs while living in a UK spelling locale.
Professional proof reading costs money. If you are modifying your resume for each job you are applying, it will cost you a fortune. And asking your friends who speak English natively(very few immigrants have those) is not so good since most native speakers have grammar problems( unless you have a friend with a Phd in English lit).
Is "people using their second or third language will make more grammatical errors than people using their primary language" really that controversial of a concept?
I wouldn't so much call it controversial as I would call into question its relevance. Most immigrants I've spoken to are very aware of how hard it is to build a career when nobody can understand you. They don't come out and say it, but you get the feeling that they put a LOT of time and effort into learning how to become effective communicators in English. I would expect a foreign applicant to make fewer spelling/grammar errors because of this.
The exception to this is when I find an "enclave" of foreigners all working at the same shop where the boss is foreign as well. English communication in these environments is worse than usual. But meeting random foreigners in places normally dominated by native speakers like coffee shops and bars? You don't want to know how many hours they've put into getting better at English. It will make you feel like a lazy, racist asshole.
People who speak English as a second language very well, in my experience, make fewer grammatical errors than native English speakers like myself. This is because they've taken the time to learn the correct grammar rules, and are also careful to apply these rules in speaking.
This was a problem for me when I took the GMAT exam, because there is a grammar section there that's based on some formal grammar rules that I wasn't aware of. For example, look at the example sentences for Rule #5 at the end of this article: both sentences sound correct to my ear: http://www.knewton.com/blog/gmat/2011/06/14/what-to-memorize...
Be careful with those numbers. He mentions that half the applicants didn't list a GPA. He also mentions that because of this, their are likely biases in those numbers. He makes mention of this in the comments.
There are lots of strange conclusions on the article. My favorite one is that degrees don't matter, because they pre-screen better the candidates that don't have one. But maybe, my favorite should be that a study about 5 job offers had any significance.
Anyway, it's very good to look at data for a change.
Nice data! However, I'd love to understand the relationship between those attributes and quality of employee. Perhaps a government funded study where you select 100 resumes at random and hire all of them for 6 months? Piece of cake.
Yeah, tracking on-the-job performance would be much more interesting than just looking at whether or not someone got an offer. I hope to be able to do this kind of study in the future with the advent of more data.
Nice article! Two thoughts on the issue of grammar/spelling mistakes:
1. The anecdotal argument: i know a great developer, very detail-oriented, math/stats genius, great team player, elite company background. But he's dyslexic. Let's not hope you would apply your filter there.
2. Great teams need a balance between detail-oriented and not so detail-oriented folks. I know quite a few excellent programmers who can get lost in the details. People with less focus on the details are often the more visionary ones, the ones with the big picture. Truly great people can switch between the two, but they are rare.
> Of all the companies that our applicants had on their resumes, I classified the following as elite: Amazon, Apple, Evernote, Facebook, Google, LinkedIn, Microsoft, Oracle, any Y Combinator startup, Yelp, and Zynga.
I'm not sure an "elite" filter is a great filter, but it's not surprising that it is used. What is surprising is that the OP considers "any Y Combinator startup" to be an "elite" company.
More than 500 companies have passed through Y Combinator and I doubt that most of them would be instantly recognizable as Y Combinator companies on a resume unless Y Combinator was mentioned. Grouping all former employees of Y Combinator startups into an "elite" category seems like it requires an unnecessarily big leap of faith on the part of the prospective employer.
does that mean that it makes more sense to know how top companies hire, rather than small ones, since small companies just shadow decision made by bigger ones and check for typos and grammatical mistakes?
And further to this: which top companies? In TrialPay's case he lists his dozen elite companies. But what happens if you're hiring in a market where none of those companies have an office? For example the city I work in has a lot of large financial firms, but do those count as a "top company" on someone's CV? If you're hiring for one of those firms then probably yes, but if you're hiring for say a web startup then it wouldn't help.
So the vast majority of companies hiring will have to build their own "top companies" list, based on local conditions. Which in most markets means doing your own research and getting to know local conditions, aka networking.
To be honest, most of the top programmers or engineers that I know have terrible, terrible grammar and spelling. A number of them do not speak English as a first language but others just don't seem to care that much.
Would I base my hiring decision for a new programmer or engineer on the amount of spelling mistakes in a resume? I would give it thought but I would be more interested in actual real life working experience.
> I ran this analysis on people whom we decided to interview rather than on every applicant; roughly out 9 out of 10 applicants were screened out before the first round.
Although the author acknowledges this will taint the results, it seems like a major filter especially when you are crafting rules for screening by non-engineers. For example, the results for "Offer Likelihood as a Function of Highest Degree Earned" could shift quite a bit if those with higher degrees were basically passed through the first cut.
Why not run a subset of the analysis using the full set of applicants?
It's somewhat interesting that grammar/typos etc. matter this much. It's a little hard for me to believe because I'd think "projects, projects, projects, get stuff done" trumps all.
Did you adjust for native speakers vs. non-native speakers in any way?
Also maybe I'm naive but why can't engineers do the HR even at big companies? Imo for knowledge workers you really can't afford to outsource HR to people that have no domain expertise.
At the big companies, HR doesn't handle the complete hiring process, but a good HR department can do effective pre-screening for the hiring manager. I like the fact that the OP points out that if you provide sufficient criteria to HR, they can effectively screen resumes. (The problem that I've seen - which can be avoided - is that either no criteria are given to HR or the criteria are too tight.)
Why should an engineer spend time pre-screening for typos? And, if the hiring manager is adamant that the candidates must be proficient in language/methodology X, why should an engineer waste time looking at resumes that don't even mention language/methodology X? (Maybe the hiring manager shouldn't be adamant about lanauge/methodology. Maybe judgement should be used. So that's a criterion that should not be passed on HR, in that case. If it is, that's the fault of the hiring manager, not HR.)
Also, a good HR department can keep you from getting into legal trouble with illegal questions/behavior.
We didn't adjust for native vs. non-native speakers. Despite that, we did end up hiring several people from outside the U.S.
In general, I think native and non-native speakers should be held to the same communication bar. No one is demanding perfection, but it's important to be able to get ideas across without noise. Moreover, and I think some commenters mentioned this below, a resume isn't written on the fly. Ideally, someone should be self-aware enough to know that grammar/spelling isn't their strong suit, whether they're a native speaker or not, and that should inform whether they opt to have it proofread.
if it's true, then whatever trait leads one to being punctilious about spelling also makes one a great engineer.
forget the guy who's really good at linear algebra or can write his own compiler ... the guy who is perfect with his spelling will be the better engineer.
I'd love to see more research on this because it would be really fascinating result if true.
In an age where word processors highlight mistakes and offer suggestions, errors of the kind that would be caught automagically are akin to checking in code that doesn't compile.
I don't really care about email (esp. if written on a phone) but a CV is, as others have said, a formal document.
I've always used attention to detail when writing (although I give a pass on grammar/spelling to some extent to non-native speakers) as a proxy for general professionalism/intelligence with relatively high accuracy.
If you can't be assed to write accurately and consistently in your native language (and/or don't avail yourself of the myriad automated tools to achieve this end), how can anyone expect you to be accurate and consistent in any other endeavor? It's not like language and written communication is unimportant; indeed it may be the most important skill any information worker can have.
If I were considering hiring a native speaker of English, a single homonym misuse (its/it's, effect/affect, et c.) or lack of subject-verb agreement in any pre-hire communication (emails, CV, their public-facing website, READMEs on Github) would be sufficient for a NO HIRE in my book.