Hacker News new | past | comments | ask | show | jobs | submit login

I regret causing confusion here. It turns out that this correlation was true on the initial small data set, but after gathering more data, the correlation went away. So the real lesson should be: "if you gather data on a lot of low-frequency events, some of them will display a spurious correlation, about which you can make up a story."

So can you now say with some confidence that competition performance doesn't correlate with job performance? That's still kind of interesting, although less so than the original conclusion.

The explanation of the effect did seem a bit too convenient.

You and/or Google HR were also prominently quoted as saying, IIRC, that GPA, standardized test scores (and interview ratings?) had no observable correlation with job performance either. I always wrote that off as just range restriction/Berkson's paradox, but did those also go away?

This made me reread the article[0] again (if we were talking about the same one) and I don't see any mention of interview ratings and job performance.

[0] - https://www.nytimes.com/2013/06/20/business/in-head-hunting-...

Hm? It's right there at the start:

"Years ago, we did a study to determine whether anyone at Google is particularly good at hiring. We looked at tens of thousands of interviews, and everyone who had done the interviews and what they scored the candidate, and how that person ultimately performed in their job. We found zero relationship.

...One of the things we’ve seen from all our data crunching is that G.P.A.’s are worthless as a criteria for hiring, and test scores are worthless — no correlation at all except for brand-new college grads, where there’s a slight correlation. Google famously used to ask everyone for a transcript and G.P.A.’s and test scores, but we don’t anymore, unless you’re just a few years out of school. We found that they don’t predict anything."

Seems counterintuitive. Naively, high GPA = High work ethic + IQ, which surely plays a role in job performance, no?

Of course, but that's where selection processes and psychometric considerations start to play havoc with naive correlations.

Interview ratings would be a surprising one, hard to imagine not overhauling the system after discovering that.

I remember reading something about Google interview ratings being uncorrelated to job performance, but that's clearly a conditional correlation based on the person being hired. For all we know, the ratings might be highly correlated with job performance up until the hire/no-hire cutoff. After all, their primary purpose is to make that binary hire/no-hire decision. Hopefully, the scoring system is hyper-optimized to be a good signal right around the hire/no-hire boundary, as the scores themselves aren't that useful for obvious hires and obvious no-hires: the scores are a decision-making tool.

In order to really get a good assessment of if the interview ratings were effective, they'd need to also hire some random unbiased sample of those who fail the interview process. There are alternative ways of slicing the data to help give insight, such as looking at only those who barely passed the interview process, or looking only at the bottom 10% of performers. However, when you're looking at such a highly biased sample (only the small-ish percentage of people hired), it's hard to say what the correlation is across the entire interview population.

At the risk of repeating myself, we don't particularly care the predictive power of the scores across the whole range, only their predictive power across those who aren't obvious no-hires and those who aren't obvious hires. That's the range where the power of the interview scores as a decision-making tool is most important.

Also, if two metrics disagree, it's not clear which one is problematic. It's possible that a poor correlation indicates that there's a problem with the performance rating system.

> I remember reading something about Google interview ratings being uncorrelated to job performance

You haven't, Googles interviews are correlated to job performance. They have data on it internally, people who work there can look. What you probably read was that brain teasers like "why are manhole covers round" doesn't correlate to job performance.

It was while I worked there that I read something briefly about them being uncorrelated, but it was probably just some popular press miss-characterization. I'd done over 100 interviews for Google SWEs, and wasn't aware of where to look up the data internally. In any case, the article I read wasn't interesting enough for me to do more digging.

I guess I should have been more critical at the time. Thanks for the clarification. Is it widely known where to look up this data internally now? I left Google over a decade ago.

When I was there I just searched it on moma and they had a paper on it showing the correlation coefficients.

Also, for any given title, ppl on HN will come up with anecdata to support it's assertion.

Thanks for clearing that up. Was job performance still positively correlated with higher gpa on that larger dataset?

Ah, I remember hearing this from you in person in 2011 and have repeated it occasionally since. Thanks for the update!

Notice this comment is from Peter Norvig. ^^^^^

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact