Hacker News new | past | comments | ask | show | jobs | submit login
The computers rejecting job applications (bbc.co.uk)
198 points by mnw21cam on Feb 8, 2021 | hide | past | favorite | 251 comments



The next logical step would be to train an adversarial AI against the hiring AI. You'd have the system generate your resume and application to maximize your chances. Then the hiring AI would need to be re-trained to account for this. And so on.

In the far future, this feedback loop creates an economy where every job application is total gibberish. No human can possibly explain why their resume is a recipe for carne asada, an excerpt from Moby Dick, and a bunch of windings. But supposedly it's predicted to increase final offer salary by 13.54%, so nobody questions it. Anybody who still writes out their resume by hand is considered a luddite weirdo, and definitely not someone you'd want to have join your company.


I have a section on my resume called:

'Technologies I'm Unfamiliar With'

Underneath is a Laundry list of today's sexiest technologies.

It's not quite as advanced as your suggestion, but it's pretty effective. It's also a good icebreaker when you have the in person interview.


I would title it "Growth opportunities I look forward to exploring", and then yes, absolutely.


This must be an American thing, if someone wrote that on a CV in the UK, this sort of text would probably put people off.


The 'American thing' is putting keywords in white text on a white background.

This is just normal stuff.

I get people hitting me up on Linked In for tech that I outgrew 10 years ago. I can either take them off entirely (making me look very 1 dimensional) or counterbalance that with things I'd love for someone to pay me to learn.

I haven't made any decisions so far so the status continues to remain quo.


I think the joke is that the automated scanners weeding out resumes as in the article posted look for keywords like "React" or "Machine Learning" but without context, so when they see "Machine Learning" on a CV, they are more inclined to accept it, not knowing the context. As an interviewer you can simply play dumb on that aspect.


I find job application advice and takes very polarising. Some people are adamant about including other interests in a CV/resume while others consider it a waste of space that implies something negative about the candidate. There are also huge variances in what a CV/resume should contain and be called, such as how many pages it should have[1] or if there should be a picture, that people will die on a hill for.

This is normal! If the hiring managers will reject CVs that have hobbies and you find them valuable information, what you're experiencing is a culture mismatch. That's effective filtering, before you're saddled with a new job contract. Viewing a job application like school, where you're guaranteed a certain result if you have the right talent and work ethic, is the wrong model and will lead to disappointment. It's fundamentally a relationship, so should be viewed like dating.

If I one day become one of HN's fabled star engineers that can have any job they want, I'd like to write a CV in a markdown text file. That's how I really disseminate information in my working life, not Latex.

[1] Having the proper number of pages is an important signal that you are also proper and well educated, i.e. the correct social class.


The picture thing is a no-no in the US, due to discrimination fears, but is reasonably common in Europe.


If I were a hiring manager, and I liked everything else I saw, I'd get a good laugh out of this, which might propel them to the top of my "to be interviewed" pile.


It'd piss people off in the USA, too


That would tell me so much about a candidate.

The desire to learn and adapt is probably one of the most important thing in Software.

The ability to take a look at oneself and identify blind spots/weaknesses in ones knowledge and skills is also a great indicator this person won't become an "expert beginner".


This is the biggest of big brain moves I've ever heard when it comes to gaming these types of stupid auto-filters. Definitely stealing this for future use.


Resume Index:

Personal Information ... 1

Education and Certifications ... 1

Previous Experience ... 1:2

Known Technologies ... 2

Technologies I'm Unfamiliar With ... 2:348


I've always viewed staying up with the latest tools, technologies, and practices as part of my job. I've also found that the people who tend to not do this or display a similar sentiment you touched on with your comment, it's because you have carved out a large enough amount of success or niche with what you are working with. Am I reading into this too far?


The breadth of the industry makes it completely impossible to keep up with everything. I started typing out a list, but the list would be impossibly long.


I agree that it is impossible to keep up with everything. I am more viewing it from a prospective employability position and safeguarding my future interests. I may do very well writing apps in containers with traditional load balancers, but I very well better learn some Serverless and JAMStack type patterns as well, as a completely random example.

It's part of the filtering process to determine what subset you want to target, but learn something new as often as you can. The bonus I've learned over time is the more paradigms, tools, and technologies I learn and practice configuring and launching sample apps or solving simple challenges with, the more they all start to look and feel the same.


Also once you've done it for about 10 years you realize there's not a lot that's really new.


Like nitrogen explains as well, it's just too much, even if you concentrate on what you are actually usually working on/in, depending on what your field is.

I.e. if you are working on anything that is currently a 'web app' of some sort, then depending on what company you were at, you would either be using one framework or another. So you might write in the "I know technology X" column things like "jQuery, React", because your last job used JQuery, then they grew up and switched to React. In the "I don't know technology Y" column, you write things like "BackboneJS, AngularJS" etc. (the actual lists would be much longer, this is just to take an example). You're still current if you use either Angular or React it just happens that you (or someone at your company) didn't choose one but the other. And then there's the myriad of other frameworks that come and go or that a particular niche of companies might prefer. If you're a web app FE or full stack guy you can still pick any of these up easily enough, so it makes sense to list them for the "pattern matching HR drones" (or computers :)) to get the interview.


As someone who gets to participate in the hiring process more than I'd like to, I look for balance on resumes.

If you bring nothing but J2EE 1.4 experience I'm going to assume you've carved out a focused niche. I'll steer the interview towards broad, modern practices and technologies.

If your resume talks about nothing but modern technologies, I'll try to dive deep to be sure you aren't a dilettante with surface knowledge in a ton of things but no deep capabilities where it matters to $company.

In general the best resumes show a balance of deep expertise in foundational tech and exposure to new technologies. A dev might have done Python for a decade but has started poking at Elixir, an ops person knows Linux like the back of their hand before they start talking about Kube, etc.


Do you have hard data demonstrating that your method actually has greater predictive power for job performance than other methods?


Absolutely not. It's one more approach riddled with bias, human foibles and questionable correlation to results - just like all hiring.

All I can say: in my experience, the engineers I've hired with deep (but potentially old) experience and with balanced resumes have been better performers and more likely to stick around than those who bounce around tech to tech.


I mean, I'm in the same boat as you. I just hate that I have to rely on this type of information, subject to all the flaws and biases of relying on personal experience.

In my experience, it's useful to have a couple 'jack-of-all-trades' types on the team, but you wouldn't want all your engineers to be broad but shallow. I did a stint in the military to pay for college, and something we'd point out was that it was better to have something 'good enough' right now than something perfect a thousand miles away. Breadth of experience is also handy when trying to innovate a novel solution.

But I don't have any better evidence than you. I've never worked anywhere large enough to have that sort of data where we could demonstrate quantitatively that one method works better than another, and of course the metrics are all fairly arbitrary themselves.


If it's part of your job then you spend time in your regular 9-5 day keeping up with technologies correct? Otherwise it's not part of your job, it's part of your personal time.

Employers tend to want people with the right skills but they seldom allow time to develop those skills.


This is similar to Stackoverflow's 'Technologies I don't like' resume section. Every time I encounter something truly horrible I add it there. I also add tech to filter out roles I'd rather not get tricked into doing. So far I get asked about this section a lot. Mostly from recruiters curious to see how flexible I am because internally the company does a little X or Y and needs to know if it's a deal breaker. It can really tell you a lot about places you would rather not work at though if you know for sure there are processes you want to avoid.


Brilliant, take a page from dark patterns and make the font white on white or light light gray.


If only I had space left on my big brain resume for this.


Very clever!


I'd say your name pans out, that seems pretty bold to me.


Yeap. There then ought to be a whole cottage industry of gaming AI applications and AI hiring.

So, we're basically voluntarily letting the machines take over and replace human jobs, nuance, and critical thinking?

Oh and then the next logical progressions are AI performance reviews and AI management. Within a decade, your AI boss will slinging around a coffee cup and saying "yeah, I'm gonna need you to come in on Sunday too."

You are a true believer, blessings of the State, blessings of the masses. Work hard, increase production, prevent accidents and be happy.


There are numerous resume consultants who specialize in getting past HR screens and keyword filters.


Who? Where?

Asking for a friend...


Or just send in an image of a koala with a noise filter over it. 99% confidence for a prime candidate.


I kind of rambled on about this in another comment on this thread, but I paid someone $250 to optimize my resume with key words to appease AI. It worked. I got automated replies to schedule phone screenings from jobs that had rejected my old resume. It's already happening.


I am curious what they did - can you share high level details? EG: if you're a developer, did they add more technologies, or what was the type of thing?


The very fact that they serve many clients gives the service a good view on what generates increased response rates.

I once scraped a bunch of job postings and did a simple word frequency count. Just eyeballing the list, I picked out a few of the most commonly mentioned techs and skills. By just rewording a few of my resume entries, it seemed to increase my response rate by quite a bit.

For example, in the description of one project I added "using JSON". It wasn't particularly important to that project that the exchange format used was JSON, but apparently many of those doing hiring are seeing fit to include that in their searches.


Business plan: Undercut Pymetrics by offering a free service. Dominate the market. Then, monetise it by selling optimisation to the applicants.


This has a strong Kurt Vonnegut smell to it.


I also thought of Player Piano when I saw this


he's ded, you know it, right?


So it goes.


We can just replay the history of SEO. I believe we're at the point (was it the late 90s for SEO?) where it's time to put all the keywords onto your resume in a white font.


The scary-ier part is how many life-critical processes might end up ( or already are ) like this. That being said ; balls have zero to me to me to me to me to me to me to me to me to. But if you don't have access to that kind of technology, i i can i i i everything else . . . . . . . . . . . . . . [0]

[0] https://thenextweb.com/artificial-intelligence/2017/06/19/fa...


You underestimate the conflict of interest by far. What makes you think the company that creates the hiring AI won't create the resume AI under a different company name? It's like turbo tax. Create a problem and sell the solution.


I just wanted to say I loved this comment, thanks. Subbed.


Building on this, what are the chances this just leads to a cyberpunk underground economy of interview-passing images/videos?


I would read that if it was a book.


Lol.


I was part of a very in-depth evaluations of both Pymetrics & HireVue for a large company recently (2 different evals, among others). Some interesting things we observed:

(1) It was not clear to us, with any level of certainty, whether either tool did anything more useful than filter out people who don't want to take the test / don't trust "AI".

(2) It was not clear to us, with any level of certainty, whether (1) was a useful, arbitrary, or counterproductive selection criterion for the roles we were hiring for.

(3) It was very clear to us, with a high degree of certainty, that both impressed a LOT of the higher ups

(4) It seemed somewhat clear that both were unlikely to get us sued (note: this is a high bar). However, HireVue (as of a year ago) could not in ANY meaningful way substantiate this claim "AI is more impartial than a human interviewer, as it has no bias," and it angers me that they make that claim.

After conducting a bunch of reviews like this (including other tools), I concluded that most AI recruiting software is a combo of (a) very useful process automation and (b) hocus-pocus, magicical, pseudo-AI. I had people telling me they used supervised learning because they had a team of supervisors in India. I had people telling me they had "Custom AI" when they were just calling some random unverified third party API. And I OFTEN heard nonsense, unverified claims of "unbiased" AI, as if eg training a hiring model using geographical factors won't tell you "don't hire non-college-age people from the south side of Chicago".

If anyone is looking to use recruiting AI, get a good IO psych person and listen to them.


The pymetrics test mentioned in the article involves among other things, clicking on a balloon till it pops and pressing the keyboard as many times as possible in a minute. As much as people love to hate on leetcode-style interview programming questions, they are at least remotely related to performance on the job.

Personally, I've found it to be a good filter in terms of deciding where to interview at. Companies that have adopted these pop-psychology tests (management consulting companies and big banks like BCG, JP Morgan) signal that they've let non-technical upper management hold sway over recruiting in comparison to tech-centric companies/hedge funds etc. that seem to have more rigorous recruiting standards.


> Personally, I've found it to be a good filter in terms of deciding where to interview at

Emphasizing this point for others. This is the best way to approach it IMO.

The company should work to impress you as much as you work to impress them.


Got a call from a recruiter that was looking for an oddly specific combination of skills. Me, so it’s Company X? He was flabbergasted. I gave 3 months notice. I was literally the only person in the world with that combination of skills. So I asked how much, went to bosses, I’ll stay on longer if you pay me that rate.

Good money for a few months.


I suppose that speaks positively of the recruiter's ability to find the right person for the job!


This. I got headhunted for a very very specific skill in my city. Asked them if it was company X? “How did you know??” “I wrote that rec”. I then proceeded to work directly with the recruiter to find a candidate and got an easy $1000 for about 4 hours of total work.


I'm curious about what the combination of skills was ?


Obscure mainframe tech and ruby, VB. About 15 years ago.


> The company should work to impress you as much as you work to impress them.

This becomes more true the further in your career you go. But for entry level positions it's really just a numbers game.


that's the balloon task, it measures willingness to take risks in contexts of ambiguity (the chance of it popping is not known). the button press task is probably used to normalize response time data, since some of the between person differences in average response time is due to (probably uninteresting) differences in motor-control of the neuromuscular system. were there any other tasks like ones that asked you to identify the color of the text regardless of what the word said (e.g. a blue word "red")?


The "willingness to take risks" is probably also "willingness to take crap" by an unemployed or underemployed person. A gainfully employed person may move on when the pseudoscientific video games come up during the screening interview. It is a risk to go work for such companies.


I think I would dispute the characterization of this as pseudo-scientific. The analogue balloon risk task is well validated, if in some ways poorly understood. That is, while standard economic risk models don't map onto it very well, it does correlate fairly substantially with real-world risky behaviors (so again, not economic risk where there is variance of outcomes, but more like the kind of risk taking where there are predictably bad outcomes, like trying heroin).

On the other hand, the use of these measures by big corps very well might be pseudo-scientific. I generally object to psychometrics being applied to prospective employees; it feels very dystopian to me. I also know that these measures require more subtle interpretation--and caution--than you're likely to get from anyone in HR, especially when we are talking about making inferences about individuals rather than groups of individuals.


Why you are being down-voted is beyond me. Your response is reasoned, clear, and open-minded, as well as informed. A citation might improve it, but as it stands, its a fine exemplar of the kinds of comments we should be promoting here on HN.


I’m an IO psych at a large tech company that evaluated pymetrics against job performance and there was no relationship, just near 0 correlations so recommended not using them and we don’t. These AI tools are not transparent enough in explaining their outcomes and lack what we call face validity or job relevance. A coding test is at least a good filter at the top for entry level roles with large pools because it has relevance and false negatives aren’t as much of a concern (but we test still for disparate impact) otherwise for most roles a structured interview is the best option.


This is why tech companies are good. Our output basically said "this is entirely indistinguishable from random choice," and we got the response "Great! Can't get sued over randomness!"


Even fairly strong correlations break down on restricted ranges of inputs. Eg, if you're hiring between 80-90th percentile candidates (top ones get better offers, lower ones get filtered out), yes, the correlation will get swamped by noise; not so much if you are hiring between 0 and 100.


Based on how these things go, it's almost a certainly that this will be found to be discriminating in some way that is unethical/illegal within the next few years.


> (1) It was not clear to us, with any level of certainty, whether either tool did anything more useful than filter out people who don't want to take the test / don't trust "AI".

There's at least one way this is beneficial for the employer: Desperate, obsequious job-seekers will be more willing to tolerate the degradation of being judged by a computer. It's perfect for finding employees who will ask "How high?" when you tell them to jump.

I applied to a few jobs at United Healthcare and they kept sending me HireVue interview invitations. Every time I rejected them and when prompted for feedback I said I don't want to be evaluated by a computer. Never heard anything from them (and didn't expect to).


How do you find good psych people? Showing my bias but it’s a genuine question. I remember in school all the psych and social psych folks being total jackoffs in blowoff classes.

How do you find people you can take seriously? Who don’t just quote psychology today and act like it’s insight? Like I know the field is professional and has skilled professionals - but how do you find them?


Well, first you put out a job ad. Then because the psych people are so numerous, you'll be inundated with resumes. Since you don't know what to really look for, you decide that an AI from some company will help out with that. So you use an AI to hire people to get rid of the AI, except in the case where you hire people to get rid of the AI, because you don't know how to hire those people. ;)


So how do you find the right psych person to monitor the psych hiring AI?


I actually built an AI to do that for you


Who's gonna monitor that AI?


It made a strangely compelling argument that it would be fine if it forked off a subprocess to monitor itself.


But it also has to fork off a subprocess to hire that subprocess from a pool of randomly generated subprocesses


For hiring and assessment design look for people with at least Master degrees in Industrial Organizational psychology. PhD isn’t necessary because it’s an applied field and there aren’t many applied PhDs so work experience is more important. Most start out in public sector which is much more rigorous in their interviewing and defining roles than the private sector is. SDSU and SFSU have great programs in CA and their professors can connect you to their alumni networks. SIOP is the professional organization. IOPredict, Biddle, and RocketHire are some consulting firms.


You don't hire right out of university, but middle-to-senior people who have established themselves in academia.


That makes sense - thank you.


> I had people telling me they used supervised learning because they had a team of supervisors in India

Love it.


This makes me think of the (most likely apocryphal) story of the hiring manager faced with a huge stack of resumes.

He divides the stack in half, and tosses the top half in the trash. "We don't want to hire those people. They're terribly unlucky."


Frankly, the logistics of hiring make this a reality. I remember helping my boss hire interns, and picked a few out. Later on, they found out that they weren’t able to go to all of the schools that they wanted, they could only choose 2. Where did all the people from other schools go? Straight in the trash. Same deal with hiring full-time employees in my experience. Random, usually understandable circumstances preclude a huge swath of applicants from even being evaluated. And for many places, it doesn’t even matter, because they have so many candidates that they can’t distinguish between them at that level. But from a personal POV, it can be disheartening to get rejected from job after job if you don’t know that like 75% of the time it has nothing to do with you.


The current large tech company I work for had a recruiter posting messages on LinkedIn like “we’re hiring like crazy in <my town> for people with <my background>!” I had already applied on their careers site and got no response. Messaged this recruiter and got no response. Applied again to a similar position maybe 2 months later, got a response, got an interview, and now I’m working there. Guess my point is there’s clearly a lot of randomness in the screening process.


Any application process is mostly luck after you meet a certain threshold IMO.


They're also mostly luck until you meet a certain threshold.

Therefore, they're mostly luck.


I applied to my current employer 3 times in total. Better candidates would already be working for other companies after the first time.


This is why I always laugh when someone trots out the tired "shortage of engineers" excuse about why they can't seem to hire. For any tech job posting, the employer needs a super-aggressive filter just to get the list of applicants down to some manageable double-digit count, and this filter is not always going to be nice or fair unfortunately.


> shortage of engineers

They mean qualified engineers. There are a lot of completely clueless applicants for every programming job out there. The lack of formal certification sure doesn't help with that.


There are resumes you're going to see today that are a terrible fit, but might be exactly what you're looking for in three years. It matters how you go about rejecting those people. Charity is contagious, so is indifference. Disdain, on the other hand, is virulent.

You don't want people to delight in the prospect of working with one of your competitors. That's far, far worse than just missing out on hiring them and never getting a second shot.

There is also, I think, a false economy in having one person talk to an individual on their own. There is no one to see how that interaction goes. If I were in a protected class, you've also created a liability for yourself by having nobody to corroborate an exchange.

We are trying to make an inherently expensive process cheap and we are breaking everything in the process. We didn't fix deployments by doing fewer of them. Why do we think we're going to fix the onboarding process by avoidance? Finding and training people is part of building a team, which is necessary to build a product. If you have a bunch of people pushing back on the obvious parts, they're probably pushing back on the rest of it too, making little empires for themselves at the expense of their peers, the product, and the company.

We used to use referrals instead of cattle calls to fix this problem, but we don't like the kinds of hiring biases this brings in and so we threw the baby out with the bathwater. Now we have all of the worst, dehumanizing attributes of a lottery system, and not really many benefits from doing so.

Probably what we are all not learning from this decades long experiment is that if you haven't solved your diversity problem before you are famous enough to have an embarrassment of riches in your inbox, then you never will. Everything that comes after is a series of rear-guard actions trying to replace bad decisions with less bad ones, failing as often as succeeding, and justifying your arbitrary decisions as more merciful than the alternative, when in fact you mean more merciful for yourself.

Right now, for instance, I've been keeping my eye on José Valim (Elixir), because I have a suspicion that his team may have cracked that nut already, or soon will.


Choosing two schools and then saying the process is random seems disingenuous.


This is such a common story that a lot of people have actually witnessed it. I mean you might as well do that with a stack of CVs, so people actually do it. A friend of mine witnessed this, and clearly the boss hadn't thought of it himself.

Variations:

"The way we hire traders is we don't hire unlucky ones."

"The most important thing in sales is selling yourself."

Followed by unceremonious dump into bin.


This always annoyed me, because it's almost certain those candidates were in some kind of order.

It'd be either based on the file-name when they were all printed, or the order they came in, or based on some weird method the printer uses, but they'd be some implicit order.

So the candiates were unlucky, it's just the hiring manager couldn't be bothered to think about the biases... oddly relevant when talking about managers out-sourcing their racism/classism/agism/sexism/elitism/whateverism to AI.


Less harm there in the trash than in the cloud to be used by perfect strangers to profile you.

"This will go down in your permanent record" comes to mind.


I've read once of a company who fired everyone who had below average performance score.


It's more typical to fire the bottom 10% but that's derisively called rank and yank. Microsoft, HP and others have allegedly used it at one time or another. The assumption is if you do your job well you'll rank well. The reality is your job becomes ensuring you rank well, which may not 100% align with doing the job you were hired for well.


It also neglects that you'll always have a bottom 10%, even if those people are absolutely amazing: hire 10 Harvard PhDs and one of them will be at the bottom.

I can't believe stack ranking ever caught on.


Yeah defenders of the institution hand wave and say well once you get enough people in the mix, the math works out. I'm skeptical when you're asked to force rank people into a bell curve no matter how small the population. People who are on the bottom due to forcing it into a curve tend to stay there or near there.

Your bottom-most Harvard PhD is unlikely to find himself in the top 10% when mixed into a general population because other human factors come into play, not the least of which is it's incredibly time consuming to review and re-rank every person. So managers tend to leave people where they landed because fatigue eventually sets in.

This in turn inspires other behaviors that are not really what the designer envisioned. For example one manager used to carry around a book of every mistake other teams made: if anyone challenged the ranking of one of his employees he'd start firing off potshots at the other manager's org. Needless to say his rankings were left alone. The irony is not lost on me that this manager eventually left to start a company that allegedly uses AI somehow to help people identify the best candidates. He spent his days subverting a system meant to measure employee performance and now claims expertise in finding the best performing employees.


If the ranking system is accurate (that's a huge "if"), then it doesn't really matter what the bottom x% look like on paper.

The question is whether it's ever possible to devise an accurate measure of value quantitatively. Even if your job is press the red button, a person with low numbers might be such an inspiration to the team that everyone else works 15% faster, more than making up for their shortcoming. And you can't possibly anticipate all such factors in advance.


And even if the evaluations are perfectly accurate and retained staff are equally or more motivated, it's still quite possible that interviews, training and slow productivity ramp up for replacements results in more of a productivity hit than retaining relatively unproductive staff.


> If the ranking system is accurate (that's a huge "if"), then it doesn't really matter what the bottom x% look like on paper.

Unless every employee does the exact same work your ranking system will never be accurate. The difference in the difficulty of jobs will dominate the difference in competence.


It also means that if you're doing a bad job of supporting certain groups of employees, they'll get laid off instead of fixing the problem. You'll winnow out anyone who disagrees with you along with the people who just aren't good at their jobs.


Well, I guess the joke works perfectly well for 10% too. The company will just live for a bit longer.

The fact that the practice is immediately visible as stupid does not stop real companies from adopting it.


"We're the best, because we fire the worst!"


Oh yes, and I know of at least one country that is considering setting the minimum salary as a percentage of the average salary...


Just think, in only a couple of years everyone will be a millionaire


GE?


Of course if the job sucks, then you're screening for luck the wrong way.


You joke but this is actually how most recruiting AI systems work


See 'the secretary problem' [0], which is a more sophisticated version of the above. It's a process that can be summarised roughly as follows.

Given a pile of CVs

Read approximately 40% of them (actually 1/e, or ~0.368)

Dump that 40%

Select the first candidate that is better than the ones you have just dumped

[0] https://en.wikipedia.org/wiki/Secretary_problem


The secretary problem says that once you pass on a CV you cannot go back and latter decide you want that person. I don't think that is a restriction that exists in real life hiring.

It is possible to review all the CVs and then decide which one to pick.


When I was involved in hiring, we received a steady stream of CVs and then had to decide if we want to interview and eventually hire or reject a candidate (typically we decided right after the interview or programming task).

While there is a short time window in which multiple candidates might be evaluated, this approach is pretty close to the assumptions of the secretary problem.


I assume that's pretty typical. It's certainly my experience. We interview someone and have a call and it's usually either an enthusiastic yes. Or an OK I guess (or just no), in which case we keep looking. I'm not sure I can think of a case where we were "I guess they'll meet our needs if no one better comes along."


> I'm not sure I can think of a case where we were "I guess they'll meet our needs if no one better comes along."

You may not have been at an organization with a policy that unfilled positions after X time get removed. When you get close to the end of the time, there's pressure to hire someone, even if they're not great, because otherwise you'll lose the position.

On the other hand, I have had a case where someone was not hired for a position, and then later was asked to interview for a different position with the same hiring manager, and was hired for that.


Many hiring managers read your CV for the first time during the interview.


This is a reason why you

1. Should always insist on a first phone interview

2. Ask the hiring manager why he believes that you are a good match for the position.


About a year and half ago I was applying for a software engineering internship at Goldman Sachs and they made me take a Pymetrics test. One of the tasks was to press the spacebar as many times as possible in 30 seconds. I didn't do the task and withdrew my application.


They successfully filtered out one of the applicants who won't put up with their bullshit.

It's a pure win on their side.


This. People think stupid tests and ineffective recruitment stand for incompetence. They may actually signify a well-tuned process to identify compliant, mediocre individuals that don't produce results, stand out, nor are self-motivated, but rather aim to please the boss, without question, and to "fit in".

As an engineer, you want to increase your value contribution to the org and the org should see you the same way. This usually means look for software product companies with at least some meritocracy. It should be about software all the way down.

In any other scenario, it's about something else, usually pleasing the boss, you are a servant, and good luck with that.


>They may actually signify a well-tuned process to identify compliant, mediocre individuals that don't produce results, stand out, nor are self-motivated, but rather aim to please the boss, without question, and to "fit in".

You might be giving them too much credit. I suspect that may have been the original intent of the question, but was mindlessly aped after appearing in an article by countless hiring managers and recruiters.


It doesn't even have to be conscious. Implementing a meaningless test still provides a valid test as far as candidate compliance and desire to fit in are concerned.

Merit-driven applicant: This test is not backed by science, does not measure anything related to performance relevant to the job and is arbitrary. Oh, and did you check how do the developers of the test address the fundamental problem of lack of replicability and reproducibility in psychology?

Hiring (if they can actually articulate and self-reflect, a rare occurrence): So you think you're smarter than us?

Subservient applicant: I gave this test 110%, thank you for this opportunity! I look forward to learn more about how I can contribute to the continued success of [org name]!

Speaking to a mediocre manager with a very stable career, tt was a revelation for me to hear them say "smart people get frustrated quickly. I've learned through the years to select those that are agreeable". Value and value creation were not on that managers radar.


>"smart people get frustrated quickly. I've learned through the years to select those that are agreeable"

That might make sense if you're hiring people to dig holes in the ground but seems absurd to use as a filter for any sort of skilled professional position.


The idea here is that the hiring manager is likely to care about things other than raw productivity of their direct reports, such as how easy they are to manage.


A different take on your last paragraph:

Ignorance can be fixed relatively easily compared to poor personality traits.

I think somewhere here is a conflation between "ignorance" and "intelligence".

First priority is to get someone who doesn't stir up unnecessary shit.

Second priority, get someone smart.

Third priority, get someone who already knows the tech.


That sounds like it could be discriminatory against people with certain disabilities.

Besides the fact that it's stupid and somewhat embarassing.


I literally was embarrassed. I didn't want to work for a company that selected talent like that.


Perhaps you aspired to find a job that rewarded on merit. They perhaps aimed to form an orderly line of obedient sycophants. The test worked, to the benefit of both parties.


Yeah I wouldn’t feel proud of getting that job.

We had some of those too there’d also be this webcam screen thing where you answer vague interpersonal questions as the camera looks at you.


A finger-tapping test is often used to normalize other kinds of response time, essentially to control for uninteresting between-person differences in neuro-muscular performance. Was there some kind of cognitive control task too, e.g. where they ask you to refrain from responding to certain stimuli? Typically in something like that you'd have a hard and an easy condition, and the difference in average response times between those conditions is a proxy for "cognitive control" or impulsivity. If an employer were interested in that they'd want to remove the effect of finger speed.

I don't think companies should be administering psychometric tests. But it doesn't make the task you refused to do meaningless.


All data is meaningless if interpreted or used incorrectly. Is there any evidence supporting the use of psychometric evaluations in the context of making hiring decisions? Just intuitively, I would think the predictive power would be low.


agreed.


> But it doesn't make the task you refused to do meaningless

Did they say it was? I'd withdraw my application too. There's no good reason for it, just bad reasons that they think are good.


There was an arcade game many years ago where that was part of the game play, and this “test” could probably be manipulated in the same way: “ Because the game responded to repeatedly pressing the "run" buttons at high frequency, players of the arcade version resorted to various tricks such as rapidly swiping a coin or ping-pong ball over the buttons, or using a metal ruler which was repeated struck such that it would vibrate and press the buttons.”

https://en.m.wikipedia.org/wiki/Track_&_Field_(video_game)


There are MANY programable keyboards (and just plain USB device controllers). The chances that this can't be gamed are zero.


Pushing candidates who actually question the requests being made to select themselves out of the running sounds like the ideal recruitment software for a consulting firm.


Fun fact: Pymetrics has hired a lot of former C64 game programmers.


Is that a reference to the infamous C64 Summer / Winter Games? Where you had to twiddle your joystick as fast as you can?

Fascinating that should carry over into the 21st century hiring process.


Yeah, I might have lied when I wrote "fact"...


Wasn't that the game mechanics for the sprint event in Summer Games? Diving was the best in that.

I did really enjoy surfing in California Games.


I enjoyed California Games more than the already great Summer/Winter games. For a little boy far, far away from California, somehow they managed to pack a whole lot of California Feelin' into 320x200 pixels... http://www.youtube.com/watch?v=mSp8XHpEKzw&t=0m33s


Also seems biased towards programmers that prefer spaces instead of tabs...



Yes, hence Py-metrics.


Good one. C64 games almost always required a joystick though. Maybe ZX Spectrum game programmers?


Details... ;-)


To be fair, I've been training for that task my entire life


Cordless drill with a small paddle bit held the right distance from the space bar would have aced that test.

Of course there's software tricks you could have used as well.


Or a software input device


Or, presumably, holding the spacebar down and letting the repeat function do its thing


`yes | tr 'y\n' ' ' >/dev/input`


Is it bad that my first reaction would be to bind autohotkey to spam out space bar events?


Probably, too try-hard. I think mine would be to press the spacebar once. I'd get rejected for not following instructions.


Sounds like 50% of video games in the market. Just pretend it's about how high you can jump with infinite jump enabled!


Good on you, mate.

It would be great if you gave them feedback along the same lines, although chances are it would just get ignored.


Bravo. Genuinely well done..


I just timed myself. 224 times. I wonder what that says about me?


It says you're going to be buying a new keyboard soon. The one you usually use has a worn-out space bar.


You have too much free time - not good for our company sorry.


Computers decide if you get a job (and access to a resource called "money", pretty important for living).

Computers decide if your Google accounts gets deleted, without warning.

Computers decide if you can get credit, or open a bank, or participate in society.

And there is 0, yup, 0 accountability. Computer doesn't like you? Tough shit, get fucked, nothing you can do. If you don't have an internet connection, you're fucked. If you don't have fast upload speeds and a capability to record video and are willing to store it on someone else's computer (where they will datamine it and use it against you in the future, or it'll be "leaked"/sold) then you're fucked.

So now, to get a job I need:

To participate in surveillance and trading of my data where it'll be data raped

A smart phone

A fast internet connection

So great, fuck you to all the poor people I guess who find technology hard, they can just go die.

I'm beginning to think computers were the biggest mistake humans ever made. We need regulation and fast against this sort of thing, to make the hiring process fair and accessible to everyone.


Computers are just a tool, they're great at processing lots of data very fast. That makes certain types of tasks a good fit for them. None of the things you mentioned were necessarily better before computers, they simply used other methods to do the selection. But that doesn't mean those methods were better.

> I'm beginning to think computers were the biggest mistake humans ever made. We need regulation and fast against this sort of thing, to make the hiring process fair and accessible to everyone.

It was never fair and accessible to everyone (for those companies that would employ algorithm based resume screening). Just that instead of whatever other arbitrary mechanism they had used until now to filter out candidates they now use "AI".

It's good to be trying to find better solutions to these problems, but don't rile up against computers because computers might be part of the solution.


In regards to society computers are still a relatively new invention. The behaviour we're seeing at the moment will either be regulated away or the affected populations will end up as neo-serfs, subjects to decision making systems they are allowed no control over. The EU is at the forefront of the fight against this: people can demand an explanation of algorithmic decision making and GDPR prevents organisations from holding your data without consent.

Computers are powerful tools and can be put to both constructive and destructive purposes. I'm hopeful that at least some societies will figure out sensible limitations on what use they can be put to.


I'd actually go beyond the computers. Why is taking any time away from work make you a liability? Don't answer that, I know all the excuses, but this is horrible oppressive policy, even if it happened "accidently". What if you want to travel the world, try to get your band a touring presence, look after a sick relative, or god forbid you want more time with your kids (especially men). You're viewed as traitor to capitalism for taking your nose off the grindstone 1% of your life.

Don't you dare get fired either. Better not accidently work for a company that tells you to do something grossly unethical, or illegal. Don't accidently be a part of a negative news/viral social media story. Doesn't matter if it's wrong, they'll just print a tiny retraction in fine print and leave the main article at the top of google.

Then there's the justice system. Why does getting busted for anything permanently make you a second class citizen? We talk so much shit about China, but the USA, and the elite in our country definitely has a no-no list that will give a life sentence as the "untouchable class".

Of course, we let a few talented "untouchables" advance to the middle class (it helps if you were previously a member). I actually have a bit of a conspiracy theory about this. Amazon and others have commercials bragging about hiring ex-cons. I think they might use this market to bring wages down. Essentially, they pick from the cream of the crop, with the most benign offenses, and best sob story; you can pay them whatever, they'll be thrilled for any middle class chance!

P.S. I know what China does is worse, and some of the specific reasons why, but sometimes I'm not so sure.


This is maybe true in Silicon Valley. Not in the real world.

Also, nobody needs a Google account.


I wish it were as you say. Effectively, at least a facebook account was needed at two of the universities I was at. I've also seen instances of important (even life threatening) information being transmitted via social networks, exclusively. So yes, sometimes you pretty much have to be part of that game.


10 years ago I was so excited and optimistic about the future technology was going to bring us. As that future comes closer it's quickly becoming quite clear that this entire industry is bringing us a giant dystopian future. Tech companies have more power then governments, individual privacy is gone. Corporations, already pretty impersonal, have completed the transition to treating people as "resources", completely dropping the word "human".

We all need to think about the future we're building here.


I wish it wasn't true, but the path we're headed down is pretty much set now.

I can't help feeling that the habit of removing identifying markers (names, ages, etc) from resumes to try to ensure equality in hiring is also part of the same ill-fated direction.

I dearly wish that the world I was leaving my kids wasn't so dehumanised.


> I can't help feeling that the habit of removing identifying markers (names, ages, etc) from resumes to try to ensure equality in hiring is also part of the same ill-fated direction.

What? There is well documented and clear evidence that these indicators actually lead to highly biased decisions regarding who gets interviewed. Removing them is the right thing to do.


There are positives, and there are also negatives. Both have been well-documented.

My point was a more general one, that we are being sucked into a strangely anti-human world where those in power would like nothing more than to reduce us to a mere employee number. Just data in a spreadsheet. It's so much less messy that way, right?


It goes beyond resumes. Our criminal justice system also uses AI to make judgement decisions. I understand the need to eliminate bias from systems (though I question how much bias is actually being eliminated via these systems) but the need to humanize people when they are most vulnerable is important.


My plan when I retire (about 10 years) is to have no internet and no mobile phone. I really can't see what I'd be losing.


Sounds a bit Black Mirror Complex. People were complaining about being treated as resources since the industrial revolution - and before then serfs knew that they were resources! More power than governments, seriously? The only time I have seen that rhetoric come up is complete fucking morons complaining about encryption and wanting to outlaw math.


Help help! I’m being repressed! https://youtu.be/ZtYU87QNjPw


FWIW (at least in tech industry), apparently the automated filtering systems are a myth spread by companies who make money on the promise to optimize your resume to get past the filter.

https://twitter.com/GergelyOrosz/status/1292909844886945792

(The OP of the tweet was an engineering manager in several big tech companies, wrote The Software Engineer’s Guidebook last year, and talked with many recruiters in the process)


I don't know if there are automated filtering systems that will send a rejection email in place but I know as a manager who used LinkedIn job posts I had what was basically a, "qualified" and then a "not qualified/spam" folder for applications. New applications were auto sorted into one or the other depending on their resume/linkedin profile/some other factors I was not aware of.

I could still find those profiles but it was auto filtering the candidates it thought were relevant and those that were not. To at least some extend these types of automated filtering systems do exists. I don't know why recruiters/manager say they do not.


I talked to a recruiter for a three letter company, and they told me that they actively search for keywords on people's resumes and the applications that most apply to the role. This would in fact be considered filtering. Now I don't know if the software they use (brassring) automatically "suggests" applicants (which would imply automated filtering) or not.


>(at least in tech industry), apparently the automated filtering systems are a myth

Sounds like horseshit. There's no shortage of graduates in tech, on top of graduates of various "bootcamps", as everyone and their dog has been telling young people they just need to "learn to code!" and they'll have a luxurious, joyous future.


I paid someone $250 to get my resume past ATS systems, it worked.


And what would have happened if you hadn't?


Care to share the service/company?


Reminds me of that old trick: Pay me $100 and I'll cast a spell to get you accepted to university. Your money back if it doesn't work.


Huh? This directly disagrees with my experience. I believe this person is mistaken. Resumes are always filtered before they hit my desk, I just never even knew the filter happened. Apparently they toss out a huge amount.

Reading this tweet in depth disagrees with a lot of my experience in Fortune 500 tech companies. Can’t figure out the misunderstanding.


If you never knew it happened, how do you... know it happened?

I can't conclusively say one way or the other. I have been a hiring manager at a F500 tech company and have never seen direct evidence of any automatic filtering or rejections.

Anyone speaking with complete confidence in either direction needs to bring actual evidence...


As in, my managers always talked about their piles being “presorted by HR” which was just code for a computer did it. I never thought some well paid HR person was actually sorting through those, but maybe I guess.

There’s no evidence to bring? Like, in the case of bunch of people bringing personal anecdotes you can still create a sense of truth without evidence Christ.


The companies I worked at had sourcers and resume screeners who did exactly that. Mostly but not all contractors. Not necessarily well-paid, but humans nonetheless.


jakub_g refers to the claim that "automated filtering systems are a myth". That doesn't mean that HR aren't filtering them manually.


Doing an online application to initiate contact with a potential employer is already a strike against the applicant.

The whole point of these systems is to whittle down the pool of candidates down to single digits. Of course each candidate is going to "perfectly match" the stated faux-objective criteria because that's what the users of these systems think they want. Whether or not that's what they need is another question entirely.

For the jobs where it's possible, it's much better to reach out directly to human beings and exercise one's human professional network. The best jobs are filled by referral and/or reputation, almost always.


Google did a study on what elements of the hiring process correlated with good performance and found that references we're almost useless.


Google is Google. They hire _VAST_ numbers of people straight out of college. It's practically a pipeline for them. I am not saying that neglecting referrals from professional contacts doesn't work for Google, but rather that it's commonplace and "good enough" for other places.

Moreover, from the point of view of an individual rather than the employer's performance metrics, it's generally better to make a human connection if you're just talking about the chances of getting hired.


I don't see what good references are since the applicant gets to choose them. How many people legitimately can't find a single person from a previous job to list as a reference? Or find a person that can convincingly pretend to be a reference.

It would be different if you could randomly call 3 people they worked with at their last company, but I don't think anyone does that.


Are we talking about candidate provided references?

Or telling your boss "hey, my buddy is looking for a new job, I'll shoot you the resume"?

The former seems fairly useless, the latter seems useful


That's been my experience but it probably does legitimately depend. I think there are some people whose skills match sought after keywords and they're fine with those positions. If you've got a more eclectic (and often more senior with varied types of experience), keywords probably don't help much at all.


I don't bother with online applications anymore - I don't think I've ever had a single experience where doing so didn't result in an automated rejection letter or simply no response. Seems like a total waste of time.

Applying via human connections (whether it be a friend, coworker, or even a recruiter - first or third party) has been tremendously more successful.


Related story: I applied for a position with United Airlines a few months ago. I started the job application, then stopped in the middle for whatever reason. United’s job system emailed me at 9 PM a reminder to continue my application, and I completed it less than a half hour later. A half hour after that (just after 10 PM) United emailed me a rejection notice. So either some United recruiter was working late (I’m in the same time zone as United’s HQ) or my resume was completely rejected by a bot.

It’s pretty annoying to put in all that work and get a rejection so quickly.


Very early in my career, I did an open house where I chatted with their principal architect and got hired. A few weeks later, after I had been working for a week, I received a letter from HR rejecting me. I wandered over to the HR office and meekly asked if I was still employed - and thankfully was. Apparently they had just gotten to screening the resumes that were handed in at the open house.


And now you're presumably much less likely to try applying again, regardless of how well you fit future openings. They burnt the bridge. The equivalent for you would be turning down a job offer with 4-letter words. In many areas there aren't enough skilled candidates for this to be sustainable.


Don't know if this is the case with United but I experienced a few companies where the first round of recruiting defense was in India - so it's possible someone got your app in the middle of their workday.


I wonder if it's possible to get the job applicants to evaluate each other?

That way you have a process that automatically scales to the number of applicants you have.

Exactly how to structure it so it can't be gamed/subverted is tricky, but you might imagine some team activity where the team does a task together, and then each person in the team is asked to so some kind of review of the other people in the team.

If you make the team activity be a small nugget of the actual work you need done, and you pay all the interviewees generously for their time, I can't imagine interviewees walking away unhappy at the end, even if they don't get the job.

You then interview the top 2 people from each team.


> Exactly how to structure it so it can't be gamed/subverted is tricky

Most companies pay a commission to the recruiter or a referral bonus. You could distribute that to those candidates who contribute to the recommendation of the winning candidate. You're incentivized to recommend the best person, because you can still make money even if you don't get the job.

Also you could always hire for a batch of positions, or at least more than one person at a time. That way, just because someone else gets the job doesn't preclude you from also getting it.


>> I wonder if it's possible to get the job applicants to evaluate each other?

I can't tell if this is the stupidest idea or brilliant.

On one hand you're saying "take a bunch of people we don't know are qualified to work here and empower them to make our hiring decisions" which sounds dumb.

But on the other hand, if people chosen through this process do well, you should go back and hire those who "hired" them as managers :)


Any time people are asked to review their peers, there's at least some compulsion to rate others poorly because it'll make you look better.

IME, the only times that it can go well is with people who don't think the results will matter. And in which case, why do it?

The one exception is someone who is already somewhat of a leader and isn't looking for a promotion, and so they don't have much desire to make themselves look better.


Yes, those AI systems surely create garbage results right now, and worse, once they are trained perfectly, they will provide perfect cover to codify existing hiring biases:

1. Take existing employee base, hired through biased process.

2. Use resumes to train multi-million parameter machine learning model.

3. System will replicate biases, while the process is completely opaque to review by even the people who created it.

But then again, existing hiring practices are just as bad, e.g. hiring European engineers only if they graduated from Oxford, Cambridge, or the University of Bucharest: https://twitter.com/shaft/status/1355696154990628864

Or the still widespread practice in Switzerland to demand hand written job applications for executive positions, in order to have the applications evaluated by graphologists…


This has to be a prank somehow.

The list of North American universities makes no sense. Some co-op technical college I never heard of listed along Stanford/MIT.


You can pay about $250 for someone to optimize your resume with certain key words, punctuation, and formatting choices to make it through these systems. I did this and it worked. I was going for entry level finance roles in tech and finance. I went to a top 40 finance program, interned at Morgan Stanley, a hedge fund, and a respectable incubator, but had a hard time getting replies to my job applications. I used all the help and resources from my university's career center and followed everything on the infamous "Mergers and Inquisitions" website, which is supposed to be the holy grail of career guidance and resume formatting for finance majors. I had my resume looked at my everyone I could get critiques and edits from. I wasn't getting answers from my applications. After hiring someone to change the formatting, I got automated replies saying that they liked my resume and would reach out to schedule a phone screening. Same person, same level of "professionalism" on the resume, same experience, just different key words and seemingly arbitrary formatting choices to please the AI overlords.

I found 5 people claiming to optimize resumes for algorithms for the sorts of jobs I wanted. They all had high reviews from customers. I then asked them all questions through email and phone calls to see if they actually knew what they were talking about. I picked the one who appeared the best. Not only did she have some personal proximity to the space, but she seemed to know all of the nuances of how to please the resume algos. It was the best $250 I ever spent.

Point is, people are cracking these processes, it feels arbitrary but that's how the game works now if your a job seeker. Whether if be something like a resume screener or something of this level that claims to be able to understand your personality in 25 minutes, you can crack the system. If your hiring, I can't imagine how confusing this whole mess is on top of your already busy work day, but it's easy to manipulate these screening software products to make them think whatever you want. I've met many sketchy software sales people who just schlep useless enterprise products at businesses. I wonder how much of the enterprise SaaS world is made up of this sort of thing.


I did something similar, and for me also was definitely worth it. I collected data over a full year of applications, and in addition to the updated cv (from fecak here's on HN), I also started to contact recruiter directly on Linkedin. For the same number of applications (before and after the new cv) I had 11 times more interviews than before


I'm applying to stuff right now, mind emailing me this person's information?


I'd also really appreciate this person's info! shawndebray@gmail.com


Did you end up getting an email?


Yeah it sucks computers do this but its important to remember people can be equally cruel. Big tech companies who need talent arbitrarily accept/reject candidates based on resume buzzwords and whether they can leetcode. But the long tail of small companies, often run by idiots or frat bros of some kind will reject you for reasons stupider still - dont like hair, not hot enough, can only drink 2 beers?, only knows Java LMAO.

In other industries, psychometrics are common and even effective. I remember one dull self help author talking about how insurance companies screen for optimism because lol you're gonna need it to consistently make 100 calls a day, knowing only 5 people will buy. Meaning forget merit, EVEN IF you make it past the randomness filter, your interview & job performance is determined based on largely immoveable parts of your personality. Don't forget, the corporate world outside of the rarefied space of tech is an absolute shit show.

Whenever i feel like ranting against how shitty AI is, remember folks, i remember the majority of humans suck too. It gives me a warm, cozy feeling like I'm a turkey the day before thanksgiving.


"AI can help evaluate all those candidates in a very consistent way," he says."

Which, if applied across an industry, means someone will be consistently rejected; where as now they might have a chance of finding a position somewhere, as the intellects in each hiring process are different.


There's no way in hell I'd do an AI-based interview. It speaks a lot about the way a company treats its employees if they can't even be bothered to speak to you until the second round.


It is a truth universally acknowledged that you are more likely to be hired on, if your parents are friends with the owner of the company. As for a video interview, I would totally clam-up. If you do want your application to make it to the top of the pile, then what you do is this. Fill the page with white-on-white very tiny words that are designed to hit the AI buzz word detector. You can extract these from the company "Mission Statement" :s

https://dilbert.com/strip/1996-09-01


When I took one one of these things for a temping company a few years back, and it was asking me about basic Excel things, I took a look at the network traffic in the browser. I could see it was just like "question 2, pass yes, respons time 10s" in JSON. So I feed it through portswigger burp proxy and modified the responses as they went. I still think I was demonstrating basic IT skills.


I used to work as a programmer in this field. I'm torn, on one side the "science" behind it is based on statistics and is supposed to be validated. Your view depends largely on whether you trust the validity of psychometrics. On the other side, someone makes a test wraps some statistics around it and says it's valid and measures what it's supposed can seem a little far fetched.


It doesn't seem like science, or it's selectively chosen science.

If we must do these tests, I should be able to take one or a few of these and keep that as a record and not have to do them again. But instead, I have to repeat the process for every company that gives them (recently, Berkshire-Hathaway and it had to be done before anyone would even speak about the position). These things rarely test for job-specific skills and are supposedly generalized intelligence tests, so the skills should be widely applicable, and therefore I shouldn't have to repeat this often. Hell, I do less for a drivers license renewal and 2 tons of metal at 60 mph is more lethal than most jobs behind these tests. Typically they've been math, reading comprehension, word association, memory, personality, and abstract/spatial thinking, which, frankly, was already done through high school and college. If the goal was finding signals for successful applicants, they already have had these signals.

To me, that says they're less interested in finding out how smart you are, and it's really more of a filtering tool where science is being used to give it more credibility than it deserves.

And of course, referring to the article, because it's AI, there's a gray area of accountability that I don't think the legal process has caught up with yet.


I would guess that if you actually dig into it, the reason why each company who does it does them differently is legal.


I work with a lot of psychological testing, and IMO it’s absurd to expect these kinds of tests to describe individuals. They’re fine at creating norms and viewing statistical deviations, but There’s a reason why, even in the ideal case, when you have a gold standard diagnostic test, with obvious symptoms, hundreds of thousands of previous cases to look at, few things are diagnosed solely on testing alone. And these tests likely don’t get anywhere close to any of these ideals.


How can you "validate" it? Are companies making all their staff take these tests and then measuring performance? But the people in your company isn't RCT. Are companies basing this on research that is done on random people? Okay, but skills required on most jobs aren't correlated to some "general" intelligence. How do you deal with the fact that IQ appears to be correlated to income? How you deal with the fact that IQ doesn't correlate to success? Seriously, I don't understand how this makes sense to anyone who has studied stats at all (this is taught to 16 year olds where I am).

This stuff is totally out of this world crazy. I am reasonably intelligent, I did an IQ test when I was 12 and it was 130...I am extremely bad at relatively basic elements of life and work. I am good at other things but there is no way this can all be measured or, even, quantified. What is more: the main thing determining my ability is still experience, and my skills change significantly over the course of a few years. This isn't how statistics or science is really supposed to work. Slightly obviously, and contrary to what people think, this entrenches inequality.

I understand why companies do this:

* Hiring is hard. It is hard to get people who are actually good at hiring, there are no courses for this at university, there are no real rules outside of experience. Ironically, companies get over this by doubling down on their weaknesses.

* Underemployment is massive. It used to be hard to apply for a job, everyone just shotguns apps out now, and no-one has a job they want. Employers can do whatever: I am in the UK, even for menial tasks like working in the supermarket, they are doing weird, totally unscientific tests.

* Everyone is terrified of hiring. Employment lawyers get richer, laws get more complex, and the risk of being villified in the press for an interview gone bad are massive. Ironically, these totally "objective" processes just solidify inequality. They are designed with context, there is no totally abstract notion of intelligence that is relevant to the work place, they are designed to produce stratification in intelligence that likely does not translate to any real-world scenario. These approaches cause the thing they are trying to prevent because the thing has become so terrifying for companies.

Just my 2c, but this stuff is, imo, one of the worst modern corporate behaviour. Whether you think it is right or wrong morally, the outcomes are not good...the solution to bad hiring processes isn't psuedoscience. It is: hiring people who can do their job (again, these are linked, I have noticed that some companies are in the death sprial...they have bad processes that hire bad people, when a company does it well they just exist totally outwith this cycle).


I know a number of people who have paid for someone to do the personality tests for them, but with video it’s gotten harder. It’s also gotten easier for companies to reject for tacit bias such as gender, race, and any other number of factors. There doesn’t seem to be any safe guards for it; we didn’t do it, the software did it.


>It’s also gotten easier for companies to reject for tacit bias such as gender, race, and any other number of factors

In some European countries like Germany, Spain and Austria it's not even tacit as it's tradition to have your photo and birthday in your resume, especially for more traditional companies, to the point where it generated some scandals in Austria and Germany since in some companies, being black or having Slavic/Turkish/Arabic names got your CV rejected by default even though the CVs would fit the requirements.

But it's ok, since they write in the footer of their career page that they don't discriminate on such things. /s

Edit: There's even some German satire about the racism job applicants face over having photos in resumes:

https://www.youtube.com/watch?v=ih5k7g8vUmE


There's an elephant in the room: hiring is by nature biased.

I have yet to witness an hiring process that isn't biased. As an anecdote, in a past small business job, a colleage (that accumulated the hiring manager hat) used to choose people (always opposite sex) by photo and had even the literal approval of the company owner to do so. To corroborate this, I remember two persons "hired by photo" commenting that they never had any trouble getting job interviews. It may indicate that "hiring by photo" is rather common.

We like to forget that most jobs don't allow 10X'ers. Choosing "the best" is normally just a form of "early optimization". My take on this (but I might be biased) is trial most and only hire who adds enough value.


Like I said below, you cannot fully outlaw and police tribalism, human bias and cronyism that is evolutionary ingrained in us as a species since the dawn of man.


It is also the practice in some Asian countries. I applied for a job at a Japanese university and they wanted a photo.


Yeah, those footers are just for the "more equal" crowd. Anyone I know who actually wants to discriminate, still gets away with it. Those who don't don't. But at least now the average diversity idealist is appeased.


[flagged]


AFAIK, photos and birthday are not allowed in US to prevent such discrimination so how would they do it then?


They're referring to diversity programs as discrimination.


I am surprised that Germany hasn't been taken to the European court of human rights over this abuse.

Maybe those German privacy activists going after google et all might want to look at this.


Because in Germany/Europe discrimination is officially illegal so it is assumed nobody does it, even though people know better, and because you cannot fully outlaw and police tribalism, human bias and cronyism that is evolutionary ingrained in us as a species since the dawn of man.

Putting more legislative hoops in place means HR will just need to find more creative ways to formulate job requirements in order to exclude certain classes of people or more creative ways of justifying not taking you application further, but ultimately won't solve the core issue.


There's also the Arbeitszeugnis thing (a summary of how they see you as an employee, given when you leave the company). If your boss doesn't like you, they can fuck up your future a bit by giving you a bad Zeugnis (future employers typically want to see those).

It's effectively illegal to complain about employees in Zeugnis, but they can reorder words and skip some phrases to imply that you're a bad employee. If you're not German you might even think you got a good Zeugnis because it looks well on the surface, but there's the entire code you need to learn to read these.


Ah the "Forget it, Jake. It's Chinatown" defence.

or maybe you ned some high profile sackings of HR directors at a few big German companies ala github.


Sackings? On what ground? Proving there's any kind of malice or bias in rejected applications is nearly impossible as the main job of HR is to protect the company from any hiring bias leaking out. That's why your rejection email is usually some generic copy-paste message and they usually refuse to give you any feedback upon request


Except they straight up ask you for your racial etnicity, disabilities and gender.


I interviewed for my first employee a few weeks ago. I just opted for an audio call. I didn't want to judge candidates based on their coronavirus nest, coronavirus haircut, or anything that didn't influence their performance in a strictly remote job. I also didn't want to dress up.

That being said, German culture dictates that resumes have pictures, so it's sort of moot.


This reminds me of when I was working for an EdTech startup nearly a decade ago where the CEO invented his own “Mini Myers Briggs” test, and we developed a “feature” to reject any teachers applying who came back INTJ after filling out that stupid test. The CEO adamantly believed INTJs didn’t make good teachers.


This reminds me of some times in the mid 2000s, I was looking for a job. I had a friend who was a manager at EA. He was trying to get me a job as a game tester but no luck. I had to apply online through their system, but they couldn't find my application.

After days of applying and not appearing, someone told him that it's possible that my application was being rejected. They found my name in the rejected pile. I don't know what exactly in my application was turning me into a red flag, or of if he didn't really want to hire me, but unless the system accepted me, I was not going to get hired.

I was never hired.


The obvious connection that we’ll see in the news soon is how companies will combine this with all the digital tracking and profiling being done. Every message somebody sends to a teammate in an online game or discord conversation will be a factor in a job application result 5-10 years later


I’ve worked in the HR tech space and we integrated with all of the big applicant tracking systems.

Many of them have screening questions. So you can ask something like “can you lift 50 lbs?” And set it up so answering no will screen the candidate out. But even in that case recruiters and hiring managers will usually still look at the candidate anyway.

There are also 3rd party addons that will score candidates, and resume parsers that will let recruiters search resumes for keywords.

But as far as I know there are no widely used AI solutions that will actually flat out reject candidates with no human intervention.

Even if it worked, the risk is too great that your model will start over rejecting people from protected classes and no one will notice until you’re hit with a lawsuit.


Most of the AI I've seen is really just a database query dressed up with a HAL-9000 eyeball camera or something. A good example is a license expiration date. It says it expired four years ago. problem is that candidate data is largely parsed from randomly formatted resume files, so it's not that accurate and as a result, the date from the college graduation ended up in the license field.


While there is a natural "that's not fair!" reaction to this, and also I think that a lot of these companies are selling snake oil, I also have to wonder if it can be much worse than the human screening most applicants go through for these kinds of jobs.


A lot of this is probably mostly BS-y stuff like personality tests and so forth. But if there's a stack of 100 resumes applying for some job, there's going to be a lot of arbitrariness in any heuristics used to cull pile down to a reasonable size even once "obviously" unqualified candidates are filtered out (and even that filtering can be a bit arbitrary).


This reminds me of the nfl combine but for jobs. There are plenty of examples of players that do well in the combine but fail in the nfl. Your test just selects for individuals who are good at testing and not necessarily good at their potential job.


> "Everyone wants the right job, and to hire the right person. It doesn't benefit anyone for the match to be off. Trying to use these AI systems in smart ways is to everyone's advantage."

I mean, that's flagrantly untrue. There are plenty of, I hesitate to use the term "bad actors", but people who will lie and embellish their resume, tailor their answers to these personality quizzes in an optimal way, and otherwise game the system to maximize their payoff - which is to get hired.


I’m not certain you can quantify the qualitative - it’s too much. How would you properly set up what you are looking for concerning the specifics in your company’s culture? It would be great to have follow ups - like did this person who passed all these tests and get hired really turn out to be amazing..? How do they factor Ig they get another job offer and leave shortly after the hire? When humans can’t even read humans how are they going to program something that will?


The value provided by this "AI" solution is that it diffuses accountability for a decision away from any individual making a hiring decision. Would it be ethical if it were used to operate a food rationing system, or maybe a vaccine rationing system, and at what point does the level of plenty make it acceptable? I could convolve or convolute any scheme to make the attribute I was selecting for seem "random" and then refuse to acknowledge the views of anyone who couldn't explain gradient descent as well. There may be a general principle here where if you are mediated by machines, you cease to be a person and just represent a sample in a set of samples, and hiring processes like this indicate an economically inferior-good kind of employer that people select away from as soon as they can afford to.


What responsibility is there to diffuse? There is no obligation to form a relationship. Internal ass-covering maybe? But really that would just shift to whomever decided upon the AI system - if they had a system that just rolled a die and picked a resume by index their bosses would not be pleased.

While there is cliche angst about being mediated by machines that has existed since bueracracies - but the parts were humans. Sadly the bueracracies qualify as an improvement over literal fiefdoms.


I don't think this is bad at all. If there are so many job application that human can't scale enough to process all of them, it should be a big firm with reputation and those companies are spammed by job application from all ranges of candidates. It is just a spam mail detector with job application flavor.


I read another post about talent. If you use AI in your process and try to industrialize your hiring to make it as cheap as possible, I think you will be either very, very good at finding and fostering talent... or very, very bad.


In Europe Automatic performance reviews are probbly not legal in EU according to GDPR section 22.

"The data subject shall have the right not to be subject to a decision based solely on automated processing"

https://gdpr-info.eu/art-22-gdpr/


>"solely"

Which presumably means that they can be used as a filter so long as a human is making the final selection.


From Pymetrics' CEO's Linkedin:

Avid technophile who believes audited AI can act like CRISPR: retaining all the wisdom of humans while stripping out ineradicable bias.

lol wut


Disappointing that a BBC piece does not cover any UK-based business in this sector, like Arctic Shores (https://www.arcticshores.com/). The journalist lives in NYC but these days that shouldn't really matter...


I find the concept of not being allowed to hire (or not hire) whoever you want horrifying.

My life goal is to get my kids to a level where they can afford to refuse to take demeaning tests. But if companies want do employ them, they should be allowed to do so.


are resumes valid anymore ?


Yes, just not everywhere. We use them to filter our candidates.


I assume that not all roles go through this process. If you are hiring for a sensitive position (CEO, other C-suite, any type of Compliance and Internal Audit), the process is different.

I will also assume that the "AI" does only the first clean-up. Basically moving them from the 'pile on the floor' to the 'table', pick the 'top 100' for a role.

Fun story: When working for a US tech company some years back, we were looking for IT Auditors. Someone saw the word "audit" on an ad and applied. The guy was a "night auditor" (hotels). He didn't bother reading the ad, he just applied. HR didn't review his CV at all, they set up the meeting. We laughed, apologized, got him a coffee, we exchnaged funny/horror stories, and he went his merry way.

An AI system would have picked that up in 1ms. A lazy/overwhelmed HR employee did not.


Maybe they should use more AI to hire CEOs and upper management from a very large pool. We might get positively surprised.

(Or not, in which case I'd be surprised.)


Even just a random process might give better results - sometimes.

From THHGTTG:

> To summarize: it is a well-known fact that those people who must want to rule people are, ipso facto, those least suited to do it. To summarize the summary: anyone who is capable of getting themselves made President should on no account be allowed to do the job.”




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: