
New app scans your face and tells companies whether you’re worth hiring - Futurebot
https://www.theladders.com/p/26101/ai-screen-candidates-hirevue
======
adityab
Some comments are focusing on this exacerbating aggregate phenomena like
diversity stats and wage gaps. This is true and a huge negative.

But that's not the only thing we should be worried about. Far more damaging
would be the dystopia that these "cognitive surveillance" products will bring
upon us.

It claims to micro-analyze facial expressions, intonation, non-verbal signals
while the candidate is interviewing. This is vile, hostile interaction in my
opinion, and it is startling to see that companies like Vodafone, Intel, and
Oracle are their customers [1].

At best, this is a way to sweep unaccountable decision-making under the carpet
of "the software said so". Quite likely, though, is that such products will
make society a living hell for everyone until these practices are entrenched
in the industry and it is too late to roll it all back.

The creators of these products are not stupid, they made a conscious choice to
grab low-hanging fruit (now that they have the technology available) and
enrich themselves while making the world a worse place. Let's not kid
ourselves that the consequences of their actions did not occur to them.

As a research student in DL/AI, I realize this may make things marginally
worse for my career, but right here is the reason we should regulate AI usage
_now_: not Skynet, but these attempts to "disrupt" social norms for no stated
reason other than "progress" and "efficiency". We should keep some
technologies out of the public sphere and make it vocally clear that they are
unacceptable, lest we end up with a world where everyone is wearing google
glasses and you have no way to maintain a personal facade because some
combination of blood flow and facial muscle twitch betrays your thoughts.

Your only argument against this stuff _should not only be_ that it has X bug
or that it's execution isn't sound due to Y bias. We must oppose such things
on principle, I would rather kill myself than live in some sort of Black
Mirror-esque dystopia where this is mainstream.

[1]: [https://www.hirevue.com/customers](https://www.hirevue.com/customers)

~~~
tzs
> It claims to micro-analyze facial expressions, intonation, non-verbal
> signals while the candidate is interviewing. This is vile, hostile
> interaction in my opinion.

Is your objection to the use of those things in general, or just to the use of
an AI instead of a human to evaluate them?

I'm curious because these things are just, I think, the components that go
into demeanor and humans routinely use demeanor to judge how much trust to put
into what someone else is saying.

In fact, one of the main reasons that witnesses in a criminal trial in the US
testify in person in front of the jury is so that the jury can see their
demeanor and use it to judge credibility.

I'm pretty sure almost every human interviewer uses demeanor evidence, albeit
not consciously. Short of eliminating face to face interviews I doubt that
there is a way to stop such judging because it is almost completely
unconscious.

~~~
seangrant
My understanding is that their objection is to having a black box AI that can
make or break a hire. What's to stop them from not hiring anyone because "the
box said no"? What's to stop the hiring process from being entirely composed
of the black box, with nobody knowing what it takes to beat? Sure you have the
skills and experience, but unfortunately your facial pattern makes you seem
impersonal. Don't worry, they'll keep you on file just in case.

~~~
brador
Wouldn't the free market solve for this? I mean if the AI system leads to
worse hires the company hiring using it's signals will suffer in the market
and ultimately lose out to a superior competitor who picked up hot superstar
rejects with facial ticks on the cheap, right?

------
pella
Related:

"The era of blind faith in big data must end" | TED Talk |

[https://www.ted.com/talks/cathy_o_neil_the_era_of_blind_fait...](https://www.ted.com/talks/cathy_o_neil_the_era_of_blind_faith_in_big_data_must_end)

 _" Algorithms decide who gets a loan, who gets a job interview, who gets
insurance and much more -- but they don't automatically make things fair.
Mathematician and data scientist Cathy O'Neil coined a term for algorithms
that are secret, important and harmful: "weapons of math destruction." Learn
more about the hidden agendas behind the formulas."_

...

 _" Algorithms are everywhere. They sort and separate the winners from the
losers. The winners get the job or a good credit card offer. The losers don't
even get an interview or they pay more for insurance. We're being scored with
secret formulas that we don't understand that often don't have systems of
appeal. That begs the question: What if the algorithms are wrong?"_

~~~
icelancer
These same algorithms run in our brains with far more flawed edge cases than
big data. Not that I am defending blind use of "algorithms," but that word is
a boogeyman at best.

~~~
sidlls
No, "these same algorithms" don't "run in our brains". Our brains produce
thoughts in ways that we don't even have the slightest understanding of. We
have conjecture and some observations (e.g. electric signal monitors) that
only scratch the surface of describing what the end result of a "thought"
looks like.

The comparison of how humans think and human intellect in general with poorly
understood applied computational statistics really has gone too far.

Now, if you mean to say that humans themselves have biases and "wrong"
behavior, that's a different matter.

------
electic
This invariably will identify and rate characteristics like race. It will
judge their vocabulary and identify if you are a native or an immigrant.
Probably also identify your upbringing and class from language and grammar
composition.

Let the lawsuits begin.

~~~
blunte
Reminds me of this problem: [https://gizmodo.com/5431190/hp-face-tracking-
webcams-dont-re...](https://gizmodo.com/5431190/hp-face-tracking-webcams-dont-
recognize-black-people)

There are a thousand reasons why this is probably a garbage system. And
chances are, it's just another AI-hype-driven thing to sell to ignorant
companies/"decision-makers" who should be out of business already.

------
justinjlynn
Holy shit, what HR department would _ever_ approve such a thing?! Photographs
on resumes are discouraged - can you imagine the risk involved asking for a
biometric scan and doing feature analysis on video interviews? How would this
interact with GINA and polygraph laws? It raises far, far too many questions -
unless the app makers are providing indemnity I can't see any reasonably
competent hiring process professional adopting such a risky tool openly.

~~~
Myrmornis
According to the article,

> Goldman Sachs, Under Armour, Unilever, and Vodafone are also among the
> companies that have used the platform.

~~~
snissn
According to the article they've used the HireVue platform, but not
necessarily this new (and weird) feature

~~~
justinjlynn
Lawsuits flying and discovery/settlement processes commencing in 3.. 2.. 1..

------
rafiki6
Is there any actual sound peer reviewed research indicating the results of
their predictive system will lead to more great hires? Cus it didn't seem so
in the article. The article made it seem like HireVue is working off some
false causal relationship assumption where the best hires performed like this
in their interview therefore we should compare all new hires to them without
any shred of evidence showing that this type of interview performance
indicates a good hire. I.e. snake oil...

~~~
blunte
Since when has science and reason driven management decisions? 20 years ago
the silver bullet was OOP. 10 years ago the silver bullet was offshoring. Now
it's AI and/or blockchain.

------
kotrunga
This leads to some pretty scary questions. What about different cultures?
Everyone has different ways of speaking, acting, living. What if the app
considers something bad behavior, and it's just a cultural difference? Or it's
how someone was raised?

Maybe the biggest question we should ask is what does this solve? What problem
is this trying to solve? And once we figure that out, ask ourselves... is that
the problem we should be solving? Is it the best way to solve it?

------
vbuwivbiu
can we rename machine learning to "Prejudice Amplifier" now ?

~~~
jschwartzi
I will certainly start calling it that informally.

------
amb23
There's no way this software is not inherently biased on very basic measures--
a lisp caused by a disability probably wouldn't pass muster, nor would a
southern accent. But the real tragedy here is that it equates the way someone
presents themselves--not the content of their words or the strength of their
skills--with job performance. It exacerbates the worst vices of HR and hiring
practices and performance reviews. It's an algorithm for superficiality.

This company should remove this product from the market immediately. How this
passed off as a good idea all the way through the product development cycle is
beyond me.

~~~
notyourwork
>This company should remove this product from the market immediately. How this
passed off as a good idea all the way through the product development cycle is
beyond me.

It sounds like you are assuming that businesses are supposed to act like a
single human with ethics and a moral compass should. I agree with your
conclusion but to ponder how this passed off as a good idea is silly. If I
could come up with software to sell to businesses to replace a huge amount of
their recruiting cost (people, time, operational cost) wouldn't I be inclined
to do business with them?

Consider a company global company, when you look at hiring at scale it becomes
easy to justify streamlining the whole process.

I still find it like a bad direction to go but what do you expect?

------
rayiner
Incredible that they can now get phrenological measurements from a video
capture.

~~~
bostik
Hell, if this system starts spreading[0], _retrophrenology_ might actually
become a thing in job interview prep.

0: Think of the plague

------
braxxox
Fuck this. A whole company whose business model is to increase the current
diversity gap in a field.

~~~
blunte
This would, if allowed to run for generations, play out like incest - a dead
end.

Fortunately for US-based companies, the quarterly earnings-per-share cycle
creates so many decision changes within a year that a system like this
wouldn't survive more than two years max. And that's about the tenure of the
average employee, so it will wash out. Meanwhile, a boss will get a bonus for
implementing it, and the vendor will make some cash. All is well, right?

------
diimdeep
Great plot for Black Mirror episode.

~~~
andrei_says_
May I recommend rewatching the Waldo episode, especially if you live in the
US?

------
colbyh
To put skepticism aside for a second - I could see this working (maybe) in a
field where face to face communication is incredibly important (e.g. sales).
But the vast majority of roles in a modern company don't need people that are
good at communicating verbally, not to mention in front of a camera.

Can you imagine putting a scientist through this process? A janitor? No way.

~~~
hardlianotion
I think it's hard to avoid a lawsuit. An interview process that uses visual
input to help it decide who to hire? It almost doesn't matter what else it
considers - you'll have difficulty explaining to folk how it is not
discriminating in some nefarious way.

------
bogomipz
Accessing this page results in an instant pop up box that says "Thanks for
reading, like us on Facebook", even before you've had a chance to read a word
in the article. What a fail. Won't read, pass.

~~~
hardlianotion
popup blocking does still work, after all.

~~~
bogomipz
Really? How do you block an HTML5 element?

------
Brakenshire
Looking forward to 3D mapping of the skull, it will surely open up a utopia of
completely unobjectionable, final judgment of character.

~~~
Broken_Hippo
Yay, phrenology for the digital age...

[https://en.wikipedia.org/wiki/Phrenology](https://en.wikipedia.org/wiki/Phrenology)

------
Rjevski
I'm curious as to how this judges the skillset of a particular candidate. To
me the only skill this would be able to judge is confidence, which also makes
it very easy to trick - simply believe in your bullshit and you'll get
through, no matter whether you actually have the skills.

~~~
OtterCoder
It says right in the article. You train it on your 'top performers' and then
use it to increase groupthink and tribalism in your company.

~~~
blunte
Well, according to US political polls, 35% of the population is very happy
with groupthink and tribalism despite any arguments. So there obviously is
some room in business for a product like this.

~~~
OtterCoder
35%? I think you underestimate the bad blood and ill faith in the US political
system. Both major parties are mere parodies of genuine debate. I will concede
that 35%, being Trump's approval rating, I assume, is an easy and valid lower
bound.

------
noir-york
A modern take on anthropometry and craniometry. Because we know the recent
history of those.

------
zebraflask
What are they really testing? A candidate's willingness to go through a trendy
interview format, which comes across as weirdly similar to the HR version of a
video dating profile - and be chipper about it? I mean, I hate video
conferencing in general, I'd probably fail just because I wouldn't like the
format and it would show.

This kind of thing just provokes my bias towards only allowing recruiters, HR,
etc., to handle processing paperwork, like clerks, and leaving the real
decisions to the departments that have to work with the people who get hired.
Something like this would be laughed out of the building where I work.

------
trapperkeeper74
Let's assume there are some people whom have many years of hiring people
effectively. Whom among them is going to trust a computer at this point in
time? Also, what potential candidate is not going to be immediately insulted
or lose respect of a company so lazy it can't even be bothered to do one of
the most critical tasks without outsourcing it? Sounds like another "something
for nothing" panacea service.

PS: Just imagine all the shallow correlation biases a deep learning net will
discover. People with red hair? Nope. People with insufficient bilateral
symmetry? Nope.

------
FullMtlAlcoholc
Is this a joke or a modern incarnation of phrenology.?

As society moves forward, we must not accept that just because a tool is based
on AI that it is not flawless or biased.

Judging someone's competency based upon their face and its expressions is not
much. more accurate than a witch test, even without the consideration of
differing cultural norms and behaviors. Will American firms discriminate
against Russians because they don't smile as much?

------
rainbowmverse
I'm sure this won't instantly fail anyone with gender dysphoria that kicks
into overdrive at seeing their own face. /s

------
gchapiewski
"Judging a book by the cover" taken to the extreme? And I'm sure this leads to
great diversity in the workplace too :)

------
blunte
Another sensational (and misleading) title...

While there can be very much debate over the long term value of an AI based
filter, at least the filter here includes much more than just "your face".

Honestly, I only clicked it to see if they were really just viewing a still
image of your face and then trying to determine if you would be a good hire. I
knew better.

------
deusum
*Persons with facial deformity need not apply.

------
jorgemf
I always though that if the skills of a person are highly shaped by the genes,
that would have a reflection in body traits. So basically you could have a
good idea of the mental skills and personality of anyone just analyzing the
body and other external features. One of the things that lead me to this
theory is how mental illness do affect the body features.

But even if I am right in this theory, this only gives you one part of the
person. Because the environment and the own decisions of a person shapes them.
For example, even if a person if the perfect fit for the job position based on
the analysis, and the analysis is right, the succeed of that person also
depends in the attitude. If the person has developed a natural laziness over
the life it wont be productive at the job. And the opposite can be true, a
less genetical predisposed person which has work hard during all the life
could be a much better match for the job.

EDIT: when I said the skills are encoded in genes and also expressed in body
traits it doesn't mean gender or race, it could irrelevant things as the
length of the fingers, or the way you walk.

~~~
whateverman3
This is only true if the two genes are correlated in success. I have faith
such traits do exist, and many of them, but I am not immediately convinced
most traits interesting to businesses would be revealed this way.

Also, your example of laziness is classic thinking yourself out of quality
employees: laziness is _good_ for identifying waste and inefficiency and for
creating scaling automation. So it's not even clear if "laziness" is
interesting without context to businesses.

Of course, it could also entail lack of work ethic.

~~~
jorgemf
My point is that there are more factors that the biological capabilities (or
whatever an algorithms says) to be good at something.

------
diggernet
"Would you be nervous with an artificial intelligence evaluating your
interview skills?"

Yes, because the prospective employer _should_ be evaluating my job
performance skills instead.

------
transverse
What makes a hire a good hire? Is a good hire one who sucks off management, or
one who does the right thing even if it provokes management ire?

~~~
blunte
As long as the company has excessive cash flow, it's the former. Once the
company must actually work to compete, it's the latter. Maybe.

------
Overtonwindow
We've got discrimination down to a science

------
olleromam91
Who actually thinks this is a good idea?

~~~
blunte
The people funding the project with the fairly realistic expectation that they
will cash out with a good profit one way or another within the next couple of
years.

