
Ask HN: Critique this hiring procedure - Idontknowmyuser
I&#x27;ve been thinking about hiring processes in General and I think some of the problems that they face are a result collapsing a complex question very hastily into a binary decision.<p>Therefore I propose a &quot;multi-layered&quot; decision process, where the final step is made automatically. (This is obvious and I know it&#x27;s probably not new, if your company tried this in the past I will be happy to hear about it)<p>First essential rule: &quot;if the candidate has the highest amount of points and is willing to accept the offer, s&#x2F;he gets hired&quot;<p>Now, unlike other systems which try to prescribe, this system attempts to describe.<p>We define &quot;a reason&quot; as any true, legal (as in non-discriminatory) statement about the candidate that is precise enough.<p>Each reason can be assigned a number of points (negative or positive). The interviewer has complete freedom to assign as much points as he wants to any reason but is advised to respect previous jurisprudence and to justify himself in the cases where he goes against it.<p>Valid reasons can range (just examples) form:
&quot;S&#x2F;he got Y GPA in University X&quot; to &quot;s&#x2F;he didn&#x27;t say hello at the start of the interview&quot;
Invalid reasons are for example &quot;unprofessional attire&quot; this is invalid because it is vague. Reasons need be blunt and precise. &quot;S&#x2F;He wore X and Y&quot; should be valid.<p>Claim A: this makes the hiring process more traceable and open by providing a paper trail<p>Claim B: this fights part of the unconscious bias. By requiring interviewers to think and write down reasons. They can examine more critically their decisions.<p>Claim C: this helps control for overreaction. Assigning helps insure that the effect of each reason is reasonably proportional. I think many would be ashamed to write (-50) for just not saying hello but I worked with people petty enough to reject for less.<p>char limit, 1&#x2F;2 to be continued at: 
https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=16105130
======
JSeymourATL
> some of the problems that they face are a result collapsing a complex
> question very hastily into a binary decision.

I don't see anything particularly attractive or helpful here from a candidate
experience perspective. The objective of hiring to find people with the right
combination of skills/experience/attitude. Slow, bureaucratic processes
(interviews by committee) tend to turn-off good people.

Relative to reducing bias in the process, HBR offers interesting advice >
[https://hbr.org/2017/06/7-practical-ways-to-reduce-bias-
in-y...](https://hbr.org/2017/06/7-practical-ways-to-reduce-bias-in-your-
hiring-process)

------
indescions_2018
Honestly, its just very difficult to find talented people right now. 99% of
your effort will be spent on recruitment. Not on the hiring design process.
Which, you are probably right. Its biased, unfair and skewed against
inexperienced candidates. But ultimately, it is an instinctual decision. "Do I
wish to work with this person?"

Of course, to attract the top talent in the first place. You have to build
something they would be proud to be a part of ;)

Good luck and keep iterating!

Also, check out CNBC's The Job Interview. I am sure it will hit close to home:

[https://www.cnbc.com/the-job-interview/](https://www.cnbc.com/the-job-
interview/)

~~~
Idontknowmyuser
The idea is that instinct has in itself an unconscious bias. By making the
reasoning part conscious we hope to eliminate some of it.

------
Peroni
Quick caveat before I get stuck into your post: I've been working in the
hiring space for the past ten years with companies of all shapes and sizes.
Your thought process throughout this post is absolutely on point but the big
tl;dr here is that your proposed solution already exists in well established
hiring platforms.

Feedback:

The primary difference between the process you describe and the process that
most companies already follow is assigning points to influencing factors. In
fact, some Applicant Tracking Systems like Greenhouse[1] already do exactly
that.

 _check if the point values are reasonable_ \- Who decides this? If it's a
person who has met the candidate, they are immediately skewed by their own
bias. If they had a positive experience with the candidate, they are going to
struggle to see how negative scores could be considered reasonable.

 _Claim E: this process if well documented should be a valid and easy defense
against allegations._ \- I'm afraid I have to argue the opposite here. In the
EU for example, new GDPR regulations mean that this level of documentation
must be made available to the candidate on request and categorically makes you
more liable to legal challenges.

[1][http://www.greenhouse.io/](http://www.greenhouse.io/)

[2][https://ico.org.uk/for-organisations/guide-to-the-general-
da...](https://ico.org.uk/for-organisations/guide-to-the-general-data-
protection-regulation-gdpr/)

~~~
Idontknowmyuser
by check if the point values are reasonable

I meant that the amount of points is not a major outlier, and if it is the
interviewer should have written an argument why it's different from the old
cases if the other members find the argument convincing it should be fine.

>some Applicant Tracking Systems like Greenhouse

that's exaclty the type of feedback I wanted thank you very much

> Claim E

if we are innocent, why wouldn't this be in our favour.

------
Idontknowmyuser
2/2

Step 2: peer-review The highest candidate is selected, interviewers exchange
notes and are asked to:

* check and discuss if their colleagues reasons’ are valid and reasonable (for example "the hello" reason might be seen as too petty by some, and will be discarded)

 _check if the point values are reasonable (Checking old similar data to flag
outliers is a possibility)

_ check if some of the reasons presented by other colleagues apply to their
own candidates. For example, if the "hello" reason was not discarded and your
candidate didn't say hello you must apply it to him too.

*Interviewers are asked to bring more focus on the current best candidate's reasons.

If after a peer-review round, the best candidate changes, another round is
made or shorten the list and re-interview.

If it does not change, the best candidate is offered the position.

Claim D: this helps smooth out harsh or lenient interviewer bias.

Claim E: this process if well documented should be a valid and easy defense
against allegations.

Data Analysis on hiring decisions becomes way more interesting. I'm sure there
are tons of trends you can seek out.

Claim F: this allows an earlier detection of discrimination. For example if
you find that an interviewer or a committee always removes points from a
certain group for a subjective reason more than others in the company. It
might be an early warning sign that a problem need be addressed.

Claim G: this allows for a higher quality debate on controversial issues like
sexism and racism in hiring.

Machine learning can be used to give suggestions to interviewers about the
amount of points to give for reasons. (They should be suggestions no decisions
to avoid the pitfalls the lost nuance that classic machine based decision
might suffer from)

Example: a fairly easy one is that after a high enough number of universities
and GPAs is received one can aggregate the point values into scores that take
into account the difference of grading and quality between universities. This
might help decide if GPA x in Y is better or worse than GPA z in C, a very
hard question to answer fairly if we didn't have the data.

I think this method offers a more traceable, open and perhaps fair way of
hiring without suffering from the lack of nuance that traditional automatic
hiring suffers from.

Problems:

\- Money: this process might require more man power.

\- Tooling: to be efficient this process requires tooling and automation

I would be happy to hear your ideas, improvements and experiences with similar
systems. I think the idea of this system is very similar to that of a neural
network.

