Hacker News new | past | comments | ask | show | jobs | submit login
Job Applicant Resumes Are Effectively Impossible to De-Gender (unite.ai)
59 points by Hard_Space on Dec 17, 2021 | hide | past | favorite | 97 comments



Original paper: https://arxiv.org/pdf/2112.08910.pdf

The entire premise of the research relies on the fact that the resume samples were well matched: in other words, the quality of male candidates was equal to that of female counterparts. In the paper, it says

  Specifically, we perform 1-1 matching without replacement such that for each male resume, we find a female resume that is within 2 years of experience, has the same degree, field of study, and has a resume similarity score (i.e. cosine similarity of resume vector representations) of at least 0.7.
I am not sure if this method is sufficient to create a match dataset. In other words, it is entirely possible that the (legitimate) experience or quality differences in resumes may act as the signal for the ML algorithm.

It would be interesting to see descriptive analytics on male and female applicants. IIRC men and women have differences in self-selection for precisely things like this. It is plausible (or even probable) that such difference in self-selection crept into the dataset despite the research's resume matching process.

(As a clarification, this is a critique on the research methodology, and I am not claiming one gender is less qualified than the other in general.)


> The model had learned through historical hiring data that men were more likely to be hired, and therefore rated male resumes higher than female resumes.

A lot of machine learning sounds like the computers are just cargo-culting. But this concrete example is spectacularly so.


Well, it's not the computers faults that the people designing the system failed to (and probably are completely unable to) provide a proper measure of "value once hired". In the absence of a signal to tell you that you're stuck looking for proxies, and that it then ends up doing the same as many humans and assume that "past success at getting hired" is a good proxy for "value once hired" is not very surprising.

The machine learning is "cargo-culting" in many instances because humans are cargo-culting and thus are creating models that repeat the same motions rather than try to provide qualitatively better data.

But the interesting thing is that in doing so it does reveal a lot of things about human decision making that may help inform improvements.


If one gender really did provide more 'value once hired', would it be reasonable for the AI to favour them?


Not necesarily, because it could have given more value for the wrong reasons.

Imagine a misoginistic, but horny, boss, that only hires good working ugly men and hot dumb women, specifically looking for these traits in order to have a "playing" field. If the AI got in here it would be constrained to the same parameters, af if it was looking for good workers of any gender it would prefer men.

An the thing is that although this boss is part of history (we hope) the influence they had on the workforce at large was enormous, and it'll be a long time before this is reversed.

And this without taking into account all the coworkers that could hinder your job because you are not part of their cadre (gender, football club, fortnite clan, whatever).

We really should exepriment with equality, or even the opossite extreme, before considering one gender "superior" than another for a given role. And even then there would be exceptions.


How much more value?

Suppose men were providing $10 in additional value over the time they work for you. That's barely even worth a Starbucks gift card, the sort of thing you probably give away to people who guess the number of marbles in a jar at a company mandatory fun event. You definitely shouldn't hire James rather than Deborah because of an average $10 gender difference unless they are literally interchangeable. If their sole distinguishing feature was somehow that Deborah's smile was nicer, or James' laugh was more annoying, that's probably still enough difference to out-weigh this $10 lifetime value. And the reality is your candidates just aren't that similar. Deborah's six years experience in a similar role while James is applying with no relevant background should not lead to the conclusion that all that matters is the average "value once hired" for James will be higher based on gender averages.

Now, suppose women are providing $10M in additional value. Well, I think we'd have noticed something that dramatic by now right? "Gee, my start-up with five women broke even in the first six weeks because of how awesome women are". So we can rule that sort of large figure out as implausible.

I think a discussion about whether it's reasonable needs to take the magnitude of the difference into consideration, which would first mean (if you wanted to do this) finding out for sure how large that difference even is. My guess is, if you did this, you get a very small answer and so then the discussion is really easy: No.

But before you could do that, you'd need to have a decent objective measurement of this "value once hired", and that part is far harder so it won't get done.


If you are in a gunfight, would you rather have a slight advantage?

You also need to consider the risks and circumstances, sometimes you want every tiny edge possible.


The problem with this is that it presumes you know which actually has an advantage in that scenario to within the error margins.

In the $10 example, I'd argue chances are very small that you actually know who provides the most value because just a tiny shift in the measurement criteria could shift things in the other direction.

If such a tiny edge matters, it's an indicator you have a problem.


Not to mention someone’s value ends up varying based on what they are assigned to and who they work with. Frankly the whole idea of precisely measuring individual value is bean-counting bullshit.


Sure, you spend as long as you like figuring out how best to achieve the slight advantage, meanwhile I shoot you in the head.

Analysis paralysis, figuring out what to do costs time and time is expensive so you should avoid this unless you've got a good reason to expect it actually matters and you believe you can actually do it correctly.


In this instance most societies have decided that discrimination by gender is a negative for society irrespective of whether or not you can come up with a metric that gives one gender a higher value to the company, so no.

But it would still be important to understand how the AI would assign value because it would matter to ensure that it does not apply criteria that are not legal.

At the same time, what this seems to suggest is also that purely judging by other factors in resumes might effectively act as a proxy for directly judging by gender. As such the more interesting question to me is whether that means that deciding based on such factors might put employers at risk of claims of gender discrimination.


Yes, because of the AI is picking my brain surgeon, I want the best person, and I’m absolutely opposed to any equity, just raw talent and merit when it comes to someone operating on my brain again.


the AI is just a product of it's inputs, how it's used, and whether it violates discrimination laws, are totally up to the users.


Sort of, but we have to be careful not the wash the hands of people all too willing to discard responsibility. It's correct to say "it's just statistics on data" but it also provides cover to the processes that use it. "How can you be against hiring the best person for the job" the argument goes, completely ignoring that the "best" isn't scientifically defined, and includes the same sexist biases we were trying to get rid of.


I read that as specifically saying that the ethical and legal responsibility is entirely on the people. (i.e. I think you're agreeing with the comment you're replying to)


Yes, sorry for the lack of clarity. Users shouldn't blame the machine when they control every aspect of it and how it's used. It's on you as a user to understand your tools and data.


The there are differences in human learning and knowledge and machine learning, one being the kind of errors machines makes compared to humans. A human is unlikely to confuse the side of a truck with the sky (tesla), or the image of an item with the actually item. Humans in turn get effected by stress, metabolism and other biological factors.

For an AI to actually predict which employees are going to benefit the company, it need to understand both how the company has evolved over the data period, how it will likely continue to grow, and that some employees that have successful careers are not necessary the people that contribute to the success of the company. An AI might think nepotism is a great indicator for a highly retained and employable employee, but a human would recognize it for what it is but may accept it as an acceptable level of corruption. It would be the same conclusion (ie, hire the person), but for two very different reasons.


> A lot of machine learning sounds like the computers are just cargo-culting.

That's precisely what ML is. Though to be fair, interviewing done by real humans is also largely based on cargo culting.


I would call it "pattern recognition" rather than "cargo culting". Cargo culting is an example where pattern recognition fails.


I mean. This seems like a good example of pattern recognition probably failing?


That depends on how you measure it. It accurately reproduced how previous recruiters worked, but this approach might not be the best. The thing here is that the metric "the best candidate" is very hard to find, so people use the metric "candidate that were previously recruited". I would call it an instance of the "streelight effect": https://en.wikipedia.org/wiki/Streetlight_effect


I think the key insight is that the humans the AI replaces were probably also cargo culting. They'd just get a bit more defensive if you suggested that.


Have you ever tried to tell an ML engineer that their model is biased (or even worse sexist). I think the defensiveness is about equal.


FWIW at least at the company I work this gives women an advantage. I get a nice bonus for anyone I recruit but recruiting a woman also increases my managers KPI.


We have quotas we need to fill. Hiring a white male is basically a nonstarter without prostrating yourself before HR


This reads to me as it's okay to discriminate as long as it's for the "right group".

I honestly feel a bit sick about all this. Maybe it's the only way, but it doesn't feel like it is from my perspective.


It is absolutely just another form of discrimination. In the past people thought they had reason too.

If you want compensatory justice and you are male, the only thing you could do is to resign and give your job to a woman. If you demand the same from others you are just as bad as someone that discriminated in the past.

There is no higher motivation behind this. HR just follows laws though, so you should boot your legislators for this.

Additionally it very much influences work relations for women far more negatively than for men, so they don't even profit from this aside very few selected high earning positions. But those will still have a problem with authority.


It's not the only way. They want you to think it is. Don't let them break you.


[flagged]


> What a horrible injustice for an HR team to try and bring some diversity to a company that mostly hired white males.

White people are under represented at tech companies in USA. Only Asians are over represented. The only reason to discriminate against white people there is that you hate white people. White people are less under represented than black people, true, but white people are still under represented.


You started with a lie, wonderful wonderful: https://www.eeoc.gov/special-report/diversity-high-tech

> Compared to overall private industry, the high tech sector employed a larger share of whites (63.5 percent to 68.5 percent), Asian Americans (5.8 percent to 14 percent) and men (52 percent to 64 percent), and a smaller share of African Americans (14.4 percent to 7.4 percent), Hispanics (13.9 percent to 8 percent), and women (48 percent to 36 percent).

But I'll play ball, I'm sure you can show me a stat that says white people are underrepresented by a tiny amount by changing the scope, maybe by considering worse paying jobs where white people are definitely underrepresented!

That will totally make your point... right? White people are not getting enough of the worst jobs, only the best ones! Kind of why I said industry wide stats are useless though.

Of course, also link it so I can show how that same source probably shows blacks and hispanics as being even more underrepresented than mine and I can laugh in your face for acting like white people are "underrepresented" in tech because there's like a 90% correlation to population size instead of 100%.

-

But actually I mean let's roll with your lie for a second!

Do you think, just maybe, white people are less likely to be under represented at the companies that are working hardest to hire non-white-males?

Like a company is all-hands-on-deck trying to get some marginalized people working there, that's because...??? Like who... who do you think is currently making up most of the org?

-

Again... broad industry stats about diversity are utter bullshit.

Don't take my word for it, apply some critical thinking.

Do you think white males are underrepresented more or less as you climb the ladder?

https://www.thisdot.co/blog/diversity-statistics-in-tech-a-d...

You can look at that first diagram and you tell me that white males are being discriminated against?

Like, how out if touch with reality are you?

We're in an industry where the only reason there's any underrepresentation of whites is because companies were stuffing minorities into the worst paying roles and bringing over H1B1s to underpay them whenever they can.

The moment HR tries to change the status quo the white males start crying racism. Cry on brother.


You are just a racist and sexist. The problem is you, not other people.


I'm racist and sexist (against myself) for accurately pointing out the trend where a white male dominated industry normalized drinking in the office and how that affected their definition of "culture"?

https://tech.co/news/tech-workplace-drinking-culture-2019-02

Or are you just insecure about your own biases.


> Or are you just insecure about your own biases.

---

> Be kind. Don't be snarky. Have curious conversation; don't cross-examine. Please don't fulminate. Please don't sneer, including at the rest of the community.

> Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.

> When disagreeing, please reply to the argument instead of calling names. "That is idiotic; 1 + 1 is 2, not 3" can be shortened to "1 + 1 is 2, not 3."

- https://news.ycombinator.com/newsguidelines.html


This would be a great copy pasta if you didn't have to purposely ignore the entire rest if the comment to make it work.

And it's not an insult, people are clearly insecure about their biases, even you for being unable to address a very clear point and immediately taking solace in playing a wannabe mod to somehow invalidate the actual point.

You have biases, I have biases. We all have biases.

Some people hear one guy give their career history with one accent, they hear another guy give it to you with a country twang, and they interpret it differently. They see one guy who's real old and one guy who looks 18 say the same thing and they subtly interpret differently.

That doesn't make you racist or ageist it makes you human. We're not robots raised in a vacuum and covered in Teflon so we never let biases from a million sources from upbringing to media rub off on us.

-

This incredible insecurity that white males (amongst others, but most openly expressed due to the fear the tech industry is now oppressing them) have picked up about ever admitting they're not robots is infuriating as a black man.

Because the moment you bury your head in the sand like this, you're almost deciding "I will continue to be biased and make no attempt whatsoever to fix it"

Like fuck, you're actually going to pretend there are no candidates at all that you maybe get a little excited about because you feel like you connect with them on a personal level in a way that benefits from having a shared culture and is maybe not so helped by being from different cultures???

Because if you are I can tell you it's bullshit:https://www.sciencedaily.com/releases/2012/11/121129093008.h...

What a person actually interested in dealing with this problem does is accept that, yup I did end up shooting the shit with this candidate, that did improve my image of them a tiny bit, let me look at my feedback again and remove anything that was influenced by that.

9 times out if 10 HR is trying to do this for people and suddenly they act like the tech industry is now built on affirmative action, white men are unable to get ahead (lol) and any non-white-male with a heartbeat is getting hired for walking through the door.


No, it works fine regardless of context. You can't make an argument and then tack on "and if you disagree it's because you're insecure"; your comment would be more compelling if you didn't resort to ad hominem.


Your comment would be more compelling if you had something of substance to say instead of taking a cop out and whining someone the black guy is too mad about racism in the industry for your liking, not just once but twice now...

You're acting like I wrote some expletive filled rant, it's 99% content and an unbiased, honest question to the reader: are you insecure about the biases we all have?

If even being asked that bothers you so much maybe that's also something to think about.


Not my experience and I doubt this is true in any way. Maybe young startups by young founders, but even there I have never seen it. I am not from California and not even the US, but demographics of IT looks pretty similar in many countries.

I would expect such behavior in some cases in the established industry branches and large corporations. But that is nepotism and gender or race does not play a role in the slightest.


I don't even know how this is up for debate!

It is universally known people are biased to hire people they'd like to spend time with: https://www.sciencedaily.com/releases/2012/11/121129093008.h...

> > I am not from California and not even the US

Oh so you're not from the country in question with these policies?

The whole "tech bro" persona did not come into existence for no reason.


I don't have an opinion on either side of this debate, but your link is based on a "survey with more than 1000 responders" that seems to be about the current state of things. It also doesn't talk about volume, just drinking as if every drinking is the same. I don't think it supports your claims.


It's not supporting my claims without asking critical thinking of the reader, that's a tough ask I know...

There are more studies than you could ask for showing we hire people we'd like to hang out with: https://www.sciencedaily.com/releases/2012/11/121129093008.h...

Alcohol is one thing that caught traction in the "tech bro ecosystem".

So put two and two together and what do you get?

https://www.wired.com/2016/12/techs-alcohol-soaked-culture-i...

https://www.softwareiseasypeoplearehard.com/does-the-tech-in...

https://www.forbes.com/sites/csr/2011/03/18/drinking-at-work...

https://technical.ly/philly/2016/05/02/alcohol-tech-events/

https://www.teamgather.co/blog/ping-pong-tables-and-beer-keg...

https://www.huntonlaborblog.com/2019/01/articles/employment-...

HN is the only place people would actually pretend alcohol is not part of the tech-bro persona lol.

Can't admit there's a problem right?

At least in the last few years people have realized it's a bad idea. All those pesky HR department and their "inclusion" and "drinking culture shouldn't replace a work culture" noise amiright?


I've never worked in the US, and don't live there. I don't think asking for more evidence than a comment and a survey of 1000 people in a sector at least a million of people is a display of a lack of critical thinking. In fact I'd say it's quite the opposite. You talked about the last 2 decades, yet nothing in the article talked about that too. The articles you linked there are more substential.

I personally live in France. I don't know about the tech industry as a whole, but drinking is the was young adults socialize. I personally don't drink much, so I know how it feels to have people pressuring you to drink more every time you're at a social event.

I understand that it's the internet, that you don't know me and that you probably faced plenty of people pretending to be neutral while they were trying to refute what you said, but this wasn't the case here.


That sucks to hear as a white male but I still put the responsibility on myself. I should be so good at my job that they don't care that I'm a white male.


The context here is that applicants are overwhelmingly white and male. Most of the time it won't matter because you're the only category in the pool.


Depending on what country you are in, this is illegal no? Not hiring someone based on their ethnicity who is otherwise qualified/the best candidate is racial discrimination, regardless of what that ethnicity is.


In Brazil this is illegal, but when I saw this happening and told the government about it, I got an e-mail from a female judge, saying that hiring only women is fine because it "corrects the distortions", despite it being against the constitution... This is anecdote but I can say from personal experience that companies can get away with it.


I was horrified to find this [0] written in law for my country:

> Appropriate measures aimed at achieving true equality are not regarded as discriminatory.

[0] https://www.fedlex.admin.ch/eli/cc/1996/1498_1498_1498/en#ar...


I have a white male friend who works in academia in NYC. They had a staff meeting at the college about hiring for open positions in several departments and were explicitly told that they were looking only for "diverse" candidates and that white men would not be considered.


We happily take anyone qualified and I get my bonus. But as mentioned, percentage of female employees is a KPI and it makes my boss looks good and it makes the CEO happy in all hands meetings.


I am an engineer that involves in hiring (technical interviews, assignment reading (we have a home assignment for candidates)) and I have a couple experiences in which I felt like my decision was overruled.

One was a female applicant. I was reading her assignment and felt like it was subpar and my decision was a thumbs down. Later we went forward with the application to the technical interviews because it was a female and we "really needed to hire one" (to be clear, I didn't even know it was a female until my decision is overruled. We don't receive CVs or names while reviewing home assignments)

The other was a minority from a different country. My decision was thumbs down because the applicant basically lied in his CV about his past experience. HR overruled that by saying something like "...in his culture it is hard for people to admit their weaknesses..." and to top it off "we don't have anyone in our company from country X!" and he got hired.

So yea, positive reinforcement means you are not really looking for qualification and most things can be ignored to fill some quotas. And you won't be getting that bonus when someone else refers a minority that gets hired.


And to add, when I was hired I was also "first employee from country Y". Still wondering if I was a diversity hire


It is crappy to think you got a job not because you deserved it/good at it but yeah... Well if you need the opportunity I guess that's nice.

I have played the poverty card before regarding student loans.


My properties ensure that I won't face quotas that would be a barrier for emplyoment (well maybe my gender, I'm male), so, I'm just curious, so what if there's one position to fill, the male candidate scores 60%, and the female one scores 58% (using my hypothetical ranking system which actually works and is fair, etc, etc), so just slightly lower. Would your company pick the female candidate in this case?

I guess answering this question "the wrong way" would get a lot of butthurt people showing up to this thread, so feel free to not answer.


Not OP, but I've seen some of this (though not terribly often). I don't think I've ever seen it come to a head and be explicitly about an attribute like that. I think that would make a lot of people uncomfortable.

Usually what I see is faster movement through the recruiting pipeline. In your case, the woman scores 58% a week before the man takes the test. There's no overt pressure to just accept her score and move on. She's just here, qualified, and ready to save us from having to do more interviews haha.

I don't have a poignant take on whether that's a good or bad thing. It definitely feels better than how I imagine having to pick between the two would, thought it still feels discriminatory. Then again, diversity in some sectors is low, so perhaps it makes sense for the hiring pipeline to give them some form of advantage, and this might be the least bad option.

The whole thing is a mess of moral ambiguity.


People need to realize is actual support for diversity means internalizing there is a benefit.

So if you actually believe there is a benefit to having a team of men and women (I do), then it's not "wrong" to let that guide you to the woman when any score above a 50% indicates someone who will be able to actually do the work.


You're pushing shit uphill with that argument on HN. A lot of these guys have never worked on a team with anything resembling gender balance, and they have an irrational fear of women.


>Hiring a white male is basically a nonstarter without prostrating yourself before HR

I struggle to believe what I'm reading.

I'm just curious what people of future will think about this.


In most companies hiring anyone requires some form of prostration to HR, so a little extra work to hire a white male is not going to slow anyone down.

A friend of mine works in a large organization in which the head passed down a rule that only women would be hired for positions 'unless' the only available qualified candidate happened to be male.

Can you guess what happened? It hasn't increased the relative number of women hired but now everyone thinks the men are extra special.


The thing kinda is, that it's not only male/female thing, but race's also involved.

Biases biases and biases.

I think we should try to reach equal opportunity by providing access to edu, mentors maybe, an ability to prove themselves, not this kind of "discrimination".


Exactly, it sets anyone up to failure that would only slightly be under suspicion to have profited from these decisions. Not only by judgement from others, it also undermines the faith of the candidates themselves.


This isn't something to claim without elaborating.

You're saying your company won't hire white males without extenuating circumstances?

I'd love to see the incredibly diverse teams there then... or is this a case of your company used to not hire non-whites and now they're trying to play catch up so they can beat a drum about DNI, meanwhile the culture that lead to a lack of diversity marches on internally?


> I’d love to see the incredibly diverse teams there then…

This is exactly the line of thinking that causes the aforementioned reluctance to hire yet another white male.


What on earth are you talking about?

You think the reason they're so desperate they're willing to gaslight minorities over their abilities (according to this anonymous unelaborated source) is because they have diverse teams?

Did you just stop reading after the ellipses or what?


  recruiting a woman also increases my managers KPI.
That explains why my manager was looking specifically for female candidates


Where I live it is very, very common to see job offers that only take disabled people because the company gets $$$ from the government (this is, from everybody of us, including those ostracised by those policies) when they hire a disabled person.


That's literally the worst.


No, compared to what I hear from other places there is no hate against men compared to what sometimes get described here; it is just that women are preferred and get slightly higher pay on average if my contacts in management are to be believed, but keeping women on board is a priority and also they pay us men very fairly.


No hate - that's an interesting perspective. Such practices, or even just results without intention, would for sure be described as hate (or something negative) if men were preferred and get slightly higher pay.

Don't get me wrong, I'm not calling you out, you just provided description without judgment and that's great. It's just a reflection on how those topics are usually framed, and that's a sad state of affairs.


> Such practices, or even just results without intention, would for sure be described as hate (or something negative) if men were preferred and get slightly higher pay.

To be clear I disagree with calling it hate also if it goes the other way.


"Positive" discrimination is the worst, and makes it pretty clear to everyone on the floor, that the person has been put where he/she is for reasons that have nothing to do with how he/she performs at work. I've seen it first hand, and it was quite embarrassing to watch a female member appointed to a specific committee just for the sake of having a women, that no one listened to, that clearly had no business there, etc... People deciding what women employment rate and where, should be, reminds me a lot of certain communist bureaucrats deciding what the GDP of the country should be and passing a law saying "this year, the GDP will be x"... you just cannot fake reality.


It's bit sad that there are no examples of processed CVs, especially those where gender was masked by removing some words. I wonder how the NLP models did their job and I am feeling there is NLP looking for tanks using clouds instead (see https://www.gwern.net/Tanks).


I'm not sure gender inequality can be solved by some magic machine making gender disappear. Rather we can progress by making inequality visible and challenging it: only when attitudes change we will slowly improve.

The HN crowd might like Lauren Klein's approach to this via data science [1] - I do.

1: https://data-feminism.mitpress.mit.edu/


Thank you, it seems to be an incredible resource, really appreciate you sharing this.


In my native language every verb has a masculine or feminine ending depending on the subject's gender.

I'm also sure a woman has a far better chance of being hired into tech than a man with the same qualifications.


To be fair, getting hired in tech is still pretty easy in most cases and countries. My country puts way too much weight on formal qualification, but in the current economy you find work nearly everywhere.


On the importance of de-gendering the recruitment process, I have a parallel to submit with a discipline I know well, the professional classical music industry.

A few decades ago, it was commonly thought that the brain of women couldn't understand music. "Women have other preferences," "Their brains are not made for that." If it reminds you of some rhetoric seen here and there, trust your instincts.

Here is what happened in the 70s, using extracts from the book "Blindspot":

"In 1970, fewer than 10 percent of the instrumentalists in America's major symphony orchestras were women, and women made up less than 20 percent of new hires."

"Starting in the 1970s, several major American symphony orchestras experimented with a new procedure that involved interposing a screen between the auditioning instrumentalists and the committee, leaving the applicants audible but not visible to the judges."

"The next twenty years provided interesting evidence. After the adoption of blind auditions, the proportion of women hired by major symphony orchestras doubled—from 20 percent to 40 percent."

Fun fact: those blind auditions didn't start because of gender ethics; they began because the classical music industry was rife with clannishness and nepotism at all levels, which gave the incitive for orchestras to limit the impact from influential professors.


But that does not mean we have the same situation here. We are living 50 years later and the pipeline for IT looks different than the pipeline for music.

Also:

https://statmodeling.stat.columbia.edu/2019/05/11/did-blind-...


Thank you, you are totally correct; we are living 50 years after, and those pipelines are different. I tried, without great success, to create perspective in a very one-sided thread.

This link is interesting indeed. While there was some fair criticism of this study, I do agree with the conclusion:

"I agree that blind auditions can make sense—even if they do not have the large effects claimed in that 2000 paper, or indeed even if they have no aggregate relative effects on men and women at all."


Link to the actual paper here: https://arxiv.org/pdf/2112.08910.pdf

Edit after having read it: It's short and easy to read, I would suggest reading it instead of the article linked. I think that more work should be needed before making suggestions. For example, is there a correlation between the "number of words" and the gendering of a resume? Maybe something like "I was the manager of the soap team for 3 years, during which sales increased by 25.6%" compared to "2012-2015: soap team manager". Though this convey less informations. Maybe a KPI box for each job?

Another thing that I would have liked would be to compare that to what actual recruiters can do. If a simple model can do 0.75 and a regular recruiter 0.53, it's not the same as if a recruiter does 0.92.

Lastly, the "list of gender indicating words" seems really small. Just 26 words for 348k resume.


The obsession with equality (of outcome) is a blight upon our world.


What about this is promoting equality of outcome? The paper is investigating whether information preventing equality of opportunity can be successfully obfuscated.


My boss recently transfered to my team a lady that was lacking very basic skills. I was, and still am happy to mentor her, and have her in our team, like everyone else who joins us. But I suspect that the only reason we keep her is my manager's KPI that is driven by the (well known) views of higher managent in our company. This makes me feel very bad towards all those candidates I rejected that had much more experience than her, but still not enough for the roles we are recruiting for (we set the bar pretty high). This favoritism is implicitly discrimination against these candidates. But you don't fix discrimination with more discrimination. Instead of moving forward, we repeat the same mistakes with different decoration. I want to move somewhere far away :(


I assume this controlled for the fact the genders don't apply to certain roles at the same rate? I mean I assume most people reading this right now are men simply because this site is a business/tech forum and men tend to have more of an interest in these two things.

Similarly, if I got an empty job application for a rust developer role I would expect 90%+ of time it's a man applying. If it were a UI/UX designer I'd be less confident. Maybe a 50/50 split based on my experience. Were it a social media manager role, I'd then assume it was a woman.


The article says

  Male and female samples were matched 1-1, and a subset obtained by pairing up the best objectively job-appropriate male and female candidates, with a margin-of-error of 2 years, in terms of experience in their field. Thus the dataset consists of 174,000 male and 174,000 female résumés.
but that is assuming you are able to match two candidates objectively.

Also "with a margin-of-error of 2 years, in terms of experience in their field" is a quite a margin for junior-mid positions.


Reading the conclusion of the article (not the paper), it sounds like the difference in writing between genders seem to be too strong for current techniques to work.


I wonder what happens if you work with someone of the opposite sex to prepare your resume.


Yes! Or use the AI to identify the gender indicating words of the opposite gender that are synonyms of words you've used and swap those in - might be a start (although deeper things like sentence structure could still influence the classification)


I'd say the current techniques work too well :)

Also it says something about the current wave of "AI" being able only to mimic the human thought process and not being able to come up with anything different. The I in AI is still out of reach.


But you are not trying to de-gender it for AI’s right? It’s humans that still have to distinguish.


What if we take something like GPT-3 and have it rewrite all the resumes before they sent to processing into NLP models?


If we grant as correvt the reseatch saying upper eshelons of society is filled with sociopaths and psychopaths, https://www.forbes.com/sites/victorlipman/2013/04/25/the-dis...

Can we train ML to recognise them from a CV? Do they revognise each-other?


Automation should be removed from the hiring process.


But to which level? Basic automation, i.e. sorting by proficency is not really done better by a human than machine. The problem doesn't lie in automation but how we apply it. Don't blame the tool, blame the user.

I think there is a lot of room for removing overautomation and creating transparency. Though just outright abandoning automation would hurt even more since human routines are even harder to evaluate/ make transparent.


If you have too many CV’s to look at, you can simply trash a random set (or automatically send a rejection email), then look at the remainder. You should still have roughly the same ratio of skilled candidates.


To still have the same ratio of skilled candidates would mean the automatic rejection has zero merit. For example, what about filtering for a minimum of x years of work experience for a senior position? Surely your ratio of skilled and appropriate candidates will increase.


What kind of proficiency do you mean, exactly? To me, the concept sounds too nebulous for a machine to analyse, as it needs precise instructions/criteria.


How do you "sort by proficency"?


[flagged]


Because it is religious dogma at this point, and not based on facts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: