Hacker News new | past | comments | ask | show | jobs | submit login
Students of color are getting flagged because testing software can’t see them (theverge.com)
17 points by throwaway888abc on April 9, 2021 | hide | past | favorite | 21 comments



Just train the model on other races, what’s the problem?


hi - I'm the person who extracted their facerec algo and ran the test - another problem is that they're pretending that it wasn't a problem.

See "fewer than five complaints" on the third/fourth paragraph of page 21 of this letter Proctorio sent to the senate [0].

From anecdotes from my circle of friends alone, I've heard at least 3, some correlated by race, and then seen many tweets from @Procteario about it, so I didn't buy it, ran it myself, and tada. bias.

Also, it's less of a "just train" problem and more of a "complete redesign" problem - HAAR cascades use greyscale images, which generally mean a loss in contrast and makes it difficult. You'd likely need a full color oneshot network, such as BlazeFace, to make it work.

[0] https://epic.org/privacy/dccppa/online-test-proctoring/Proct...


So do that, what’s the problem?


They... aren't doing that?


I see your point, but I feel like the non tech informed general public will extrapolate that there was intent to institute racial bias.

However, now that we know what the issue is, ongoing inaction is racist.


This might be a naive question, but is darker skin just harder to see for shitty web cameras and cameras generally, especially in low-light indoor study environments? So the software doesn’t get the data it needs in the first place. I know that cinematographers often illuminate dark and light skin differently to account for the way light behaves when reflected off different colors.


Your question may be more relevant than you think. Yes, lack of illumination will obscure capture of any scene, if making it dark enough. This is a consequence of our imaging technology. Even great cameras are severely limited in how they can make sense of light that reaches them, never able to reproduce more than a slice of available contrasts (depth?).

Your cinematographers seem aware of this of course, and it touches an idea a photographer introduced me to. That one could draw parallels between photography as a technology, and the issues surfacing in image recognition and biased deep learning etc.

Photography has been shaped by environments even less concerned by such issues. Technology refined and calibrated over decades, to attain the perfect shot by those with the means to access a camera, the perfect shot of those most often subject to it.


Even worse, an education company built a solution that works for only ~10% of the human population. The customers should get a refund, damages, and throw that software in the trash.


I'm not sure I understand. Which ten percent?


Does that really matter? The product makes more errors on dark skin. All you did is suggest a reason for this fact. How does that change anything?


Diagnosis is helpful in solving the problem.

It's also not uncommon to ask questions purely out of intellectual curiosity.


If you don't care about the reason for a problem, you don't really care about solving it.


So basically the only thing that is wrong with the software is that they ship a model with no more than 75% accuracy to production, there is no racial bias here, every color are detected poorly almost the same


I made the point below and agree that there was no intent in making a racist algorithm. However, now that we know it’s inadvertently prejudiced, not fixing it would be racist.


Four paragraphs in:

>Not only did the software fail to recognize black faces more than half the time, it wasn’t particularly good at recognizing faces of any ethnicity — the highest hit rate was under 75 percent

Surely 30-50% of students aren't failing their exams because of this. Teachers aren't idiots. Something doesn't add up here.

>While racial bias in code is nothing new, it’s especially distressing to see it affecting students who are just trying to do their school work

How is it affecting students? If the teacher has to review the video recording whenever a student is flagged as absent, then assuming teachers can detect black faces just as well as others than literally no student is being affected by this at all, let alone black students specifically. Why is America obsessed with construing everything like its an attack on African Americans?


Hi there, I'm the person who extracted the algo and ran the original test.

I picked my words mostly in response to the "fewer than five complaints ... due to race" statement Proctorio included in a response letter to the US senate[0] (page 21, para 3/4). If they did not make those claims, I would not have used the word "racist" in my original blog post, instead opting for a more neutral term like "biased". How the press chose to represent my work is their editorial choice.

> How is it affecting students? ...

Here's some insight into how Proctorio works - when your exam time window opens, you need to start this set of "pre-exam checks" - which take nearly a minute or two to run through in several cases, more on slower computers. Once that's done, it runs a "test" of their facerec - if it cannot see your face, you cannot enter the exam. No easy professor override, no "professor can review the recording". Only options are to either shine a bright light on your face or something for it to see you, if that works, or have a conversation with customer support that essentially goes "this algorithm literally doesn't see me as a human".

Also remember that all this is going down when the time is ticking down on the exam - most exams are only open for the time slot set for them - if you start late because you're fighting a phone/chat tree (sometimes with wait times of 5+ minutes, from anecdotes from friends) or a facerec algo, that's time counting against you. And there's also the stress of having to write an exam on top of all that.

So yeah, students are definitely affected by it.

[0]: https://epic.org/privacy/dccppa/online-test-proctoring/Proct...


You have given me no reason to believe you are the author, but if you are, I think you may have given away a key piece of information that is unsurprisingly absent from the article.

>Only options are to either shine a bright light on your face or something for it to see you

So what this is really measuring is the percentage of students who try to get out of taking the test by turning off most of the lights in their room and telling the teacher the computer wouldn't let them take the test. I would love to hear what the company that designed the test has to say about its performance. I doubt it failed to recognize faces when they tested it in proper lighting as just detecting that a face is present is extremely easy for a computer to do.


This affects black students in many ways. For example, People suffer from biases as well. If you are a teacher and you mostly see black students flagged, you'll probably be biased to assume black students cheat more. It's not a rational deduction it's an artifact if the way our mind process information (hebian learning, fire together wire together)

This is just one hypothetical way it might harm black students.

You are also assuming that teachers do their job and review the video. How about some lazy teachers that don't, and decide to trust the computer? Can you really assume there is no teacher like this? If so, this software bias may very well manifest as a accusation against an innocent. If the software makes more mistakes with black students, then this innocent will probably be black. Even if after some investigation it'll be sorted out, it is rather rattling being accused of cheating. And all of this ordeal is just because you are black.

I also really dislike the culture and race war narratives coming out of the US. That given, I'm still very concerned about trusting crappy software with meaningful life altering decisions. Handling color is just how this software is crappy, it's not the real issue.


Students are in the loop in the software - it doesn’t just silently report to teachers. It would be a huge distraction to be accused by the computer of truancy or cheating for no reason


reviewing dozens of exam videos takes longer than grading them, it is simply not going to happen


You don't need to review the entire video, if you look at a single frame where the software says the person is absent and they're clearly there, you know it was a false positive.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: