Hacker News new | past | comments | ask | show | jobs | submit login

Four paragraphs in:

>Not only did the software fail to recognize black faces more than half the time, it wasn’t particularly good at recognizing faces of any ethnicity — the highest hit rate was under 75 percent

Surely 30-50% of students aren't failing their exams because of this. Teachers aren't idiots. Something doesn't add up here.

>While racial bias in code is nothing new, it’s especially distressing to see it affecting students who are just trying to do their school work

How is it affecting students? If the teacher has to review the video recording whenever a student is flagged as absent, then assuming teachers can detect black faces just as well as others than literally no student is being affected by this at all, let alone black students specifically. Why is America obsessed with construing everything like its an attack on African Americans?




Hi there, I'm the person who extracted the algo and ran the original test.

I picked my words mostly in response to the "fewer than five complaints ... due to race" statement Proctorio included in a response letter to the US senate[0] (page 21, para 3/4). If they did not make those claims, I would not have used the word "racist" in my original blog post, instead opting for a more neutral term like "biased". How the press chose to represent my work is their editorial choice.

> How is it affecting students? ...

Here's some insight into how Proctorio works - when your exam time window opens, you need to start this set of "pre-exam checks" - which take nearly a minute or two to run through in several cases, more on slower computers. Once that's done, it runs a "test" of their facerec - if it cannot see your face, you cannot enter the exam. No easy professor override, no "professor can review the recording". Only options are to either shine a bright light on your face or something for it to see you, if that works, or have a conversation with customer support that essentially goes "this algorithm literally doesn't see me as a human".

Also remember that all this is going down when the time is ticking down on the exam - most exams are only open for the time slot set for them - if you start late because you're fighting a phone/chat tree (sometimes with wait times of 5+ minutes, from anecdotes from friends) or a facerec algo, that's time counting against you. And there's also the stress of having to write an exam on top of all that.

So yeah, students are definitely affected by it.

[0]: https://epic.org/privacy/dccppa/online-test-proctoring/Proct...


You have given me no reason to believe you are the author, but if you are, I think you may have given away a key piece of information that is unsurprisingly absent from the article.

>Only options are to either shine a bright light on your face or something for it to see you

So what this is really measuring is the percentage of students who try to get out of taking the test by turning off most of the lights in their room and telling the teacher the computer wouldn't let them take the test. I would love to hear what the company that designed the test has to say about its performance. I doubt it failed to recognize faces when they tested it in proper lighting as just detecting that a face is present is extremely easy for a computer to do.


This affects black students in many ways. For example, People suffer from biases as well. If you are a teacher and you mostly see black students flagged, you'll probably be biased to assume black students cheat more. It's not a rational deduction it's an artifact if the way our mind process information (hebian learning, fire together wire together)

This is just one hypothetical way it might harm black students.

You are also assuming that teachers do their job and review the video. How about some lazy teachers that don't, and decide to trust the computer? Can you really assume there is no teacher like this? If so, this software bias may very well manifest as a accusation against an innocent. If the software makes more mistakes with black students, then this innocent will probably be black. Even if after some investigation it'll be sorted out, it is rather rattling being accused of cheating. And all of this ordeal is just because you are black.

I also really dislike the culture and race war narratives coming out of the US. That given, I'm still very concerned about trusting crappy software with meaningful life altering decisions. Handling color is just how this software is crappy, it's not the real issue.


Students are in the loop in the software - it doesn’t just silently report to teachers. It would be a huge distraction to be accused by the computer of truancy or cheating for no reason


reviewing dozens of exam videos takes longer than grading them, it is simply not going to happen


You don't need to review the entire video, if you look at a single frame where the software says the person is absent and they're clearly there, you know it was a false positive.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: