Hacker Newsnew | past | comments | ask | show | jobs | submit | antran's commentslogin

So basically the only thing that is wrong with the software is that they ship a model with no more than 75% accuracy to production, there is no racial bias here, every color are detected poorly almost the same


I made the point below and agree that there was no intent in making a racist algorithm. However, now that we know it’s inadvertently prejudiced, not fixing it would be racist.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: