Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Did the developers intend their product to manipulate race in its outputs or not? It's a simpler question than these contorted reformulations you present.

But probably not simple enough to pierce ideological bias. I'll ask instead: you really believe that it wasn't tested in historical and other contexts where white people were erased?



> I'll ask instead: you really believe that it wasn't tested in historical and other contexts where white people were erased?

Occam's Razor: Of course it wasn't or they wouldn't have released such a ridiculous embarrassing product! C'mon, I think you're the one wearing blinders here.


We'll see what comes out later, but I give it 90:10 odds that this behavior was seen in testing and accepted by developers/DEI commissars - the failure was in not forseeing the reaction of the non-DEI-extremist public.

Of a piece with 'Google more or less explicitly said I won't be promoted because I'm White' stories.


> Of course it wasn't or they wouldn't have released such a ridiculous embarrassing product!

Then please do explain why their first reaction was to double down on "diversity" in historical contexts, before eventually retreating?


Because adding the word "diverse" to the text of to their output is a lot easier than retraining their model? They're always going to try for the easy fix first.

Have you never tried tweaking the error text first before giving up and redesigning your form that's confusing users?


Hmmh. Fair enough. I didn't think about it from the perspective of "error text", but now that I think in that mental model, I think you're right here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: