Did the developers intend their product to manipulate race in its outputs or not? It's a simpler question than these contorted reformulations you present.
But probably not simple enough to pierce ideological bias. I'll ask instead: you really believe that it wasn't tested in historical and other contexts where white people were erased?
> I'll ask instead: you really believe that it wasn't tested in historical and other contexts where white people were erased?
Occam's Razor: Of course it wasn't or they wouldn't have released such a ridiculous embarrassing product! C'mon, I think you're the one wearing blinders here.
We'll see what comes out later, but I give it 90:10 odds that this behavior was seen in testing and accepted by developers/DEI commissars - the failure was in not forseeing the reaction of the non-DEI-extremist public.
Of a piece with 'Google more or less explicitly said I won't be promoted because I'm White' stories.
Because adding the word "diverse" to the text of to their output is a lot easier than retraining their model? They're always going to try for the easy fix first.
Have you never tried tweaking the error text first before giving up and redesigning your form that's confusing users?
But probably not simple enough to pierce ideological bias. I'll ask instead: you really believe that it wasn't tested in historical and other contexts where white people were erased?