
Twitter consistently centers image previews on whiter faces - anonymfus
https://twitter.com/bascule/status/1307440596668182528
======
slg
Perfect example of how not proactively making an algorithm anti-racist can
yield results that end up being racist. If you train a machine learning
algorithm on photos with predominantly white faces, your algorithm is going to
think people with lighter skin are more important thank people with darker
skin.

~~~
xyzzy_plugh
Why are people downvoting this? The parent is correct. If you don't test your
models adequately then they'll have holes like this one.

~~~
slg
HN doesn't like talking about race or numerous other "controversial" topics.
This story is already flagged off the front page.

~~~
rvz
> HN doesn't like talking about race or numerous other "controversial" topics.

There was plenty of discussions on HN when GitHub was renaming master/slave
terminology [0] and when other open source projects were doing the same.
Essentially they break the HN guidelines when they 'like' if it is a company /
person they 'like'.

On the other hand, when a topic attacks their 'narrative' or attacks their
side of the argument from someone they don't like, they downvote, flag and
censor and make sure it is never seen. Even if it is factual and has evidence.

[0]
[https://news.ycombinator.com/item?id=23518123](https://news.ycombinator.com/item?id=23518123)

------
tzs
In case anyone else did not realize this and is as confused as I was,
apparently you can attach more than one photo to a tweet, and Twitter will
then show previews of all of them in a gallery. The previews are formed by
taking some part of the original image. Clicking on any of the previews opens
a view of the corresponding full image.

I knew about twitter taking a part of an image and using that for a preview,
but had not realized that there could be more than one image in a tweet.

------
d0ne
Why is this flagged?

------
nokya
Wow.

(nothing else to say)

------
benmmurphy
Consistent in the title is misleading because this is essentially n=1. The
Twitter user did try to transform the image in order to produce a different
result but we don’t know if there is something not race based in Mitch’s face
that the algorithm prefer’s over Obama’s. We also don’t know if the user has
cherry picked this example.

