Hacker News new | past | comments | ask | show | jobs | submit login
Instagram prioritizes photos of scantily-clad men and women (algorithmwatch.org)
93 points by n_kb 17 days ago | hide | past | web | favorite | 84 comments

It seems like the possibility that photos-of-undress are promoted by organic engagement is not discussed? (aka people seem to be clicking on this).

Confession: this is the first thing that sprang to mind for me. Not that algorithms suck or companies suck (though both of those are true) but that people suck.

That was my first idea too! Not that people suck (not sure why you must suck if you enjoy looking at good-looking people), but that it's organic. Instagram just prioritizes what people click on, and people are attracted to attractive people, so they click on that.

Follow-up questions:

- Does Instagram prioritize scantily-clad photos of attractive people over scantily-clad photos of relatively less-attractive people?

- Is the prioritization based on a machine analysis of the photo, or on the response of earlier users to the photo?

- "While [the skew towards nudity] was consistent and apparent for most volunteers, a small minority were served posts that better reflected the diversity published by content creators." Are the majority people who have clicked on plenty of attractive scantily-clad photos? Are the minority people who have been presented with such and avoided clicking them?

> Is the prioritization based on a machine analysis of the photo, or on the response of earlier users to the photo?

I would bet the latter. Why go to the trouble of analysing difficult stuff about body structure, when you can just let users "decide".

I think we should formulate a law, something like "every internet imageboard, if left to user-moderation, will eventually turn into pr0n".

It's likely that FB uses user response to posts to decide how to prioritize them. However, several patents describe systems where they analyze (using CV) and make decision on the importance of pictures before they are published.

Remember kids, just because a Megacorp has a patent on something, doesn't imply that they are using an approach like this anywhere in production.

We didn't have enough data to test these hypotheses. If more people contribute their data, we'll be able to test that: https://algorithmwatch.org/en/instagram-algorithm/

> - Does Instagram prioritize scantily-clad photos of attractive people over scantily-clad photos of relatively less-attractive people?

If my theory of "It's not Instagram that prioritizes attractive people, people prioritize attractive people" is right, then yes, less attractive people are less attractive and therefore less prioritized.

I'm sure this is exactly the case. I bet far more people interact (regardless of whether they 'like' it) with the swimwear/sexy pose shots than they do the photo of last night's dinner.

I like a lot of photos, but in my own 'explore' timeline I've noticed categories of photos that appear from time to time, despite me never liking them. Until I did the whole "I do not want to see content like this" thing and made them disappear, I was seeing a lot of fishing videos, or bizarrely, people slipping their feet into shoes filled with foam (?). I'm fairly certain that as you scroll down the explore timeline just paying over a playing video - or particularly opening something (even if it's really odd or gross) just because you're like "what the hell is that" seems to register it as interest. I think if that topic is pretty "niche" you start seeing it a lot more pretty quickly.

I guess a bit like reading a single story on Quora about going to prison, or surviving an aircrash - you can pretty much expect your daily digest to be full of those for the next 3 weeks.

Point being, I assume few people will admit to it, but I bet their eye is drawn to the scantily clad photo every time. They open it/pause on it - and it's probably registered as 'most interesting'.

> I guess a bit like reading a single story on Quora about going to prison, or surviving an aircrash - you can pretty much expect your daily digest to be full of those for the next 3 weeks.

Man, this but on Youtube. There is a definite chilling effect, where I don't click on ANY ads or anything that looks targeted or served solely because I don't want to get spammed for for the next month. You click one video about Joe Rogan and then it's all you see.

When Jordan Peterson got big I broke down and watched a couple of his vids to see what he was about. And holy crap I got nothing but conservative videos spewed at me for weeks, PragerU ads for months. Even now I'm hesitant to watch any BLM, COVID, or anything that's trending because the algo will just spew more crap at me.

While it's probably based on engagement, it's still a flaw of the algorithms I feel. Just because other people engages more with the content, doesn't mean it's the correct content for a given hashtag / following.

For instance, I mostly follow cycling and running stuff on IG. The suggestions / explore feed is therefore full of women in short running shorts and bicycle jerseys with the zipper down. It's not really that relevant content considering what I actually follow, but I think the engagement those things get push all other cycling / running related content away.

Have you tried the GCN app for browsing cycling photos?

Didn't know they had their own app. I regularly watch them and the triathlon version on YouTube, so will check it out, thanks.

Of course it is. I'd even say that what we found is probably similar to the issue of offensive suggestions by search engines. A minority of Instagram users see the platform as a free source of soft porn images and their behavior is probably picked up by ML systems, amplified, and pictures of nudity are pushed for all users, in a vicious cycle. Just like search engines spread far-right conspiracies by suggesting them to millions of users after a few thousands searched for them.

but as opposed to "offensive suggestions", there is nothing wrong with showing-skin.

Unless your don't want to, but you are obliged to use IG because your business focuses on a demographic on which IG has a monopoly (the 15-25 demographics in the EU).

That's where models, brand ambassadors, and influencers come in. It's not like lifestyle business just found out that sex sells.

I'm sure there are worse jobs than using Instagram

There's a difference between using Instagram and having to wear a Bikini on Instagram even if you don't want to.

Nothing worse than owning a car repair shop and having to force your mechanics to wear a bikini for Instagram ads. Imagine having to have that conversation with a bunch of short tempered hairy dudes

Nobody is "obliged" to use it.

You can also buy ads if you want to show your posts to people, like any other business does.

To be fair, the linked article does suggest that this effect occurs for ads as well as organic posts.

Where's the suggestion that it applies to ads? The only coverage is 2 un-sourced instances of the ad creative review algorithm falsely flagging the content.

How confident are you that Instagram’s algorithm isn’t just optimizing for that particular user? And that it’s just that most users of Instagram seem to engage with scantily clad photos more than other photos?

The technical term for these is "thirst trap" ( https://www.urbandictionary.com/define.php?term=Thirst%20Tra...).

The article points to a Facebook patent for an "engagement metric" created at time of posting (that is, before user interaction) which specifically calls out "state of undress" as a factor to consider:


...and this metric is used to determine what gets shown in people's feeds.

The patent link was not working for me, here is an alternative: https://patents.google.com/patent/US8929615B2/

They can patent an architecture?

People suck because they go online to look at attractive people?

Algorithms suck because they take this as an engagement metric. Or rather people who design algorithms in this way.

If you’ve ever visited the profile of a woman with a very sexualised or glamorous LinkedIn profile, you’ll see the “people also viewed” results are almost entirely other women with glam photos. The normal case is that “people also viewed” is others at the same company or with similar industry/job type.

> "but that people suck."

It's in the definition of "attractive", things which attract attention and interest. Do people suck for wanting to eat food instead of mud? For wanting to listen to pleasant music instead of screeching noises? It seems like the least sucky creature in the world implied by your words would be Buridan's Ass[1].

Hyper-stimuli are one of the big problems of our time; hyper-palatable food, hyper-comfortable indoors sedentary pursuits, hyper-engaging gambling and gaming environments, hyper-engaging edited pictures and videos (and adverts and films and TV shows). The highlights reel, on steroids.

[1] https://en.wikipedia.org/wiki/Buridan's_ass

People suck? For being sexually attracted to members of the opposite sex? That mechanism is the very reason you are here. Your parents were attracted to each other and mated to produce you. Don't demonise sexual attraction.

What's actually happening here is some people are exploiting this natural quality for their own personal gain. Giving men access to countless pictures of naked women is like giving an alcoholic access to a brewery. It destroys people and it destroys relationships.

This brought to mind a quote from Roger Ebert's book "Great Movies":

> Of all the arts, movies are the most powerful aid to empathy, and good ones make us into better people.

Even if people suck, could Instagram tune their recommendation algorithm into recommending content that made them into better people?

And who gets to decide what is 'better'?

Here’s my guess: there is an ML model running at Instagram to prioritize content shown to users in the service of some target: engagement, revenue, shares, etc.

This model takes many of the features of the image into account: the account posting, the account’s previous likes, the number of existing likes and the time frame they were received in, the GPS location of the poster, etc... and possibly... deep learning features from the image itself.

If the model has the deep learning features... it will take all of 10 minutes of IG-scale data to overpromote hot naked people. It will also learn all of their userbase’s biases, and underpromote content from black people that isn’t “stereotypically black,” overweight women, short men, etc.

With the rise of blackbox deeplearning it's entirely possible that this is actively reinforced by instagram and not just an organic process.

Instagram’s explore tab has learned that I like pictures of English gardens. It works amazingly well. But if I refresh too often, it seems to run out of new garden photos and just shows hot young women and luxury cars instead (I’m not interested in either). I don’t know if it’s the algorithm or a deliberate choice but it’s gross either way.

It's the same with Pinterest for me by the way.

It shows the pictures I'm interested in and when it runs out of them it starts showing bikini models and cars.

To be fair, that's probably the two largest centroids of any reduced dimension space they've estimated to rank photos.

> While our results show that male and female content creators are forced to show skin in similar ways if they want to reach their audience, the effect could be larger for females, and be considered a discrimination of female entrepreneurs.

Overall this article strikes me as a large collection of pseudo-science but this sentence baffles me the most. Is the author arguing against their own conclusions?

I don't disagree about the style/tone of the article but this conclusion is borne out by their own data:

> Posts that contained pictures of women in undergarment or bikini were 54% more likely to appear in the newsfeed of our volunteers. Posts containing pictures of bare chested men were 28% more likely to be shown.

The effect is clearly larger for women, and if you have no problems with their data I don't think it is a reach to assume it disadvantages female entrepreneurs more than male entrepreneurs, especially given that in many cultures the pressure to dress modestly is stronger on women than on men.

> The effect is clearly larger for women

I don't think it's that simple, given that their data show both the share of posts posted and shown as lower for women.

And there has to be a non-linear effect from increasing the share of posts posted with nudity, given that if it was, say, 66% then it would be impossible to boost that by 54%.

How do we "fix" it? Neural implants to change what we desire?

This test is not scientific, the conclusion is highly suspect and the discrimination angle is a giant reach.

I think it's called "subtlety" but I'm no native speaker.

From your profile I can gather that you are affiliated with the source so I'd really be interested in what you mean by this

I guess the authors didn't like what conclusions their data leads to so they just came up with this weird paragraph.

It's like they wanted to throw in some discrimination out there but they weren't sure how to do it.

This is very true. As a early 20s guy, I don't follow a single celeb who is a woman, the only women I follow are women of my family/girlfriends, and then it still shows me unknown barely clothed women and makes me embarrassed when I use it in public. I've never even liked a picture once, of a woman who is semi nude, because its just not my thing and I don't want to contribute to the degradation of society.

Therefore all I use Instagram for is keep up with friends and family, share vacation photos, not to view unknown celebs/people I don't care about.

I would be very happy if they stopped trying to show me trash.

I find myself realizing that as I clicked on the post, I was hoping it would include example pictures of attractive instagram models. I expect the results may be predominantly a reflection of human nature...

Hmm, if ppl press like on those photos more, isn't the algorithm just faithfull reproducing human preference on average? Why say instagram is prioritising? Makes it sound like Instagram. Is sentient.

Wow, a social media network the average Joe visits promotes stuff the average Joe might enjoy. How is this news? Any sane algorithm would tailor posts' visibility to the lowest common denominator, and that's what IG seems to do here. I don't even use IG and it makes sense.

Anecdotal: My explore feed has been 100% populated by semi-naked women for a long time. I realized that it could lead to some embarrassing moments when my friends would see it and I decided to change the feed. However, I was clicking on them so it as at least somewhat organic. The issue was, that since it was the ONLY thing on my feed, after some time I also had no option.

At some point a couple of months ago I decided that I wanted to change it. So I started to consciously like and follow hashtags and people around memes, standup, art and sports.

Now, I like primarily meme pictures and funny pieces and never on any semi-naked model. However, my feed has changed to a 80-20 (models-memes). I still think that there is some Instagram push there, since obviously, my preferences on content have changed, but the algorithm barely gave a damn.

Do the detailed results of the study contain all the data for the participants interests?

And if Instagram personalized the newsfeed of each user according to their personal tastes, the diversity of posts in their newsfeeds should be skewed in a different way for each user

That seems like a wild underestimation of how much a group of 24 Instagram users choose to look at sexy pictures over other topics. "Instagram behaves as if people are thirty percent more interested in sex than any other topic" frankly seems low.

It's probably somewhat lowered by the other provider of sexual imagery, namely porn.

The website is down, but can still be read using google cache: https://webcache.googleusercontent.com/search?q=cache:Dk1y0u...

That title (not the HN one, the article one) is so editorialized that I doubt the credibility of the analysis. It feels like something people would think about and then find data to validate their perception.

As unfortunate (is it though?) as it might be, people do like that content and it creates engagement with the platform. I see it as business as usual for social media.

If you tap on a pic in the explore feed, and then tap the three dots in the upper right hand corner of the pic, and then tap "Not Interested", it sends a strong signal to the recommendation algorithm. After doing this a few times on similar pics, you will generally not see that category of picture in your feed.

You used to be able to click attractive profiles in the sidebar of LinkedIn to successively attain a sidebar that was full of model-looking people. I wish the around-the-network sidebar for Stack Exchange worked like that but eventually gave you an absolutely insane list of questions.

Bears prioritize forest locations for toilets

There's a catch-22 here. Instagram will nuke accounts of models who make their money from appearing scantily-clad in photos. Damned if you do, damned if you don't.

This seems rather thin on describing the feeds that these users are following. How "diverse" are they? What's that even mean? Did they check which posts are getting the most interactions (by their other followers)?

Unless these posts are completely new, how can they know that this isn't Instagram prioritizing the posts that already have the most engagement organically?

We didn't have enough data to test these hypotheses. If more people contribute their data, we'll be able to test that: https://algorithmwatch.org/en/instagram-algorithm/

That's a major missing hypothesis that seriously undermines your conclusion.

Why do you need me to share data at all? Just create brand new accounts and test what they see in a more controlled manner. Then you'll be able to test what's recommended from the same accounts over time while building any specific profiles yourself.

That's against FB's and IG's terms of service, so as an investigative outlet, they can't really do that and then promote the results.

The source google doc at https://docs.google.com/document/d/1L7A5hmskm3Y3huSXHNtIIoiV... is very informative while the website is down

This is a sample size of one but I think I may have helped the algorithm learn this behaviour

This research is flawed as they have no way of telling what's the ratio of scantly-clad vs. dressed is in the entire collection of Instagram.

It could be that there's just a higher ratio of these types of images.

They analyzed images with “Google Vision API”, but did not make even an attempt to correlate with number of likes, comments etc. What a waste of time in so-called investigation.

A social network for promoting scantily-clad women prioritizes photos of scantily-clad women?

You don't say.

And I dread to think what the tiktok algorithm prioritises.

Algorithm? Tiktok puts its hands on the scale deliberately: https://theintercept.com/2020/03/16/tiktok-app-moderators-us...

> the label “beauty”, for instance, was only returned for females

Males just dont look good, or something.

Edit: as someone noted before, my point was not that that men look badly. More that it is ridiculous bias I did not even noticed before.

People disagreeing with this comment: please consider what the post is trying to highlight.

There's an apparent asymmetry in how imagery of females is perceived, vs. similarly posing males. Commercials employing 'sex sells' tactics are still biased towards using the female sex to sell. Reddits' r/gonewild and offshoots is pretty huge, and filled with females. This is a truism.

The label 'beauty' being only returned for females highlights this asymmetry. There is something deeply absurd about this: on a first order approximation, ~50% of the population should be attracted to males. To me personally it is enlightening to see how 'smart' data driven systems expose the systemic biases in our society.

Yes, that was my point. Not that I think males look ugly, I don't. More that beauty being female only is a clear example of bias. I have seen similar bias with the word "elegant" exactly today. It was prompt in the list of photos and there was exactly one guy and many women in result.

> To me personally it is enlightening to see how 'smart' data driven systems expose the systemic biases in our society.

I also think that they to make that systemic bias bigger. Like, if there is small difference between the gender of elegant in real life, the algorithm will make it higher. The human bias in worst as amplifier too, but I find algorithmic harder to control and even worst in amplifying effect.

Also most women find most men unattractive or average looking. Most men find most women average looking or attractive.

As a non-native English speaker I wonder — is it common to call a man "beautiful" in reference to his looks?

It's rare. To me, it's not just a synonym for "handsome", but rather implies a certain kind of elegant good looks. The male characters in JoJo's Bizarre Adventure are the first example that springs to mind.

I don't find it common for humans in general, with exception of Trump? It is kind of not not how people normally talk.

Bun this exact day I have seen similar phenomenon with "elegant". Search for elegant ended with exactly one guy and a lot of young women with dresses, most of them brides. Elegant is commonly used for good looking men.

Never seen a beautiful table?

I don't even need to read the article to know that a more accurate title would be: 'Undress or fail: Instagrams user patterns strong-arms users into showing skin' or the classic 'Sex Sells'

Not really. It is about algorithm picking up and showing some pictures more regardless of personal taste of individual user or content creator

> If Instagram were not mingling with the algorithm, the diversity of posts in the newsfeed of users should match the diversity of the posts by the content creators they follow. And if Instagram personalized the newsfeed of each user according to their personal tastes, the diversity of posts in their newsfeeds should be skewed in a different way for each user. This is not what we found.

So it is more of algorithm amplifying effect that exists in general in much smaller effect and pushing people toward aesthetic that would not be "naturally" their preferred thing to look at or produce.

You might be surprised.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact