Confession: this is the first thing that sprang to mind for me. Not that algorithms suck or companies suck (though both of those are true) but that people suck.
- Does Instagram prioritize scantily-clad photos of attractive people over scantily-clad photos of relatively less-attractive people?
- Is the prioritization based on a machine analysis of the photo, or on the response of earlier users to the photo?
- "While [the skew towards nudity] was consistent and apparent for most volunteers, a small minority were served posts that better reflected the diversity published by content creators." Are the majority people who have clicked on plenty of attractive scantily-clad photos? Are the minority people who have been presented with such and avoided clicking them?
I would bet the latter. Why go to the trouble of analysing difficult stuff about body structure, when you can just let users "decide".
I think we should formulate a law, something like "every internet imageboard, if left to user-moderation, will eventually turn into pr0n".
If my theory of "It's not Instagram that prioritizes attractive people, people prioritize attractive people" is right, then yes, less attractive people are less attractive and therefore less prioritized.
I like a lot of photos, but in my own 'explore' timeline I've noticed categories of photos that appear from time to time, despite me never liking them. Until I did the whole "I do not want to see content like this" thing and made them disappear, I was seeing a lot of fishing videos, or bizarrely, people slipping their feet into shoes filled with foam (?). I'm fairly certain that as you scroll down the explore timeline just paying over a playing video - or particularly opening something (even if it's really odd or gross) just because you're like "what the hell is that" seems to register it as interest. I think if that topic is pretty "niche" you start seeing it a lot more pretty quickly.
I guess a bit like reading a single story on Quora about going to prison, or surviving an aircrash - you can pretty much expect your daily digest to be full of those for the next 3 weeks.
Point being, I assume few people will admit to it, but I bet their eye is drawn to the scantily clad photo every time. They open it/pause on it - and it's probably registered as 'most interesting'.
Man, this but on Youtube. There is a definite chilling effect, where I don't click on ANY ads or anything that looks targeted or served solely because I don't want to get spammed for for the next month. You click one video about Joe Rogan and then it's all you see.
When Jordan Peterson got big I broke down and watched a couple of his vids to see what he was about. And holy crap I got nothing but conservative videos spewed at me for weeks, PragerU ads for months. Even now I'm hesitant to watch any BLM, COVID, or anything that's trending because the algo will just spew more crap at me.
For instance, I mostly follow cycling and running stuff on IG. The suggestions / explore feed is therefore full of women in short running shorts and bicycle jerseys with the zipper down. It's not really that relevant content considering what I actually follow, but I think the engagement those things get push all other cycling / running related content away.
You can also buy ads if you want to show your posts to people, like any other business does.
...and this metric is used to determine what gets shown in people's feeds.
It's in the definition of "attractive", things which attract attention and interest. Do people suck for wanting to eat food instead of mud? For wanting to listen to pleasant music instead of screeching noises? It seems like the least sucky creature in the world implied by your words would be Buridan's Ass.
Hyper-stimuli are one of the big problems of our time; hyper-palatable food, hyper-comfortable indoors sedentary pursuits, hyper-engaging gambling and gaming environments, hyper-engaging edited pictures and videos (and adverts and films and TV shows). The highlights reel, on steroids.
What's actually happening here is some people are exploiting this natural quality for their own personal gain. Giving men access to countless pictures of naked women is like giving an alcoholic access to a brewery. It destroys people and it destroys relationships.
> Of all the arts, movies are the most powerful aid to empathy, and good ones make us into better people.
Even if people suck, could Instagram tune their recommendation algorithm into recommending content that made them into better people?
This model takes many of the features of the image into account: the account posting, the account’s previous likes, the number of existing likes and the time frame they were received in, the GPS location of the poster, etc... and possibly... deep learning features from the image itself.
If the model has the deep learning features... it will take all of 10 minutes of IG-scale data to overpromote hot naked people. It will also learn all of their userbase’s biases, and underpromote content from black people that isn’t “stereotypically black,” overweight women, short men, etc.
It shows the pictures I'm interested in and when it runs out of them it starts showing bikini models and cars.
Overall this article strikes me as a large collection of pseudo-science but this sentence baffles me the most. Is the author arguing against their own conclusions?
> Posts that contained pictures of women in undergarment or bikini were 54% more likely to appear in the newsfeed of our volunteers. Posts containing pictures of bare chested men were 28% more likely to be shown.
The effect is clearly larger for women, and if you have no problems with their data I don't think it is a reach to assume it disadvantages female entrepreneurs more than male entrepreneurs, especially given that in many cultures the pressure to dress modestly is stronger on women than on men.
I don't think it's that simple, given that their data show both the share of posts posted and shown as lower for women.
And there has to be a non-linear effect from increasing the share of posts posted with nudity, given that if it was, say, 66% then it would be impossible to boost that by 54%.
It's like they wanted to throw in some discrimination out there but they weren't sure how to do it.
Therefore all I use Instagram for is keep up with friends and family, share vacation photos, not to view unknown celebs/people I don't care about.
I would be very happy if they stopped trying to show me trash.
At some point a couple of months ago I decided that I wanted to change it. So I started to consciously like and follow hashtags and people around memes, standup, art and sports.
Now, I like primarily meme pictures and funny pieces and never on any semi-naked model. However, my feed has changed to a 80-20 (models-memes). I still think that there is some Instagram push there, since obviously, my preferences on content have changed, but the algorithm barely gave a damn.
And if Instagram personalized the newsfeed of each user according to their personal tastes, the diversity of posts in their newsfeeds should be skewed in a different way for each user
That seems like a wild underestimation of how much a group of 24 Instagram users choose to look at sexy pictures over other topics. "Instagram behaves as if people are thirty percent more interested in sex than any other topic" frankly seems low.
As unfortunate (is it though?) as it might be, people do like that content and it creates engagement with the platform. I see it as business as usual for social media.
Unless these posts are completely new, how can they know that this isn't Instagram prioritizing the posts that already have the most engagement organically?
Why do you need me to share data at all? Just create brand new accounts and test what they see in a more controlled manner. Then you'll be able to test what's recommended from the same accounts over time while building any specific profiles yourself.
It could be that there's just a higher ratio of these types of images.
You don't say.
Males just dont look good, or something.
Edit: as someone noted before, my point was not that that men look badly. More that it is ridiculous bias I did not even noticed before.
There's an apparent asymmetry in how imagery of females is perceived, vs. similarly posing males. Commercials employing 'sex sells' tactics are still biased towards using the female sex to sell. Reddits' r/gonewild and offshoots is pretty huge, and filled with females. This is a truism.
The label 'beauty' being only returned for females highlights this asymmetry. There is something deeply absurd about this: on a first order approximation, ~50% of the population should be attracted to males. To me personally it is enlightening to see how 'smart' data driven systems expose the systemic biases in our society.
> To me personally it is enlightening to see how 'smart' data driven systems expose the systemic biases in our society.
I also think that they to make that systemic bias bigger. Like, if there is small difference between the gender of elegant in real life, the algorithm will make it higher. The human bias in worst as amplifier too, but I find algorithmic harder to control and even worst in amplifying effect.
Bun this exact day I have seen similar phenomenon with "elegant". Search for elegant ended with exactly one guy and a lot of young women with dresses, most of them brides. Elegant is commonly used for good looking men.
> If Instagram were not mingling with the algorithm, the diversity of posts in the newsfeed of users should match the diversity of the posts by the content creators they follow. And if Instagram personalized the newsfeed of each user according to their personal tastes, the diversity of posts in their newsfeeds should be skewed in a different way for each user. This is not what we found.
So it is more of algorithm amplifying effect that exists in general in much smaller effect and pushing people toward aesthetic that would not be "naturally" their preferred thing to look at or produce.