This whole Press release is about the "Steps" they have taken, very few of them tell us about these steps in detail or how we can verify them. There is no pledge to do better or a consequence to not meeting these goals. If feels to me that FB is mega coorp where the people in it want to do better but the machine is so much bigger than the people in it they are currently lost on how to do it without torching the whole company
Give me a break.
Super Size Me while dishonest about how bad McD is in just one month, was shooting fish in a barrel.
Waiting for Superman, a charter school infomercial featuring discredited superintendent Michelle Rhee, showed the danger of the form.
Maybe it's just that the whole concept - Algorithmic curation of content, instead of human editors - is flawed and impossible to fix.
Maybe YouTube has the exact same problem only fewer clearnames and a higher barrier-to-entry (no text posts).
How can one ever assume that it's possible to design an algorithm which hands every individual a customized newspaper that is "good for him".
Better solution: Limit freemium model on anti-competitive basis for software services.
If facebook and instagram stops being free, I would think a huge portion of consumer base will quickly reconsider the value it provides them and alternative companies will have more chance to enter in the market when there is more leveling.
That aside it would be a very good thing for society if all companies started addressing stakeholder concerns and contributed constructively in the societal discourse around their actions.
Not only that, but they seem to focus on the steps they take rather than the results they had. Sure you can take some steps to reduce addictiveness or polarization, that doesn’t mean they had an actual effect nor take away the central argument that the documentary is making.
Also really every media format so far relied heavily on curation. I think this is ultimately the problem to be solved if they still want to act as news distributor. Like button doesn't really work, especially it's not possible to downlike instead the "like emojis" only indicate how emotionally charged the topic is.
Since this is infotainment I bet that it's way less than that. Another advantage when trying to get an objective perspective from reading, is that there is no dramatic or eerie music in the background giving you queues on how to interpret the contents on an nearly subliminal level. Furthermore there would not be any faces that your less advanced parts of the brain would attach feelings to.
The social dilemma might probably have been condensed down to a 5 page PDF. All these dramatizations are maybe good if you want to discuss this with 5 year olds, but it's not dense with fact in any sense.
Social media companies sell the ability to influence you. That's why the product is often free.
Lets say I want to setup a small town of 10K ppl anywhere in the world. As soon I try to do so, about a hundred different govt depts will descend on me to enforce all kinds of rules built up over hundreds of years to gaurantee health, water/sewage, fire, police, traffic, housing, edu, labor rights etc etc etc
Take public safety and administration of justice - there is always a formula to calculate how many police/judges/public defenders a town needs. There are even Sustainable Development Goals from the UN setting up targets for different countries.
Now Fuckerberg/Youtube et al set up these virtual towns with 2 billion people. And there are no formulas and they think the 2 billion people within will patiently allow them to work a formula out while the world burns.
More likely Social Media execs are going to be swinging from trees in various parts of the world soon.
Whats the solution?
Profiting by deluding yourself is just lying and deflecting your guilt. Either give back the money you made from your "accidental" misbehavior, or stop pretending it's accidental.
I'm still on HN, LinkedIn and WhatsApp. Every time I open LinkedIn I'm shocked at how addictive the feed is. I go there to message someone and before you know it 10 minutes have gone by and I forgot what I went to do in the first place. WhatsApp is really great, except that is owned by FB and they still extract value from me.
In 2015 I turned off all notifications on my phone. Quiting FB and no notifications have really improved my state of mind.
Overall, I'm pretty happy with how I use social media, but I'm very worried about how my kids will be able to handle it in their teens.
I wonder how many other engineers feel similarly. It feels increasingly like having "Facebook" on your resume is a badge of shame. Sure, you can make good money working there, but you also have to be able to sleep at night.
The biggest factor after education would be the ability to be non conformist. That requires you have some leverage in the group. If you don't, you will quickly become target for exclusion and bullying.
In school, it's not easy to avoid bad apples and many parents are/will be negligent. So you will have few students using social media and trying to influence others into it even if the current situation changes.
Teens also want to prove themselves and the dynamics of social media provides cheap validation. You need intrinsic motivation about something to fight it as well as support or acknowledgement from adults.
Recommended resource: https://www.privacytools.io/
There are articles, communities linked on privacy, surveillance, etc issues.
I think the first step would be to self host your own social media and other stuff (next cloud?) with the family. Although, not every kid will be interested but I presume at least one would be curious in your case. Start it as a hobby on a weekend and invite them. Get s raspberry Pi and it will be a fun tinkering experience.
How did you get an offer without going through their extensive interview process?
Oh, and high paying: it was like 3x my previous salary, but not high for SV standards.
This was before the company was viewed as completely morally bankrupt. They pay probably the highest in SV now to make up for that.
“What we really mean is that we keep it free for you [because you are product exactly like they said, this is the only reason advertisers come here, not for the traffic but for the precision targeting]”
> now Facebook, the world's biggest social networking site, is facing a storm of protest after it revealed it had discovered how to make users feel happier or sadder with a few computer key strokes.
> It has published details of a vast experiment in which it manipulated information posted on 689,000 users' home pages and found it could make people feel more positive or negative through a process of "emotional contagion".
It's overly dramatic in the way that I could say "Axe weilding man splits homeowners door uninvited" and "Fireman had to break down a door" and both are technically true, but only one is apt.
Facebook's product is influencing my behavior, technically, maybe, but it's a clumsy description made for the sake of drama. It's more accurate to say their product is targeted advertisements.
The researchers found out that Facebook's news does influence sentiments in a contagious manner. Which implies it does result in behaviors change on the platform, otherwise they don't have a way to measure the effect in the first place...
And don't forget to check the Acknowledgement section:
> We thank the Facebook News Feed team, especially Daniel Schafer, for encouragement and support; the Facebook Core Data Science team, especially Cameron Marlow, Moira Burke, and Eytan Bakshy; plus Michael Macy and Mathew Aldridge for their feedback.
It's not like they hide it, that's exactly why advertisers and political parties partner with Facebook in the first place.
I personally don’t see my goal, the goal of projects I work on, the goal of companies I work for, or anything else I contribute to, to be about manipulating people behaviors without their knowledge and/or consent.
Facebook has been doing completely unethical experiments since forever (are you aware of their role in Rohingya’s genocide in Myanmar?).
They have been open about a lot of them, bragging how good they are at manipulating crowds.
And yes, they are crazy effective. And they have the scale. They have unethical behaviors, that are effective, applied at humanity scale.
Was there more to it than that?
Existing in a filter bubble has a strong effect on your perception. Perception has a direct influence on your actions. Couple this with interfaces which are purposefully addicting ("high user engagement" is a euphemism), and you can very directly influence behavior.
The pervasiveness of smartphones means that these apps are only a few clicks away for virtually the entire world population. And worse, once these apps are installed on your phone, they relentlessly pull you back in with and endless stream of notifications.
It is not only Facebook. Applications like Reddit, YouTube, Instagram, Twitter, and TikTok all follow the same basic patterns.
It is not an overly dramatic description. If anything the public has been frightfully unaware of the influence that these companies can exert on the world. I am glad that this film has brought these issues into the spotlight.
The biggest danger in my opinion is how this time it's not just the ads that are changing your behavior but the platform itself.
> the gradual, slight, imperceptible change in your behavior and perception that is the product
It is not hard to argue that an ad fits such a description. Ads change behavior. We don't just have ads to buy things, but we also have political ads. We have ads for charity. We have ads for religion. We have ads for mental health. We have ads for public awareness. There's this common argument that ads are just about selling you things but such an argument doesn't reflect reality and is often strange to find coming from people living in a country that is going through a major political election where they are being accosted by ads encouraging people to vote. Considering we've been doing this for a few hundred years, I'm pretty sure someone would have picked up on it being wasted money if ads didn't influence behavior.
The answer of course is somewhere in between depending on context. People don't have zero agency, but they also don't have 100% agency.
If you are looking for a camera, going to a review site will give you a fair chance to find the product that fits your needs (most likely a lens module for your phone). Looking at Canon ads is not that.
A short version of this would be any product information media output that doesn't detail a products principle flaws is manipulation intended to sidestep your agency.
When did you last see a fizzy drinks (pop, soda) advert that said "tap water is better for you and cheaper but our brand advertising is supposed to associate drinking our drinks with being cool, so choose to avail yourself of our continued widescale brainwashing of your society to help you feel socially superior" or "we make sure our batteries mtbf is 2.5 years, with a narrow sd so that it needs replacing just after the warranty period expires; the battery though, it's glued in - clever eh!".
The central tenet of market capitalisms optimisation relies on consumers having perfect information and operating on that information. Advertising, the ads I've ever seen, are a direct effort at circumvention of that.
Yes, people still have some agency, otherwise ads wouldn't be needed. But ads purpose is to erode that agency.
Your post sounds like the attempted justification of someone who uses advertising?
Facebook’s manipulation is arguably much harder to spot.
Sure they’re both forms of advertising, but at some point a new technology is so much more effective that it should be treated like a new thing.
I think that’s what people are arguing here.
I wonder what happened in 2018
> [Cambridge Analytica] closed operations in 2018 in the course of the Facebook–Cambridge Analytica data scandal, although related firms still exist.
I'm sorry, but if you're going to lead with a policy change that was made after pretty much the largest scandal in your company's history, I think you know you're doing something wrong. If you want to claim the moral high ground changes need to happen before they become scandals. It is okay to make mistakes. It is okay to fuck up big time. But this just feels disingenuous. The reason people like The Social Dilemma is because it is real people saying "sorry, we fucked up. We take the blame, but let's solve the problem." This response feels like a childish response of "We didn't fuck up, you did."
> We don't sell your information to anyone.
IIRC the movie didn't claim this. Most people criticizing FB aren't claiming this. FB is selling access to the data.
> just like any dating app, Amazon, Uber, and countless other consumer-facing apps
I'm reminded of my mom saying "If all your friends jumped off a cliff would you?" I can't be the only one. Just because Netflix is guilty of similar crimes doesn't mean you're in the right. No one respects this defense.
This response is weak and does not feel genuine.
50M hours / 1.5B DAU is about .03 hours, or about <2 minutes per person. The average person spends 1h15m on FB, this is less than a 3% drop in overall time, but likely a larger proportion of that time is spent scrolling and seeing more ads than it is fixed on a single video.
Source? That sounds very high. Regardless, it's definitely a silly statistic. It could be that people just lost interest in Facebook.
Not sure how they collected the data though.
https://www.broadbandsearch.net/blog/average-daily-time-on-s... puts it at 58 minutes, and https://www.nytimes.com/2016/05/06/business/facebook-bends-t... (from 2016) notes that it's 50 minutes (apparently reporting by FB in an earnings report).
(for those who don't remember https://www.theverge.com/2018/10/17/17989712/facebook-inaccu... )
How are you providing value to people when you show articles from the same perspective to the same person. I would value a product, if it gave me different perspectives on the same issue. I guess their definition of value is distorted. And therein, lies the problem and the reason for filter bubbles.
Try logging out of facebook and then login after a week. You will see a significant increase in the rate of notifications. How is this not driving usage?
>> We provide advertisers with reports about the kinds of people who are seeing their ads and how their ads are performing
Well, technically you are not the product. But when you combine hundreds of data points from millions of people and give access to advertisers to that data, you are a part of the product. So I guess, it should say, "You are a tiny part of the product". There, I fixed it.
>> Facebook uses algorithms to improve the experience for people using our apps—just like any dating app, Amazon, Uber, and countless other consumer-facing apps that people interact with every day. That also includes Netflix, which uses an algorithm to determine who it thinks should watch ‘The Social Dilemma’ film, and then recommends it to them.
The key difference between Facebook and other services is that facebook is a "social network". Things I post on facebook are viewed by my friends, family, colleagues which has an impact on how others perceive me and my social status. Facebook has the potential to literally shape my perception in public and my relationships. They completely fail to address this. I definitely don't get to choose the articles spewed by their algorithm and the articles I read/are shown to me definitely influence my thinking.
>> The overwhelming majority of content that people see on Facebook is not polarizing or even political
True, but there are certain topics which are "hot" topics. People usually have strong opinions on topics like religion, politics, sexual orientation etc. I wouldn't care that my friend is a cat person but I am a dog person. However, it would matter to me if my friend supports a candidate that I vehemently oppose. People usually lose their senses when it comes to the "hot" topics. So a post on these topics has a disproportionate amount of effect than a post on a vacation my friend is taking.
It's super easy to cherrypick data that backs your point. I'm sure "The Social Dilemma" did this too. If instead of looking at content, they looked at engagement (comments, likes, shares) then I think the political content would be much higher on the list.
I thought about writing a point-by-point rebuttable to this, about the obvious lies and wilful misdirection ...
But then I realized that I just don't care anymore. I love being free of this company and I almost got sucked back in to writing flame war comments about [thing that gets more attention].
Folks, just give this company up. You'll be much happier, I promise.
Ha, I just finished watching the movie an hour ago with a friend. Of course its obvious how the debate was going to be framed from the beginning: Facebook is supposed to be "unbiased" and from this point of view political inaction is the high-minded route over traditional low-minded autocratic rule. Instead of possibly curating a high-quality news feed for users to consume, users get to pick what they want to listen to because to force a user to watch curated news feeds would be biased against the news organizations that operate and compete for engagement on it. So Facebook does have smart leadership and have been proving themselves to more resemble an unbiased company compared than an autocratic government with some fake news initiatives.
However, our individual news feeds are not unbiased. By letting individuals control what they see, increases their bias. From this perspective, Facebook is now inherently biased towards extremism (our sad reality proves this point)
The way out of this situation is confusing and neither Facebook nor the movie talk of any real action. How about, instead of trying to clutch our unbiased pearls, we collectively learn to appreciate and understand inherent biases in everything because everything involving humans and our psyche is inherently biased and there is no end objective truth to the big issues surrounding our differences in values.
Now the question becomes not of bias vs facts, but of better and worse bias. If we could create a Facebook news stream that is inherently biased towards bringing people together rather than splitting us apart, it rather appears to solve this new question quite cleanly and as a plus fixes our pathetic situation. This isn't a radical new idea created in my head 2 hours ago, its a rather established view on this important issue: https://youtu.be/ZbPt66TYsFM
The big question I think should be how do we design this "good" bias into our social media and how do we convince everyone everything is biased literally no point in finding unbiased sources of anything to reframe the debate to be on good versus bad bias.
Yeah, no thanks. Algorithms aren’t the solution to this particular problem.
Am gonna watch this Social Dilemma myself now cheers for vouching for it FB
Does anyone know if this is indeed a safe tool to recommend to friends and family looking to thin their social media presence?
Are there equivalents for other social media platforms like TikTok, Twitter, Reddit, Instagram, etc?
It worked as advertised, and I deactivated my now-empty account and went on my merry way.
`unless you give us permission`
If you've spent even a minute working in the marketing world, you'd know that yes, yes they do. They think exactly that far, and then about another 10 steps beyond that.
Those prompts are most definitely designed to be willed away by the user without allowing them to fully grok the consequences. Unfortunately, it's not measurable, provable, or enforcably illegal.
"The overwhelming majority of content that people see on Facebook is not polarizing or even political—it’s everyday content from people’s friends and family. "
But in my case, and many of the people I know, you can't not even open facebook without seeing some highly charged political bullshit. I've largely abandoned the platform, as it seems all it is, is an echo chamber of people I know of all political leanings posting shit that strongly, and often inaccurately supports their political viewpoint.
Its no longer a platform of people causally sharing their personal lives, its seeming an endless stream of worthless shares and repost.
I really want to see a social network that limits the ability to repost and "share" content like this, maybe go so far as to even do copy and paste content checks to make sure its not just shitpost copy pasta.
Instead, the answer was to trot out some absolutely inexcusable, banal platitudes. And now we get to see similar effects roiling the USA and seeping more and more into the world at large. The fact that they're continuing to tread out such platitudes this far in is simply indescribable. I understand the psychological foibles that can lead very smart people to deny the monstrosity of their own creations, but I am nevertheless filled with fathomless rage that Mark Zuckerberg prioritized his ego and his shareholders above the good of all mankind, whether that was conscious or driven simply by his unwillingness to accept the true nature of his creation.
So literally like any product saying "We of product X recommend product X".
And also the fact that if they didn't put out this document, they might later get sued and the lack of response towards "The Social Dillema" may be used as an argument that they don't care. And as such, they lose a case.
But about that data...
See, I've always heard Facebook makes the largest part of its money by selling data about its users.
Is that not true? Or, even worse, was it wrong of me to assign value to articles and news stories that stated this? If so, why?
What about the reverse? Would there be those who consider it foolish to place stock in a statement that says Facebook does not sell user data, or sells it anonymized?
It feels to me like a deliberate attempt to move all conversation on the subject into the territory of "you can't really know for sure, so SHUT UP".
Get people to stop talking about it, by virtue of casting doubt on everything anyone says.
Meanwhile, Facebook keeps doing whatever it's doing. Nothing changes.
Facebook could fix this problem by opening up about what it does with the data. But it refuses to do that. Trade secrets, I suppose?
Either way, it's Facebook who's handling the data, not us. As such, all responsibility falls on them, not us.
Categorically untrue. They sell advertising space. The data is used to target the ads, which are often paid for based on performance.
Selling the data would be a terrible idea... if anyone else had it, it would undermine their value proposition.
Directly responding to the film by name is such an obviously terrible idea PR-wise, I’m shocked to even be seeing this from Facebook. It must be an accident.
... “Ah, so they’re doing nothing.”
I haven’t watched “Social Dillema”, I think it is odd to mix fiction and documentary, but just looking at the dementi, I don’t buy it.
"The overwhelming majority of content that people see on Facebook is not polarizing or even political—it’s everyday content from people’s friends and family."
When I used FB I didn't find it polarizing or political but I did find I couldn't see content from people I knew. Other pages and crap overtook it and filled my feed. It was essentially useless to me.
I don't think FB is evil.. I think they are just about competent. Everything from them has been crappy. Remember when they spammed emails, chat was terrible, the apps were terrible, the notification icons didn't work on the website, FB messenger didn't work well.. etc, etc. Some things have been improved now but their talent is pretty lacking for a big tech company.
This seems like it's totally dependent on the politicalness of your friends and family. The past few months, many of my friends and family have been overtly political on facebook, and I ended up seeing it. I don't think that's easily avoided in a contentious election season.
I made an attempt at stopping using facebook about 3 weeks ago, and have been successful so far. Then I just learned another family member joined and I'd like to connect. With my family spread out so far, it's one of the few ways most of us can stay connected. So... I'll probably be back in the FB world small doses in the next week.
That said, I didn't find leaving all that liberating. Some people talk of some sort of new found freedom after stopping using facebook, but I didn't feel that way. I just have a bit of extra time during the week, which I've mostly filled with phone calls to other people, so there's probably not a huge net win from a time perspective. There may be some benefits re: privacy/data collection, but I've been on it for ... 11 years at this point. I'm not sure a few weeks off makes much of a difference yet. :)
If they were really doing the best they can, the logical conclusion is no amount of effort can actually fix this, and a global, engagement-based social network is too toxic to exist.
If so, that’s a positive thing for them. A pdf doesn’t have the Facebook like tracking and shadow tracking. They can still see some traffic from the raw http gets, but it skips the ubiquitous Facebook analytics that the documentary talks about.
This was my conclusion too. There were some genuine good parts, but most of it felt too hyped because the explanations were too flavorful and simplistic, or taken too far. It felt like I was watching The Secret or Zeitgeist.
> Rather than offer a nuanced look at technology, it gives a distorted view of how social media platforms work to create a convenient scapegoat for what are difficult and complex societal problems.
This sentence lacks the nuance that they criticize. While I think it's a fair position to state it gives a distorted view (due to simplification, for example), that doesn't mean they're creating (or intending to) create a convenient scapegoat.
> The film’s creators do not include insights from those currently working at the companies or any experts that take a different view to the narrative put forward by the film.
Well current employees have a lot to lose in being honest, if they disagree. And the documentary did have experts that had a different view that was actually telling a different story than the narrative. I was actually on the lookout for it, because I was upset with the simplistic cookie cutter bullshit it was feeding us.
---- Reacting to the points ----
1. They say they're making efforts to make responsible use possible. The issue with this is, we can't check this. So reputation and trust is all we can go on. So I'm putting this in the neutral category.
Wasn't Cambridge Analytica a thing? If so, then I don't care if you sell or don't sell, info gets out there. It feels disingenious to put that sentence in there as a response to "you are not the product".
I agree that the mantra is nonsensical "you are the product". Uhuh, right, because it's short and sweet it must be true? Nonsense, all explanations that I've ever read that were true never devolved to mantra-like statements. IMO it's a "code smell" that something is off.
I'll stop here, this is getting way too long.
Long story short: Facebook is being questionable here at best, despite that I agree in general with them that The Social Dilemma is nonsense.
The incentives here are all warped :/
Also, fuck you Facebook. You had multiple chances to redeem yourself and start being not-so-corrosive for the society and you intentionally went out of your way to make the wrong choice every single time.
Maybe you don't owe Facebook better, but you owe this community better if you're commenting here.
Thanks for pointing it out.
/s that was the part I was missing from the PR post. Somebody at Facebook doesn't know how PR works, it seems.
I would have gone with a more editorialized title but the submission guidelines prohibit it.
The Social Dilemma explains how the stupid 'you are the product' meme is incorrect, for instance.
It also explains the emergent behaviours are not planned, so equally they are hard to stop from planned actions.
Compare the parallel logic of "steers are not the product: this ranch is funded by meat buyers so it remains free for cattle."
Go da we da peyeting. (follow the money)
I think the point is not that this statement is incorrect, but rather that it doesn’t accurately convey the severity of the situation. More accurate would be: “manipulation of their audience is their product”.
Manipulation of the audience in pre-TV days, ca. fourth century BC:
> "The emotions are all those affections which cause men to change their opinion in regard to their judgements"
Remember the Maine!
It will explain it and other concepts for you better than me. It's a good doco to start on.
I'm not my eyeballs. Facebook et al don't care about most of me.
My sex and age is 90% of the metrics they know how to use. The other 10% is stuff I've searched recently.
What they are buying is time.
The product concept is from TV, social media is different - https://quoteinvestigator.com/2017/07/16/product/