Hacker News new | past | comments | ask | show | jobs | submit login
What “The Social Dilemma” Gets Wrong [pdf] (fb.com)
144 points by martingordon 17 days ago | hide | past | favorite | 121 comments



I found this press release pretty disappointing. It is very weak. This press release is like saying when the documentary about how bad McDonald is for you (Super Size me) came out, McDonald's issues a statement saying, oh but they (filmmakers) were not fair, they did not show you guys the steps we take to keep harmful bacteria from your food and ensure we don't poison you.

This whole Press release is about the "Steps" they have taken, very few of them tell us about these steps in detail or how we can verify them. There is no pledge to do better or a consequence to not meeting these goals. If feels to me that FB is mega coorp where the people in it want to do better but the machine is so much bigger than the people in it they are currently lost on how to do it without torching the whole company


Your example is interesting because in retrospect Super Size Me was alarmist and unfair. Morgan Spurlock mislead viewers to believe that his health conditions were caused by McDonalds when they were in fact caused by alcoholism.

https://www.wsj.com/articles/a-big-mac-attack-or-a-false-ala...


Are you claiming that the Social Dilemma was likewise alarmist and unfair?


He didn't but I would. Towards the end of the film, one of the interviewees claims the US is heading for a civil war. The solution? Turn off notifications.

Give me a break.


I think what they're "claiming" is that it's worth considering whether that's the case here as well without picking a particular side yet.


Super Size Me was a turning point where the power of dishonest propaganda as "documentary" became a major force in culture.

Super Size Me while dishonest about how bad McD is in just one month, was shooting fish in a barrel.

Waiting for Superman, a charter school infomercial featuring discredited superintendent Michelle Rhee, showed the danger of the form.


McPaywall



> If feels to me that FB is mega coorp where the people in it want to do better but the machine is so much bigger than the people in it they are currently lost on how to do it

Maybe it's just that the whole concept - Algorithmic curation of content, instead of human editors - is flawed and impossible to fix. Maybe YouTube has the exact same problem only fewer clearnames and a higher barrier-to-entry (no text posts).

How can one ever assume that it's possible to design an algorithm which hands every individual a customized newspaper that is "good for him".


It's not like traditional newspapers/outlets are good for people. Not necessarily the algorithms fault.


It's really not comparable since nobody knows the extreme outliers the algorithms produce, that are out there. For newspapers we know them.


No. That would be repressive and quickly invite censorship from influential people and state.

Better solution: Limit freemium model on anti-competitive basis for software services.

If facebook and instagram stops being free, I would think a huge portion of consumer base will quickly reconsider the value it provides them and alternative companies will have more chance to enter in the market when there is more leveling.


Are Facebook or Instagram even freemium? I think of them as being ad-supported in primary business model.


It will include free too. In general, I think they should add some barrier to entry and force platforms to provide an ad free way to consume media.


Apart from the bad points, posting a PDF press release in response to a documentary people watched on Netflix is kind of stupid. How many people who watched The Social Dilemma will actually see this press release, let alone evaluate its points?


The PDF is for the people who are hearing about the criticism (through people who have watched the documentary), and want to feel better about enabling facebook through their work or investment or continued use.


The press release is for press to read and report on which they can then pass on to policy makers.


Yes, it's probably bad from the perspective of how to best cool down criticism.

That aside it would be a very good thing for society if all companies started addressing stakeholder concerns and contributed constructively in the societal discourse around their actions.

edit: tempus


> This whole Press release is about the "Steps" they have taken, very few of them tell us about these steps in detail or how we can verify them.

Not only that, but they seem to focus on the steps they take rather than the results they had. Sure you can take some steps to reduce addictiveness or polarization, that doesn’t mean they had an actual effect nor take away the central argument that the documentary is making.


Yes, they should put more effort into it and more than 1 page to argue against a 1.5h documentary. Speaking about McDonalds, they should also show that they are able to learn from other related businesses and not suffer from Not Invented Here. News about elections exist since hundreds of years. For instance in Germany during election day the media can only do limited news coverage about the election until the polls are closed.

Also really every media format so far relied heavily on curation. I think this is ultimately the problem to be solved if they still want to act as news distributor. Like button doesn't really work, especially it's not possible to downlike instead the "like emojis" only indicate how emotionally charged the topic is.


meh Documentaries are way less information condense compared to written text. For a very quick approximation: - Speed of speech ~ 120 wpm. - So 1.5 hours constant talking would mean roughly 10'0000 words. Keep in mind that that is without all the pauses and dramatic music. - An A4 page would roughly come out at 500 words. - Which means that you could fit 1.5 hours of _constant_ speech in 20 A4 pages.

Since this is infotainment I bet that it's way less than that. Another advantage when trying to get an objective perspective from reading, is that there is no dramatic or eerie music in the background giving you queues on how to interpret the contents on an nearly subliminal level. Furthermore there would not be any faces that your less advanced parts of the brain would attach feelings to.

The social dilemma might probably have been condensed down to a 5 page PDF. All these dramatizations are maybe good if you want to discuss this with 5 year olds, but it's not dense with fact in any sense.


I can condense it in one line:

Social media companies sell the ability to influence you. That's why the product is often free.


True. They can't afford to say they have no clue about what they are doing.

Lets say I want to setup a small town of 10K ppl anywhere in the world. As soon I try to do so, about a hundred different govt depts will descend on me to enforce all kinds of rules built up over hundreds of years to gaurantee health, water/sewage, fire, police, traffic, housing, edu, labor rights etc etc etc

Take public safety and administration of justice - there is always a formula to calculate how many police/judges/public defenders a town needs. There are even Sustainable Development Goals from the UN setting up targets for different countries.

Now Fuckerberg/Youtube et al set up these virtual towns with 2 billion people. And there are no formulas and they think the 2 billion people within will patiently allow them to work a formula out while the world burns.

More likely Social Media execs are going to be swinging from trees in various parts of the world soon.

Whats the solution?

De-scale.


> people in it want to do better

Profiting by deluding yourself is just lying and deflecting your guilt. Either give back the money you made from your "accidental" misbehavior, or stop pretending it's accidental.


Yesterday I watched The Social Dilemma, and I just wanted to share some of my experiences: I quit FB as a user in 2011. In 2013, when I was 25, I received a high paying job offer to work at FB. Going to SV was a long-time dream, but in the end I took the hard decision not to take the offer. Ultimately I really don't like what FB does. The perks and salary don't compensate for that.

I'm still on HN, LinkedIn and WhatsApp. Every time I open LinkedIn I'm shocked at how addictive the feed is. I go there to message someone and before you know it 10 minutes have gone by and I forgot what I went to do in the first place. WhatsApp is really great, except that is owned by FB and they still extract value from me.

In 2015 I turned off all notifications on my phone. Quiting FB and no notifications have really improved my state of mind.

Overall, I'm pretty happy with how I use social media, but I'm very worried about how my kids will be able to handle it in their teens.


Facebook recruiters periodically contact me. I've considered interviewing with them in the past, but there's no way in hell I'd do it now. As far as I'm concerned, they produce a toxic product, and their sludge is infecting all of society in ways that no one fully understands.

I wonder how many other engineers feel similarly. It feels increasingly like having "Facebook" on your resume is a badge of shame. Sure, you can make good money working there, but you also have to be able to sleep at night.


> I'm very worried about how my kids will be able to handle it in their teens.

The biggest factor after education would be the ability to be non conformist. That requires you have some leverage in the group. If you don't, you will quickly become target for exclusion and bullying.

In school, it's not easy to avoid bad apples and many parents are/will be negligent. So you will have few students using social media and trying to influence others into it even if the current situation changes.

Teens also want to prove themselves and the dynamics of social media provides cheap validation. You need intrinsic motivation about something to fight it as well as support or acknowledgement from adults.

Recommended resource: https://www.privacytools.io/


Thank you! I already notice how very different my three kids are, and being non conformist is going to be easier for some than for others for sure. The link you've added seem mostly for software tools that block social media, I think fostering a resilient / non confirmist personality is going to be the toughest challenge.


> The link you've added seem mostly for software tools that block social media

There are articles, communities linked on privacy, surveillance, etc issues.

I think the first step would be to self host your own social media and other stuff (next cloud?) with the family. Although, not every kid will be interested but I presume at least one would be curious in your case. Start it as a hobby on a weekend and invite them. Get s raspberry Pi and it will be a fun tinkering experience.


I received a high paying job offer to work at FB

How did you get an offer without going through their extensive interview process?


I went through the process. They reached out to me, and at first I was very excited and I just wanted to see how good I was. Honestly I didn't think I'd make it far. When I reached the offer stage after 2 months I took a long time to think if this is what I really wanted, and the fact that it was FB was one of the deciding factors.

Oh, and high paying: it was like 3x my previous salary, but not high for SV standards.


> Oh, and high paying: it was like 3x my previous salary, but not high for SV standards.

This was before the company was viewed as completely morally bankrupt. They pay probably the highest in SV now to make up for that.


The film explains that when they say “you are the product” they mean Facebook is selling a chance to influence your behavior. It’s an interesting perspective, but this PR piece from Facebook seems to willingly misinterpret this


Facebook is giving perspectives that aren’t even rebuttals, just rewording the same thing and that’s hilarious

“What we really mean is that we keep it free for you [because you are product exactly like they said, this is the only reason advertisers come here, not for the traffic but for the precision targeting]”


I’m surprised they bother to make a response at all. The popularity of the the documentary must be making someone at FB a little nervous


Maybe they are already able to measure an effect of the documentary. That would explain the nervousness.


"Influence your behavior" seems like an over dramatic description of "show me ads".


They literally did it, it’s not over dramatized.

From 2014:

https://www.theguardian.com/technology/2014/jun/29/facebook-...

> now Facebook, the world's biggest social networking site, is facing a storm of protest after it revealed it had discovered how to make users feel happier or sadder with a few computer key strokes.

> It has published details of a vast experiment in which it manipulated information posted on 689,000 users' home pages and found it could make people feel more positive or negative through a process of "emotional contagion".


Pretty significant difference between "Facebook has done some research on changing moods" and "Facebook's product is changing your behavior."

It's overly dramatic in the way that I could say "Axe weilding man splits homeowners door uninvited" and "Fireman had to break down a door" and both are technically true, but only one is apt.

Facebook's product is influencing my behavior, technically, maybe, but it's a clumsy description made for the sake of drama. It's more accurate to say their product is targeted advertisements.


I don't see what else you want. In 2014 it was revealed that Facebook did run experiments to see how much of an impact they can have on people's sentiments. Just check the study, it's public: https://www.pnas.org/content/111/24/8788.full.

The researchers found out that Facebook's news does influence sentiments in a contagious manner. Which implies it does result in behaviors change on the platform, otherwise they don't have a way to measure the effect in the first place...

And don't forget to check the Acknowledgement section:

> We thank the Facebook News Feed team, especially Daniel Schafer, for encouragement and support; the Facebook Core Data Science team, especially Cameron Marlow, Moira Burke, and Eytan Bakshy; plus Michael Macy and Mathew Aldridge for their feedback.

It's not like they hide it, that's exactly why advertisers and political parties partner with Facebook in the first place.


It's not that I "want" anything else. It's that Facebook's product is ads, even if they also ran experiments to change sentiment. If I want to make a million people sad, I can't buy that from Facebook (directly). Conversely, if I want to show a million people an ad for my widget, I can absolutely directly buy that from Facebook.


The PNAS paper showed very small effect sizes, to be clear. But, yes.


Yes, that's true. At the same time the argument of "The Social Dilemma" is that at Facebook scale you just need a very small effect to influence the crowd in a way that matters. Paraphrasing the documentary (from memory): tuning human behaviours 1% in the direction you want, worldwide, is what Facebook sells and what their customers pay for.


Isn't that the goal of just about all human behavior, to influence other humans behavior? Is the issue with Facebook that they are too effective at it, or is it that most marketing & advertising behavior is unethical?


I don’t understand this hyper relativism.

I personally don’t see my goal, the goal of projects I work on, the goal of companies I work for, or anything else I contribute to, to be about manipulating people behaviors without their knowledge and/or consent.

Facebook has been doing completely unethical experiments since forever (are you aware of their role in Rohingya’s genocide in Myanmar?). They have been open about a lot of them, bragging how good they are at manipulating crowds.

And yes, they are crazy effective. And they have the scale. They have unethical behaviors, that are effective, applied at humanity scale.


My understanding of the Myanmar incident is that FB didn't swiftly block material used to inflame already existing ethnic tensions? From what I know, the fault there is they allowed communication in language that their AI and human reviewers couldn't understand.

Was there more to it than that?


They do not just show ads. They also display content.

Existing in a filter bubble has a strong effect on your perception. Perception has a direct influence on your actions. Couple this with interfaces which are purposefully addicting ("high user engagement" is a euphemism), and you can very directly influence behavior.

The pervasiveness of smartphones means that these apps are only a few clicks away for virtually the entire world population. And worse, once these apps are installed on your phone, they relentlessly pull you back in with and endless stream of notifications.

It is not only Facebook. Applications like Reddit, YouTube, Instagram, Twitter, and TikTok all follow the same basic patterns.

It is not an overly dramatic description. If anything the public has been frightfully unaware of the influence that these companies can exert on the world. I am glad that this film has brought these issues into the spotlight.


The movie raises the question of what advertisers are paying for, and conclude it must be behavior modification. From public service announcements that smoking is bad for you to Tom Selleck endorsing reverse mortgages, it seems advertisers pay to "show me ads" believing it will alter someone's behavior at least a little. Given the scale of Facebook's data and the scale of Facebook's audience, advertisers can use Facebook to make small surgical modifications to a vast slice of humanity. To me, this is somewhat compelling argument that Facebooks' "show me ads" business model really is pay to "influence your behavior".


"Show me ads" since the dawn of time has always been in an effort to influence behavior. Magazine ads. Newspaper ads. TV ads. This is just a different medium that much more surgical with its targeting.

The biggest danger in my opinion is how this time it's not just the ads that are changing your behavior but the platform itself.


The exact quote from the movie is changing "you are the product" to the product is

> the gradual, slight, imperceptible change in your behavior and perception that is the product

It is not hard to argue that an ad fits such a description. Ads change behavior. We don't just have ads to buy things, but we also have political ads. We have ads for charity. We have ads for religion. We have ads for mental health. We have ads for public awareness. There's this common argument that ads are just about selling you things but such an argument doesn't reflect reality and is often strange to find coming from people living in a country that is going through a major political election where they are being accosted by ads encouraging people to vote. Considering we've been doing this for a few hundred years, I'm pretty sure someone would have picked up on it being wasted money if ads didn't influence behavior.


They persistently allowed ads aimed at influencing elections in nefarious ways and misusing their users' data.

https://en.m.wikipedia.org/wiki/Cambridge_Analytica


Why is it over dramatic? They are trying to make you buy something, sign up for something, or vote for someone.


Take "make you buy" - are they "making" you do it, or are you in the market for a camera and they're providing you with a really nice choice of one that will likely suit your needs?

The answer of course is somewhere in between depending on context. People don't have zero agency, but they also don't have 100% agency.


Continuously feeding someone false information can change their perception of what’s right in the first place. That’s the whole point of impression ads. That’s why companies will pay so much to imprint an image on viewers.

If you are looking for a camera, going to a review site will give you a fair chance to find the product that fits your needs (most likely a lens module for your phone). Looking at Canon ads is not that.


They are making you associate brands or products with certain lifestyles, beautiful people, or status. This will bias your choices towards spending too much money on things that do quite little for your happiness in the end.


If an advert doesn't do a detailed comparison with other products then it's trying to convince you by manipulation, not presenting you information and relying on your agency.

A short version of this would be any product information media output that doesn't detail a products principle flaws is manipulation intended to sidestep your agency.

When did you last see a fizzy drinks (pop, soda) advert that said "tap water is better for you and cheaper but our brand advertising is supposed to associate drinking our drinks with being cool, so choose to avail yourself of our continued widescale brainwashing of your society to help you feel socially superior" or "we make sure our batteries mtbf is 2.5 years, with a narrow sd so that it needs replacing just after the warranty period expires; the battery though, it's glued in - clever eh!".

The central tenet of market capitalisms optimisation relies on consumers having perfect information and operating on that information. Advertising, the ads I've ever seen, are a direct effort at circumvention of that.

Yes, people still have some agency, otherwise ads wouldn't be needed. But ads purpose is to erode that agency.

Your post sounds like the attempted justification of someone who uses advertising?


Ads are intended to influence you to buy things, even if you don't need them.


Shades of gray.

Facebook’s manipulation is arguably much harder to spot.

Sure they’re both forms of advertising, but at some point a new technology is so much more effective that it should be treated like a new thing.

I think that’s what people are arguing here.


I didn't realize how much these platforms police our thoughts until I started diggin around for stock footage on wikimedia commons. It only took a few hours before I was watching air strikes on iraqi communities, nuclear tests, people jumping to their deaths during 9/11. This stuff gets buried on social media, and it doesn't even appear on TV, yet I think it's important for all modern Americans to view it, because I believe most people would think differently and make different decisions if they saw such things.


I am curious to hear what you think advertisements are supposed to do.


“Show me ads” only sounds less menacing than “try to change your behavior” because we’ve become accustomed to ads for a century.


> For example, in 2018 we changed our ranking for News Feed to prioritize meaningful social interactions and deprioritize things like viral videos.

I wonder what happened in 2018

> [Cambridge Analytica] closed operations in 2018 in the course of the Facebook–Cambridge Analytica data scandal, although related firms still exist.[0]

I'm sorry, but if you're going to lead with a policy change that was made after pretty much the largest scandal in your company's history, I think you know you're doing something wrong. If you want to claim the moral high ground changes need to happen before they become scandals. It is okay to make mistakes. It is okay to fuck up big time. But this just feels disingenuous. The reason people like The Social Dilemma is because it is real people saying "sorry, we fucked up. We take the blame, but let's solve the problem." This response feels like a childish response of "We didn't fuck up, you did."

> We don't sell your information to anyone.

IIRC the movie didn't claim this. Most people criticizing FB aren't claiming this. FB is selling access to the data.

> just like any dating app, Amazon, Uber, and countless other consumer-facing apps

I'm reminded of my mom saying "If all your friends jumped off a cliff would you?" I can't be the only one. Just because Netflix is guilty of similar crimes doesn't mean you're in the right. No one respects this defense.

This response is weak and does not feel genuine.

[0] https://en.wikipedia.org/wiki/Cambridge_Analytica


"Wow! 50M hours a day of time saved seems like a lot. How could they be evil if they're getting people to spend less time on FB?"

50M hours / 1.5B DAU is about .03 hours, or about <2 minutes per person. The average person spends 1h15m on FB, this is less than a 3% drop in overall time, but likely a larger proportion of that time is spent scrolling and seeing more ads than it is fixed on a single video.


> The average person spends 1h15m on FB

Source? That sounds very high. Regardless, it's definitely a silly statistic. It could be that people just lost interest in Facebook.


Source is here: https://www.emarketer.com/content/average-social-media-time-...

Not sure how they collected the data though.


That's total time spent on social media. According to the source, Facebook's usage is 38 minutes a day. I suppose this also counts people who don't use Facebook though.


Ack, you're totally right. I pulled that link from this https://sproutsocial.com/insights/facebook-stats-for-markete..., which makes that stat sound like it's Facebook-specific.

https://www.broadbandsearch.net/blog/average-daily-time-on-s... puts it at 58 minutes, and https://www.nytimes.com/2016/05/06/business/facebook-bends-t... (from 2016) notes that it's 50 minutes (apparently reporting by FB in an earnings report).


I wonder if Hacker News counts as social media.


Good thing fb never fudged data for the purpose of selling more advertisement

(for those who don't remember https://www.theverge.com/2018/10/17/17989712/facebook-inaccu... )


>> Instead we want to make sure we offer value to people, not just drive usage.

How are you providing value to people when you show articles from the same perspective to the same person. I would value a product, if it gave me different perspectives on the same issue. I guess their definition of value is distorted. And therein, lies the problem and the reason for filter bubbles.

Try logging out of facebook and then login after a week. You will see a significant increase in the rate of notifications. How is this not driving usage?

>> We provide advertisers with reports about the kinds of people who are seeing their ads and how their ads are performing

Well, technically you are not the product. But when you combine hundreds of data points from millions of people and give access to advertisers to that data, you are a part of the product. So I guess, it should say, "You are a tiny part of the product". There, I fixed it.

>> Facebook uses algorithms to improve the experience for people using our apps—just like any dating app, Amazon, Uber, and countless other consumer-facing apps that people interact with every day. That also includes Netflix, which uses an algorithm to determine who it thinks should watch ‘The Social Dilemma’ film, and then recommends it to them.

The key difference between Facebook and other services is that facebook is a "social network". Things I post on facebook are viewed by my friends, family, colleagues which has an impact on how others perceive me and my social status. Facebook has the potential to literally shape my perception in public and my relationships. They completely fail to address this. I definitely don't get to choose the articles spewed by their algorithm and the articles I read/are shown to me definitely influence my thinking.

>> The overwhelming majority of content that people see on Facebook is not polarizing or even political

True, but there are certain topics which are "hot" topics. People usually have strong opinions on topics like religion, politics, sexual orientation etc. I wouldn't care that my friend is a cat person but I am a dog person. However, it would matter to me if my friend supports a candidate that I vehemently oppose. People usually lose their senses when it comes to the "hot" topics. So a post on these topics has a disproportionate amount of effect than a post on a vacation my friend is taking.


>> The overwhelming majority of content that people see on Facebook is not polarizing or even political

It's super easy to cherrypick data that backs your point. I'm sure "The Social Dilemma" did this too. If instead of looking at content, they looked at engagement (comments, likes, shares) then I think the political content would be much higher on the list.


I’ve been on FB since the 1st year and after seeing all the major changes delivered with heaps and tons of hype about how mega engagement would bring the entire world into one big kumbaya circle, I find it hilarious the PR team thought they could get away with pointing fingers at someone for having too much sensationalism instead of substance.


If you block facebook domains: https://archive.is/6zw6N


...you'll find the PDF here.


I haven't had Facebook since I gave it up for Lent in 2017 as part of a social media "time out", and I haven't looked back. I watched "The Social Dilemma" with my fiancée last weekend, and she deleted hers after (time will tell if it's permanent, I certainly hope so... she's kept Instagram for the time being but I'm hoping to convince her to get rid of that as well).

I thought about writing a point-by-point rebuttable to this, about the obvious lies and wilful misdirection ...

But then I realized that I just don't care anymore. I love being free of this company and I almost got sucked back in to writing flame war comments about [thing that gets more attention].

Folks, just give this company up. You'll be much happier, I promise.


Reposting my comment from a dead thread here on the subject:

Ha, I just finished watching the movie an hour ago with a friend. Of course its obvious how the debate was going to be framed from the beginning: Facebook is supposed to be "unbiased" and from this point of view political inaction is the high-minded route over traditional low-minded autocratic rule. Instead of possibly curating a high-quality news feed for users to consume, users get to pick what they want to listen to because to force a user to watch curated news feeds would be biased against the news organizations that operate and compete for engagement on it. So Facebook does have smart leadership and have been proving themselves to more resemble an unbiased company compared than an autocratic government with some fake news initiatives.

However, our individual news feeds are not unbiased. By letting individuals control what they see, increases their bias. From this perspective, Facebook is now inherently biased towards extremism (our sad reality proves this point)

The way out of this situation is confusing and neither Facebook nor the movie talk of any real action. How about, instead of trying to clutch our unbiased pearls, we collectively learn to appreciate and understand inherent biases in everything because everything involving humans and our psyche is inherently biased and there is no end objective truth to the big issues surrounding our differences in values.

Now the question becomes not of bias vs facts, but of better and worse bias. If we could create a Facebook news stream that is inherently biased towards bringing people together rather than splitting us apart, it rather appears to solve this new question quite cleanly and as a plus fixes our pathetic situation. This isn't a radical new idea created in my head 2 hours ago, its a rather established view on this important issue: https://youtu.be/ZbPt66TYsFM

The big question I think should be how do we design this "good" bias into our social media and how do we convince everyone everything is biased literally no point in finding unbiased sources of anything to reframe the debate to be on good versus bad bias.


> The big question I think should be how do we design this "good" bias into our social media

Yeah, no thanks. Algorithms aren’t the solution to this particular problem.


I hate it too (pointing to the out-of-context quote), I don't use social media, but it's here and it's causing a problem. The solution is to remove the algorithm however. The end goal of this proposed solution (not mine, again) isn't to bring "good" bias to social technology through algorithmic means, but to find good bias through a 100% human-lead experience.


I don’t think that skim…milk said anything about that this has to be done by algorithms.


One of my favourite internet quirks is that I always seem to hear about these things due to the Streisand effect

Am gonna watch this Social Dilemma myself now cheers for vouching for it FB


This Chrome extension named Social Book Post Manager (https://chrome.google.com/webstore/detail/social-book-post-m...) seems to be the easiest way for users to delete all their past history (or selective portions) on Facebook, seeing as how Facebook provides no way to bulk delete your likes, posts, photos, etc except by deleting your account entirely. The authors claim it is completely safe to use:

> This extension is completely written in Javascript, and running completely within users Chrome browsers. All the source code can be viewed with Chrome Developer Tools. The code has been analyzed many times by many other developers. The extension has been used by nearly 200,000 users. If there is anything bad, it would have already been reported. You can feel absolutely safe to use it.

Does anyone know if this is indeed a safe tool to recommend to friends and family looking to thin their social media presence?

Are there equivalents for other social media platforms like TikTok, Twitter, Reddit, Instagram, etc?


While I can't say for sure it was that exact extension, I did use something with a very similar name about a year or two ago to completely scrub my Facebook profile. Apparently they had renamed it a couple of times, I assume to purge trademarks from the extension name.

It worked as advertised, and I deactivated my now-empty account and went on my merry way.


`We provide advertisers with reports about the kinds of people who are seeing their ads and how their ads are performing, but we don’t share information that personally identifies you unless you give us permission.`

`unless you give us permission`

lol


They mean via an explicit permission dialog. Not something hidden in the TOS/Privacy policy


One of those dialogs people click "ok" without reading because they are lazy or are in a rush looking something up.


"Yeah but it's not like they're banking on that, nobody thinks that far ahead."

If you've spent even a minute working in the marketing world, you'd know that yes, yes they do. They think exactly that far, and then about another 10 steps beyond that.

Those prompts are most definitely designed to be willed away by the user without allowing them to fully grok the consequences. Unfortunately, it's not measurable, provable, or enforcably illegal.


I think the problem is, that the following statement may be true when you consider all post from all across the world.

"The overwhelming majority of content that people see on Facebook is not polarizing or even political—it’s everyday content from people’s friends and family. "

But in my case, and many of the people I know, you can't not even open facebook without seeing some highly charged political bullshit. I've largely abandoned the platform, as it seems all it is, is an echo chamber of people I know of all political leanings posting shit that strongly, and often inaccurately supports their political viewpoint.

Its no longer a platform of people causally sharing their personal lives, its seeming an endless stream of worthless shares and repost.

I really want to see a social network that limits the ability to repost and "share" content like this, maybe go so far as to even do copy and paste content checks to make sure its not just shitpost copy pasta.


Summation of this article: “you’ve got it wrong. Our cigarettes aren’t bad. We pack them full of flavor to provide you with ‘value’”


It seems to me that if your platform nurtured social conditions that cause an actual, literal genocide, you should immediately, radically rethink the whole way the platform works--or, safer still, just shut it down and use your power to lobby for legislation to stop a thing like that from ever happening again.

Instead, the answer was to trot out some absolutely inexcusable, banal platitudes. And now we get to see similar effects roiling the USA and seeping more and more into the world at large. The fact that they're continuing to tread out such platitudes this far in is simply indescribable. I understand the psychological foibles that can lead very smart people to deny the monstrosity of their own creations, but I am nevertheless filled with fathomless rage that Mark Zuckerberg prioritized his ego and his shareholders above the good of all mankind, whether that was conscious or driven simply by his unwillingness to accept the true nature of his creation.


I see a bunch of "bad thing X was said about our product, we developers of said product say that our product does not do aforementioned bad things".

So literally like any product saying "We of product X recommend product X".

And also the fact that if they didn't put out this document, they might later get sued and the lack of response towards "The Social Dillema" may be used as an argument that they don't care. And as such, they lose a case.

But about that data...

See, I've always heard Facebook makes the largest part of its money by selling data about its users.

Is that not true? Or, even worse, was it wrong of me to assign value to articles and news stories that stated this? If so, why?

What about the reverse? Would there be those who consider it foolish to place stock in a statement that says Facebook does not sell user data, or sells it anonymized?

It feels to me like a deliberate attempt to move all conversation on the subject into the territory of "you can't really know for sure, so SHUT UP".

Get people to stop talking about it, by virtue of casting doubt on everything anyone says.

Meanwhile, Facebook keeps doing whatever it's doing. Nothing changes.

Facebook could fix this problem by opening up about what it does with the data. But it refuses to do that. Trade secrets, I suppose?

Either way, it's Facebook who's handling the data, not us. As such, all responsibility falls on them, not us.


> See, I've always heard Facebook makes the largest part of its money by selling data about its users.

Categorically untrue. They sell advertising space. The data is used to target the ads, which are often paid for based on performance.

Selling the data would be a terrible idea... if anyone else had it, it would undermine their value proposition.


Was this intended for public consumption?

Directly responding to the film by name is such an obviously terrible idea PR-wise, I’m shocked to even be seeing this from Facebook. It must be an accident.


The entire lingo of this “release” is mind-numbingly vapid: “made improvements in...”, “partnered with ${vague}”, “take steps to...” Many words saying nothing.


“Exciting, FB responding to criticism. Let’s see what they’re doing.”

... “Ah, so they’re doing nothing.”


The butcher insisting they are 100% vegan.

I haven’t watched “Social Dillema”, I think it is odd to mix fiction and documentary, but just looking at the dementi, I don’t buy it.


I think this is a pretty good response. My only issue with it is:

"The overwhelming majority of content that people see on Facebook is not polarizing or even political—it’s everyday content from people’s friends and family."

When I used FB I didn't find it polarizing or political but I did find I couldn't see content from people I knew. Other pages and crap overtook it and filled my feed. It was essentially useless to me.

I don't think FB is evil.. I think they are just about competent. Everything from them has been crappy. Remember when they spammed emails, chat was terrible, the apps were terrible, the notification icons didn't work on the website, FB messenger didn't work well.. etc, etc. Some things have been improved now but their talent is pretty lacking for a big tech company.


> The overwhelming majority of content that people see on Facebook is not polarizing or even political—it’s everyday content from people’s friends and family

This seems like it's totally dependent on the politicalness of your friends and family. The past few months, many of my friends and family have been overtly political on facebook, and I ended up seeing it. I don't think that's easily avoided in a contentious election season.

I made an attempt at stopping using facebook about 3 weeks ago, and have been successful so far. Then I just learned another family member joined and I'd like to connect. With my family spread out so far, it's one of the few ways most of us can stay connected. So... I'll probably be back in the FB world small doses in the next week.

That said, I didn't find leaving all that liberating. Some people talk of some sort of new found freedom after stopping using facebook, but I didn't feel that way. I just have a bit of extra time during the week, which I've mostly filled with phone calls to other people, so there's probably not a huge net win from a time perspective. There may be some benefits re: privacy/data collection, but I've been on it for ... 11 years at this point. I'm not sure a few weeks off makes much of a difference yet. :)


Facebook feels threatened by a dramatized documentary? That’s unexpected.


I guess they saw an effect of people leaving it.


Well, they’ve had 5 years to deal with their problems. We can take them at their word that they tried really hard and made these absolutely massive investments, but the platform is still an addictive petri dish of misinformation, conspiracy theories, and continues to drive polarization.

If they were really doing the best they can, the logical conclusion is no amount of effort can actually fix this, and a global, engagement-based social network is too toxic to exist.


I was surprised this is a pdf, but now wonder if they did that to reduce tracking of readership and sharing.

If so, that’s a positive thing for them. A pdf doesn’t have the Facebook like tracking and shadow tracking. They can still see some traffic from the raw http gets, but it skips the ubiquitous Facebook analytics that the documentary talks about.


> But ‘The Social Dilemma’ buries the substance in sensationalism.

This was my conclusion too. There were some genuine good parts, but most of it felt too hyped because the explanations were too flavorful and simplistic, or taken too far. It felt like I was watching The Secret or Zeitgeist.

> Rather than offer a nuanced look at technology, it gives a distorted view of how social media platforms work to create a convenient scapegoat for what are difficult and complex societal problems.

This sentence lacks the nuance that they criticize. While I think it's a fair position to state it gives a distorted view (due to simplification, for example), that doesn't mean they're creating (or intending to) create a convenient scapegoat.

> The film’s creators do not include insights from those currently working at the companies or any experts that take a different view to the narrative put forward by the film.

Well current employees have a lot to lose in being honest, if they disagree. And the documentary did have experts that had a different view that was actually telling a different story than the narrative. I was actually on the lookout for it, because I was upset with the simplistic cookie cutter bullshit it was feeding us.

---- Reacting to the points ----

1. They say they're making efforts to make responsible use possible. The issue with this is, we can't check this. So reputation and trust is all we can go on. So I'm putting this in the neutral category.

2.

> We don't sell your information to anyone.

Wasn't Cambridge Analytica a thing? If so, then I don't care if you sell or don't sell, info gets out there. It feels disingenious to put that sentence in there as a response to "you are not the product".

I agree that the mantra is nonsensical "you are the product". Uhuh, right, because it's short and sweet it must be true? Nonsense, all explanations that I've ever read that were true never devolved to mantra-like statements. IMO it's a "code smell" that something is off.

...

I'll stop here, this is getting way too long.

Long story short: Facebook is being questionable here at best, despite that I agree in general with them that The Social Dilemma is nonsense.

The incentives here are all warped :/


Company who claims they aren’t manipulating their users but also has teams dedicated to identifying user emotions in order to change user interactions.


It's all about agency; the intermediary becomes accountable through its manipulation of the conversation. To avoid these problems, features of the system should emerge from negotiations between participants, not announced by decree.


*Voting is one such mechanism; it works best together with a preceding conversation.

sakisv 17 days ago [flagged]

I wasn't planning on watching Social Dilemma, but seeing that it rattled FB enough to issue a "wait, let us explain why they're wrong" statement I think I will watch it.

Also, fuck you Facebook. You had multiple chances to redeem yourself and start being not-so-corrosive for the society and you intentionally went out of your way to make the wrong choice every single time.


This comment breaks the site guidelines, which include: Please don't fulminate. https://news.ycombinator.com/newsguidelines.html

Maybe you don't owe Facebook better, but you owe this community better if you're commenting here.


Apologies Dang, it wasn't my intention to do so.

Thanks for pointing it out.


Appreciated!


Only commenting because /wp-content/ - from FB? I had expected better...


"Facebook's experts say Facebook not evil, Facebook says."


"Tobacco companies say cigarettes are good for you"


"Here, come take one. I'll light it up for you. That's better, now you look like a badass cowboy."

/s that was the part I was missing from the PR post. Somebody at Facebook doesn't know how PR works, it seems.


[flagged]


People may upvote the submission because they care about the discussion, not because they think Facebook response is good (I didn’t upvote myself, but that what I would do).


Yep, I submitted not because I agree with it (I actually think it's a pretty laughable response) but because I think it warrants discussion.

I would have gone with a more editorialized title but the submission guidelines prohibit it.


this press release is hilariously BAD


I have to say I'm normally on Facebook's side (it's not the toxic, fair election destroying, divisive hole Twitter is) but this document is junk.

The Social Dilemma explains how the stupid 'you are the product' meme is incorrect, for instance.

It also explains the emergent behaviours are not planned, so equally they are hard to stop from planned actions.


How is "you are the product" incorrect? Facebook's customers buy eyeballs, so how are eyeballs not the product?

Compare the parallel logic of "steers are not the product: this ranch is funded by meat buyers so it remains free for cattle."

Go da we da peyeting. (follow the money)


> How is "you are the product" incorrect?

I think the point is not that this statement is incorrect, but rather that it doesn’t accurately convey the severity of the situation. More accurate would be: “manipulation of their audience is their product”.


One of the nice touches in "Gravy Planet" (1952) are the throwaway sentences that show, via revealed preferences, how the ad-exec protagonist is also a product of the system he helped create.

https://news.ycombinator.com/item?id=24576623

====

Manipulation of the audience in pre-TV days, ca. fourth century BC:

http://www.perseus.tufts.edu/hopper/text?doc=Perseus:text:19...

> "The emotions are all those affections which cause men to change their opinion in regard to their judgements"

Remember the Maine!


Watch The Social Dilemma.

It will explain it and other concepts for you better than me. It's a good doco to start on.

I'm not my eyeballs. Facebook et al don't care about most of me.

My sex and age is 90% of the metrics they know how to use. The other 10% is stuff I've searched recently.

What they are buying is time.

The product concept is from TV, social media is different - https://quoteinvestigator.com/2017/07/16/product/




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: