
Ask HN: What do you think about “The Social Dilemma”? - messo
For those who have seen the film on Netflix already; what are your thoughts? I am especially interested in hearing from people involved in any algorithm-driven platforms.<p>This is the first time I have seen such coherent, powerful and accessible explanation on the mechanics of algorithms and negative consequences of social media, and I wonder if this film can be the push that non-technical people need to take a step back and maybe even delete their accounts.<p>Anecdotally, it was my non-technical, Instagram-loving partner who saw this film first and recommended it to me. She re-watched it with me and is now asking serious questions about the platform and her continued use of it. She can&#x27;t be the only one.<p>What will the cultural impact of this movie be?
======
newscracker
Though there was hardly anything that _I didn 't already know_, I liked the
movie very much for its presentation that drove the point well to people who
may not have understood how these platforms work, the privacy implications and
the dangers therein. I've already recommended this movie to some more people.

As for the cultural impact of the movie, I don't think it's going to be much.
Cambridge Analytica had so much news coverage during its time (along with
public hearings by lawmakers and documentaries about it) and still did nothing
material to the bottom line of these companies. They've actually grown bigger
and become a lot richer since then.

1\. Most people just wouldn't care enough to give up these platforms. While
I've been enraged for a long time about these platforms, the big gap here is
that there is no good answer to the question, "what are the better
alternatives?" Don't tell me that Mastodon and Mastodon clones can be
replacements for Twitter, Facebook, Facebook groups, Facebook events,
Instagram, Snapchat, TikTok, etc. Where are the nice(r) mobile apps (not just
some website designed for desktops) for any replacements?

2\. Governments will not regulate these platforms in meaningful ways that
create fundamental changes. Regulatory capture is what's looming around, where
the current biggies make the rules and ensure that nobody else can beat them.

I will keep pushing people to switch to better platforms (even if they seem
deficient in comparison), but I'm sadly not very optimistic about big changes
in the next decade or so.

~~~
rapnie
> Don't tell me that Mastodon and Mastodon clones can be replacements for
> Twitter, Facebook, Facebook groups, Facebook events, Instagram, Snapchat,
> TikTok, etc.

Of course not. Because Mastodon is a federated microblogging service. As that
it is a good alternative for Twitter. My Mastodon client has a better UX than
twitter, no algorithmic feed, no ads, no trackers, no distracting
recommendations, and - importantly - breathing a friendly, open and largely
non-toxic culture.

Sure, it only has a tiny number of users in comparison to twitter (about 4
million), but there is no reason this can be many many more. And hosted on a
federation of 1,000's of servers. Much less risk to be de-platformed from your
favorite walled-garden for obscure reasons.

The Fediverse is only at the start of its evolution, and things are becoming
exciting for us devs. Diverse applications, PeerTube, PixelFed, write.as,
Lemmy, many more, all start to be integrated together. A gradual process, as
the underlying standards mature and new ones (content addressing, object
capabilities, datashards) are being developed. Besides federated ActivityPub
there's work ongoing to extend to full p2p contexts.

I'd say you should have Developer FOMO. To miss out on opportunities to stand
out, and be part of a path that could be part of the solution to our social
dilemma :)

~~~
auganov
> Much less risk to be de-platformed from your favorite walled-garden for
> obscure reasons.

Federation does absolutely nothing for this. The risk is probably even higher
given how small most of these instances are. Much more likely to run into an
admin/moderator.

~~~
vorpalhex
In the fediverse, you're more likely be able to say "Hey Lisa, why did you ban
me?" than you can get into contact with the Youtube/Facebook/etc forces that
be.

That may or may not be better because your admin may respond "Well because I
think you're a jerk and disagree with your political views" but at least you
can migrate to a new instance still instead of just being banned for all
eternity.

~~~
auganov
If you want to keep your followers you'd need to migrate before they ban you.

But yes, I guess you can start over somewhere else while on big ones you're
out forever. So in that sense I guess it is a tiny improvement. But if most of
your followers were concentrated on server[s] that don't like you, you may not
be able to reach them anymore.

~~~
MrGilbert
If your followers are "real" followers, and if you are important to them,
they'll get in touch outside the fed and will know where to find you. Maybe
that's what we should strive for. I have 200+ followers on Twitter, but if you
ask me, I can only name 15 with real, social interaction.

And I think that's great.

~~~
auganov
If that's how you define "real" followers then you might as well use an IM
application instead. IMO the whole point of (Twitter style) social media is
having effortless reach and interactions without explicitly sending stuff to
anybody.

I had over 10 thousand and had many interactions with people but wouldn't feel
like bothering them on other platforms (unless it was of the same nature).

And the few people who bothered to contact me personally felt a little awkward
too. I don't necessarily want to be talking to these people directly.

The magic is you just publish whatever you want and people can look at it.
They can comment but there's no obligation to respond. Having actual
conversations with that many people would be a huge chore. The whole thing
isn't about having relationships with people at all but about spreading
information. It could be about yourself in which case there would be some
implicit one way relationship but that's just one use case.

~~~
searchableguy
Maybe that's why people are feeling lonely. They think their voice should
reach thousands of people while they simply don't care about those thousands.
How can I expect someone to care about me when I don't care about them beyond
a useless number or marketing campaign?

People are thinking of others as consumers while expecting to be treated as a
friend?

------
alexmingoia
I think what I think about docudramas in general: Hyperbolic. I think the
harms of social media are exaggerated, and not entirely unique to social
media.

I don’t have a problem with Facebook knowing what content I interact with and
using that to serve me ads, or content that I want to spend time consuming. I
don’t agree with the characterization of publishing or volunteering
information (likes, posting photos, profile details) as private information.
Facebook makes clear what information is public, what is shared to friends,
etc. I think most people are relatively aware their interactions determine ad
selection, as it is intuitive. It’s really not scary to me that Facebook knows
which city I live in to target ads.

I don’t agree with the characterization of engineering apps to provide
entertainment or social value as “manipulation” in the harmful sense. The fact
that something is enjoyable to use or is used a lot isn’t necessarily a bad
thing, and is not necessarily an addiction. To use another example, I would
want game developers to “manipulate” me to enjoy playing a game. That’s the
point. The fact that people spend lots of time on social media just
demonstrates that they derive something from it, not that it’s addictive or
harmful. Is social media a worthwhile use of your time? Maybe not. Is it an
intrinsically harmful activity? No.

I think the Internet in general allows for the more rapid, selective spread of
information, and that it lowers the cost of publishing. It follows that ideas
can spread quickly, including conspiracy theories. I don’t agree that belief
in pseudoscience, rumor, or superstition is anymore prevalent than it was
before social media. I think it’s just more visible to everyone how many
people do believe wacky ideas. Regardless, social media isn’t going to
suddenly make me believe the moon landing was fake, so I don’t see it as being
harmful.

In short, I don’t think ad targeting is harmful or violating my privacy. I
don’t think social media is harmful but that passive consumption of media is
not the best use of time, like video games or watching TV.

~~~
save_ferris
> The fact that people spend lots of time on social media just demonstrates
> that they derive something from it, not that it’s addictive or harmful.

I was totally with you until this moment. This is like arguing that people
with alcoholism get something out of drinking too much.

It’s one thing to say that you’re comfortable with the amount of information
that’s collected about you and how it’s used, but how can you ignore the fact
that these companies hired experts in psychology and paid millions to keep
people using their apps, just like child psychologists were deployed to TV
advertising in the 80s and 90s? These apps are absolutely addictive because
they were explicitly and intentionally designed to be.

Spending hours in an infinite scroll wormhole does not automatically mean
you’ve derived something important or worthwhile. It just means you were
entertained enough by the possibility something might be good if you just keep
scrolling, and that’s a problem for some.

~~~
blueterminal
> These apps are absolutely addictive because they were explicitly and
> intentionally designed to be.

Same goes with fast food, TV shows etc. It's your responsibility to not over-
consume anything; and if you have children, teach them to not do that as well.
The problem is not with FB, Google etc. The problem (which is over-consumption
of everything) is cultural. Blaming some specific corporations is easy but it
won't lead to any positive solutions. We desperately need stoicism in the
west.

~~~
diob
Speaking as someone who came from a rather colorful childhood of abuse, I'm
wondering what your personal past is? You're assuming so much about the lives
that folks lead, and to me it speaks of a lack of empathy for the personal
struggles each individual is enduring.

Stoicism will fix things in the west? I assume you mean the USA, but as a
person in the USA, I have to say that we are about as stoic as you can get. We
lean on personal responsibility for every little thing.

"Oh, you went bankrupt from cancer. Why didn't you plan for that? Don't you
know around 50% of folks get that?"

"Oh, you went to college for a useless degree? Why didn't you have someone in
your life teach you that was a mistake? Why didn't you reject the counselors
in public schools who encouraged you down that path?"

"Oh, you eat too much sugar? Why didn't you reject the industry sponsored
studies that downplayed their health effects?"

"Oh, you got covid? Why didn't you stay at home? What do you mean you're an
essential worker? Why didn't you take care of your health so you weren't
predisposed to it?"

It's not stoicism the west needs, but a dire injection of empathy and
community.

~~~
merpnderp
It's not just an argument about stoicism vs empathy, but freedom vs control.
How much paternalism do you want from your politicians? How much control over
your daily routine do you want to give some barely accountable millionaire
legislator/governor/president who's loyalties are only to you based on how
many votes their donors can buy (the rate differs greatly depending on the
politician). The more effectively money buys their votes, the less they give a
crap about you.

~~~
jessaustin
We need a better word than "paternalism" for "not inclined to use a pandemic
as an excuse to give trillions of dollars to the rich".

------
nip180
It was very ironic that Netflix, who once said that sleep is there biggest
competition, releases a documentary about how it’s rivals in the attention
economy are harmful.

Yes, Netflix doesn’t advertise directly to me, but many of the effects are
still present. The way I think and behave can be changed by programming I
watch on Netflix. Netflix uses a recommendation engine to keep me engaged.
Netflix has implemented several design patterns to keep people engaged, such
as auto loading the next episode.

It was specifically ironic for them to attack YouTube for kids while Netflix
has a children’s offering, and most of the known downsides about screen time
for children applies wether or not advertisements are present.

~~~
aoshifo
Well, the difference is, that YT needs you to watch more to show you
advertising to make money. Netflix just needs you to pay your subscription.
That is a big difference, as far as the payoff of using psychological hacks
goes. I believe, that if Netflix was ad-based it would be a hell of a lot more
addictive.

~~~
nip180
TV addiction is a huge problem in the western world. The average American
spends more than 2.5 hours watching tv a day, and Netflix is a large part of
that. Even worse, Netflix promotes “binging”, of watching tv for long periods
of time.

Netflix is clearly addictive, and there businesses model is based on people
spending large amounts of time watching their programming and talking about
it.

------
hef19898
The funny thing is, i always wondered how it would have been to live in the
30s. Mass media, populist movements and all of that. I kind of get it now.
Social Media is the new mass media And as the last time something similar
happened, smart populists used it to their advantage, industry supplied the
tools and we all know how it happened. Not saying we are heading the same way
just yet, but the possibility exists.

One could have propably made the same documentary about radio and TV like 80
to a 100 years ago. Just watching the film now, thanks for posting this
question, I could have missed it otherwise.

EDIT: The guy having invented the LIKE-button is right up there with the guy
having invented FCKW. Not that he had any bad intentions. Which worries me,
that when a group of people with good intentions is doing good things can
create easily exploited, dangerous things in the wrong peoples hands.

~~~
lefrenchy
I’ve often thought about this too, I think the printing press is a similar
time to examine as well [1]. I do think the internet provides an entirely new
scale, the dissemination is global and instantaneous, and that makes the
previous problems that much more explosive and harder to contain.

[1]:
[https://www.theatlantic.com/magazine/archive/2020/01/before-...](https://www.theatlantic.com/magazine/archive/2020/01/before-
zuckerberg-gutenberg/603034/)

~~~
hef19898
Yeah, scale and speed are maybe the worst part compared to other periods of
the past. Ways to manipulate the masses have always made some extremely
powerful. Religion and church in medieval Europe for example.

Ultimately, we will figure it out. We found ways to cope, as societies, with
mass media, radio, TV and so on. We will with social media as well. I'm just a
tad worried about the way we have to pass to figure it out and what might go
horribly wrong along that way.

EDIT: They lost me a little bit on the AI scare part. Up to now, we talking
about ultra-complex algorithms. The kind of algorithms that screwed up Wall
Street. Hard to understand, sure. But still "stupid" algorithms. Difference
between Wall Street and Social Media? Wall Street has a public, closely
monitored, public real time KPI to measure, indirectly, what the algorithms
do. Social Media has no such thing.

Oh, and we should read up on Goebbels theories on propaganda. The principles
still apply, just the scale is so much bigger. The algorithms have no
intentions yet. People using these algorithms do. And we have to find a way,
as societies, to cope with that.

------
jordwest
I don't work on algorithm-driven platforms, but I built a tool [1] to help
deal with them.

I'm very glad to see this reaching the mainstream. Like most people in the
tech industry with some understanding of how these things work, I've been
increasingly worried about the harms it's doing to myself, others and society
at large. Once you've seen the damage it's doing, you can't unsee it. It
drives me crazy that others can't see the same things.

However I think this film did a great job at conveying to non-tech people how
the systems work and take advantage of them. I think of it a bit like those
documentaries that expose the worst of the meat industry, where it's not
uncommon for viewers to become vegetarians.

My optimistic outlook is that more and more people will wake up and turn their
backs on these harmful things, and the era of Facebook etc will, thankfully,
be over.

[1] [https://chrome.google.com/webstore/detail/news-feed-
eradicat...](https://chrome.google.com/webstore/detail/news-feed-eradicator-
for/fjcldmjmjhkklehbacihaiopjklihlgg?hl=en)

~~~
zls
Thanks for NFE!! That and FB Purity have reduced the time I spend Facebook to
nothing, except for messages.

I put a bit of effort into finding extensions like this, and Distraction Free
Youtube, Inbox when Ready, Amazon Lite, etc this year and I've been shocked
and how much less "sticky" I'm finding the internet. Willpower and personal
responsibility are great, but better yet is not even having to try.

When I use my phone, the difference is absolutely incredible, since I can't
bring this huge suite of extensions with me. and as a result I've been
uninstalling apps gradually over time -- you can disable surprisingly core
functionality on Android, including the web browser.

~~~
jordwest
I'm glad to hear it's helped you :)

> Willpower and personal responsibility are great, but better yet is not even
> having to try.

I think as humans we overestimate just how much power we have over our minds.
Especially up against these algorithms that have been trained and honed with
billions of compute hours by thousands of engineers, we just have no chance.

------
nyxtom
Social media is this generation's smoking issue. One of the predictions made
in this movie was that humanity will not be able to survive if we continue to
be blinded by our own willful ignorance due to these addicting sources.

I think they hit the nail on the head that ultimately the goals of the
platforms are at odds with your own intrinsic goals and the goals at large for
society. Something is going to have to change eventually.

~~~
Nasrudith
Believing willful ignorance is due to any external X is itself willful
ignorance. We have it documented in the bronze age with Philosophers.

------
rogerdoger123
The problem is that many people, including my wife, are OK being shown ads.
After watching the documentary, I uninstalled Instagram and logged off
Facebook as I hated the thought of being manipulated. My wife (and her friends
for that matter) though concerned over the impact to kids, was back on
Instagram about an hour later. "I'm ok with all these ads, some are useful".

So though it should help parents navigate this better for their kids sake, I'm
not sure it will impact everyone as much as we think it might.

~~~
Nextgrid
I think that advertising is the main problem. Most issues caused by social
media ultimately originate from this way to generate money out of wasting
people's time. Cut that source of revenue and suddenly there is no longer any
incentive to get people addicted, encourage outrage, promote
misinformation/clickbait, etc.

Related: [http://jacek.zlydach.pl/blog/2019-07-31-ads-as-
cancer.html](http://jacek.zlydach.pl/blog/2019-07-31-ads-as-cancer.html) (I
totally agree with all of it)

~~~
ricksharp
I think you may be right. I've always hated ads, but never thought about why.

They have never done anything good.

If I need something, I can search for it. If I don't need it, I don't want an
ad to try to instill some desire in my heart.

If something is really useful, word-of-mouth is a good way to learn about it.

What would it be like to live in an ad-free society (that still promotes
healthy mutually beneficial capitalism)? Is it even possible?

------
jmdocherty
I'm a fanboi! I found this film incredibly powerful (esp. the comments during
the closing credits). I've become a real bore when chatting with friends and
other parents about this subject. Now I can simply say "watch The Social
Dilemma" and return to discussing the weather!

I found the tips on [https://www.humanetech.com/take-
control](https://www.humanetech.com/take-control) very helpful to interrupt my
addition to my phone (particularly the monotone one).

~~~
luizfzs
It is great indeed. The only thing that disappointed me was the way they
presented the increase in depression data. They mention that there was a 100
and something percent of increase, which is a percentage of 2 percentages.

Other than that, I think that it brings to light what (possibly) happens
behind the covers of those systems, as it's always trying to get you back on
the platform. I remember that every time I get a notification about something.

This film and The Great Hack form a great duo, in my opinion.

------
fellellor
I couldn’t help but thinking of the movie/documentary itself as an act of
manipulation and an attempt to distract from the real issue.

The real problem is the way big venture capital is funding these companies.
Allowing them to become so big, while suppressing all competition in order to
concentrate profits. This is what has made these companies into the monsters
they are now. In this era of big tech focus has shifted, IMO, from creating
and implementing fair and simple common standards to , among other things,
walled gardens and egregious bullshit wherein people don’t even own most
things they’re paying for.

The act of manipulating others into situations that are favourable to yourself
is not something new and has probably been practiced for centuries. The
documentary, which I admittedly couldn’t force myself to finish, seemed to
gloss over the big money aspect to focus (in the most cringe way) on the
psychological manipulation aspect, which made its insights unoriginal and
shallow.

~~~
23B1
I'd wager a movie on a mass distribution platform like Netflix is going to be
more effective at making people aware of the problem and help them mitigate
its effects on an individual basis... than, say, trying to change the way
capital flows.

~~~
fellellor
My point is that it’s easy to make an argument about this aspect of SM than to
talk about the money, trace it, and pinpoint blame. Nowadays, there is plenty
of evidence available about the psychological effects/harm of social media. So
it’s easy/harmless to fund this kind of exposition than anything that talks
about the finance side.

------
mikewarot
It told me many things I already knew... but it enabled me to summarize them
in a new way. Facebook, YouTube, Twitter,etc.. all exist to sell your
attention to advertisers. This means they want to keep you on site for as long
as possible. The algorithms used find the nearest rabbit hole, and gently push
you in that direction. The hole might lead to a new skill, just plain fun, or
could be horrible for you and society. They don't care about the effects, they
only care about the profits.

Evil isn't the intent, it's just the side effect.

~~~
bob33212
Zuck isn't evil, he just wants to give people what they want and take no
responsibility. "If they were not on Facebook they would be on Twitter or Fox
News or MSNBC. I might as well get rich instead of someone else."

------
timeuser
It was a good explanation of some issues with current social media platforms.
But I don't think that ad driven business models and algorithmic content
manipulation are the biggest problems we are facing in all this. The
fundamental problem is we humans aren't wired to deal with information and
social connection and this scale and speed. It's too easy for just about
anyone to find and confirm anything they want to believe and so many others
that believe it. On the other side of that coin it's too easy for anyone to
reach an audience with whatever they want. Focusing on the algorithms and
business models is a bit of hubris and a distraction from the harder problem
of empowering mobs. I know people caught up in the conspiracy theories. They
aren't finding them through Facebook. Some of it is through YouTube rabbit
holes but a lot of it is just websites found through Google (which is really
just popularity ranking) or text messages shared between friends & family
directly. People have these biases and want to believe these things and the
access to these easy tools of connection and communication are amplifying us.
Maybe regulation, moderation, filtering and defensive algorithms can help?
Maybe we will learn to deal with it better and society will change to
accommodate it but it's becoming a rough transition at the least.

~~~
ses1984
I think you're underestimating the power of algorithmic manipulation.

Take a simple ecommerce AB test as a starting point. Should our shopping cart
look like A or B? Apply Bayes theorem and it doesn't take that long to gather
evidence that A increases conversions by 1%.

Now replace the simple AB test with a multi armed bandit.

Now run the multi armed bandit continuously across all social media platforms
and keep tweaking to maximize sentiment in your favor.

That's even without applying data science to the social network graph.

The "fundamental problem" that you lay out is definitely related. The
"fundamental problem" is that we are so susceptible to such algorithmic
manipulation.

~~~
timeuser
I understand what you are saying and agree that algorithmic manipulation is a
problem. I think it's very profitable for these companies and people can be
manipulated by it. I still think it's a small aspect of the problems of the
network & scale of information access/publishing which are leading to
conspiracy content and mob amplification. Problems which exist even without
algorithmic manipulation being involved. At this point, just due to the
network of EVERYONE using mobile devices to connect & consume content it's
similar to the eternal September problem but at full scale and saturation. Is
the algorithmic manipulation needed for Fox News & other biased media to
reach, feed & manipulate its audience? I think you could delete Facebook from
everyone's phones and we'd still have a polarized world with these bubbles,
conspiracy and mobs continuing as long as everyone still has their
smartphones. I also think pagerank and similar popularity based ranking is
just as big of a problem as the algorithmic timelines because it directly
amplifies what we say and want to hear even without the personalization of
search aspect. I'm not defending Facebook and algorithmic manipulation, I just
think there is too much focus on that aspect of the problem and don't think we
can fix things with that focus.

~~~
intended
Yup, after watching this I figured that we are going to eventually see a law
which prevents behavior management of humans through code, all reccomendation
and tailoring algorithms will be put into a central transparent repository
with public access and study of its impact.

------
dsanchez97
Like a lot of other people that replied, I enjoyed the documentary because it
provided the information in a more accessible way. I think the documentary
does a good job explaining the current harms of these companies, but I think
the problem can be viewed more clearly through the lens of externalities
([https://en.wikipedia.org/wiki/Externality](https://en.wikipedia.org/wiki/Externality)).
As discussed in the film, these companies curated feeds are shaping peoples
realities and creating a more divided society. Like with most externalities,
the way to account for this cost is through regulation. I think that a tax
that increases exponentially with each ad served (over a certain period of
time and then it resets) would be an interesting idea that could adjust the
business model to not optimize for addictive behavior.

------
Everhusk
The movie does a great job at explaining the real problem behind our current
version of online social networks. We need to start developing much more
advanced systems that have the communication tools that Facebook and Instagram
have, but also innovate on the financial markets/governance systems that the
most advanced nation states have.

The is the only way out of this problem is to accept that governments and
regulators are not going to be able to fix this, and create a new one that is
owned by the people. And as you see from the movie, the chances of us hitting
singularity with an AI coupled to an optimization function that domesticates
humans for profit, is highly likely. It's probably one of the greatest
challenges of our generation, and the hardest part is that over time the
people you need to change the system are so addicted to the crack that it
gives them, that they won't want to.

I do think movies like this are extremely important on getting the message out
though, and hope that it enacts some changes. Me and a few engineers are
working on a solution at social.network, let me know if you are interested in
helping out.

------
aaron695
It was well done. It certainly understood how social media works far better
than anything I've seen on HN. Which is interesting when you are aiming at non
technical users.

The back end minions were well done as an explanation.

It did well to be bipartisan. "Extreme centre" was clever. Their examples were
sometimes political but it's hard to know how to have bipartisan real world
examples, but they seemed to try.

I think it'll have a far reach and it will affect people, but there's no
solution, what would you protest and where?

"Gasland", "Waco: The Rules of Engagement", "Blackfish", "What the Bleep Do We
Know!?" all seemed to be actionable for instance. It's easy to buy guns, or go
to a protest or ditch your meds.

For "The Social Dilemma" it's hard to change everyday life and it's hard to
legislate algorithms where no one knows how they work.

------
intended
I think movie will be pretty major, since it summarizes _many_ points that
people following it "know", but in one place. I essentially live and breathe
this stuff at this point, and yet it is ridiculously hard to talk about.

You can see it in the start of the show - when people are asked what the
problem is. Everyone's brains just stall with the scope and challenge of
putting it into words.

It is great at explaining concepts and problems to people, which are very hard
to discuss just using conversation.

I was discussing polairzation increasing since the year 2000, and the
implications of that with someone. It took me almost an hour of discussion, to
get an acknowledgement of the possibility.

One look at the Social Dilemma's animation on polarization trends over time?
Never have to make that argument again.

------
_mkef
Good movie overall.

I also suggest Ten Arguments for Deleting Your Social Media Accounts Right Now
, by Jaron Lanier

Jaron is in the documentary and makes additional points. Social Media is
making you a jerk is a great one.

While he doesn't directly critique Online Dating, he does mention Catfishing
and Ashley Madison being mostly bots. If you Google 'FTC sues Match' you'll
find bots are very common across the industry.

At least now you can't detach online dating from social media , you need to
get your matches to add you on Snapchat so you never actually meet anyone.
Just off pure stats, people are more alone than ever before. By design it
doesn't work for most people. And what company would want to lose subscribers
?

I found myself an anxiety filled 'lab rat' when using that junk. Deleted all
my social media a year or so ago , and I've had no issue meeting great folks
since.

Generally I go out to do things I want to do. Everyone that's joined me along
the way, including a fellow .net programer, has been a bonus.

All that said, I see it getting much worse before it gets better. Social Media
is eventually going to be regulated , but it's already destroyed a generation.
Rates of suicide have exploded since Social Media became mainstream.

------
aimindcontrol
I'm actually worried about the similar issues with Google search results at
the moment.

2 examples.

1) The new suggested answers on search results. As an example search for "is
salt bad for you". The top results might be correct though I've seen other
searched that I believe aren't (different searches). Under that is this new
"People Also Ask" section listing all kinds of questions, each with a single
answer. AFAIK the answers are just whatever Google's algorithms decided are
the most popular. Given there is one answer for each question (and the one
answer to the original search at the top), it all comes across as
authoritative answers. I think if you try various search terms you'll quickly
find lots of examples you vehemently disagree with the answers google is
presenting.

I don't know why the previous style of results felt better but something about
the previous style (the style below the "People Also Ask" section came across
as "here's semi random answers from unknown sources so be aware" where as to
me at least, the new top answers above that all come across as authoritative
and that scares the crap out of me.

2) Basically the same issue but if you search for "<some phrase> in <some
language>" google will give you one answer and in this case they will often
claim "Community Verified" yet they are often flat wrong. Often the phrase is
ambiguous so there can be no "one true answer" without more context and yet
google presents the results as "this is the one true answer because it's
community verified". This basically proves my point for #1 above. That fact
that google shows these questions with just one answer seems like pure thought
manipulation.

~~~
messo
They include an example in the movie regarding search results (and auto-
complete suggestions) to illustrate that the answer you get is influenced by
the algorithms knowledge of your world view. The example they use is: "Climate
change is …", where people are shown their "truth". I wonder how long it takes
before Google makes a change to the results of that exact search string, hehe.

~~~
aimindcontrol
That's a similar but different problem IMO.

I think I mostly want Google to help me find things. If I search for "windows"
I'm 99% more likely to be looking for Microsoft Windows where as some
carpenter or interior decorator is probably looking for glass windows.

This new issue is google claiming to tell me the right answer about any topic
and it's really Google's AI claiming to know the answer on any topic. I admit
it might be me. The old presentation of webpage hits suggested to me that
google was not taking responsibility for the answers themselves, only for
"search results". The new presentation is that Google is making it appear this
are official answers. I know somewhere in the fine print Google will claim no
responsibility but the presentation suggests they are.

------
clircle
I was somewhat disappointed that the movie didn't much discuss the downsides
of regulating social media, e.g. censorship, slippery slopes. like other
commenters say, I didn't learn anything new, but I'm glad this kind of
documentary is out in the open for non-technies to watch and consider.

------
knobcore
It will have the same effect the DMCA did on piracy after Napster. That's what
will happen. Even if the big companies come clean and start policing, that
will just move the extremists who have already built their army into
decentralized encrypted chat platforms, and they will still recruit over these
tools too.

You've gotta basically build the entire Internet from scratch, with some kind
of government and some kind of identification system so that if someone comes
to take down social discourse they are removed from not just one platform but
all of them and they can't come back without some kind of appeal. In addition,
all political discourse held should be required to adhere to scientific and
scholarly standards. I have no problem with people who aren't from
institutions to challenge the status quo, sometimes that has benefits but if
they aren't armed with a pile of evidence to prove why the status quo is
wrong, they should not be amplified and that regulation should be central.

Until you have both of these things, anything else you do will be a waste of
time. It will either become like some sort of Vegan alt-lifestyle or morph
into yet a new set of big platforms. Both are untenable, as fixing this has to
require the participation of the entire social system.

~~~
messo
The movie does not argue for policing content, but directs criticism towards
massive data gathering, analysis and micro-persuasion trough algorithms.

~~~
knobcore
Well that's what I'm arguing. I think algorithms have less power than people
think they do, even the computer scientists. Social sciences/economic theories
that study how crowds work all say that it's driven by information. You don't
need an algorithm to do this you just need someone in your group to share that
information with you, and this is actually is what really is happening when
you look at it and get out of data and charts.

People are leaving these algorithm ran places in droves especially now because
they already believe that the government is working with these companies to
track their movements and radical plans. They're already going to
decentralized networks, and once that happens you will not be able to control
the radicals just like you couldn't control the pirates after they switched to
decentralized piracy networks. This isn't trying to stop someone downloading a
couple Metallica songs. This is trying to stop the proliferation of
fascist/totalitarian thought and foreign actors stoking it up. It's much more
dire, and the world of John Perry Barlow is dead. Not to mention the last few
right wing terrorist attacks weren't planned on Facebook or were they
radicalized through there. Neither was QAnon. That all came from
4chan/8chan/etc. There are no algorithms there, no AI, and anyone could
basically code something on that level of complexity in a day.

Information should not be free. It should have limits. Once such information
threatens the proliferation of a free society and brings people back towards
totalitarianism the slippery slope "first they take away the press and then we
get Hitler" argument falls flat on it's face because large swaths of the
population freely gravitating towards Hitler gets you the exact same effect,
and not only that the nuance between society trying to regulate information
for the general health of said society and a fascist dictator doing the same
is completely different.

~~~
messo
> I think algorithms have less power than people think they do, even the
> computer scientists.

Algorithms are powerful because they can cause tiny, incremental and often
completely unnoticeable changes of opinions and perception. These changes adds
up when they affect large enough populations. I agree that people who gather
in groups and share information makes up the bulk of this equation –
algorithms do not operate in a vacuum.

I do think that the negative effects caused by algorithms are largely
unintended consequences. Profit is the motive, not malice.

> People are leaving these algorithm ran places in droves especially now
> because they already believe that the government is working with these
> companies to track their movements and radical plans.

A tiny fraction of the technically literate people escape the big platforms,
the rest stays on even though they are somewhat conscious of the negative
effects and ruthless business practices.

> That all came from 4chan/8chan/etc. There are no algorithms there, no AI,
> and anyone could basically code something on that level of complexity in a
> day.

Many people joining extreme communities have been nudged by algorithms,
especially Youtube's. What happens from there is usually just plain old group
think and tribalism, feeling the comfort of finding a community to belong to.
The point is that those nudges from suggested videos gently pushes you down a
rabbit hole you otherwise would not be exploring in such a rapid and
captivating manner.

Like they say in the film; algorithms are not evil on their own, they just
tend to enable and amplify some of the worst tendencies in people who know how
to exploit this tool for their own gain, political or otherwise.

> Information should not be free. It should have limits.

I disagree. What I do think is that the many "information outlets" should be
held responsible for their editorialization. Newspapers, TV and social media
platforms do all editorialize, only social media has left this task to the
algorithms – which is rather careless and naive. These companies have indeed
moved fast and are now braking things.

~~~
knobcore
> Algorithms are powerful because they can cause tiny, incremental and often
> completely unnoticeable changes of opinions and perception.

Stafford Beer predicted in 1972 that increasing variety without regulatory
variety to combat said variety would send society towards catastrophic
collapse because there are so many possible states in the social system it
becomes as complex as things like weather and wave formation.

[https://archive.org/details/DesigningFreedom](https://archive.org/details/DesigningFreedom)

But we got the American Skinner Box model, mixed with heavy doses of
hipsterism instead.

> A tiny fraction of the technically literate people escape the big platforms

I suggest you look into it a bit more and read about the associations between
people like Nick Land, Milo Yiannopoulos, Steve Bannon, etc. While it's true
some of it was done by social media, the thing about it is that conservative
media has and always been this close knit juggernaut hype machine since when
Rush Limbaugh appeared in the depths of AM Radio hell. To them, it's merely a
faster way of organizing the way they have for years because there isn't any
cost. They no longer have to print 5000 copies of something. It's low hanging
fruit. Most of the alt-right are ex Ron Paulers, that were already into things
like bitcoin through the libertarian party since like 2012. They never
"escaped the big platforms", they were always recruitment points for normies,
which doesn't even require them to do anything but promote mainstream
conservatism then say "go here for more stuff" that can't be tracked by
Facebook's AI. Maybe your grandma may not have "escaped" but your grandma
probably isn't on the streets shooting black people or lefties. Bitcoin was
transacted between these Russian/American groups according to the Mueller
report also.

> Many people joining extreme communities have been nudged by algorithms,
> especially Youtube's. What happens from there is usually just plain old
> group think and tribalism, feeling the comfort of finding a community to
> belong to.

The algorithms merely bayesian filter a giant database of what people like and
feed it back to them for their increased engagement. These people would have
chosen such content on their own, and the fact the content exists in the first
place that plays to their _real_ views (not views they show in public) that
have existed for hundreds if not thousands of years and are passed through
blood lines (if your mom is a dem or rep or kkk, you're gonna be a dem or rep
or kkk 80% of the time) it legitimizes the worst of human behavior which was
socially unacceptable.

But algorithms don't do that, communication without moderation does. You
aren't going to get a Trump supporter to start watching CNN or MSNBC, and if
the satellite stopped carrying it they would cancel their subscription and go
somewhere else. So if the Internet offers media that has their views, they
will seek it out and if the algorithms amplify this by sharing what other
people in their social circles say, even better according to them. Saying that
these people were innocent and normal before Facebook came along and made them
radicals is just flat out wrong. They can just say the n word to each other
with millions of people instead of having to hide it among close friends like
they're smoking weed or something.

This polarization was happening long before Facebook and the Internet even
existed, especially during the Clinton administration and the things that got
Tim McVeigh to bomb Oklahoma City over Ruby Ridge, Branch Davidians, and the
Assault Weapons ban. That's who these people are, always will be, and giving
them a communication platform without strong information control is asking for
trouble.

You aren't going to get away from business interests, religious interests, or
racial interests it's basically impossible. You just need to create a sense of
civil society via the web through ejecting bad actors from the public square.
I know if I walked into a gay bar and started yelling homophobic slurs I
probably wouldn't make it out of there alive, and if so I'd be kicked out for
life with a nice bouncer at the door to greet me if I tried again. The
Internet has no such protections.

~~~
messo
People and groups with strongly held beliefs are not the most affected by
algorithms, as you are pointing out. Worst case scenario is that their echo
chamber becomes less impenetrable or that they become even more extreme.

Algorithms does most damage to people not currently holding any strong beliefs
about any given topic, but do have some wage leanings one way or another.
These people can be swung and their views amplified and radicalized without
much effort or financial input, as the film The Great Hack makes a good case
for.

Google, Youtube, Facebook and Twitter (who all offers fine-grained ad
targeting) can be weaponized to push "normal" people towards the extreme, and
pit groups against each other and affect democratic processes in a big way.
This was Cambridge Analytica's business model, and it worked really well.

~~~
knobcore
I think a lot of people confuse the appearance of a lack of strong beliefs
with simple social filtering. There is no social filtering on the web because
you don't have to worry about your reputation.

The thing that people try the hardest to stop from happening to themselves is
social isolation and being an outcast. People will lie all the time just to
get along with the crowd. You wouldn't know people with strongly held beliefs,
unless you found out what they had in their library. It's just the filters are
removed and laid bare for the whole society to see, and it's ugly and it
always has been.

"In 1981, former Republican Party strategist Lee Atwater, when giving an
anonymous interview discussing Nixon's Southern Strategy, said:[28][29][30]

You start out in 1954 by saying, "Nigger, nigger, nigger." By 1968, you can't
say "nigger" – that hurts you. Backfires. So you say stuff like forced busing,
states' rights and all that stuff. You're getting so abstract now, you're
talking about cutting taxes. And all these things you're talking about are
totally economic things and a byproduct of them is [that] blacks get hurt
worse than whites. And subconsciously maybe that is part of it. I'm not saying
that. But I'm saying that if it is getting that abstract, and that coded, that
we are doing away with the racial problem one way or the other. You follow me
– because obviously sitting around saying, "We want to cut this," is much more
abstract than even the busing thing, and a hell of a lot more abstract than
"Nigger, nigger."[31]"

[https://en.wikipedia.org/wiki/Dog_whistle_(politics)](https://en.wikipedia.org/wiki/Dog_whistle_\(politics\))

~~~
messo
This is a great point. At the same time I observe a night and day difference
in how some people close to me behaved before and after they had access to
Facebook and Youtube. Pre-social media I could have a long and rather nuanced
discussion on controversial topics, and we both left the conversation with
slightly altered opinions and some new perspectives. Today, all nuance is gone
and the conversation is scattered all over the place, spiced up with whatever
conspiracy theory they "discovered" on Youtube lately.

Maybe one of the effects of algorithms has been to push people to extremes to
such a degree that social filtering is discarded? People I know seem to be
almost apologetic in their approach, as if their life depend on convincing
people about their views and to not care about the social consequences of
constantly "preaching". This form of polarization is not healthy – basic
respect for other people is lessened and listening to counter arguments is
considered a weakness.

This is just an anecdote from my life. From what I read and hear from others,
this experience is not unique.

------
i8code
Doubtful it will have much impact. I don't buy the central premise of the film
- social media is so addictive and people are easily manipulated by it -
through the power of 'genius' algorithms that work against vast stores of
data. There is no mention of freewill .. are we really so easily controlled?
If this technology was so powerful and able to manipulate people - how are we
not using it to end racism, effect climate change, teach basic math skills,
fight obesity, etc. The reason is that the technology isn't that good - and at
it's best it suggests things that appeal to our base nature. It doesn't
control us or even come close. Also the fucking hubris of these people to
think they really had that big of an impact is a bit much. Free will and human
nature are real things and social media isn't that powerful. Also with free
will comes personal responsibility - this film seems to suggest that we are
such sad victims and the problem isn't us but the powers that be. I'm so weak
that I can't help myself - so it must be social media's fault. Give me a
break.

------
cik
In every conversation like this I always bring up Cathy O'Neil's excellent
Weapons of Math Destruction ([https://www.bookdepository.com/Weapons-of-Math-
Destruction-C...](https://www.bookdepository.com/Weapons-of-Math-Destruction-
Cathy-Oneil/9780141985411)). I think it's an ethical must read for anyone in
tech.

~~~
approxim8ion
I also enjoyed Robert Elliott Smith's Rage Inside the Machine
([https://www.rageinsidethemachine.com/](https://www.rageinsidethemachine.com/))
which is in a similar vein.

------
nikivi
I personally never understood the value of ads. I subconsciously turn off my
brain and press skip whenever I encounter any kind of ad because I know it was
made to influence me into doing an action (and I don't want to waste any
mental energy in evaluating whether I am being tricked or not). And most ads
are bad:

[https://twitter.com/JonErlichman/status/1304793136494006272](https://twitter.com/JonErlichman/status/1304793136494006272)

So to me the whole thing simply doesn't make sense, who actually did buy
something from an ad?

The only signal I value for trying out (buying) new things comes from real
people.

The privacy implications of storing so much personal data is the other
question and that can be used for some bad things that isn't just serving ads.
Would be nice to have a law for mandatary differential privacy for ad based
social networks.

\--

Perhaps I am losing out on ignoring all ads wholesale like that. I even
sometimes put products that do 'a lot' of advertising in bad light and try to
avoid them all together.

~~~
robjan
Many people buy things after seeing ads. I was running an e-commerce where the
majority of customers came either from direct ads or indirect advertising like
influencer marketing. It's a great way to gain new customers for a product
business (although in my view the "lifestyle brand" space is pretty saturated
now).

~~~
hungryfoolish
Also, there is nothing wrong in advertising as a concept. Advertising has
existed as long as products have existed. The problem is tracking and data
collection/exploitation about some very granular aspects of users.

Also, even if you take away advertising, the problem remains of addictive apps
and 'persuasive' design which will still remain. To tackle that is a whole
different discussion altogether.

~~~
nikivi
Guess I am missing out ignoring and blocking all ads I see. I do think
there're better ways to conduct business than basing entire business model on
ads.

------
mooreslaw
I was surprised that given where the movie went, there seemed to be little
discussion of the idea of banning personalized advertising. While it would be
difficult to stomach for the companies who've thrived on that business model,
the social consequences and business incentives of personalized advertising
are corroding our society.

~~~
082349872349872
Read Bernays (or the Creel Report if you can find it) for an account of how
personalised advertising and manual social graph exploitation worked about a
century ago.

Of course, back then, they targeted with much less data. For comparisons from
half a century ago to now, consider [https://opendatacity.github.io/stasi-vs-
nsa/english.html](https://opendatacity.github.io/stasi-vs-nsa/english.html) .

~~~
burnaway
Never heard about the Creel Report before, but here it is:
[https://archive.org/details/completereportof00unit/page/8/mo...](https://archive.org/details/completereportof00unit/page/8/mode/2up)
Thanks for the tip.

------
eivarv
Just about nothing new wrt content.

I remember I kept thinking how naive, imaginationless and without knowledge of
history these people must be if they genuinely never thought critically about
what they were doing – if no one ever saw any potential for abuse.

That, and the fact that they kept pushing unfounded conclusions, like
correlation between diagnosed mental issues and appearence of social media
allegedly pointing to a causative relationship; alleging that this thing is so
"new" and "different" from any other tool that has played on weaknesses
inherent in the human mind that we just can't deal with it, etc.

I tend to think proper education, including critical thinking skills
(including logic and cognitive biases), critique of media, etc., as well as
defining privacy clearer as a human right (including explicitly stating that
data subjects own data about themselves) and enforcing this hard will solve
many problems.

------
ianopolous
There's a great article with more technical exposition by Jeff Seibert, one of
the interviewees: [https://medium.com/@jeff_seibert/the-mechanics-and-
psycholog...](https://medium.com/@jeff_seibert/the-mechanics-and-psychology-
behind-the-social-dilemma-719d618aa8ce)

------
randyzwitch
I found it to be persuasive, but the after-school special vibe of between
interview scenes was corny

------
codekansas
I suspect there's a growing market for summer camps and other forms of tech
hiatuses. I would gladly subscribe to a service that forced me not to use
social media or whatever, since I don't think my own brain can be trusted with
that

------
partiallypro
I think the presentation was kind of silly to be honest. I liked the talking
to experts, etc...but when they had dramatizations it felt so forced and
silly.

~~~
messo
It is a bit silly, but at the same time an effective way to illustrate some
very technical concepts to non-techies. Judging from the response I got from
my partner, it did the trick.

------
anonu
> What will the cultural impact of this movie be?

A few months back a bunch of big Facebook advertisers (nike, etc..) pulled
back some of their ad spend to punish Facebook over their stance on certain
cultural issues. What happened to Facebook? Nothing... Revenues actually went
up...

Facebook is an incredible tool and platform. No platform (none amongst the
FANGs) has the same capabilities of tailoring your internet experience like
Facebook does. As an advertiser, this is incredibly powerful: being able to
put a targeted tailored ad in front of exactly the demographic you want. No
other platform even comes close.

The cultural impact will be zero. The dollars speak for themselves. FB users
keep using the platform and advertisers will keep chasing them on it.

The only way to put this in check is for government to step in and put in a
comprehensive set of laws and rules that governs how your data is shared. A
market-driven response to Facebook's insidiousness will not happen by itself.
Government needs to nudge it forward with a thoughtful set of policies that
promotes competition and makes these big companies liable for the false
information they promote.

~~~
Nextgrid
The Facebook boycott was complete hypocrisy. It was framed as being in support
of BLM & similar movements despite:

1) The companies involved in the boycott were often the ones with a terrible
track record regarding their own treatment of minorities, so it was absolute
bullshit virtue-signaling.

2) We were in the middle of a pandemic where the economy essentially shut down
so it made total sense to pause advertising for the time being. This was
purely a financial decision, framed as some kind of social impact action by
some terrible & disgusting companies.

Of course, as soon as the market started recovering and they saw a need to
advertise again they were back in full force as if nothing happened.

------
monster2control
I think everyone should watch it. I think there can be a lot of good done on a
platform like Facebook, if it's goal wasn't to get you hooked. If it had real
moderation of content.

The proliferation of fake and misleading information has clearly caused a lot
of harm to the United States and I'm sure other countries as well.

------
marianov
I think it does a great job to explain to non-techies things I have explained
for years withouth luck. Biggest points are about kids and explaining echo
chambers. I live in a very polarized country where everybody justs follows the
media and people who are on either of two sides. (Argentina)

------
el_dev_hell
Like most people here, I was already aware of the technical side (and I'm
guilty of implementing several of the features such as infinite scroll).

The majority of users aren't giving up their addiction. I think that's a
given. But I do hope parents are more aware of the effects on children.

------
mensetmanusman
It should be called something else besides ‘the social dilemma’ because
Netflix also does this to make sure people watch nonsense as much as possible
as well...

every company hiring the best algorithm developers want people to spend as
much time as possible on their ‘thing’

not sure what to call this though...

------
xutopia
In the first 10 minutes of the movie they talk about what happened after what
seemed like a revolution at Google. It was essentially nothing. People will
not care due to how convenient staying the same is. It's sad really.

------
capkutay
I think the main takeaway was the influence on children. I don't think there's
precedent for providing kids with constant access to dopamine-driven, socially
adjacent experiences. At least with TV and video games, they're finite
activities with time limits (hopefully).

Yet giving a pre-teen a phone where they're constantly being spammed with
addictive content is dangerous. I can't even imagine all the psychological and
developmental implications.

In terms of adults being addicted to apps and their data being used for ad-
targeting...eh. User beware. I think we have bigger issues facing us.

~~~
josalhor
I do agree the main problem is the influence on children.

As for the adults, I do believe the biggest takeaway is not advertising. It is
the effect on information propagation. I have seen people close to me share
clearly fake information that wouldn't have made the cut on most traditional
media outlets.

People are giving way too much credibility to information that cannot be
traced with a proper source. In the past people relied on the media to do the
research, as biased as they may be. Now this research phase has disappeared.

------
HumblyTossed
I watched a little this morning with coffee. I am finding myself wanting to
skip the scripted parts and just wanting to watch the interview parts. I'll
try again to watch it when I have more time.

------
glitchdout
I wish they had presented more solutions. The way I see it, infinite scroll
feeds generated by algorithms should be outlawed. Same with Like counts. (It
seems Instagram has begun testing this
[https://techcrunch.com/2019/11/14/instagram-private-like-
cou...](https://techcrunch.com/2019/11/14/instagram-private-like-counts/))

Facebook wasn't that bad when you had to visit each person's wall or when the
feed was in chronological order.

~~~
schiavi
I am thinking a lot about this problem space and trying to integrate some of
my ideas into my project
([https://www.confidist.com](https://www.confidist.com)). I am also interested
in solutions and alternatives.

Centralization of power

I think this is the difference between these social media platforms being a
good source to share information with friends and family and major social
media platforms dictating culture through self-interests. Taxing data
collection and storage is a good idea presented by the film. I believe social
media companies should stay smaller in size and scope. These companies should
make guarantees and hold each other accountable through regulation and
organizations about data usage and adopting direct to user business models.

The attention economy

Again I think some regulation here is needed. In addition to looking closely
at infinite scroll feeds, and the information associated with status that is
displayed publicly, we also need to look very closely at the nature of
notification management and data ownership. Taking email as an example, we
must have an accessible way to unsubscribe from all. In parallel, all
platforms should be required to have an accessible and straightforward
unsubscribe from all notifications toggle. The obvious attention loophole of
just creating new notification categories to continuously alert users needs to
be closed. And lastly about data ownership; All social platforms that collect
data, not only need to be accountable for the money trail your data takes, but
should also allow users to export their data in a usable format. That means I
should be able to export all my friends and connections from Facebook and
delete my account. This would prevent that psychological gotcha that plays on
the fear of losing what you have built on a given platform. It is also anti-
competitive to other social platforms if data is not easily exportable. All of
these companies know this and make it very difficult.

Intermittent Variable Rewards

You hit on this with the infinite scroll, but we can likely do more. We need
to help people be patient again. Guess what? Notifications can help manage our
time and attention instead of hurtful if used narrowly and as a service to the
user. For example, if I make this post on hacker news and I have notifications
disabled, I'll probably keep checking back to see if any updates occur. That
involves the intermittent variable reward. However, if I know very clearly
that I'll receive a notification when an update occurs, I can exactly respond
when I need to. Companies like Apple and Google who manage device
notifications should be responsible for their intermittent variability as
well. We should require the ability to have "notification digests" instead of
immediate notifications. We know they can cause traffic accidents, but they
also waste a lot of time and are rarely all that important.

Echo Chambers

You spoke of the like counter. We all know status is a huge part of social and
it is especially important to young people. Viewpoint diversity and acceptance
play off of status as well. So how do we help remove the perception of our
social media status? In the film, they spoke of narrowing in on certain ages
before social media is allowed under the law. I think that is a good first
step before we fully understand how to make improvements through the user
experience. I am requiring 18 and over.

What I am doing with my platform is to privatize as much interaction as I can,
with the balance of moderation and a rating system that is not displayed back.
Sometimes good old fashion one-on-one communication is the way to go. If that
can be used more in a distributed way to substitute for one-to-many
interactions I think that goes a long way towards the social pressure of the
group.

I think special attention also needs to be made towards each type of media.
Pictures, videos, text, virtual reality, augmented reality, and more. I
isolate interactions to text only similar to here on HN. This removes social
pressures and "shiny objects" from easily manipulating our attention. It
prevents every context with another person from being about appearance and
prejudices. But it also allows users to fill in the unknowns with blanket
assumptions. If you are just a username and well maybe you said something I
don't agree with, maybe you are everything I dislike in the world wrapped into
one anonymous internet persona? To try and bring a little more empathy into
the picture I am trying out showing various attributes users have in common
when they interact as part of an introduction. In a virtual reality social
situation we could show emotions from users in a more effective way without
revealing their identity which is pretty neat. Pictures, especially of users,
can allow targeted objectification, and the hyper and unrealistic comparison
with our peers that is most likely with younger women. We should have
regulations around pictures and videos that require a visible indication that
a filter was used. I think allowing young people to share pictures and videos
of themselves will become less of a problem the more decentralized social
media becomes. We need to change the interaction of, "wow did you see the
<media> of x on y?", to "Hey I posted x on y, the niche platform I use, check
it out". Sharing pictures and videos need to be more of a niche hobby, and
less the thing people need to do to be relevant.

Echo chambers I think also directly relates to growth and growth hacks. People
interact more and enjoy interactions more when they are validated. 8 friends
within 11 days. This gets users focused on that status number of # of friends.
Or maybe # of connections which exploded LinkedIn. Guess who people mostly
know and that can help grow the platform? Oh, people that are very similar to
each other and that generally like each other. This is great for a niche
platform where we stay connected with close friends and family, but it is a
disaster for a platform that aims to connect everyone and is also a way to
share news, opinions, etc etc. So again, the scale and scope matters. What
about platforms that aim to form communities or groups based on a common
interest or identity? People love forming their echo chambers and I don't
think there is anything we can do about that human desire. What we can do as
system engineers use our algorithms to promote diversity of thought and
perspective within groups and among groups.

On my platform Confidist, I am working on a design for a connection system
called "Orbits". Instead of encouraging connections with just about anyone you
encounter, I want to leverage the data that users have volunteered, to similar
to a dating platform, promote diversity and health of your Orbit. This might
equate to having multiple strong foundational connections such as being
introverted from a rural community and prioritizing other attributes that you
have not yet been exposed to such as someone who is socially liberal.
Additionally, I created a spotlight event system that is tied to experience-
based rewards. As the system designer, I can cross-promote these events from
one community to another. The site can also encourage a spirit or culture of
openness.

I did not touch on solutions and alternatives to the advertisement model,
however, I think plenty of people are experimenting. I am interested in
hearing what others think are the most successful alternatives. But to me, the
key is staying relatively small in size and scope, prioritize the user
directly with your business model, and be very conscious about how you grow
and regulate your system. Don't make something you don't want your kid to use.

~~~
burnaway
It reads like you have put a lot of time and energy into thinking the issues
and possible solutions through. Here is a tip: as a first time potential user,
this site gives an awfully complex vibe and its hard for me to get the layout,
what should I do next, etc. Compare that to Facebook/Instagram sign up /
onboarding / first steps experience. I suggest enlisting a UX
researcher/designer who have done it before to tackle this problem.

~~~
schiavi
Agree, thanks for taking the time to check it out and give some critical
feedback. Much appreciated. Any UX'r who wants to help please email me
info@confidist.com.

------
fishywang
I see that on netflix's info page it calls it a "documentary drama hybrid".
That's a weird but accurate description, because as a documentary it's too
biased and makes me feel that it tries too hard to push it's agenda to me. I
agree with most of it's points, I deleted my facebook account several years
ago, but they way this "docu-drama" represents its points makes me unable to
take it seriously, because there's just too much drama in it.

------
oqtvs
When I was watching it I was thinking in ways of fix these points in the users
perspective and it seems to me that the user should use the same technology,
to try to block this effects and help him to not fall short in this mind
traps. His use of this tec would`t be binded to others interests like sell ads
for instance, so maybe the system optimization could be better expressed for
him. But I don`t know how a system like this could be implemented.

------
snegu
It was mentioned on my Nextdoor feed, so it has clearly reached even the tech
novices :)

I haven't watched it myself yet, but I'm excited that it's getting such reach.

------
x87678r
Honestly I dont think most people care about privacy or advertising. For me my
biggest argument about SM (incl HN) is the sheer amount of time it sucks for
very little long term benefit. I complain I dont have time to visit friends or
do exercise but somehow I manage to spend way too my time on my phone, and
that even is without Facebook or Twitter.

------
ricksharp
"We are more profitable to a corporation if we are spending our time staring
at a screen, staring at an ad, than if we're spending our time living our life
in a rich way." (Justin Rosenstein ~1:25:30)

This is ironic, because what is best for all of us as a whole and what is best
for each individual - is that each of us are productive, helpful, and loving
to one another.

However, what is more profitable in a short term for a single person (or
entity) is what is often destructive to others (a drug dealer is a good
example of making profit from the destruction of others).

Instead, we have to care about the long-term benefit and health of each person
(our friends, family, community, and all humanity) and not just amass some
money for ourselves so we can live selfishly.

Also, to be clear, I believe capitalism is a vital part of that - when it
works with a win-win focus. (I innovate to provide a good and beneficial
service that helps you, and you help me by providing me with resources I need,
etc. - big win-win for everyone).

However, when we turn it to pure profit and divorce it from the human side of
working together for mutual good, that is when it destroys instead of
empowers. That is exactly why that quote above struck me.

If we use social media to enhance our human relationships, then it is
beneficial. But if all we do is stay up late at night watching video after
video of emotionally triggering content, then it's time to hit the delete
button.

------
elorant
I doubt this will have the impact we wish it could have. Best case scenario
some people with moderate the number of notifications they receive. Other than
that I can't see anyone deleting their accounts because social media are so
much interwoven into their everyday life.

------
wombatmobile
I read Shoshana Zuboff's book Surveillance Capitalism, which was the jolt I
needed to quit Facebook. Prior to those reveals, I was aware that the FB
experience was more dark than fun; the reveals of the exploitation just nudged
me into leaving, because I don't like being exploited.

Then I found and joined HN. It's a better experience. I think _the guidelines_
make it so. They aren't 100% followed 100% of the time, but the intent makes
HN a vastly better experience than FB.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

Be kind. Don't be snarky. Have curious conversation; don't cross-examine.
Please don't fulminate. Please don't sneer, including at the rest of the
community.

Comments should get more thoughtful and substantive, not less, as a topic gets
more divisive.

When disagreeing, please reply to the argument instead of calling names. "That
is idiotic; 1 + 1 is 2, not 3" can be shortened to "1 + 1 is 2, not 3."

Please respond to the strongest plausible interpretation of what someone says,
not a weaker one that's easier to criticize. Assume good faith.

Eschew flamebait. Don't introduce flamewar topics unless you have something
genuinely new to say. Avoid unrelated controversies and generic tangents.

------
iamwpj
I'm tired of every few months we all have to rediscover how our complicated
obsession with social media and the resulting data collection based
advertising is leading to a total loss of individuality and privacy.

We (society) need to _know_ these things and act according.

~~~
messo
I think this film can do just that, but for non-techies. We can't really blame
non-technical people for not understanding or caring about these things, when
the underlying mechanisms are of such a technical nature. That is why I really
liked The Social Dilemma; I can watch it with family or friends and make them
aware of the negative effects of algorithms, that are less obvious from a
regular users perspective.

------
ArtWomb
As a documentary film, I bailed after the first few interviews. The novelty of
peeking inside internet giants was lost on me. Just felt like awkward
confessions of some of my peers with ethical second-thoughts.

I also feel like one method was singled out amongst others: the facebook
algorithm. Which exploits its proprietary model of the links between
individual humans to boost signal. I don't use the product myself. But I have
to ask, to those who do use it regularly, how can any of jaron lanier's
darkest predictions come as any surprise?

If you are in the mood for a free doc feature that will utterly blow your
mind, I can sincerely give a high rec to The Real Story of Paris Hilton. It's
actually a real life horror movie that I found to have a similar energy to
Satoshi Kon's Perfect Blue. But non fiction ;)

[https://www.youtube.com/watch?v=wOg0TY1jG3w](https://www.youtube.com/watch?v=wOg0TY1jG3w)

------
gverrilla
I think it's shallow, naive, simplistic, and achieves probably nothing, except
making this particular narrative a more attractive social media loophole, and
more money for netflix. It's based entirely on false premises.

------
koolhead17
Why was Netflix not there? How are they different from other addictive
platforms?

~~~
messo
Netflix is also guilty of using algorithms to recommend content, but to be
fair, they do not gather nearly as much data on your behavior and life in
general as i.e. Google and Facebook/Insta/WhatsApp. They also have different
incentives as all of their users are actually paying for their product, and
are not shown ads to the same extent (only product placements in movies as far
as I know).

~~~
SketchySeaBeast
Funny enough, of all the platforms I take part in (Facebook, twitter, reddit),
Netflix is the one the one with the most notifications and emails. I imagine
that's simply because I don't find them as offensive as the others so I
haven't neutered them yet.

------
rohanaed
there were many things in the documentary which I honestly did not know.Very
educational and presented brilliantly.

------
peteyPete
I don't know what the cultural impact of this documentary will be since most
won't care to change their behavior one bit. People are building walls around
their echo chambers and are protective of it.

I was well aware of most of this and had been complaining lately about how I
don't see there being a good ending in sight considering how much growth there
seems to be in the extreme right and extreme left. I've been finding it harder
and harder to have conversations with many people I know or even family
because of their extreme stances on many of todays core issues. Most if not
all of their positions or perceived understanding of these issues is straight
out of their facebook echo chambers. They started with an ignorant stance and
had all their thoughts and ignorance echoed and amplified back at them,
empowering them to feel even stronger about it all. My feed is peppered with
propaganda simply because I still bother to comment to family, trying to share
facts, trying to pull them back a bit.. How can they not know better??
Right... The information never reached them, because they relied social media
to get the initial news and once suckered in, they only seeked to confirm
their biases.

There's no questioning why we're in the world we're currently in. Nothing
about this is normal, especially when we claim to have access to information.

I know better, yet still find myself scrolling my feed robotically.. I had
just scrolled it a bit ago, but, found myself scrolling it again. I didn't
plan on it. I just had a blip in focus while I was watching something and my
new programmed behavior was to pick up my phone and start scrolling.

When they say kids mental health severely affected, I'm not surprised one bit.
I see it. My nieces and my gf's nieces are all hooked. I only see my nieces
once or twice a year since we live pretty far from the rest of my family, but
I was shocked to see they knew how to find content and how to operate my
brothers iphone before they knew how to read. And if there was a youtube video
they talked about, they knew how to find it again. Again, they couldn't read
or spell yet..

My girlfriends older nieces are locked on tiktok. They determine how they feel
based on others perceptions of their online accomplishments. All these kids
are trying to become influencers. The feuds that arise, who collaborated with
who, commented what on who, bullied them, etc etc. Bullying in schools is
nothing new, but at least it used to have a schedule. You got bullied in the
hallways at school, or around town, or during recess. Now, there's no turning
it off. Kids get bullied around the clock, and some of it leaves a permanent
mark online for every other kid to see. Thats not healthy for kids. They don't
have a safety net once they get home.

Politics are a clusterfuck right now. One can't keep up with the events while
trying to find the actual facts on everything and not simply believing
everything being targeted at them. And we hope to have positive change soon?
Come on.. As long as people can be bought, this is not going anywhere.

The one thing the documentary didn't touch on, and I guess its the cause of
this in the first place, our capitalistic systems are unsustainable.
Everything must grow, indefinitely. Every quarter, companies must meet their
growth targets or take a hit. At first, you work on improving production,
cutting waste. Then you optimize every other aspect you can. Eventually you
cut corners, eventually thats not enough anymore so you outsource everything,
move production to the cheapest place you can. Then all these tools are
available to market your products in the most targeted way possible. Facebook
et al don't care what you pay attention to, as long as you pay attention and
they can throw ads at you. The companies will pour as much into this as they
can get out. They'll squeeze every drop out of that lemon.

This whole system is like one big dirty coffee filter being wrung out too
hard. Eventually it'll rip and people will fall out, civil war is definitely
not that far out of sight. I think they touched accurately on the big picture
and where things are headed if things don't change. Whats sadder is we know
it, we see it, and we're unwilling to change it, because that would
drastically affect finances of too many parties, and there's one thing that
rules above all, and thats the all mighty dollar.

~~~
messo
Do you think watching The Social Dilemma with family or friends could help
spark a conversation about how they are being manipulated by "Big Tech"? I
plan on doing this with someone who has been greatly affected by the
algorithms, and although I do not expect an immediate change, I do expect it
to spark some new thoughts and questions.

Another comment here mentioned this site as a good starting point for taking
tangible action against being influenced by algorithms:
[https://www.humanetech.com/take-control](https://www.humanetech.com/take-
control)

~~~
peteyPete
I think its a great way to start the conversation as most will likely see
behavior in there that they can identify with or see in their immediate
family. The information is also presented in a way where everyone should be
able to understand whats at stake.

