
Facebook's employees reckon with the social network they've built - uptown
https://www.buzzfeednews.com/article/ryanmac/facebook-employee-leaks-show-they-feel-betrayed
======
creato
I feel like all of this debate is painfully missing the point. The problem is
not some content moderation policy, the problem is that social media has
changed social conversations from small local interactions into monstrous
virtual fight clubs between millions of people simultaneously, and where the
most extreme opinions are rewarded with the most attention. Boring level
headed opinions used to at least have a chance of rising above the noise. Not
anymore.

~~~
Nextgrid
The other problem is that the business model of social media is based on
generating "engagement" at all costs, so the platforms are built to encourage
outrage as it generates lots of engagement, among other addictive behaviors
(the infinite "algorithmic" feed for example). Social media was supposed to be
a tool that serves people but its current business model encourages it to work
against people.

There were plenty of other technologies that could've been used to organize
large-scale virtual fight clubs (forums, BBSes, chatrooms, maybe even the
telephone) but this didn't happen because nobody actually wanted to foster
such toxic behavior.

~~~
acdha
> There were plenty of other technologies that could've been used to organize
> large-scale virtual fight clubs (forums, BBSes, chatrooms, maybe even the
> telephone) but this didn't happen because nobody actually wanted to foster
> such toxic behavior.

“Nobody” is leaving out some pretty notorious sources of toxicity (e.g.
4chan). I think a key difference is that these huge platforms dramatically
increase the reach of those communities by giving them much better tools,
highly-available servers, etc. and in particular mainstreaming them into the
same place everyone else is, making it easier to recruit and share outside
those communities.

In the forum era, people had to learn about a particular site, learn the
community, maybe create an account, etc. to know these existed — now it's just
one Facebook share away and there's an advanced “engagement” system ensuring
that anyone who likes something widely shared will continue to see other
content from the same source without needing to seek it out. Brigading was
most noticeable with a wave new of new accounts and there wasn't something
like an ML system making that activity drive unrelated users to see it.

~~~
Retric
4Chan is only 4 months older than Facebook and is hardly that profitable.
Which is what changed, online dumpster fires where not attractive to
advertisers but add a veneer of social networks and some basic location /
demographic data and suddenly things change.

~~~
shadowgovt
And some moderation. 4chan wasn't ever going to attract ad revenue because
nobody at Coca-Cola wants their brand run alongside a goatse link.

But massive political fights aren't something that the advertisers (until
recently) see as bad imagery for their brands.

~~~
Talanes
4chan has moderation. They just tend to take their duties exactly as seriously
as everything else on the website.

------
artfulhippo
I loved Max Wang’s point that Facebook has chosen to focus on political
engagement over social engagement. Like many people who used FB early, I
remember when it was more about friends than allies (and opponents).

Facebook profits massively from the politics of outrage, and for Zuckerberg to
claim that Free Speech is a moral imperative is for him to take a Strategy
Credit[0]. While I strongly believe that Free Speech is a moral imperative, I
don’t believe that Facebook makes decisions on a moral basis.

[0]: [https://stratechery.com/2013/strategy-
credit/](https://stratechery.com/2013/strategy-credit/)

~~~
jcroll
Facebook makes decisions on a profit basis and by that metric is meteorically
successful. These Facebook employees that spent years collecting their $500k
pay cheques might want to consider this before posting their videos
criticizing the company's direction

~~~
kccqzy
Max Wang, the person who posted the video criticizing the company, left the
company.

~~~
jcroll
My argument is you can't have it both ways. You want to collect $500k salaries
you're going to have to live with your employer pursuing immense profits

------
dr_dshiv
I think the idea is we need diversity in online social media. I'd like to see
human editors come back.

Here's how Benjamin Franklin dealt with his social media platform:

"In the conduct of my newspaper, I carefully excluded all libelling and
personal abuse, which is of late years become so disgraceful to our country.
Whenever I was solicited to insert anything of that kind, and the writers
pleaded, as they generally did, the liberty of the press, and that a newspaper
was like a stage-coach, in which any one who would pay had a right to a
place... "

~~~
1234letshaveatw
As long as the editors are "our guys" and not any of "their guys"

~~~
SaltyLemonZest
On the contrary, I regularly read articles and watch videos edited by "their
guys". I don't have much interest in listening to people scream on social
media, but I'm genuinely happy to learn the perspectives of people who don't
think the same way as me.

~~~
nullc
A consequence of this is that you may find yourself changing your views-- or
at least being more open minded about things your friends don't approve of.

If it's just politics with no direct relevance to your life-- are you really
better off that way? Is being more right or just more open minded about
something that you have no influence do you any good if it alienates you from
friends or family at all?

~~~
SaltyLemonZest
Being more open minded allows me to maintain a friend group who's similarly
open minded, and wouldn't ostracize me for holding opinions they don't approve
of. From what I've seen this strategy works for most people, and life seems
like it'd be really stressful otherwise; if I for example got a stalker and
needed to start carrying a gun, I wouldn't want to be in a situation where my
friends will abandon me if they find out.

I of course don't begrudge anyone who feels they have to keep a closed mind in
order to maintain a reputation in their community.

------
vannevar
Facebook is not a "social network" in any meaningful sense of that word. It's
a largely unmoderated narrowcast platform accessible to anyone willing to put
down the money to use it, regardless of motivation, as long as it isn't
obviously pornographic.

~~~
notacoward
Why can't it be both? I see people using it both ways, every day.

~~~
vannevar
You see people _trying_ to use it that way, while wading through a morass of
paid advertising and promoted memes. I can use a brick as a hammer, too, but
that doesn't make the brick a hammer.

~~~
notacoward
Why do so many people forget that _they control their feed_? Maybe less than
some of us would like, but I carefully curate my feed and use mute and so on
and my feed has _very_ little junk in it. Mostly family stuff and jokes/memes,
with some ads for stuff that actually does interest me. It's not hard. Most of
us here solve harder problems every day.

~~~
tantalor
I did that until my feed was mostly empty and uninteresting, so I deleted my
account.

~~~
dingaling
But that assumes that the feed is the primary value of FB. For me the value is
in the social graph. I don't even follow the posts of people.

------
koheripbal
This long article doesnt address the most critical point.

If employees have an ethical issue with working there, why don't they quit?

This isn't a new dillemma. People working for tabacco companies have always
been in the same boat.

~~~
paulgb
> This long article doesnt address the most critical point.

> If employees have an ethical issue with working there, why don't they quit?

The article opens with a story of someone who quit.

~~~
ciarannolan
It took him 9 years of collecting a fat paycheck and enabling Facebook to find
his moral compass.

Is it wrong to be skeptical of these millionaire developers who ride off into
the sunset to retire but leave a final "oh btw, lots of problematic stuff at
my former employer" message?

~~~
paulgb
It's not wrong to be skeptical. I do think there's a sort of boiling frog
effect, though. Internal company culture never matches public perception, and
you're always able to tell yourself a story that the public misunderstands
what drives the company. After all, you hang out with your colleagues and you
know they aren't bad people. But then some big controversy snaps you out of it
and you realize that the company's internal collective fiction is no more
accurate than the public perception, and perhaps you are the baddies after
all.

~~~
mxawng
hello, i'm the ex-employee in question and can speak authoritatively on the
subject of myself :)

i joined FB for the first time as an intern almost a decade ago. i suspect i
have a substantively different view of the company and the people who built
it—even compared to many other employees, and certainly compared to folks who
have not been in the ~room where it happens~ (heh).

even accounting for some amount of insider bias, i think there's still a
material discrepancy between how the public viewed certain major FB
"scandals"—via the lens of media spin and profitable reporting—and how many
folks like myself, who were privy to additional context and private
information, viewed them. to some people, the recent discord around hate
moderation might seem like just more of the same FB badstuff. not so for me!

even folks at FB who don't work directly on certain products and policies are
often immersed in discussion about them (unless they aggressively
unsubscribe). these discussions, again, have their bias, but i hope we can
concede that it's still a lot of passive brain cycles being spent swimming
among these topics. folks develop deeper intuitions for how troubling X or Y
publicized issue actually is, relative to all the things (of all flavors) that
happen at FB and among its users. there's also a ton more discussion of
legitimately positive societal work and how to extend those successes, whereas
those rarely make good media narratives.

the good doesn't cancel out the bad—that's never how it works—but you do have
to consider both together at every decision point.

in the leaked audio of my internal video post, i say that my long road to the
door started in 2019. i still stand by that inflection point, and feel that
before the, FB—while clearly on the back foot for some time due to rampant
abuses by powerful groups of its product—was on balance moving in a direction
i supported.

you may have made a different choice—or you may feel you would make a
different choice, though you may not have the full slate of inputs right now
to be sure you would, in situ. but, at least for me, i have reasons why the
questions around the 2016 election or fake news or data breaches were things
which, to me, seemed categorically different from the questions concerning
hate moderation.

feel free to DM any earnest questions about why these things don't all congeal
into one mass of problem for me (and i suspect other current and former
FBers).

~~~
ciarannolan
Thanks for this, Max. Easy upvote.

I'm still of the opinion that you chose personal gain over doing what you knew
to be the right thing. It's something everyone (me included) does many times
over. But the scale of wrongdoing by Facebook is exceptional.

Equally as possible is that our intuitions about "the right thing" are wildly
different. I think that democracy, societal cohesion, and personal privacy are
important and that Facebook has permanently damaged all three.

Which do you think it is?

~~~
mxawng
i think you may be setting up a few false dichotomies here.

> the scale of wrongdoing by Facebook is exceptional

absolutely—but i think you (and certainly, many many people in the world) may
be missing that this is in large part due to facebook's scale, period, not any
specific wrongdoing-at-scale. facebook is enormous and has enormous impact.
enormous quantities of good and bad things happen on facebook and through
facebook every day. it's not enough to /just/ point to its "scale of
wrongdoing" to say that it's "wrong" to associate with it. every government in
the world causes harm at scale. should we embrace true anarchy? are we all
culpable for participating in society? i mean, maybe yes to that last
question; but it's not a very interesting answer, possibly because it's not a
very interesting question.

i think you have to consider things holistically. facebook does harm—that's
bad! does it do good at scale, also? how much of the harm is facebook's
"responsibility", at least from the perspective of assigning moral
culpability? (b/c obviously it's better to treat all fouls as your
responsibility, because you control only your own actions.)

consider the 2016 election mess on social media. do we pin the blame squarely
on social media here? if so, why? i think social media was caught off-guard.
social media started its life as small circles and communities, grew into a
media and meme distribution platform, and then somehow became hijacked by
powerful entities fueled by state money (including states themselves) as a
dezinformatsiya and propaganda side hustle. was it FB's responsibility to
predict and prevent that, even when so few others did or could? if so—are
those other people and platforms not morally culpable for not making enough
noise or action? are the state actors themselves not morally culpable for
their atrocious agency?

in late 2016 and early 2017, i read lots of opinions and articles on the NYT
about how FB didn't defend against disinformation and propped up negative
political content. why did the NYT write so few opinions and articles about
how the NYT published story after story of tabloidal but-her-emails drama, or
how the NYT gave the trump sitcom team so much free press?

facebook is very big and powerful but it is not beyond exploitation. should we
hold it morally culpable for being exploited in these ways? facebook is not
only the messenger, but a lot of the times it is, and perhaps we should not be
so quick to proverbially shoot it.

> I think that democracy, societal cohesion, and personal privacy are
> important and that Facebook has permanently damaged all three

dovetailing off the above: are the instruments of the state not causing this
same damage? are traditional media conglomerates not causing this same damage?
are political entities accepting massive cash injections not causing this same
damage? are certain social institutions like megachurches not causing this
same damage? are social norms sculpted by late capitalism not causing this
same damage?

do you curse your server for bringing you bad food?

this question is rhetorical. the answer is obviously yes, people do that all
the time, but like, maybe they shouldn't.

so for me the question is, how much of this damage do i feel that facebook is
"immediately" responsible for, versus "secondarily" responsible for. i brought
up the 2016 election above. my view is that FB was in large part taken by
surprise and exploited in the course of those events.

obviously, i have insider bias here. but much as both-sides-ing every single
argument is not actually a "neutral" position, neither is the position of
being on the "outside" and not having insider bias. neutrality is a fiction;
there's nothing truthier about being on the insider vs. on the outside.

but my position gave me (and other FBers) access to information about motion,
decisions, and human actions which informed my evaluation of culpability. much
of the "scandals" in FB-the-darling-child-of-the-media's history had a similar
feel. facebook had some posture, it largely worked alright and allowed for
good or neutral stuff to happen, conditions changed, "small" (but still at-
scale) but high-profile harm was incurred due to some bad faith actors, and
facebook responded. not interesting to me, someone who watches this process
happen literally every day.

employees didn't have trust in FB because they were given literal kool-aid to
drink; it's because they had extra information and sometimes it led to obvious
conclusions that are utterly non-obvious without that information.

but i think facebook's response to hate moderation has a different texture.
it's had years to adapt to the new reality of constant assault from political
forces (though i would never, ever expect or want perfection here). but rather
than pushing back—as it did in pretty much all past privacy or data
breaches—it seems to be adjusting to explicitly allow for and tolerate some of
this behavior which i consider "bad".

why i stayed at FB past the moment at which i developed this concrete concern
is for entirely selfish reasons. but don't conflate that inflection point with
the whole history of FB's narrative, as told by the media. you may feel the
whole story has a uniform texture, but i don't, and i suspect other FBers do
not, and i suspect it's because there are good reasons to feel that way.

------
mherdeg
Facebook's famously tight-knit and tight-lipped engineering organization has
become increasingly leaky in the past couple of years, as posts and data make
it to the press with surprising regularity.

What changed? Is it morale? Or something that happens at scale? Does the rate
of leaks say anything about the health of the company?

~~~
mandevil
Talk to a reporter about why people leak. The reason people leak is when they
feel that the organization doesn't listen to them and they can't steer the
organization any other way. If your employees feel respected and listened to-
and that their internal warnings are being heeded- they won't leak to an
outsider. If they feel that they are a Cassandra, doomed to be correct but
ignored, they will leak.

Now sometimes workers are wrong- the problem they were obsessed with wasn't
actually that serious, their boss did take their advice into account and
decided to resolve it a different way, etc., but in general, lack of respect
is why people leak.

The implications for Facebook are obvious.

------
tech-historian
> "Yaël Eisenstat, Facebook's former election ads integrity lead, said the
> employees’ concerns reflect her experience at the company, which she
> believes is on a dangerous path heading into the election.

> “All of these steps are leading up to a situation where, come November, a
> portion of Facebook users will not trust the outcome of the election because
> they have been bombarded with messages on Facebook preparing them to not
> trust it,” she told BuzzFeed News.

> She said the company’s policy team in Washington, DC, led by Joel Kaplan,
> sought to unduly influence decisions made by her team, and the company’s
> recent failure to take appropriate action on posts from President Trump
> shows employees are right to be upset and concerned."

This is pretty damning stuff, coming from named, authoritative sources inside
the company. I'm hopeful that the recent advertiser boycott will help shift
priorities with leadership at the company, but I'm not holding my breath.

~~~
creaghpatr
This is extremely predictable though, because a sizable segment of Facebook
users did not trust the outcome of the 2016 election. Why would it be any
different in 2020?

~~~
tomrod
People trust the outcome generally, whether they like it or hate it. The folks
aren't trusting the inputs.

2020 is not 2016. Polling is suggesting a large shift. That may cause distrust
in the outcome itself if the outcome doesn't align to polling due to voter
suppression efforts and other possible issues.

~~~
jfengel
Is the polling really all that different from this point in 2016? As I recall,
in 2016, the eventual winner was way down in the polls right up until election
day.

~~~
creato
By the time of the election, the polls had narrowed considerably. There were a
lot of clowns with predictions of 99% chances and whatnot, but people that
actually understood statistics and polls had the race a lot closer to a coin
flip at the time of the election (IIRC 60-40 odds).

~~~
Tempest1981
FiveThirtyEight had Trump at 29%. Most others had him at 1-15%. Betting
markets had him at 18%.

[https://fivethirtyeight.com/features/why-fivethirtyeight-
gav...](https://fivethirtyeight.com/features/why-fivethirtyeight-gave-trump-a-
better-chance-than-almost-anyone-else/)

------
lr
Social media, and all of the "talking heads" on network news and talk radio,
are basically in an arms race, just like advertisers have been for years. For
instance, Coke can't stop advertising because then Pepsi or another brand will
take over, i.e., they have to keep it up. We are now at that same point with
politics: everyone has to keep it up, and even more so; constantly upping the
ante. Facebook is the dominant platform for upping the ante, which gives them
money, and more importantly, power.

------
foofoo4u
There seems to be a growing consensus here on HN that Facebook's ad-based
revenue model is a source issue for why extremity, polarization, toxicity, and
usage addition is occurring on the platform. All of these elements drive up
engagement. The more engagement you have, the more profit the company makes.
It's a feedback loop that's helping the company and shareholders, but at the
cost of the stability of society at large. So here's one idea to address the
problem that I'd like to get other's thoughts on. What if we declare ad-based
revenue streams for social media platforms a form of market failure? What if
we made it the law that social networks must charge a premium, a subscription-
model if you will, to use their services. "Want to use Facebook? You'll need
to subscribe for $2/month". What do you think the experience of using Facebook
would be like then and, to an extent, society at large? It has been my
experience that services I must subscribe to do a better job of respecting my
data, privacy, are less toxic, have higher quality content, and are less
addicting. What I am suggesting here is that the desired behavior can be
directed through proper incentives.

------
xoxoy
it took Covid for me to really believe how dangerous social media has become
to society. it really has made people more divisive and angry and
conspiratorial.

didn’t really buy it when the Russia bot thing happened but now I do 100%

------
seddin
The media always blames Mark for every bad action that his company does, but
what about all the other employees, aren't they just as guilty as him?

~~~
site-packages1
It's possible to be an employee at what you think is an immoral company but be
doing some innocuous that's just a part of a bigger whole. Does improving an
internal observability tool as your job make you just as culpable as Mark? At
what point do your actions become immoral? Just working for the company?
Building anything for the company while an employee? Maybe you have to be the
last step implementor of some bad policy before the immorality attaches. I
don't really know.

But the answers to these questions aren't clear and up for debate. What is
harder to argue is whether the person directing everything and with a bird's
eye view of the direction of the company is culpable for that company's
misdeeds. I think this makes it more clear to blame Mark than "all the other
employees."

~~~
viklove
> At what point do your actions become immoral?

When your livelihood is funded by this kind of vile deception and
manipulation, your actions are undoubtedly immoral.

~~~
mxawng
sounds like you should cease participation in capitalism and also society
then. i know a guy popping out of a well who would like to have a word with
you!

you might not like the messy reality, but there are lots of my former
coworkers who feel inclined to stick around because of their work visa, or to
pay off their loans, or because they felt their FB offer was a lucky break and
they want to best support their kids.

if you're not responsible for all violence or oppression in your nation as a
citizen, trust me—folks employed by a given employer are likewise not wholly
culpable for the actions of their companies.

------
nacho2sweet
Do people who work at facebook proudly tell people they just met at bar that
they work at facebook? cause it wouldn't play well in my circles.

~~~
KaiserPro
no, my team at least says they got bought out.

------
tims33
FB's religion around AB testing for engagement and their views on free speech
(which are mostly good) creates a complicated product that brings out much of
the worst in people. They need to reevaluate their engagement at all costs
philosophy.

------
room505
I used to have a Facebook account and deleted it in 2009-2010. This is just me
brainstorming, but would it make sense to have to submit a photocopy of your
ID card to create a real name account for such a huge company, for the sake of
creating something of a safe online forum/community? I would think people
would be more responsible with what they post. I had done so for OfferUp,
which to me, makes sense because people need to have a sense of safety and
security when interacting with each other. I'd like to hear what others think.

~~~
foofoo4u
Years ago, Facebook for some reason thought I was a bot and disabled my
account. They required that I upload a government issued ID to verify I am a
real person. I was finding Facebook to be detrimental to my health at that
time, so I felt it a perfect opportunity to leave the platform all together.
Plus I didn't exactly feel comfortable giving them that kind of information.

------
KaiserPro
The thing that annoys the living shit out of me are two fold:

1) the content policy for the public is clear, well thought out and easy to
understand. but that doesn't apply to politicians. The rules that apply to
politicians is hidden, opaque and taken on the fly.

2) the moral compass of the leadership team is fundamentally blinkered and
naive. They do not understand or seem to want to understand viewpoints of
other people. Unless you are posh & rich and upper middle class, you have to
battle really hard to get your point across.

------
deckar01
Is there a source for the video? I am only seeing an audio clip in this
article and I can’t seem to find the video by googling “Max Wang facebook”.

~~~
minimaxir
The engineer just posted the video themselves to YouTube:
[https://www.youtube.com/watch?v=oyBQ1_a70KI](https://www.youtube.com/watch?v=oyBQ1_a70KI)

------
EGreg
We should probably make a 2020 version of this post:

[https://qbix.com/blog/2019/03/08/how-qbix-platform-can-
chang...](https://qbix.com/blog/2019/03/08/how-qbix-platform-can-change-the-
world/)

Another month another 20 reasons to have an open source alternative.

------
rbanffy
I have the impression we invented that Big Machine that drove the Krell to
extinction in Forbidden Planet. It's just so that's far less spectacular than
in the movie.

------
Miner49er
One thing that's interesting is that Facebook has also come under fire for
banning/censoring anti-racists and black people for talking about racism:
[https://www.usatoday.com/story/news/2019/04/24/facebook-
whil...](https://www.usatoday.com/story/news/2019/04/24/facebook-while-black-
zucked-users-say-they-get-blocked-racism-discussion/2859593002/)

It would be one thing if they didn't censor anything, but censoring anti-
racism but not racism seems to show that FB is taking a pro-racism stance.

~~~
d_burfoot
From the article:

> "White men are so fragile," she fired off, sharing William's post with her
> friends, "and the mere presence of a black person challenges every single
> thing in them."

> It took just 15 minutes for Facebook to delete her post for violating its
> community standards for hate speech.

I'm pretty sure that comments like that don't count as "anti-racism".

~~~
Miner49er
They definitely do.

But anyway, that's only a single example, one women was banned for posting a
screenshot of a racist message she received.

------
lowdose
Does WeChat cause the same problems in China as Facebook & Twitter in the
west?

~~~
Lammy
Maintaining societal cohesion is the justification for a great many of PRC's
actions (I would say 'evils', but that's my judgement), including the Great
Firewall, extermination of Uyghurs, etc. It is fundamentally a waste of time
to compare that society to America's melting pot on the level we're
discussing.

~~~
lowdose
Non of the solutions China comes up with is worth to discuss?

~~~
Barrin92
I've found it even of little use to bring up frameworks of other Western
countries. I once got a lot of angry replies because I brought up the German
approach of mostly just tackling hate speech and having independent non-profit
fact-checking organisations label and correct posts, which is about as non-
invasive as it gets while trying to deal with the problem, still got a ton of
responses about censorship and free speech.

The US is basically free-speech fundamentalist. Expression trumps truth, with
that view in mind you by definition can't deal with the problem.

~~~
Lammy
Those concepts of "truth" and "fact" may feel like absolutes but are a product
of the observer. When people with similar beliefs/cultures/backgrounds/goals
observe the same thing then it's easy to agree on what's true and what's not
true. In general I think it's not a good look to tell somebody that the world
they perceive is untrue, and that's why I support free speech as basic a human
right :)

~~~
Barrin92
we're not talking about some highbrow philosophical disagreement or argument
about cultural values, we're talking about very basic facts, which have the
benefit of actually having truth value regardless of whatever people think.

The pandemic is a pretty good example. There are actually a lot of protocols
and behaviours that have shown to make it so hundreds of thousands of people
don't have to die regardless of what timezone or nation or religion or
political party you're in, yet aided by constant social media barrage the US
is so divided it can't even manage to enforce that.

------
worksmart122
need a culture that has integrity, transparency, and social responsibility.

------
ktrl
I have been wondering why Facebook has become the only platform that tolerates
right-wing speech. The other platforms took the convenient route and kicked
out right-wing voices.

Why exactly is Facebook risking both bad press and a revolt from its
overwhelmingly progressive employees?

~~~
bhupy
Facebook can afford to do so because they have higher margins allowing for
more experimentation/risk-taking. Businesses in more competitive spaces are
unable to take tough stands.

The New York Times cannot be the New York Times of New York Times v Sullivan
anymore, because their ad monopoly has been decimated. But Facebook can be the
NYT of NYT v Sullivan, because they have 40% margins on their ad business.

One thing I have been wondering is: why is Google so lacking in conviction?

------
nailer
> a post from President Donald Trump that seemingly called for violence
> against people protesting the police killing of George Floyd.

This seems very unlikely, and there's no link or screenshot of the actual post
to support the statement. Twitter has previously taken statements that laws
will be enforced as 'calls to violence' so it wouldn't be surprising if the
same thing is occurring here.

~~~
bhupy
The steel-man argument is that the President's statements practically amount
to an _indirect_ call for violence. There's no other way for the state to
enforce laws except to use the messy apparatus of the police, which may result
in collateral damage/violence.

~~~
drak0n1c
The truly steel-man assessment of the President's statement of "when the
looting starts, the shooting starts" is that he meant it empirically, not
normatively -- describing what inevitably occurs in an environment of
sustained looting (shootings committed by people guarding property, looters,
and law enforcement).

If you want to be extra charitable, he possibly made that statement
empirically to describe a tragic situation he wanted to prevent entirely by
dispersing protests early. Such a justification and reasoning would be very
authoritarian and is arguably wrong in efficacy, but that interpretation does
not involve an indirect or direct threat of shooting.

My grandmother (an English teacher) once castigated email and predicted it
would condition people at all levels to not be precise in language, and warned
of how communications and ideas would degrade as a result. As a young nerd I
thought that was absurd and the opposite would be true, but now seeing Twitter
I have changed my mind.

~~~
bhupy
Yeah that's the steel-man counterargument.

My steel-man was an attempt to justify the deletion of Trump's post: by making
the best possible case that it's actually a normative call to violence (even
if indirect).

~~~
erichocean
Since Trump literally explained his comment, your steel-man seems entirely
wrong/misplaced.

I'd rather Twitter just ban Trump entirely because he's an asshole and they
don't like him. Such a policy would be far more honest.

------
sowellecho
"If a man is not a Socialist at 20 be has no heart, but if he remains one at
30 he has no head." I don't know who said this, but this is a great example of
how our minds mature.

The probably unintentional ageism in tech has a major, quite literally
civilization changing consequence.

We are creating power centers which have too many young people who haven't
really thought a lot of things through, certainly not the unintended economic
consequences, and who glibly dismiss people who disagree with their premises
as some kind of "ism".

~~~
orhmeh09
That’s a crass dismissal of the people involved here _and_ socialism. You
think that the problem really is that everyone at Facebook is too young and is
thus more inclined toward recklessness (somehow involving socialism)?

PS: Would you be shocked to learn that some become socialists after age 30?

------
tijuco2
Social medias have been silencing every conservative opinion on their
platforms, because most of their employees can't stand different opinions.
These people are unmanageable and want to make activism using a platform they
don't own. At the same time they look the other way when it comes to
progressive comments. If he doesn't like different opinions, intrinsic to
democracy, maybe we should ask Mr Wang's parents if they miss China.

~~~
dang
Turning this into a nationalistic slur and personal attack is no way to make
your case. It also breaks the site guidelines and will get you banned here, so
please don't go that route.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

