
Facebook to add 3k people to community operations team to improve moderation - tinodotim
https://www.facebook.com/zuck/posts/10103695315624661
======
hartator
I wonder if they will push some politic agenda or if they will manage to stay
neutral.

Even if I have own opinions on the subject, I think the whole "fake news"
thingy seems to politicaly motivated. Doesn't mean they are wrong by doing
this, but it's hard to be claiming they are still somehow neutral.

~~~
diogenescynic
Neutrality is a mirage. Journalism can and should be fair without trying to
appear 'balanced' or neutral. For example, there aren't really two sides to
climate change when 99% of scientists and experts agree. It's really only the
news who tries to position issues as if they are competing arguments like
sports teams. This seems like it's had a negative impact over all by giving a
false equivalency, for politically motivated reasons, to issues that are by
and large not contestable or debatable.

~~~
nonbel
>"there aren't really two sides to climate change when 99% of scientists and
experts agree"

So what exactly is this one side that 99% of experts agree upon?

~~~
vkou
That it's happening, that we're causing it, and that if we don't change
something, we are fucked.

The only disagreements are on extent to which we will be fucked - will we see
hundreds of millions of climate refugees, or billions.

~~~
nonbel
Can you be more specific? For example, I really doubt anyone thinks climate
change is not always "happening" or that human activity has exactly zero
influence on it.

~~~
lordCarbonFiber
A certain commander in chief has in public espoused that:

"The concept of global warming was created by and for the Chinese in order to
make U.S. manufacturing non-competitive."[0].

There are plenty of misinformed people that will gladly tell you there is no
such thing and if there was human activity has no effect. Pretending that
there aren't and even if the issue is complex there isn't a set of clearly
_wrong_ opinions is disingenuous to the extreme.

[0]
[https://twitter.com/realdonaldtrump/status/26589529219124838...](https://twitter.com/realdonaldtrump/status/265895292191248385?lang=en)

~~~
nonbel
I asked about "climate change". You are talking about "global warming" as if I
asked about that.

Are you saying these concepts are equivalent to the aforementioned 99% of
scientists and experts?

I'm really not trying to act dense, just figure out what it is people even
believe about this topic.

~~~
vkou
This is splitting hairs. The consequences of human-driven climate change are
widely understood to include net warming of the globe.

~~~
nonbel
For example, if there is a new ice age triggered due to human activity, would
that would agree with both the global warming and climate change ideas?

~~~
vkou
If cows were spherical, we could agree that cows were both cow-shaped, and
sphere-shaped.

They aren't, though.

~~~
nonbel
Sorry, I have no idea how this is relevant. I thought I was asking a pretty
simple question: "What is it that the average person concerned about climate
change actually believes is going on?"

It appears this is the wrong place to get an answer to that question.

------
blfr
What's weird is that Facebook cannot rely on their users to report blatantly
criminal acts witnessed by thousands of people. It probably says more about
Facebook users than the platform and makes me doubt that doubling this or that
team size can make a meaningful difference.

Especially with this approach of manual monitoring which will probably just
result in more questionable deletions Facebook is already known for.

~~~
nikcub
Users do report posts, from what I understand quiet regularly. It still
requires manual moderation tho otherwise the reporting process can be abused
(think anti-competitive purposes)

I've yet to hear of a way that this is solved without hiring thousands of
people in what are horrible jobs (see Adrian Chen's excellent reporting on the
issue[0][1])

I have no doubt that this will eventually be solves or assisted with ML, and
that the solution is likely to come out of FB or G

[0] [https://www.wired.com/2014/10/content-
moderation/](https://www.wired.com/2014/10/content-moderation/)

[1] [http://www.newyorker.com/tech/elements/the-human-toll-of-
pro...](http://www.newyorker.com/tech/elements/the-human-toll-of-protecting-
the-internet-from-the-worst-of-humanity)

~~~
jsmthrowaway
Those Chen articles hit close to home when I read them, because I've had to
verify and act on child pornography complaints at a hosting service. Almost
eight years later, I still have nightmares about the (thankfully) little I've
seen. When I got into hosting I didn't realize dealing with stuff like that is
table stakes; that was my first, and last, hosting employment, but I'll carry
that aspect until I die.

That experience really makes me feel bad for the 3,000 new hires, honestly. I
couldn't imagine moderating the human condition, which one could argue
Facebook basically is. They'd better not clean out Adecco, and actually pay
those people with the long-term damage of the job in mind, but that won't
happen. Would actually make for a good union...

~~~
nikcub
Completely sympathize. I had an experience in my black hat days - broke into a
server and found folder upon folder of JPG's. Stupidly downloaded some of them
and opened the first to find an image so disturbing that I can't even begin to
describe it.

We were a bit conflicted about what to do (more how to do it), and ended up
reporting it to both the US and Australian feds (which I suspect may have
given me a free pass on one of the crazier things I later did).

I really didn't take it well, but one of the guys in our group was inspired to
start a vigilante group that would hunt these distribution networks down and
it achieved some success in the 90s.

Hopefully these employees can be eventually protected with some basic level of
ML that would filter out the worst of the worst (apparently Microsoft Research
have a well-developed fingerprinting system for child exploitation images) -
because i'd really hate to imagine the scenario you and Chen describe, and
what I briefly experienced, as becoming more common.

------
kensai
3000 persons on top of the current 4500 is a big addition. If all these
persons are dedicated to the prompt elaboration of complains and violations of
TOS, it might indeed make the difference.

I don't ask if it is economically viable, I guess he knows what he is doing.
Facebook is not losing money anytime soon.

~~~
swalsh
That's 7500 people whom will be "forced" as much as free employment can
constitute force to do nothing but look at questionable often offensive
content.

What a funny way our economy has evolved. 100 years ago many of these people
might have worked in farms, 50 years ago they might have worked in factories.
Today they sit in an office looking at ostensibly offensive material that
facebook deems the rest of society should not be exposed to.

~~~
educar
So? We have 1000s in the bay are building the next chat bot, cat pic ai engine
etc.

~~~
anigbrowl
Equating these two classes of activity is not even wrong.

------
malandrew
Since social media use itself contributes to lower self image and depression,
how much are they going to look into their own product as contributing to a
problem getting worse for an individual. It would seem that work being done to
drive engagement is most problematic for those at risk.

~~~
xatan_dank
Probably not at all. Facebook's collective desire for higher engagement and
profit will likely outweigh any concern for individual or collective users.

~~~
zackmorris
I hear what you're saying, but from a slightly different angle. I wonder if
the profit motive creates a conflict of interest the same way that corporate
news tends towards bias to satisfy advertisers.

I am thinking of exiting Facebook for at least a couple of months because my
posts/shares (which tend to have a political slant or at least broader
perspective to them) don't seem to get any reaction or be shared anymore.
Neither do music, alternative culture, or sustainability/environmental posts.

If Facebook is unable to give people the dignity to fail at debating one
another and be challenged by new ideas, then that may not be compatible with
democracy. I hope they fix whatever is going on with their feed algorithm, and
maybe 3000 people training AIs will help, but I wonder if the problem isn't
technology.

------
unklefolk
I suspect the long term plan is to create a training dataset labelled by the
3000 people and, when they have sufficient training data, let machine learning
/ AI take over.

~~~
tyingq
100% automated ML doing this well seems far off to me.

Google's ML for rich snippets thinks a quarter is worth 50 cents.
[http://imgur.com/1nNreR2](http://imgur.com/1nNreR2)

Really parsing language for meaning is hard. Sentiment analysis, which seems
simpler than this problem, still has relatively poor accuracy.

I could see it perhaps thinning out how much has to be done manually, but I
don't see it removing that need entirely.

~~~
Spooky23
That's amazing. Nearly every fact presented there is completely wrong.

Plus, the picture is a standing liberty quarter that is nearly 90 years old
and pretty much completely out of general circulation.

~~~
fdsfsafasfdas
Could you specifically address his concerns for those not in-the-know about
ML?

~~~
Spooky23
Sorry, my comment was ambiguous -- I was referring to the Google result.
Thanks for pointing it out. The commenter that I was responding to made a
great point that I agree with and was highlighting as a particularly good
example of ML gone awry in the wild.

Other than the subjective statement that quarters are "useful", every fact in
the google response is wrong.

Quarters are:

\- Worth $0.25 (not $0.50)

\- Not the largest. (Dollar coins and half-dollars are larger)

\- Made of a blend of copper and nickel. (No silver content for circulating
coins)

The picture depicts a standing liberty quarter that is no longer a circulating
coin.

It's amazing that Google would put that in production.

------
olivermarks
[https://www.buzzfeed.com/reyhan/tech-confessional-the-
google...](https://www.buzzfeed.com/reyhan/tech-confessional-the-googler-who-
looks-at-the-wo?utm_term=.voVYAdXJ7#.fjplEpV02)

The horrible reality of trying to keep offensive materials from appearing
online.

1.86 billion FB active users divided by 3000 thought police equals 620k
accounts per clean up team operative...

~~~
wu-ikkyu
Buried somewhere deep inside their 1,000 page employment agreement is probably
something along the lines of "you willingly agree to subject yourself to this
obvious psychological torture and we are in no way responsible for the mental
problems which will inevitably result from your job duties."

------
bluetwo
Was just talking last week that the last thing any of these large internet
companies wants to do is hire a large room filled with low paid workers to do
anything, especially here in the US.

If they are making this move they must see some large liability looming on the
horizon.

~~~
bogomipz
I didn't understand the "especially here in the US" part of your comment. Can
you explain?

~~~
walshemj
its not like low paid and probably part time workers have many employment
rights in the USA

~~~
cookiecaper
Compare to Bangladesh, Vietnam, or the Phillipines. Low-wage American workers
are not very competitive in a global marketplace. If the cost of shipping and
importing goods and services < the cost of complying with U.S. regulations and
paying American workers, there's every economic reason to have the work sent
out to less regulated, lower-paid regimes.

The way to fix this is to even the playing field by ensuring that it's _not_
cheaper to pay to have this done overseas. Otherwise, the "American-made"
companies just have to hope that's enough to convince customers to help them
make up the gap in profits ... and it's usually not.

~~~
malandrew
That would require a thoughtful examination of our housing policies in the US
that are designed first and foremost to drive up property as an asset class.
It's probably the single biggest driver of inflation in the US that
necessitates increases in wages.

------
blauditore
I wonder if this will reduce the problem of fake accounts. I regularly get
such friend requests, and it starts to get annoying.

Also, those seriously affect the attractiveness of ad campains. I dipped my
toes into it once, but it looks like a large percentage of gained "users" are
just fake ones...

~~~
tyingq
My wife made an account for our dog, with an appropriate headshot as the
profile photo, about 9 years ago. He posts regularly, and we tag him in
photos.

My guess is that they only block fake accounts if they are manually reported
by someone.

~~~
Flammy
You would think this, but as a counter example Xbox (back in the 360 days) had
a "report gamertag as offensive" (aka offensive username).

Users could pay $ to change their gamertag (~$10).

It was fairly common to see people publically asking "please report my name as
offensive" as it was known that if you got enough reports, you would be
"forced" to change your name for free. People who wanted to change their name
anyway would solicit reports against themselves to avoid a $10 fee.

Clearly no human was reviewing these reports... and that was a paid service!

------
socrates1998
Long overdue. I sort of get these online social media companies skimping on
moderation while they are growing and don't have cash.

Facebook is rolling in cash and this has clearly hurt their brand. Hiring
moderators to take out the worse of Facebook could help a lot to dealing with
the utter bullshit that goes on.

Reddit and Twitter have similar problems, they want to either farm out
moderation to volunteer users (Reddit) or automate everything and only step in
when the NY Times gets a hold of something (Twitter).

Either way, their moderation leaves a lot to be desired.

Reddit is particularly strange. I don't know how this could be true, but they
claim that without the volunteer mods, they couldn't exist. Either they are
lying or are just awful at running a business, neither would surprise me.

Does anyone know if Reddit actually makes money? And if they do, how? Ads seem
sparse and selling "gold" just doesn't seem like much.

------
zoul
Slashdot’s moderation and meta-moderation system
([https://slashdot.org/moderation.shtml](https://slashdot.org/moderation.shtml))
always comes to mind. Could something like that work for Facebook self-
moderation?

~~~
fsiefken
yes, i always think of this. Why is this system not in use at Reddit or HN? Is
it patented? I would pay 1000 euro's if they could implement it for my
facebook group, with 2000 members it takes an hour a day which would halved
with such as system, 14 hours a month.

~~~
Crespyl
Even without the meta-moderation aspect, just having multiple dimensions for
evaluating any given post is so useful and seemingly unique.

"Upvote/Downvote" is not nearly as useful for ranking as the range of, say "+1
Funny, +1 Insightful, +1 Agree, -1 Disagree, -1 Spam, -1 Flamebait/Troll"
(I've forgotten exactly what set /. uses)

------
flexie
In parts of Eastern Europe, Asia, South America, and Africa, Facebook could
hire 3,000 college educated employees for less than $3M monthly, including
taxes. So while this from a human point of view is a very decent thing to do,
it's not necessarily as costly as one might think, especially not when
considering the possible liability or regulatory backlash they may run into if
they do nothing and Facebook becomes the place for suicides and violence.

Now, if they add all those jobs in SF, it's another thing entirely.

------
problems
This is sort of ridiculous, they can't seriously be expected to be held
responsible for every video served on their platform. Nor should there be
anything inherently worse about live-streaming of violence than that violence
occurring in the first place.

I can only see this as a good thing if they manage to catch people before the
act and intervene. Is this a primary goal of the program?

~~~
cookiecaper
Yeah, random acts of violence getting exposure as they're live-streamed on
Facebook has been interesting, to say the least. Without the criminal's
willful publication of their act, the crimes may have gone unsolved and
ignored for a very long time.

I'm not sure that we really need to stop people from publishing videos of
themselves executing old men at random. It certainly seems to lead to a speedy
resolution and really heightens the impact over the impersonal "Another
shooting today..." on the evening news. It's virtually the same as walking
yourself into the police station, with the added bonus that the whole country
now hates you and is looking for you.

~~~
jacquesm
Without an audience the crimes might never have happened in the first place.

~~~
anigbrowl
that seems highly unlikely to me, given the abundant history of covert
violence. Obviously it's hard to assess for individual cases but across a
distribution people seem to avoid notoriety.

Perhaps we can make inferences by parallel; trolls engage in obnoxious
behavior when granted anonymity or insulation from direct consequences. Is it
likely that absent an audience for their trolling, they would be nice all the
time?

Seems unlikely, and the historical record suggests that cruelty, criminality,
and impunity are a common combination.

------
GCA10
Can't help but think of Deming's famous advice on how to deal with quality
issues: "Eliminate the need for inspection on a mass basis by building quality
into the product in the first place."

If Mark Zuckerberg and his product teams could travel back in time a decade or
so, would they still have built everything the way they did?

~~~
mcrad
Market share was likely always more important than quality.

Besides, do app developers even understand statistical quality control? I tend
to think this kind of software grew up on a very black/white,
functional/broken, bug trackers, etc., a very narrow view on quality as a
concept.

------
devdoomari
I don't expect much from this...

I've reported countless spammy comments like "free ladies!" and "free $50 per
click!" But FB replies are 100%: "...but our community deemed that comment to
be ok with FB guidelines"

FB should definitely see what Korean FB is like recently...

------
avar
I predict that Facebook's current PR problem related to this is going to be
replaced by a swatting problem before long. I.e. someone reports someone on
Live as suicidal, SWAT/police show up, shoot their dog etc, public outrage
against Facebook ensues.

------
paradite
Western social media seems to be one step behind the Chinese counterparts in
terms of moderation strategies.

Living streaming scene in China has already went through this entire
discussion and various forms of implementations on moderations of offensive
and inappropriate contents sometime last year when it was rapidly growing.

Again with government intervention, it is so much faster and easier to enforce
standards of moderations for private companies to follow.

~~~
kyledrake
> Western social media seems to be one step behind the Chinese counterparts in
> terms of moderation strategies.

I totally agree. We have the NSA so the total information awareness network is
there, but we've really fallen behind China on using it to address suicide
intervention.

~~~
paradite
I understand your sarcasm but I believe you have misunderstood my point.

I'm not talking about secret operations, but rather some transparent
guidelines or laws issued by the government (legislative branch in the case of
US? I'm not sure.) that are enforceable by private entities like Facebook.

------
fortyniners
Machine Learning and AI are dropping the ball here?

------
6stringmerc
I get the feeling the job itself might have some side-effects, if this article
from quite a while ago has any relevance to the subject at hand:

[https://www.buzzfeed.com/reyhan/tech-confessional-the-
google...](https://www.buzzfeed.com/reyhan/tech-confessional-the-googler-who-
looks-at-the-wo)

------
user5994461
What is "CO"?

~~~
stingraycharles
Community Operations, it's mentioned in the article.

------
narrator
China has 30,000 people who work full time moderating the Internet:
[http://certmag.com/the-great-firewall-how-china-polices-
inte...](http://certmag.com/the-great-firewall-how-china-polices-internet-
traffic/)

------
avivo
They already had 4500 people. And use of the moderated features is likely
still growing rapidly.

It's great that Facebook is increasing it's moderation numbers, but it's
unclear if this was already planned and simply used (successfully) as a PR
response to recent events.

------
deegles
I wonder if Facebook made much money on the ads displayed alongside this type
of content (or perhaps pre- or mid- roll ads). Do they have a responsibility
to treat this income differently?

------
genkaos
Recomended minidoc "Field of Vision - The Moderators"
[https://vimeo.com/213152344](https://vimeo.com/213152344)

------
zepto
Does anyone else think it is dangerously Orwellian to describe speech as
people 'hurting themselves and others'?

We seem to have a serious problem resulting from people living in bubbles of
information sources that only confirm their own viewpoint.

How can the solution to that problem be to have a single corporation design
the bubble for everyone?

(Note: I know he's taking about actual videos of violence taking place.
However my point is that violence is already happening, and hiding that from
public view is 'out of sight, out of mind')

~~~
artursapek
Nobody's forcing you to have a Facebook account, or participate in it. It's
not "for everyone".

~~~
apostacy
Unfortunately, it kind of is becoming that way. Facebook is becoming a de-
facto public utility.

The phone and telegraph companies started out as "optional" services. If you
didn't like the telegraph company arbitrarily blocking journalists that
criticized them, you were free to not use telegraphs.

Facebook is being tied to credit worthiness, and job applications, whether we
like it or not.

And in some parts of the developing world, it is more important than a phone
number. It is recognized as being so vital, that providing Facebook access is
subsidized.

We can no longer pretend that Facebook is some sort of toy, and dismiss
criticism by saying that if you don't like it you shouldn't use it.

~~~
artursapek
> Facebook is being tied to credit worthiness, and job applications, whether
> we like it or not.

Where, how? I've never heard of someone needing a Facebook account for credit
and any employer asking for your Facebook profile is a huge red flag.

~~~
apostacy
> Where, how? I've never heard of someone needing a Facebook account for
> credit and any employer asking for your Facebook profile is a huge red flag.

Here[1]. Saying "wasted" as a status update can affect a credit score. And
even if it isn't disclosed, employers, and other entities, may use your
Facebook activity. This is a totally valid issue. So, yes, Facebook does
matter, whether you like it to or not. It isn't just a toy, and you can't just
stop using it, without it potentially harming you. I think it's totally
unfair, but people now have to consider their online presence when hunting for
a job.

And they don't have to ask for your Facebook account specifically. Companies
like ru4.com can tie your financial identity to your social media identity[2].
Just like how even though you didn't submit anything to the 3 credit bureau,
they will find data for you, it is also becoming true with social media. I
can't log into chase.com without allowing scripts from ru4.com. Luckily, I use
a sandboxed Firefox profile, that has never touched Facebook.

[1] [https://www.theguardian.com/media-network/media-network-
blog...](https://www.theguardian.com/media-network/media-network-
blog/2014/aug/28/social-media-facebook-credit-score-banks)

[2] [https://yro.slashdot.org/story/13/05/02/1521239/even-the-
ad-...](https://yro.slashdot.org/story/13/05/02/1521239/even-the-ad-industry-
doesnt-know-whos-tracking-you)

------
davexunit
Not optimistic but hopefully things will improve. As of now Facebook can't
even remove an account for a hate blog local to my area.

------
chinathrow
We laughed when China added x thousands to their great firewall content
moderation team.

------
zoew
Hopefully things will improve and we will see better things in our feed

------
anigbrowl
Why doesn't Facebook give people a way to have input to their 'community
standards'? Basically it's a black box that's presumably stuffed with lawyers,
marketers, and some analytics people. I see zero evidence that there is any
actual input from the people who use FB. It's essentially a dictatorship
dressed in a costume of democracy, and I would far prefer it if the 'community
standards' were called what they are, 'Rules of Mark's Club.'

This is a sore point for me as an artist. It's tedious when posts are removed
because they depict or seem to depict nudity and you have to go through and
assure some anonymous and wholly unaccountable person that they're not. One of
my friends teaches art history at UCLA and - surprise - he posts lots of fine
art on his wall. He has to have 8 or 10 accounts because he is constantly
getting temp banned for posting famous paintings of people with no clothes.

It also bothers me on a more general level, eg it's fine if I take a picture
of myself with my shirt off but if one of my female friends does the same
thing she risks being restricted from posting or having her account terminated
because her breasts are apparently a worse thing than extreme gory graphic
violence that comes with a warning but is nevertheless acceptable to post.

That's sexist bullshit that turns women into second class citizens. I utterly
fail to understand how it's OK to share pictures of just about any violent
subject matter, but any kind of nudity, sexual or not, is grounds for having
your account terminated.

Here's a list of some of the things I've seen on FB over the last year, some
with an automatic clickthrough content warning (which is a good idea and
mostly well implemented) and some not. As far as I'm aware none of these have
resulted in account terminations for people who posted them:

Beheadings (video, multiple examples); hanging; people being shot/have been
shot; serial killers and their refrigerators stuffed with human meat;
disembowelments; autopsy photos. In each of these cases I don't mean grimy
thumbnails where you can sort of imagine what was going on, but photos and
video of sufficient clarity to be used in a news broadcast if not for the
disturbing nature of their subject matter.

I'm leaving out other stuff that I found sufficiently disturbing that I prefer
not to even describe it. I'm not into gore, beyond watching a few horror
movies in a given year. But I'm pretty open with my friends list and allow
people to join me to groups, so I'm exposed to a certain amount from trolls
and of course there are episodes of violence in the real world that are
newsworthy, and I prefer my news without censorship of any kind.

You'll notice that I'm not calling for this stuff to be removed or banned from
FB. I think the 'graphic content, are you sure?' warning strikes a sensible
balance between protecting people's sensibilities and allowing free discussion
and information. We live in a world that is often violent and I believe that
concealing the ugliness of violence often allows it to proceed unchecked. It's
also true that some people become obsessed with or celebrate violence, and
that admitting it as cultural currency risks desensitization or normalization
of violence. Those are tricky questions to which I do not believe any one
person, firm, or society has a perfect answer, but given that the instinct of
criminal persons and regimes is generally to conceal rather than reveal
transgressions, exposure and condemnation is probably a more effective
response than obscurity and censorship.

After that unpleasant detour into the pits of human awfulness, I _really_ want
to hear from someone at Facebook:

a) why it's OK to engage with the reality of people inflicting horrible
violence on others, but it's not OK to let people engage with the reality of
sexual or aesthetic expression, and

b) why the 51% female majority of the population are subject to tighter
restrictions than the male minority, and

c) why the 'community standards' don't offer any formal mechanism for
community input and decision-making.

Think about it, Folks. A picture of a healthy naked body is grounds for
account suspension or a ban, but it's totally OK to show that same body hacked
to pieces? That's some grade A bullshit, and platitudes about how 'we try to
reflect the prevailing standards of society' isn't going to cut it.

Automation _intensifies_ whatever process you choose to automate, and if you
automate a standard whereby erotic desire and self-expression are constrained
but extreme violence and interpersonal aggression are less constrained, guess
which you'll end up with more of? Likewise if men are allowed freedoms that
are systematically withheld from women, guess whose freedoms are going to be
expanded and whose are going to be reduced?

I demand answers on this. Facebook is one of the most powerful political
entities on the planet and those who own it need to explain why, within
Facebook, there is greater tolerance for violence than nudity or sexuality,
and why one half of the population is subject to greater restrictions than the
other half.

------
mmahemoff
Dear mods, I enjoy TLAs as much as the next geek, but can we please change CO
to Community Operations?

 _edit - that was quick, thanks!_

------
d1ffuz0r
Censure is coming

------
gm-conspiracy
Are they hiring pre-cogs from Mars?

------
OedipusRex
The issue with work like this is it is very distressing. Having to look at
videos and pictures of horrific acts (suicides, child porn, etc) is not a
pleasant job. Many of those who do this in law enforcement (mainly child porn)
have high rates of PTSD and other issues.

------
losteverything
Mr zuckerberg's involvement signals it is a huge problem, one unable to spin
away..

People always figure out a way around the latest attempt to control - and
these measures will be overcome by bad intenders.

So if FB admits it has a community safety and fake news problem - by hiring
3000 additional enforcement agents, why would one stay with FB if there is an
alternative. Which there isn't.

I would like a FB lite. Photo sharing and comments and discovery of old
friends. No news. No menu of features.

I wish there could be an alternative. ("Its just like. fB but without the news
and crap)

~~~
xatan_dank
I think some alternatives do exist, but I do not believe an near exact clone
of Facebook with less of the problems of Facebook exists.

Maybe it's just infeasible to properly monitor such a massive social network?
I don't know for sure, but I left Facebook long ago and have not regretted it
at all.

