
After a livestreamed suicide, TikTok waited to call police - crivabene
https://theintercept.com/2020/02/06/tiktok-suicide-brazil/
======
cj
This (again) highlights the need for tech companies to take responsibility for
what happens on their platform. And it probably won't happen without increased
regulation.

Imagine you build a real-world football stadium. Then, you open the doors up
to the public for free with minimal restrictions or rules for how the space is
used.

You brand the stadium as a "platform" and you invite the public to treat it as
such. 6 months later you have people doing drugs on one side of the stadium,
holding a white supremacy rally at the other end of the stadium, and then a
group of people in the bleachers playing russian roulette with the occasional
casualty.

The police eventually track down the owner of the stadium. The owner says
"Yes, I own the stadium, it's my property, but sorry Mr. Police, the stadium
is a Platform for free expression. I have no control what the public does in
there."

\--

Introducing new regulations is usually an unpopular suggestion. It raises the
barrier of entry for competitors. It limits the market to players who can
afford to comply.

But at a certain point, IMO we should stop giving tech companies a free pass
just because they're a "platform" that's "too large to moderate".

~~~
big_chungus
> You brand the stadium as a "platform" and you invite the public to treat it
> as such. 6 months later you have people doing drugs on one side of the
> stadium, holding a white supremacy rally at the other end of the stadium,
> and then a group of people in the bleachers playing russian roulette with
> the occasional casualty.

So what? People are responsible for their own individual actions. None of what
you described is hurting anyone else. It may not be a pleasant place, but
that's okay because it's private property. Now consider that in the digital
world there's no possibility for physical harm and the example becomes
entirely pointless in any case.

~~~
leppr
Doesn't take much imagination to switch up grand-parent's contrived examples
for your own definition of actually threatening deeds to get their point, does
it? Would "operating a slave-manned factory" or "training volunteers in
crafting concealed remotely-detonated explosives" fit the bill?

~~~
big_chungus
You're putting words in my mouth. Your examples aren't consistent, either:

> operating a slave-manned factory

This is wrong. Slavery violates the personal liberty of the enslaved

> training volunteers in crafting concealed remotely-detonated explosives

I don't see an issue with this. The information is already available online.

------
tomatocracy
This article doesn't mention that there is good evidence that publicity
surrounding high profile suicides often leads others who are at risk to
attempt suicide too. Thats why (along with respect for families) suicides are
often described more obliquely as 'x has died' when the deaths are reported.

Ensuring that these types of incidents have as low a profile in the press as
possible may well have saved lives.

------
jennyyang
This article sounds skewed against TikTok. Was the police never contacted by
anyone about the suicide? Maybe the police were already contacted? This is a
big question that the article leaves out that makes it sound like it's only
TikTok who is responsible for calling the police.

~~~
freeone3000
Bystanders have a responsibility to report, at least where I am. Even if they
are not the only ones responsible, they are still responsible.

~~~
drivingmenuts
Bystander has a different meaning online, though. Is a viewer located halfway
around the world still a bystander when they are not governed by the law
surrounding an event and may not have any way to influence said event? Is this
same person culpable?

~~~
freeone3000
Yes, it'd be the same as if you were on the phone with the person.

~~~
brianTheDog
That doesn’t make sense. The person on the phone you most likely know where
they live. If you are on an 800 number how do you contact the correct
authorities? A public account on tiktok can’t tell you who to call... unless
it has a video of ghostbusters, but that won’t help.

------
dredmorbius
The principle failing here seems to be a lack of a prepared plan of action for
such events, and confusion within the TikTok offices as to how specifically to
respond.

When you are dealing with The Public and public activities, at scale and
volume, you've got to be prepared for such actions.

This means not only detection (which functioned poorly), but response.

\- Contact first responders and emergency services.

\- Disable harmful, inappropriate, or disturbing content.

\- A set of prepared communications for specific circumstances (and adapatable
to others) such that you're _not_ scrambling to put together language under a
crisis situation.

It's worth noting that with networks such as Facebook and Google (the late
unlamented G+, YouTube) having literally _billions_ of users, simple actuarial
statistics mean that _tens of thousands_ of accounts are likely to represent
deaths on any given day. Contingency planning simply _has_ to account for more
than puppies and rainbows.

During my own time on G+, at least two of the thousand of so people I followed
closely, both within a relatively close circle, took their own lives. One was
a complete shock, though he'd been acting somewhat oddly (purging activity and
accounts, then restoring them) over preceding months. The other had made
threats over the course of a year or so, including a previous instance in
which I'd tried frantically to find a way to alert local authorities thousands
of km away. The first time there was a intervention. The second, not.
Thankfully, in both cases the evidence wasn't a stream, but silence.

I followed up as best I could with several Googlers at the time. There's
frustratingly little which can be done, though some capabilities were offered.

TikTok's response here strikes me more as simply poor planning, and the
_Intercept_ 's criticisms perhaps unduly harsh. Though yes, forsight,
planning, _drilling of those plans_ , and an ever-so-slightly less self-
serving slant to the response, would be preferred.

 _Requiescat in pace_ , Dieter and Dawn.

------
stronglikedan
Sounds like they did the right thing. The police couldn't have helped by the
time the company was aware of the post. Certainly at least _one_ of his fans
had already called them anyway.

------
Grustaf
I just heard today that tik tok is libe moderating its content, perhaps they
weren’t doing tgat back then?

~~~
dredmorbius
As noted in the article, they were:

 _The team of moderators, referred to internally with the anodyne moniker of
“technical team,” is the only unit that works alone, apart from other TikTok
operators and staff. This team’s responsibility includes pulling any content
that violates TikTok’s terms of use or is offensive to the app’s user
community. Live broadcasts, however, are monitored by an even more elite team
operating in China, where ByteDance has its global headquarters. The team in
China was responsible for monitoring the app and discovering João’s suicide as
it was livestreamed. Yet no one saw it._

[https://theintercept.com/2020/02/06/tiktok-suicide-
brazil/](https://theintercept.com/2020/02/06/tiktok-suicide-brazil/)

------
xerox13ster
[]

~~~
AdmiralAsshat
Spoiler warning, please!

