
A petition is calling on Pornhub to prevent non-consensual videos being posted - jimnotgym
https://www.theguardian.com/global-development/2020/mar/09/pornhub-needs-to-change-or-shut-down
======
justsomedude11
I'm so happy this made the news. I signed it and I wrote to my local
representative about the petition too.

I'm a 35y old guy and I was never anti-porn, I was always pro-porn. However,
few weeks ago I read an article on BBC (it was posted on HN I think) about a
14y old girl being gang raped, rapists pardoned and the rape video uploaded to
Pornhub. Pornhub refused to remove the video at first, they later did when the
girl wrote a letter pretending to be a lawyer.

Imagine having to go through all that and still be mentally strong enough to
deal with your peers mocking you and millions of strangers masturbating to the
torture you went through. Imagine being the parent. Reading that article
changed how I view porn. It also changed my life in a way. God knows how many
videos that end up on Pornhub (and the likes) are rape or pedophilia and we
don't even know it. I haven't browsed since and probably won't watch porn ever
again.

These corporations need to be held responsible. If we can do it in retail and
tech, then so can we regular porn industry a bit.

------
ocdtrekkie
This falls in the exact same vein as Google and Facebook and the inherent
issues with Section 230: Platform immunity very directly enables and promotes
illegal content, because platforms lack an incentive to moderate them
effectively.

Article is from the UK perspective, but this is arguably, an easy example of
where the United States could utilize the new SESTA-FOSTA and take down
Pornhub, since Section 230's immunity no longer applies to sexual exploitation
of children or sex trafficking. And multiple examples here are given of videos
of underage persons without their consent.

In addition to this, the recent case against Girls Do Porn, PornHub has
allegedly done an extremely inadequate job preventing reuploads of videos that
constituted rape and abuse, and for which the creators are in jail.

~~~
adontz
I am in no way defending PornHub, but how exactly should it be [self]
moderated?

Can we use AI to detect underage and drunk people? PornHub already uses some
computer vision technology to recognize kind of sex taking place and put time-
codes on video. What are limitations of this technology? Can CV detect drunk
person and not confuse facial expressions with an orgasm? We have seen epic
failures of fully automated AI in our industry, like YouTube taking down
videos with birds signing for DCMA copyright violations. Also, consent and
legal age are defined very differently across the world and even within US.
Which jurisdiction to refer to and why? Video can be legal for one web site
visitor and illegal for another one at the same time. What to do in that case?

Can we have some kind of feedback form to report video as... what exactly?
Private? Leaked? Amateur videos are very common. Independent models make a
living from posting videos. Any feedback system will be abused by competing
content providers, which are private persons mostly, not legal entities. So
reporting party should take selfie for identification or what exactly?

What about all hard core BDSM content, when modeling non consensual sex is the
very goal. Are we gonna ban BDSM? What about all fetishes, which may include
underage sex fetish, rape play, etc. which when performed by adults is legal.
Will computer vision recognize these exceptions? Will people do? Should we
make people make final decisions? YouTube moderators deal with mental issues.
Honestly, I have no idea what kind of terrible job should be moderating porn
videos.

I understand that PornHub makes money, violates privacy and should be
regulated, but without sane solution we'll just punish everyone without really
solving the problem. New keywords will be coined to search for specific
content, that's it. We already have lots of them, like MILF, ATM, DP. We'll
get just DT for drunk teen, that's it. And when it'll get banned (for instance
"rape" is a banned word on PornHub), another keyword will be coined.

~~~
ocdtrekkie
Why should AI be employed? Why is fully-automated superhuman scale something
companies should be allowed to achieve?

I mean, let's be real: How many new pornographic video uploads per day does
PornHub actually need to serve it's audience? And the vast majority of content
on PornHub likely falls into two categories: Major brand videos, in which they
can work directly with the legal entities responsible for them, and user
uploads of primarily copyright-infringing content. The former can be
streamlined by connecting directly with brands, making sure they have all
their legal ducks in order and that all of their performers are properly
employed. For the latter, a slower, non-automated approval process is probably
perfectly fine. Individual/amateur users who want to publish their own content
could enter into a business relationship with the site, to achieve a pipeline
closer to the first group, establishing a channel/brand that identifies the
performers in the video and verifies their consent.

Fetish/BDSM content is arguably pretty important to the platform (and sex
positivity in general), but again, my guess is a large portion comes from
established adult film brands. If PornHub works directly with the actual
people filming the video, verifies performer age eligibility and consent, and
then approves the videos, there is no problem with the content of the video
shown itself.

Unchecked open upload of whatever content a user has on their computer has no
real need to exist here. At the very least, every video should be associated
with the performers present in the video, and upon the revocation of consent
from that performer, all of the videos containing them could be removed and
prevented from being added again. This not only prevents abuse of non-
consensual performers, but ensures the revenue generated from a video actually
makes it back to consenting performers, as piracy is no longer possible using
the platform.

~~~
adontz
>> How many new pornographic video uploads per day does PornHub actually need
to serve it's audience?

As much as possible? It was 13 videos per minute in 2019. Each minute almost
three hours of content are uploaded.

>> And the vast majority of content on PornHub likely falls into two
categories: Major brand videos, in which they can work directly with the legal
entities responsible for them

This assumption is very natural, but days of professional studios are gone.
There are about 100 thousands amateur models registered on PornHub.

>> For the latter, a slower, non-automated approval process is probably
perfectly fine.

Even if we'll estimate that just half of content is amateur, which is not so,
PornHub would need at least 100 moderators working simultaneously at each
moment of time to review every moment of all uploaded videos. I don't think
anyone can view porn for 8 hours per day, so it's at least 4 shifts. Take into
account vacations, and other things, that is easily 500-1000 of trained
moderators who will know specific of genres, and will not punish fetish
performers. And they'll need offices, computers, help of psychologists,
security check because they can be source of major leak. I think we are
talking about tens of millions dollars now.

>> my guess is a large portion comes from established adult film brands

There are big BDSM brands like Kink, but if you sum up minutes of content,
it's mostly amateurs.

>> If PornHub works directly with the actual people filming the video,
verifies performer age eligibility and consent, and then approves the videos,
there is no problem with the content of the video shown itself.

So PornHub now has to perform full KYC, because US citizen would not know what
Hungarian driver ID looks like. This is not a cheap service at all. And also
PornHub now has verify that videos posted contain only verified identified
models, which I have no idea how to perform. Also, now you enforce PornHub to
keep a lot of Personally identifiable information about all performers which
does not sound like a good idea in the first place.

>> Unchecked open upload of whatever content a user has on their computer has
no real need to exist here. At the very least, every video should be
associated with the performers present in the video, and upon the revocation
of consent from that performer, all of the videos containing them could be
removed and prevented from being added again.

You will just punish PornHub for being big, there is no way to enforce right
to be forgotten across entire Internet.

~~~
ocdtrekkie
> As much as possible?

Why? A given viewer is going to watch how many minutes of content a day? Even
accepting for a wide spectrum of preferences, there's no reason PornHub needs
three hours of new content a minute. Nobody's "already watched everything on
PornHub", have they? If they can't manage it ethically, they shouldn't be
allowed to.

> I think we are talking about tens of millions dollars now.

Good thing PornHub's reported 2015 revenue was nearly half a _billion_. They
can afford to operate ethically.

> So PornHub now has to perform full KYC

Not really know-your-customer. In this case it's know-your-content-provider. I
certainly don't suggest every viewer of porn needs to register. But every
publisher of it, including amateur channels, should.

> This is not a cheap service at all.

Again, this company operates on a half billion dollar a year scale. They can
afford to hire someone who can verify what a Hungarian driver's license looks
like. They aren't poor.

~~~
adontz
>> As much as possible? >Why?

Because PornHub is a platform. Why not limit uploading videos on YouTube, just
because they can't moderate them all?

> Good thing PornHub's reported 2015 revenue was nearly half a billion. They
> can afford to operate ethically. > They can afford to hire someone who can
> verify what a Hungarian driver's license looks like. They aren't poor.

First, define ethically in a worldwide accepted way.

Second, you are asking a business to take a loss just because you want them
to. I don't think "ethically" is a strict legal term.

I believe there should be an unambiguous law taking into account interests of
all parties. People get killed by cars, but cars are legal. We can't punish
drivers only. And we can't punish content providers only.

> In this case it's know-your-content-provider.

I have posted on anonymity here.
[https://news.ycombinator.com/item?id=22529650](https://news.ycombinator.com/item?id=22529650)
It's simply dangerous.

~~~
ocdtrekkie
> Because PornHub is a platform. Why not limit uploading videos on YouTube

Platforms aren't an inherent good. I absolutely think YouTube should be held
accountable for it's behavior and the way it radicalizes people by
recommending increasingly extreme videos. YouTube is a platform that is _out
of control_ , and another example of why unchecked immunity for platforms is a
dystopian nightmare.

> define ethically in a worldwide accepted way

Not profiting from the distribution of child pornography. I think that's
pretty universally understood to be required of ethical business. "Ethical"
isn't a strict legal term, "distribution of child porn" is though.

------
magduf
This seems like a very good example of how poor enforcement is. Child porn is
very, very illegal, and gets "regular people" throw in prison for the rest of
their lives just for having downloaded it and having it on their machines. Yet
a big site like this can have it posted, and reposted, over and over, and
that's perfectly OK.

Why is there no federal prosecution of not only the site for abetting, but
also tracking down these ex-bfs or whoever it is uploading the videos?
Usually, producing and knowingly distributing CP results in a very, very long
prison sentence in the US.

~~~
vorpalhex
The article states that Pornhub did comply and take down the videos. Like most
companies, they also reported these cases to the FBI or other appropriate
agencies.

What, exactly, do you want them to do?

~~~
Spare_account
Pornhub should be required to prevent illegal content (specifically in this
case child porn) from being hosted on the site at all. They should be held
responsible for content they make available.

Presumably this would require human intervention to greenlight the video once
all the necessary verification has been sought.

All platforms should be held to this standard frankly. I don't understand how
we've gotten to the point where it is acceptable for anything less.

~~~
jedberg
Should Hacker News be responsible for stopping you and everyone else from
posting possibly illegal material?

The same law that protects HN protects PornHub.

~~~
ocdtrekkie
HN has competent moderators that do a reasonable job removing illegal or
harmful content. (Hi dang <3)

Section 230 protects bad actors, not good ones. Platforms that invoke Section
230 as a defensive measure are generally either refusing to moderate, or
entrusting moderation entirely to algorithms, claiming they're "too big" to
moderate their content.

~~~
vorpalhex
> Section 230 protects bad actors, not good ones

Section 230 is why you can comment here without dang having to approve every
comment you make first. It's why Facebook and Twitter and Reddit and HN can
exist. It protects _all_ actors.

~~~
ocdtrekkie
This is completely untrue. Legal cases already have the concept of intent and
liability. Section 230 provides blanket immunity for abuses that would
otherwise be clearly defined as criminally negligent or willful behavior.

For example, without Section 230, one could define dang removing a harmful
post shortly after it was flagged (usually within hours, I imagine) as best
effort for a two-man moderation team, and find that PornHub, one of the most
valuable web properties in the world with a significant amount of manpower,
taking several weeks to take down a child porn video criminally negligent, and
holding the company responsible. (Arguably, SESTA-FOSTA already enables the
latter, as Section 230 no longer applies to sexual exploitation of children.)

------
IAmEveryone
Is it actually a criminal offense in the US to distribute “rape porn” or
“revenge porn”? I seem to remember it’s only a civil matter.

That makes all the difference here, because Section 230 is specifically
limited to have no impact on federal criminal law.

That would also explain why child pornography is extremely rare on these
sites.

~~~
toyg
In legal terms, unauthorised revenge-porn is one thing, underage porn is quite
another - that’s sexual exploitation of minors, not a civil case.

------
Red-Ted
For a site which built its business on piracy this is no surprise!

------
gaogao
Pornohub -> Pornhub in title

~~~
unfunco
Is it not a clever Grauniad joke?

