Hacker News new | past | comments | ask | show | jobs | submit login
Laws hold the porn industry accountable for dangerous content (onezero.medium.com)
54 points by whatthe91 4 days ago | hide | past | web | favorite | 58 comments





This is one of the worst articles I've read in a long time. The author is able to weasel their way out of holding them accountable to their opinions when they say that if porn can "thrive" with such extreme moderation, social media can "thrive" with a lesser degree. The problem with this is that they don't specify what in particular social media companies should do.

The inability to post content without it being moderated beforehand is treated as if it's a good thing, suggesting it's something social media companies should do but distancing themselves from the implication enough to maintain plausible deniability.

Especially ironic is that many porn sites make much of their money through ads that harvest personal data or prey on male insecurity.

There are strong arguments I disagree with in favor of regulating social media. Those arguments are also prescriptive, not general "we should do something". This is a very weak one.

Downvotes are the single best indicator of the problem with volunteer content moderation. People don't just downvote for bad comments, they downvote for comments they disagree with. Good luck getting edgy content through YouTube's Volunteer Force, it'll hurt causes you may agree with.


"We have had a legion of volunteers... who review uploads in exchange for in-site rewards, as well as the health of the community."

...or in exchange to see the stuff they can't get to see anywhere else? I wonder for how many of these "volunteers" this is the main motivation - is this a quasi-legal way to see illegal content?


That's a really weird assumption - if you're looking for dodgy videos it's trivial to find them the dark web. Volunteers who do the same jobs for companies like google or facebook frequently say they are traumatized by content they see - it's not just sexual stuff, there's plenty of really explicit gore and abuse videos that come through. And if you're the kind of person who gets off on that stuff then like I said, it's not exactly difficult to find on the dark web.

My point is - doing this like you said "quasi-legally" must be the worst way of doing it, as you're giving the company all your real details to be employed - the very second anyone suspects you're doing it for the thrills(maybe you are the only employee who actually watches every video to the end?) you'd be in deep trouble.


> the very second anyone suspects you're doing it for the thrills (maybe you are the only employee who actually watches every video to the end?)

This would be unlikely. Stipulating that ordinary xHamster users are watching the ordinary videos for the thrills... how many of them watch every video to the end?


I mean - if you see an abuse video, a regular employee would flag it immediately and stop watching. Someone doing it for the thrill would watch it till the end and then flag it. It would be pretty obvious from the data I think.

> Someone doing it for the thrill would watch it till the end and then flag it.

Why do you believe this? It's not at all similar to how the regular user base watches videos. Compare PornHub's in-video display of how popular various timestamps are.


You're getting hung up on the exact wording. Maybe not to the end, but they're not going to get any enjoyment if they close it five seconds after they part they're seeking starts.

Or you download it, flag it, and watch it at home once you are alone.

I never thought about the fact that the reviewers can have bad intentions. Interesting....


I can only assume you would use data loss prevention software/techniques for tasks this. Locking out USB ports and logging web traffic on the terminal would be the first step but there's probably smarter things you could do.

True, indeed. If you want, you can somewhat prevent.

Although, to be honest, typically videos are buffered and even cached from the browser, so logging incoming web traffic won't help there.

Preventing outgoing traffic might help. Preventing storage would be interesting. I think those machines need to be really just terminals where you can't do literally anything but click yes/no and go to the next in the queue. It'd be interesting to know how they do it.


> I wonder for how many of these "volunteers" this is the main motivation - is this a quasi-legal way to see illegal content?

Well the article said that only 1 in 20000 uploaded videos ends up getting taken down, "“Because we’re very aggressive in our patrol of content, the criminals know not to use us”. I'm not sure a .005% chance of seeing illegal content would really be enough "reward" to affect anybody's behavior.


This is similar to the argument for promoting people who enjoy violence and murder into the military (people who haven't committed crimes, but get an adrenaline rush around these topics). Don't you want people who are good at it and enjoy it? Won't they get it done most effectively? If you have to do a job that involves "bad" things, do you get more moral points if you don't like it?

I did a bit of googling and it doesn't look like there is a different section of 'review' users - it looks like all users gain rewards for reviewing (comments, votes, reports) all content, with some places rewarding unreviewed content more.

At least they are useful^^ how many psychopath / would be murderers do you think enroll in being snipers ?

While there might be some legal factors to it, I think the problem spaces are a little different on these two websites (or generally speaking any porn site compared to facebook).

Facebook revolves around user interaction, that is videos, photos and text (in various forms, even links to external sources). Adult entertainment deals with those too, but those sites are mostly used like YouTube with lower session times.

Every minute "510,000 comments are posted, 293,000 statuses are updated, and 136,000 photos are uploaded"[0] on facebook, on Pornhub it is 22 video comments and 122 messages[1]. Xhamster should be comparable to that. The difference in volume is striking.

Now with lower volume it is easier to moderate. Scanning videos isn't easy but AI should help, but AI these days fails for text, at least for anything else than an easy classification task. Hate speech isn't just random insults, there is more to it, and current AI systems can't possibly understand the whole context to something.

Even if content wasn't released immediately on facebook, there aren't enough people in the world to moderate the amount of content posted on facebook everyday.

I really think the comparison of those sites doesn't make sense.

[0]: https://zephoria.com/top-15-valuable-facebook-statistics/

[1]: https://www.forbes.com/sites/curtissilver/2018/12/11/pornhub...


Of course porn websites will say they have advanced measures to moderate content. The reality is that these platforms do host illegal content [1]. Even beyond child pornography, they facilitate the sharing of revenge porn which content-wise might not be considered illegal, so I don't see any way an algorithm would remove it on its own (even a human would probably be unable to do so without additional information). Once that is out, victims are unable to do much about it. Perhaps the original video might be deleted but copies can spread around, even on the same website and are not taken down.

[1] https://www.thetimes.co.uk/article/766b96c0-fdb0-11e9-8343-7...


Beyond facilitation of child pornography distribution, there have been lawsuits related to rape being filmed and distributed. During the course of these lawsuits the videos typically remain up for streaming. There's also the underlying human trafficking problem that makes consumption of any pornography, even mainstream, a probabilistic endorsement and generation of demand for trafficking.

Most of pornography involves willing young girls who just want money and dont care about sex.. just watch the netflix docs.

Not saying its better, but its not clear at all pornography drives human trafficking.

Prostitution however...


> There's also the underlying human trafficking problem that makes consumption of any pornography, even mainstream, a probabilistic endorsement and generation of demand for trafficking.

That does not sound like a common perception of pornography. There are a number of quantifiable potential problems with the consumption of porn, but correlating porn consumption and human trafficking seems like a stretch.


Finding that 3 people out of 10,000 are using a set of tools to do something terrible does not equate to "consumption of any pornography, even mainstream, a probabilistic endorsement and generation of demand for trafficking." - any more than saying that because some restaurant owner in carolina had a mentally challenged slave working in his food place for 10 years is an underlying problem meaning anyone who eats at a diner is promoting slavery. imho.

the various rape allegations with some filmmakers are complex, and sadly I'd believe half of them. People should be aware of these types of things if considering porn as a profession and considering working with different groups of people as opposed to self or bf/gf only, or via contract with a studio like wicked.

Sadly these kinds of things happen in Hollywood and bum-nofk-nowhere usa and random no skin showing jobs in many places around the world. I read news about this kind of thing being rampant in some corps in India - but that does not mean it happens there more than other places.

education, and human rights (including women's rights and rights for sex workers) should continue to be a thing. Not fearmongering that industry A is terrible because some Harvard Wienstien was exposed coercing desperate people, so renting a movie endorses rape or coerced slavery.


Would you mind linking to some resources? I've heard this before but could never track down any good sources.

Recent case of mainstream pornographers being charged with sex trafficking [1].

Also here's a [2012] article about the connection https://www.rescuefreedom.org/wp-content/uploads/2017/10/sla...

[1] https://www.justice.gov/usao-sdca/pr/girlsdoporn-owners-and-...


Thanks, those describe some horrific scenarios. I'm also trying to find more information on how common it is and policy options that would help.


xHamster is Russian owned and controlled. I know all of their C-levels, they are here in Cyprus. It is really a brilliant company and they perfectly know what we are doing.

Quite unlike Facebook - there is a Russian company that makes and sells fake Facebook accounts, and being used correctly, they can live for months and even years without being banned (used to buy ads advertising dodgy stuff). Facebook is really helpless to stop it.


The author is essentially arguing that it's okay to regulate the tech industry because the porn industry survived regulation. It's fine to try to do good by regulating an industry but it doesn't free us from the burden of evaluating the value of each proposed law. Just because it's okay to regulate in general doesn't mean every law is reasonable.

It sounds like a pretext for end-to-end encryption regulation.


TLDR;

1. They don't have to worry about latency due to review delays.

2. They can use vision based detection algos for classification which are much reliable than NLP based understanding.

3. They recruit users to do reviews for in-site benefits.

Unsaid but I suspect they also don't have to worry about user driven random virality for any content.


Surely there's a volume difference too ? I would expect images, videos etc. that people post to facebook outnumbers what people post to porn sites by many orders of magnitudes.

There is also the issue where a lot of data posted on FB is meant to be semi-private, while content posted on these websites are meant to be public.

> They recruit users to do reviews for in-site benefits.

I think this would be the best thing for a social media platform. Using a site requires using credits, and you can get credits one of three ways:

1. Paying a monthly fee

2. Performing maintenance activities like content reviews

3. Opting-in to advertising


This is the same strategy Google and Apple use with their app stores. This level of moderation is definitely possible in the valley.

To get on the Apple Store you have to pay $99 a year and show some type of proof of identity. Not to mention that you have to have a Mac and know how to code.

I don’t see any similarities between what Apple does and a social media site.


Ah yes, the very properly moderated Google Play Store.

i absolutely don't want the web to end up like shitty walled gardens

Facebook isn't the web, it's already a shitty walled garden.

the problem is that any regulation that affects FB will affect the web

The answer has always been obvious. Facebook (and Google, etc) would rather keep the tens of billions of dollars in profit they make each year and keep growing that number, too.

They don't have need for "cost centers" such as customer support, moderation teams, etc. Facebook doesn't even have a large enough support team for advertisers let alone anyone else that might have some real issues with their accounts, which I think should probably be illegal. You can't offer a service and then not respond to your customers issues/problems, especially if they pay for it.

There definitely needs to be more regulation in tech regarding this aspect. These companies have all the money they need to solve some of these issues, and then some. They'd rather keep that money for themselves and try to enter completely different markets so they can become an even bigger monopoly instead, though (which is yet another good reason to put restrictions on this stuff, because the goal should be to minimize the number of monopolies).


I vouched for this comment, because I don’t see what was so objectionable about it. It seems like a genuinely held opinion, expressed in a relatively measured tone. If you disagree, reply with your reasons rather than trying to get it removed.

Plenty of Facebook/Google employees over here.

Plus he said it should be regulated. Plenty of laissez-faire folk over here too.

Nobody tried to get that comment removed.

Downvoting is effectively voting for a comment to removed.

How so? Downvoted comments can still be seen.

When was the comment downvoted?

hmm, it does look like they did though.

looking at @mtgx's history, almost all comments have been flagged and downvoted until they're [dead]


No, the account was banned, therefore his comments are automatically dead unless vouched for. https://news.ycombinator.com/item?id=20587230

Interesting. I didn't realise that banned accounts could comment at all. I'll keep an eye out for that in the future before vouching - in retrospect it does seem like an account worth ignoring.

I have a Facebook account, but I've not logged in for many years now, I've never had the app installed on any of my phones (last time I used Facebook the chat app was still working in the mobile version of the web app) and recently some kids seriously asked me what Facebook even is. As far as I'm concerned it's dead and I wonder how they are still even making any money?

Is it really just some old people there now and a bunch of influenza seeking self confirmation through meaningless likes and then the majority which is left is perhaps some bots and political activits to spam those few old people with shit that nobody outside their bubble gives a damn? This is a serious question, what's happening on Facebook nowadays that still keeps that domain alive?

> If a porn site were the subject of an investigation that revealed it had been home to millions of images of child abuse content, its parent company probably wouldn’t remain in business for much longer. For Facebook, however, a Times piece that alleged an epidemic of child abuse on its platform was just a bad news day. That’s a stark difference in reaction to the same offense, and one that should give us all pause.

So true. How is Facebook not being penalised really badly for this? Facebook distributing child absue through their unmoderated platform is the same as if Facebook would have build a bunch of robots who would have handed out child abuse magazines to millions of households. It is literally the same - a machine, which allegedely can't see - publicising horrendous content to millions of people. In the latter case Zuck would be locked into a prision cell on Guantanamo Bay for the rest of his life, in the former he just shruggs it off like nothing and keeps being one of the richest people on earth because of it. Quite stark contrast.


Wow. It's a public company, there's no need to speculate based on personal habits.

https://investor.fb.com/investor-news/press-release-details/...

Q3 revenue: $17.34 billion from advertising

Daily Active Users were 1.62 billion on average for September 2019, an increase of 9% year-over-year

Monthly Active Users were 2.45 billion as of September 30, 2019, an increase of 8% year-over-year


> recently some kids seriously asked me what Facebook even is. As far as I'm concerned it's dead and I wonder how they are still even making any money?

This[0] is how they make money. 2.41 billion monthly active users.

"Active users are those which have logged in to Facebook during the last 30 days."

Advertising on Facebook seems to be also very profitable (at least that's what I read several times). I don't use Facebook, however, it's still a thing. Saying it's dead is blind.

[0]: https://www.statista.com/statistics/264810/number-of-monthly...


Facebook is still alive because it serves its primary purpose as a searchable book of faces. Even if gen z kids are not signing up anymore millennials will occasionally check in on their high school and college cohort and use it to look new people up. Meanwhile boomers actively use it to stay connected with family, especially if they have family abroad. It doesn't matter if everyone is in a little bubble, that's what they want. FB can still happily serve them ads. Twitter is for people with a global focus, it is where bubbles clash, Facebook is for confirmation and support from friends and family. Different people want different things.

You do realize that the oldest “millennials” are now 38? But what I’ve found is that the older people get (over 25) and start having busier lives and move away from their family and friends, the more they use FB. If you are a teenager in high school, you see your friends everyday.

I'm saying millennials checking on their highschool and college cohort after they graduated, like how did he or she do in the last 10 years? sort of thing. I'm aware that all school age kids are gen Z by now.

> Twitter is for people with a global focus, it is where bubbles clash, Facebook is for confirmation and support from friends and family.

Mhh.. that's a great way of describing it.


Facebook’s financial and publicly given business info is available online. One of the most valuable and profitable companies in the world that is still growing revenue in the double digits per year (and unlikely to end in the next two years) is anything but dead.

FB did $40B, $55B, and ~$68B in revenue for 2017-2019. Instagram has done ~$4B, $7B, and ~$15B for 2017-2019. Almost all of that revenue are from ads or other services via Facebook.com itself. And then Instagram.

FB barely monetizes two of the three most dominant chat/messenger apps in the world - FBM and WhatsApp. WhatsApp is closing in on 1.75B monthly active users. FBM is closing in on 1.5B monthly active users. Both targets should be met in 2020. FBM has Facebook in its name. Not to mention the billions of active users Facebook and Instagram (well 1B+) each have.


>I wonder how they are still even making any money?

Instagram is how they are making money


Not true at all. Estimates vary. But say Instagram is doing around $1.5B rev a month now. FB as a whole does over $5.5B rev a month.

FB did $40B, $55B, and ~$68B in revenue for 2017-2019. Instagram has been ~$4B, $7B, and ~$15B for 2017-2019. IG as a part of revenue has certainly grown from ~10% to ~22%. However it’ll be a while before it gets to be the majority. IG has possibly had its biggest growth period by now and has already had a lot of ads added.

I wouldn’t bet against it topping 1/3 revenue in 2021. Hard to predict that and beyond that. FB may also [begin to] monetize FBM and/or WhatsApp or something else by 2022.





Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: