The inability to post content without it being moderated beforehand is treated as if it's a good thing, suggesting it's something social media companies should do but distancing themselves from the implication enough to maintain plausible deniability.
Especially ironic is that many porn sites make much of their money through ads that harvest personal data or prey on male insecurity.
There are strong arguments I disagree with in favor of regulating social media. Those arguments are also prescriptive, not general "we should do something". This is a very weak one.
Downvotes are the single best indicator of the problem with volunteer content moderation. People don't just downvote for bad comments, they downvote for comments they disagree with. Good luck getting edgy content through YouTube's Volunteer Force, it'll hurt causes you may agree with.
...or in exchange to see the stuff they can't get to see anywhere else? I wonder for how many of these "volunteers" this is the main motivation - is this a quasi-legal way to see illegal content?
My point is - doing this like you said "quasi-legally" must be the worst way of doing it, as you're giving the company all your real details to be employed - the very second anyone suspects you're doing it for the thrills(maybe you are the only employee who actually watches every video to the end?) you'd be in deep trouble.
This would be unlikely. Stipulating that ordinary xHamster users are watching the ordinary videos for the thrills... how many of them watch every video to the end?
Why do you believe this? It's not at all similar to how the regular user base watches videos. Compare PornHub's in-video display of how popular various timestamps are.
I never thought about the fact that the reviewers can have bad intentions. Interesting....
Although, to be honest, typically videos are buffered and even cached from the browser, so logging incoming web traffic won't help there.
Preventing outgoing traffic might help. Preventing storage would be interesting. I think those machines need to be really just terminals where you can't do literally anything but click yes/no and go to the next in the queue. It'd be interesting to know how they do it.
Well the article said that only 1 in 20000 uploaded videos ends up getting taken down, "“Because we’re very aggressive in our patrol of content, the criminals know not to use us”. I'm not sure a .005% chance of seeing illegal content would really be enough "reward" to affect anybody's behavior.
Facebook revolves around user interaction, that is videos, photos and text (in various forms, even links to external sources).
Adult entertainment deals with those too, but those sites are mostly used like YouTube with lower session times.
Every minute "510,000 comments are posted, 293,000 statuses are updated, and 136,000 photos are uploaded" on facebook, on Pornhub it is 22 video comments and 122 messages. Xhamster should be comparable to that. The difference in volume is striking.
Now with lower volume it is easier to moderate. Scanning videos isn't easy but AI should help, but AI these days fails for text, at least for anything else than an easy classification task. Hate speech isn't just random insults, there is more to it, and current AI systems can't possibly understand the whole context to something.
Even if content wasn't released immediately on facebook, there aren't enough people in the world to moderate the amount of content posted on facebook everyday.
I really think the comparison of those sites doesn't make sense.
Not saying its better, but its not clear at all pornography drives human trafficking.
That does not sound like a common perception of pornography. There are a number of quantifiable potential problems with the consumption of porn, but correlating porn consumption and human trafficking seems like a stretch.
the various rape allegations with some filmmakers are complex, and sadly I'd believe half of them. People should be aware of these types of things if considering porn as a profession and considering working with different groups of people as opposed to self or bf/gf only, or via contract with a studio like wicked.
Sadly these kinds of things happen in Hollywood and bum-nofk-nowhere usa and random no skin showing jobs in many places around the world. I read news about this kind of thing being rampant in some corps in India - but that does not mean it happens there more than other places.
education, and human rights (including women's rights and rights for sex workers) should continue to be a thing. Not fearmongering that industry A is terrible because some Harvard Wienstien was exposed coercing desperate people, so renting a movie endorses rape or coerced slavery.
Also here's a  article about the connection https://www.rescuefreedom.org/wp-content/uploads/2017/10/sla...
Quite unlike Facebook - there is a Russian company that makes and sells fake Facebook accounts, and being used correctly, they can live for months and even years without being banned (used to buy ads advertising dodgy stuff). Facebook is really helpless to stop it.
It sounds like a pretext for end-to-end encryption regulation.
1. They don't have to worry about latency due to review delays.
2. They can use vision based detection algos for classification which are much reliable than NLP based understanding.
3. They recruit users to do reviews for in-site benefits.
Unsaid but I suspect they also don't have to worry about user driven random virality for any content.
I think this would be the best thing for a social media platform. Using a site requires using credits, and you can get credits one of three ways:
1. Paying a monthly fee
2. Performing maintenance activities like content reviews
3. Opting-in to advertising
I don’t see any similarities between what Apple does and a social media site.
They don't have need for "cost centers" such as customer support, moderation teams, etc. Facebook doesn't even have a large enough support team for advertisers let alone anyone else that might have some real issues with their accounts, which I think should probably be illegal. You can't offer a service and then not respond to your customers issues/problems, especially if they pay for it.
There definitely needs to be more regulation in tech regarding this aspect. These companies have all the money they need to solve some of these issues, and then some. They'd rather keep that money for themselves and try to enter completely different markets so they can become an even bigger monopoly instead, though (which is yet another good reason to put restrictions on this stuff, because the goal should be to minimize the number of monopolies).
looking at @mtgx's history, almost all comments have been flagged and downvoted until they're [dead]
Is it really just some old people there now and a bunch of influenza seeking self confirmation through meaningless likes and then the majority which is left is perhaps some bots and political activits to spam those few old people with shit that nobody outside their bubble gives a damn? This is a serious question, what's happening on Facebook nowadays that still keeps that domain alive?
> If a porn site were the subject of an investigation that revealed it had been home to millions of images of child abuse content, its parent company probably wouldn’t remain in business for much longer. For Facebook, however, a Times piece that alleged an epidemic of child abuse on its platform was just a bad news day. That’s a stark difference in reaction to the same offense, and one that should give us all pause.
So true. How is Facebook not being penalised really badly for this? Facebook distributing child absue through their unmoderated platform is the same as if Facebook would have build a bunch of robots who would have handed out child abuse magazines to millions of households. It is literally the same - a machine, which allegedely can't see - publicising horrendous content to millions of people. In the latter case Zuck would be locked into a prision cell on Guantanamo Bay for the rest of his life, in the former he just shruggs it off like nothing and keeps being one of the richest people on earth because of it. Quite stark contrast.
Q3 revenue: $17.34 billion from advertising
Daily Active Users were 1.62 billion on average for September 2019, an increase of 9% year-over-year
Monthly Active Users were 2.45 billion as of September 30, 2019, an increase of 8% year-over-year
This is how they make money. 2.41 billion monthly active users.
"Active users are those which have logged in to Facebook during the last 30 days."
Advertising on Facebook seems to be also very profitable (at least that's what I read several times). I don't use Facebook, however, it's still a thing. Saying it's dead is blind.
Mhh.. that's a great way of describing it.
FB did $40B, $55B, and ~$68B in revenue for 2017-2019. Instagram has done ~$4B, $7B, and ~$15B for 2017-2019. Almost all of that revenue are from ads or other services via Facebook.com itself. And then Instagram.
FB barely monetizes two of the three most dominant chat/messenger apps in the world - FBM and WhatsApp. WhatsApp is closing in on 1.75B monthly active users. FBM is closing in on 1.5B monthly active users. Both targets should be met in 2020. FBM has Facebook in its name. Not to mention the billions of active users Facebook and Instagram (well 1B+) each have.
Instagram is how they are making money
FB did $40B, $55B, and ~$68B in revenue for 2017-2019. Instagram has been ~$4B, $7B, and ~$15B for 2017-2019. IG as a part of revenue has certainly grown from ~10% to ~22%. However it’ll be a while before it gets to be the majority. IG has possibly had its biggest growth period by now and has already had a lot of ads added.
I wouldn’t bet against it topping 1/3 revenue in 2021. Hard to predict that and beyond that. FB may also [begin to] monetize FBM and/or WhatsApp or something else by 2022.