According to the article, Facebook did make the deliberate decision not to dedicate resources to Myanmar and other non-western countries, so I don't think it's negligence. "Greed" is more appropriate.
> Guy Rosen, Facebook’s VP of Integrity, told Sophie Zhang that the only coordinated fake networks Facebook would take down were the ones that affected the US, Western Europe, and “foreign adversaries.”
That's frankly sensible? There are three hundred+ countries on earth, to monitor all of their "fake networks" would take an organization like the CIAs. It's like saying its greed that Apple doesn't solve the family issues of all their workers. its not reasonable.
> its greed that Apple doesn't solve the family issues of all their workers. its not reasonable.
Of course it's greed, and of course it's reasonable. Apple is directly responsible for the work conditions of its workers and the contractors they use, absolutely. If they can't be responsible then they can close shop. There is no obligation for them to have a business.
Maybe they should consider not operating in countries where they unmonitored presence may lead to genocide?
There's no obligation to operate everywhere. Instead, as discussed in previous parts, they invested heavily to get a monopoly in third world countries.
I have a hard time following this reasoning. I would definitely consider Facebook partly responsible for this genocide if they had deliberately made an editorial decision to spread the genocidal rhetoric.
But they didn't. The situation was as you put it mostly unmonitored. In other words, the spread of genocidal content was organic. This is the equivalent of blaming the phone company or the ISP for what the users do with the communication channels.
Yes, a history of trying to get to the truth of matters even if it's not popular in the zeitgeist.
Look at fsociety's comment in this same thread to see just how much misinformation is out there (which he helpfully corrects)
It just so happens a lot of misinformation is anti-corporate since many people want to believe every bad thing they hear about them. So I end up being one of the few commenters who say, actually, the big company did try here.
This is important because if every big company is supposed to be equally bad, there's no reputational penalty for the actually bad ones.
Funny, talking about the truth when your comment history is neck deep in right wing conspiracy and talking points.
The point people are making here is that Facebook shouldn't have done the impossible and created a perfect tech solution to moderate content, they should not have operated in these countries at all if they couldn't do it safely. They did not even try, instead they ignored internal reports on these problems for years and actively decided to make the problem worse with their single focus of growth. And for as you put it, only a few $ in advertisements, to fuel the fires of genocide.
Considering I've voted only democratic in every election in my life, and volunteered to help progressive candidates manage their campaigns in swing states, you're waaaay off base on my politics.
I really don't want to ask this since it run contrary to HN rules, but did you really read the article?
No one expect Facebook to remove all misinformation/hate speech/etc. That is unreasonable. But they have literally 1 data scientist to monitor the whole non-western world.
In a year where their annual profit is $23.9bn.
Assuming 1 data scientist is paid $500k/year, surely it is sensible to dedicate $20mn, less than 0.1% of their annual profit, to hire 40 more such data scientist and literally increase their effectiveness on this topic by 4000%? At the very least, this is what I expect from a company that actually have a speck of moral.
> A strategic response manager told me that the world outside the US/Europe was basically like the wild west with me as the part-time dictator in my spare time. He considered that to be a positive development because to his knowledge it wasn’t covered by anyone before he learned of the work I was doing.
When you write “they have literally 1 data scientist to monitor the whole non-western world”, I assume you mean Sophie Zhang. I agree with a lot of what she said, and I mean a lot, but you have a misconception.
The integrity org, back when I was there, had thousands of folks and was aggressively hiring. They were not a lone island and were also supported by other orgs like data infra, FAIR, security, privacy, product teams and more.
Site integrity, Sophie’s team, was a small piece of the work done there. And they, like any team, relied heavily on other teams. You have all this unbelievable tooling internally, and internal teams are incentivized to get you to use it.
These issues were not from want of trying. The reality is that problems like this at scale are incredibly difficult. In my opinion, Frances trivialized a lot of this in her whistleblowing, but it makes a good news story so what can you do.
It pains me when people trivialize it as “just do the machine learnings to fix it all” or “if these tech companies actually gave a damn this wouldn’t be an issue”. It’s an incredibly hard problem, and much like security it will never be “solved”.
That doesn’t excuse when technology has a bad effect on the world. We need to be better, and for the most part as a society we are trying hard as hell. It’s also why the work is interesting and impactful. More should get into it.
I talked to FB people (recruiters, managers) in the site integrity team back in 2018. Admittedly I had some reservations up front, but during the on-site day I made up my mind to never even consider FB as a potential employer again. During a chat session between interviews, I asked about the prospect of doing proactive education - essentially, detecting users who had been caught in the influence operations and then surfacing them a note that they had been targeted by such activities, so that they could themselves make educated decisions.
The senior manager I was talking with at the time was visibly taken aback by the very idea. "We don't do that!"
From that experience I drew the inference that FB are fundamentally, as an organisation, incapable of doing the right thing. I suspect it's less about the cost, and more about the prospect of openly accepting accountability for what their platform is really used for.
According to the article, Facebook did make the deliberate decision not to dedicate resources to Myanmar and other non-western countries, so I don't think it's negligence. "Greed" is more appropriate.
> Guy Rosen, Facebook’s VP of Integrity, told Sophie Zhang that the only coordinated fake networks Facebook would take down were the ones that affected the US, Western Europe, and “foreign adversaries.”