There is a moral difference between doing and not doing that you seem to be glossing over. Facebook didn't do enough to stop all sorts of things. But the question is what moral responsibility do they have for not recognizing issues soon enough, or not committing enough resources to responding, etc. It's not obvious that they have the backwards-looking responsibility that your argument requires.
The role of Facebook and other social media in swaying politics is almost presented as if it began in 2016.
In reality what the U.S. saw was more like version 3 of a society manipulation tool.
Looking at the events in France recently, the 2016 election, Brexit, the Arab Spring and to some degree the 2008 election. We can see traces of targeted information inflaming a populace for political goals if we look back a decade.
This is not about Facebook specifically, it's more an observation on the kinds of social media manipulation that can happen with these tools. The question becomes, how much of this was an intended consequence of building these tools? A cynical view could make a strong case that it was intentional.
The role of social media in the Arab Spring was spun as a positive thing at the time though, as was the way it was used by Obama in the 2008 and 2012 elections.
Sure, but that proves Facebook was aware of its influence as early as that and had enough time to consider the implications. If you find that politicians are using your product to sway public opinions, it's your moral responsibility to think about what that means and how it can be abused. You can't just shrug your shoulders ten years later and pretend you're surprised.
> There is a moral difference between doing and not doing that you seem to be glossing over.
Sure, I'm glossing over it, because it's not really a useful distinction for me to make in this case.
The difference between doing and not doing is that one is a deliberate criminal action, the other is negligence. Both of these things are, in various ways and in various situations, considered immoral and illegal.
The harshness of the punishment may vary, but the fact of punishment does not.
By way of analogy, if a company operates a nature preserve and invites people to walk through it and then sells licenses for other people to come and shoot at people on the preserve, and gives them lessons on how to shoot people, then that company will probably not be in business much longer.
But negligence implies an obligation to act. Did facebook have an obligation to recognize the breadth and depth of the Russian operation against the U.S.? I don't see how.
These are the questions that folks who want to see facebook suffer consequences of some kind need to address. Your argument is not made simply by pointing out that bad things happened on facebook.
You're implying the problem is Russia or that the "attack" was somehow particularly sophisticated.
It was entirely possible, using the tools Facebook provides, the way those tools are intended to be used, to do extremely unethical and illegal things without Facebook intervening.
If I run an unprotected server and allow people to execute arbitrary code on it it should be pretty clear that I was being reckless if it turns out after ten years that all this time script kiddies have been running elaborate botnets with it. Especially if the code examples I provide take users 90% of the way to doing that.