I read it differently than "shaping public opinion". The reality is that social media (and Twitter especially) is toxic. That underlying toxicity comes from humans of course, but recommendation algorithms accelerate and highlight it because it leads to high "engagement", and thus more money. These sites already "guide" conversation, just using metrics that can end up being harmful.
A social media CEO that's interested in breaking the cycle there and trying to recommend content that's more constructive than inflammatory sounds like a great thing to me. Yes, there are a dozen pitfalls awaiting anyone that tries, but it's still worth attempting.
The problem here is that "toxic" is subjective, what I find toxic and disagreeable may not be the same thing you find toxic and disagreeable. Take for example hot button issue of Gun ownership, I believe it an essential right and extension of self defense, others view any talk about guns as toxic that should be banned.
Who's worldview should win? Mine, your's, Twitters?
IMO platforms like twitter should not be making the choice as to what is or is not toxic, they should be giving users the ability to curate their feed's.
> others view any talk about guns as toxic that should be banned.
Do they? Are you sure? I’ve certainly seen many calls for the glorification of violence to be limited, stuff like that. But banning actual discussion of guns? I’d be interested to see examples of people advocating for that. In any case, Twitter can simply ignore people asking for that because it isn’t a reasonable request.
You can have sensible, level headed discussions about guns and gun control. You can also have inflammatory, toxic discussions about them. It’s interesting to think how you’d develop a system that prioritises the former without bringing along the latter.
> Who's worldview should win? Mine, your's, Twitters?
What does "win" mean, here? If you're the CEO of Twitter then Twitter's worldview should always win. Of course, as CEO you also get to choose what Twitter's worldview is.
>>I’ve certainly seen many calls for the glorification of violence to be limited
Guns was probably a bad choice for Twitter, would have been better for YT as YT has recently cracked down hard on firearms content.
However even on twitter the "the glorification of violence" is very subjective. For example people celebrating the jury verdict in the Rittenhouse trial, who many believe was an attack on the very right to self defense, has been reported by many as "glorification of violence"
Even more recently, I have seen attempts to censor conversation, and video around the shooting / death of Chad Read.
Then you going to have a conservation around Self defense use of guns it will include violence, there is no way around that, if you are going to censor violence then by necessity you have to sensor guns or relegate it to a discussion about hunting only.
> However even on twitter the "the glorification of violence" is very subjective
Sure. Running a company is subjective! That's why we celebrate CEOs rather than try to perfect an objective CEO algorithm that runs in the cloud. Lines have to be drawn somewhere. You could draw the line at "absolutely anything is allowed" but that might not be a wise business decision.
> who many believe
> has been reported by many
> I have seen attempts
I'm sorry but these are all very vague assertions. I don't know what any of us are supposed to do with them.
Yes they are pretty vague when you take them out of context and do not look at the over all statement, if however you read the entire thing it is pretty clear what I am talking about. If you have followed that news event then you would also be more aware of what I was referring to. if you did not follow it you may have a harder time with the context but I think it is still pretty clear
Twitter will amplify the most extreme positions on both ends, the ones to cause most outrage
- Ban all guns, completely!
- No restrictions on gun ownership should be allowed whatsoever, not even age limits!
These cause the most outrage / emotion => therefore they get the most retweeted => create a distorded mental picture of even deeper division. We're never going to solve anything that way.
The problem isn't one of free speach and censorship. Its a problem of amplification of emotionally manipulative content. The amplification is exponential (because the retweet process is exponential). This is a disaster.
It's their tool, so they get to build it to their liking, until law is created requiring them to do otherwise. The question in the meantime is, "what is moral?" and "do we as an entity (twitter) enforce our morality at all, and to what extent?"
For example, millions of people believe the lie, the fabrication that the 2020 presidential election result was fraudulent. If there are organizations on your platform amplifying messages that are fraudulent or intentionally misleading in nature, should Twitter take action?
The new CEO seems to think the answer is "yes".
IMO people should be able to curate their own feeds, but Twitter has the full right (and perhaps the moral obligation) to flag content that is bullshit as bullshit. The danger, of course, is that the kinds of people who fall for conspiracy propaganda from fascists on the right will then think, if their leaders' lies are called out, that the calling out is itself a conspiracy, and further entrench themselves rather than heed warnings.
"How to keep a social media platform from enabling anti-democratic demagogues" is an unsolved problem.
There are many facets of this comment I would like to address.
First lets make the assumption that I agree false information should be (or even can be) curbed on social media, things like Flat Earth... The problem here as we saw with COVID picking "authoritative" sources is not always accurate and tends to curb legitimate dissent as much as it does false information. Anything from the origins to COVID to the flip flopping nature of mask wearing, to discussions over mandates have all been censored in various ways under the guise of curbing false information. That is very very dangerous IMO, in fact to me it more dangerous than the false information itself. It is akin to the legal standard of "better 10 guilty people go free, than 1 innocent be imprisoned falsely" well to me, it is better than 10 false statements be spread than 1 true statement be suppressed
Then you have to take into account the clear political bias in deciding what is "false" information, you talk about the "big lie" of election fraud, but what about the continuing lies about the Rittenhouse trail, the protests / riots, the Waukesha Atrocity, Russia Gate, and many others continuing to be spread by the "authoritative sources" that many of these platforms use as Ministries of Truth. None of which has any kind of censorship or fact checking attached to it, it seems only one political camp has these fact checker flagging deployed to them. If you are going to fact check the "Big Lie" on election fraud, then I want to see fack checks on all those other topics as well.
Then you talk about Twitter "flagging" content, I actually agree that is the correct path. What twitter (and youtube) does to add a flag, or content message directing people to different sources is a good thing, I have no problem with this I just want ti deployed in a political neutral, fact based way. Today it is not being done that way.
What I do have a problem with is suppression, bans, and other direct forms os censorship often employed by twitter and other platforms. I am a firm believe that the solution to speech one believes is false or "bad" is more speech you believe is true or "good" not attempts to censor and suppress which often has an amplifying effect.
There are better ways to win the hearts and minds of the population than authoritarian control of what people see, hear and say. No one likes to be controlled by someone else. Censoring discussion of the issue is only going to make more people assume the election was in fact "fraudulent". Similarly Sam Harris has said in a podcast how laws against Holocaust denial do more to create more Holocaust deniers than they help because they automatically make people assume you have something to hide, even if you don't.
In general, intelligent people feel an intellectual responsibility to question what they are told.
The irony is that millions of the people you’re trying to defend as free-thinkers who can look at any speech and make good choices, are literally controlled by Fox News and Alex Jones propaganda.
Side note: In fact I believe there’s a legal path to suing Tucker Carlson out of existence by proving, with real data, that people really do believe the nonsense. The only reason Carlson is still trumpeting destructive lies from a megaphone is that so far, judges have accepted the argument that “no one in their right mind believes that what Carlson says is true; he’s obviously a satirist.”
> are literally controlled by Fox News and Alex Jones propaganda.
You think you've got it, and though that may seem obvious I must set you straight even if it scares you as much as it does me: it is the other way around; those are mere reflections ultimately under their control (as much as anyone controls their beliefs).
Your parenthetical statement casts a large shadow over the rest of your statement, because yes, control varies, and people with low control (correlates with low education) are precisely the ones who fall prey to cynical propaganda.
There's this idea that there are millions of mind-numbed conservative zombies out there, blindly following Trump or Tucker Carlson or spokesman x, but I don't see it. What I think is closer to the truth is that there are a ton of angry cultural conservatives who distrust everybody, but begrudgingly watch Carlson because they perceive him as better than the actively-hostile rest-of-the-news. That doesn't necessarily mean they're making good choices, but it's good practice to understand why people make the political choices they do.
The theory that a huge percentage of Tucker's audience is watching it knowing full well that Tucker is a full of shit zero-content "satirist" just isn't convincing at all. Unfortunately, though, it has been convincing to some judges but like I said I hope that will change.
That said, I now a bunch of people firsthand who are getting brainwashed on Tucker and Alex Jones. They really do believe the crap.
The problem is of course, he won't ever do that consistently. Instead he will simply decide that people he doesn't personally like a "lying" and "spreading misinformation", whilst people who are powerful or who he does like, never do.
Consider that if Twitter censored everyone who believed a lie or fabrication, every public health person who claimed masks didn't work and then that they did, would all lose their Twitter accounts or be hidden. Guess what, they will never do that.
Thus it is reasonable to interpret their use of the word "healthy" to be "heavily left wing biased".
To not promote toxicity, one could just avoid amplifying anything that’s a hot button or divisive issue. However, incentives don’t align with that.
The key is the issue is the amplification. Promoting of content you’re not following in feeds.
Of course, completely changing it back to only content from those you follow in chronological order and allowing you to curate would solve that problem as well, but there’s no way they go back to that as there’s far less money involved.
It doesn't mean "something I don't like". It means an outright lie, or a statement made in order to mislead or bait people into useless or malicious behavior.
Trolling, lying, saying stupid and libelous things with the intent to anger. Bad faith is about intent.
I also look forward to the day that "censorship" is allowed nuance. If you think Twitter deciding it won't be a party to disinformation campaigns is "censorship", we have bigger issues.
It probably is not what you personally mean, but outright censorship is what will likely happen if we do not actively resist the calls for silencing the deplorables. Freedom of expression is not the default state of the world.
Do you believe that Twitter banning the Hunter Biden laptop story (to the extent that you couldn't even DM a link to it to other users) before the election wasn't an act of political corporate censorship?
Youtube for one has banned a lot of discussions around guns, and gun channels. There is a very limited number of things they allow and gun channels are walking on egg shells. The Rittenhouse trial cause alot of banns, strikes, and etc as well. Including one of the most popular law channel's getting taken down for a time.
That is one example, I can instead highlight any number of other topics like abortion, pronouns, gender, sexuality, any of the other "culture war" topics.
I subscribe to multiple high-sub-count YouTube channels that post videos about guns on a regular basis. Are you referring to their rules about violent/explicit content? That is absolutely not a subject matter ban like 'you can't discuss guns' and the facts don't support a claim that they ban guns.
YouTube's rules enforcement for videos is notoriously bad and has been forever, but that doesn't change their actual policies.
Enforcement is more pertinent than policy. If their policy allows for such videos, but in practice removes them, it doesn't really matter what their policy is.
A social media CEO that's interested in breaking the cycle there and trying to recommend content that's more constructive than inflammatory sounds like a great thing to me. Yes, there are a dozen pitfalls awaiting anyone that tries, but it's still worth attempting.