The public sphere changed with the advent of social media. At first, corporations and governments found themselves at the whims of people, and then it flipped. I believe we may be able to bring back some balance with reviewable moderation such as via a site I made, https://www.reveddit.com/about/faq/
Perhaps so, every bit counts. But I cannot see how these problematic matters are going to be fixed unless the individuals who have the power to affect the lives of others are brought to account by one means or another.
We should know who these individuals are whether they're making decisions that are in our interests or otherwise.
As it stands now many of the individuals who've the power and who have altered our lives for good or worse by their decisions remain hidden completely anonymously behind both corporate and government bureaucracies. Anonymity, has in many instances enabled them to act without accountability and with complete impunity.
This is the crux of the problem, until these individuals know that their decisions will be gone over with a fine-toothed comb by the population at large and that they will be held accountable then nothing will change.
Both now and in the past the argument against this approach is that such scrutiny will make decision-makers ineffective in that they'll be too scared to make hard or effective decisions. This is a weasel argument argued by bureaucracies to protect themselves and to make their lives easier. There are numerous ways to protect individuals who make 'sensible' decisions from, say, an over-zealous citizenry or unreasonable shareholders who may not understand all the facts but that's for another discussion.
Change has to come because this anonymity has allowed fairness and due process to be so abused that our institutions now have an ethical crisis. It's why so few people have faith in or have respect for them anymore.
> I cannot see how these problematic matters are going to be fixed unless the individuals who have the power to affect the lives of others are brought to account by one means or another.
It's happening, even while they are anonymous. When unscrupulous mods cannot maintain the theater of innocence while continuing to manipulate, they tend to step aside themselves or be put aside by Reddit [4-6].
Deanonymizing the internet isn't going to happen because that would be hugely unpopular. But also, you'd need to wall it off because it will always be possible to be anonymous in certain parts of the world.
It may not be enough to know who's in charge. Consider how many actions public figures take that we don't know about.
> This is the crux of the problem, until these individuals know that their decisions will be gone over with a fine-toothed comb by the population at large and that they will be held accountable then nothing will change.
This is precisely it. Knowing their actions alone may be enough. It's hard to grow a following, so when faced with knowledge of their actions, mods are incentivized to incorporate users' feedback into their moderation style. Perhaps surprisingly, this can all happen while they are anonymous! And that's a good thing because again, there are parts of the world we just can't deanonymize, to say nothing of the backlash such a policy would have in the free world.
The more who know about Reddit's secret removal feature [1], the more difficult it will be for Reddit to keep that feature as is. I've observed positive changes in communities that know about it. Mod behavior does change when users are informed about removals.
Still, Reveddit is not widely known, and it can be difficult to share since mods often remove it. When I first made this site I assumed it would go viral immediately. I overlooked the fact that mods would often remove it or even add it to their automod configurations [2] so that it becomes unmentionable.
> Both now and in the past the argument against this approach is that such scrutiny will make decision-makers ineffective in that they'll be too scared to make hard or effective decisions.
Yes, I've heard that. Former presidential advisor Ben Rhodes, at a recent panel on disinformation, remarked [3] that it used to drive him insane that in 2014-2015 he was unable to immediately declassify imagery of Russian weapons and personnel flowing into eastern Ukraine, even though that imagery was commercially available. He says 2022's instant declassification process is clearly much more effective.
> Change has to come because this anonymity has allowed fairness and due process to be so abused that our institutions now have an ethical crisis. It's why so few people have faith in or have respect for them anymore.
In my opinion, it's not the anonymity that is the significant issue, it is that the actions themselves are hidden in a way that users do not expect. That may be why some mods consider it to be so effective.
You're correct, anonymity isn't the principal problem, the lack of accountability and proper governance are - and that means all actions of government should be open to public scrutiny. The only exceptions should be operational security matters and even then they need proper independent oversight (e.g.: is, say, CIA operation xyz in the national interest or not).
Exposing individuals in a fucked system is a blunt tool but it's better than nothing (or when other approaches fail or when unavailable).
Right, the digital era has thrown up challenges that are far from resolved. The problem as I see it is that far too few people are interested in seeing that these issues are solved and even fewer are actively engaged in doing something about solving them. The world seems more preoccupied with nonsense and trivia like yesterday's news headlines about a Kardashian wearing an ancient Monroe outfit than resolving issues that would actually improve people's lives.
I'm cynical enough to believe that these trashy distractions are encouraged, after all the technique's not new, one of the reasons why emperor Vespasian built the Colosseum was to distract the masses from serious issues such as food shortages.
The video is good but long so I've only seen the beginning of it so far.
> The problem as I see it is that far too few people are interested in seeing that these issues are solved and even fewer are actively engaged in doing something about solving them.
I forgive people for not knowing. The worst mod abuses are hard to see. When you reveal them, the behavior disappears; either the mods leave on their own accord or Reddit removes them. Then, when you can no longer point to an egregious case, people who didn't see the previous abuses wonder what the problem is: what they see isn't that bad. I guess the solution is to continue sharing and chip away at the problem.
About the video, I only meant to share that linked clip, 30 seconds or so, although that last panel is quite good.
> I'm cynical enough to believe that these trashy distractions are encouraged
I don't think that's cynical. The book by Pomerantsev that I mentioned in another comment, "This Is Not Propaganda", argues such distractions are part of the propaganda targeting democracies.
I think revealing what gets removed gives people a better idea for how forums are managed and encourages healthier conversations by treating us all as equally worthy of access to information. Not necessarily free access, but equal access to what's publicly accessible.
It isn't enough to mark news as fake. Some people won't believe you. But when you start to show what is removed, then people start to have a better sense for what is propaganda and what is not.
"I forgive people for not knowing. The worst mod abuses are hard to see. When you reveal them, the behavior disappears"
Right, we're (a) not training people properly and a young enough age and (b) we're not telling them the truth - so they end up cynical or misinformed or both.
Haven't read Pomerantsev but his background would give him great insight. Similarly, Chomsky's works are full of it - here's just one: Manufacturing Consent, etc.:https://en.wikipedia.org/wiki/Manufacturing_Consent. Trouble is, Chomsky is heavy going for the average person, it's dense and almost turgid, it's got all the good oil if one is prepared to mine his books for it. (I actually met him once on a lecture tour 20-plus years ago - he's much easier to understand in person than from reading his books).
We need to get simple versions of these ideas out to the population at large but it's always proven difficult.
The issue of course is that for quite some time we've pretty much an overview of what the problems are but we don't have the same power resources to implement them as does the 'Establishment'. (Again, there's no doubt that we know the facts: Pomerantsev, Chomsky et al have documented this stuff ad nauseam—latest developments, social media etc. only add nuance to the basic facts—the problem is that we've failed to implement the fixes.)
"It isn't enough to mark news as fake. Some people won't believe you. But when you start to show what is removed"
Right again, but this takes training (and it can be slow) if it's to be done well. It comes back to what I was saying above about training. One doesn't start training a six-year-old with fake news (although that's difficult not to do these days), rather you train basic logic such as arguments have subjects and predicates or otherwise they make little sense, and to understand simple fallibility tests, false/contradictory statements, etc. such as 'All swans are white' and 'Western Australia has black swans'. Even very young kids can quickly see that just one exception—even if it's on the other side of the planet—makes the first statement false.
In essence, kids need to develop good bullshit filters at a young age. As they grow older, they'll learn to fine tune and better filter what they hear.
What concerns me is that this sort of training isn't done as a matter of course. (I suppose I was luckier than many, it was taught at my primary school.)
> Right, we're (a) not training people properly and a young enough age and (b) we're not telling them the truth - so they end up cynical or misinformed or both.
Could be. With or without education, many people are shocked to discover how social media platforms permit anonymous moderators to remove content and, at the same time, still pretend that this content is live to the author of the content. The fact that it's possible to set up a system that way doesn't mean it is expected. I wouldn't say people are dumb for not knowing or expecting that.
> Chomsky
I understand Chomsky has a lot to say about propaganda. How much attention does he give to the censorship part of the manipulation equation? Pomerantsev, I am discovering, gives no attention to it that I can find.
> Right again, but this takes training (and it can be slow) if it's to be done well.
Yes and no. Show someone something they posted that's been removed without them knowing and they immediately understand the potential for wider abuse. The trick may be showing people things they care about being removed.
> What concerns me is that this sort of training isn't done as a matter of course. (I suppose I was luckier than many, it was taught at my primary school.)
The public sphere changed with the advent of social media. At first, corporations and governments found themselves at the whims of people, and then it flipped. I believe we may be able to bring back some balance with reviewable moderation such as via a site I made, https://www.reveddit.com/about/faq/