I feel like it's more ethical to just outright ban someone than to shadow ban. This is not only censorship but false advertising.
When you interact on social media you expect a certain level of fairness and visibility of your content.
This masquerades as providing that fairness when in fact you are screaming in an empty room. Shady.
Others not so much. When they came back repeatedly with a new account after multiple bans with no change in behavior they got shadowbanned. The couple of users that figured it out and came back again got the nuclear option. Imperfectly and without guarantee of no collateral damage, it linked IPs, accounts, etc of the user. They could register a new account but the shadow ban followed them unless they were very careful; in which case we just applied it manually.
The thing is, you have to make the system easier for you to control than it is for them to exploit. Otherwise they’ll just grind you down until they “win”.
Shadow banning is way more effective. IP bans don't work because even tech illiterate users can figure out what's going on and get a proxy working.
We ended up going with normal account-level banning after experimenting with IP-based shadow banning. The game was popular with kids, who would tell their siblings/school friends about it. So lots of IPs were associated, and at the time teasing apart different traffic streams from the same IP was sci-fi level stuff.
But, we did learn that normal banning was way less effective. If it weren't for the collateral damage, we would've kept shadow banning with IP account association.
The shadowban slowed them way down by denying the feedback. A user would have to respond to them for them to even know they did it. If they did, we’d manually shadowban that one too and they wouldn’t realize for awhile.
Lots of work for them. Little for us.
This does not masquerade as providing fairness at all. It's nothing to do with fairness. It is removing that user in a way that makes it difficult for them to come back. That is all.
Firstly, not all shadow bannings are guaranteed to be fair and not due to some judgement error by moderators or moderation algorithms.
Secondly, I would argue that sites like Facebook are uniquely positioned to cause harm to users for whom its moderators or algorithms have failed. It has been commented and studied that heavy facebook use is correlated to depression.  It has also been commented that facebook feeds your dopamine response cycle.  A user who has been unfairly targeted by FB moderation may feel isolated from friends and family, may feel like they suddenly lost their dopamine fix, and may therefore experience psychological distress and isolation. It's not hard to imagine how a decision to shadow ban can have real consequences on the banned, who may in truth be decent human beings not deserving of such treatment.
Footnotes - Some random citations from google searches on this topic - by no means exhaustive:
There is a similar dopamine response cycle with gambling addicts, but a casino barring an addict for the safety of themselves and other customers is hardly seen as "unfair" or unethical regardless of how the banned person feels, because similar to with social media bans, it is generally seen as being for the benefit of the community and even the banned members themselves.
A banned FB addict might spiral into depression just like a barred gambling addict, but in both cases if anything the ban is helping the person face and overcome their addiction rather than continued service which would be enabling or perpetuating their addiction.
Or: the gambling addict isn't typically nearly-forced into using a casino to keep in touch with loved ones.
The idea is that the person being shadowbanned is either communicating in bad faith (trolling or trying to underhandedly manipulate others), or is a paranoid schizophrenic.
In practice, though, shadowbans are applied to people who are persistent annoyances, even if they're not intentionally so, and even if they're not delusional. The result is a shift of the forums' Overton Window. Gradually, other people become annoying outlyers on contentious topics, and they have to be shadowbanned as well to "keep the peace". On and on it goes.
People who are justifiably shadowbanned — trolls and paranoids — probably check periodically to see if their main account has been shadowbanned. Otherwise they're poor trolls or not very paranoid paranoids. If a troll discovers this, they will be enraged and redouble their efforts if it's technically possible to get around the ban. If a paranoid discovers a shadowban, it will feed their paranoia. Either case is very bad for the forum, when the toxic user inevitably re-registers (assuming they're able; if they're not why couldn't they just be banned outright?), causing chaos until they're identified and shadowbanned again. The only person this really works on is the unintentionally annoying poster who discovers the shadowban and becomes depressed over being gaslighted, and might leave the forum on account of that.
Shadowbans seems like they're part of a war of attrition against justifiable targets, with quite a lot of collateral damage.
> If a troll discovers this, they will be enraged and redouble their efforts if it's technically possible to get around the ban.
In my experience shadow-banning people on my forum, outright banning someone tends to enrage people much more since they basically realize they are banned in the heat of the moment that got them banned. One benefit of shadow-banning is that they may have cooled off by the time they realize, and frankly people tend to handle it better, even sometimes a "ah, touche" sort of mentality.
I've noticed my own posts on HN individually shadow-banned by looking at my comment history in incognito mode and I get it. "Yeah, I knew I shouldn't have worded that post so strongly."
Better mod processes help the issue. On my forum, people shadow-banned pop up in review queue at intervals. If they seemed to have cooled off, they can be unbanned. Mods can also vouch for certain posts like people can on HN with showdead on. There's a view in the mod panel that sorts people by vouch count and we can potentially unban people that way as well.
Nobody would need a shadow-banning mechanism if the moderators had unlimited resources. People forget that it's an effort to be on the right side of the trapdoor function.
Meanwhile I wager that people that don't think shadow-banning is ever just are coming from an idealism you tend to have before you actually try running a forum and realize the extent a single person can harm your community and waste everyone's finite time.
I think the next level of shadow banning (if not happening today) will be shadow supporting or shadow questions/responses. Where an entire thread keeps progressing to convince a banned user from investigating their true status by keeping the dopamine flowing.
I think the world is much better when it gives people chances to improve.
also you don't have a right not to be censored by private companies.
Not that I'm necessarily arguing for GP's point here, but the point of a bill of rights is to grant or in some way codify rights that people think they should have. They are not necessarily lists of natural rights that won't be violated.
You kind of do. They can't claim they offer one experience and then knowingly provide another, lesser one.
Proprietary platforms are under no obligation to host your content. Choosing to deny access or full use of their products to you isn't "censorship"
And we'll never get one as long as everyone who wants one is not being the change they want to see in the world.
I'm thinking YouTube, Reddit, Facebook, Google when I'm talking about these sorts of things.
We've also got the fun game of "Platform or Publisher?" and responsibility about the content on the site itself.
Is it right for a CEO to get charged with a crime if someone on their website offers sexual services (SESTA / FOSTA)? Well, if it's an open forum, absolutely not. If the company is in complete control of their content? That's a different story. Now the question becomes "how much control over content can a CEO give up and still remain responsible for the content on the site?" and where we as a society decide we need to end up on that gradient is where we answer the question: "what should site owners be able to do on their site."
I agree with you about software patents, though. I think software patents across the board do more harm than good.
That said, I would never support something such as disallowing shadow banning in an internet bill of rights. That goes way overboard and just ties the hands of legitimate platform operators.