Given Reddit's already unequal application of the site rules (some people can shamelessly vote manipulate, others receive hellbans for merely following links), and the propensities for specific groups to treat mild disagreement as "harassment", this does not bode well.
It gets really blatant if you've been on the site for any length of time. I really hope this doesn't kill the site, but it looks bad.
> I really hope this doesn't kill the site, but it looks bad.
The issue with Reddit is why would anyone spend a couple hours writing thoughtful comments when there is some algorithm that just deletes them at random? The obvious answer is they wouldn't. And so by and large the people who used to contribute good content have already left.
I doubt this change will push the site into a death spiral, but in general when rules like this are created then it certainly signals that there probably weren't many obviously better options.
>writing thoughtful comments when there is some algorithm that just deletes them at random?
huh? i'm a very active redditor across many subs and i've never encountered anything like that. you're the first person i've ever seen even mention it. unless i'm not understanding what you mean?
> Given Reddit's already unequal application of the site rules (some people can shamelessly vote manipulate, others receive hellbans for merely following links), and the propensities for specific groups to treat mild disagreement as "harassment", this does not bode well.
It's not even mild disagreements. Most of the subreddits are run like cults. If you look at the mods of the major subreddits, most of them are run by a select few users. The same mods on every other subreddit.
Is this literal? I've heard that subreddit mods have been caught promoting/demoting content for money. Or even like, Russian propaganda. But I've never had the in to that community to be able to verify these claims.
Famously, /r/technology was demoted (removed from "default" status, meaning you see it when you're not logged in, meaning it's visible to the majority of Reddit's users) after it came out that they had a bot removing submissions with certain words in the title. Those words included things like "CIA" and "Snowden".
/r/skincareaddiction was caught red handed (with a price list, even) offering promotion services.
Those are the big ones I can think of - most of the suspicion comes from the fact that you find a lot of the same people moderating a lot of the same subreddits. It smacks of the cabal.
(Given communities with a few million subscribers, how is one supposed to do their job on a single one of these, let alone multiple?)
It's the control freaks, the people who stay late everyday and never go on vacation that are most likely to be robbing you blind. Each mod should have mandatory off time or be banned for a few weeks out of every year, at least so they don't get too comfortable with their position.
It's how we caught which manager was stealing from us and got the evidence that our accountant was funneling money out to herself (although it was noticing when she bought that new car and put her children into an expensive private school that created the suspicion).
I mean, rules are applied by the mods of each different subreddit, right? I don't doubt some subreddits have huge problems, but IMO the majority don't, and the site itself works just fine.
A moderator can't shadowban you. But someone criticizing Ellen Pao got a shadowban recently, which means the actual administrators (Reddit employees) did it. I don't know if it was for criticizing her or for something else.
I am not a habitual Reddit user and know little about the controversy with Pao, but I am interested in the subject of Internet site moderation so I have been looking into what has been posted in this thread. I found this. Pao linked to this post by a non-reddit admin recently as an example of how she thinks shadowbanning should be done: https://www.reddit.com/r/blog/comments/35ym8t/promote_ideas_...
Selected quotes:
>You will never be told exactly what will earn a shadowban, because telling you means telling the sociopaths, and then they will figure out a way to get around it, or worse, they will file shitty, frivolous lawsuits in bad faith for being shadowbanned while "not having done anything wrong"[...]Shadowbans are intentionally a grey area, an unknown, a nebulous and unrestricted tool that the administrators will use at their sole discretion in order to keep reddit running[.]
The first thoughts that come to my mind are "this isn't compatible with a culture of transparency" and "there has to be a better way" but I can appreciate how hard it must be to keep a site the size of Reddit running.
Maybe I'm in the minority, but I don't see anything wrong with her explanation. Anyone who has run/moderated a public forum knows the pain all too well.
Of course, this only works well when the overlord behaves in a way that the subjects agree with. As soon as the overlord makes one too many bad calls, the subjects revolt.
I don't know how you can be interested in Internet moderation and think that "transparency" is anything other than a road direct to concern troll / rule-lawyering hell.
It maybe worked for meatball wiki but they were i) small and ii) disciplined.
Because I've done it. It is not necessary to put up with people rule-lawyering and concern trolling just because you're very clear on why people were banned.
Ellen Pao is on a power trip. Let's not forget she just filed a frivolous lawsuit against her employer. That's something that liars and other people without morals do, they try to take advantage of others by feigning victimhood when they know they're wrong.
Now she won't tell you the rules because like every unprincipled person, she wants to be able to change the rules as it pleases her. That's the type of person she is.
The quotes make her sound like a tyrant. That's what leftists become when they're given any power. I'll watch Reddit go down from far away as I do not want to be in the same website as liar Ellen Pao - she might frivolously sue me.
Well, that's actually relevant the policies being discussed here. If they were criticising Pao in a reasoned, sensible way, that's a problem. But if, as often happens on Reddit, they flew off the handle and were abusive/inflammatory/etc. then perhaps they deserved it. It's a similar system to how Hacker News operates, AFAIK.
Permanent, site-wide shadowbans are not intended for use outside of spam/bot countermeasures and are not policy against users, especially for a nasty one off, no matter who against.
That they are happening to users at all is evidence of either an unreported change in internal policy or high level vigilante modding. And as users, there is no way to find out how many are affected by this.
Another issue exists where after a person discovers their shadowban, they request its removal and are denied -- because there was no note left for why they were banned. What if it were placed by a Reddit mod and not a sub mod? How could anyone make a reasonable appeal in that case?
Their opaqueness is why shadowbans are effective, but mods treating user appeals like gag orders with top secret clearances is not in the spirit of their duties. After a user discovers their shadowban, it's no longer effective, so there is no reason not to disclose information to them about it during their appeal.
Opening the appeals process so users can see for themselves why they were banned, reversing appeals against noteless bans without contest, limiting the duration of all bans not made against spam/bots, a system of blind review, and increasing mod accountability would all serve to strengthen users' ability to stay active on the site without compromising the intended function of the shadowban.
Do you know what a shadowban is? It's not just a page that says "you were banned for being abusive"
It's actually an anti-spam measure, because when you post while being shadowbanned you see your posts, but nobody else does. That means you could be posting for a long time and not even know all of your posts are actually invisible.
That's not just telling someone to fuck off the site. That's actually erasing them from the site because they won't even make a new account - they'll keep posting and not even know they're invisible.
I am aware of what it means, yes - though I'm not really sure why the nature of shadowbanning is relevant in whether someone deserves to be banned or not. As you suggest, if someone is outright banned they'll likely just make a new account. Shadowbanning is, if anything, a far more sensible type of ban.
And it's something that - at the risk of repeating myself - Hacker News also does.
I think that's very severe and undeserved. Just criticizing someone in a post shouldn't get you hellbanned. That's for persistent trolls that just won't go away.
The actual content of the post that appeared to merit the shadowban is as follows:
Buddy Fletcher, husband of Reddit CEO Ellen Pao, is being described as being the operator of Ponzi scheme
~144 million dollars of a pension fund was lost
Ellen Pao is now accused of frivolous lawsuits to try and stay afloat and some other shit. Seeing as she is a CEO of a large company and has a fraudster for a husband I think it's safe to say we have a textbook ASPD/Sociopath on our hands
no it isn't, that's exactly the concern the initiator of this thread of conversation had.
It doesn't matter if it's reasonable or not as long as it isn't ultimately harmful to the person.
If you dox someone or follow them around for a length of time attacking them, you should be banned for it as it creates a hostile environment.
But that is a far cry from telling someone in no uncertain terms that you think their opinion is ridiculous.
Personally, I've been wanting a reddit alternative for a while, but everything out there sucks in some manner (don't get me started on hubski...). I would love for this to push enough people onto the other platforms for the content to be there. I doubt it will, but that doesn't stop me from wishing it would happen.
It's deeper than that. There's entire post being removed when conversation about Pao was negative. I'm amazed the board are leaving her in place from this. This kind of behaviour will send the community elsewhere pretty fast. Reddit is losing its core principles so will be interesting to watch.
Overall I believe a better solution would be to give users the ability to block people. Ideally via a digital fingerprint so people can't just use a fresh username. This would be more inline with an open community in control of its own destiny.
That's actually not completely accurate.. Reddit has this bot called AutoModerator which has, among its numerous other functions, the ability to blacklist users. A blacklisted user's comments/posts will be deleted as soon as the bot sees them.
In theory, this only applies to subreddits where automoderator is installed, but in practice, automoderator is installed on every sub above a certain size simply because it does a lot of cool things.
In essence, a user can be "soft" shadowbanned this way.
Some popular subreddits have pretty petty mods and there is no real recourse to being banned from the sub even if you explain your position and why it didn't violate any posted rules; if anything challenging their decision seems to send most in to a tizzy.
the real challenge will be users attempting to use the system to shut down other subs or posters within. there are some seriously hate filled subs that people do try to get shut down with false claims of reddit rules violations; namely vote manipulation
given they also freely allow multiple accounts will they shut down all accounts they can prove are related? how do they handle multiple accounts on one IP where the users truly are distinct
I think for it to kill the site, there has to be a replacement waiting in the wings. Digg faded under much more innocent circumstances because Reddit was gaining in popularity. But, this crackdown might prompt somebody out there to launch a replacement.
Voat does indeed seem to be a viable replacement. Of course, it's just starting out, and they are in a self-described alpha state, so they may not be ready for a huge influx of users.
Voat's become a magnet for the GG and other seedy reddit communities that have propped up over the past few months. More power to them, but Voat can keep them.
I'll go elsewhere once Reddit crashes and burns. Away from the MRAs and SRS. Somewhere where the noise is squelched far from the signal.
Can you elaborate a bit on the vote manipulation? Other than certain Reddit power users regularly getting their posts to the top of certain subreddits, I haven't really seen anything suspicious.
So Reddit's got this community called "Shit Reddit Says", also known as 'SRS'. It originally started as a group of Something Awful users doing what can be described as a trolling operation.
SRS's stated goal is to highlight people on Reddit saying bad things (casual racism, stuff like that). Someone from the community finds a post somewhere on reddit they feel meets the criterion, and they post it there. Often immediately afterwards, the post, and the user who made it, begin losing karma at a rate that cannot be described as coincidental.
One of the rules in SRS is "don't touch the poop" - you're not supposed to participate in a thread that gets linked there. However, looking at the relation between post time on SRS, and karma over time, there's no other explanation for what happens to the scores of comments that get posted there.
Before you say "meh, racists, who cares?" - two problems. First, SRS's aegis has expanded to cover any criticism of the social justice movement, rather than actual hurtful comments. Second, what's happening is explicitly against the site rules regarding vote manipulation (vote brigading) - other subreddits who engaged in similar behavior had to institute rules that mandate either "no participation" links (a CSS hack that disables the voting buttons), or outright banning of intra-subreddit linking at the urging of site administration staff.
It's interesting that there's a whole subculture of this stuff now. There's a ton of subreddits starting with "bad," such as badhistory, badlinguistics, and badphilosophy whose purpose is to collect links to comments seen as "bad" in the context of whatever field it's about. This could actually be a valuable service, where you get experts in to correct misconceptions or incorrect claims that are getting a lot of attention. Unfortunately, they're never about that, but are simply about making fun of people behind their backs.
I'm reminded of the Robbers Cave Experiment, wherein all it took to create conflict between two groups of summer campers was the act of dividing them into groups. The subreddit system is genius for allowing one site to serve so many different communities, but it also sets the stage for subreddits to see other subreddits as the enemy.
(And if this comment was on Reddit, I imagine there would pretty quickly be a link to it from /r/badphilosophy talking about how badly I've misunderstood the experiment and how stupid I am, all while not actually educating me or any of the people reading my comment.)
Do you have any data that shows that trend? I frequently see threads where people complain about how blatant the effect is and how hard they're being brigaded, usually with 300+ karma
One simple experiment is to look at the front page of the subreddit - each link description has, per the rules, the amount of comment karma it has when it was submitted. Look at relatively new posts, and look at the difference in karma at submission time and then as time goes on.
I tried this a few times months ago, and being linked from SRS led to a significant decline in scores on every post in the top 25.
>''I enjoy hating on people who think 'fat acceptance' is a legitimate thing...they're easy targets'' [+1521, gilded] (reddit.com)
>[–]RodneyHFarva 1706 points 21 hours ago
>"A bunch of din du nuffins. But when these thugs get killed their mom will cry murder and say he was such a good boy." [+27] (reddit.com)
>esiper 50 points 12 hours ago
>"It started with /r/jailbait, but I wasn't a ephebophile so I didn't speak up. Then they came for /r/thefappening, but I didn't speak up because I wasn't into fuzzy pictures of people I don't know. Then /r/gamergate, and I didn't speak up because I wasn't a gamer. I'm speaking up now." [+28] (reddit.com)
>[–][deleted] 194 points 22 hours ago*
It looks like there's an increase in karma score after it gets linked from SRS
Two things I noticed - all of those comments are more than 12 hours old, which is long enough for the communities the comments came from to counter the downvoting.
It also appears that these are copypasted straight from the front page, you'd have to click through and look at the score on the linked subreddit.
The second part of each line was the comment score/username after I clicked through link, I'll add a timestamp of when I viewed them
(submitted 33 minutes ago by so_srs) "The modern view on gender: Acceptable: Female and spending most of your time on the celphone, reading literotica, and playing candy crush. Unacceptable: Male and spending most of your time on the computer, watching porn, and playing World of Warcraft." [+25] (reddit.com) -> (at 1:16 CST)[–]comosayllama 68 points an hour ago
(submitted 3 hours ago by cakevodka ) things Hitler did were terrible for sure, but, at the same time, it would be intellectually dishonest to not call him a great politician and orator. The fact that he tried to wipe out a couple races and nationalities does not make his achievements in politics and economics any less significant [+66] (reddit.com) -> (1:17 CST) [–]_Pornosonic_ 144 points 7 hours ago
Those combined with the 3 earlier comments I posted make up the last five submissions to /r/shitredditsays(with the exception of one where the original comment was deleted), the first part is the linked comment and the second is the comment score(now with timestamp!)
I can't actually find a single instance from the front page where the score goes down after being submitted. The opposite is usually the case, where the score increases after being posted to SRS
besides the condition listed below it is not permitted for users on one sub to cast votes on another sub by request/suggestion/etc. There is leeway if the user voting is subbed to both but it is not supposed to happen
HN uses it too sometimes. Basically, a hellbanned user doesn’t know they’ve been banned: they can still comment, add submissions etc. The difference is that all their activity is hidden to normal users.
This is a topic I'm legitimately curious about. 4chan existed in a form for a very long time such that (aside from any stated rules) harassment was ubiquitous (at least in /b). That being said, that same domain formed a basis for massive swaths of the internet community and culture we see today.
In any place I've seen these sorts of gestures of "safe-ification", they seem to serve only to 1. be ineffective; for whatever moderation you have it's going to have edge cases and a time delay for enforcement. 2. Split the community, create _more_ (and often more subtler, now that it has to conform to some rulebook) harassment/drama between the split. 3. Drive away productive parts of the community, for any variety of reasons. (for me it's a mix of "I've seen this pattern before and don't like where it goes" and "I don't agree that trying to dictate social standards is the proper way.")
Caveat, and I'm somewhat frustrated that I feel like I need to say this (whether that's at myself or the fear that I'll be jumped on if I don't), I was _VERY_ heavily bullied/harrassed for much of my childhood.
I caveat as such to try and say that I don't hold this stance from a position of ignorance, but that if I hadn't interacted with bad actors in places that are frankly more insulated from real life (the internet) I would have never come to (yes, painful, but still VERY useful in hindsight) realizations about human interaction and my role in it.
This rambled a bit, so tl;dr,
Dealing with bad actors is how we learn to deal with bad actors, which is a _critical_ life skill.
This even aside from that the strategy being applied does not have a successful history, from what I've seen.
(I'm trying to avoid the slippery slope argument, although that's kinda implicit in my second conclusion, I'd rather hear what other people think on this concept of growth through pain, or if it's all in my head)
About 2. One thing i have seen more and more as rules come into play on various sites is baiting. In particular when the rules are automatically enforced.
I don't like it, but Reddit can do whatever it wants with it's platform. Reddit is not a "free speech platform", it is whatever it wants to be at any moment.
In Portugal we have a saying that translates to something like this: "Who is not well/happy, should move to another place". And that's whats going to happen. Some users will go to another platforms.
That is exactly what it is, by their own reckoning[1]:
Create a safe space to encourage participation.
Embrace diversity of viewpoints.
Allow freedom of expression.
Be stewards, not dictators. The community owns itself.
As long as they claim to be holding themselves to that standard, people are fully within their rights to call them out on hypocrisy.
I'd be curious about their exact definition of "safe space". That term on college campuses is rapidly evolving exactly into ensuring that a diversity of viewpoints is not even seen or heard, let alone embraced.
I mean this straight; I'd honestly be interested in hearing an elaboration from them, not as snark about how they can't possibly mean it or something. Again, no sarcasm or snark, high principles are hard.
A safe space for you, if we are at odds, is an unsafe space for me.
Only if you are determined to be at odds. It's quite possible for an atheist and a die-hard Christian to get dinner together, if they both commit to focusing on the topics they share rather than the ones they disagree on.
Safe space is just code for coddling the first person to claim offense.
seems like a willful misrepresentation of the aims behind the concept.
>Only if you are determined to be at odds. It's quite possible for an atheist and a die-hard Christian to get dinner together, if they both commit to focusing on the topics they share rather than the ones they disagree on.
That's not honest debate.
>seems like a willful misrepresentation of the aims behind the concept.
As a white male I am often told that I'm not allowed to participate in discussions because of my race and gender due to the discussion taking place in "safe space." No matter what the original aims are behind the concept, it is easily corruptible and almost never used in the "best case scenario." It's usually used to set up echo-chambers where individuals can feel safe from criticism or debate. It has no place in a healthy society, IMO.
Who says they have to have an honest debate? Why can't they just have a conversation with each other? That's the problem with battling attitudes - there isn't any reason why the two of them must come to blows about their opposing beliefs. They can simply respect each other and move beyond it.
As a white male I am often told that I'm not allowed to participate in discussions because of my race and gender due to the discussion taking place in "safe space."
Interesting, because as a white male myself I have never been told that. I have been told that my opinion in a conversation is not an important one, which is often (not always!) a valid point. But I have never been told I am cannot participate in a discussion solely because it is a "safe space". Where are you having these conversations?
Is there a meaningful and substantial difference between those two things? Both amount to "your input is not welcome here", both amount to classifying one's identity more important than the content of their speech, and the entire concept of a "safe space" is one in which mere disagreement is explicitly disallowed.
Absolutely there is a difference. If a group of women are talking about their experiences of sexism in the workplace they ought to be able to do without a man coming in and "actually"-ing them.
If a group has been marginalized, belittled, and excluded for long enough, it can damage the people in that group. They can need a safe place to heal, to realize that they matter, that they are as worthwhile as anyone else, and that they don't have to be what their oppressors said they were. A safe place can be very important in that process.
Or a safe place can be a place where a bunch that feels oppressed because somebody looked at them funny, and feels more oppressed because nobody else thinks they're oppressed, can get together and rehash their sense of victimhood and exclude anyone who disagrees.
Human nature being what it is, the second is perhaps more likely than the first. But that does not make the first invalid (when genuinely needed).
"It's quite possible for an atheist and a die-hard Christian to get dinner together, if they both commit to focusing on the topics they share rather than the ones they disagree on."
Actually, nothing even precludes them from discussing religion. Just because we're wired to get into angry shouting matches on the matters that pertain to tribal identification (which I used descriptively, not perjoratively... human psychology can not be understood until you understand that we are deeply tribal, and religion is a huge component of that) does not mean that we are obligated to give in to those impulses.
It's just... really hard. For everybody. And increasingly, giving in to those impulses is being held up a virtue, rather than vice. This bodes poorly for the future of civil society.
But doesn't a civil society benefit from discussion on topics rather than ignorance of them?
Because as far as I'm concerned safe spaces only promote ignorance, one sided narratives, and echo chambers.
America, during the red scare, was a "safe space" for capitalism. Even so much as agreeing with a socialist or communist idea would get you publicly denounced at the least, and jailed at the most.
Safe spaces are private communities for people who can't handle alternative views.
A site-wide harassment ban is one thing. A site-wide "safe space" is entirely different.
I'm not "them" anymore but I'll try to explain what I can.
Safe space used to be a term for a very small group of people who agreed to be positive and nurturing towards one another no matter what. This is important for sexual assault survivors, closet gays, the heavily marginalized, etc. to have a space in which they can actually speak with other human beings about their experiences. This is, in reality, a very important thing.
However, it doesn't scale well. Some people have thought, "Hey, I like these safe spaces...what if we make everywhere a safe space? Wouldn't that be great?" What they miss is there isn't one universal definition of safe space and you can't just force your definition upon others.
In a very general sense, at the reddit scale, "safe space" means protecting those who need protecting. That means protecting the disempowered, the marginalized, the minorities, etc. from the ignorance and violence of the masses. Protecting them from the hegemonic power of the majority, which means the patriarchy in general and if we're being honest white males in particular.
It sounds good in principle but who is "empowered" and who is "disempowered" isn't an easy question to answer. To make things worse the tools of enforcing a safe space are so powerful that the people fighting against oppression often become oppressors themselves. In the case of reddit there are many members who are openly abusive and believe it's morally justified because they're "punching up" and it's only real abuse when you hurt people lower on the social ladder than you. (As an aside, this a big reason to claim victimhood: once you're a "victim" you get to do things you otherwise wouldn't.)
What to you mean by "Create a safe space to encourage participation".
Define a safe space. I can't physically hurt anyone, so that's a given. But are they considering a "safe space" a place where I can't say anything potentially insulting, offending or hurtful? In that case, I don't have freedom of expression and neither is Reddit encouraging (my) participation.
I would say the original statements and your version of the statements are the same thing. If they were not they should read as:
Embrace some viewpoints.
Allow some freedom of expression.
With whatever variation on the language as necessary. If the statement doesn't originally place a restriction on the idea, then it isn't necessary to specifically state there is no restriction.
Therefore, I would say that technically:
Embrace viewpoints.
Embrace all viewpoints.
Are, in fact, the same thing.
As for maintaining decorum, that where the restrictions come into play and have to be specifically stated:
Embrace diversity of viewpoints by permitting all
statements that follow community guidelines on language
pertaining to harassment, crudeness, personal attacks,
etc. Any statements may be censored for not following
community guidelines.
It's a fuzzy statement of values from a corporation. Reading it as if it is a computer program is a mistake.
I guess you can choose to give them more or less credit based on how closely they hew to a pedantic squeezing of the meaning, but my point was that they do not emphasize that they are absolutes (even where simple language is available to do so).
We say similar things in English. One might seriously say, "If you don't like it, move to Russia." Implying that if you disagree with anything then you're sympathizing with the enemy, and that as long as we're better than somebody it doesn't matter what's wrong.
Maybe the Portuguese version doesn't have those implications, but I find it to be one of the worst arguments a person can make.
Eh, that is usually applied to discussions about living in America, not using a specific service. After all, it doesn't even make any sense in this context (move to Russia and... use Reddit still?).
You're right that this one is an awful argument because it's not within anyone's power to move to Russia, usually (or otherwise move away from the various laws and regulations that apply to them in the US). However, they absolutely can use a site other than Reddit without problems.
Um, yes, that specific phrase would be applied to living in the US, not using a specific service. I'm describing a similar kind of statement with an explicit example, not saying that exact phrase also applies here.
And the problem is not that you can't do it, it's that it's a way of shutting down criticism without addressing it. It's inherently a non sequitur. It basically says, "The problems you point out are not actually problems, because you can go away."
In portuguese it literally means "going someplace else".
An example: I enter the subway and sit next to a man. For some reason, I stink. The man is unconfortable and asks me to move. I shouldn't move. I am OK where I am. If he's not OK he should be the one moving somewhere else.
I'm in favour of a few criminal limits - perjury can't be allowed for example.
A person who supports placing legal limits on expression because they offend a group, religion, political party, etc. can't sincerely claim to be an advocate for free speech. Better to avoid the doublethink and stop defining free speech as speech that you approve of.
The protections are practical and necessary, someone complaining that the example gets overused and abused in random arguments isn't really relevant to this discussion.
The discussion is free speech and "hate speech" (whatever that should be), the "example" used for free speech limits is the "fire in theater" ... how does this example even remotely relate to hate speech? And if it doesn't relate why is telling that not relevant to this discussion?
The current state of reddit includes a large group going around trying to be offended and then bringing the outrage back to home-base so the group at large can also get that outrage-dopamine kick. For their troubles the person who found the offensive comment or post gets some internet points and their group membership is solidified.
It's fascinating in the sense that it happens but I don't think it's reasonable to lurk in social groups you don't belong to and complain about the local norms.
You'll easily get banned from a subreddit dedicated to North Korea, as a joke. Create another account from the same IP and try to post something else on that subreddit and now you're hellbanned on the whole site for that IP because of some hidden rule.
On the surface it seems like a good thing. Taking a stance against the worst "systematic and continuing harassment" should help the most vitrolic cases. Some might argue that Reddit should be an "extreme free speech platform" and this obviously goes against it, but I'm not sure reddit sees itself that way.
I think most people are afraid of the slippery slope and censorship if taken too far. Making Reddit "too political correct" would transform the current culture into something else. Many (or a loud minority) currently doesn't seem to trust Reddit to not do that. What makes things worse is people assume this is a "Monetization of a service"-story they've seen played out many times before with catastrophic results.
In the end I guess it comes down to the execution and agenda.
Possibly. The frustration over Pao seems to be on the verge of going mainstream despite it being relatively pointless in terms of actual things changed. The shadowbanning of users discussing Pao is more of a problem than Pao herself so far IMO.
My guess: the anti-SJW crowd will leave to go to voat (and if we're unlucky, here) and the SJW & neutral crowds will remain, warping reddit into something more along the lines of tumblr with more cat pictures.
On the other hand, even before this recent Pao stuff, reddit had been getting a pretty solid reputation for being an advertising/PR platform. The shilling and advertising is pretty horrendous, and almost all of the default subs are completely overrun by paid content-- if you don't believe me, just take a gander at how frequently McDonalds, Doritos, TacoBell, etc make the front page with content-free posts that are often as simple as a picture of their products.
I guess you can say "avoid the larger subreddits" but there is a lot of content-based shilling too-- try posting anything about Israel or Russia in the context of a serious discussion and see what happens. It isn't a coincidence.
It would really depend on implementation wouldn't it?
The incredibly strict anti-trolling policies around HN do make it a great place to come back to years later. Though HN also takes care to engineer the social interactions here too. (scores are hidden, upvotes only until you get to a certain karma level)
I do tend to worry a little about bias and inconsistent enforcement of their policies, but also believe that we'd have to see that play out over time.
Without question the stated goal of "diversity" is noble, valuable and, deserves to be pursued. The problems arise in their definition of diversity. Does it mean what I think it means?
What about their definition of "safe"? We're really going to have to wait and see.
I think calling this the demise of reddit is a bit premature.
Though I also struggle to believe that deleting the extremes will make it a more interesting or compelling place to be around.
There's a reason that maniacs stop taking lithium.
HN's strategy of burying unpopular posts is very, very strange to me. While reddit's is similar, at least you can disable it manually - to do that for HN would require tampermonkey or the equivalent. Half the time I wind up just highlighting the text to read what someone wrote anyways. I don't really get it.
I still can't downvote after almost 3 years. I don't comment very often, so I don't have an incentive to chase the downvote privilege. But the thing is, after all this time, I dont want to downvote anymore, like the first time I came here. I just read and enjoy the place.
I've been around here close to 5 years and I still can't downvote. I wouldn't say that it is a bad thing -- it just causes me to use HN very differently than Reddit (which seems to be the point).
"Think of the children" justifications for normalizing private sector censorship seem to be a contemporary ruling class focus. Pao's contribution is a smart career move.
Isn't reddit feeding off the entire nation of self-professed internet activists who are out to solve the world's injustices? These people are going to lose their sense of purpose if nobody is attacking them. Don't bite the hand that indirectly feeds you, reddit.
Reddit is a propaganda site and has been for a while now. Many years ago, it was predominantly run by the users ( commenting and voting ). Now it's just a tool for admins and mods to push their agenda/views.
They might as well get rid of the voting system. Reddit is a joke and the sooner it goes away the better.
They're not doing it out of the good of their hearts, so I see three obvious possibilities: making the website a more attractive place to advertize, external pressure hurting their image (like that time they banned some subreddits only after being featured on CNN) or trying to appeal to more women. Did I miss anything?
I assume it's some combination of the above leaning heavily towards advertizement. Considering the number of house ads I see them running, and the anecdotal things I've heard about advertizement sales on 4chan or other uncouth websites.
It's fun to watch parallels develop between reddit site operators versus subreddit moderators, and federal government versus state's rights. (For anyone not familiar with this long-running American political theme: http://en.wikipedia.org/wiki/States%27_rights)
I said it in another thread, I visit a subreddit with very strict moderation. Very strict. The place is wonderful for regulars and newcomers. It is healty and strong. One of the rules is:
>If we find that you are active in any subreddits that promote hatred toward any particular group of individuals, your participation rights in this sub will be revoked.
>If we find that you are active in any subreddits that promote hatred toward any particular group of individuals, your participation rights in this sub will be revoked.
> It added that it defines harassment as "systematic and/or continued actions to torment or demean someone in a way that would make a reasonable person (1) conclude that reddit is not a safe platform to express their ideas or participate in the conversation, or (2) fear for their safety or the safety of those around them".
In point (1) she basically just described how SRS operates...I doubt they'll be willing to use this to punish them, though.
Speaing of SRS, maybe it's time the community starts asking A) how many reddit employees are involved with SRS and B) how many other subs the anonymous* SRS mods also moderate.
*mostly anonymous. It's public knowledge one of their mods used to be a site admin.
Less possible when you're getting hundreds of messages per day on all your online accounts with eg pictures of your children leaving school or your workplace.
No that actually sounds like a really good time to start ignoring things. You don't feed the trolls, real life or internet.
What's more inflammatory, ignoring a troll after he happens to find a picture of your children, or aggressively attacking their actions on their medium of choice? Don't feed the trolls.
Well, at least they are doing something. It seems like Reddit has tripled staff and VC money but has been doing exactly zero, nada, zilch. The site is the same, as are the ads, the downtimes, the loading problems, the ...
Hint: it's an ad revenue based cash-out. They don't actually need more servers or features, they need more curators and ways to channel money.
Sell off the valuable real estate and make the site cleaner for advertisers so that there's less bad PR when one of the uglier subreddits gets critical mass and finds public attention.
I'd just like to point out the Pao and Reddit can do anything they want on their property and it is not censorship. Private parties cannot censor, only the government can censor which why the recent nationalization of the US Internet by the FCC is so dangerous for Free Speech. When the FCC comes out with their "Internet Anti-Harassment and Diversity Guidelines" that must be followed at the threat of shutting down your blog, that is true censorship and should not be tolerated.
Actually, the everyday meaning of censorship quite clearly includes the private suppression of speech. Them having the right to engage in it is independent of that.
I can't agree with many of the comments here. I have a 7 year old reddit account, go there daily, am still subbed to many defaults, and I just haven't seen many of the accusations of terribleness and the inevitable downfall that have been professed here. In my experience, the growth of the site hasn't changed it much at all.. maybe more reposts to the front page but that's not a big surprise.
I see the submission title has been edited so that it's no longer the title the BBC used. I get what the social pressures are around here, but c'mon son.
It gets really blatant if you've been on the site for any length of time. I really hope this doesn't kill the site, but it looks bad.