Hacker News new | past | comments | ask | show | jobs | submit login

Relationship Advice lead-ish mod here. We were one of the subs "called out" as being controlled by Gallowboob with the original post and subsequent reposts.

We (buu700 and I) added Gallowboob some years back largely to have insight into how other teams modded their subs at a time when we were growing quickly and needed to know what to do to keep up. While useful in meeting that goal, we quickly realized it was pretty meaningless for /r/relationship_advice because of the nature of the questions we were getting. The only sub like ours is /r/relationships, and we each differentiated from each other by having different rules and content creation controls.

As best as we know, Gallowboob enjoys the place and is pretty decent at modding, so we're pretty happy to have him. Most mods burn out quickly because of how dark the questions get as well as because of how meaninglessly violent people become when their posts are modded.

---

Separate comment on an item in the story:

> Allam believes his time on the site has made him a more “paranoid” person and led him to develop “borderline PTSD.”

Moderating Reddit's larger subreddits is absolutely capable of resulting in PTSD-like symptoms. I've been dealing with some on and off after a post some years ago where somebody who requested advice followed through with the best course of action only to find that his wife killed their kids soon after. And Reddit has absolutely no support system for things like this.




Why do you free work for a for-profit social network, especially if you say yourself that the work negatively impacts your mental health and that the community you are moderating is toxic?

What are you getting out of this?


> Why do you free work for a for-profit social network

They don't. They do the work for a community they're part of. The social network hosts that community, and tries to make a profit doing so.

I organise a meetup that's held in a pub (well, not at the moment). People coming to the meetup spend money on the pub's beer and pizza. Am i doing free work for a for-profit public house?


>People coming to the meetup spend money on the pub's beer and pizza. Am i doing free work for a for-profit public house?

Yes. Unequivocally, yes. If I were the owner of X thing and Y person was sending lots of people my way I would be more than happy to give Y person something in return. In your example I would offer free pizza and beer at the absolute minimum. You are selling yourself short.


Then the other group members will wonder why you get stuff free while they have to pay, or they might doubt why you chose that particular pub. If you really need a physical reward for doing something you enjoy it would make more sense to agree a deal with the pub, like "second drink on the house" for the whole group.


Selling friends or peers to an establishment for pizza and beer is not really how a trusted human relationship works.


Yes you are. Social organizers, advertisers, trend setters, ect frequently get free product, kickbacks or some money for their 'work'. That's the work you have done.

It's not your intent to do work and it's probably not the pub owners desire to pay for meetup organizers, but regardless of your and the pub owner's intent, work has happened.


Ok, so in addition to the community getting value out of it, the pub benefitted too, that's just called win-win. Why does the pub have to lose in this situation for you to be ok with the result?


They didn't really claim the inverse, just that any form of community organizing inside a private for-profit business profits the business, making it free work for them.

It's one thing to organize a bake sale in a public park, but when you choose one pub over other competitors, and they profit from it, you're doing work for the pub for free.

This is why experienced community managers works with businesses to get prices for the event discounted, or set up an agreement for profits to go towards the event.


I don't disagree that incidentally involving a business in a community's activities benefits ('does work for') the business. But that doesn't mean that the business absorbs all the value a priori and the community no longer gets any value.

I'm sorry I wasn't clear, this point is not directed at friendlybus, but to the general sentiment upthread (e.g. ramphastidae) that seems to assume that because reddit is benefiting from the existence of the community therefore the community cannot benefit from the existence of reddit. The underlying reason why is because individuals benefit from the community.


Nobody is saying that a for-profit business absorbs all the value. Is your premise that if the community gains value from an action, then that action is justified no matter who else profits from it?


My premise is that if a community and individuals in it benefit from organizing themselves, then it's reasonable for the business that the community chose to host their interactions to have some amount of profit.

Also I would question if reddit actually makes much profit from individual communities like r/relationshipadvice, especially compared to the pub scenario which I find generally acceptable.


I am okay with the result. The guy asked if he was doing work, and I said yes he has done work. Why would I not be okay with that? The pub & friends situation is a win-win.

A reddit moderator's free time and a toxic community is a net negative for a moderator, payment may offset the health cost.


Individuals seem to get a lot of value out of interacting with the community, so it's not obvious to me that the moderator's sacrifice (which primarily comes from a small portion number of individuals) is unjustified.


A hotel conference room costs on the order of $50-300 an hour depending on location and group size and that's without the benefit of a bar so the host has to front the money for all the drinks. Most other available venues are either just as expensive or forbid alcohol, loud noise, and so on. A four hour party would cost at least $200 plus drinks before the hosts even collect any money - or if they do, that's another cost and more risk for everyone.

The option is to do "work for free" or to shell out hundreds of dollars.


Is it not possible that the situation is a win-win for all parties and that all parties feel like they are well-compensated in terms of the value they've received, and that therefore no money actually needs to exchange hands?

You may as well ask why people volunteer their time at all.


Both points are valid I think.

It's fair that both parties feel like they're benefiting, so no direct transaction needs to occur.

Comparing this to volunteering generally isn't the same. I volunteer with local registered charities because I believe in their mission, and the volunteer work I do is directly impacting and improving the lives of the people I'm working with.

Which may be the crux of the GP's question. Why volunteer in the community in a way that results in a for-profit organisation monetising you for their gain, instead of finding a non-profit in your community to be volunteering your time towards? The non-profit / charity in your community's goals are more likely to be aligned with your own, than the for-profit social network.


It’s a false dichotomy.

A moderator’s work primarily benefits, directly, the users, because it allows them to communicate safely. The platform only gets help indirectly.

Charities often have salaried or reimbursed employees; do you volunteer in order to get their salary paid? Of course not, you do it because it helps actual users; the fact that this allows people to get paid, somehow, is an indirect benefit. Same here, really.


There's a difference between a charity and a regular for-profit corporation.

If the charity started stalking me to sell my data onto advertisers in order to pay the salaried staff or other stakeholders more money, I'd stop doing any work through them.

The charity's mission is not maximising revenue / profits.

Any VC/PE backed corporation's mission is to maximise revenue / profits at any cost.


Hey I'm not saying money needs to change hands. I'm just pointing out it's work, like in the physics sense. Something valuable has happened, both in the business sense and social sense, but obviously the priority for parent poster is the socializing.

Benefitting from socializing is not worth the reddit moderator role is my opinion, under the psychological load they operate in.


You can likely get a kickback from the pub, but you can also negotiate a deal where everyone wins. E.g. discount or freebies for meetup members, or you can run a small contest every night with prize fulfilled by pub.


AOL would use volunteers, called Community Leaders, to moderate chats/BBSes, etc. They even got perks, like free Internet. I volunteered as a Host Guide when I was 16. CLs eventually filed a complaint with the DoL claiming AOL violated the Fair Labor Standards Act by using non-employees to do unpaid work that could have been done by paid employees. In fact, it's still illegal even if the volunteer doesn't mind or doesn't want to be paid, so long as the company is for-profit. AOL paid out something like $15 million in a class action. [1]

The same will, and should, happen to Reddit. One of the largest sites on the Internet is making money hand over first while moderators end up paying the price for no real benefit.

1. https://en.m.wikipedia.org/wiki/AOL_Community_Leader_Program


In your favored outcome, the government intervenes in a way that

a) prevents people who want to be mods, from being mods

b) breaks the way the site works

What gives you the right to do that? Seriously, if you want to live in a nanny state, please, not in my back yard.

I do not want you to be my nanny.

Reddit's moderation problems are not so big that you are justified in putting a gun to other peoples' faces to force them to do it the way you want.


>What gives you the right to do that?

The Fair Labor Standards Act: https://webapps.dol.gov/elaws/whd/flsa/docs/volunteers.asp

Reddit's preference of exploiting unpaid labor, and it's failure to plan for the inevitable event where that exploited labor decides they want to be compensated for their work is entirely on Reddit Inc. A company valued at $3 billion should have understood the potential risk.

All Reddit mods who feel burned out or are otherwise struggling because of the work they've done as mods should file a complaint with the Dept. of Labor and request financial compensation for the work performed, and coverage of any medical treatment stemming from the results of moderating toxic communities.

https://dol.gov/agencies/whd/contact/complaints


This is just a forced wealth transfer to people who have managed to partially capture the government.


Because the continued existence and quality of the subreddit brings value to them. It’s no different than volunteering. The fact that Reddit is for-profit doesn’t really enter into the equation.

I don’t get it for relationship subreddits because it’s so mentally exhausting but for fandoms and niche communities the experience is super cool.


It's very different than volunteering - since most people don't volunteer at a FOR PROFIT company.

I've volunteered at all sorts of community centers, schools, etc... Never once at a for-profit business in town.


As far as the business of reddit is concerned it provides the tech to run communities on, reddit itself doesn't manage the communities themselves (with the exception of admins stepping in to remove illegal stuff I suppose).

In your analogy reddit is the property owner or landlord of the community center, but it does not operate the community center, so volunteers are necessary.


I don't think I've ever volunteered for a profit-maximizing private venture. Is that a thing people commonly do in real life?

I mean, I've done internships, but the quid-pro-quo there was quite clear.


(Serious question) what's the difference between an unpaid internship and volunteering?


An unpaid internship is done at a place where you expect to either get a job, or because the place and the work you do there is important to your subsequently getting a job or school entrance.

E.g., I did a quality improvement internship at a hospital to get a job in quality improvement there, once upon a time. It was "unpaid," but it was an audition for a job that I wanted.

As opposed to my time as a volunteer in a local free clinic, where no benefit accrued to me at all.


I would assume most mods don't put "reddit mod" on their resume, for one.


It’s no different than volunteering.

It's certainly volunteering in the synonymous sense of 'doing something for nothing because you can' but I feel this statement-left in the vacuous state of its own brevity-comes loaded with an unsaid implication that "volunteering" on Reddit is no different than "volunteering" in one's local community and I'm not sure it's that simple.


A lot of volunteer groups meet in for-profit places like cafes, and use for-profit ISPs and for-profit email providers or IM operators to talk to each other. An online forum is basically the same thing: a way to communicate.


Meeting together, talking about %thing%? Okay that's fair, I don't disagree that there are similarities.

It does feel just as limited of a definition of 'volunteering' if taking the grandparent comment at face value.


That community does genuinely help real people struggling with their relationships, regardless of whatever value it's existence also provides to reddit. I don't think the motivation is all that different from volunteering in one's community, both come from a place of wanting to help others.


i disagree. it is that simple. the community consists of the people that contribute to the discussion. the fact that this community is using reddit as their tool to communicate is secondary. if they as a group were unhappy with how things are done they could move. stackexchange, discord, whatever.

moderators should be volunteers from that community either way, not paid outsiders.


What are you getting out of this?

Not the OP, but most people just want to help a community that they are a part of and take a hand in making it better and helping members of the community.

You have to remember, this issue is played on pretty much every online community that has likely ever existed, in some form or another. Long before the term "social network" became used.


Judging from most moderators that are active it would appear to be a power thing for them.


Oh for sure, for some people that's definitely true. And honestly, the community itself self-selects for people for who that's true as well, since most reasonable people realize that being a moderator is actually kind of terrible in reality. So the people who last the longest tend to be the people who aren't just in it out of altruism (users work hard to drive that motive into the ground).

But at the end of the day, you need moderators. And the job sucks and is unpaid, so you can't exactly select only the "perfect" candidates.


You're being downvoted, but GP directly lists one of his reasons for modding as "because Reddit will grow in influence"

Being the invisible hand that shapes the narrative of an entire community to one that you see fit is surely an alluring power. Especially if you think the powers of that influence will grow over time.

It's not crazy to assume that a portion of them are in it for the power. People love positions of minor authority. See: the fragmented history and (hilarious) mod power struggle that led to Seattle now having a bunch of differently managed subreddits.


This is absolutely true, which is why we tend to be pretty conservative with adding new rules - to reduce the risk of people getting carried away with the hammer. Some communities are the exact opposite - they'd rather people hammer away specifically because they're extremely selective with content, so people with a "power thing" are a better fit for those communities.


> What are you getting out of this?

For a nontrivial number of people, this community is the only place to turn. If we step away, do people incur harm as a result?

Most likely.

(in other words, it's almost a psychological obligation of sorts at this point)


It used to be that the mods ran little fiefdoms that they advertised things to buy in some subreddit's feed. Those micro business success stories seem far less common in the past few years. I don't know why you would anymore.


this will certainly be unpopular but if they don't get any money out of it then most likely it's an ego / power thing


I moderate a sub because if found it useful, I want to make sure it thrives and it's my way of giving back to the community, even if I don't participate as much as some other redditors.


Power tripping is very addictive.


>how meaninglessly violent people become when their posts are modded

Maybe if mods were more transparent across all the subs.

Most people get angry when someone removes their post and yet they see it reposted and approved hours or even minutes later (Gallowboob is infamous for that but regular users do it too)


Maybe if mods were more transparent across all the subs.

This is a common refrain I've heard going back well before Reddit ever existed.

I helped moderate the Vault Network boards for a while back when they were a thing.

Its hard to overstate how amazingly...disturbed and vitriolic some of the individuals of a community can be. And dishonest. And spiteful.

No amount of transparency helps when all it takes to refute any evidence is to label the mods as lying or "corrupt".

Let's say a user says something bad and you mod the post and give them a warning. You tell them exactly what the offense was and why it was moderated. They might even act civil in response.

Then you see later them talk about "X mod totally censored their post for no reason and refuses to explain why. X and the rest of the mods are totally corrupt".

So, what do you do? Post a screenshot of the private messages exchanged (something, for instance, we weren't supposed to do)? Take a screenshot of a browser window with the "mod view" (aka. uncensored) of the original post? Something said user will point out can be easily edited because well, its a browser showing a website, not exactly hard to alter).

And that happens every day, all the time constantly. And no one is being paid and there is a constant stream of other stuff you are trying to stay on top of.

And sure, some of the mods are shitty and "corrupt" in a sense. But I would say its as much a reflection of the community itself as anything.


As someone who used to moderate a pretty large gaming community back in the day I think there are two things to keep in mind.

The first is that transparency doesn't "fix" hostile community members. What it does is justify the actions of those in power to the rest of the community. Without this the community will very quickly lose faith and the situation just devolves into factions and hostility.

This is very indicative of your last line really. It's like any form of government. When the people lose faith in those in power due to perceived capricious or opaque behavior there tends to be a lot of civil disorder. This usually leads to those in power entrenching themselves and enacting even more draconian measures. It's a vicious circle. Perhaps all governance is subject to the laws of entropy and will eventually fail, but I believe transparency, consistency, and effective communication are the only methods to slow such an eventuality.

The second point is one of size. As the population of a community grows and the proportion of those who wield power shrinks you also end up with a lot of discontent. Many community members will no longer feel they can effectuate change as they're a small voice amongst many without any real connection to the small group in power.

Also note that my experiences are in relation to fairly tight knit communities. Reddit is a little different in the sense that plenty of subreddits are far less communal. Effective communication is very difficult when the community is mostly transitory right up to the point where mob mentality takes over.


Yeah, you definitely raise some good points here.

When the people lose faith in those in power due to perceived capricious or opaque behavior there tends to be a lot of civil disorder.

I would be really interested in figuring out how to combat "perceived opaque behavior" especially when the source of the complaints is more artificial, where the complaints are being used as a tool for manipulating the community rather than being based in an actual grievance.

That's the reality you run into sometimes, like that oft-used quote from the Batman movies "Some men just want to watch the world burn."


> how to combat "perceived opaque behavior"

As the GP said, transparency is for the punished is not sufficient. Governance must be transparent to the public, otherwise it will seed distrust. The behavior you described isn't "perceived" to be opaque, it is opaque.


I've been thinking of what I would do if I were a forum moderator and I came to the conclusion that I would have to implement a minutes keeping rule like they do in the actual government. In the UK for example you have the 30 year rule, which says that all the minutes in cabinet meetings get released after 30 years.

Maybe you could keep complete raw backups of the "minutes" (or mod logs in this case). So you would take a backup every 24 hours, encrypt it, and upload it to an independent write-only server (which proves you couldn't tamper with it). And after e.g. 1 year (let's reduce it from 30 years), you would release the encryption key to that mod log backup you made. This ensures transparency and trust.


That sounds like a job for public automated mod logs.

Naturally people will still complain since it's impossible to fix people, but I'd imagine having a "authoritative" list of every moderator action and the accompanying explanation would help stave off the corruption/lies accusations. That combined with a reputation of transparency.


That sounds like a job for public automated mod logs.

I mean, I think that would be an interesting experiment, as I can't think of a large community that provides such data.

But again, if you the underlying assumption is "the mods are corrupt", then that accusation can easily be transferred to whatever logs are provided as well.


The Something Awful forums, which have been around for a very long time and have been relatively successful in maintaining a decently sized community, have a public moderation log[0].

Something Awful also has a number of other moderation features that I think more sites should emulate:

1. Temporary bans from posting ("probation") of variable length, to allow for punishment short of a full ban. Usually 6 hours to a couple days, depending on the offense, occasionally a week or longer.

2. A nominal registration fee ($10, one time) to register an account, to cut down on bad actors just making new accounts.

3. Normal bans for being a dick are reversible by paying $10 (same cost as registering a new account), unless you get "permabanned" for either repeated bad behavior or posting illegal content. If you get permabanned, any new accounts you create get permabanned as well (assuming the admins can find them, which they do remarkably effectively using IP and I think payment info).

That last point sounds like it incentivizes the mods to ban users, so that the forums get more money. But it doesn't seem to actually have that effect, possibly because most of the mods are not paid.

There have also been a few interesting experiments in moderation that were less useful, but are definitely entertaining, such as the ability to limit an account to a single subforum (usually the one for posting deals, or one of the ones for shitposting). It's also possible to view a user's "rap sheet" of moderation actions from any of their posts.

[0] https://forums.somethingawful.com/banlist.php


Mods apparently inflict their PSTD from bad posters on good posters, given the common ban first ask questions later approach.

I really don’t care how shitty another poster was to you. I only care how shitty you are treating me.


While I am sure that occurs, I have been personally banned from several subreddits not for rules violations, or violence comments, but because the moderator simply disagreed with my comment or I was harsh in my commentary against the product the subreddit was about, for example a certain web browser subreddit banned me because I talked badly about a policy of that web browser

I have much more experience in dealing with corrupt and biased mods than I see anything else


Just to provide the other side of this.

Imagine seeing a post exactly like yours, literally word for word, with the same civil tone and all.

Except the user in question had been banned for posting a tubgirl (its gross) image by a moderator that happened be female and had responded in private messages on a clear alt account (minutes after the ban with the same IP address) calling that moderator a slut who deserved to be raped and killed.

And that style of interaction being relatively common.


Imagine treating a civil poster as if they were a jerk, just because a different civil poster once turned out to be a jerk.


Its' not about treating people like a jerk, just that you learn quickly to never trust a person at face value when they say "I didn't do anything wrong" because that's usually the first and most common phrase a person who does something wrong uses.

I was merely pointing out that the "public" persona a user portrays doesn't have to match the truth.


Which can be equally applied to moderation staff, mods that say "We are not biased, and always are fair only banning people that break the rules" are equally as likely to be bad actors as your narrative about regular users


Definitely. I mean, in the vast majority of cases, the moderators are just "regular users" who are given (or chose in the case of the creation of subreddits) power over other users. Even if the moderators are employees they are still fallible, with personal motives, just like the people they moderate.

In the end, that's why its such a difficult problem. You take the normal conflict that occurs in communities, add in the potential for malicious actors on both sides and it's no surprise that the normal conflicts can spiral out of control. Especially in the virtual and relatively anonymous setting of online communities.

And from a person on the outside looking in, it can be impossible to actually know what the truth is.


Not sure how that invalidates the equally if not more common occurrence of banning based on philosophical, political or other disagreements with the moderation staff


> Maybe if mods were more transparent across all the subs.

We're (r/relationship_advice) rarely transparent with removal reasons. Our templatized removal reasons generally look something like this:

> u/[user], please message the mods:

> 1. to find out why this post was removed, and

> 2. prior to posting any updates.

> Thanks.

or

> User was banned for this [submission/comment].

The reason is because we have a high population of people who:

1. post too much information and expose themselves to doxxing risks, and

2. post fantasies that never happened.

So in order to protect the people who inadvertently post too much information, we tend to remove these posts using the same generic removal template. However, if people know that the post was pulled for one of these two reasons, the submitter may still end up on the receiving end of harassment as a result, meaning we have to fuzz the possibility of the removal being for one of these two reasons by much more broadly withholding removal reasons.

This is specific to r/relationship_advice. Other subreddits have other procedures.


> Maybe if mods were more transparent across all the subs.

Don't get me started.

Particularly "muting" posts with auto-moderator (silently hiding them for others without notification/warning/explanation). It was originally created for spam control but is regularly abused for generic moderation. It needs more controls placed on it.

To give a recent example, I wrote a long reply in /r/fitness's daily discussion but it was muted because it contained an offhand remark about COVID-19 (vis-a-vis getting hold of fitness equipment right now). Why are they muting all comments that contain "COVID?" Who even knows, but the /way/ it was done was pretty irksome and resulted in wasted effort on my part for a comment that violated zero published rules or etiquette.

/r/politics has a huge dictionary of phases and idioms that result in auto muting. None of which are defined in their rules.


> /r/politics has a huge dictionary of phases and idioms that result in auto muting. None of which are defined in their rules.

This is true here on HN too. One such word that I’ve seen to cause comments get auto-killed is “m∗sturbation” (censored for obvious reasons), and I am sure there are others.


Here's the list:

  masturb circlejerk faggot nigger feminazi shitstain lmgtfy.com #maga
...plus a bunch of spam sites. Why those words and not others? Shitstained if I know. These things tend to get added when somebody does something, and then stay there.


Plenty of people complain about overmoderation here on HN as well... I am one of those people


I firmly believe that the only way an online forum, particularly an anonymous one with free signups can remain relatively civil is very heavy handed moderation. This is not a free speech zone (it's a privately owned space), and the only way to prevent bad actors is to be very liberal in applying heavy moderation.

If a few "medium" actors get banned by accident, that's the price to pay for the rest of us getting to enjoy discussing tech without dealing with a toxic cesspool.


I strongly disagree. Bad actors thrive just as well if not better in places with moderation and even sign up fees.

Metafilter is a forum which used to be very diverse in opinion and is now basically captured by a small vocal minority. How did they do this? The minority produced a large amount of content for the forum and was very active. 95% of that content was high quality and on topic but the remainder is biased and very opinionated. As a result of their interaction with the site they were closer to the mods, regarded more highly, given the benefit of the doubt in "bad actor" conversations.

Now today Metafilter is a dying community of territorial users who eviscerate anyone who doesn't know how to play the game. The minority won and created their little clubhouse corner of the internet. Metafilter as a forum though is a shell of its former self. There are fundraisers to keep the site alive and formerly paid mods and devs are now either retired or do it for free. Not only did it drive away old and new users but the minority also seems to have become bored without constant drama and moved on (as seen by new posts and commenting activity falling of a cliff).

I know the common refrain on HN and in general is "on a private platform there is no such thing as free speech" but be careful. Don't let blatantly toxic users run your platform but beware of users who are quick to call everything toxic.


Every community has its rise and fall, and there is no one solution for all of them, but I still think without heavy moderation that fall comes much quicker and more violently.


Yes, moderation might be necessary, but what the anecdote illustrates is that simply having moderation will not be enough.


>>how meaninglessly violent people become when their posts are modded

> Maybe if mods were more transparent across all the subs.

Does the latter really justify the former?

If you disagree with a moderation decision, take it up with them politely. If you consistently disagree, maybe this community just isn't for you!

Mods are just people donating their time. Even if they're inconsistent or "corrupt", there's no reason you should respond in any way that can be described as "violent".


What if you take it up with them politely and this is their response? https://imgur.com/a/jhmGXzJ


Find or create another community.

There are basically two scenarios here. One, a lot of people agree with you — in which case you should be able to appeal the decision or splinter off successfully. Two, most people disagree with you — in which case the mod is probably right, or at the very least you're simply not welcome in the community.

Let's also not forget that we're talking about violence in response to moderation decisions. So even if that's their response, it's still not okay to e.g. threaten them.


The problem with the first suggestion is the fact that the moderator in the above screenshot mods 1000+ subreddits. You can be pretty sure they mod the other subreddits too with the same mindset, how many of them do you recreate? This is the whole problem that has caused the drama in the past few days.

> Let's also not forget that we're talking about violence in response to moderation decisions. So even if that's their response, it's still not okay to e.g. threaten them.

Agreed.


> If you consistently disagree, maybe this community just isn't for you!

Ah the classic "don't let others make changes but outcast them" approach.


If you are in conflict with enough other members of a community, you are de facto outcast anyway. And if enough other members agree with you, just break off and form your own community!

We’re talking about online communities — low stakes to join, low stakes to leave. There are exactly zero reasons that anyone should resort to doxxing/threats/etc in response to moderation decisions.


As a mod - how much time do you commit to this? Are you just sitting there reading comments all day? I'm fascinated by this whole mods thing especially since it's volunteer (!!) work - which strikes me as insane.


If Gallowboob and friends are just helpful innocent people, why were were the calling-out comments, dozens upon dozens of them, deleted, and the people posting them banned?

If you're helpfully, innocently, looking after dozens of top subs, and people mention that and wonder what's going on, you don't censor them, you have flipping AMA about it!


> If Gallowboob and friends are just helpful innocent people, why were were the calling-out comments, dozens upon dozens of them, deleted, and the people posting them banned?

Because personal witchhunts are against Reddit's rules as a form of targeted harassment, and mods are de-modded by Reddit for not enforcing Reddit's rules -- or worse, subreddits that consistently see significant Anti-Evil Ops (effectively Reddit's on-payroll God-Mods) action may be quarantined.


I looked at some of the deleted comments, and if I were a mod, I would delete them as well.


I looked at some of the deleted comments, and there was no reason except hubris to delete them.


And Reddit has absolutely no support system for things like this.

My immediate thinking involves referring affected parties to professionals and specialists who deal with this sort of thing, but in your opinion-what should Reddit (the company) be doing?


If anything Reddit (the company) should be working to limit the influence of a single "power" user like this particular moderator, not make it easier for him to control the site.


That's fair, I was more speaking to the question of moderators affected by PTSD. I don't necessarily disagree with what you've said here, but it doesn't quite answer my question, or are you suggesting Reddit Admins limiting mods ability to be mods is a solution to this particular problem?

I guess I'm having some trouble unpacking your suggestion, can you help me gain some clarity on what you're saying?


paying for the councelling of the moderators.


I more or less quit Reddit when I realized what kinds of traits moderation tends to select for: https://jakeseliger.com/2015/03/16/the-moderator-problem-how.... Perhaps you're an exception, or the exception, and, based on Reddit's growth, it seems that my preferences are minority ones.


Why should reddit offer you any support? You volunteered; no one forced you.


[flagged]


This is a little more serious than "you're a piece of shit", no?

> I've been dealing with some on and off after a post some years ago where somebody who requested advice followed through with the best course of action only to find that his wife killed their kids soon after.


I've got two thoughts after reading this. One is that trauma isn't a binary thing, and it can accumulate over time. Some of these mods are dealing with serious events, like the user who was given advice which lead to his wife killing their children. Others got doxxed and receive death threats.

Two, if someone feels upset from getting messages like "you're a piece of shit", I wouldn't say they're an immature child or a weakling. I'd say they could be sensitive, perhaps also very considerate, and in that moment they may worry that another human being is upset and they're responsible for it. They might hurt from the pressure to moderate something that's important to them. Maybe they struggle with social interaction, even if it's online, and experiences like this can be very hurtful.

I don't care why something hurts someone, I care more that they are hurt. Chronic exposure to these things, even if they do seem benign or minor from the outside, absolutely can lead to trauma.

Trauma is a result of exposure to acute or chronic damage to your physical or mental well being. This can occur in a staggering number of ways from person to person. How each of us handles that will vary, but if it leads to a lasting impact, it's trauma.

I'm sorry you experienced what you did. It's arguably worse than what the moderators are describing, but it isn't exclusive to those experiences. It's also not a contest.


Unfortunately (or fortunately, as the case may be) PTSD and trauma in general are not yours to define or gatekeep.


Calling someone with PTSD “weak” is exactly how society brushed aside soldiers and child abuse survivors with PTSD for decades.


> I'm so sick of people on the internet saying they have PTSD

> Calling someone with PTSD “weak”

That's a nonsequitor derail. The post you are responding to is about faux self-diagnosis.


> That's a nonsequitor derail. The post you are responding to is about faux self-diagnosis.

"weakling" is right there at the end and the GP is going on about a strawman not mentioned by the OP anyways, as several of the child comments point out.


> "weakling" is right there at the end Again, this is not related to the assertion.

Name calling (as a proxy for illegitimacy) is all the same for the intent of the point. Claiming damage without evidence is not compelling, even if it makes for tasty fluff content.


> A stranger saying, "you're a piece of shit," is not fucking traumatic to anyone but an immature child or a weakling.

Think more everything from graphic descriptions of rape to child porn. Reddit generally lets comments of the "piece of shit" nature stand.


Negative social judgements, death threats and attempts at doxxing and property damage is a heavy burden when you know (or believe) it to be coming from real people that you can't control and who are unpredictable. It is a reasonably similar tactic to torture. The increased perception of danger grows your amygdala and it doesn't ever shrink again, it's something you have to know how to deal with now. If you didn't before it can be traumatic.

Watching ones you love die is terrible but it can also bring about a sense of peace. Being actively predated on is no fun. I have also seen mental and physical abuse as a child, fwiw.


Have an upvote.

We're conditioning a culture to be hypersensitive, and at the same time, for some of them, hyperaggressive. Lack of experience has always clouded the vision of humans, but many find themselves, willingly or not, living within silos and echo chambers, which reinforce their beliefs and behaviors. What they consider trauma does not begin to approach yours.

In other times, their vocal ignorant statements would be squashed immediately upon utterance. In these times, they receive reinforcement, from some.


Why is that subreddit so similar to Jerry Springer? How did it become so trashy?


You shouldn't be seeking relationship advice from random strangers or a reality show in the first place. Given that no one is a verified counselor the sub is largely redundant at best, knowingly harmful at worse. It is rife with low-effort karma farming, and toxicity, among the defaults.


Largely true with two caveats.

1. We're trying to keep the subreddit as accessible as possible to people with nowhere else to turn. This is 𝐇ard.

2. There's at least one verified crisis counselor who frequently comments - u/ebbie45. We've verified their credentials.


I stand corrected.


drama-farming.

We're still figuring out how to deal with it short of a figurative nuke.


> Most mods burn out quickly because of how dark the questions get

Care to give us some examples?



You don't know if that's what he's referring to.

In fact, I expected that the best examples got modded out.


The "̶b̶e̶s̶t̶"̶ most representative examples often involve egregious cases of abuse against parties either without consent or without a capacity to consent.


> Moderating Reddit's larger subreddits is absolutely capable of resulting in PTSD-like symptoms

There was a moderator of the gaming subs that killed himself fairly recently. He said largely the same thing that modding was not healthy - but he continued to do it.

Why do you think that is? I suspect it was because he had control over something and found that too appealing to let go of. That’s not really a soldier’s dilemma of duty and responsibility.

So that anecdote aside, I’ve worked with a special forces vet that actually has PTSD.

Respectfully, if you think moderating an online forum is any sort of analog even to be “PTSD-like” you are either mistaken or have a far more gruesome task than I think possible.


> Why do you think that is? I suspect it was because he had control over something and found that too appealing to let go of. That’s not really a soldier’s dilemma of duty and responsibility.

Speaking solely on behalf of myself: we see a notable volume of fantasy and fetish posts as well as legitimate pleas for help that veer into extremely disturbing territory. The result is a situation where mods may well find themselves feeling substantially troubled with extended exposure.

I'm not about to impose that on someone else, and as a result of inevitable scope creep from the sub gaining readers, we've now got to sustain an environment that people use as either a first- or last-resort option while at the same time turning away significant populations of people (a subset of followers from influencers such as https://twitter.com/redditships/) who appear to relish creating drama from people calling for help. Great example: https://twitter.com/eganist/status/1263534755045412870

When staying imposes a burden on myself but leaving heightens the risk that people may be harmed, it's a lose-lose, and the trauma arises from this.

I'd show you some of the stuff we've had to mod out, but it's too dark for Hackernews.


> but he continued to do it. Why do you think that is? I suspect it was because he had control over something and found that too appealing to let go of.

Why do people volunteer on anything online, like open source projects even? Sometimes people like a thing, want to keep it good, and feel that it's less likely to happen without them. Leaving is condemning the thing to possibly get worse and decay through less contribution, and interrupts their social connections formed through it and their established routine that gives the satisfaction of contributing. It's not something easily abandoned after investing years in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: