This is why emergency exit doors must open outwards. People have died in fires in theaters trying to push on doors that said “Pull”. The crash bar on an emergency exit door is what’s known in design as an affordance.
When you present people with a door handle that affords “pulling”, they will pull even if the sign says “push”.
Now consider the fact that the primary affordance of social media is the “reaction”. Is it a surprise that content that garners a reaction will trend towards the outrageous?
If proactivity is the road to a more fulfilled, more civilly minded life and society, maybe we need to think of our affordances. Because we’ve made it awfully easy to be reactive, and awfully cumbersome to be proactive.
The comments below correctly point out an error; what I should have said was that there’s a danger in putting a “pull” handle on an emergency door that pushed outward, because of the confusing affordance.
To stretch it a bit, I'd say there's an important difference between a fire door and reddit. The fire door knows its job is to prevent people burning to death. Reddit... does it know it's job is to prevent outraged reaction?
I think this is one of the things that made facebook so problematic on politics, it can't tell the the good likes, comments and shares from the bad ones. I'm not sure they really had a concept of better and worse. Some stuff isn't allowed, but otherwise?
Imagine one post, where Mrs X invites neighbors to meet a local candidate at her house for revolutionary thoughts and biscuits. Another post, where Ms Y rants about Macron voters, Trump, taxes and kids these days. Both are political. One is actually democratic and participatory. The other is cheap, nasty, unproductive and divisive. Does Facebook, in any meaningful sense, value one over the other? Does reddit?
Reddit has its non-censorship values. I respect that. It's important that someone does. I also think they want to house the weird, and I respect that too. But, I think unrestricted speech may be an insufficient value, like nondiscrimination or atheism. It's not enough to build on. You need positive values too.
Free speech is also problematic, when taken as 'all speech is equal.'
Ha, ha, ha.
Cause that'll increase pageviews.
Outraged reaction is exactly what all news media is going for because people rant and rave and in passing see more adverts. It's just they want to sanitise the topics according to their advertisers wishes.
Reddit doesn't have non-censorship values, it's heavily censored; not all from the top admittedly, but the sanitisation that's gone on in the last few years is huge as Conde Nast have moved to make it a more tempting platform for advertisers.
If you think reddit doesn't have those values, I guess we disagree. I don't see any way of coming to that conclusion apart from fundamentalism, it's either absolute or it's bullshit.
https://www.wired.com/story/bad-actors-are-using-social-medi... .. is the first entry Reddit, is it in the list, nope.
I think Conde Nast are prepared to stoop pretty low in the search for dollars before integrity.
That said, it certainly seems to me like if not as an explicit business decision, Conde Nast certainly has no problem with their properties sensationalizing their media for views/clicks/etc
It looks like you are imagining that every speech has objective "value" which can be determined (maybe it's hard to do, but if we throw enough "big data" magic dust onto it we can get at least close) and then somehow speech can be sorted by value.
I think it's completely wrong from the premise up. The value of the speech is a subjective measure. Some people value invitation of candidate X, some people value rants of Ms Y. In fact, if salaries of talk radio hosts and late night comedians tell us something, way more people value rants than measured, polite discussion. Thus I think "value" exists only in the eye of beholder, and trying to objectify it would only mean dismissing the values of part of the audience and emphasizing values of some other part of the audience. I can see why a site may want to do it and why one may want it to happen, if he or she happens to belong to the latter part, but I see no real justification for it.
And yet we need to do it. As you observed, "way more people value rants than measured, polite discussion", and I'd claim that this is a problem. The "value" may be subjective, but the consequences of both "kinds of speech" are real, and so it would be great to incentivize the kind that leads to a more stable, more just society, and disincentivize the one that causes thoughtless destruction.
Yes, I'm aware that "disincentivize" is getting dangerously close to "ban", but I feel we need to try and walk that fine line, if we want to have a society that's better than just random.
Who are "we"? And what is "it"? Do you just take on yourself the mantle of decider for the whole world who is worthy and who is not? Or how is it determined? Who is worthy to wear that mantle? I don't think any human is.
> I'd claim that this is a problem
Maybe human nature is a "problem". But what you're going to do, replace humanity with better species? Do you have one in mind? Beyond that, I don't see how declaring it "a problem" helps anything, unless there's a fix to this problem. History teaches me that all attempts to create a "new, better human race" didn't just end badly - they ended so awfully terrible that when people point at it, other people get offended that you dare to compare their ideas to that. So, we have to deal with what we have now - and had for millienia. Given what we did with it - well, not ideal, but certainly there have been some improvement.
> it would be great to incentivize the kind that leads to a more stable, more just society
How do you know what leads to more just society? Maybe rants would lead to more just society faster? As for stability, stability is only good when we're at the optimum. Are we?
> I feel we need to try and walk that fine line, if we want to have a society that's better than just random.
Which fine line? Everybody has their own fine line, that's the point. You could maybe find a bunch of people whose fine line is similar to your own, if you don't look too far into the future (and if you do, what you get is https://www.youtube.com/watch?v=WboggjN_G-4) - but pretending there's some line that's good for everybody is just willful blindness. And out of all possible ways of building a better society, I don't think dismissing people that have different views as something that doesn't matter is the best one to start with.
I agree that the root of the "problem" currently lies within people and not process. I agree that changing people wholesale is not easy and not desirable. I agree that there has been some improvement.
I also think human nature is a product of the environment. The way people behave on different websites, in different countries, and in different social situations shows this rather clearly. There is no fixed set of anything that constitutes the whole of how people act. Put someone in a nudist colony, and the environment changes, and the way they act changes (with time). If a reddit user starts going to 4chan, the environment changes, and the way they act changes. Put someone who follows the "always defect" strategy in a community of "always cooperate" people, and the environment changes, and the way they act changes. Put a racist in a racially diverse community of acceptance, and the environment changes, and the way they act changes.
If you accept this idea, it follows that certain environments can be better for society as a whole. Case in point with Reddit: they decided an environment without beastiality and certain violent elements would be better. Maybe they were wrong, but I don't think so. I'm not suggesting I have a wonderful theory of what the best environment is, only that there are better and worse ones. The problem of what we value is hard, but that doesn't mean it's not worth trying.
This ties the loop back: human nature is a product of environment, and environment is a product of humans. We have the power to create environments that make thoughtful discussion easier and hate harder. We can put energy toward solving the problem by changing the environment. HN has an environment I am very fond of, despite being here for only a couple years. I appreciate the work that has gone into making the comments an insightful, respectful, and generally nice place.
We can't replace humanity. We can't change people without changing the conditions they exist in. We can change the circumstances of our struggle in order to grow together as a species.
> I don't see how declaring it "a problem" helps anything, unless there's a fix to this problem.
This seems wrong to me.
The process of (as a group) clearly identifying things that are problems, and coming to agreement that they are problems, and coming to agree on whether they are important, is of fundamental value, regardless of whether we have solutions at hand.
Without this step, folks will either be ignoring problems because they don't know about them, or proposing "solutions" to things that others don't even see as problems, neither of which can lead to any good...
Presenting what the problem is.
The first step to solving a problem is understanding it. It's not solving it. Trying to solve a problem immediately is like trying to write code before you fully understand the requirements.
If you lack the mental fortitude to simply look at a problem without having an immediate solution to it, you're not going to be able to solve major, ugly, nasty, uncomfortable problems like this.
But, inevitably, these problems will show up and knock on your door. Running away from them is not a good plan.
We can tell people it's not attractive to be ranty, but I'm not very comfortable going further than that, at the risk of wandering into thought policing.
Look at some modern political opinion, I'm sure you've seen it as much as me. Ranting is cool. Calling everything under the sun "problematic". The tricky thing is it's not always wrong. There's surely a nearly infinite list of "problems" one could identify, and some of them are truly important. But we just need to turn down the heat on the criticism for just a second. But you can't just ask bipartisans to listen to each other more. We need to make it less cool to be blindly partisan. We need to increase the value of being able to identify with anyone. And we need to make it really uncool to judge hundreds of millions of people you've never met with deep assumptions.
I don't know, it's an interesting problem and I haven't thought of it this way very much. All I'm sure of is that this growing lack of interest in protecting free speech is about the only topic in modern politics that I give a shit about.
I don't want to see it going much further than that either. I was thinking more along the lines of making it so being thoughtful is "sexy" and being ranty isn't, the way today owning a car is "sexy" and smoking isn't.
Free speech has its positive and negative consequences on stability and happiness; I do not want to fight free speech, I'm looking for ways to reduce the negative consequences. I'll protect your (and mine) right to rant about whatever you want, but I sure as hell would like the general policy discussion to involve less rants and more thoughtful cooperation.
I agree that free speech is an essential aspect of what makes us humans, and that it comes with both many positive and negatives.
In implementation of a plan to mitigate the negatives though, I much more support a private entity such as Reddit censoring whatever they wish, as if people believe it becomes to harsh they can simply leave. I'm paranoid that allowing an entity like the government (where constituents can't easily just leave) to get involved with it is good, as it allows for many conflicts of interest. These conflicts could be instances where the ruling party or minority parties push to label an opposing belief as more divisive, or where the ruling majority seeks to 'disincentivize' a minority or outside belief/religion by saying it is offensive to what they deem our values.
I feel like we should push for the civilization of speech to be a societal change, not a policy based change.
On a slightly different note, people have been saying that language and civil discourse have been going to hell for a very long time. George Orwell rather famously wrote an essay titled "Politics and the English Language" in the early-mid 20th century, wherein he detailed how society was moving towards using unclear and imprecise language to pander to the many without being forced to use falsifiable statements. Anthony Burgess wrote "A Clockwork Orange" in the 1960's where he highlights the main characters savagery in part by highlighting their usage of 'barbaric' dialect. William Langland wrote that “There is not a single modern schoolboy who can compose verses or write a decent letter.” in 1386. While civil and educated discourse is an important issue, people have been saying it will lead to the downfall of society for a very long time, but in many cases it is just changing and the entrenched powers dislike having to cope with that change.
Says who? Let's get a little more objective first. Its job is to make money, presumably, if not encourage participation by any means necessary. Its job isn't to moderate. Its job is to allow the creation of sub-communities that can be moderated in any conceivable way. Most moderators aren't interested in reducing reactionism, they're just interested in reducing whatever they or their community doesn't like.
I agree with this statement, except for the inclusion of atheism in the list. Atheism is literally the lack of a belief. You may as well say "The lack of belief in astrology isn't enough to build on. You need positive values too."
Unrestricted speech and nondiscrimination argue for something. Atheism literally argues for nothing.
Free speech is not problematic, as long as everyone has the right to speak and falsehoods can be debunked - there should be no "safe place" for the exchange of ideas, whether they are good or not. Starting by saying that 'free speech' has a problem is a very, very dangerous place to go to.
What do you think a "safe space" is? If you are arguing that there should be platforms where people can speak without being shouted down when the audience strongly disagrees with what they're saying, that is a safe space. In order to construct that space, you have to deny some rights of the audience to speak in that context.
And this is the whole problem with naive free speech advocacy. Unrestricted free speech is not possible anymore than unrestricted freedom in general is possible. People cannot possibly hear every single person's viewpoint, so some people will always be denied a platform to speak to some other people.
The question is how best to structure our societal discourse. What values are important, and how do we protect them? And the question needs an answer more complicated and nuanced than "free speech". Because when we don't acknowledge the complexity of this question, we become blind to the de-facto decisions that we're making about which speech to prioritise.
No, I am not asking for that kind of platforms. I am saying to let people express what they want to say, and the only restriction to Free Speech should be "direct incitation to violence" (such as asking to lynch someone publicly) as mentioned in the US Constitution. Everything else should be able to be said and be heard, and debated between people as long as they want to debate. And of course you will be responsible for what you say, as an individual, and you will have to face the consequences of your words. But it goes both ways.
Restricting Free Speech puts power among the ones in control of Speech. Allowing Free Speech is the only thing you can do to allow even the marginal points of view, even unpopular ones, to be heard.
But what does that actually mean? If all you're saying is that the state should not stop them, then relatively few people disagree with that, but the argument usually goes further. There are many ways people's speech can be limited without the involvement of the state. Be that de-platforming, protests or economic or social limitations.
>Restricting Free Speech puts power among the ones in control of Speech.
This is true, but the reality is there will always be restrictions on speech. It is not physically possible to let everyone speak to everyone, or even just those willing to listen. However we structure our societal discourse, it will always privilege some speech over other speech.
We have to engage with, and be ready to criticise, the implicit decisions being made about what speech is privileged and why. Because if we don't, then we cede power to those who already restrict speech with these decisions.
Just saying "don't restrict speech", and thinking that gives everyone a voice, is incredibly politically naive.
I don't see how "incredibly naive" is something that is the fundamental piece of Western Civilization (at least in most of the English world).
> It is not physically possible to let everyone speak to everyone
No, but first not putting any filter on the nature and contents of speech, as long as it is not violent, is something we should stand for. The "How" is irrelevant.
> However we structure our societal discourse, it will always privilege some speech over other speech.
This should be an individual's choice to make, as in what "speech" you want to listen to. When you go on social networks for example, it should be expected and natural to find people who share different views, no matter how revolting they might appear to you. And we should find comfort in the fact that they are allowed to be expressed, because in turn we are allowed to express ourselves just as well. So in fact, there is no intervention needed by any state actor - on the contrary, free speech is the tool that enables us to discourse and experiment with ideas. Just shutting the door, or filtering inconvenient speech is not making it go away, and certainly will end up having no positive effect towards those who profess such speech, because it could further solidify their opinions and prevent them from being receptive in the future.
It may be good for me to hear an uncomfortable truth, but ultimately I get to decide whether I listen or not. If I want to live in an echo chamber, I can do so, but it should be my choice. Equally if I want to hear uncomfortable opposing views, I should be able to. (Somewhere. Not necessarily on Reddit.com. The site's owners can publish whatever they want.)
This is true in practice, because as a last resort I can put my fingers in my ears and say “la la la not listening” loudly. So frankly, advertisers should give up trying to force me to see their adverts — they can't — and persuade me to listen instead.
Also, affordances aren't to support habit or unconscious behavior. Affordances are to imply meaning or how to use something by their design (whether digital or physical).
That's actually called a signifier. https://ux.stackexchange.com/a/94270
One could even write a book about the discourse of affordances. It would be boring and irrelevant, but it wouldn't be wrong.
Sorry, just defending the mechanics of meaning.
Asking people to build user-interfaces that promote self-control seems a bit naive though. There's a reason self-control is identified as one of the primary "fruits" of a Christian life (Galatians 5:22-23), since it's exactly the opposite of default human behavior.
Automatic negative feedback mechanism.
Much of internet usage is, essentially, a giant slot machine. We've studied the things that make gambling addictive and intentionally incorporated them into social platforms to keep eyeballs on. The goal is to generate a reflexive subconscious impulse to check for new inputs with specific relation to the platform. A good high-level overview of this is the book Hooked, about how to build addictive computer interfaces. As we've "gamified" and "optimized for engagement", we've created an apparatus that many have difficulty understanding or overcoming.
This concept cemented more for me as I watched an Amazon FBA seller just impulsively hit "Match Price" over and over again, despite the fact that it was leaving him with margins of less than 25c per unit. He couldn't be reasoned out of this and insisted he had to match price with the lowest seller, even though his units would've sold for much higher margins over the ensuing weeks if he hadn't.
I realized then that really, he was simply gambling. He hits the button, sees a corresponding spike in sales, and sees a corresponding increase in his "income", despite the fact that letting it increase at a slightly more moderate rate would've easily netted 3x-4x more money (not talking years here -- just a few weeks to sell out as non-cheapest). He does it because he likes the impulse. He likes pressing the button and correlating that action with numbers that say he increased his money.
Facebook is the slot machine; Likes and Comments (broadly "engagement") are the currency; Your Posts are the input. You drop in a post and find that bland, mature, predictable posts don't generate a bright animation or engagement. You respond by posting more and more stuff that has high-engagement quality, which is, of course, the more controversial things like religion, politics, and identity.
The check on this is one's own identity is tied back to it, but that generally just means that you take the things you conceptualize as positive self-image to an extreme that warrants reaction, in particular in these controversial fields that generally have high-yield output from the slot machine.
So Republicans are put on a path toward cartoonish Republicanism, Democrats on a path toward cartoonish Democratism, and so forth. It has a distorting, reinforcing echo-chamber quality on everyone. It encourages people who are more moderate on certain issues to disengage, it encourages us to select them out and weaken our bonds, it makes it easier for our own little world to become all-consuming and feel all-encompassing. It has an overall corrosive effect on public discourse and relationships in general.
The more I think about it, the more I think the physical constraints of "real life" on dialogue and relationships, like being in a room with someone who can punch you if you sink too low, or making instinctive reactions and adjustments based on a conversation partner's body language, are much more valuable than we've assumed.
Well, maybe, but is there a way it could it be otherwise?
We can't even notice the other possibilities, let alone consider them.
Is this true? I always assumed it was because it's pretty much impossible to pull open a door when you are being crushed by people pressing against one another trying to escape.
>An inward-swinging door - three times cited as a code violation by West Warwick inspectors and three times replaced by club managers - was blocking the exit closest to the stage.
The door in question's main problem appears to have been that bouncers wouldn't let people use it, not its orientation.
But I think your assumption is probably more correct, past a certain point they have no option to pull anymore.
There are engineering reasons to prefer a round hole, but there are sometimes other considerations which push a different shape.
Manhole covers exist in several shapes, as illustrated on the Wikipedia page.
You can see lots of utility covers that are hinged, square, or other options, but all are lipped for safety. While a circular profile makes the lipping easier, it doesn't seem to influence many utility covers.
Stop for a moment next time you walk in an urban environment. Look for circular utility covers you see in your urban environment as opposed to squaed ones. Look at the features of the circular ones vs the non-circular ones.
One of the reasons this myth irritates me so much is that everyone is so certain that they know exactly the answer but their actual daily experience doesn't line up with the results at all.
I've heard several people offer explanations from a geometric safety option (which is attractive for free-standing covers) to the simplicity of manufacturing (e.g., that it's very easy to make circular molds and get even density compared to square molds) to simply what the contractor suggested. I've also heard people suggest that metal cylindrical templates were something very common to manufacture for a variety of industrial uses.
I'm not sure what features I would look for because the lip on a rectangular cover would be under the lid. How do I know that the rectangular covers have sufficiently large lips to prevent the lid from falling in? I'm willing to accept that in some cases the lids are rectangular and there is a risk of them falling in.
Then why do rectangular (including square) and triangular manhole covers exist? Rectangular ones are quite common, triangular less so.
Grandparent is correct, lips are what, in practice, prevents manhole covers from falling in, not being circular (which many are not.)
(Not always, but quite frequently.)
They have the same gravitational-gradient dynamic as manholes.
Also fun fact: That used to be a Microsoft interview question.
You're implying that in every thread where a troll shows up to derail the conversation we all have to stop what we're doing and give thoughtful responses to the troll to show them the error of their ways. But then the only possible conversation is debates with trolls. But that gives too much power to the trolls. Sometimes you just have to shut the trolls up so you can talk about what you want to talk about.
"Poisioning a conversation" is a bad metaphor that conflates two different things which do exist, but need to be dealt with in two different ways:
1. Baiting: trying to say something horrible to anger people for their own entertainment. The proper response to this is simply to ignore it: if you aren't entertaining the baiter gets bored.
2. People saying things they actually believe, even when those things are genuinely terrible. Responding to these people prevents their beliefs from going unchallenged, and is the only way we can possibly hope to change those beliefs.
If it were just group 1, you could just ban those people and that would be fine. But the problem with that is that sometimes people are actually in group 2, and engaging those people and correcting them is part of arriving at shared values in a functioning society. The tendency to accuse people who are genuinely expressing their (awful) opinions of simply baiting so that you can ban them is problematic for open discussion.
> Sometimes you just have to shut the trolls up so you can talk about what you want to talk about.
Contrary to what you're saying, I think it's very possible to have conversations about what you want to talk about while letting these people say what they want: isn't this what comment trees exist for? On Reddit and HN, person A and person B want to have a conversation, person C can say whatever they want, and it doesn't affect the continuity of A and B's conversation as long as A and B respond directly to each other's posts, and not to C's posts. Every platform I know of supports private messages.
Underlying what you're saying is an assumption I'd like you to reconsider: why is it that you think that a public conversation in a forum where anyone can respond should only be about what you want it to be about?
These things are caused poison because even a small amount can cause serious problems if left unchecked, and the effect creeps out across an area (how many people are hit) like poison spreading.
I think what you're missing is that a very large amount of people simply do not enjoy a certain style of discourse to the point that they'll opt out of a community that doesn't ban that discourse. You may not like it, you may think those people are weak or something, but the rest of us want to talk to them, and we want them to feel welcome.
Which form of trolls from my post are you talking about? I insist that we not pretend these are the same group of people.
I think that good moderation filters out the baiters and lets the people who believe what they're saying stay. And contrary to what you're saying, I don't think that such communities end up with just trolls. There were plenty of reasonable conversations on Reddit before Conde Nast took over and dropped the banhammer.
> I think what you're missing is that a very large amount of people simply do not enjoy a certain style of discourse to the point that they'll opt out of a community that doesn't ban that discourse. You may not like it, you may think those people are weak or something, but the rest of us want to talk to them, and we want them to feel welcome.
I'm not missing that--in fact, I don't enjoy talking to people with hateful beliefs either.
But the alternative you're proposing is an echo chamber where you don't have to hear those people, but they still believe what they believe, and those beliefs become our leaders and laws. If we ignore bigots on the internet we get bigots in office.
The reason we get bigots in office by the way is not because trolls are banned. It's because powerful interested want bigots in office. Bigotry sells. It's very easy to screw people over if you can distract them by having them hate on some out group. It is naive to think that taking with bigots will change this.
And here is the point: social change doesn't proceed through rational discussion. Never has, never will. Real change requires organization and solidarity and protesting and marching and uncompromising demands.
If you want to waste time engaging with trolls have at it. You will find that these people have nothing but contempt for discussion and no interest in being swayed by logic. For the rest of us we have far better things to do and banning trolls and bigots is the obvious choice.
"Have to" and "obligation" in a general sense are things I try to avoid saying. They don't exist in my belief system, and I apologize if I mistakenly said otherwise.
What I'm saying is that if we want bigots to change, we can't just expect it to happen.
> The reason we get bigots in office by the way is not because trolls are banned. It's because powerful interested want bigots in office. Bigotry sells. It's very easy to screw people over if you can distract them by having them hate on some out group. It is naive to think that taking with bigots will change this.
I think you've confused cause and effect here. Some powerful interests certainly see bigotry as an end goal, but I think most powerful interests who support bigotry see it as a means to an end. As you said, bigotry is a distraction to achieve other goals. Bigots are easily manipulated if you don't care about bigotry: you just pretend to be a bigot and that gets you power, and then you can do what you actually want to do. If there were not bigots to be manipulated, powerful interests wouldn't push bigots into power.
> And here is the point: social change doesn't proceed through rational discussion. Never has, never will. Real change requires organization and solidarity and protesting and marching and uncompromising demands.
Organization and solidarity and protesting and marching aren't incompatible with rational discussion, and in fact none of these things work if they aren't a means of putting forward a rational discussion.
Modern protest movements need to read Martin Luther King's writings and understand what he really did. Every single protest he lead was carefully designed to make a point in the rational discussion of the time. The bigoted viewpoints of the time: that people of color were violent, dangerous, less intelligent, etc., were struck down one by one on public television by MLK's protests. Bigotry is based on lies, and MLK made it impossible for people not to see the truth. When bigots feared people of color would be violent, he showed them people of color peacefully being beaten. When bigots feared takeovers by blacks, he showed people of color only wanted normal things like sitting where they wanted on the bus and drinking from the same water fountains. He didn't simply try to talk over the people he disagreed with, he listened to their concerns and showed their concerns to be invalid.
Harvey Milk, as far as I know, didn't write about his tactics, but they are clear in what he did and said. When bigots saw homosexuality as a foreign, unusual, threatening thing, he encouraged people to come out so that bigots could see that gays were normal people all around them. When bigots saw homosexual culture as an invasion of their neighborhood, he showed it also brought economic benefits ("You don't mind us shopping at your liquor store." "We both pay taxes for your child's school").
Can you explain to me how you think protests work to change policy? If all they are is simply trying to yell your opinion louder than your opponent, why should people in power care? If protests don't persuade anyone, what's to stop everyone voting for the same people and getting the same bigots in power? If our only tool is escalation, they'll just escalate back, and they can escalate further because they have guns. :)
> If you want to waste time engaging with trolls have at it. You will find that these people have nothing but contempt for discussion and no interest in being swayed by logic. For the rest of us we have far better things to do and banning trolls and bigots is the obvious choice.
If by trolls you mean people who are saying inflammatory stuff to enrage people for their own entertainment, sure, engaging with them only entertains them.
But if you're talking about people who are just trying to live their lives and think that bigotry is the way to do that, I very much doubt you have tried talking to these people, because this has not been my experience at all. If you approach talking with someone about their bigotry as if they were a human, with compassion, and address the actual fears and hang-ups that cause them to be bigots in the first place, people do change. It doesn't always happen quickly or at all, but sometimes it does. And more importantly, I've never seen it happen any other way.
I haven't found it particularly wortwhile to distinguish people who are saying terrible things to troll, or because they believe them. They're very often the same group, because reasonable, emphatic people are going to neither say nor believe those terrible things.
If you cannot carry a conversation with consideration for the other people in it, I do not want you in my community.
> There were plenty of reasonable conversations on Reddit before Conde Nast took over and dropped the banhammer.
I perceive Reddit as a very good example of the kind of community I _don't_ want, because any deep, complex, or otherwise not aligned with the popular discussion is impossible there, so I'm afraid we're at an impasse.
> But the alternative you're proposing is an echo chamber
Luckily, I haven't proposed an alternative, so it's rather interesting what imagined alternative you're making that statement about...
> If you cannot carry a conversation with consideration for the other people in it, I do not want you in my community.
Okay, you can want whatever you want, and I understand why you want that. I also have the gut reaction to someone saying something bigoted where I to avoid the person so I don't have to see it, or respond with vitriol and ostracization, because that's what feels good in the moment. But if people continue to insist on putting their head in the sand and take actions that feel good rather than actions that actually address the problem, these problems are only going to get worse.
> I perceive Reddit as a very good example of the kind of community I _don't_ want, because any deep, complex, or otherwise not aligned with the popular discussion is impossible there, so I'm afraid we're at an impasse.
I have had fairly in-depth conversations and said plenty of unpopular stuff on Reddit all the time, so I'm not sure what you're basing this.
> Luckily, I haven't proposed an alternative, so it's rather interesting what imagined alternative you're making that statement about...
You said, "If your community is ignoring trolls instead of banning them, it's effectively sending the message that the trolls are very welcome."
I'm not sure how you go from "I don't want trolls in my community" to "it just feels good, you're putting head in the sand". I'm not putting anything anywhere, I know exactly what I am doing. I don't want trolls in my community.
> these problems are only going to get worse
Not in my community they won't.
> I have had fairly in-depth conversations and said plenty of unpopular stuff on Reddit all the time, so I'm not sure what you're basing this.
This is a subjective thing obviously but it's not like it's some new sentiment I made up, plenty of people find Reddit terrible to have interesting conversations on. Particularly a thing you'll see mentioned often is that shorter, less complex posts are often more liked than longer, more complex posts requiring a lot of effort to write.
> You said, "If your community is ignoring trolls instead of banning them, it's effectively sending the message that the trolls are very welcome."
Which is not a proposal. It's a statement on consequences. A proposal looks like this: "To have a well functioning community, you need to have this, this, and this, and not that". I've said nothing of the sort. Communities are complicated and require design, and there's a lot of variety within communities besides just "free for all" and "echo chamber".
You have a conversational style which seems to like to presume that the person you're speaking to is doing something they never claimed they're doing (keeping their head in the sand, or suggesting an echo chamber), which might be why you find Reddit tolerable, because this is very much the kind of interaction I find really annoying and could do without. It's always easy to feel right about everything if you just put words in the other person's mouth.
I have never seen it play out that way in practice.
> I've never observed that a no-moderation community is free of trolls
I've never observed community with moderation and no trolls. You will not ever eliminate them completely, so better approach seems to be ignore them and just ban outright spam.
And I did observe almost unmoderated community that had trolls, nobody cared about them, and everything was fine.
> very large amount of people simply do not enjoy ...
I think you are doing a lot of projection here. Maybe people agree with you, but considering how broken are your arguments I would not take your sweeping generalizations seriously.
If we treated, say, driving a car the way people treat Internet discourse, you would be dragged out of your car and stoned to death the first time you cut someone off.
Yes, sure, ignore posts that seem like trolling. Filter or block them, even. But perhaps you could give the people behind those posts a second chance, before writing them off for life for one post.
On the other hand, I would fully support a law that would give a temporary suspension on people's driving licenses if there was a reliable way to tell that they have an habit of cutting other drivers off. And to make it a permanent ban on operating any kind of vehicle if you were a reincident. Again, it would be greatly inconvenient for them, but there must be some point when the rights of the public superseed the rights of individual assholes.
Oh haha well done...
This isn't anything visually graphic, but the opinions expressed in this subreddit make me believe that these people are preparing for a violent insurrection, full stop.
CBTS stands for the Calm Before The Storm. These people follow an anonymous online poster named "Q" who posts vague, short posts on some other site (not sure where), and then lots of other subscribers repost these and form general conspiracy theories that all revolve around the deep state, the NWO, and other nefarious groups colluding to remove Donald Trump from office. Its not your standard fare related to the ongoing investigations in congress or the special counsel, these people are the dangerous combination of paranoid, gullible, and angry. As an outsider just perusing, its obvious that this place is crawling with charlatans and con artists who understand that they are addressing a crowd of people who are more prone to believe an idea simply because of the tantalizing ramifications of if it turned out to be true. Anyone can spout out the most hairbrained idea, and three people will show up to give vague, outrageous stories that confirm the original idea.
I don't have an answer, but surely there must be a level of fomenting anger and general mob action that deserves some sort of modulation/regulation.
What makes you think so? I've looked at it (admittedly, didn't spend too much time) and it looks pretty standard fare for a subreddit - or any other forum like it. It's slogan is "BE LOUD. BE HEARD.". People that are preparing violent insurrection don't want to be loud and heard. They want to be silent and invisible until they have enough people and materiel overthrow the government. People that want to be participating in a democratic debate want to be heard. There's no point in being heard by the other guy if the next thing you're planning to do is shooting him (well, maybe if you demand surrender, but I see no such demands there and if would be weird to do it on Reddit). The only point I can see of being loud and heard if you try to convince somebody, or at least gather support - e.g. for winning al election, or pressuring an elected representative into doing something by showing them how many people demand it. All that is part of the normal democratic process.
Even if people that are there hold some unacceptable views (I have no idea if they do, but even we for a minute assume they do) that doesn't mean they are planning violence. What is the evidence they do?
I see a lot of explanations - especially on the left - how expressing certain views is akin to violence. If we had tons of actual violence happening - or clearly imminent to happen - we wouldn't need any speeches about how words are similar to violence. It would be clear to us that there's actual violence and there would be a lot of pointing to it instead. So, I take it as a sign that there's actually not much violence to be pointed at, if we are pointing at words instead.
I realize that there is a spectrum of using psychology to influence people, and I can't tell you where the line is of too far, but can we agree that the line exists?
Oh I don't say you do. I am saying the need of so many people to do it suggests there's a distinct lack of violence to point at, otherwise they'd be pointing at it, instead of pointing at words. And since these people are highly motivated to find anything to point at, their failure to find it suggests maybe there's indeed not much political mob violence to find.
> The level of fervor I see on some of these subreddits tells me that someone is programming these people.
For some definition of "programming", maybe. But for that definition, everybody who debates on the internet "programs" everybody else participating in the debate. It's just a nefariously sounding way of describing mundane things, just like writing "contains chemical compounds!" on food packaging.
> People are using the tools of psychology to achieve goals that would not be achievable otherwise
Not sure what you mean by "otherwise". People communicate. Some of them use knowledge of human psychology to make their message more persuasive. It's not something that appeared today or yesterday or this century or this millenium. Is it harder to convince somebody in something if you ignore human psychology? Of course. But there's nothing nefarious about it - it's like saying "people are using tools of physics and chemistry to achieve goals that would not be achievable otherwise". Sure they do, all power to them! That's why we spend all the big bucks financing the science!
> My point is that there is a singular event, Trump leaving office prematurely, that large groups of the right will inevitably interpret as a coup.
That would largely depend on the manner of said leaving, I'd assume. For example, if he becomes gravely ill or suddenly dies, that sounds unlikely. If Democrats win majority in Senate and House in the next election and decide immediately to impeach Trump "because he's bad", without a proof of any real crime committed - that sounds much more likely. But that would be a consequence of highly inappropriate behavior resulting in loss of trust in the democratic system by citizens. The cure for it is not to behave like that. If there's no such behavior then the history shows there would be no significant violence. I've heard rumors that Bush would cancel elections and institute martial law, then Obama would cancel elections, and no doubt I'll hear about Trump cancelling elections, and then whoever will be elected after Trump would cancel elections too. There's always talk about this, because it's easy.
> I realize that there is a spectrum of using psychology to influence people, and I can't tell you where the line is of too far, but can we agree that the line exists?
Not really. There's no "using psychology" but plain old persuasion, and no persuasion is "too far".
Well, of course, if you use violent methods like torture, you can also achieve psychological effects, but if we're talking about persuasive speech alone, then there's nothing "too far" in that. There's no words that can make robots out of people, and in fact convincing somebody to change one's mind on a political question by just throwing words at them is really hard. Possible, but hard. People may be "programmable", but not very easily. Usually if they become convinced in something, there are a lot of reasons for it and a lot of background for it, not just some nefarious article on some forum.
How do you explain cults? How do you explain the effects of advertising? How do you explain the uniform levels of discipline achieved by basic training? How do you explain phone scammers? How do you explain the success of the public relations industry? How do you explain Bernie Madoff? How do you explain cigarette smokers? How do you explain the effects of what we refer to as echo chambers? Everyone of these consists of people being programmed or brainwashed in one way or another.
You hear the word brainwashing and immediately think of someone thats hypnotized, or a zombie, the typical hollywood trope. But its a far more common thing.
If you're interested in reading about this, there is a book by a psychologist named Robert Cialdini, called Influence, the Psychology of Persuasion.
One interesting persuasion trick is to start with extreme opening bids in negotiation, and then back off to what you really want. This is how the actors of Watergate were able to convince others to go along with the plan to break in to the Watergate. The original plan was far more involved, with a $1,000,000 budget, and included kidnappings. G Gordon Liddy used this as an extreme opening bid, and eventually convinced everyone that what eventually took place was a reasonable compromise. After, its not like they kidnapped anyone, and they only needed $250,000.
Take Scientology for instance. I have access to the whole of their printed words online, yet I am not a Scientologist. You wouldn't be either. You can sub out "Scientology" for any other cult or any other odious group (including race nationalists, terrorists, etc.) and the statement remains true.
The reason being that nothing is being engaged here other than reading, writing, and thinking. No money is changing hands, no leader is demanding my obedience at literal or metaphorical gunpoint. In the case of a web forum the absolute worst thing that could happen to me is that I'll get rude comments or be unable to participate further.
The only difference between discourse and propaganda is the aim of the people doing the talking. This is a subjective value judgment on what you think of the speech, and when it comes to analyzing it, this amounts to noise. Are you trying to propagandize at me? :)
Narcissists and con artists don't need to threaten violence to achieve their means. The threat of violence is simply one of the most effective methods of placing someone under emotional stress. It is once these people are under emotional stress that their defenses are down, and they are vulnerable to brainwashing.
I don't think its controversial at all to say that both Facebook and Reddit are brainwashing people. Not to say that the companies themselves are doing the brainwashing, just that they are effective platforms for anyone to do it on. Do you really not know anyone who literally lives on one of the two sites? They are both highly addictive echo chamber services that people willingly return to, hoping that the next page load will contain their unicorn story that confirms everything they want to be true. Its the dopamine cycle of what makes social media interesting in the first place. These are two of the four most visited sites in America. I can't control for peoples previous experiences, or say that Reddit or Facebook are the exclusive causal factors, but I think that just the fact that they are excellent at getting users to self select for interesting content, at the expense of content that may challenge their opinions, is enough to count these services as brainwashing platforms. People are figuratively screaming to the void "I want to be entertained!!!" Trump answered the call. The repeated dopamine cycle of discovering outrageous content, and then eventually petering out to boredom, is the stress. Once you are sufficiently stressed out, you are far more susceptible to believing that the media is lying, and the FBI is biased, the intelligence community and government at large is filled with evil actors with their own agenda. I mean, how interesting would that all be, right?
Another point to consider is that I don't necessarily think any one organization has caused all of this to happen. Internet addiction has obviously existed nearly as long as the internet. My theory is that Facebook and Reddit effectively teed up millions of people for someone else to come along, and capture their minds.
As I've said throughout this thread, I'm not suggesting a minority report situation, or that circumstantial evidence should depose a president, or that censorship is the answer to any of this. What I'm more lamenting is that the charlatans are winning, and their methods are nearly impervious to defense in a modern democracy. As best I can tell, all I can do is try to convince people of what I think is at play, and hope that it resonates.
Re everything else: This is a reply that doesn't do justice to the effort you put into it, but I think with your definition of "brainwashing", we've made that term functionally useless, and I think bringing partisan politics into it apropos of nothing has made any further honest conversation on this matter impossible.
Keeping the original topic in mind, we're talking about online comments. Not cults, not marines, not anything other than words on screens.
Once someone has become afraid of something, it doesn't just vanish after the event that caused it is over.
People have strong drive to belong to an in-group. There are multiple experiments that assigning random markers to random people and making them participate in certain activities lead to "group cohesion" effects and people start assigning deep meaning to those markers, despite them being completely random. Cult is just when people take it to the extreme - likely because they didn't find satisfaction for their in-grouping drive elsewhere.
> How do you explain the effects of advertising?
Which specifically effects? It's mostly brand recognition, aka availability heuristics - if you have brand A, B and C and you heard before that A is good, you're more likely to choose A than unknown B and C - and just plain informing people things like brand A exist. And a touch of in-grouping ("if you drink Coca-Cola, you are in a group of cool people"). And a tad of signaling ("if we have money to but expensive ad spot on TV, we must be a successful company that can afford to create good product, would not disappear tomorrow and values its reputation so we would not cheat you").
> How do you explain the uniform levels of discipline achieved by basic training
They are not that uniform, but again in-grouping plus other guys will be literally shooting at you (though research shows a lot of this shooting is much less targeted than previously thought).
> How do you explain phone scammers? How do you explain the success of the public relations industry?
That's plain persuasion, with various ethical fences either present or absent.
> Everyone of these consists of people being programmed or brainwashed in one way or another.
Again, if by "programmed" you mean "persuaded in something, in which they most likely inclined to believe from the beginning due to selection and grouping effects" then sure. People can be persuaded to buy a shampoo especially if they wanted to buy one already, and people can be deceived especially if they're already out to look for something the deceiver seemingly offers, and people can be blind to arguments especially if those arguments challenge their prejudices.
The only thing different here is tobacco smoking - that's physical addiction, it is a different mechanism.
> If you're interested in reading about this, there is a book by a psychologist named Robert Cialdini, called Influence, the Psychology of Persuasion.
Yeah I know about Cialdini. It has a lot of nice tricks, but it's not magic. It's much less magic then it's made out to be. And yes, reframing and anchoring is one of the tricks. If you watch Trump carefully, you can see all these tricks employed, he does it all the time. He didn't invent it of course - expensive stores have been putting items with outrageous prices on prominent display for ages, to make regular item prices seem lower in comparison. It does confuse some heuristics for people. But anybody is capable of approaching the prices - or Trump - rationally and see the actual price. There's no "programming" to prevent it - if one makes minimal effort, one can always do it.
My point is that there are large swaths of conservatives that have been radicalized by the last 25 years of Fox News and Rush Limbaugh. And now they are being told that mainstream media is lying about everything.
It's so complete in many people, that they don't hear themselves when they lay out their political motivations. The number of people whose primary goal in achieving any particular political end is to anger their political opponents is staggering. Its like a catch all if someone can't be convinced of the traditional conservative stance on an issue, they can fallback to "well, at least liberals will lose their minds".
This radicalization is primarily the cumulative effect of conservative media, and its villification of the left.
Cults can't exist without a confident authority figure dominating the flow of information to the adherents. These things don't just happen out of vacuums, without a leader in on the scam. Just because reading Dianetics is not 100% effective in converting to scientology, that does not mean that the book isn't an effective tool of persuasion, that combined with the other effects like ingroup psychology, can get people to join a cult that will bankrupt them without a second thought.
With regards to advertising, I mean the use of sex, patriotism, or other emotionally charged concepts to trigger positive associations with the subject being marketed. I don't claim that all people are impervious to all instances, I mean that enough people are susceptible to it, and it has persisted over a long enough period of time, that it has produced a significant amount of people with seriously warped views of how government works, how the 20th century played out, and what the powers that be are planning to execute imminently. I think there are literally people out there being primed to support whatever totalitarian aspirations Trump may have, and that they are being convinced of the righteousness of his cause. Do you really think that hundreds of thousands of people are being facetious when they refer to Donald Trump as "God Emperor"?
With regards to basic training, the effectiveness can be attributed directly to the process I describe. Convince people that they are in danger for long enough, reinforce this with threats and screaming. After the people are sufficiently scared, give them a path to escape the danger. Apply the imaginary danger proportionally to the level people stray from your prescribed path. You posit that since the danger is controlled and not as real as the grunts are lead to believe, the effect is somehow negated. It works in the military because you are isolated from any contrarian information by the military themselves. There's no source of information to tell the trainees that its a controlled, safe environment. It works in political echo chambers because the people have isolated themselves willingly. There's no _trusted_ source of information to temper the vitriol from the echo chamber.
Your model of people and how they take in and react to information ascribes a lot more agency and rational decision making than what I posit. Your argument is that since everyone technically has the tools available to them to become educated enough to not fall prey to devious persuasion, devious persuasion is automatically defanged, because all people avail themselves of all education.
Tobacco smoking, specifically nicotine addiction, is not a different mechanism at all. In fact it is a highly illuminating example that shows how insidious constant bombardment with persuasion can be.
Nicotene has only the slightest physical withdrawal symptoms. The cravings smokers exhibit are a product of the brainwashing. People don't wake up in the middle of the night from cigarette cravings. They've convinced themselves that smoking a cigarette removes stress, instead of ensuring its perpetuation. When they crave the cigarette, they fantasize about how much they will enjoy it, but when they actually smoke it, addicts are surprised 20 times a day to find out they don't enjoy it at all. But this surprise doesn't free them from their mental prison. They are of the opinion that cigarettes modulate some completely unrelated stress in their life. Think of talking to a smoker, and just telling them how dumb they are, by repeating obvious facts that we all agree upon. How does that work out for you? They immediately put up defense mechanisms and dig in. But in their private moments, they know that everything you said was true. I posit that many Trump supporters are in the same predicament. They are faced with the prospect that their entire world view is wrong, that they aren't the woke genius's they've assured themselves to be, they are just Bernie Madoff investors, tools of Vladimir Putin, the choice to double down on supporting Donald Trump is no choice at all.
Trump supporters are addicted to him like Nicotene, and they will irrationally defend the Nicotene even as they cough and wheeze, because after all, whats more emasculating and cuckold-like than admitting you've been catfished by a charlatan to the very people that have been screaming this fact at you for over a year?
I'm not familiar with that subreddit, but I think I'm familiar with the mindset you are describing. Nothing in your description seems to set it apart from the "variety of opinions" model the GP proposed. Other than your feeling that these ideas are "beyond the pale" what exactly differentiates this subreddit from being a group of people whose ideas you disagree with?
These people follow an anonymous online poster named "Q" who posts vague, short posts on some other site
I thought the comments by Occams-shaving-cream in this thread was a good take on Q: https://www.reddit.com/r/conspiracy/comments/82qpk5/in_case_.... He suggests that Q is essentially a marketing team within Trump's campaign, casting out ideas and seeing what resonates with potential voters. He hypothesizes that it may not have started this way, but given the obvious utility of having such a mechanism, it likely has become such by now.
I don't have an answer, but surely there must be a level of fomenting anger and general mob action that deserves some sort of modulation/regulation.
Maybe, but it seems possible that attempts to censor discussion may paradoxically fan the flames. Given a group of paranoid conspiracy theorists who believe that the "powers that be" want to silence them, "modulation" seems equally likely to make them believe even more fervently that there are powers who do not want certain knowledge to be known. And they're right! The difference would seem to be only that they believe this knowledge is truth, and you believe it is a dangerous lie. Public discussion seems like the best way we have of resolving this dispute. Trying to stamp out an idea like this seems more likely to produce violence than to prevent it.
 Did you know that this phrase originally meant "outside the reach of English law": https://englishhistoryauthors.blogspot.com/2013/03/the-origi...
I guess the crux of my argument is that these people are the victims of psychological warfare. I'm not throwing out the word brainwashing for emphasis, but I truly believe a lot of these people are actually brainwashed. To give context, I don't believe its controversial at all to say that boot camp in the American armed forces, and most assuredly other countries as well, is brainwashing, through and through. Put people in a prolonged state of stress. After sufficient time, tell these people you can end all of the stress if they just follow your directions. Run till you puke. Have trained soldiers screaming in your face. Be woken at all times of the night to both run till you puke and have trained soldiers scream in your face.
Instead, many conservatives in America have been fed a constant diet of outrage/dopamine cycles about how evil Bill and Hillary Clinton, Barack Obama, and anyone associated, are. They're given 30,000 emails to peruse that are eventually framed to "unveil" a pedophile ring tied to all of the current players in the democratic establishment.
Then most importantly, a billionaire with no political experience, no political capital to lose, whatsoever, comes along and trolls, and proves wrong, nearly the entire national media for over a year. Nearly every week, he makes some offensive remark that leads every veteran of any election anywhere to believe he has committed political suicide. Thus, a year and a half of reporting that Trump will soon quit the race, and has no chance to win. But since, unlike all of those previous politicians, Trump has no "betters" to please, no one in American politics had anything with which to pressure Trump to do anything he didn't want to do. So when Trump won the election, the brain washing was complete. Trump had lead them out of the "prolonged stress" of Barack Obama's 8 years, and the year and a half long prospect of Hillary Clinton being President for 4 years after that. Everything he said about the lying media turned out to be "correct". Witnessing Trump win under the unique circumstances in which he won has had the psychological effect of making him a nearly god like figure in the eyes of his base. He has cut through political correctness, sexual assault claims, ethics concerns, taken his red meat to the supreme court and won. In the eyes of someone already inclined to pull for they guy on their side, Donald Trump became nothing short of Luke Skywalker. If you think this is hyperbole, heres another example of how people can be brainwashed by witnessing uncanny success against all odds, especially when that success stands to improve the brainwashed's lives immediately.
There is a known email scam where the scammer emails a sufficiently large enough pool of marks, the winning team for a single National Football League game every week, ahead of the game, so the marks can bet on the game. Every week (out of 16), the scammer simply takes the group of people who "won" the previous weeks game, and splits them in half, telling half that team A will win, and half that Team B will win. Obviously, no one will listen to someone who picks losers, so assume that every week, half of the marks leave the scam. If the pool is sufficiently large, (only 32768) after 15 weeks, there will still be marks who have been given the winning team ahead of time, for 15 consecutive weeks by some anonymous stranger. It's not controversial to see why those marks may have been brainwashed by the process, and would be willing to pay large sums of money for that 16th pick.
Obviously, Donald Trump didn't run this scam, but it goes to show that people can be casual observers to information, and be presented with just the right information to be essentially brainwashed.
Nonetheless, I don't want to censor your ideas. Let people make their own minds up. McCarthyism censored communist ideas because they were too dangerous and people might get brainwashed. Was that a good idea too? I though the whole free speech ideal of America was to keep political ideas out in the open where they can stand and fall on their own merits, not silenced groups of violent supporters like you get in Venezuela or Egypt which leads to revolution after revolution.
For an example of what might be a dangerous idea of the left - how about the popular one that blacks are poor because whites oppress(ed) them? That sounds like a recipe for endless failure. Don't work to improve yourself - just get angry at the bogeyman. They did it in Zimbabwe and it ruined them. They're doing it now in South Africa. And American blacks are listening to the left mantra of oppression too. All it can do it make people angry and hateful - what if it leads to race riots?
I realize that the topic of brainwashing is loaded enough that any rational discussion of it can easily be derailed by someone by simply making the strawman you just made, by comparing my arguments to claiming the end of the world or collapse of America.
Isn't it ironic that you can easily spot a culture of victimhood in others, the psychological effects, the futility of it, but I imagine you don't see the same thing in your average revanchist conservative? The level of hatred and frothing at the mouth over 8 years of Barack Obama, fed by people like Donald Trump, Sean Hannity, Alex Jones, and other charlatans created a victim complex that seems to persist even now in many Trump supporters.
Lets break it down point for point:
- lopmotr framed my comments as a conspiracy theory. I retorted that there is obvious evidence all over reddit that supports my claims.
- lopmotr brought up examples of victimhood as the causes for decline of states, and drew a line to whats happening in America with blacks as equivalent. I retorted that conservatism has this same victim complex, and that its ironic that people can see in others much easier than they can see in themselves.
You, on the other hand, compromised your otherwise reasonable argument by flippantly claiming that approximately half the US voting public is "willfully ignorant". This is gratuitous "name calling", and el_cid was right to call your attention to this. I didn't vote for Trump, but have intelligent friends and relatives who did. Regardless of whether their choice was wrong, insults like this are counter productive to changing anyone's mind. So stop it.
Personally, I think you are right about much of the behavior you see on the right, but seem to be missing (or at least not mentioning) the equivalent online psychological tricks that mislead the left. As your penance, here's a story from someone one the right detailing how he sees some of the matters you are refering to: https://imprimis.hillsdale.edu/the-politicization-of-the-fbi.... I thought it was an interesting read that I haven't seen in the mainstream press.
I developed the roots of this idea over December '16, after several conversations with one of my life long friends I hadn't seen since a few months before the election. He's an attorney, and someone I know to be extremely bright. He voted Trump. The level of spite and shadenfreude in all of his arguments, and him repeating the phrase "I've never been so sure of anything in my life", in regards to his confidence in Trump to fulfill his campaign promises, was very jarring. All of these traits were completely foreign to my friend before Summer '16. He hadn't gotten less intelligent in any other avenue of his life.
I only came to this theory through the realization that susceptibility to public relations tactics, weaponized persuasion, whatever you want to call it, is not a matter of intelligence at all.
I think you should focus on continuing to build your case on reddit and then you should definitely return here in a few years once its ironclad!
"This subreddit was banned due to a violation of our content policy, specifically, the prohibition of content that encourages or incites violence and the posting of personal and confidential information."
I can't think of any non-contrived situations where those behaviors aren't equally damaging to social bonds in any situation, whether it's an online community or a government or a family or an organization. They all reduce enlightenment, are always irrational, and always increase human misery, even if it's just a teensy bit in the most benign cases.
All of these are products of the failings/weaknesses of humanity, not causes of them. Their presence is to be expected and crops up in everybody, not just in some "toxic" subset of people that can be excluded.
In Robert Sutton’s book, “The No Asshole Rule”, he describes what it takes to be a “certified asshole”:
> A person needs to display a persistent pattern, to have a history of episodes that end with one “target” after another feeling belittled, put down, humiliated, disrespected, oppressed, de-energized, and generally worse about themselves.
Put another way, in the series “Justified”, Raylan Givens opines (paraphrasing here):
> If you come across an asshole in the morning, well, you just met an asshole. If you’re coming across assholes all day long, maybe _you’re_ the asshole.
Assholes at work create a genuinely toxic work environment. People get sick, quit, and even commit suicide.
It can be argued, with some merit, that this differs from the Internet in that the assholes are usually in a position of power to abuse their subordinates, while on the Internet - at least in chat rooms and the like - people can withdraw from hateful environments by just closing the tab on the browser.
That’s not my point, though; I’m asserting that people who show a “persistent pattern” of promoting hatred of particular groups, inciting violence, and convincing people of harmful information through lies, half-truths, and myths, deserve to be labeled as “toxic”, and can be far more dangerous to society than a common or garden corporate asshole, because their messages can - and do - influence millions of people towards antisocial or, at very least, irrational thoughts and activities.
People with very controversial beliefs (and in a free & diverse society, everything is controversial in some relative axis, hence we need to extend measures of freedom to each other) can still act civilly, or they can flail and be problematic on forums. That's not a feature of their beliefs, but a feature of their behavior (stubbornness, arrogance, etc). People who have fully "correct thinking" in some scope can also be disruptive, poorly behaved members of discussion-based communities.
In literally any part of reddit, if there is any post on anything that could be construed as racial or about any gender, even innocuous, hordes of extremist right wing trolls descend upon it and spew horrifying screeds of hate. They abuse people into silence. They are toxic. These things leak. And toxicity is real. Hate begets hate, saying nasty horrible things to people and advocating for genocide are not innocuous "beliefs that other people might have" they are unacceptable behavior in civil society.
If you do this in real life you are ostracized, beaten, you lose your job, you are abandoned by your family and friends. And this is good. Social signals and actions to prevent "toxic" behavior have existed since forever. But the Internet is the property of a few companies who are loath to enforce those same social rules. Sometimes, they get pushed far enough they feel have to.
Edit: The exception, of course, is if the behavior that is at issue is actually encouraged by a foundational belief of the group.
I'm glad you added this, I agree with you in general. I'm not interested in "other-ing" right-wing people, Republicans, moderates, conservatives, but I'm very interested in "other-ing", e.g. neo-nazis or Klan members. I am not worried about neo-nazis becoming more extreme (? is this possible?) and I also am not willing to let their sensibilities or concern for their feelings dictate any part of my or society's behavior.
Consider these, generally, hygiene factors.
There are substances which are toxic in specific concentrations or circumstances, which are otherwise healthy: oxygen, CO2, water, vitamin D, salt, nutritional iron. Certain forms of discourse.
My view increasingly isn't that there are things, but interactions or behaviours. A thing isn't, but is what it does or how it behaves.
(This ... tends to simplify numerous ontological questions.)
The "belief in things" is itself a cognitive simplification we make, probably for the sake of efficiency. To use a programming analogy, toxicity as a concept is a function of at least three arguments - toxic(what, to-what, context) - but we attach it as a label to the first argument and store it there.
Compare e.g. with beauty, itself a function of at least two arguments - beautiful(what, to-whom). But we usually assume to-whom = "human like me", and stick the whole thing as a label on a thing, because 99% of the time, that's the correct thing to do (incidentally, the quote "beauty is in the eye of the beholder" is literally a reminder that the concept of beauty is a function of multiple values).
Confusing the "arity" of concepts seems to be the cause of quite a lot of misunderstandings between people.
"Things" with some bounded shape or form (or other properties) may be related to perceptual apparatus.
We see, or hear or smell or taste or feel, etc., the boundaries, emissions, or other perceptible manifestations of objects or phenomena. If you will, those are their interfaces.
As in other domains, an interface may reveal, or conceal, some more complex back-end, inner working, or larger system.
As in, beauty(who to-whom &optional context) confused as beauty(who).
I've been thinking on Aristotelian categories over the past couple of years. Thought occurs that all of them are relations, though with varying degrees of dependence on the observer.
Related, a ~1835 essay on value by W.F. Lloyd notes that all value is relative. That's not a universally held view, but is, I believe, correct.
I agree that people get considered toxic because they're expressing views that are unpopular within the group in question. All groups have their sacred cows and enemies (where the belief in the truth of these views is often spread socially rather than developed intellectually), and people tend to not look kindly upon someone going against the standard line.
The second thread is that toxicity is not just a matter of the opinions being expressed but also the way they're being expressed. Eg when people are going out of their way to be rude to others, or where they're not arguing in good faith.
Small amounts of copper are required in your diet to keep your body functioning. However, large doses of copper are toxic.
Same goes for most drugs.
There seems to be tap dancing around the issue that reddit is a 1960s Playboy magazine fifty years in the future. There's just enough excellent articles to keep the advertisers amused, but the fact has to be faced that 99% of the traffic is young men looking at scantily dressed young women. You need a little submarine PR to stay controversial about subreddits nobody on a statistical basis reads to keep things legit appearing, whereas all the traffic and money is over there at /r/randomsexiness
Just like the old saying about Playboy, I only read it for r/ama not the nekkid ladies.
I'm not complaining; I'm just pointing out that reddit is THE most successful pr0n site out there with the most brilliant strategy I've ever seen. Please don't confuse it, and its achievements within the pr0n industry, with legacy news media or anything like that.
While there is doubtless overlap that influences demographics and discussion, that influence does not preclude non-porn subreddits from relevance or discussion.
I completely agree, there are zillions of different experiences depending the subreddit and many of them can amaze you. For example, I have asked very specific questions in subreddits like sysadmin and networking on how AWS and Google Cloud handle layer 2 network protocol like Ethernet, and received the correct answers that I never received posting on the specific AWS or Google Cloud groups outside Reddit.
I suspect in my case that this has a lot to do with how people who are browsing porn use Reddit, versus how people use the rest of the site at large. Most subs moderate posts and discussion actively and have norms (and moderators) that act against posting frequently, cross-posting and self-promotion. On the other hand, most porn subreddits are quite happy to accept x-posts, self promotion, etc, because their goal/approach is to provide more porn faster.
This seems to be an emergent property of Reddit rather than a deliberate strategy, but they seem happy enough to keep the revenue coming (as am I).
That's a thought-provoking way of putting it, though I'd add the /r/gonewild subreddit and its variants makes it considerably more meta. A lot of people, particularly females, are posting explicit imagery of themselves - revealing a hidden culture exhibitionism as the vast majority of them do it for little profit beyond comments and upvotes. There are people on Reddit who do actually make money from posting explicit content, but the vast majority of users who do it are in it for a sense of self-worth from the "updoots" and the thrill of exposing themselves relatively anonymously.
Taking your observations about advertising into account, it's a complex ecosystem. The closest thing I've seen to Reddit is Usenet, but in a age where digital cameras are ubiquitous.
No offence intended but could it be that well-proportioned, model-like subjects are more likely to have their profile viewed, so it's a case of sample bias?
While no, I don't have any hard data so my points are anecdotal, I tend to be disinterested in subjects that are more model-like as the point for me is to look at everyday people getting their kit off. And from what I can tell those everyday people make up the lions-share of self-posted content.
Not only do many females seem to enjoy exhibitionism on subs like /r/gonewild, there's also a not-so-hidden culture of exhibitionism that takes place in real life that "strangely" seems to have so far escaped being raised in the ongoing gender wars discussions. Human beings are very complex beings.
I'm not sure what discussing a lack of clothing or dressing provocatively would achieve - it's never an adequate defence for poor behaviour or assault by a third party. Any nuanced debate about the topic is difficult and likely to be a no-win scenario.
One of the few reasonable points to be made is that revealing clothing can be inappropriate in professional settings, however I think other females need to be the ones who encourage appropriate attire in the office. A male delivering the message would lead to anger and resistance as it'll be viewed as a form of control rather than something formed from consensus.
This might be an overly simplistic way of looking at it, but female exhibitionism is not dissimilar from a guy flashing how much power and resources they have. It's a signalling system, and it attracts both wanted and unwanted attention, but in a civil society assault and harassment are never acceptable no matter how provocative someone's behaviour might seem.
Well, no, wearing revealing clothes isn't considered sexual assault when done by either sex. Though I suppose if you mean that male public upper body nudity is treated as acceptable whereas female public upper body nudity is treated as criminal and a sex offense, but were just using slightly hyperbolic language about “sexual assault”, you'd have a point.
Though that seems to be in the opposite direction of the fantasy you are trying to sell.
I'm not seeing apples for apples in your argument - it'd be valid to say the popular female opinion (whatever that is) is hypocritical if their stance was that men couldn't wear revealing clothing, but you're making the point that they're hypocrites for wearing provocative clothing and then being upset if they're treated indecently or worse. You could argue their exhibitionism is inappropriate, but that is subjective, shaped by culture, and entirely different to saying their position is hypocritical.
If I'm rich and wave my money as I walk down the street, it's still a crime to rob me even if you feel as though I was asking for it.
Isn't that their stance? If a man went out in public with 1/3 of the flesh of his penis showing, everyone would be cool with it? The reality is, he'd be arrested for indecent exposure. In this case, the equivalent of what women do on a regular basis is quite literally illegal.
> but you're making the point that they're hypocrites for wearing provocative clothing and then being upset if they're treated indecently or worse
No I'm not.
> If I'm rich and wave my money as I walk down the street, it's still a crime to rob me even if you feel as though I was asking for it.
I'm not saying otherwise.
This conversation is actually not a terrible example of my overall point.
> If a man went out in public with 1/3 of the flesh of his penis showing, everyone would be cool with it? The reality is, he'd be arrested for indecent exposure. In this case, the equivalent of what women do on a regular basis is quite literally illegal.
I don't understand what the specific double-standard you're alluding to it? I don't think a woman with exposed genitals is going to escape an indecent exposure charge.
Upthread you liken men wearing revealing clothing to sexual assault, but I've never seen this as a talking point in the gender discourse.
Then again, these young men probably aren't paying attention to the ads (unless maybe they're porn ads!), so those subreddits aren't a target audience.
> 100 to 1000 times the sheer posting volume
> 99% of the traffic
Also, just to clarify - do you consider porn something not legit or bad? Your tone suggests it, but I may have misread, so just want to make sure. If so, why?
Because I've noticed that the amount of posting on mainstream subreddit is maybe 5-7 gazillion times more than the porn ones so he's full of shit.
- Real identities (Facebook comments)
- Voting and self moderation (Reddit, HN, etc.)
- Strong moderation (Reddit, HN)
They all result in toxic comments, trolling, an echo chamber,or worse, a complete lack of participation. There's no real solution to this problem. However, if you create consequences or a cost to commenting you'd eliminate toxic comments and trolling at the cost of less participation and an echo chamber (though, you could argue that not all participation is equal and you're mainly removing poor participants).
There's no perfect way to do this because even if you made a subscription necessary, for instance, you may just create an echo chamber. As part of the solution you'd need to prevent the creation of new accounts to circumvent any punishment received.
I'd say the most straightforward solution is that you have a forum and you get an account. Physical mail is sent to your house in order to get a single account. Then, regular moderation practices would be taken seriously as there's no way to create another. The community would be left with those who care enough to not be banned. The problem is that the moderators themselves may be corrupt or wrong.
Facebook/Reddit/Twitter/etc promote a regression towards the mean understanding... viral content reflects and amplifies the "average user's" sentiment and values. Acceptable for entertainment, but inherently prone to misinformation, propaganda, and demagoguery. Education requires valuing subject matter experts. Opinions which may not be widely held, or even popular, but supported by people who are vetted as being knowledgeable on the subject.
Traditional media could be regulated because they were largely centralized, but centralization also creates an establishment that regulated counter-culture ideas. In contrast, the internet is anarchic. Online anonymity impairs delegation of trust... any idea can be published so every individual must rationally evaluate what they consume. Attempting to regulate away undesirable behavior on the anarchic internet is just cat-herding. At best, you create a walled garden for a select few.
As I see it, the paths forward are either:
* public education, emphasizing civics/rationality, to support distributed self-regulation
* centralizing with state regulations
I want the former but the latter seems most likely, considering how the underlying networks are consolidating, and increasing awareness of how amplified public ignorance creates political/economic instability that hurts those with power.
That's great but it's a slow cultural change. Well-educated countries can still fall into extremism, which is driven by emotional and atavistic factors as well as economic and political ones, and can't simply be dispelled with doses of Rationality (tm). Arguably, the failure of rational utilitarianism to engage with this aspect of humanity and to simply dismiss everything that can't be quantified as irrationality exacerbates the growth of toxicity.
On a more practical level, the US is a country where part of the population rejects the theory of evolution on religious grounds, historical narratives are intensely contested, and political life is objectively and increasingly polarized. Educational change happens over generational timescales, and if it were as simple as making it available all our social ills would have been dispelled long ago.
Of course education and critical thinking skills are essential for a healthy social body, but when I see people saying 'we just need better education' I feel like I'm on a bus that's headed towards a cliff edge and well-meaning people are suggesting that the solution to this is better driving lessons.
There may not be a solution that preserves the open internet, if this system is fundamentally incompatible with social realities.
I agree 100% as long as you put me, or people of my worldview, in charge of the curriculum and personnel.
While thought-policing media, schools, churches, and any other possible venue of "indoctrination" may "work" to a superficial extent, it mostly just completely destroys the credibility of your authority and leads to stunning implosion and destabilization. See: the Soviet Union.
Like it or not, you can't just beat "undesirable" views out of people, either with schools or social media moderation.
Education teaches critical thinking, science, history, numerical literacy, and the general skill and toolset to differentiate fact from falsehood, rhetoric, and manipulation-- regardless of where it is coming from.
Education is an immune system for the mind. Generally it is the manipulators who don't like an educated populace, because it decreases their power. They tend to be the ones labeling education as "indoctrination".
Right, in principle, it's agreed that "education" is "good knowledge" and "indoctrination" is "bad knowledge" and/or "fake news".
As long as you think that the inoculations being administered in the school system are valid, you'll call it education. Once you stop thinking that, you'll call it indoctrination.
So you're not really arguing anything. Every side calls training that biases you toward their preferred narrative "education" and training that biases in the opposite way "indoctrination". Is your point that "sometimes people disagree"?
This tends to last until someone realizes that an educational system is a wonderful indoctrination tool to advance their goals. Often enough this is followed by enacting that.
This isn't new. "Give me the child for the first seven years and I will give you the man."
That quality is also known as adaptability and it's crucial to successful survival and prosperity, for exactly the same reason that it's useful in mathematics: global optima are generally difficult to deduce, if they can be conclusively and authoritatively determined at all.
Saying Side X is "not being logical" or "can't think critically" is virtually always just a cop-out. It says you either a) don't understand or b) don't want to admit the validity of some of their concerns.
Most of the time when the other side's argument is understood, the disagreements are a matter of priority and/or credibility, not nonsensical thinking. And those priorities are usually determined intrinsically; values as such can't really be programmed or taught. They're the result of the years of experience each individual has endured in the real world.
A good example of this is that many engineers are known for a just-the-facts, no-frills approach. This is because engineers tend to prioritize facts and correctness over aesthetic and emotional value. Other people who don't do this aren't objectively wrong -- they just put different weights on the considerations, leading them to different conclusions.
Another example is outlet credibility. Your average Fox News viewer may believe that MSNBC is propaganda secretly dictated by the shadowy figures in the background, and vice versa. If you believe this, the logical conclusion is to dismiss or at least discount the perspective of the propagandist.
You cannot "prove" that one side is propaganda and the other side isn't, because it is impossible to definitely deduce the intentions and motives of other people. Reports that say reports from MSNBC were more frequently errant are of no value because you can just say "Oh yeah, says who? The same shadowy figures?" to that.
It is important to understand that humans hold a variety of totally non-falsifiable beliefs -- things that cannot be definitively proven one way or the other, even if you try, like the state of mind of the speakers we're around. These have to be approached from the subterranean to be understood, let alone addressed.
All we can do is understand that our own perspective is not the default or de-facto correct one, and that other people are entitled to their own assumptions and unfalsifiable opinions just as we are. They're entitled to their own credibility heuristics and decisions about who is worth trusting. People are free to make their own decisions and conclusions, whether we agree with them or not.
Understanding that is critical to learning that it's OK to disagree with people, without having to pretend that they're insane just to preserve your own ego and self-worth.
For opinions, perhaps. There are also people who reject facts. I don’t consider rejection of evolution or young-earth views as legitimate. Thus, those who cling to these views are empirically wrong.
Most people don't reject facts, they reject certain interpretations of facts.
For example, some people believed epilepsy came from evil spirits. They didn't deny that the person was shaking on the ground. They just had a different explanation for it than we do now.
When you build a runway for a plane or a long bridge, you do.
A model is not necessarily useful in all contexts. People still use the flat earth model in useful ways because it's simpler to assume the earth is flat in some situations. Of course, once you go beyond the capabilities of the flat earth model your numbers will wildly diverge into the realm of useless while the round or spherical models provide useful numbers for longer.
And yet engineers are over-represented (compared to people with other degrees) amongst Creationists and conspiracy theorists and, I would guess, terrorists. I think engineers value simplicity and direct causation more than facts or correctness.
Appreciate the multiple snide attacks, though.
Is there not such a thing as conspiracy fact? Aren't some conspiracies, in fact, real? It seems both sides of the political aisle have pet conspiracy theories these days, so it's really hard for this to hold water anymore.
> people who don't understand facts
As another commenter said, people will usually agree on the clear and present facts, e.g., Donald Trump won the presidency. Where you'll find more disagreement is on rationale: either he won because he gave a voice to the discontented American working class, or he won because he worked in cahoots with Vladimir Putin to subvert American democracy.
People don't refuse to acknowledge the obvious state of affairs. They have different interpretations, based on different values and credibility heuristics, of the likely impetus for that state of affairs.
>people who are highly opinionated about things they don't understand, etc.
aka virtually everyone. How many of us know enough to hold our own with the experts in something that we're "highly opinionated" on? If we can in anything, it's very narrow. Are all of our other opinions invalid now? Humans use credibility heuristics to try to determine who is right about something, and then they follow based on that.
> are not behaving logically
I dunno, it sounds logical to me, at least in the practical sense. If we pretend we live in a world of infinite resources and time, you might be right, but considering the constraints of reality, the logical approach seems to be to have and express opinions in the moment according to one's best judgment, since everyone else is going to be doing that too. Just gotta try not to be too haughty about it.
> They can have valid concerns and still be behaving irrationally. You clearly think these are mutually exclusive but they aren't.
I agree someone can have a valid concern and also behave irrationally. I don't agree this is what you started out saying, though.
>Appreciate the multiple snide attacks, though.
No offense intended. Edit deadline is passed, but I wasn't thinking I put any such things in. My apologies if you felt I was being condescending or passive-aggressive.
These 20,000 odd people unequivocally lack the type of critical thinking skills GP is referring to. I find it hard to believe that they are all under professional care. These people are straight out of The DaVinci Code, or National Treasure. They truly believe that they have uncovered a massive conspiracy to over throw the current American government, and they are organizing to stop it. Many subreddits choose a sort of mascot that defines their subredditors. For instance, people who subscribe to the tongue in cheek /r/evilbuildings are "6509 villains plotting", where they post pictures of buildings that have a nefarious apperance, no conspiracy in the comments. /r/CBTS_Stream has "21,333 Operators". As in mercenaries/militiamen. These people are rabid Trump supporters, seem to have a strong fundamentalist Christian bent, and appear to be extremely gullible and susceptible to any sort of theory that involves revenge upon the previous administration. They even have their own prophet, "Q". Everything from occult references, to nazis, to big pharma killing off holistic doctors, to arranging Trumps tweets into an 11x11 grid, and then playing word search to reveal a secret message. These people swear that Donald Trump's televised rallies are chock full of encoded messages and symbolism, both in what Trump is saying, and the clothes/posters of supporters in the background. These people buy toothpaste from Alex Jones, because it doesn't contain flouride. These people believe that all mainstream American history since the American Civil War is a lie created by the perpetrators of this current hoax these people have uncovered. They also believe that Trump has already secretly met with Kim Jong Un, and will soon unveil a world saving peace treaty, and that will "make the libs heads explode".
The truly sad part of this is that a lot of these people are also members of other subreddits dedicated to people who have escaped Mormonism, or Jehovah's Witnesses, or similar groups. So these people have already thrown off the shackles of psychological warfare once. But they believe now that they are "woke", and seem completely beyond talking down.
Good luck explaining to these people that they are being radicalized by Russians, or whoever. Good luck getting any of these people to not believe that any censorship is obvious proof that the sleuths are hot on the case, and that the global elite are silencing them.
Oh please, you think the average person has sufficient critical thinking skills to read the newspaper and pick out the parts that are "stretching the truth", use specious reasoning or various other logical fallacies, etc? You must roll with a different crew than I.
> Your average Fox News viewer may believe that MSNBC is propaganda secretly dictated by the shadowy figures in the background, and vice versa.
If they had critical thinking skills, wouldn't they be able to get a pretty decent handle on the degree to which they are propagandists?
It sounds to me like what you're saying is, most things within this realm are not knowable, except for the parts that are. The world is complex and confusing, but I don't think it's that confusing.
I don't know, I don't think we have actively tried yet.
You absolutely can. An example: https://www.theguardian.com/commentisfree/belief/2012/sep/22...
Religion sucks, but Soviet Communism's anti-religious nature doesn't excuse its foibles.
People like you scare me.
I have a feeling a lot of people would have issues with this approach though.
Of course this doesn't work when politics is taboo.
Take the same comment or opinion and air it among three friends in person (or a very tight social network). You only need to convince two or three people who likely trust and respect you already, and who are not inclined to want to spend an infinite number of hours debating such trivia across all time zones.
Not every conversation has to become a burned bridges and salt the earth affair. If the other person is just trying to "win" then disengage from the argument. If the other person is arguing with you in good faith then maybe you're wrong or have something to learn from a new perspective.
There is the perception that "the entire world" is watching you on the web, criticizing your every move, but that's not a fact.
Logical fallacies would be one place to start, you can see examples of this all day long on reddit for example.
I've been hearing claims of the need to teach "critical thinking" since I was in high school. To me it always came across as one of those things that can't easily be taught, particularly in a traditional academic setting. Everyone agrees it should be taught, but if there were a clear way of doing it, we would.
> If it's not being taught, then we'd need to do something differently.
When reading the news, forums, or overhearing conversations, do you not regularly encounter people who obviously have no significant skills in critical thinking?
Right. I took a logic class for my undergraduate degree. It's actually the source of the "modus" in my username. I guess to me that's a far cry from what people refer to as "critical thinking." Being able to identify textbook logical fallacies isn't the same thing as rationally and objectively forming a judgment about something.
It's certainly a helpful part, but I doubt most would remember it any better than geometry or 1800s history.
> When reading the news, forums, or overhearing conversations, do you not regularly encounter people who obviously have no significant skills in critical thinking?
I do, but it's rarely a clear-cut example of misunderstanding a logical fallacy. More often than not, it's the blind acceptance of supporting evidence while rejecting opposing evidence. Or assigning way too much value to a poorly-sourced news story. Or approaching the issue with a different worldview / values. Or any number of other biases that affect decision-making.
To be clear, though: I agree it's clearly not being taught. I'm just not convinced you can take a bunch of high schoolers, put them in a room, and after X weeks of doing something, they'll be critical thinkers. I agree you could probably teach them logical fallacies well enough to pass a test on them, but that's not the same thing.
If not, if critical thinking doesn't work, what could we do to improve this situation?
Of course not.
> If not, if critical thinking doesn't work, what could we do to improve this situation?
I'm not sure "well-informed" and "critical thinking" are even relevant to each other, but putting that aside, I genuinely don't know. That's why I asked how you teach critical thinking.
It's possible people are bound to retreat to their biases and it's a futile effort. I'm just not convinced attempting to teach people "critical thinking" will work, because it hasn't.
Implying it's been tried, and failed.
Where has widespread teaching of critical thinking been tried?
> Public school teachers and administrators will tell you that one of the mandates of public education is to develop critical thinking skills in students. They believe that curricula are designed, at least in part, with this goal in mind. 
> Common Core, the federal curriculum guidelines adopted by the vast majority of states, describes itself as “developing the critical-thinking, problem-solving, and analytical skills students will need to be successful.” 
> Many teachers say they strive to teach their students to be critical thinkers. They even pride themselves on it; after all, who wants children to just take in knowledge passively? 
Are you willing to acknowledge educators / curricula commonly claim to teach critical thinking? To me it's always come across as something claimed to be taught pretty much everywhere. Yet we both seem to agree it's not working.
We could try teaching critical thinking differently and potentially meet some success, but that doesn't change how it's been claimed to have been taught for some time with poor results.
> Are you willing to acknowledge educators / curricula commonly claim to teach critical thinking?
I'm not in denial of some sort ffs, I'm frustrated at watching our society coming apart at the seams because the vast majority of the population seems to be incapable of intelligently reading a newspaper article, and will fall for seemingly any trick in the book.
Of the examples of "critical thinking education" listed above, do any remotely approach the critical thinking specific education I'm talking about here?: https://news.ycombinator.com/item?id=16572861
People are absolutely inundated with propaganda nowadays, like no other time in history, with social media being the most powerful weapon by far. We are graduating our children and sending them intellectually defenseless into this new world, I don't know if the average human mind can be brought to a level sufficient to cope with the propaganda created by the world class experts in persuasion who are working for a variety of deep pocketed entities, but at least we could try.
Well no, but my claim wasn't that your suggestion has been tried. It's that other people have been claiming they've been teaching critical thinking for some time, and it's not working.
I agree it's a problem--I just don't think a class in logic will do it. I'm not sure it's teachable at all, and even if it is, I'm not sure those same skills won't be ignored the moment the argument questions one's identity or becomes emotional.
Is it worth trying? It's easy for me to say "sure," but it's not on me to implement, and I'm certainly not sure how to assess whether it'd be successful.
But to be fair, the longer we get into the current era of politics, the harder it is to distinguish between earnestness and satire. Young people who watch the movie Network today don't see Howard Beale as satirical, because there are too many people like him today who are deadly serious.
What country are you writing from, and could you give some specific examples?
Comparing the experience in a niche subreddit, vs. a default subreddit, it's clear that the real problem is allowing causals in. If people have to go out of their way to participate, you wind up with only the ones who care to do so. And they have, in their reputation, something they don't want to lose.
But if people are allowed to participate by default, you get the enormous masses of people who, collectively, by virtue of their shifting roster, are immune to moderation. And you get people who set out that morning to share as many opinions as possible, instead of the people who set out to participate in that community exclusively.
On the subreddits I manage, I allow broad participation with no barriers. But they aren’t popular enough to have enough trolling to break me down and add hoops.
Facebook, Reddit, twitter, Snapchat have all started for entertainment and switched to catchbyte reaction/outrage culture news and normal news has turned into a mess as well.
I say regulations wiping trending news feed off Facebook, twitter and Snapchat is a start. Perhaps funding to any group that meets set in stone criteria regardless of political affiliation gets some federal funding to make up for the cost of making real journalism. That journalism must be fact checked and we hold them accountable.
People are desensitized to everything now and when shocking news is made every 15 minutes and the world is so connected we become numb to so much and that is incredibly dangerous
The restoration of the Fairness Doctrine would also help stymie some of the biggest promulgators like Fox News *
* you can search for "fox news viewers misinformed" and encounter studies and results like http://publicmind.fdu.edu/2011/knowless/
Similarily, there are still excellent news sources. The Economist is often cited in these discussions, and the New York Times is also vigilant in their reporting and the correction of errors when they occur.
What we’ve seen is a breakdown in trust of institutions, largely disconnected from actual mistakes on their part. People will quickly demand proof and invoke conspiracy theories when, for example, the there-letter agencies accuse Eussia of interfering in elections. They have learned to invoke “appeal to authority fallacy”too well, without offering an alternative. Because you cannot evaluate a new story without in some way deferring to the reputation of the publisher.
The Economist and the New York Times may have good practices in regard to errors, but there is a clear difference in their reporting to independent fact checking sites. To make matters worse, even those examples of "excellent" news papers tend to have a clear and open political alignment. With increased political polarization this then result in a rather natural distrust of news institutions, even those that are vigilant in correcting errors after they have occurred.
Re: trust and appeal to authority; your example made me realize people are drawn to grand conspiracies because unverifiable theories are infallible... Luring in people unfamiliar with probabilistic reasoning and consilience.
Sure McDonalds is not good, but there is also plenty of organic that equally bad (or worse).
That depends. Sometimes European organic veg is preferable to Chinese industrially farmed veg when your local supermarket offers only those two choices. This is definitely true of garlic: Chinese garlic tends to be notoriously bitter and lack juice, but Spanish organic garlic is very sweet, pungent, and juicy. Now, the fact that the European organic choice was made according to the limitations of organic farming may well be irrelevant to its goodness, but there is a strong enough correlation with quality to guide consumers, and it was likely chosen by your supermarket as an alternative to the Chinese imported product precisely because they wanted to cover the organic segment.
Although this distrust does have negative impacts to our society, I view this distrust as an overall good thing.
nope, this won't work.
People will always need a place to voice their vile comments in a cowardly manner
Forum communities with actual strong moderation, eg. Something Awful, ResetEra have near zero issues with hate speech, Nazis and other things that reddit has let fester.
One of the major factors that keeps it relatively clean is that user registration costs $10. That's a strong financial disincentive against trolling, bots, etc.
I really believe that successful online communities of the future will have paid signups.
I'd totally pay $10 for a less shitty reddit clone.
When I use Facebook I am the product, not the customer. This means the platform is optimized to put my eyeballs on advertisements or provide data about me to marketers. It is not optimized to provide high quality conversation.
No wonder you got banned so quickly there...
Having a toxic community that internally declines to moderate is not strong moderation.
1) Namespace of subreddits. The subreddit which snags the most obvious name for a topic has a much better chance of becoming canonical for that topic that competitors with worse names.
2) Cross-subreddit identity and supporting tooling. For example, i can easily search for all recent posts made by a particular user, but i can't easily search for all posts made by a particular user within one specific subreddit. This sort of thing promotes "cultural leakage" across reddits and makes people think of all of Reddit as one community with one culture.
Related: see https://news.ycombinator.com/item?id=16573842 which summarizes a study that finds cultural leakage across subreddits.
Real identities would require verification, which sites like FB only do after the fact.
I fear that a huge repercussion of the election issues is that we will get there. A real ID may be required for you to post comments in all websites. And i'm not sure how I feel about that. Reality is that it would drastically reduce comment trolling, if your real identify was viewable and searchable by all you know. But at what cost?
I really do wonder what is the root of trolling. What makes a "normal" person take an alter-ego/view/counter view, solely based on lack of face to face interaction...
I mean, you get toxic behavior even on something like Nextdoor, where you pretty much know it's the neighbors across the street. Technology has just made things more convenient -- social media removes any curation, and technology also has made some means of harassment much easier to execute.
Myself, personally, I avoid social media that encourages toxic behavior (which usually means, smaller, special interest type sites; social circles that you know; etc.). This involves some degree of moderation or self-selection.
I don't see a good way around limited moderation for Reddit either, which is unfortunate in that it is hard to moderate something that size well (it's usually inconsistent and often arbitrary-ish).
That is not at all clear, especially as a percentage of comments. There are plenty of sociopaths who are perfectly willing to troll under their real name. Meanwhile, more reasonable people may quite rationally be worried that expressing any opinion on a controversial issue will lead to online mobs trying to get them fired from their jobs, kicked out of school, or otherwise ostracized. Not to mention scenarios like being a gay teenager in a very socially conservative environment.
I don't think it would. Lots of people post horribly objectionable material under their own names on a regular basis, depending on their level of financial security, peer group, and social milieu.
What makes a "normal" person take an alter-ego/view/counter view, solely based on lack of face to face interaction...
Some people are horrible, and are just as unpleasant in real life as they are online.
Is this true? I've seen various studies saying that the opposite is actually true, but I can't currently find any of those studies. Does anyone have any sources?
EDIT: some sources, though I don't know the strength of their validity:
The cost is that it would prevent people from anonymously reporting abuses, which means that fear of retaliation will have a chilling effect. We've already seen this where people get death threats, houses burned down, etc, when they do things like report sexual assault.
I can see at the cost of making stolen identities worth even more.
As it is websites completely suck at keeping our 'anonymous' identity secure. Our email and passwords are hacked so commonly there are websites dedicated to tracking it. Now you're just adding 'real identities' to the brokered data. Any real trolls will be able to use this data from the dark net pretending to be you. Even worse, since real names are required, any employers will look for you on the internet, they will see your "I'm an anti-gay right wing pro-russian" profiles online and say "you are not a cultural fit for our company". You will have to take the time and effort to clean up what is said about you. Since it's a real identity, it's not going to change, and they already have all the information they need on you.
Good luck in that terrible future.
Those who post offensive content and do care, find it completely trivial to get a fake name. FB’s enforcement of their policy is basically nonexistent. This is the second-biggest group of users.
I honestly don't think this will ever be the case. Because there is a lot of profit from an unmoderated comment store. Also because it'd be monumentally hard to actually make all sites compliant.
4chan also exists and it works and is marvelously non-toxic once you realize that any insults hurled at you are impersonal because they can only attack what you have immediately posted previously. Your attack surface is tiny, assuming basic information hygiene.
Post history On:
You get to follow a users comment history. If you read an insightful comment by them, and want to read more, then having a history is nice.
Users are people
Post history partially On:
e.g. comments could decay to anon after some period. (Cue the sites that collect all post data and match to users)
Slightly increases the cost of doing a deep dive on a users history.
Users are people, fading to ideas
Post history Off:
Lowered attack surface for people who are actively trying to find an argument with you.
Less pressure to have a persona consistent with the typical one of any particular community.
Users are ideas
For example you can make multiple identities easier. I make use of that here via firefox containers, it is quite nice.
You can make anonymous the default and track the user under the hood, allowing them to claim a post at a later point if they feel confident enough to attach their name to it.
You could limit who can see the user identity. Or only display something more vague than a specific identity, a profile-lite behind a one-time ID.
The only way i see to save reddit is to set a maximum size for a subreddit, and shut down or otherwise isolate every subreddit that grows bigger than the maximum threshold.
This is why I'm quite pessimistic about the current situation: the current in-vogue business model of surveillance/advertising capitalism demands massive size beyond what can be moderated, and thus makes this problem inevitable. And it only gets worse when the most toxic users are the most profitable, viz. Twitter's refusal to ban Donald, even though by any reasonable interpretation of their TOS, he breaks it every other day.
Cheap, if you ask me.
The most common form anarchic moderation (i.e. no moderation). When a forum's small, unwritten social rules keep things under control. However, as forum grows, that breaks down, and things become more chaotic.
Metafilter essentially has an authoritarian moderation culture, the rules of discussion are both made and enforced (selectively or not) by the same group on another subject group. There's a wall to keep outsiders out (the paywall). It avoids chaos, but its failure mode is ossification, devolution into an echo chamber, and eventually desertion; as public forum behavior comes to more-or-less rigidly reflect the opinions and preferences of the moderators.
Reddit's somewhere in the middle of the above two forms. There are anarchic hordes in the less moderated reaches, and little authoritarian kingdoms without the walls to keep the hordes out.
I don't think anyone's tried real democracy in a forum (with elections, politics, checks and balances, and the time investment that all entails). It'd be interesting to see how such a forum would fare, and if it could avoid chaos without become an echo chamber. Democracy isn't the public-opinion-style voting we see in forum's today, but instead actual accountability of the moderators to the users.
Not claiming this is a novel insight, but it's new to me.
In the end, the wizards published "LambdaMOO Takes Another Direction" and took back control, concluding:
> Over the course of the past three and a half years, it has become obvious that this [refraining from making social decisions] was an impossible ideal: The line between 'technical' and 'social' is not a clear one, and never can be.
A great deal has been written about this experiment (and a Web search will find much analysis, along with full text of LTAND and LTAND2), and there are a wide variety of perspectives on why LTAND failed, but one conclusion that nearly everybody seems to reach is that attempting to give a community democracy with no higher guidance is almost guaranteed to be a recipe for disaster.
Reflecting on all of this, I have no idea how the US founders managed to get something that worked at all, much less as well as it does. (And notice that it took them several tries to get it right.)
I think maybe a good solution would be to (1) pay mods and (2) make everything they do transparent. This will give you better mods to start out with but also gives users the power to notice mod overreach before it spirals out of control.
I mean, yeah, it’s structured mostly around links, and you can certainly use it as a source of Interesting Links. But there’s conversation and community there if you look around a little.
Staying relatively on topic is an unrelated aspiration and one that we mostly let flex a lot depending on the specific thread and context.
The real-world has that figured out long ago. If you find something that is truly useful & timely, you would be willing to pay real money for it.
Google Answers (answers.google.com) had tried an approach wherein a price can be put on a question and any legit reply which answers that q can claim it. 'Reputation' definitely still plays a role in this, but the system is flexible enough to allow a new comer to attempt answering a question & stake a claim to the funds.
The real-world has many of these aspects sorted out, like calling a plumber or a carpenter from your neighborhood to get your work done. The problem is we have embarked on creating a 'global' network (aka FB) without first having adequately understood how to create strong family & community network, before we go global with our social networking..
An echo chamber is arguably more when we actively suppress dissident viewpoints. Reddit is infamous for moderators doing that simply by deleting comments under some pretext of 'spirit of the subreddit' or such. With Facebook there's a first-and-last-name-and-picture-visible shaming that can be scary and damaging, repulsing the opposite viewpoints. At a more extreme, you can help foster an echo chamber by organizing a large group of people to scream and picket and threaten a speaker that has the wrong views, reminding all the others of what happens.
HN to me is an oasis. Even if I get downvoted when I have a minority view. I still feel as if intelligent arguments are considered.
With Reddit, how many intelligent comments are there? The English grammar alone is awful, full of shortcuts, cliches and new millennial-speak. But worse: the responses are short. One-liners. And even worse: argumentation is ad-hominem and emotive.
In summary, I think Reddit is about emotional expression, and HN is about (an attempt of) rigor and rationality.
Downvotes are not only for inflammatory language; a comment can be a negative contribution to the signal-to-noise ratio, and even violate the commenting guidelines, without using inflammatory language.
I avoid poking the moderator lions (I used to post political articles maybe a year or more ago), but I do wish HN would have another view of that particular topic. It's rather unavoidable that adults (and we are adults), highly-educated ones at that, would sometimes slip into politics when science or tech news (or legal news about tech or science) is discussed.
But yes, you're generally right about that.
I think emotive political discussion is useless, but rational policy discussions aren't useless.
Where HN has been falling short (lately, in my observation) is where discussions about the ethics of certain business models get lost via the "buried" option or killed off completely.
You cannot come to HN to discuss the potentially-negative ecological or economical impact of a YC company. The voting rings will literally send your comment or post to the void: buried or killed off completely. HN does still post lots of interesting links, but for truly interesting discussion that isn't (for lack of a better word), tainted by bias, I prefer Reddit these days.
- discussing the risks of psychoactive drugs.
- pointing out flaws in overhyped press releases about the next wonder drug/treatment
I guess you're right that you can avoid getting downvoted by being exceptionally polite and spending about 15 minutes crafting a response saying "crap science, uncontrolled trial, possible placebo effect", but sometimes I just don't have the time and energy for that. I'd prefer it if people here didn't automatically assume I'm full of shit when I point out a flaw in an argument without writing my response absolutely perfectly the first time.
That's definitely not true! People say negative things on HN about YC companies all the time. We moderate HN less, not more, when YC or a YC-funded startup is at issue. That doesn't mean we don't moderate it at all—that would leave too much of a loophole—but we do moderate it less. This is literally the first principle that we tell everyone who moderates Hacker News. You can find many posts I've written about this over the years via https://hn.algolia.com/?query=by:dang%20moderate%20less%20yc....
What I meant, is that one cannot start a discussion of such things without being willing to lose lots of points and karma. Observations that AirBnB might be doing more harm than good to cities having "housing crisis" issues, and the fact that Uber and Lyft are actually harming public transportation rider numbers and putting more automobiles on the roads (creating congestion).
Two issues I've seen brought up here that get downvoted into oblivion. Why risk that? It's far easier for people to jump on the "attack the poster" bandwagon... as they have done to me in this thread.
Granted, I've been reading HN for over 11 years now, and the site is not the same as it used to be. A lot of interesting posters have left. Probably I need to lower my expectations for what to see when I come here.
Plenty of comments arguing that Airbnb/Uber might be doing more harm than good routinely get heavily upvoted, so I'd question your overall generalization.
Something might be totally off topic or funny, but if I made me laugh do I down vote it?
Slashdot's model of tagging posts was a pretty good idea I think and allowed one to filter out the 'funny' or 'offtopic' comments.
So, upvotes and respond.
If it's a net positive contribution, you shouldn't be downvoting.
> Slashdot's model of tagging posts was a pretty good idea
Its a good model for a customizable user experience, and a bad model for a community. Those two goals are often opposed.
Dragon, you commented and downvoted on something that you are doing right now which is commenting on a comment system. No? At least you had the decency to reply, which most Redditors don't. Which makes Reddit Toxic.
I reserve downvotes for when a comment is being needlessly toxic, doesn't contribute to the discussion or otherwise not helpful for an open discussion.
I think the best cure for "downvote to disagree" is to firmly hold to the principle that the opposing side of the argument has the best intentions to the extend of their knowledge and that at the end of the discussion, all participants should have learned something. You should also always be willing to change your mind on what you argue about.
Voting on HN barely has an effect, and I suspect that the average votes per comment on HN has gone way down year over year. People just don't vote on posts as often as you'd think, not anymore. A related problem is that commentary doesn't go on for very long. In the usenet days you could have a good thread that would last for months and months that would continue to spawn good and interesting commentary, a flash in the pan thread might only last a few days. On HN the window of commentary for a post is rarely more than a day and typically only a matter of hours. It's just people strafing comments into the void and then disengaging. Long comments typically don't get read, and don't get upvoted, don't get commented on, etc, for example.
OK, I'll bite! A cursory look at the data shows a clear increase in average votes on comments from 2007 until 2012, which is the only year with a dip, followed by steady growth until the present all-time high.
Of course, there are lots of "all else being equal" implicit assumptions there. First, if the population doubles but stories move off the frontpage in 0.7x the time, then you'd only get 1.4x as many votes—and this is one of InclinedPlane's points. Second, the newer crowd could be significantly more, or significantly less, active. To control for these two things, the measure you might use instead is "votes per comment per pageview", or "votes per comment per second a user spends on the page". Third, there might be more comments posted—well, duh, it would be weird if the new users never posted any comments.
Fourth—and I think this another thing InclinedPlane wants to focus on—comment quality could have changed. Comment sorting is relevant too, because I'm sure lots of users don't read everything. If we suppose that, due to an increase in population, we get 2x as many comments but they have the same quality distribution, and if we suppose the best comments always go to the top, then the average quality of the top n comments should increase; you can see something like this in extremely popular Reddit threads, where the top several highly upvoted comments are clearly optimized for something (often clever jokes). If we suppose a decent population of users only read the top n comments, and always use the same function that maps "quality of a comment" to "probability of upvoting", then, when the set of comments doubles and (by assumption) the best rise to the top, we'd expect these users to generate more upvotes overall, and hence "average votes per comment viewed" should go up. (It's also possible that people's standards would rise. But I think people's changing standards would lag behind the changes in what they're viewing.) That said, for the comments that aren't in the top n, the fraction of people that view them (and consequently might comment on them) would go down.
The question of how long threads sit on the frontpage is relevant, both for comment exposure and for InclinedPlane's point about conversation longevity. (There are also pages like "new" and the no-longer-linked-at-the-top "best".) I wonder how best to quantify that... perhaps "the frontpage tenure of the thread with the longest tenure of all threads on that day".
After reading a few discussions over the last few days, I was thinking to my self that HN was better than ever, and very good (with one serious shortcoming). Even the echo chamber is much better than I remember.
EDIT: The shortcoming, IMHO, is the abandonment of politics. HN is the ideal place to solve that problem, with s sophisticated audience open to and interested in experimentation and in problem solving. The goal of propagandists is not to persuade you, but to paralyze you; to shut down real discussion and debate. HN is, unwittingly, capitulating and cooperating with them. HN is another success for them.
No, it's really not, as HN demonstrates most of the time it interacts with politics.
That's an illusion, for reasons I attempted to describe here:
> That's an illusion, for reasons I attempted to describe here
That implies that it's an unsolvable problem, if I understand correctly. There's no reason to think this problem is any more difficult than all the other 'unsolvable' ones and this one is particularly, I would even say 'extremely' valuable to work on.
I don't believe we simply could introduce political topics and it would work due to some HN magic. It would take serious work and experimentation to find a solution, but I think HN is better suited than other places to do that work. And a solution could change discourse in the country and the world, at a time when discourse on the Internet problems SV has invented has become a very dangerous weapon for some, and is tearing society apart.
I realize that "we" means you and sctb more than anyone, and so it's a request and encouragement. I still think it's the most valuable thing HN could do, potentially world-changing. Previous generations had books and leaders that changed the course of history; this time it might be software or a software-based technique that turns the tide. I hope that at least you will keep it in mind.
Our first responsibility is to take care of what we have. The way to take care of a complex system is to be sensitive to feedback and adapt. We can apply that principle here. Look at what happens when the political taps get opened beyond a notch or two. Discussion becomes nasty, brutish, long, and predictable. That's what we want less of, so opening the taps all the way is not an option. For similar reasons, closing them all the way isn't an option either.
I don't disagree completely. I think there's a chance HN can slowly develop greater capacity in this area. But it would need to be very slow and not something we try directly to control. Anything as complex and fragile as HN needs a light touch.
But I'm not sure that they should be eliminated. The alternative is to leave moderation as the only way to deal with bad (abusive, off-topic, trolling, unintelligible) posts. I'm not sure that having people flag every bad post they think they see, and letting the moderators sort it out, is really the optimal way to do things.
That's a common misconception. Downvoting for disagreement has always been ok on HN:
I think people have the wrong idea about HN downvotes because they think Reddit rules apply to HN. Sort of like how Canadians think we have Miranda rights because we've all watched American TV shows.
The problem with your argument is that it doesn't reckon with just how lousy bad internet comments are, or how many of them there are, or how completely they take over if allowed to. To a first approximation, bad internet comments are the entire problem of this site.
It's easy to imagine an HN that would be just like the current HN, only with some negative X (e.g. bad downvotes) removed. Most of these ideas are fantasies, because the thing you'd have to do to remove X would have massive side effects. You can't hold the rest of the site constant and do that.
Bad != disagree. I think we all agree that it should be ok to downvote and hide "bad" comments. However the problem is that many good comments are downvoted simply because people disagree with them.
I think it might be better to remove the downvote and replace it with "flag", so people can flag bad comments (spam, abusive, pointless, etc). At least that way people would need to think a little before the comment gets flagged, which would hopefully result in fewer minority viewpoint comments getting hidden.
I think you have more success if you ask everyone else to upvote unfairly downvoted comments.
Some subreddits do have strong moderation; some subreddits think they have strong moderation but they have fucking idiot mods who call down trolls; and there are some subreddits that have permissive moderation and those subreddits leak.
> However if you create consequences or a cost to commenting you'd eliminate toxic comments and trolling.
You genuinely don't. There are forums where you have to pay real money to be able to read and post, and where acting like a jerk will get you banned. They haven't eliminated the jerks. About the only advantage is paid mods which ensures some consistency.
While I can't come to a conclusion on it (whether or not it's a good/bad idea) - I wonder what others in the HN community think about it. I apologize in advance if this is a bad place to comment, I've been mostly a lurker thus far!
In addition to all that, I still fail to see how a blockchain would solve any of these fundamental issues.
There's nobody watching the watchmen, essentially. That leads to a lot of frustration, anger, mistrust, and abuse.
This means there was a "gold rush" in the early days and if some shitter is sitting on prime real estate (like brand names of products) there's nothing you can do about it. If they decide to close the subreddit by making it private, nothing you can do about it. If they go inactive and are still squatting on prime digital real-estate, nothing you can do about it... if they later get hacked but are still inactive, doubly nothing you can do about it.
That's how NextDoor works and it seems to work reasonably well.
There are some big communities that seen relatively healthy to me. For example, Instagram is easily my favorite social network. I see photos that my friends take and that's about it. I pull it up and am done with it in a minute or two.
In the long run push media is always going to be troll-ier and offend more people than pull media.
HN seems to be doing a great job in that regard (in my opinion) and it is a model with which other websites can learn from.
Secondly, I notice no mention of deliberate, paid propagandizing, i.e. professionally divisive sock-puppets employed by sock-puppet firms.
Any serious discussion of threats to a healthy public discourse must address deliberate attempts to undermine the legitimacy of the common voice.
More karma, more posting; less karma, less posting. Everyone starts every month/week/day with only so much, no roll-over per timeframe. Modify it so that 'popular' threads cost more to post in. You can give karma to others too via the upvote and take it away with a downvote, but still no roll-over. Troll/shill accounts would still get upped around en masse, but less so and it would be 'easier' for mods to tell. (I'm sure you can model this without too much effort vis a vi prisoner's dilemma). You'd have to pick and choose which to comment in. Posting content would work similarly, but a slight mod to the cost to posting.
If you're curious: https://www.bloomberg.com/news/articles/2018-03-01/the-nra-h...
I had the same idea at first but I don't think it would work in practice
Twitter actually does this but only for some types of accounts. Every see the "see more" button under replies? That's where people of low quality go. Twitter has a sort of rating for figuring out if someone is of low quality or not and they usually get hidden. Even their likes and retweets get hidden.
Hasn't to help overall on Twitter though I have noticed I get less assholes replying to me.
I'll agree that reddit doesn't seem to expose easy to use moderating tools, though.
Remember that Reddit ultimately only imparts one vote per person regardless of whether you're an admin or moderator or whatever. End that system. We've already established that these systems are not libertarian - the leadership has an opinion (in /r/Science the opinion is that you must post good science) and we want to empower them to enforce it.
Not only that, but provide negative feedback. If you endorse a terrible person, make it impact your credibility.
If googling myself found comments I'd make on "platform X," you can be sure I'd carefully consider my comments on that platform.
Having anonymous commenting is important, but if you want a less toxic platform for mainstream comments, real identities + indexing is a good start.
The real WTF moment for online discussion is yet to come. When ML chatbots are able to comment indistinguishably from human comments, purely as a measure of capacity of time/scale, comment threads are eventually going to just become chatbots arguing with chatbots, drowning out actual discussion by humans.
Naaah. Gotta monetize every inch of everyone’s life. It’s the American way.
My hypothesis is that Nextdoor primarily appeals to those who already have that kind of territorial "they're all out to get us" mindset since territoriality is literally what the app is about. It's the angry "get off my lawn" old man of social networks and it attracts exactly those kinds of people. Mix in some Internet depersonalization effects and you get something pretty nasty.
My neighborhood is 50%+ non English Speaking Chinese. There are posts almost everyday that say things like..
"Some people in this neighborhood need to OPEN THEIR EYES and stop hitting the gate with their cars. I'll post this in the appropriate languages so everyone and the community can appreciate the significance of my message. Ching-Chang Chow... Bang Bong, Bing bong bing."
Imagine a blank graph. Randomly place 1 million circles on it. No two with identical boundaries. This is our hypothetical Venn diagram of people's preferences for how communication happens online. Drop another million circles to represent how those people will actually act online.
It doesn't matter what points on the graph are labeled "toxicity," "echo chamber," "uncomfortably friendly," or "sparse comments, experts only." It only matter's that the circles are not all identical.
I don’t know an easy way to scale that model.
It would also help reddit monetize.
Real real identities (i.e. government issued digital id) have never been done. I am sure they will come eventually. The political process is just very slow compared to the pace of technology.
Compare the outcomes of totally anonymous reputation-based forums such as HN, reddit or 4chan with near-real identity forums such as Facebook or Linkedin.
There is a very open flow of ideas and debate on 4chan and other reputation-based forums.
There is at least as much hate speech and trolling going on at Facebook as on HN, yet Facebook has near-real identities.
Linkedin have at least as much spam and criminal phishing posts going on as HN, yet Linked in has near-real identities.
HN would not be a better forum if everyone had to register with their government ID. The main benefit would be to make it easier to ban one person from accessing the forum, and silence that individual.
My vote is for reputation-based forums.
I disagree, There IS a solution, but no one likes it.
Segway: Football aka Soccer had a problem back in the day. Games became (more?) boring because teams would go up a goal, and just play "kick the ball to the goalie". Goalie'd pick up the ball, bounce it, pass to a player who kicked it back to the goalie, rinse and repeat. Then they instituted the back pass rule - https://en.wikipedia.org/wiki/Back-pass_rule - where basically, pass back to the keeper and he can't use his hands, only his feet. A generation of goalies had to learn to play with the ball at their feet, and the bit people enjoyed happened more often.
Rugby Union has had a similar evolution, although more intense. Rugby's goal is to have a game that can be played by people of all shapes and sizes, and they mostly succeed. George Gregan was a great player at 5'9", and most second rowers are over 200cm/6'7". But rugby always gets boring, because the coaches start to playing boring rugby (lot of kicks, lots of positional play, less running). So rugby, every few years, overhauls the rules. For a year or two, the game is exciting again, IMHO the best sport in the world, then it is boring all over again as coaches play safe. We then get another overhaul.
I think the Soccer back-pass rule and rugby's ever-changing-rules are models of how changes in rules can dramatically affect the quality of an activity, I don't think they are doable for online sites at scale. Instead, the only solution I can see working is to constantly change to new platforms.
I started in SEO in circa 2001, and there were heaps of forums run on phpBB et al. They had all recently started, so they were figuring themselves out. The forums all interacted, but they all had their own feel and their own rules. Over a few years, they developed "personalities", and everyone could find a place that suited them. This "personality" then morphed into a kind of group think, and the forums grew into echo chambers where the rules where strictly enforced to keep out the others, and each forum became hostile to all the evil others, and they devolved from fun places to hang out to the same old same old.
SM then came along, with sites like FB, Digg and reddit being born, and this process started again. A new place, no real rules yet, and it was the same exciting process of discovery. Over time, the bad parts set in, and these places became stale echo chambers filled with all the bad bits everyone talks about.
That's why I think the only real solution is to tear it all down and start afresh, because this process has repeated several times now. I think that partly explains why SnapChat and the other platforms exist, and why, IMHO, SnapChat et al will grow to a point then fail to grow anymore, as the "freshness" fades, and all the nastiness intrudes. Unless sites can figure out a way to change this process, which after almost 2 decades of this process is either unlikely (pessimistic), or it is too soon to tell (optimistic).
TL;DR when a platform gets entrenched, it starts to exhibit more nasty traits, and a new platform started afresh is the only solution.
Judicial system engages in determining whether someone is guilty based on the evidence that it itself filters.
What about more complex things? Like what if I say "P = NP"? Is that a lie?
And then, do you want a trial for every comment? That will not work even for a fraction of comments.
That can't be determined to be a lie unless someone solves the problem. It's only a conjecture or assertion until then.
>And then, do you want a trial for every comment? That will not work even for a fraction of comments.
Only really has to apply to political statements made by the most powerful office holders, and only when contested, and only when there is a imminent intent to deceive and impact policy.
It's not really an all-or-nothing situation. It's just a matter of how much can be achieved within a reasonable cost.
All within the context of rampant financialisation, land policies returning us to feudalism and continuous lying about the banking bailout including removing the one candidate who was going to take on the banks (Bernie).
Maybe it's not "the internet" that's the problem. Maybe the dissemination of information as we slide into this rentier hellpit is causing people to be pretty pissed off?
- European feudalism was an unusual system in that the government had no taxing power. The lord who owned land held the taxing power ("feudal dues") over that land, and the king funded himself by collecting feudal dues from land he owned personally, rather than e.g. by taxing the dukes. This might contrast with a more advanced state of civilization in which the caliph / emperor / whatever collected taxes directly from everywhere by virtue of being the supreme ruler, and paid a salary to his lower administrative functionaries. Or it might contrast with a system where the use of land wasn't much of a source of taxes. Or both of those latter things might be true simultaneously.
The US system of property taxes has a lot in common with the system I've described above, and some obvious differences. Similarities:
- The federal government ("king") can't assess property taxes. Only the states ("local lords") can do that.
- People other than the government cannot own land outright, but must pay the property tax ("feudal dues") for its use every year.
Of course, rather than the federal government receiving tax income based on federally owned land, it instead double-taxes the citizens of the states. (But on mercantile revenue rather than on land.) This is arguably worse than the feudal system.
Night and day.
"Devolved local state" is slightly more accurate than "private landlord", because unlike a landlord in a more commercialized society, a feudal lord was not legally able to sell the land he owned. He was legally able to govern it.