Hacker News new | past | comments | ask | show | jobs | submit login
Humans are hardwired to dismiss facts that don’t fit their worldview (niemanlab.org)
145 points by hhs on Jan 31, 2020 | hide | past | favorite | 125 comments



Twice I have been through interviews that use hours of psychology assessments to filter candidates. One of those was a large hedge fund. Through that I have learned that objectivity is a measurable and identifable personality trait that is astonishingly rare.

Most humans are hardwired to seek mutual security, which means banding together as a stress coping mechanism. That incidentally means some natural revulsion of originality, objectivity, individuality, and even honesty. More important are group conformance, mutual reassurance, and perceptions of social security.

This is readily observable when you take a group of people who have never worked together and are generally unfamiliar with the outdoors and drop them into an outdoor military training exercise, such as college ROTC junior classmen. Naturally they congregate really closely together and wait for somebody to tell them what to do with minimal or no initiative, all of which is a bad idea in modern military training.


Re: I have learned that objectivity...is astonishingly rare.

As somebody who has been in many political debates, I have to agree. A good many are addicted to authority figures to tell them how to think.

Another frustrating aspect of human nature is that most people don't know how to be logical. I strive to be logical as a personal goal, but it's often not valued in the work-place, or anywhere.

Keep in mind "logical" is not necessarily the same as "being right", at least not my working definition. Being logical is being able to back up your viewpoint and claims using a clear chain of logic. I define any assumptions I make as givens, and my conclusion can be formed directly from those givens. I don't necessarily write it out in formal logic, but if one questions a specific informally-written step, I can and do formalize/clarify it.

Most don't return the favor. "My guts have proven accurate and I'd rather trust my gut feelings" is a common response. Screw Guts, I want logic, damn humans! Human guts have proven dumb.


The problem with trying to be "define any assumptions" and using "a clear chain of logic" is that, if you do this right, you will realise that there's really very few things you can actually say about anything at all. This is a problem when it comes to day to day dealings.

On the one hand, average population do not agree on what can be construed as "self-evident". On the other hand, what seems self-evident within our limited capacity is often misguided or frustratingly incomplete. Even if we sorted both these out, being logical means having to deal with details. My experience has been that average person does not like to deal with the details.


First you find out what each side agrees on and what they don't. Part of the "logic" exercise is teasing out such info. Both sides may not agree in the end, but at least have a better idea of what assumptions (givens) each is using, and maybe test those once narrowed.


> you will realise that there's really very few things you can actually say about anything at all

This really only applies if you require absolute certainty. But most of the time being reasonably certain is enough to operate in the world. And you can be reasonably certain about lots of stuff.


Ask each side to give a probability. If both sides have a similar estimate, then you move on to the next logic step(s). Don't bother debating things you already agree on. We don't have to dissect everything.

If it varies too much, then explore the reasons for the different estimates. Often times it comes down to different past experience, which is obviously going to vary per individual. (Ex: "when users see X on UI screens, they usually do Y, in my experience".) But at least one can narrow down recommended decision logic to observed patterns of history. Narrowing down the point of disagreement is often still helpful even if both sides still disagree.


I suspect that there are many situations where the "facts" are poorly established or just wrong, and "logic" is just a set of assertions rendered plausible by the rhetorical skill and popularity of the debater.

Stubbornness may be a survival trait for a social animal. We humans are flooded with facts and logic contrived to cause us to sacrifice our own self interest in favor of someone else's. I've even read that the game of social manipulation is why we evolved our huge brains.

I was born stubborn, according to my parents. The same people who accuse me of being stubborn are often the ones who are the most vigorously opposed to gathering additional information that they can't predict or control.


The "logical" thing to do in most real world situations is to be extremely uncertain. I think just about only way that logic enters into real world considerations is through maintaining a set of not-logically-arrived assumptions and heuristic arrived at models and then using some logic and math to keep your behavior consistent.

A lot of what people call logic are things like "not expecting a different result from the same behavior" and "not follow an impulse with no relation to reality" but even those are heuristics. Usually good heuristics but not logic.


Re: A lot of what people call logic are things like "not expecting a different result from the same behavior"

That can be semi-formal logic if it's stated as a given rule-of-thumb. Most of the time your debate partner(s) will agree with such rules of thumb, barring some odd edge case. If an edge case is allegedly in play for the context, that path is then explored in more detail.

In IT, often rules of thumb will conflict, such as "a stitch in time saves 9" versus "keep it simple" (YAGNI). In such cases I often use labor hours and finance-like decision making techniques, not unlike deciding on insurance: what are the monthly premiums (ongoing costs), how likely is the targeted/prevented event to happen, and what's the cost of not having the prevention or preparation in place. It's not a perfect model, but at least is a numeric framework to discuss with rather then "I feel that...". Probabilities and money (labor hours).


It's not always logic vs guts. Logic is great when you are dealing with theoretical problems that are well defined and you know the variables involved etc. In real life problems, often there are way too many variables, are time bound, things aren't as well defined. It's in those cases that people resort to gut..


I've found the lack of an ability to use basic logic to be extremely widespread down to simple conversational matters. It severely impedes and distracts conversations (online and off). I wish it were primarily a problem when dealing with theoreticals, as I find it's a problem just about any time you converse where there are many people. Emotionalism is rampant.

I have yet to find a forum - in ~25 years of near daily discussions on forums online - where incredibly obvious logic failures don't happen persistently. And I'm only talking about the basics, nothing complex.

The worst and most common, I believe, is: because you said X, therefore you must believe / endorse / be implying Y. It's some kind of emotional transitive property of logic failure.

Example concepts:

I say: Bush did X. Response: Yeah but Obama did a thing, he's worse, how can you support him?!? (Obama isn't part of the conversation at all, there was no statement endorsing Obama whatsoever)

I say: the US did X. Response: Yeah but Russia (or China etc) is evil and did a thing, they're even worse because of a thing! (the other countries aren't part of the conversation at all, I never suggested the other countries are good or bad or did or didn't do a thing).

I say: I'm in favor of an immigration system like Canada or Australia. Response: how can you support internment camps and murdering children at the border? Or more calmly: why are you against immigration? (there was no mention of being against immigration at all, Canada allows plenty of immigration via their approach)

Some of it is obviously an emotional attempt at diversion, an irrational reflex to change the conversation away from what it's pointing at for one reason or another. Logic is in part about self-control and my observation is that it's a rare quality.

I've yet to find a forum where this doesn't happen constantly, basically in every large thread. You spend half your effort on forums either trying to pre-empt very primitive logic failure responses via how you structure what you write so you don't have to waste your time later correcting people, or you have to waste time responding after the fact and noting that no, in fact you didn't endorse x y z.


I always feel ambiguously about this kind of arguments. I agree that there is a lack of rational discussion, but on the other hand if someone really cares about Y then they will be quite protective of perceived attacks on X, a tangentially related topic.

Ideas and subcultures are in a constant fight for mindshares, I disagree with how pervasive and hypocrite this battle is, but it is not like there is no reason for it.


I suppose this is a fundamental flaw of 'online communication' having low bandwidth when it comes to conveying information.

We humans spent most of our history interacting with people in a more direct way, and having out 'gut' conclude things about another person based on what they say, do, how they look, how they sound, and so on, was and is important to our survival.

If I meet a person face to face who says 'Bush did x', I might also notice the tone of voice, that the have a Southern drawl, that they're wearing camo and a red hat with MAGA on it. In this situation I'm not likely to conclude that they're pro-Obama despite their critical statement wrt Bush.

I've noticed this flaw in myself as well, and it's frustrating and takes effort to counter. When I see a politically loaded comment on HN, I really have to make an effort to not jump to conclusions.

All that said, I have a little plugin that allows me to tag users and while plenty of commenters surprise me, I find that most are almost shockingly consistent when it comes to which 'bucket' I put them in (right, alt-right, conservative, liberal, socialist, libertarian, evangelical, etc.). So perhaps it's not so strange or inaccurate that many of us jump to conclusions based on very limited information.

(not that I think it's a good thing to do so. I do agree with your comment.)


> If I meet a person face to face who says 'Bush did x', I might also notice the tone of voice, that the have a Southern drawl, that they're wearing camo and a red hat with MAGA on it. In this situation I'm not likely to conclude that they're pro-Obama despite their critical statement wrt Bush.

Stereotypes are convenient when you need to make decision in a pinch, but they are horribly ignorant and misguided when talking to people. You can easily dispel your ignorant bias by asking a single pointed question.

As far as politically loaded hyperbole and labeled stereotype buckets I find Bush and Obama far more in common than Trump and that Trump and Clinton have far more in common. When I look at these people I don’t care what their politics are or how charming they are. I am trying to examine their motivations and how they interact with people. I am not sure which labeled bucket that would put me in and I don’t really care because I despise political labels.


There's another deeper level here.

Like with stocks, the price depends on what other people think the price is, so you can be correct and lose your shirt because everyone else thinks something that is wrong. (Which also makes them right!)

Similarly, it's likely that common illogical arguments like whataboutism persist exactly because they are effective in convincing non-logical people and may be a logical strategy if you actually want to achieve some kind of political change.


This is why options are dangerous. Other people set the price of the stock.

That said, stocks are fundamentally safer. If you own company X and they succeed/profit well longterm, you receive great dividends regardless of the stock price.


Not directly a forum, but...

https://slatestarcodex.com/about/


[flagged]


That is interesting, because there are plenty of logical proofs of the opposite.

There are, also, plenty logical proofs of how Schiff conducted a biased investigation.

It is easy to prove any point by selecting your evidences.


There isn't proof of either of those things you said.


That's fine and all except we should at least recognize when we've decided by gut and when by logic. Unfortunately this doesn't happen in most cases which means that when weighing alternatives we give equal credit to gut and logic based decisions because we don't even know which is which.

Saying "I don't know" is one of the most honest and correct things a human can say about most problems. Then starting from that they may attempt to gather some (partial) information and build a logic to support some solution. But if instead we're starting with "let's do this" then we already lost.


I agree on the honesty part, but in many cases "because I think it is right" is a valid argument.

It is wrong to pretend to have a logical reasoning behind your gut feeling where there is none, but there is nothing wrong in not wanting to work with a company because they give a 'bad feeling"


Yeah our gut/brain is like a machine learning tool. It will reach to a conclusion due to how experience has structured our brain to be however we always do not know the reason why. For simplistic and every day things it is very often right, but the more unusual and complex the scenario the more likely it is to misjudge. And it is also rather often that when we get that gut feeling we start to search for logical arguments only after and at which point there is danger that we will only look for arguments supporting our gut.


One should strive to avoid reliance on their "gut". First, guts are often biased; second, it's hard to communicate gut feelings. Consider it a bad habit that should be corrected or reduced for things that impact others.


I completely disagree with this; gut feeling is how we are scared of unusually silent alleys, gut feeling is how some people just have the wrong smile and look scary, gut feeling is how we perceive the infinite complexity of the human experience.

It is wrong to codify gut feelings into objective rules ("smiling too much is now a crime" obviously does not work). We should also strive not to be dominated by our gut feeling and to be self critical of how appropriate they are in any given situation.

Sometimes the rational position is to realize that strict rationality is not the perfect solution to every problem.


Slightly aside; I wonder ...

How large is the difference between arriving at an answer by "using your gut", vs. arriving at the answer using a Machine Learning system?

Especially in cases where the ML system is a simulated neural net?

In both cases you are using a neural net based optimizer, and in both cases it is hard to apply (post-hoc) introspection.


I think this is a false dichotomy.

I value objectivity a lot, but recognize that there are lots of issues in human interactions that you just cannot solve well with this approach. In trying to break it down to something tractable, you throw out the important bits.

I suspect that sometimes the resistance you find is not due to you applying logic, but instead misapplying it. For what it's worth, I also suspect that sometimes it is more like you present. And sometimes you just have different axioms, even if both parties cannot articulate them formally well enough to see this.


It’s not a false dichotomy. Objectivity allows you to assess data and make decisions based upon that data opposed to a gut feeling. Objectivity is very helpful when it comes to persuading and manipulating people, as it means you are making decisions based upon your observations of them opposed to your personal opinions of them.


Our senses are not objective, and our mental processing of said sensory input even less so.

While generally I do agree that trying to be 'objective' in the way you describe is a solid approach, I've met plenty of people and had plenty of experiences where this backfired because I or others had too much trust in the ability to be 'objective' about something, overruling our very subjective but also correct 'gut feeling'.


That is largely why objectivity is a personality trait. Most people, at least non cognitively, don’t want objectivity. They want comfort or reassurance.

You can’t pretend to be objective when the appearance of such is convenient, because then you are just lying to yourself about dodging the emotional considerations that compose a decision. This sort of self deception is far more common than you might find comforting. Even highly objective people sometimes need guard rails to ensure their decisions are objectively quantifiable.

All decision making is a highly emotional experience in all cases for all people. It’s just how our primate brains work. For highly objective people a careful consideration of data and perceptions of balance are a fundamental part of that emotional experience in a way they cannot be for other people.


What I was trying to say is that it is intractable to get the right data in many common cases, and in others unsolvable because of different premises. In the latter case of course it is good to be objective about what the premises are, but you can't derive a solution there often - it's possible but may not be worth the time needed.

Handwavy: often what you need is more rough-and-ready approximate statistics than any real logic. You are still approaching it with an objective mindset, but you are being realistic about the limitations of your data and modeling capabilities.

And at the end of the day a lot of what is important is also subjective.


> Another frustrating aspect of human nature is that most people don't know how to be logical. I strive to be logical as a personal goal, but it's often not valued in the work-place, or anywhere. Keep in mind "logical" is not necessarily the same as "being right", at least not my working definition. Being logical is being able to back up your viewpoint and claims using a clear chain of logic. I define any assumptions I make as givens, and my conclusion can be formed directly from those givens. I don't necessarily write it out in formal logic, but if one questions a specific informally-written step, I can and do formalize/clarify it.

Logic can only get you so far. Your argument can be perfectly chained together logically, but logic says nothing about whether your premises/givens are true this is the difference between sound and valid arguments. This is probably what you meant by being logical is not always being right. Honestly you can be as valid in your arguments as you want, but that doesn’t always make your argument sound, because when your givens turn out to be false your whole chain of logic is moot and your conclusion is not true (as some might think) but instead inconclusive [1]. Just as someone with invalid logical chains and possibly even correct assumptions conclusion would be inconclusive and not false as some may presume (I’m thinking of the fallacy fallacy [2].) Sure it’s important to have a valid argument, but people with unsound valid arguments can make a lot of confusion too and possibly even more so.

> Most don't return the favor. "My guts have proven accurate and I'd rather trust my gut feelings" is a common response. Screw Guts, I want logic, damn humans! Human guts have proven dumb.

Your givens you describe on setting up your logic could also be categorized as a gut feeling as well. As they say it’s turtles all the way down.

Now don’t get me wrong I agree we should be more logical in our arguments. I just think people can take your reasoning too far which is why I added some input.

[1]: https://simple.m.wikipedia.org/wiki/Argument_from_false_prem...

[2]: https://en.m.wikipedia.org/wiki/Argument_from_fallacy


Re: but logic says nothing about whether your premises/givens are true this is the difference between sound and valid arguments.

As I mentioned nearby, finding each side's "givens" is part of the process of narrowing down the points of disagreements.

For example, after long debates with die-hard conservatives I've discovered that they are not necessarily against social safety nets, but rather they feel the more society relies on them the more powerful gov't gets in general, giving it more control over OTHER things, not just safety nets. They'd rather risk dying than let "the beast" grow. That's their personal choice. It's the "feeding the beast" theory. But they won't admit this up front, one has to tease it out of them.


> As I mentioned nearby, finding each side's "givens" is part of the process of narrowing down the points of disagreements.

I think this is a very good way to look at it. If you can find each others givens you can know how you differ. But, not always is one given more correct than another, but instead sometimes just shows a different perspective. For example in euclidean geometry by changing certain axioms you can create other non-euclidean geometries. Neither is more correct than another only they're true under different circumstances.

> For example, after long debates with die-hard conservatives I've discovered that they are not necessarily against social safety nets, but rather they feel the more society relies on them the more powerful gov't gets in general, giving it more control over OTHER things, not just safety nets. They'd rather risk dying than let "the beast" grow. That's their personal choice. It's the "feeding the beast" theory. But they won't admit this up front, one has to tease it out of them.

I'm a pretty die-hard conservative myself and social safety nets are good for the down-trodden or those who are disabled or those who cannot work for whatever reason, but sometimes what I've heard is that they disincentivize work for those who are able this hurts not only those being taxed, but the economy, and likely the able bodied person collecting welfare. I think feeding the beast is another problem the government obviously needs to reduce spending and cutting an d reducing spending on government programs could help with that in my mind. Do economists really agree on these things though? Is the given that feeding the beast really that inherently bad? Feel free to educate me my mind is a pretty open book. I think the main problem with big government is it could lead to tyranny and less economic freedom.


Re: but sometimes what I've heard is that they disincentivize work for those who are able

I agree it can be hard to distinguish between the truly needy and intentional slackers. But, it would cost even more tax money to inspect harder to find out. Inspectors ain't cheap. It may be counter-intuitive, but it's sometimes cheaper to let some degree of riff-raff slip thru the cracks. The third alternative is that nobody gets anything.


> I agree it can be hard to distinguish between the truly needy and intentional slackers. But, it would cost even more tax money to inspect harder to find out. Inspectors ain't cheap. It may be counter-intuitive, but it's sometimes cheaper to let some degree of riff-raff slip thru the cracks. The third alternative is that nobody gets anything.

That is exactly what I’ve heard and I completely agree with your statement. Although I never really meant to imply that we should have investigators follow people around and kick them off welfare.

Again, I’m not against welfare for the disabled, poor and destitute, but I also thinks it disincentivizes work to those that want to and I think though that people are disincentivized to work for economic reasons and not just social reasons. The economic reasons we can fix and the social disincentives might be able to be fixed too, but I wouldn’t think hiring inspectors is the right way to do that. For an example of economic disincentives for work in the US welfare system, if someone makes just enough money to get off of particular kinds of welfare because they got a job or even a better paying job they may end up making less which could disincentivize them from working or holding a better job. The answer as pointed out here is to ween people off of benefits in a way that would make financial sense to the individuals themselves [1].

Basically my thought with saying previously that these bad incentives can be bad for the person on welfare themselves is if we disincentivize people from climbing the job ladder above a certain rung they may feel inclined to always stay below that rung due to that incentive. Thus leading to their own poverty and thus hurting the economy as well, and this is because not only we have to continue paying their welfare, but also because they don’t contribute as much to our workforce.

I understand that for the large part of people on welfare their personal intention is to get off of it and pass that barrier/rung and get off welfare, but they still have some amount of disincentive to not do so which if the welfare program were better maybe it would have better incentives.

[1]: https://www.reddit.com/r/PoliticalDiscussion/comments/32rr5p...


A good many are addicted to authority figures to tell them how to think.

To be fair, if you have (for example) a high school education, simply taking the word of someone who has (for example) a PhD in X subject is a best practice.

From the article:

In theory, resolving factual disputes should be relatively easy: Just present the evidence of a strong expert consensus.

Though I've spent time in an anti-vax community -- because it was a good source of useful health info if you could sidestep the emotional baggage and conspiracy theories -- and some members had PhDs and at least one was also a published author. So they had their own respected and credentialed experts to argue against the respected and credentialed experts that friends and family were trying to use to attack them.

Most don't return the favor.

Regardless of argumentation style or approach, most people are basically saying "This is my best understanding of life, the universe and everything." This includes people claiming to have the logic high ground for some reason and lots of people have been burned by people looking to win the argument and making no effort to understand why they might see it differently.


Good point. The world would be a better place if we at least learned /who/ to trust when it comes to expertise...


I lost track of how many people around me claim they’re “logical”. In reality, they’re perceived by others exactly as you’re perceiving others here (yes, me included, of course)


> Another frustrating aspect of human nature is that most people don't know how to be logical.

Many human philosophies reject logic (Or any mechanical system of derivation) as a valid epistemological device. Logic in the west is heavily shaped by Greek philosophers, like Plato and Aristotle, and then Latin Catholicism's embrace of them in the medieval times (logic is not particularly important in the eastern churches). Without the effort of the Church in the West, logic would not exist as a common form of 'proof' today. In Arabia, Greek logic was embraced by the Muslims. But, it is not a universally embraced philosophical framework.

In fact, there's not even any such thing as 'logic'. There are multiple logics in Western thought (and in mathematics and philosophy and even computer science) that are often at odds with one another and cannot be reconciled. Western philosophers like Godel even admit that logic cannot explain everything.

Of course 'logic' (and it's now in quotes to refer to your conception of it, which we've already established is ill-defined given the number of extant logical systems) in normal discourse now includes the law of excluded middle, which undermines the entire system, and is incompatible with logic's own embrace of rationality.

Let's take a quick tour around the world though.

Several branches of western philosophy do not admit logic (the way you think of it at least) as a valid epistemology:

1. Mathematical Constructivists reject universal applicability of the law of excluded middle. In the intuitionistic logic they use, the phrase 'P is either true or not true' is not one that can be assigned a truth value. This is not to say there are no propositions which one could argue successfully are individually true or untrue, they just reject the universal applicability of the law. This is really important, because in 'formal logic' the way you conceive it, the law of excluded middle makes the entire logic unsound.

2. Pragmatists (John Dewey, Charles Sanders Pierce) often reject logic outright (https://en.wikipedia.org/wiki/Pragmatism#Logic).

Many branches of non-western philosophy do not admit logical inference as fully valid, or augment it such that it no longer fits with western conceptions.

1. For example, the Charvaka philosophy from India rejects logically inferred knowledge as always doubtful. In that philosophy, the only unqualifiably true knowledge is that which one observes. Studying 'logic' in an attempt to be 'rational' is not useful in that system: https://en.wikipedia.org/wiki/Charvaka.

2. The Vaisheshika school of philosophy accepts inferences based on prior knowledge, but also accepts as true the testimony of reliable experts (shabda), thus invalidating logic as a means to truth (truth in the western sense). https://en.wikipedia.org/wiki/Pramana#Vaisheshika_school

3. The Mohist philosophy of China (from the followers of Confucius) admits analogy as a form of knowledge, but rejects formal logic in the western way.

4. Zen Buddhism often castigates logic as not only useless but potentially harmful in one's quest to arrive at truth: http://ccbs.ntu.edu.tw/FULLTEXT/JR-JOCP/jc26600.htm

Your idealism of 'logic' (and given that you didn't even specify which logic, I'm taking it mean you think there is only one, and that it is right) is, when considering the grand arc of human history, indefensible. You argue that it is 'right' and the way to the 'truth', but you can't even fathom that many societies, cultures, groups, and philosophers did not think it useful or necessary or valid.

Simply believing logic is right, without any consideration for an alternative, amounts to a religious belief. Indeed, given that logic remains popular mainly due to the theological musings of Thomas Aquinas, it is moreover, a heavily Christian (from Aquinas) or Muslim (from the Muslim theologists studying Aristotle) belief.


>Many human philosophies reject logic (Or any mechanical system of derivation) as a valid epistemological device.

People can reject what they want, but rejecting hammers doesn't make them less capable of hitting nails.

>logic remains popular mainly due to the theological musings (...)

It's a flourishing field, popular for its usefulness in formal proofs, etc., not a dead remains of the past.


Logic is not universally useful in formal proofs because logic as you conceive it is incomplete. This is a provable result in the basic logic one would learn in an introductory formal methods course.

Real formalization either does not admit very obvious axioms, like excluded middle, or cannot prove theorems we would consider obvious, like function equivalence.

In formal methods in computer science and math we as of yet do not even have a complete logic that can encapsulate all the things we want to prove (homotopy type theory is an attempt but there is no accepted computational interpretation). Stop pretending 'logic' is a thing that is self evidentiary.


>logic as you conceive it is incomplete

Don't we all know that at least since Gödel.

>very obvious axioms, like excluded middle

I don't consider it something obvious. If you read Nagarjuna's Mulamadhyamaka-Karik, you might even consider it something inappropriate.


> I don't consider it something obvious.

Then that puts you in the minority of those using 'logic' when arguing about non-mathematical things. By and large, most lines of reasoning engaged in those attempting to be 'rational' rests on excluded middle.


Implicit learning is real. Some guts are more accurate than others. Just like not everyone is a great logician.


I like your goal.

I also wrote a literature review on when humans should and should not trust their gut.

It was a personalized assignment that I fought to obtain as it was not a standard assignment.

Writing it taught me a lot.

If you’re interested, email me and I’ll send it to you.


>Through that I have learned that objectivity is a measurable and identifable personality trait that is astonishingly rare.

How does one test for this? I would like to mark my own belief to market.


Here is what it was like in these cases I interviewed. Both provided several, maybe 6, online assessments of intellectual and emotional health. Think of these like IQ tests, but they are divergent tests instead of convergent tests. That means they are testing for things like vocabulary and creativity instead of multiple choice with a prior identified accepted answer.

After testing at the hedge fund I was able to see my scores and assessments. The other place is an elite military group, and I was not allowed to see my scores unless I accepted the job. The military group followed the on screen tests by a one on one with a psychologist.

I passed both assessments but chose to not accept either job due to unrelated conditions like a competing job offer, bad timing, life conditions.


Still interested in boosting personal effectiveness with teammates who apply some of those ideas? Email in profile.

PS. You don’t really binary pass/fail the assessments. They tend to inform whether and how you and others can complement or guardrail each other on mission as a team.


It's easiest to tell when someone is in a mildly stressful situation, and to see if they can focus on the task and evaluate tactical options to move forward, or if they get defensive.

Also, hello username buddy.


Defensiveness or stress tolerance or problem solving are not objectivity


>A human being’s very sense of self is intimately tied up with his or her identity group’s status and beliefs. Unsurprisingly, then, people respond automatically and defensively to information that threatens their ideological worldview. We respond with rationalization and selective assessment of evidence — that is, we engage in confirmation bias, giving credit to expert testimony we like and find reasons to reject the rest.

>Political scientists Charles Taber and Milton Lodge experimentally confirmed the existence of this automatic response. They found that partisan subjects, when presented with photos of politicians, produce an affective “like/dislike” response that precedes any sort of conscious, factual assessment as to who is pictured.

>In ideologically charged situations, one’s prejudices end up affecting one’s factual beliefs. Insofar as you define yourself in terms of your cultural affiliations, information that threatens your belief system — say, information about the negative effects of industrial production on the environment — can threaten your sense of identity itself. If it’s part of your ideological community’s worldview that unnatural things are unhealthful, factual information about a scientific consensus on the safety of vaccines or GMOs feels like a personal attack.

If you are objective, you are not threatened by surprising new information, even if it might have some negative implications that jump to mind. You evaluate it against every other relevant information source you have on the topic, sometimes requiring more complex chains of information, and assign a level of trust to the new info.


This is readily observable when you take a group of people who have never worked together and are generally unfamiliar with the outdoors and drop them into an outdoor military training exercise, such as college ROTC junior classmen.

Thinking about it, the only way that this course of action is "not logical" is that it is not "deciding on a course of action and following it regardless of the approach of one's fellows", maybe that it's not oriented to looking the terrain instead of following emotions. But all these more like skills or "personality makeup" and as such, they aren't even applicable to every situation. If you're starting a situation where individual would be bonding over a long period, congregating with one's fellows may be a good or "objective" strategy.


In a tactical environment the problem with clustering is that modern weapons will quickly take out everybody. There needs to be some space between people, maybe a couple meters.

The problem with a lack of initiative is that it undermines security. If you don’t know what you should be doing then at least you can secure part of the group’s perimeter by seeking cover and looking outward until told otherwise.


Sure, but I don't think you're talking about "logic" as such. It is more a matter of thinking about the conditions of the environment.

IE, you're talking of specific skills and understanding as if they were a general thing, "logic". Not that I'd disagree with comment about what should be done.


how is that measured? what are the tests that measure for it? tia


for what jobs were you interviewing?


Isn't that obvious? Think of the human brain as a byzantine fault tolerant distributed system. You are collaborating with other "nodes" that may be malicious, manipulitative or omit critical information. Therefore every message you receive has to be validated but since there is no objective criteria that can be used to form global consensus the only way to validate messages is by drawing upon your existing knowledge. When you are young you are more receptive to messages because you haven't gathered enough knowledge. Your "worldview" is being developed and then once it has matured you can use it to validate further messages.

Lock-in happens when you build a whole network of beliefs that are fully dependent on each other. Admitting A can mean that B is no longer valid but you are fully convinced C must be true and if C is true then B is true as well. Therefore denying A is the only way to maintain your worldview.


Yeah, I'm still looking at the headline and I'm thinking "and why is that a bad thing?"

I mean, the problem mainstream commentary are really talking about is the accumulation of irrational, garbage world views. But mainstream is hampered in its ability to describe this in particular.

Essentially, the view of journalism, in America especially, has presented itself as a "non-world view", "just the facts" but really having a secular, pro-science, pro-capitalist, etc position. A variety of viewpoints could considered but not an infinite variety. And the endorse variety was the "unbiased", supposedly willing to considered anything. And it seemed plausible for a bit.

Moreover, this mainstream view certainly wanted some things to be taken fully on authority while other things could have their validity debated. Especially, schools and newspapers can't come out and say religious is bunk, they have to couch things as religion is a fine thing but shouldn't impinge on the domain of science or the state. And when talking a world view that is, in my secular world view, screamingly irrational, that's leaving a big hole in your position.

My personal biased view is that the rise of an educated populace in the 1960s lead to that populace questioning a lot and the elites then being happy to gut education. Which lead to a uneducated populace more ready to embrace idiocy. Researchers saying "maybe people should have tools to sort facts from trash" but how did the situation start anyway? Sure, the Internet accelerated this kindling but you had problems with the rationality of the citizenry beforehand.


> This approach succeeds most of the time when the issue is, say, the atomic weight of hydrogen.

The article is interesting but I think misses one critical point. Most "facts" aren't as clear cut as the weight of hydrogen. Most answers to questions will differ according to conditions. As an example not all people will respond the same to a treatment regimen. Low carb doesn't work for everyone for different reasons so most "facts" on low carb will likely be disputed.


Aren't Southern history books still teaching the War of Northern Aggression?

https://www.washingtonpost.com/archive/politics/2002/03/26/c...


Anecdotal, but I'm from the south and learned it as the Civil War. However, I wouldn't be surprised to learn that many schools teach it as the "War of Northern Aggression" given how it's viewed in rural areas.


Not just in the south, I have a friend who grew up in eastern Washington State that I had to disabuse of that notion.


If we're talking about weight loss, low carb does indeed work for everyone, just not everyone can stick with it, because the link between brain and gut is complex and strong, physical activity levels vary which impacts the speed of progress, etc.

It is a fact that there is no process by which someone eats less than 20 g of carbs per day and does not lose weight.


...except eating 4000 calories a day of fat and protein.

Conservation of energy is the law.


Sure, yes, of course, but protein especially being harder to digest means for eating a constant mass of food, lower carb will cause faster weight loss than a higher proportion of carbs in the diet.

Also, regarding your HN bio, google no longer claims "don't be evil" as a motto as of 2018.


Wow. They've got us over a barrel if they don't even bother to lie to us any more.

As for the ease of digestion, I know protein has the same caloric content as carbohydrate (both 4 cal/g). Fat is 9cal/g. Seems straightforward to me.


Remarkable how these debates reveal how many have a "scientific" attitude. Generally very few.

There is no such thing as "done science". Science is about hypotheses that are questioned and challenged to come up with a better hypothesis.

I'm astonished how this simple thing is not evident in much of the debate.


You have to be aware that part of human nature comes with greed, dishonesty, and countless of other nasty things. Maybe what we call "gut" is another checking point to guard ourselves of the real monster inside each of us.

Take GMOs, I am still skeptical of the science behind its safety, especially knowing that corporations fund these starving scientists. There is for example, a hidden MIT (unpublished, maybe due to retaliation fears) study that found 10 correlations of diseases to the rise of GMOs. Feel free to fact-check, HN is great at this (study below): https://people.csail.mit.edu/seneff/glyphosate/NancySwanson....

You can accuse the less-educated or talented on their bias all you want, but ultimately the smartest elite lie more often for personal gain than the humble and innocent rest (99%)...Perhaps we should be pointing fingers at our own society that glorifies lies and hides truth. Ex: "relative truth and morality" was invented by this culture.

Perhaps there is something in our human nature that still strives for truth after all, but it is very hidden among all the lies and the rest of our majority evil nature. Some people call it "the moral law". There is still a sliver of hope...


Fun fact: whether you agree or disagree with this premise, will prove this point right.


Nicely done.


Does the research cited distinguish between hardwired vs culturally (or developmentally) entrenched behaviour? I haven't followed all the links so I'm asking, not answering, but if the research doesn't make that distinction, then the 'hardwired' aspect is unwarranted, and the title should have been something like "Contemporary humans tend to dismiss ...".

In general, most social psychology research is still conducted on WEIRDS (and if the current Euro-American 'global' culture continues to spread, this is going to become an increasingly harder problem to tease out). Richard Nisbett & others claim to have shown that many cognitive behavioural regularities merely assumed to be 'hardwired' turn out not to be universal, therefore not hardwired at all.


And next to that is a link to https://arxiv.org/abs/2001.10488 "Statistical Consequences of Fat Tails: Real World Preasymptotics, Epistemology, and Applications" by Nassim Taleb which states "Many "biases" found in psychology become entirely rational under more sophisticated probability distributions".

May be "hardwiredness" is not all that hard wired. Or, put it other way, the hardwiredness is not in humans under study but in humans who study them.

I don't think that humans would achieve what we have today being hardwired to dismiss facts that don't fit their worldview.


Obviously, we aren’t absolutely unable to consider other views. If we couldn’t, people would never change their views and yet they do. Furthermore, evidence isn’t the same thing as a self-sufficient body of premises. We interpret evidence in light of prior beliefs. There are disagreements where those prior beliefs are concerned. The truth is often difficult to know and even when you have it, convincing others is still going to take a lot of work. Don’t trivialize that difficulty.

What this article is really circling are not these things, which are natural and with which there is nothing wrong per se, but about cultivated prejudice. Cultivated prejudice can be overcome, but it requires humility, that is, an openness to considering other views to see if what they claim is plausible, likely, necessary, or whatever. Should you change your mind, it may also require courage to state your change of beliefs to your milieu, but it is possible, even if risky in some circumstances.

I would caution against the opposite errors of relativism and intolerance toward others’ opinions. What has happened of late (a matter of decades) is that quotidian political squabbling has turned into a full-scale war because of the Left’s vicious and total attack on Western tradition and the very foundations of Western culture (in most cases, in profound ignorance of the depths of that culture). In turn, many on the Right have overcompensated by moving away from the cautious, principled pragmatism of conservatism. If you want a more measured atmosphere, where exchanges of views can take place without the strident nonsense that dominates today, then the Left has to stop its total, emotionalist, destructive, oikophobic siege on “the West”, especially as it offers no better alternatives. Nature abhors a vacuum and it will be filled with all manner of demons far exceeding the boogeymen haunting the Left’s imagination.


This article fails to provide facts to support the use of the term hardwired.


>dismiss facts that don’t fit their worldview

I find this disconnect from reality especially fascinating in the software field, where nature's answers are just at our fingertips, and where we just have to open an editor or launch our program to be able to verify our expectations.

I once told one of our architects that we were creating a lot of (Java) garbage, and he stared at me incredulously before saying: "but... no, we are using factories!" (the code was bloated with factories indeed, but no object was ever returned to them since noone knew when instances could be released)

In the less verifiable and more nebulous matters of human psyche and philosophy, all hell breaks loose, which is unfortunate since they are also the most essential. I remember my philosophy professor in college, telling us, as we were leaving him for math class: "Ah, now serious stuffs!".


Accurate, but fundamentally it is a type of laziness.

Changing your worldview takes a serious amount of effort. Most people don't use their new years resolution one year gym memberships after the second week of January -- and rewiring your beliefs in the face of evidence and reason makes gymming look like sitting on your couch in your undies.

This is why it's fundamentally an education issue. People's false beliefs and weak work ethics are usually baked in back at school.

If school was improved, this problem would be diminished.


This could be a side effect of how intelligence works. Intelligence is at least largely about making parsimonious models, and that involves a lot of throwing away of noise. So this is sort of a failure mode of that system. Sometimes contrarian but otherwise good information gets misidentified as noise.


This is hardly new. Leon Festinger was talking about this in the 60s with a theory called cognitive dissonance.


I'm enjoying the amount of pushback this is getting here on HN, given that we all probably think we're not victim to this, and are actively exposing ourselves to be.


> Within the conservative political blogosphere, global warming is either a hoax or so uncertain as to be unworthy of response. Within other geographic or online communities, vaccines, fluoridated water and genetically modified foods are known to be dangerous. Right-wing media outlets paint a detailed picture of how Donald Trump is the victim of a fabricated conspiracy.

To this day you'll find most well off (and extreme) democrats avoiding GMO foods, protesting vaccines as well as avoiding fluoride in water. I don't think that's a republican issue. Especially in portland that consistently votes democratic but also NO to GMO, Flouride and Vaccines.

Reading that middle sentence maybe it's a little unclear if they are tying that to conservatism, but it's definitely surrounded by it.


> To this day you'll find most well off (and extreme) democrats avoiding GMO foods, protesting vaccines as well as avoiding fluoride in water. I don't think that's a republican issue. Especially in portland that consistently votes democratic but also NO to GMO, Flouride and Vaccines.

I found that claim surprising so I did a quick "fact check".

About fluoride: true. "For the fourth time since 1956, Portland voters reject fluoridation"

https://www.oregonlive.com/portland/2013/05/portland_fluorid...

About GMO: mixed. "Sixty-two percent of voters in Portland's Multnomah County supported " a measure to label (but not ban) GMOs, that was defeated statewide.

https://www.agweb.com/article/oregon-voters-reject-gmo-label...

About vaccines: uncertain. I wasn't able to find evidence of this.


"Humans are hardwired to dismiss facts that don’t fit their worldview." Fact checkers and journalists are human, and it's not hard to guess at the worldview held by the author of this article.


I'm supposed to be avoiding politics here, but:

I thought vaccines were kind of a 'fringes of both sides' issue until our Republican rep here in Bend, Oregon got the rug pulled out from under her by her own party:

https://www.oregonlive.com/health/2019/05/vaccination-boosti...

I was surprised, I didn't think it'd go down that way.

The sheer vitriol that our rep gets on her social media because of that one thing is quite something to behold.


The unifying thread for anti-vaxxers is a fear of authority. You can find that anywhere. Left, right and center.


Yep. I did not expect the voting to be so partisan, though.


A lot of people believe their side is smarter. Try to prove them wrong and they'll take it as a personal attack etc etc see the original article


Well that's not what I'm saying: I thought that it would be more centrists of all stripes vs fringes on both sides.

I did not think it was a 'sides' issue.


Vaccine "skepticism" used to be more prevalent among the (mostly left) urban health freaks. But that is changing rather rapidly, see for example https://www.politico.com/story/2019/05/27/anti-vaccine-repub....

This might be a good example actually of people self-sorting into one of the two tribes. And Republicans (not conservatives) have conquered the science-is-bad spectrum so thoroughly they don't need to fear competition in the foreseeable future.


It doesn't make much sense to use "left" and "right" to describe quite a bit of US politics, but even more so out west, where strange combinations of political outlook and motivation are easy to find.


It's not just the US. Where I live (Northern Rivers, Australia), large numbers of people in the hippy areas would situate themselves politically as ultra-left and/or deep green. So highly anti-capitalist, anti-racist etc. They'll attack the right for its hostility to science when it comes to climate change, yet ignore any science relating to GMOs, nuclear power, vaccines, chemtrails, 4/5g ("Death Rays!" say the local signs). They're very pro-welfare so kind of Statist, yet left-anarchist on drugs, attitude to cops, etc.

People are a mixed bag, and becoming more so (I think). Longstanding political maps are being redrawn almost everywhere it seems.


When looking at specific claimed values, outside of a specific core, not so much.

When looking at general orientation, particularly in terms of concentration vs. extension of establishment power, there are trends peresistent across millennia.

This passage from the opening of A.H.M. Jones, Augustus, sticks with me:

It was the agrarian problem that sparked off the violence that was ultimately to destroy the Republic. Tiberius Gracchus' bill [Lex Sempronia Agraria, Ed], enacted in 133 B.C. for distributing the public land, after leaving a generous allowance to the occupiers, in small lots to poor citizens, excited such furious resistance among the senatorial landowners that a group of them lynched Gracchus. This was the first in a series of violent clashes between two groups who called themselves the optimates and the populares. The nucleus of the optimates was the small clique of nobles (men whose fathers, grandfathers, or more remote ancesters had been consuls) who more or less monopolized the highest offices and dominated the Senate, but they had wide support among the propertied class, even, as Cicero says, propserous freedman; otherwise they could not have maintained their unbroken hold on the higher magistracies. They were conservatives, who regarded the rights of property as sacred, and therefore resisted bitterly any attempts to redistribute land or cancel debt. They were upholders of the constitution and of religion, which could be used to block any revolutionary legislation. Though at times they had to yield to popular pressure, they always remained the government.

The populares were a much less well defined group. Their leaders were individual politicians or very small groups of politicians, who at intervals attempted to legislate in the interests of the people, by which they meant the common people....

https://old.reddit.com/r/dredmorbius/comments/6i2h0e/ahm_jon...


Portlander here.

Sometimes it's hard being a liberal when you have to deal with the fluoride idiots or the "EM radiation causes autism" ... not just from the lesser-intelligent, but from intelligent-yet-highly-misinformed people (e.g., I know severeal doctors who believe Wi-Fi routers might be dangerous and are worried about 5G deployment... yes, I said DOCTORS ... show me some double-blind studies, people).

Every group has their wingnuts, but I'm sure as heck am glad my liberal wingnuts aren't calling for the extermination of nonwhite/nonchristian/nonheteronormitive people.

So I got that going for my side. Yay? Feh.


There are no double-blind studies of the effect cell phones on cancer. That would be impossible - how would you create a control group that thought they were using cell phones but actually weren't? And even if you could magically solve that problem, who would pay to run such a study on thousands of people for years?

And Democrats have their own bigots. They're probably worse than Republicans on appearance discrimination (the entertainment industry in particular, and every Democrat who's ever made jokes about Trump's skin or hair), and minority conservatives are subjected to the most vile racist attacks.

But even if you limit the discussion narrowly, there have been plenty of calls to violence from prominent Democrats, like:

Lea DeLaria: "pick up a baseball bat and take out every f--king republican and independent I see"

Mike Malloy: "But, you know, the NRA members are the current incarnation of the brownshirts... I look forward to ... the Night of the Long Knives when [they are] slaughtered and dumped in the nearest ditches".

Allan Brauer, the communications chair of the Democratic Party of Sacramento County: "May your children all die from debilitating, painful and incurable diseases."

Michael Feingold: "I personally think [Republicans] should be exterminated before they cause any more harm."

But things like that aren't widely reported in Democratic media, because the media (left and right) likes to do the job of filtering facts that don't fit the audience's worldview for them.


Please don't use HN for political and ideological flamewar. It's not what this site is for.

https://news.ycombinator.com/newsguidelines.html


I apologize. I should refrain from responding to flamebait.

But I waited hours before responding to:

> Every group has their wingnuts, but I'm sure as heck am glad my liberal wingnuts aren't calling for the extermination of nonwhite/nonchristian/nonheteronormitive people.

and it was still highly upvoted and had no mod response. I thought someone should respond.


"Don't feed egregious comments by replying; flag them instead."

https://news.ycombinator.com/newsguidelines.html


[flagged]


You can't post like this here, regardless of how wrong someone is or you feel they are. If you'd please review https://news.ycombinator.com/newsguidelines.html and stick to the rules when posting to HN, we'd appreciate it.


Right-wing media outlets paint a detailed picture of how Donald Trump is the victim of a fabricated conspiracy...None of that is correct

Day 1 after the election. On Day 1 after that guy was elected the media--95 percent of it, in lockstep--said the following: "Russia Russia Russia Russia Russia Russia Russia Russia Russia Russia Russia, etc, etc, etc." For years.

And then they dropped it when the shit didn't stick as planned.

So, yes, the title of the article is correct.


Was curious if this was true. Looked at the front page of papers from day 1 after the election, and couldn’t find a single mention of Russia.

https://www.nytimes.com/2016/11/10/business/media/trumps-vic...


>So, yes, the title of the article is correct.

I take conspiracy to be an active process where folks get together and agree on something. I think what has happened is folks who have different goals without communicating.

The Democrats want to prevent re-election/gain removal, the media wants ratings/profit and activists want attention/change. I don't think they all got together, met, planned things out and ran a conspiracy.


The most popular news network by far is Fox News due to a few of its hosts that support the political right. If profit were the primary concern of the other networks, they would hire some conservative journalists to steal Fox's market share. Instead, they have dozens of shows with the same political perspective.


To be fair, the media was also talking about Russia before the election too, and so was Trump himself:

https://www.youtube.com/watch?v=3kxG8uJUsWU

Also, I'm not sure what you mean by "didn't stick as planned":

https://en.wikipedia.org/wiki/Links_between_Trump_associates...

https://en.wikipedia.org/wiki/Criminal_charges_brought_in_th...

but I'm glad we both agree about the correctness of the article title.


Just because no drunks were found in a weaving car doesn't mean investigating it is a "hoax". The Tinted One is just a bad policy driver who kisses up to Mr. P because he likes Mr. P.


What the header says sounds very obvious to me. And it is also encouraged everywhere. Does post downvoting ring any bell?


"Everything we hear is opinion, not fact; everything we see is perspective, not the truth" - Marcus Aurelius


In the book Superforecasting, one of the top scorers in a prediction contest wrote a program to display him news sites with differing biases from what he had viewed in the past. Social news, including HN, does the opposite.


HN, unlike many other examples of social media, doesn't show "personalized" news feeds. So in that respect it's not nearly as bad.


I really would like to read a business/product analyst HN just to see their perspective on tech on a regular basis.

Every other post titled ‘What your developers don’t understand ...’


HN doesn’t show different results to different people, does it? Though I suppose things that the majority disagree with are less likely to be upvoted.


Not based on user behaviours, excepting whether or not users have opted in to "showdead".

Posting new or low-rated stories will appear on some but not all front-page views, effectively by random assignment was tried in the past, it fared poorly. The idea was that those that pick up votes are rated more highly. The result was that readers reacted viscerally and negatively to junk.

Discussion:

Apparently tried, but reverted (from 5 years ago):

https://news.ycombinator.com/item?id=9333178

And from a month ago:

When we tested a variant of this idea, we found that it didn't work for what I guess you could call psychological reasons: readers reacted negatively, even violently, to seeing junk on HN's front page. By "junk" I mean stories that were placed there randomly and that weren't of high-enough quality to otherwise be credible candidates for the front page. Basically, it was an "ow my eyes", "get that shit away from me" effect. (I don't mean to demean the submissions people posted that got randomly placed in this way; it's just that words like junk and shit reflect how users reacted to the random placement thing.)

https://news.ycombinator.com/item?id=21868365

Apparently my theory on front-page submissions and karma accumulation is officially validated:

https://news.ycombinator.com/item?id=10228866


The only way it does (that I know of) is whether you have "showdead" turned on or not.


The headlines are shown to only a subset of people: those who choose to visit the site.


Inability to understand reality will lead to the species' extinction. I guess that's indeed how it's supposed to work.


Although the author was clearly trying for fair "equivalence" claims, I found that his best effort at this fell short:

> As researcher Dan Kahan has demonstrated, liberals are less likely to accept expert consensus on the possibility of safe storage of nuclear waste or on the effects of concealed-carry gun laws.

There's the very big question of what "less likely to accept" really means. Presented with good evidence of the above two points, I (believe) I would say "well, I accept X, Y and Z, but my objections to this were not limited to those points, and I still object to expanding these because <orthogonal reasons>".

That's quite different from saying "that's fake news! it's all a bunch of crap! i'm not listening to any of that!"

Studies that fail to differentiate between these two responses to "facts that don't fit their worldview" are somewhat pointless.


Hardwired and perhaps trained as well. If you ever develop serious skill at debugging logic problems it will be because you have no choice but to accept facts that don't fit into your world view, or maybe it is because you stop having a rigid world view because of how many times you eventually realize you weren't seeing the bigger picture.


A number of times, I've had to debug something where the impossible was happening. The question then became "Which thing that I think is impossible is actually possible? And how do I find out?"


I like to think of it as "it is impossible at this layer", which then can point to a failure at another abstraction layer.


It's a good filter to cut through distracting BS that would otherwise bog you down.

If you're surviving and thriving, your worldview is not so bad.

A worldview is a paradigm of reality constructed over decades. It makes sense to require higher standards of evidence for apparent exceptions to the hard won rules eked out from decades of observation and insight.


I'm increasingly convinced that worldviews / mental models are not simply modeling devices, but information rejection tools.

Borrowing from Clay Shirkey's "It's not information overload, it's filter failure", the world is a surprisingly information-rich space, and humans (or any other information-processing system, biological or otherwise) simply aren't equipped to deal with more than a minuscule fraction of it.

We aim for a useful fraction. It paints an incomplete, but useful picture.

Even a bad model has utility if it rejects information cheaply: without conscious effort, without physical effort, and without lingering concerns or apprehensions. It's a no-FOMO mechanism.

Usually, what happens is that we apply our bad models to a given scenario, act, process the new resulting scenario, and notice that that is obviously not favourable, and take appropriate actions to correct the new circumstance. Net loss: one round of interaction. Net gain: not succumbing to analysis paralysis or having to hunt for a new and improved worldview (especially: a new concensus worldview shared with numerous others, creating a large coordination problem).

Sometimes that doesn't work out and people (or companies, or governments, or cultures) get stuck in a nonproductive rut, often characterised by "doing the one thing we know how to do, only harder".

The big problem comes when there's a recognition that a former large-scale world model no longer applies. I'm leaning strongly to the notion that this is behind many psychological conditions: Grief, denial, meloncholia, depression, PTSD. Possibly burnout and ADHD.[1]

Classic grief is triggered by the loss of a loved one, or in the "five stages of grief" study, news of the subject's own impending mortality (a fatal disease prognosis). That triggered denial, anger, bargaining, depression, acceptance.

It's a pattern once recognised that one sees repeated across numerous scenarios -- almost any disaster, epidemics, global catastrophic risks, wartime attacks, business failures, relationship breakups, and on.

What's curious to me is what the threshold for grief or denial is. There are some surprises which don't elicit this response: almost all humour is based on the principle of surprise, and horror films and thrill rides are based on the premise of surprise or extreme experience, but rarely result in a traumatic response. We go through our daily lives experiencing small and medium-sized suprises and disappointments all the time. The grief/denial response seems to be triggered only above a magnitude or repetition threshold. (Though that can differ markedly between individuals.)

________________________________

Notes:

1. I'm not claiming that all PTSD, burnout, and ADHD are greif responses, but rather that there are at least strong similarities. Early psychologists linked grief and melancholia (itself then considered a much stronger longing, to the point of mental illness). The mechanisms for overload might be internal -- chemical, physical, illness, injury, or genetic in origin -- or external. But there's a common thread that seems to run through these conditions, ultimately an inability to cope with a level of change.


Heh, you've hit a point I thought of years ago. That is, the brain isn't so much of a learning device, but instead on of the most powerful filtering devices ever created.


Both, probably more.

It filters incoming information, but also determines relationships, matches patterns, and makes predictions.


[flagged]


Funny, but what about those who agree with it? Do they agree with it because it's true? Or just because they've dismissed the facts that don't agree?

This idea means that you can't really trust anything that you think is true, including this idea. To some degree it is, not quite self-refuting, but at least... self-skepticism-causing? I don't know a good word for what I'm trying to say. But it leads you to question the truthfulness of it, at least to some degree.

That said, observation leads one to think there is an element of truth to it. It can be taken too far, though.


> If it’s part of your ideological community’s worldview that unnatural things are unhealthful, factual information about a scientific consensus on the safety of vaccines or GMOs feels like a personal attack.

There's nothing unnatural about genetic manipulation. Nature modifies genes at every level from base pairs to whole chromosome-sets, far more promiscuously than us humans.

You learn all about this if you study biology in college. Arguably, you learn nothing but.


Because I took Biology in college, I can't be skeptical of GMOs?


You can be skeptical of almost anything you want. One thing that surprised me was how tolerant scientists are of dissent. Your views can run from the unorthodox to the downright eccentric and, so long as you pay attention to data and argue politely (and maybe even not so politely), people will listen to you. You can't be a scientist without being humble, and humble people listen even to the weird arguments.

But if you believe GM foods have adverse health effects, and you design a study to look for them, and try to get it funded you'll be turned down. Literally thousands of scientists have had the same idea and done the same thing, and they all found nothing. Scientists listen to data, so if you wanted to go on looking for health effects, you would probably start with very subtle ones that haven't been looked for yet. If it were me, I would start by trying to find ways genetically modified protein gets past the digestive system, which destroys genetic information by breaking all protein down down into its constituent amino acids. Not an easy problem, but better than some of the others.

If you don't want to work on the problem scientifically, you also have the option of thinking about it non-scientifically. A non-scientific opinion can be whatever you want, but holding one means lumping yourself in with a lot of people who object to genetic engineering without knowing the central dogma or even what a gene is.

https://www.nytimes.com/2018/04/23/well/eat/are-gmo-foods-sa...




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: