It takes a while to get to the point, but this is what the page is about:
As we explained in a recent review paper, researchers have repeatedly found evidence that Autistic individuals are, on average, more consistent, less biased, and more rational than non-autistic individuals in a variety of contexts.
Specifically, many Autistic people seem to be less susceptible to cognitive biases, and therefore better able to make judgments and reach decisions in a more traditionally ‘rational’ manner.
Interesting if true; it could indicate that at least mild Autism is a beneficial adaptation. Though those biases probably came about for good reasons, it could be they've become obsolete and are no longer worth it.
I know what you mean by "mild autism" but an article [1] that was recently discussed here [2] explains that "mild" vs. (say) "severe" does not quite capture the nuance of the condition. Just pointing it out here because I found it interesting.
yeah, autism is very much an "umbrella term". we define it by the effects, not by any identifiable causes. there are still broad "less impacted" and "more impacted" cases, arguably a spectrum, but it's very very much not linear.
cystic fibrosis used to be under the (all-encompassing) umbrella "chronic fatigue" because, well, they were chronically fatigued. when its niche finally gained enough data to escape the umbrella, diagnosis and treatment greatly accelerated. of course, you'd expect that once a cause is identified, but umbrellas tend to contain many totally unrelated sub-causes with wildly different subtleties that just happen to fit a vague description that matches others.
Well, if it leaves one unable to function effectively in society, or seriously hampered in that, it does make sense to differentiate from a less strong case.
If a person can't e.g. talk to someone to buy food, and the huge majority of people can, then they have a problem. That's not severe - it might be not perfectly accomondating, but it's reasonable. And of cource society does try to help in many ways (consulting, people being understanding, parents, school experts, medicine, etc)
Alright, well I can't agree to that part. It all depends on your goals and some people are less able to accomplish their own goals than others due to those traits from the spectrum. There's no need for conceiving of utility to others to speak of grades of functionality.
Sure, but the point is that the thing that is ‘severe’ isn’t ‘autism’. An autistic person can be severely disabled, but the disability is itself a trait they only some autistic people have, so it doesn’t make sense to call the disability ‘autism’.
That doesn't seem like a useful place to draw the distinction. If someone has more autistic traits, they can be considered to be more autistic but not necessarily more disabled.
well you can have a benign tumor or one that is killing you, by that theory we should say the one that is killing you shouldn't really be called a tumor.
I think you are showing bias, there is nothing in that comment to suggest the point of the comment is calling autism a disease. I see it as juxtaposing the same condition to different outcomes.
The commenter uses cancer as an analog for autism. That implies disease. It’s not a neutral comparison, and it’s not my bias that introduced the concept of disease.
If you think it’s possible to make the same point without reference to a disease, then by all means do so.
The difference of course between our two analogies, is that a tumor like autism is something that has been observed by people and not constructed by them, and we do not know much the reason why some tumors are benign and some not and we do not know much the reason why some people with autism are severely disabled and some not. But we do know why some scooters have electric motors and some do not.
In fact it may be that at some point in the future the autism spectrum will be broken up and be identified as several different disabilities, lets say Preboscot's behavioral pattern for those with a range of light autistic behaviors and Ternobyni's syndrome for those with what we would describe as heavy autism today and in that imagined future these different disabilities do not have any actual connection to each other but just manifest in some similar symptoms, the same way that diamonds and clear quartz might have some similarities in appearance.
But until that imagined future comes to pass we live in a present where the the extremes of the autism spectrum are still defined as autism.
I hope that my explanation is acceptable to you, and if you feel a need to morally elevate yourself over others via the sport of internet commenting you pick another target as the night has just started where I am, and my severely disabled autistic child sometimes only lets me have a few hours of sleep as it is, I would at least like to spend the time before he wakes up and wants to jump about relaxing instead of in meaningless argumentation.
I remember when that article was trening on HN. I wonder how supported by data that article is, because if it is scientific it would be a wonderful quick and easy reply to people using autism and the spectrum for various arguments.
As somebody who sits upon the autistic scale, I personally have a deep rooted ethic of fairness, even if it means I yield an advantage, I'm just as uncomfortable being unfairly treated as I am being over fairly treated and to empathize I just look at the others position as if that was mine and how would I feel about that, at least that is how I empathize.
Also tend towards common sense slanted bluntness over diplomatic word dancing more than not and with that, Greta Thunberg does some good examples of that.
One finally aspect, my thinking is more wider in scope still as a child and with that, will happily ask that awkward question and equally see things from a perspective others tend to overlook.
One time during an exit interview I pointed out (some concrete feedback they could take action on, aaah, how naive of me) how bad the DevOps team had it (80+ hour weeks, constant weekend work, all hours on call, etc) as one of my reasons for leaving and the CEO could NOT understand why I would care about this at all. His response was about how our team (BI) had it so good, which we did, so why would that matter!
He literally could not understand that I had empathy for another team and it affected my perception of the company.
To sound perhaps a bit weird now I guess but: isn't empathizing with another team normal? I would not want to be in a company that was OK with some team taking the short end of the stick continuously... It says a lot about people and well I would/do feel bad. The response of the CEO of course might be one of avoidance or narcissism... (Of course nobody can't psychoanalyze people from a distance.)
Oh totally, but this is where it sunk in that I didn’t judge things relatively like others. I have a pretty rigid moral scale and I have a hard minimum.
My coworkers were sympathetic or empathetic, but would just shrug and agree saying there were “culture problems”.
I don't know if you meant this to be an example of "autistic" perception, but fairness is deeply embedded in a lot of animals, including humans. It sounds more like the CEO was sociopathic. Also, of course it makes sense to care for another team for a multitude of reasons, starting from group harmony and ending with self-care, because you never know when you or your team will be at the short end of the stick.
No, not really. It was a game company, and the justification he used was that it was SOOO much better than the studios he'd been at before.
The only “autistic” part would be the degree I cared compared to people around me. This was where it sunk in that my moral scale wasn’t relative like most people’s.
Fairness is deeply rooted in humans in general. Ethological studies show that even kids as young as 7 will anonymously distribute candy in a more egalitarian way even if they have an option to take it all for themselves. (https://www.zora.uzh.ch/id/eprint/3833/3/2008_Fehr_NatureV.p...)
I have what you would deem mild Autism and it CAN be a beneficial adaptation, but it's is highly context-dependent. I think it's only in the last couple of decades when that's become true, specifically information, knowledge work, or science.
Think of the cartoons where the "nerds" are trying to fight/play sports and are trying to calculate optimal trajectories, etc. The additional rationality slows down coming to conclusions significantly and I'd argue in most cases the added accuracy is of marginal value.
Basically, I believe a lot of those biases are shortcuts that give a good enough answer in significantly less time. I.e. Newton's method over actually computing derivatives.
Society is getting so complicated that spending the time to find the actual right answer is supplanting various "going with your gut" heuristics since its about 50-50 as to if picking the simple answer is right or the counterintuitive answer is right. The whole "blink" thing of knowing the right answer is deeply busted.
We still value people who strongly assert what they think is the correct answer quickly though and view that as a sign of intelligence (and to be fair it is, but its more about the intelligence of knowing how to convince rather than the intelligence of knowing what is correct).
Suppose you're a judge in a contest. The contest has rules. If you apply the rules the same to all the contestants, that might be considered a disadvantage when you have the opportunity to apply the rules more favorably to your friends. Whereas the other guy who interprets the rules to favor his friends creates the expectation that the friends will return the favor someday.
But the advantage isn't always an advantage. If the other participants view you as biased then they won't even show up or pay entry fees anymore. Then there is no more contest and your friends lose even the possibility of winning.
It's kind of like asking if there's a disadvantage in not being a sociopath. Turns out, maybe not.
Yeah, I suspect the fact that Autism reduces a person's ability to relate emotionally and socially to themselves and others allows them to dedicate more brain power to thinking rationally. In a mild enough case, with a supportive tribe, they could be a useful advisor. No autistic members = tribe has trouble making good collective decisions. Too many autistic members = tribe can't collaborate. That's just my armchair psychologist theory though.
That's a bad stereotype. It's far more common for autistic people to have the ability to relate emotionally to others cranked up to overwhelming levels (hyper-empathy), which makes them avoid situations that require relating to others out of severe stress it's usually causing on them - leading to other people perceiving them as unempathetic.
I almost wonder if empathy is the right word then, maybe this would best be described as something like "susceptibility to emotional contagion." Empathy as commonly defined includes an impulse to comfort and stand by, as opposed to avoidance.
Empathy has a performative component to neurotypicals. If you are not acting as though you are empathizing, you have "no empathy" and are thus a strange form of psychopath. Actual psychopaths can pass the performative part of neurotypical empathy with flying colors because they are excellent maskers and mirrorers -- that is, of course, when they can be bothered to try at all.
Neurotypical psychology is deep, complex, and fascinating. They devote significant brainpower to constantly evaluating and testing other people's behavior against a constantly evolving set of rules in order to ascertain whether they are a member of the neurotypical's tribe or ingroup. The rules have to change and evolve because ingroup members will be able to predict how they will change, and so catch any outgroupers who have heretofore successfully infiltrated the ingroup. It's like you have a monster CPU with a lot of cores, and then devote half (or more!) of those cores to the world's most elaborate DRM scheme. We benefit because much of that CPU power is in us freed to do other exciting things, like programming or particle physics; but we also suffer because most of the people around us cannot attest that we are legitimate humans running a legitimate copy of the human OS.
Relatedly, I love Japan and I love the Japanese people but... Japanese society has one of the most elaborate, impenetrable set of social rules in the world. If you want to know why hikikomori are such a thing there, it's simple, really: so many more people are frustrated with their failure to conform to the elaborate ruleset it takes to simply be Japanese and tired of being flagged as impostors in that game of Among Us that they simply give up and withdraw into whatever brings them comfort.
I would have thought the main performance bottle neck to social calibration would be the unspoken mind-reading requirement that seems to be prevalent in American society.
Perhaps I'm wrong, but from what I know of Japanese society is that it's pretty blunt in its expectations. So there isn't this "mental searching tax" to make the "right" social choice that seems to be de riguer in the States. Everyday behavior in Japanese society has explicit procedures that don't change all that much.
As I understand it, social failure in Japan is result of one of two things:
The inability/lack of interest to follow these procedures (e.g. hikikomori, non-conformists, etc.), or following these procedures but with mistaken assumptions as to how the consequences would turn out (e.g. "herbivore" salarymen who have done everything right, but are unable to find wives like their fathers could)
That's interesting. A German friend of mine who's on the spectrum and had established a second life in Japan told me that Japan's explicitly defined social mores, and slightly more chilly and formal relations between people suit him much better than the Western default. His German buddies who also have one foot in Japan all seem like they're on the spectrum too, because of this I thought Japanese culture is a safe harbor for autistic folks.
I've commonly said that the Japanese are Germans to strangers and Italians to family and close friends. Sometimes as a quip I add that they're Irish in the bar, lol.
Japanese society is not really safe harbor for Japanese on the spectrum. It is, however, quite gracious to foreigners. As a foreigner no one will say anything to you, for instance, if you use the wrong honorific or something; most will be impressed that you can speak the language at all.
Once you've been living and working in Japan for some time and have started to assimilate, though, you are on and you've got to perform the appropriate rituals or people will start to think you're being aggressively rude.
I've generally had the impression that German society is itself quite procedural and legalistic. So much so, that I'm wondering how much of a difference between the two societies your friend and his friends perceive. Was that his only reason to go to Japan or were there others?
Well, for one thing, we Americans claim to value honest communication to each other in our personal relationships. Whether we actually live up to those values is a different matter, but my point here is the Japanese do NOT. Honne/tatemae is pretty ingrained into Japanese society, and you must avoid embarrassing yourself and, more importantly, your ingroup (family, company, club, etc.) by being too honest around outgroupers. The Japanese are so pressured to not lose face that they are actively encouraged to hide their feelings and intentions, even when showing them would be mutually beneficial. You see it in business -- the old saw about circumlocutions like "We will give your proposal the consideration it deserves" meaning "no freaking way"[0] -- but you also see it in modern Japanese drama. Taro loves Hanako and Hanako loves Taro, but they are from different social strata and their parents would be shocked to find out they're in love, so neither of them says anything and neither of them knows the feelings of the other. Plus Taro is going to America to play baseball and Hanako is going to medical school. Will one of them work up the courage to go against the social grain and the wishes of their family, and confess their feelings before it's too late? Or will they just say shouganai and go about their lives without ever knowing what could have been? That sort of thing.
So as a Japanese person you are tasked with not only following the rituals, but also sussing out from the vaguest of cues what your friends, family, potential mate, etc. are thinking because they're following the rituals too instead of engaging in explicit communication.
Regrettably, I had to learn a lot of this by reading; I don't have a lot of personal experience with this because I'm a Westerner. The Japanese are generally more willing to be open with foreigners because of the relative lack of social repercussions for honesty with foreigners than with Japanese. They don't have to be "on", they don't have to actively be Japanese in front of us and that makes for some interesting and refreshing barside conversation, lol.
[0] Earlier negative stereotypes of Japanese as being "sneaky" and untrustworthy are partially rooted in this sort of thing. They mask their true intentions to avoid embarrassment, but to Americans it looks like they're trying to trick or defraud us. And they see us as loud, pushy bulls in china shops who are unable to handle delicate affairs with any nuance, even if we're well-meaning.
> Empathy as commonly defined includes an impulse to comfort and stand by, as opposed to avoidance.
Right, and thinking something is delicious includes an impulse to eat rather than avoid, but lots of people still avoid food they think is delicious in order to diet. When empathy becomes too strong then it starts hurting you as a person a lot every time you see someone who has problems, so you learn to predict and avoid those situations, or you might even learn to fear them since the empathy creates too much agony in you. Empathy is just a feeling, your rational part can still work around it.
Empathy is seeing the world from someone else's perspective. While it's common to comfort and stand by someone you empathize with, particularly if they are going through tough times as you presumably recognize that they want to be comforted and stood by, that impulse isn't itself empathy.
Exactly - the overwhelming nature of these signals (e.g. the searing brightness of eye contact) pushes people away from them. This then produces difficulties as a result of not seeing these signals (and not learning about them). The result is a lack of Cognitive Empathy - inability to read signals - which is often confused with a lack of Affective Empathy as seen in sociopaths.
Or maybe it is still a fact, but the definition of autism/Asperger changed so it is no longer true with the new definition?
There seems to be a great bundling going on, where people with a wide range of various problems gets bundled under large umbrella terms like "ADHD" or "Autism". Before Autism and Asperger was different, now they are the same etc. and some even argue that ADHD and Autism are the same thing.
Edit: Btw, there was one stereotype that was never true, that Autists didn't have empathy. Autists always felt empathy. What they were said to lack was the ability to read people, not feel empathy for people. Not being able to read people can be said to reduce "a person's ability to relate emotionally and socially to themselves and others", so that statement wasn't wrong with the old definition/understanding.
Autistic people can have trouble accessing emotions, but a lot of the reason for that stereotype is just that they communicate their emotions and emotional reactions differently and/or that their emotional reactions to certain situations are different to those of neurotypical people: not that they're not actually feeling emotions at all.
Later in life I considered myself to be on the autistic spectrum as it explained many of my quirks. Growing up, I thought everyone was normal like me but less rational. Very recently I found out about Alexithymia, the inability to identify, describe or express one's own emotions, which actually nails it. Many of the outward interactions are similar but the internal experience is different. The best way I can describe my experiences is that the 'feeling' or even awareness of existence of an emotion lags, sometimes by hours, until it becomes sorted-out and conscious. I also don't much have emotional overloads other than being drained by certain forms of interactions that I'd chalked up to being introverted, which was odd because I'm very extroverted at times. I don't know if it's a good or bad thing that I don't put effort into interacting emotionally outside my closest circles.
Maybe it's like smell, where there are many bad scents and even if it didn't impact my survival I wouldn't want to live without the sense, and I should indulge more. The only concrete thing I've learned is that significantly reducing my caffeine intake helps but also distracts.
It’s less about being unable to access our emotions than our emotions being frequently influenced by sensory (over)stimulation — we learn to ignore them because they often tell us things that aren’t useful.
My main trigger is light — too bright, bad color, too much flicker, all of which can cause me to get “irrationally” angry in a conversation about any mundane topic. I’m not really that passionate about most of the things I get overwhelmed by, so my heightened emotional state because of some sensory stimulus is not useful.
All my anger/sadness tells me is that it’s bright and I need to either put on some sunglasses or turn off the lights. I have learned over time not to blow up at other people about it because it’s not their fault, and they’ll think that I think it is if I have a meltdown in front of them.
I doubt autism correlates that strongly with resilience and grit. But I have long thought that groups benefit from having a portion of the population having autism. Variance in thinking means more potential strategies for success. Too much variance probably hurts group cohesion.
IMHO the most effective adaptation for "solo or small tribe survival" that we and other primates have is all the factors that decrease one's chances of being put in the very disadvantageous solo or tiny tribe situation. (For example, various submissive behaviors and the quite interesting concept of crying seem to be adaptations towards that - continuing to live in a larger tribe instead of leaving) It's simpler and more effective to try and avoid or fix that problem in the first place, instead of trying to optimize for tolerating the problem.
This seems rather obvious to me, but glad it's being directly said (I have made a habit of studying biases for the last couple of years).
Emotion is a known irrational effect on decision making, autism is often associated with the lack of emotion in certain contexts (or inability to understand the emotion). Having less of the thing that makes you irrational would make you more rational by default.
Similar to Charlie's Munger devotion in life isn't to be smart, it's to figure out how to not be dumb. How can you not be irrational? Don't let emotion impact your decision making.
* To be clear, I'm not saying those on the spectrum don't have emotions (they do), though in my experience it comes across quite differently. It feels more like "another factor to be analyzed", which can easily be disregarded in some contexts, than an "invisible hand" behind the scenes influencing decisions.
> Though those biases probably came about for good reasons, it could be they've become obsolete and are no longer worth it.
I suspect (but don’t know how to test the hypothesis) that cognitive biases are why human learning can produce good results with dramatically less data than machine learning. More rational, yes, when you get there; but harder to learn at all.
No disagreement there! But, at the risk of cargo-culting, it might be interesting to see the effect of deliberately trying to reproduce such biases in an A.I.
They probably came about for good reasons; and the current world is doing a poor job at utilizing the full ability of it's people, of many different types.
Oh, it's obviously true. I have a lot of experience with "high-functioning" autistic people, and the rationality is like the most obvious thing to me. Or maybe #2 after sensory overload.
On the flip side, many autistic people have trouble understanding neurotypical people because they miss nuance in their communication that other neurotypical adults would find to be obvious.
Autism isn't some reasoning superpower, it's just a difference in processing stimuli.
This is also true the other way around however: neurotypical people miss nuance in autistics communication that other autistic adults would find to be obvious.
I would agree with your characterisation as a difference in processing stimuli.
> it could indicate that at least mild Autism is a beneficial adaptation
Not picking on you or your post, but it's interesting that we still consider this an adaptation. What if this is humanity's natural state and allistics are the adaptation?
I think you meant this tongue in cheek, but this seems highly unlikely.
The neocortex handles rational thinking and reasoning, so an increased reliance on it would put autism further from evolutionary predecessors. Also, you would expect the ratios of allistics:autistic to be reversed as well.
Also, as a mildly autistic person, I don't believe autism would be a beneficial trait in the wild. I would probably be fine, but some of my tendencies would lessen my likelihood of survival.
If that would be the case, then it would also then manifest in behavioral traits of non-human primates, since they would not yet have this relatively recent adaptation.
I'll bet it does. We don't have a generic description of what autism means yet. I'm sure what we describe as autism is a cluster of differences.
So it's hard to imagine what it would look like in other primates.
Like these[1] loners in slime mold colonies. We don't even know if these kinds of variation are common or not. Autism-like variations might be like this or it could be human specific.
Actually this seems unlikely given the sequence of evolution. But...
Here is a spoof of Allistic Spectrum Disorder imagined as if it affected a small minority of people (trigger warning for those obsessed with status).
From [nonexistent] DSM-VI: Hyper-Social (Allistic) Spectrum Disorder
HSSD is a syndrome in which there is an over-focus on social phenomena at the expense of other aspects of the world. Contrast with Autistic Spectrum Disorder, which is in many ways the opposite.
Diagnosis: Any 5 of the following are present:
Inability to express self clearly; use of ambiguous and vague language; discomfort with clear language
Obsessive interest in knowing personal details of acquaintances or strangers e.g. celebrities, or even fictional characters
Unfounded belief in being able to read other people's minds, in particular to know if someone is lying or not.
Difficulty in thinking in a systematic logical way, e.g. to do math or program computers
Tendency to try to bend and stretch rules for no obvious reason. Discomfort with accurately following instructions and processes.
Forms beliefs based on the opinions of others rather than on facts and evidence
Tendency to affiliate with groups and to align all opinions to the group
Frequently lies, mostly for social convenience (studies suggest 3-5 times a day)
Preoccupied with social status and “looking the part”
Focus on status symbols, and symbols of virtue and group affiliation
Focus on appearances more than underlying reality
Intolerance of diversity of opinion
Intolerance towards people who do not have HSSD
Spends large amounts of time on shallow “social” activities with little actual content. May lead to destructive activities such as substance abuse e.g. alcohol, and over-eating.
Lack of interest in mastering difficult, especially technical, subjects in depth
Tendency to stare into people's eyes, and to believe that this gives great insight into the other person's mind. Usually unaware that this can create discomfort in the other person.
Tendency to think that staring into people's eyes demonstrates trustworthiness.
It doesn't mean that at all, its but that real world is not made for rationallity, and even if we have created special contexts where it is, it doesn't mean it can be generalized out of them.
I would argue exactly the contrary: the real world, the seasons and stars and seeds, is pitilessly rational. It cannot be tricked, pleaded with, or emotionally manipulated. It is harsh, but equally so to everyone, and according to an inexorable logic that cannot be altered but can be exploited. It is the special contexts the humans have created, like churches, courts, and tribes, where the laws of rationality can be imperfectly and temporarily suspended, replaced by a "virtual reality" that is merely a social consensus.
Humans and other primates are social animals, and pretty much all aspects of personal success and ability to influence the external world - survival and safety, access to nutrition, mates and other resources, and general power - are mostly determined by social factors, so "winning" at the social factors has been more important than what a single individual can achieve by exploiting the "real world" even since before homo sapiens existed. "Individual fitness" at the expense of social fitness is maladaptive in the environment where humans live and lived; Starving or not starving depends on social factors more than on individual hunting prowess, the same for procreation, the same for changing the world in various ways, most of which depend on how many other people you can motivate to go along with your plans. These "special contexts the humans have created" have dominated the human life as long as humans have existed and before that, as we can see in non-human primate communities where living or dying in a power struggle or inter-tribal war is largely a factor of social aspects and not the strength of some individual ape.
There's no "merely" social consensus, quite on the contrary, the social consensus has always dominated all the things that matter; being exiled from the tribe was effectively a death sentence even if the tribe did not directly kill you, and a dominant position in the tribe gains larger benefits than dominating against the real world, both in a hunter-gatherer environment and in modern society.
Almost everything in your first paragraph is correct (though you accidentally capitalized "starving".)
But everything in your second paragraph is incorrect. Even before the industrial revolution, it was commonplace for banished people to find a new place to live, either as hermits or as part of a new tribe; the outlawing and persecution of individual refugees and "stateless persons" is a Late Modern aberration. Since the advent of the industrial revolution, a subordinate position in a tribe like Japan that is very "dominant against the real world" gains larger benefits than a dominant position in a tribe like the Wola that is much less "dominant against the real world". For example, as a Japanese person, you live twice as long, you probably won't get raped, you are at no risk of being executed for witchcraft if you fall from favor, and if at some point the two tribes come into armed conflict, the Wola will be entirely at the mercy of the Japanese.
Even in the first paragraph, though, there is a significant error. You say, "Starving or not starving... procreation... [and] changing the world in various ways [mostly] depend on how many other people you can motivate to go along with your plans." But in fact they do not. These things depend jointly on whether you get teamwork on plans, a social question, and on whether the plans are any good in the first place, a rational question. This is what sunk the Great Leap Forward: Mao was suffering from the delusion that you so clearly expressed here. He evidently motivated people to go along with his plans to an almost unprecedented degree, but many objective, non-social aspects of the plans (notably backyard smelting, the Four Pests campaign, deep plowing, and close planting) were destined to produce catastrophe, especially if they were executed thoroughly. The greatest famine in human history was the predictable consequence, killing some 40 million people.
The industrial revolution was a consequence of Galileo's rebellion against this subjectivist view: he dared to look through his telescope at the real world and believe what he saw, despite its incompatibility with the socially constructed virtual reality of his time. It took some time, but Italy paid for its rejection of Galileo with centuries of penury and destitution. Ultimately Galileo influenced the external world, as you say, far more than the crabbed Inquisitors who persecuted him.
I stand with Galileo and against Mao. Will you join me?
I'd argue that even in the horrific example of the Great Leap Forward, Mao and those who went with him mostly succeeded with their personal goals and ensured all kinds of long-term benefits to themselves granted by a higher social status in the party, while those who went against him and had better plans failed in all their goals, often starting with the primary goal of immediate survival. In this scenario having the better plan was not useful, and trying to execute it was not rational as it only hurt your interests.
Using your example of Galileo, his effectiveness in propagating his science was severely limited by a scientifically irrelevant feud with church officials. Had he been more politically savvy, he would have been able to avoid tying the scientific issues with the personal conflict, and would not have provoked the church into this conflict - IMHO what we have in historical evidence indicates that it was perfectly plausible for him to get the church to support his position, which would have supported both his personal interests and the general progress of science, but he failed at that due to his personal qualities w.r.t. social aspects.
Often that does happen in the short term, although in this particular case, it led to Mao losing control of the Party for six years and arguably delayed Mainland China's economic boom by 20 years. Certainly many of the people who tried to resist the Great Leap Forward died as a result, but so did many of the people who most enthusiastically practiced it.
I don't think a Galileo who spent much of his time acquiring political savvy and currying allies would have been able to make the progress he did make. Such a Galileo might have simply decided not to believe what he saw through the telescope, or to keep quiet about it. The Church had already burned Giordano Bruno at the stake for possessing the writings of Erasmus, and there are many other such stories: Bach was imprisoned for refusing to resign from his Kapellmeister post; Swartz committed suicide to escape imprisonment for downloading too many academic papers; Turing committed suicide to escape persecution for being openly gay; Newton lived to a ripe old age but certainly had a life full of interpersonal conflict; Champollion deciphered the hieroglyphs while awaiting trial for treason.
Fundamentally, rationality is insubordinate, and social graces frequently demand dishonesty, so that those who most love the truth are never those who get along best with others.
And those are my heroes, not Donald Trump or Mao Zedong.
I really find it entertaining that all naive science supporters believe this myth about Galileo. The real story is very different, he wasn't prosecuted, he was put in house arrest, not for daring to science, but because he publically mocked his friend the pope. Anyway, it's a really interesting time to take a deep dive in.
As usual, the people posting smug dismissals to HN claiming to find it entertaining that someone might disagree with them, and to themselves know "the real story", are not well versed in the subject. While of course in some sense the real reason for any interpersonal conflict can never be disagreement over a question of facts, Galileo was in fact prosecuted, and the overt justification for his prosecution was, as my unfortunate interlocutor puts it, "daring to science." Quoting the introduction to https://en.wikipedia.org/wiki/Galileo_affair:
> The Galileo affair (Italian: il processo a Galileo Galilei) began around 1610 and culminated with the trial and condemnation of Galileo Galilei by the Roman Catholic Inquisition in 1633. Galileo was prosecuted for his support of heliocentrism, the astronomical model in which the Earth and planets revolve around the Sun at the centre of the universe. ...
> Galileo's discoveries were met with opposition within the Catholic Church, and in 1616 the Inquisition declared heliocentrism to be "formally heretical." Galileo went on to propose a theory of tides in 1616, and of comets in 1619; he argued that the tides were evidence for the motion of the Earth.
> In 1632 Galileo published his Dialogue Concerning the Two Chief World Systems, which defended heliocentrism, and was immensely popular. Responding to mounting controversy over theology, astronomy and philosophy, the Roman Inquisition tried Galileo in 1633 and found him "vehemently suspect of heresy" sentenced him to house arrest where he remained until his death in 1642. At that point, heliocentric books were banned and Galileo was ordered to abstain from holding, teaching or defending heliocentric ideas after the trial.
The rest of the article provides an even more thoroughgoing rejection of the confused ideas in the comment to which I am regrettably replying; the atom of truth in it is that, 16 years after first being prosecuted, he included the new Pope's own counterarguments in his book along with a rebuttal, which displeased the Pope, who had previously favored Galileo.
Ah yes, in the long run it's all rational, but in the long run we're also all dead. Even if that were the case, you are not a star, you are not near equilibrium, you are alive and don't have time to play long term rational games.
If you try to walk across the desert without drinking water you will be dead in two days. That's not "the long run."
If you carry water and salt with you, you can make it a week or more, but not if you strategize poorly: walking during the day instead of at night will deplete your water much more rapidly, and if you treat your canteen carelessly you will lose the water. If you have the knowledge to navigate to places with drinkable water along the way, or the knowledge and materials to distill water from crushed plants, you can make it for months, longer if you brought food or can find it. (Me, I caught and ate raw grasshoppers.) You cannot emotionally manipulate the desert; you cannot trick it; it will not treat you more gently because you beg it for mercy. Rationality (knowledge, skill, heedfulness, and above all epistemic humility) is your only hope. It's no guarantee, because a rattlesnake or a hailstone may strike you at random, but it's your only hope.
It's not just the desert. The same is true of the ocean, of mushroom hunting, of wasp's nests, and of the frozen North with its alpine sweetvetch. Nature's ways are subtle and merciless, but they are amenable to understanding, and rationality permits you to order your life in harmony with them and thus survive and prosper a little while; though not, as you say, in the long run.
The whole world is like this, all except for tiny special contexts humans have created where the ruthless laws of Nature are suspended a little bit, where mercy and humanity and fellow-feeling hold sway.
That's not really the concept of rationality I or the article is talking about (consistency, non-bias etc.) knowledge or it's use isn't the same, it's more like the a priori knowledge where the concept is immediately applied by universal rules etc. This specific rationality is good in formal games where the rules are universal and the concept should be immediately applied, but doesn't work for empirical contexts (life, science, engineering, etc.).
They aren't really different concepts of rationality; consistency and non-bias are about not fooling yourself, so that you can come to the conclusions that the available evidence would justify. That's how people as a group can empirically acquire knowledge about the world. Of course, for individual people, social aspects are often even more important, since learning from someone else's experience can be much cheaper than learning from your own—as in the case of alpine sweetvetch; but even resisting deception and knowing whose opinion to listen to benefit from consistency and non-bias. Indeed, perhaps even more so, since the alpine sweetvetch isn't trying to emotionally manipulate you into believing it.
You use bias as a dirty word, but it's really just a weight of the opinion, there is no knowledge without bias. As for consistency, it comes secondary to categorization, it's easily abused for 'foolish' consistencies that can also be created with framing effects.
Sorry, I'm using "bias", "consistency", and "rationality" in the statistical, logical, and philosophical senses, respectively. So, I think, is the article. Your use of different definitions for those words probably explains why you reached conclusions that read as obvious nonsense to me. You might think about rereading the article with those definitions in mind.
I can also recommend reading about alpine sweetvetch.
I've mentioned this before, but this is why I believe it's "easy" for RMS to be so single-mindedly incorruptible, and why he can be trusted to never waver.
I am curious if this lack of bias stems from the imposed way of life or some other effect and not the mild autism itself.
E.g. Certain professions instill in you biases. Or force you to pick them. Examples: police officers, medical, politicians, social workers. And I bet those are professions that people with degrees of autism avoid.
I am curious if the lack of bias exists in other conditions that end up acting in a short of unempathetic way (for different reasons as noted).
personally i think both are needed in the genepool. just as handicaps during a economic boom during peace time make the tribe learn about compassion, which has a group benefit for the offspring. nature is a higher order system. we are not the sums of our parts. most discoveries are mistakes or roadblocks in the path of another objective. like sperm the best approach is the shotgun approach. in space legless people use less space oxygen and food.
Isn’t this why there’s a correlation between Asperger’s and engineers?
The same kind of logical, exacting thinking necessary for mastery of physical systems is in tension with the kinds of thinking used in social games. Some brains are better at one than the other — and we have disorders at both extremes.
I’ve always wondered if autism and dyscalclia are something of “polar opposites”.
>I’ve always wondered if autism and dyscalclia are something of “polar opposites”.
I don't think they are. Plenty of autistic people are bad at maths (you just don't meet these people in engineering circles!), and plenty of "social butterflies" are good at it.
Probably also related to a lot of other factors, like probablems with social interaction making people with aspergers more likely to for example spend evenings nerding out in their own room.
Yes hi, "aspergers" is an unfortunate nomenclature and many autistic folks (myself included) strongly resent it. It was named after a Nazi doctor (Hans Asperger) and used to classify autistic folks into "useful" and "non-useful" people -- as Nazis and Eugenicists are known to do. When you think of it, if you could refer to folks on the spectrum as such, without referencing the outdated nomenclature (the DSM-5 replaced it for diagnostics, now everything falls under the Autism Spectrum, rather than viewing the "higher functioning" folks as having a distinct diagnosis)
I invite everyone here who has stood up against a murderous totalitarian dictatorship at the likely cost of their life to tell us how Asperger should have done better.
> now everything falls under the Autism Spectrum
This is only true in the US. And people who were previously diagnosed as Aspergers retain that diagnosis, even in the US.
I want to strongly second this. Asperger was a complicated person with a complicated story in a brutal context, but ultimately a sympathetic and insightful man. His story is told in "Neurotribes" which is a thorough history of autism, and highly recommended.
What information do you feel can be communicated and understood with that moniker that is not served by Autism Spectrum? And why do you feel those distinctions (if any) merit a wholly distinct diagnosis?
"Autism Spectrum" is a deliberately vague term that has been created and stretched to bring a variety of minor social and emotional functional differences under the general label of "autism". As far as I can tell, in the US the major purpose of this has been to divert special education funding from severely impaired children to less-impaired children from higher socioeconomic strata, and it has been very effective in doing so.
So to directly answer your questions, "Asperger's" (or whatever substitute term you find acceptable -- I'm perfectly fine with a substitute) is very useful to distinguish people with minor social and emotional functional differences -- those people who are, for example, able to hold down a tech job and post about autistic politics to Hacker News -- from highly impaired people such as my daughter who will never hold a job and whose verbal skills are at a three year old level.
These distinctions are vitally important to ensure that appropriate funding goes to these highly impaired children rather than being siphoned away to children of well-connected or politically savvy parents who are fully capable of succeeding in the mainstream educational system without aid.
> I'm perfectly fine with a substitute) is very useful to distinguish people with minor social and emotional functional differences -- those people who are, for example, able to hold down a tech job and post about autistic politics to Hacker News -- from highly impaired people such as my daughter who will never hold a job and whose verbal skills are at a three year old level.
There are plenty of people who the diagnosis of Asperger's who will never hold down a job. I'd hardly consider it "minor" even if it is relative to your daughter.
Will one of you please just get to the damn point, and explain to us garbage Nazi-lovers exactly what language you'd like us to use to distinguish different levels of impairment/functioning, given that we want to discuss different levels of impairment/functioning? Or do you just not want it discussed at all? The lot of you successfully derailed this sub-thread and prevented that discussion from happening. But thank god you set us all straight on problematic etymology! Close one!
I am autistic, btw. An Autist. An Aspie. High-functioning. So's my brother. So's my father. None of us give a toss about these terms, but the subject of our traits and our getting on in society remains of interest.
I'm with you on that. That person above complaining about "Asperger's" doesn't speak for me. In the end, it's always going to be up to the individual and any sort of generalizing is going to fail unless you go about bullying people into it.
Sorry. I couldn’t decide which of these comments to respond to and did not mean to single you out or attack anybody. Obviously I find this frustrating and I feel slightly attacked myself.
I hear the phrase "high-functioning" more than "aspies". I think the distinction is useful in social contexts: just knowing Bob's son has autism is not enough info when writing party invitations or considering transferring Bob overseas.
Please ignore AussieWog93. georgestephanis is correct: autism politics are indeed messy but as an autistic person who in a different age would be classed as Asperger's, I detest the term for the same reason georgestephanis does.
A perfectly acceptable term outside some circles is, by definition, not acceptable, at least not “perfectly”.
And the (undisputed) fact that Asperger was quite the Nazi should, just by itself, disqualify the term. OPs comment linking the dual terms to the similar binary classification into useful/useless human beings goes even further by showing that usage of the term doesn’t just glorify someone who doesn’t deserve it, but shows how that practice derives from and continues the namesake’s hateful ideology.
Sorry... are you arguing for a term that separates autistic people into "productive" and "non productive" that was created by a literal Nazi?
I am autistic, pretty much all of the people I know are autistic, and even most of the people I know through my workplace are autistic (it's explicitly a neurodiverse workplace), and I've pretty much never seen anyone need to use the term "aspergers" in general conversation. As in, when talking about symptoms, when talking about diagnosis, when talking about anything to do with it, people just talk about the thing, rather than branding it as "aspergers versus autistic". I'll go further and say that, not only is it not in general parlance, but also that if you used the term "aspergers" in or around these circles, you would be lightly corrected, looked on disfavourably, or given a side-eye, at the least.
>I am autistic, pretty much all of the people I know are autistic
I'm not trying to tell you what words you should and shouldn't use; obviously in your circumstance it's a word more likely to cause a political schism and lead to misunderstandings.
Most people, though, live in a NT-dominated culture where the terms "Aspergers" and "Autism" both carry extremely different connotations.
One invokes images of an aloof professor who has misunderstandings but means well, the other invokes images of a child that screams and shits themselves.
In this instance, describing yourself as "Autistic" has real negative consequences that can be greatly ameliorated simply by making a slightly different language choice.
> invokes images of a child that screams and shits themselves
This is a generalized problem with the self-proclaimed neurodiversity/"autism rights" movement, though. They do very well at expressing the wishes of reasonably high-functioning folks with autistic traits, but don't seem to relate to the kids who can't speak intelligibly and spend their time banging their head against the wall any better than everyone else. Saying that "we shouldn't talk about low vs. high functioning autism, because it's more complex than that" feels like a cop out.
What is there to say about low vs. high functioning autism? People with low functioning autism can't be helped by social movements about recognizing human diversity, they can be helped by medical research and support for their parents. High functioning autism probably can't be helped by social movements either (what's the plan, to talk everybody out of using the subconscious screening system that makes them not like people with neurological disorders? can the discomfort that is felt when someone with MS is making jerky movements be reasoned out of people's guts?) but that's another question.
Maybe this debate is nothing more than a sink for the energies of people who honestly care but can't change anything, keeping them occupied until medical science sends the whole issue the way of dwarfism.
> they can be helped by medical research and support for their parents.
Many people in the "autism rights" movement oppose these things, often with strident rhetoric. They view any "medicalization" of the condition they happen to share with their lower-functioning fellows as inherently inhumane.
Pre-Godwinning the discussion doesn't raise discourse, it dumbs it down. Yes that's the history of the term but it was also the accepted term until after its removal from the DSM in 2013, and not everyone that grew up with that term is as plugged in as you and gotten the memo to moved over to the new term yet. It's fine to be angry at neurotypical people who use it as a slur, but you won't win many converts attacking people who aren't, especially using an anecdote about how you don't use the word as supporting evidence. Btw, the plural of anecdote is not anecdata which is not data.
This is a distinction without a difference. If someone holds Nazi beliefs and actively works with the Nazi party to achieve Nazi goals, most people would consider these people Nazis even if they didn't sign a bit of paper certifying that fact. Hans Asperger was loyal to the Nazi regime and took (horrible) actions to further the Nazi cause, and was rewarded by Nazi leadership for it.
Ok, so someone above already cited how "Aspergers" is a literal, direct reference to a Nazi. My burden of proof is covered, now you have the claim and must prove it. Go on :)
It's good to see appreciation of different neurotypes for their strengths.
Many people with ASD put a lot of time and effort into learning and altering their natural behavior in order to better understand and interact in a way that is perceived as normal by nerutotypical people.
I'm hopeful the inverse will happen more over time as well, neurotypicals putting effort into learning and adjusting their own behavior to better interact with and understand autistic people.
Making it normal to include input from all neurotypes (as opposed to excluding) is a great step forwards.
One of my best friends is autistic. He's definitely a weirdo at times (I mean that in a positive way, for instance he likes old movies and watches them constantly, but when he says a movie is good he has never been wrong) but he's a good guy, would never betray anyone and is always social and fun to be around as long as it is inside of his comfort zone.
>but when he says a movie is good he has never been wrong
I have a crazy anecdote like that too. My severely (barely verbal, had his own barely intelligible language) step brother would watch like a grand total of two movies on repeat. So many times that for a decade afterward you could say a single line from either movie, and I could finish the script for you.
Those two movies? Who Framed Roger Rabbit and Willow.
After a couple of decades of a break from being forced to watch them I finally did, and they're exemplary.
And Willow is one of the best fantasy epics ever made, IMHO.
He was maybe ~6 when WFRR came out on VHS could already spot a good movie. He's probably still mentally a poorly functioning 6 despite being 36 now, but there's a lot more to him than his obsession with Christmas. :)
I feel like in the years to come we're going to learn a lot from and benefit greatly from autistic folks. (I also might be one.)
The parenthetical is evaluated separately from the sentence. It would equally read this way:
He's definitely a weirdo at times but he's a good guy, (I mean that in a positive way, for instance he likes old movies and watches them constantly, but when he says a movie is good he has never been wrong)
as the way I originally wrote it:
He's definitely a weirdo at times, (I mean that in a positive way, for instance he likes old movies and watches them constantly, but when he says a movie is good he has never been wrong) but he's a good guy
The "but he's a good guy" never syntactically follows the information about the movies, it is always tied to my opinion that calling my friend a "movie fanatic" isn't a strong enough descriptor and he teeters on the verge of being weird about it.
Nah. The "but" differentiates a positive trait from a neutral trait, and the parenthetical phrase clarifies that the neutral trait isn't necessarily bad.
One thing worth noting about the spectrum in autistic spectrum disorder is that it does not mean what many people assume it does, that it's referring to a range of severity with mild on one end and severe on the other.
Rather it's more a spectrum as in a spectrum of colors: there are a number of traits to autism, not all of which might be present in a person diagnosed with ASD so single-criteria tests like identify the emotions in these photographs, for example, don't really work as good diagnostic tools.
> Imagine you have bought two non-refundable tickets to different trips, one much more costly. You are then told that you must cancel one of them. In this case, many people will cancel the cheaper trip regardless of which one they would prefer to go on – and even though they will have spent the same amount of money either way.
I think a better example to sunken cost bias could be found than this one as people usually pay more for trips that they prefer more in the first place.
Yeah and not just that, they speak about weighing the available information but they don’t give any more information about the situation.
If I have paid for two trips and have to cancel one of them with no refunds, I assume that I really wanted to go on both.
So when I am choosing which one to cancel, I am also likely choosing that I will later repurchase the trip that I am cancelling now. So at that point I’d be looking at which of the two trips is cheaper to replace. And if I am not allowed by the rules of this thought experiment to do so, then I must assume that the more expensive one of those two will cost more to buy again later also.
Then also as you say, which one is more preferable in the first place and again, if I was willing to pay more for one of them in the first place then presumably that one.
Unless there was something special about the cheap one. For example, maybe it’s a trip somewhere that I cannot go in the future, only now. Or a trip with someone I want to go there with and they can only go at this time. But again, all of that kind of stuff is left unspecified in the question. So if they force us to make a choice on so little information, what are they expecting, and in what sense is the kind of question they are asking anything but a straw man kind of deal?
What even were the possible answers that respondents could give? If “I don’t know”, or “too little information to determine” are an option then I’d pick one of those, but if the only answer we can give is “cancel the cheap one”/“cancel the expensive one”, then I would say cancel the cheap one, but they can’t then just go and say “oh this is a fallacy and you fell for it”.
It also ignores that if you booked a trip to a place, you likely did so because you want to go to that place. Thus, if you were forced to cancel your trip due to a conflict, it is implicitly more likely that you would book a trip there again in the future - a rain check, essentially.
If the question was "Which of these trips would you like to pay for twice?", then it's immediately obvious that the cheaper trip should be cancelled.
Because in the world I live in, there's a strong correlation between a previous price and a future one. While it's not always the case and all kinds of factors apply, previous price is a quite relevant signal. In the absence of any additional specific information, if I know that ticket to city A was cheaper than a ticket to city B, then it's reasonable to assume that a future ticket to city A is a bit more likely to be cheaper than a future ticket to city B.
That doesn't answer my question at all - the statement was that it's "easier", not "cheaper" to go to that city at later time. Price is just one single factor that may influence how easy it will be later compared to now.
Maybe a significant chunk of your motivation to visit city A was to see it covered in snow and do some winter sports, so if you cancel it now you'll likely have to wait a whole year to do it again?
Or maybe you booked these trips to see some bands playing life? What if the band playing in the cheaper city is less likely to play again within your reachable area in foreseeable future?
There's a huge amount of reasons why canceling the cheaper trip may not be the best option, and I believe noticing that is what the article was actually talking about.
All of your examples apply equally to both A and B (since we don't have any other information about the differences between A and B) and thus have no impact at all on the expectation about the difference between future cost or easiness or benefit/utility of the travel to A versus B, so these arguments can and should be ignored.
On the other hand, the previous price is a signal that does provide some information about the differences between travel to A and travel to B and allows to make a better-than-chance decision than treating both options as equal.
The example from the article explicitly mentions that there are some (unspecified) differences in motivation other than the price that are often overlooked by non-autistic people in such cases, so these arguments can not be ignored without misinterpreting the article's example.
The two options are the cheap and the expensive one. If we're making the determination of which one to drop based on price, and are given no information about the options besides price, presumably price is the determining factor. All else being equal, cheaper means easier.
> In this case, many people will cancel the cheaper trip regardless of which one they would prefer to go on
"All else" is explicitly not equal, there's a difference in preference for some reason other than price. This is not a puzzle with a correct answer, this is an example of specific behavior in specific cases observed in specific people who have all needed information available to them.
That's in reference to the desirability, not the availability. When I say "all else being equal" that is in reference to the future version of the same comparison. If one option is cheaper now, it will most likely be cheaper in the future too, and if it is cheaper then that means it is less demanding in terms of resources to acquire, and if it is less demanding in resources it is "easier". You might have a special case where the less expensive option will suddenly become much more difficult than the expensive option to acquire, but there is no reason to assume in any particular instance that's true.
For a more concrete example - if you have bought a $10 and a $100 ticket, buying a second $10 ticket is, all else being equal, preferable to buying a second $100 ticket, regardless of how desirable either option is.
> if you have bought a $10 and a $100 ticket, buying a second $10 ticket is, all else being equal, preferable to buying a second $100 ticket, regardless of how desirable either option is
Not if you won't desire it anymore (or desire less) at later date. Or if you won't be able to take it at a later date at all (which influences desirability at present). Or when the price doesn't really matter, you'll be able to afford it anyway. Or in multitude of other circumstances that may be relevant to the given example that you may think about once you decide to not miss the point of that example anymore, since it's very far from implying that "all else is being equal" ;)
Again, when I say "all else being equal" that is the explicit assumption that the circumstances in the future are essentially the same (ie equal) as they are now. We are assuming all else is equal because we are given no reason to believe otherwise. If something were, for example, a once in a lifetime opportunity, we'd refer to it as the once in a lifetime option, not as the less expensive option. The fact that we are talking about this decision in terms of price implies these are normal, purchasable commodities whose relative prices are reasonably stable, and that monetary expense is a significant concern.
Of course there are situations where you can't or wouldn't want to buy the same thing at a later point in time, but in general you can. Starting with the assumption that you purchased both a $10 and a $100 ticket to two different events because you wish to experience both and further assuming that you can still experience both by purchasing a duplicate of one ticket which is available at the same price, as is generally the case, keeping the expensive ticket and buying a duplicate of the less expensive ticket is the rational choice. Without an explicit good reason to do otherwise, it is not an example of the sunk cost fallacy to reduce your future expenses by keeping the more expensive ticket.
> The fact that we are talking about this decision in terms of price implies these are normal, purchasable commodities whose relative prices are reasonably stable, and that monetary expense is a significant concern.
Are trip prices "reasonably stable"? Isn't there actually a huge seasonal variance, last-minute offers and so on?
> Of course there are situations where you can't or wouldn't want to buy the same thing at a later point in time, but in general you can.
The example is about trips, which - at least in my experience - are usually chosen based on a huge set of variable incentives to go at specific time to a specific place.
Anyway, the article gives no reason to assume that "the circumstances in the future are essentially the same (ie equal) as they are now" - it actually gives a reason to assume that there are other incentives than the price and doesn't mention whether they change or not in the future - so we can't assume that they won't (and even if we could, it would still be irrelevant to the point it's trying to convey - all it talks about is that autistic people are apparently more likely to take those other circumstances into account, which can lead to a different outcome).
For the third time, we are told that there are two trips, one of which is "significantly more costly". It doesn't matter if the prices fluctuate, the fact that one is the "significantly more expensive" one, as opposed to any other description which does not utilize price, indicates that the relative difference in price is useful for distinguishing them, and indeed more useful than some alternative method of distinguishing them. You would not refer to something as the "significantly more expensive option" if you reasonably expect it to be cheaper than the "cheap option". It would be one thing if we were told that there was a last minute offer or a seasonal variation at play, but we were explicitly not. The question is what is the logical choice given this information, and that logical choice is to go on the more expensive trip. It doesn't matter if you'd make a different decision under different circumstances, it's fine for something to be logical under some circumstances but not under others.
The whole purpose of this discussion is that the point that the article is trying to convey, that autistic people are more likely to consider other circumstances and are thus more logical because they avoid the sunk cost bias is incorrect, because this is not an example of the sunk cost bias. The non-refundable cost you sunk into two things in the past is relevant to the decision of which one to keep because you can use it to predict future prices. What the article claims most people do is in fact the rational approach, and what the article claims autistic people do is generally irrational, thus working against the thesis of their article.
> The question is what is the logical choice given this information
Not enough information is given to make any choice, and there's no intention to give it, since this is not a puzzle. It has no correct answer. It just mentions that some non-financial motivations are at play, but does not specify them (because it's not needed to make a point). You can't make a decision, because you don't have the data.
> The non-refundable cost you sunk into two things in the past is relevant to the decision of which one to keep because you can use it to predict future prices.
You don't have enough information to assume that, or that any future prices will be relevant at all. In fact, the article explicitly says that this cost is "irrelevant" - which makes sense, since you may have no intention to rebook that cancelled trip at all!
Not only that, but usually people would want to have more information than what is being presented. If the location is one I'd more like to visit, or if the location is the same but one of the modes of transportation is nicer, then I'd obviously choose the more expensive of the two. Also, if my intention is that I want to visit both locations anyway, then I would also choose the more expensive option.
So I don't know the specifics of the question at hand, or if these autistic people were even able to ask these questions, but they seem rather important, and if they in fact NOT asking them but had the opportunity to do so, then I'd question the value of some of the assumptions this article seems to make.
My girlfriend was diagnosed with autism 3 weeks ago and we had a related conversation just today. She said she feels more open-minded / less biased than other people. I thought it was because her different experiences were invalidated by society throughout her life. But this makes it sound like there's more to it. Very interesting.
This might sound strange but I'm autistic and first time I really noticed something was "wrong" was when I started school, met new people and noticed that all the adults and children think in a really different way - in my mind they couldn't explain their actions or thoughts logically, and it made no sense to me why they were fine with it. Why were they happy to have a conversation where I found a gaping hole in their logic and.. just go on with their lives without exploring it further? I always had a meltdown if I was asked to do something that could not be logically explained to me, and questioned my parents and peers for day or weeks if I learned some things about the world that nobody could completely explain. It felt like I was accidentally born on wrong planet with different species with completely different thought processes that I couldn't understand.
It's very interesting. Some of the things my girlfriend described really highlight how weird we neurotypicals are and act. From many social norms and conventions to empathy, a lot of what we do only seems to make sense in the historic or evolutionary context, but not actually in the moment when you think about it. It really does feel like different species. I have no data but suspect people on the spectrum should get along great with other people on the spectrum; You just have the disadvantage that you are in the minority and seem to not be well-understood by society.
It's interesting, as I have this conversation with my kids - ie actually let them know, there's actually no good reason for this, it's just a societal rule. However, I then try to explain that it is often advantageous to follow these rules even if it doesn't seem to make any sense, because of the advantages you can get by being cohesive in society. It's a good exercise, and also helps me think, there really are some dumb rules, and the calculus is weighing up when to disregard them.
That is exactly what I wish my parents did with me (my meltdowns usually started after few answers of "It's a tradition/rule") and what I do with my kid. I think for me growing up one of the most important person was my 5-6th grade teacher who instead of punishing me for doing something wrong actually took the time to understand me and tell me the logic behind why I shouldn't do it - so now instead of being punished for some specific silly reason I couldn't understand, I could get a better understanding of society and apply that logic to a wider spectrum of situations. And he understood that if he just explains me stuff like that there's no reason to punish me as it won't happen again anyways.
I never really had the "meltdowns" that I hear about autistic people, unless particularly bad temper tantrums when I was 2-5 years old count, but I also agonized about how little sense "social rules" made. I would spend a lot of time pondering and thinking about these things.
Well, if we consider different causal factors, your proposed cause could be just as valid as a direct genetic cause.
It may be that the societal rejection autistic people face give them a perspective that enables a more rational worldview, or it could be directly caused by the autism itself.
> It may be that the societal rejection autistic people face give them a perspective that enables a more rational worldview, or it could be directly caused by the autism itself.
I think this would only work if all the other people who were also rejected (schizophrenics, bipolars, adhds, etc.) were also more rational and I don't think this is the case (or even close to being the case.) I'm guessing the researchers would have accounted for that.
Basically theres a fundamental trade off between the ebb and flow of social interaction and cohesion, which follows predefined and implicit rules, and then the autistic ability to actually be objective/think rationally without being clouded by norms
Everyone says they want rationality and unbiased thinking, but don't really care if it takes extra effort or time. This threw me for years. When I would point out gaps or mention hidden assumptions I would get dismissed almost every time.
Like just because you have data to back up your point/idea doesn't mean your point is right. That's NOT what data-driven actually means. No, what people want is numbers to add to make a story seem more trustworthy.
Once I started treating "rational" or "data-driven" like a buzz word everything made sense.
It is very good to see more positive recognition for people that are neurodivergent including how we actually improve and fit into society as well.
I would argue that the greatest issue with neurotypical society over all is that it tends to value a singular mode of thinking and being as somehow inherently more valuable than others, failing to recognize that in our many differences we are actually stronger as a whole.
First names of elements started being capitalized. Then there was the "Black" thing, followed by the "White" thing. We seem to be headed back to the 1700s, when Important Words were capitalized.
I think op is referring to the article, which uses "Autistic" no matter sentence position. Strangely, they use "non-autistic", though I'm unsure of any hardcore style rules about this
We know that people aren't perfectly rational. The point of the article is that autistic people tend to be more rational on average than neurotypicals. The emotional weight that affects neurotypicals and causes them to fall into biases more often (on average) doesn't apply as often (on average); an autistic person will be less likely to behave differently when confronted with "80% fat free" vs "20% fat", to borrow an example for the article.
That's likely because (many) autistic people (myself included) have to learn to function with emotional regulation issues largely by second-guessing them. So the 'gut feeling' a neurotypical would tend to go with gets overridden by subsequent analysis in the autistic individual, in their attempt to 'calm the storm'. In my case, this causes me to 'throw out' most political hyperbole.
Unfortunately, if the shit really does hit the fan, this process can lead to validation of the emotions and an 'autistic meltdown'. So it's a double-edged sword, to be sure
Could be; was just pointing out that the parent maybe wasn't really speaking to the point of the article.
Still, I find stuff like this super interesting. It helps dislodge the narrative that autism is a problem or disease, something that needs 'fixing', rather than just being different.
Supposing autistic people are more rational than the neurotypical: then couldn't we generalize the thesis to, "highly rational people challenge preconceived ideas about rationality"? I don't think the research claimed that people with autism have rationality-enhancing properties that are unavailable to people who don't. For example, "reduced use of stereotypes" applies to some otherwise-neurotypical people -- we would probably call these people "highly rational".
Put another way, I suppose I'm saying that there may be a second group -- "highly rational people" -- that intersects and overlaps significantly with "people with autism", and we could be making statements about one group that should be attributed to the other.
(I'm not trying to touch any nerves here. I'm putting aside the fact that the article is a positive piece about people with autism and makes some enlightening points. I'm just claiming that the "challenging preconceived ideas" may be mostly true, but also too narrow and possibly misleading.)
>> Imagine you have bought two non-refundable tickets to different trips, one much more costly. You are then told that you must cancel one of them. In this case, many people will cancel the cheaper trip regardless of which one they would prefer to go on – and even though they will have spent the same amount of money either way. Autistic individuals appear to be more likely to make a choice based on their personal preference rather than on an irrelevant cost.
This is missing a hige piece of context. What about the possibility that you may later decide to repurchase whichever trip you decline now? My future cost is reduced by canceling the cheaper trip now. If both are in the realm of "I'd like to go there someday" and close in appeal, it is more rational to take the more expensive one. If we don't consider that larger (possible) context then obviously take the one you prefer the most.
ADHD people have a different calculus that affects their project management.
NonA-DHD typically respond to deadlines with increased urgency, commitment, and re-factoring assessments.
ADHD may respond to deadlines by abandoning tasks and starting new extraneous tasks.
Is that rational? It’s what happens, and since it is such a dramatic and consequential difference, it warrants a reconsideration of the meaning of the term “rational”, and it’s limitations.
ADHD isn’t an isolated minority. It’s a transient condition in a significant number of people, prevalent enough in the population to make economic text books about monolithic “rationality” unfit for purpose.
In what sense!? From what god is Rationalty (TM) handed down? Which parts of society are Rational? Which of the sentences of mine is Rational?
Ostensibly Rational just means “with reasons” or “calculated” but there are many different calculuses under which one could operate.
I agree with GP. Economic rational agents are supposed to do things like maximize utility or minimize loss. I know that I for one do things like “anticipate others’ needs”, “feel ambivalent about eating the entire box of donuts”, and “take a day off work to practice origami”, and I’m not sure how those actions fit into the economic model.
It's not necessary to pull economics into the picture - the general definition is that a rational choice is making the choice that best furthers your goals. If you make suboptimal choices (with respect to your own goals - sacrificing "socially desirable" goals in order to achieve some desire of your own isn't irrational), that's an irrational choice, and people often do that for various reasons. We're not horribly bad at achieving our goals, but we're also nowhere near perfect. Some types of suboptimal choices are systematic, predictable deviations which we can study as biases.
Is it truly transient? I know some people "grow out of it", but it seems plenty of people are afflicted for life. Am I misunderstanding what transient means in this context?
It's not transient, it's a neurodevelopmental disorder that changes the fundamental structure of the brain and how it functions (this is a bit of a simplification, but it's true enough for this context).
People who 'grow out' of having ADHD either:
1. Developed coping methods that lessened the impact of their symptoms, making them functional enough to not be diagnosable (ADHD is only diagnosed if it negatively impacts your ability to function in two or more of the domains of work, social life, and home/family life). Often people who do this are still negatively impacted by their condition, but their problems are invisible and go unnoticed.
2. Never had ADHD to begin with, and instead had one of the many other psychiatric or physical conditions that can impact executive functioning (e.g. depression, sleep disorders, anxiety, malnutrition, etc.).
(wow I used the word impact a lot in this paragraph)
Right, that’s what I thought. I’ve been diagnosed and it doesn’t feel transient in the slightest so that seemed like a peculiar statement. Thanks for your impactful response!
The rational man as a concept is still used nowadays as often as it is, that itself is a proof that the concept is as broken as it can be. The modern consumer economy is based on irrationality.
I wouldn't say it's based on irrationality, but based on other things that aren't necessarily rational or irrational. Just manipulation of our rational+emotional brain sitting on top of our monkey brain which is sitting on top of our lizard brains.
In psychology undergrad, I read about a study of anorexic people: they had people rate their own attractiveness, and had strangers rate their attractiveness. Normal population consistently rated themselves higher in attractiveness than strangers did on average. However, anorexic's self ratings were much closer to stranger-ratings.
(May not be remembering this correctly and I assume this study was done in the 90s when thinness was more in fashion - wouldn't be surprised if it didn't replicate if done today)
See also: dunning kruger effect.
My takeaway is that the 'normal' human brain lies to itself in many ways which protect the ego. Some disorders are caused not by disconnection from reality, but rather too accurate a view of reality.
I wouldn't be surprised if there are many autistic traits that have a similar origin: the brains socio-protective instinctual lies are failing, and autistic people are actually acting more rationally.
> Some disorders are caused not by disconnection from reality, but rather too accurate a view of reality.
Yeah, I've heard this about depression. [1]
> Depressive realism is the hypothesis developed by Lauren Alloy and Lyn Yvonne Abramson that depressed individuals make more realistic inferences than non-depressed individuals. Although depressed individuals are thought to have a negative cognitive bias that results in recurrent, negative automatic thoughts, maladaptive behaviors, and dysfunctional world beliefs, depressive realism argues not only that this negativity may reflect a more accurate appraisal of the world but also that non-depressed individuals' appraisals are positively biased.
"Autistic individuals are, on average, more consistent, less biased, and more rational than non-autistic individuals in a variety of contexts. (We use identity-first language, eg, ‘Autistic people’, as it is preferred by many people on the autism spectrum.)"
Hmmm. What are they implying by that parenthetical comment placement?
There's quite a bit of politicking in the autism movement over whether one should preferentially say 'people with autism' or 'autistic people'. The former (formally called "person-first language") is occasionally objected to because it frames 'autism' as a disease (compare 'people with cancer') as opposed to a sort of plausibly-benign divergence in cognition. As one might expect with such matters, there are reasonable arguments for either choice.
In psychology and social services there is a push to use terms like “person with Autism” or “person with autistic traits” as opposed to “autistic person” as many people seem to enjoy not being primarily labeled by something that only makes up a small part of their personality and an even smaller part of their personhood.
Think of it like primitive vs value types in Java. ‘int a = 7; Integer b = 7’
“a” is 7, but “b” just has a pointer to 7.
—-
I guess autistic people must not mind or even prefer to identify in this way, but it can really help in a therapeutic setting not to call someone “a borderline woman” or “a psychotic woman” because for better or worse labeling someone as mentally unhealthy is colloquially tantamount to an insult.
The point here is that autistic people who prefer this term don’t think it has anything to do with mental health or disease, and do think it’s a fundamental aspect of who they are, which can’t be separated.
The ironic part is that the non-autistic people wont trust autistic peoples valuable non-biased opinions because of their bias against non-conforming people.
I had an autistic coworker who could not understand using pronouns outside of the already established ones (he/she). She was otherwise very “progressive” but didn’t consider herself such. It had to do with he/she mapping to (99% of the time) defined biological features.
Most people agree with new pronouns etc due to peer pressure. You haven't convinced them rationally, rather you convince them that it costs more to fight the movement than just fall in line. But a person who needs rational reasons to change how they speak will be much harder to "convince", since they wont change anything until they fully buy into the new.
The peer pressured group might think they did properly think it through, but mostly all you need is an appeal to authority and they fall in line.
Yes, this is exactly how most people in the U.S. and the West more generally would relate to the whole "pronouns/grammatical gender" thing. SV is a bubble.
I think many people are reluctant to use non-traditional pronouns. I’m not trying to condemn or condone that, just pointing out that to me this example seems unrelated to autism.
Judging from how I was changing my opinion on similar matters when I was growing up, I guess she simply lacks the insight into why someone would feel the need to reject the established pronouns - she probably doesn't feel that need herself, so she doesn't have any frame of reference to be able to consider that until someone explains it to her, which makes her naturally gravitate towards seemingly unambiguous and clear grammatical rules that "make sense".
I'd guess that it's pretty common for autistic people to fight concepts like singular "they" just out of the sense of maintaining linguistic order, uncorrelated with whether they actually see the need for gender-neutral and non-binary pronouns or not (which can be a source of frustrating misunderstandings that assume bad intent when there's none).
For me, it only "clicked" once I understood that gender and sexuality are completely arbitrary and subjective social constructs that try to describe a whole spectrum of multidimensional behaviors and (potentially repressed) feelings, so there's little point in trying to objectively categorize them - it's all about the subjective impression of the person themself, which makes it obvious that the language should be able to actually express their identities and that it doesn't help anyone to try to force some categorization on them.
"For me, it only "clicked" once I understood that gender and sexuality are completely arbitrary and subjective social constructs that try to describe a whole spectrum of multidimensional behaviors and (potentially repressed) feelings, so there's little point in trying to objectively categorize them - it's all about the subjective impression of the person themself"
Sex is a fundamental property of living organisms; it is not determined by thoughts or feelings.
Sex and gender are different things. Neither is easily defined in a strict unambiguous manner, despite what we might be taught in high school biology.
To be clear, this isn’t a political view point, it’s a scientific view point, there’s no singularly accepted way of defining sex in human. Unfortunately nature has this amazing ability to conjure up exceptions to every seemingly reasonable definition of male/female, and it doesn’t give two shits about our desire to arrange the world into neat little categories.
Sex is almost always unambiguously defined, despite there being exceptions. There is no problem with having biological categories. It is not controversial in science that humans are a sexually dimorphic species.
To claim otherwise is definitely political.
Gender is entirely more complex social phenomenon and indeed isn’t directly related to sexual dimorphism, and evolving gender politics are totally legitimate.
I agree in the vast majority of individuals sex is clear cut and unambiguous. It’s also clear that humans are sexually dimorphic.
However that doesn’t mean that ambiguity in sex doesn’t exist, and that sexual ambiguous in individuals is impossible. Sexual ambiguity isn’t common, but equally it doesn’t represent an aberration or break some natural law.
So I take issue with the idea that determining sex is universally trivial, and those that dismiss real cases of sexual ambiguity as political correctness gone wrong. Its just that sometimes people are born who don’t fit neatly into commonly held categories, it doesn’t make them special, it just means they’re unique on axis that most people aren’t. Most of the time that nothing more than an interesting observation, but sometimes these people need help to understand how they fit in a world that culturally assumes they don’t exist.
“Of course, but that's not really related to what I said.”
You said sexuality is a “completely arbitrary and subjective social construct” and that “there's little point in trying to objectively categorize” it. Sex has been studied since animal husbandry existed. So what you said was obviously wrong.
> so there's little point in trying to objectively categorize them
I think there's more than a little utility provided by the communication it enables. I'm all for non binary identities and letting people identify across them as they want, however with any change we must also recognize the utility in the previous norms so that we can preserve some useful aspects as we construct new norms.
Can you give an example of such utility? Frankly, I don't see it - in my opinion, the best options are either not having genderized pronouns at all (and that's the option I'd actually prefer), or expressing the whole range of identities with them. Going somewhere in the middle is nothing more than just asking for dissonance to happen, which isn't useful.
Sure, the current utility is that most people do currently comfortably fit into the binary. We should introduce language to account for people that don't, and if demographics shifted such that most people didn't feel like the current binary fit them then we should adjust, but currently the majority of the population is happily self identifying within the binary and it's a great shortcut for them to communicate some assumptions about their identity. Only assumptions, not hard rules, but there's still utility in that.
It's true that most people fit into binary, but I don't really see how that's relevant.
1) If we assume that it's essential to genderize pronouns, it doesn't really matter what the majority fits into because existence of other options does not influence that majority at all. The only case where it matters is when someone doesn't fit. The utility remains unaffected (in fact, it actually is increased because of better expressivity).
2) If we assume that it's not necessary to genderize pronouns, then it may be argued that we're losing some information that the vast majority of people was comfortably fitting into - but I don't really understand why do we actually need that information. When I refer to other people, it's extremely rare that I do it in a context that requires me to mention their gender identity (or even what do they have between their legs). In those rare cases where it's actually relevant, I wouldn't mind having to express it explicitly at all, so overall the utility seems dubious.
> It's true that most people fit into binary, but I don't really see how that's relevant.
It's relevant because it's efficient more than 99% of the time and removing it introduces ambiguity 99% of the time. The person you're responding to even said they didn't have a problem with adding more pronouns, just not making it worse by removing them.
Your second point is wrong. Obviously it narrows specificity by half the room on average. I don't know why you'd argue against that obvious fact.
It's basic math: If you have a set C that is the union of two sets A and B where A and B have the same cardinality, referring to "a C" gives you twice the possibilities than referring to either "an A" or "a B." So it's measurably twice as efficient to do the former. Since so many unrelated languages in the world ended up with such a system (or very close), it's reasonable to think that that efficiency was worth it. Since most of those systems are not much more specific, it's reasonable to think that being more specific wasn't worth it (one can always specify further using more words.)
Of course it narrows down specificity. My point is that in today's society, it seems mostly useless to me. Most of the time I don't need to narrow it down this way at all. This may have been different in a world where people were segregated by sex so intensively that half of the population didn't even have the right to vote, but today I fail to see the usefulness of it.
Then you fail to see the usefulness of specificity and efficiency in speech, which is both weird and explains why it took you so many words to say that.
If I'll want to refer to you and this conversation when talking to someone else, I don't need to refer to your gender at all. Just like I don't need to refer to your race, your social class or color of your hair. Stopping to consider whether I should use "he" or "she" (or maybe something else) is the exact opposite of efficiency in speech.
It seems to me that it's actually you who misunderstands the usefulness of specificity. It's not useful to be overspecific.
Obviously it depends on your goals and the context. Both specificity and generality are useful. To ban one is foolishness. They coalesced into short words for a reason: people use them, a lot.
We can exchange truisms all day ;) But that doesn't change the fact that in my experience specificity related to gender pronouns is needed (or even helpful) only in a very tiny minority of everyday contexts.
For the record, my native tongue is much more gendered than English (it has gendered nouns, verbs and adjectives; not just pronouns) - I don't understand how it's useful at all, I don't miss it in English.
I think you're underselling the importance people find in their gender.
Personally I agree, for me gender holds little importance, if I'm being most true to myself I identify as non binary simply because I don't really identify with a gendered label. That motivation has also lead me to being ok with being gendered male, because it just doesn't matter to me.
I understand I need to look outside of my own experience to see the importance people place on gender though. You can say all you want that most people don't care, but I feel if you misgender people, a lot of them would be very upset. Trans people are very vulnerable to suicide because of this, to diminish the importance of gender (this includes the binary, of which many trans people want to fit into) to these people is to be at best lacking in empathy.
Now if you're approaching this from a gender abolitionist angle where you believe all this attachment to gender is socialized and that we should push to de-emphasize genders role in society, then I believe that's a far more defensible position, but I feel you need to at least recognize the importance gender has to people today (socialized or not) if you're to have any hope in bridging that gap with people.
To the contrary, I believe that people's gender is usually extremely important to them. What I find less important is having to specify their gender whenever I'm talking about them just because of language constructs. In English it isn't actually that bad, since it's limited to using correct pronouns; but in some other languages I have to be careful to not misgender anyone pretty much whenever I talk about them or to them - regardless of whether their gender is relevant to what I'm saying or not.
So I'm only a grammatical gender abolitionist :) I don't see the point of gendering people when I talk about them unless I talk specifically about their gender. As a happy side effect, this would also massively reduce the risk of accidentally misgendering someone.
The argument about gender (and pronouns) being an arbitrary social construct does not imply that it's the individual's unilateral choice - the fact that naming people and referring to them is a social construct means that the social consensus determines how people will be addressed and the individual does not get a veto vote - e.g. if someone asserts that their identity requires them to be called Your Majesty, the society will simply ignore that demand. So the demand for non-standard pronouns essentially is up to the society; someone may want others to use e.g. xe/xir, but it does not necessarily mean that this desire has to be honored, that is an arbitrary social construct which can plausibly be different in different subcommunities; in some communities these pronouns fit the social construct and it's mandatory to use them, in other communities it goes against the social construct and it's considered unreasonable to demand that they get used.
There are many other parallels - e.g. the criteria for using (and expecting/demanding of use) of formal vs informal "You" in many languages, the expectation on how mandatory it is to use specific prefixes or honorifics (Sir/Ms/Dr), etc; and in all those cases it's an arbitrary social construct and the wishes of the individual can be and are shunned whenever they go beyond what the locally prevailing social norms require.
Sure it does. We're a sexually dimorphic species that can only reproduce with one XY and one XX.
Differentiating between them is vital to the continued existence of the species.
Talking with your buddies of the same gender about how you were hanging out alone with one of the opposite was a tribally significant thing in the early days of speech. It still today implies you might be mating! And babies might be forthcoming.
Maybe for some people? Most of my buddies are of the opposite gender, and me hanging out with one of them doesn’t imply anything different to me hanging out with someone of my own gender.
There are definitely people it matters to, I sometimes get tired of the usual remarks and end up awkwardly using "they" or phrasing tricks to try to avoid them
We're sexually dimorphic, but linguistic gender is not based on biological sex, and sex identification is merely one of a literally infinite number of potential bits of information that one may wish to convey with language. The fact that an attractive woman in her reproductive prime is referred to with the same pronouns as your grandmother and the Titanic is indicative of how useful pronouns are for identifying who may be mating.
It reduces ambiguity, increasing narrowing specificity by half the room on average. I suspect that efficiency trade off showed up in so many unrelated languages because it was useful. It didn't get more specific because that wasn't useful enough. Languages evolve too.
Yeah, imagine getting upset when someone with a disease almost defined by being slow on the uptake and honest about it hasn't acquired a cultural change that happened during their adulthood.
I don't think that's a trait common to neurotypicals or any group in particular. All people lend greater trust to the opinions of others who think similarly/have similar experiences to themselves.
As we explained in a recent review paper, researchers have repeatedly found evidence that Autistic individuals are, on average, more consistent, less biased, and more rational than non-autistic individuals in a variety of contexts.
Specifically, many Autistic people seem to be less susceptible to cognitive biases, and therefore better able to make judgments and reach decisions in a more traditionally ‘rational’ manner.
Interesting if true; it could indicate that at least mild Autism is a beneficial adaptation. Though those biases probably came about for good reasons, it could be they've become obsolete and are no longer worth it.