Hacker News new | past | comments | ask | show | jobs | submit login
Epistemic Learned Helplessness (2013) (slatestarcodex.com)
146 points by norswap on June 6, 2019 | hide | past | favorite | 60 comments



Happy to see this here again.

I suspect that argument from authority, like "goto considered harmful" and Moore's Law, have entered the public consciousness in a warped popular form that barely resembles the original.

In topics in which I am not a subject matter expert, I can and should defer to the actual subject matter experts, absent very compelling evidence to the contrary. I'm not perfect at this, but I first read Epistemic Learned Helplessness at about the right time several years ago and it re-appears unbidden in my consciousness to hit me in the back of the head any time I start thinking an expert might be wrong in their area of expertise.

Reluctance to do this because of misguided applications of the argument-from-authority fallacy is probably one of the root causes of antivaxx, climate "skepticism", flat-eartherism, and other hot political topics with a vocal population who have convinced themselves that authorities are not to be trusted. You can never successfully argue for instance that "AGW is probably true because 98% of climate scientists agree with its foundations", because the answer is always, "that's the argument-from-authority fallacy", even though reliance on authority is the correct approach. What you inevitably get instead is two or more laypeople clumsily trying to debate the nuances of a subject that they have no academic background in.


I think the (or at least an) underlying issue is the conflation of arguments with their respective fallacies. Saying "98% of climate scientists agree that AGW is happening" IS an argument from authority, but it's not an argument-from-authority fallacy, because they are in fact authorities and we should listen to them. It's only argument-from-authority if there isn't sufficient evidence that this authority is most likely correct on this count.

Just like pointing out a slippery slope may be entirely correct; it's only a slippery-slope fallacy if there's no evidence that the slope is, in fact, slippery.


> Saying "98% of climate scientists agree that AGW is happening" IS an argument from authority, but it's not an argument-from-authority fallacy, because they are in fact authorities and we should listen to them.

That is the claim that people make when they make an argument from authority. That doesn't make it not a logical fallacy.

> It's only argument-from-authority if there isn't sufficient evidence that this authority is most likely correct on this count.

Sometimes this works (the Earth orbits the Sun) and sometimes it doesn't (sanitizing hands/scalpels is actually a good idea between dissecting cadavers and doing c-sections).


> Sometimes this works (the Earth orbits the Sun) and sometimes it doesn't (sanitizing hands/scalpels is actually a good idea between dissecting cadavers and doing c-sections).

This is actually a perfect example of the difference between argument-from-authority and argument-from-authority-fallacy!

Galileo didn't say "the Earth orbits the Sun because I say so and shut up." He said "the Earth orbits the Sun, here take a look."

Not fallacy.

The physicians who refused to wash their hands didn't say "washing hands has no measurable impact on mortality and here's our data to prove it", they said "gentleman doctors do not have dirty hands, and shut up."

Fallacy.


> That is the claim that people make when they make an argument from authority. That doesn't make it not a logical fallacy.

This is the point where branding things as fallacies breaks down, and is very much the point of the article.

Calling “argument from authority” on somebody’s point is one of the weakest rebuttals you can provide, because you’re not really tackling the substance of the argument.

In this case, “98% of scientists agree that AGW is happening” implicitly says “and I have all the arguments provided by those 98% on my side”. Presumably, if AGW isn’t happening, their arguments are wrong somehow, and you should be able to point out how. It’s not just appeal to authority, it’s saying “here’s a big body of research, tell me where it went wrong”.

Ultimately, nobody is an expert on everything, and heuristic thinking has to take over at some point. Overwhelming expert consensus is a damn good heuristic to go by.


> Ultimately, nobody is an expert on everything, and heuristic thinking has to take over at some point. Overwhelming expert consensus is a damn good heuristic to go by.

Sometimes.

I don't know enough about physics to even really understand the standard model. I still believe that it's pretty much true.

However this approach is a bit limited. There simply aren't experts in macroeconomics that could convince me of anything on an appeal to authority basis in that same way.

Ultimately, arguments are much cleaner and more convincing if you can simply and directly argue the point. For those arguments that are not so amenable, expert consensus may be a practical alternative, but at best it's a least bad choice, not something to be lauded.


> any time I start thinking an expert might be wrong in their area of expertise

But on the other hand....

Why Most Published Research Findings Are False https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1182327/


Yeah. A well-functioning civilization would be epistemically like an efficient market. In ours, it's relatively hard to get rewarded to improve the consensus estimate about most things: https://www.theatlantic.com/science/archive/2018/08/scientis...


That's a very interesting approach in the article, and yet something like that will likely never see the light of day. In my opinion much of the blame for the decrease in public trust of experts lies with the experts themselves for their lack of epistemic honesty and apparent disinterest in self-improvement.


It seems to me that another name for this vice is something like the Dunning-Kroeger effect, or, equivalently, simple laziness. Sure, I might think that I've found a mistake in general relativity or something. But I also know, or I should know, that physicists have put in years and years of work to understand those arguments and I just haven't.

So the failure to exercise an appropriate degree of epistemic learned helplessness might also be interpreted as a failure to recognize, or a lazy unwillingness to accept, that acquiring subject-specific knowledge---as well as subject-specific metacognitive knowledge to indicate to one that one lacks substantive knowledge, takes actual work which one has not put in.


> "In topics in which I am not a subject matter expert, I can and should defer to the actual subject matter experts..."

i'd contend the word 'defer' is too strong. one should accept that SMEs have made reasonable arguments based on information you don't have (like the many wrong lines of reasoning they've already attempted). "epistemic learned helplessness" would then lead you to defer to the expert, but that's not the only rational response in every case.

there's a different skill, discussed secondarily in the essay, for sniffing out bs. that's probably best termed 'wisdom', or informed meta-pattern-matching. so another rational response is to use your wisdom to decide how much to trust the expert. trusting the expert is not necessarily an either-or thing.


I also like to think of it as a lack of imagination in how deep knowledge in an area can go, in terms of people with advanced degrees in a given field. Its either that knowledge for them is shallow, or that the principles that apply to their knowledge area apply elsewhere. At the same time, people with advanced degrees in an area are the only people who are constantly contending with what humanity DOESN'T know about a subject, so they are less likely to position themselves as authoritative. Experts saying they're making calculated guesses which in reality have a 5 9s chance of being true can wither to the bystander from an argument from someone who speaks with absolute ignorant conviction.


There is a connection in his last paragraph to entrepreneurship - I can't place a finger on the exact phasing but I've seen before on HN an essay about the need for an entrepreneur to be "all in" and genuinely swayed by their own arguments. I've definitely observed that in successful business people around me, and observed that I can't seem to do that to myself. While I'm not an entrepreneur, I've often thought to start businesses, but I can't ever seem to summon the self-certainty to commit to any ideas that I suspect are good and would generate value.

Put other way, if you are susceptible to argument, you must be willing to accept the risks, or unaware that there are risks in committing fully to an argument and a premise. This seems particularly suitable to startups where oftentimes evidence of demand, market size, or growth is conspicuously absent at the moment of inception.


I like the notion of "strong opinions, weakly held": https://bobsutton.typepad.com/my_weblog/2006/07/strong_opini...

""" Bob explained that weak opinions are problematic because people aren’t inspired to develop the best arguments possible for them, or to put forth the energy required to test them. Bob explained that it was just as important, however, to not be too attached to what you believe because, otherwise, it undermines your ability to “see” and “hear” evidence that clashes with your opinions. """

I think these strong opinions change when there is information to do so, and that's probably where the whole "stay focused and keep shipping" idea comes from: collect soft feedback, hard metrics, and otherwise information by providing a product-as-a-hypothesis. The more often you ship and measure, the better your direction and your alignment with what people will pay for.


or to put it another way, the entrepreneur believes the conclusion as if it's the premise and then tries to find the appropriate rationalization to support it.

just like, say, a theist or an atheist, in-between which fits the agnostic (the epistemically helpless person in this case).


Just about everyone who ever did anything significant or groundbreaking had a whole passel of people nay-saying and saying they were crazy, trying to drag them back like crabs clutching at each other. Of course, there's survivor bias there, where the cranks who are right are remembered, but the legions of cranks who are wrong largely fade into oblivion.


See also: FYRE Festival, Elizabeth Holmes


But also see Steve Jobs's reality distortion field?


> Arguments? You can prove anything with arguments.

This reminds me of a very interesting (and accessible!) article by the mathematician Terence Tao about what major mathematical results tend to look like, and how to be more skeptical of your own work: https://terrytao.wordpress.com/career-advice/be-sceptical-of...

> If you unexpectedly find a problem solving itself almost effortlessly, and you can’t quite see why, you should try to analyse your solution more sceptically. In particular, the method may also be able to prove much stronger statements which are known to be false, which would imply that there is a flaw in the method.


"Epistemic Learned Helplessness," is simply shorthand for, "We are all finite and have an incomplete understanding for everything, yet we must still pragmatically make our way in the world."

You don't have to instantly believe arguments. Sit back and gather data to assess the predictive power of the model.


Except the problem the author identifies is that in many fields, such as history, there's no way for the lay person to judge the predictive power of competing theories, short of expending outsized effort to become an expert themselves.

Since they cannot judge theories scientifically, and they cannot judge them based on how well they're presented, they can only fall back on social consensus and instinctive sense of weirdness. "Trust the academy", would seem to be the more accurate shorthand.


> Since they cannot judge theories scientifically, and they cannot judge them based on how well they're presented, they can only fall back on social consensus and instinctive sense of weirdness.

Why judge at all? Or rather, why develop certainty when you lack the capacity to judge accurately?

The answer is that certainty, even when it is unwarranted, is sometimes useful and necessary. Certainty can also lead to major oversights and failures so it is something we should use with caution and care. When assessing the appropriate level of certainty we need to address not only our own level of expertise, but also the utility (and risk) which adopting a particular certainty provides.

However, I find this article rather misguided in its conclusions:

> And so on for several more iterations, until the labyrinth of doubt seemed inescapable. What finally broke me out wasn’t so much the lucidity of the consensus view so much as starting to sample different crackpots. Some were almost as bright and rhetorically gifted as Velikovsky, all presented insurmountable evidence for their theories, and all had mutually exclusive ideas. After all, Noah’s Flood couldn’t have been a cultural memory both of the fall of Atlantis and of a change in the Earth’s orbit, let alone of a lost Ice Age civilization or of megatsunamis from a meteor strike. So given that at least some of those arguments are wrong and all seemed practically proven, I am obviously just gullible in the field of ancient history. Given a total lack of independent intellectual steering power and no desire to spend thirty years building an independent knowledge base of Near Eastern history, I choose to just accept the ideas of the prestigious people with professorships in Archaeology, rather than those of the universally reviled crackpots who write books about Venus being a comet.

The author has discovered their lack of expertise in ancient history. Knowing that, it seems to me that adopting a high level of certainty about ancient history based on "mainstream" or "the academy", without a clear idea of the utility of adopting that certainty, is epistemically irresponsible or at least unnecessary.



Yeah, this doesn't really seem that groundbreaking to me. You just try your best and take new information in as it becomes available (assuming the information is trustworthy). Then again, I build software. Even when there are competing ways to do something, you just weigh the pros and cons (that you are aware of) of each solution across various dimensions to determine which is better (assuming you can weight the relative importance of each dimension - which itself isn't always easy to do). The more information you have, the easier it is to decide on a course of action.

I think for stuff like religion or philosophy, you really move away from the domain of "correct/incorrect" to more conditional arguments: philosophy Y makes sense if your values are X. One is only better than another if it more aligns with your values and experiences. Your values are going to be determined by the totality of your life experiences and genetics. You can dispute that holding a certain position isn't necessarily pragmatic within a given context, but you can't really say it is wrong, in my view. Of course there are lots of smart people on this site, so I'm curious to see what others may think of my views on philosophy.


This resonates with me strongly. I remember reading G.K. Chesterton writing about how the most certain people in the world are locked up in the asylum, absolutely certain of completely false things ("I am Napoleon") while the rational among us grasp even our religious beliefs with a certain amount of healthy uncertainty.

Political comment threads strikes me as a good example of people on every side of every issue being 100% convinced of things that could not possibly all be true.


Indeed, taking one idea at a time seriously leads to crazy conclusions. You have to take all the ideas you believe seriously.

If you took seriously the one idea that "the purpose of human life is to produce more humans" [SolaceQuantum below], you should be trying to have as many children as possible. For a male with upper-middle class resources, 1000 children should be achievable. (You don't have to raise them, just "produce" them.) A billionaire with strong organizational skills might manage 10^6 children.

This conflicts with the (unstated) idea that children should be raised well, in a nurturing environment with good parenting. Also, that overpopulation is bad. And that other people have an equal right to children. And ...

So I don't think you need epistemic helplessness. You just need a broad perspective.


> But to these I’d add that a sufficiently smart engineer has never been burned by arguments above his skill level before, has never had any reason to develop epistemic learned helplessness. If Osama comes up to him with a really good argument for terrorism, he thinks “Oh, there’s a good argument for terrorism. I guess I should become a terrorist,” as opposed to “Arguments? You can prove anything with arguments. I’ll just stay right here and not blow myself up.”

Wow! This explains why so many antivaxxers and the likes are highly educated people, often with engineering-like background. I never realized that until reading this. They're smart enough to never have developed a thick enough skin against convincing-but-wrong arguments, which means that once someone does convince them of something (despite their smartness), they're (we're?) all-in.

Makes me wonder how vulnerable I am to all this. I'm sure in the target demographic here. Maybe Osama would turn me into a suicide bomber too? Or into an antivaxxer :-)


I'd take his argument even further. I am pretty good at programming. When I have written a program, I believe it's right.

I always test, before moving to production. Always. Even if it seems trivial to prove beyond a shadow of a doubt, by argument, that this code is correct. Always test, before moving to production.


We programmers get our noses rubbed in our own mistakes many times a day. Once in a while we get to find out that code we were sure of for years was embarrassingly wrong. You would expect people with so much experience of fallibility to become especially aware of it, especially ready to have their minds changed by argument, when it comes to unrelated even-more-complicated topics like politics and economics... if you were an alien unexposed to human nature.


It always amazes me the amount of code I encounter that it is clear no one has ever tried to execute. People will have a week of checkins building some API out, and it maybe compiles, but as soon as you try to just hit it with Postman with a real request, it blows up like a grenade going off in a glass shop.

I just don't know how people do it. I have to iterate and poke at things constantly as I go.


> I don’t think I’m overselling myself too much to expect that I could argue circles around the average uneducated person.

Everyone thinks they are better than average ;)


True, but if you read a bit of Scott's writing it will become apparent that in this respect he is better than average, by a long way. Of course being good at writing is not the same as being good at talking, but I am willing to trust his opinion that he's good at talking too. (He's a psychiatrist by profession and presumably does a lot of Talking To People About Stuff.)


Not I. I'm sure that my humility and self-doubt are WAY higher than average! ;)


Fun Ask HN thread would be, "at what thing or what about you is below average?" It's one thing to admit you suck at something, but quite another to recognize that you are below the mean, or worse, the median.

It's too easy to fall into the trap of inverting a positive trait and say, "I'm probably less un-awesome than average," or to exaggerate false humility.

Sample bias comes in as well, since among my rich friends, I am below average net worth, but still above the 75th percentile nationally. It's like that stupid job interview question, "what is your greatest weakness?[0]"

It's an interesting exercise anyway.

([0]Apparently one wrong answer is, "suffering fools gladly.")


Scott Alexander (SlateStarCodex) is most definitely above average when it comes to writing and argument-making. He might be the most convincing writer I know of.

https://slatestarcodex.com/2017/10/02/different-worlds/


He's certainly convinced you ;)


Perhaps you’d be willing to share an example of an idea about which he has convinced you to change your mind?


I'm curious to query the HN crowd: How do we mesh this learned helplessness on subjects for which there are no definitive studies for? (eg. is the purpose of a human life to produce more humans? How does one formally prove that, and how does one prove those proofs are valid, and the organizations that prove the proofs are valid are also valid?)


I feel like people who study subjective topics like the value of human life (philosophers) have already internalized "learned helplessness". It's empirical-oriented people like us who have a harder time grappling with the idea of not knowing everything


It's empirical-oriented people like us

Translation: People who have spent so much time playing with highly abstracted and greatly simplified models, they've lost their grasp on how messy and complicated the real world can be.


More than that, we're used to dealing with maps when the territory may be inherently unmappable. It may not just be hard to produce an accurate map for the entire territory, it may be impossible.


I think that being empirically-minded requires, in part, accepting that not everything is known. Science differentiated itself from the rest of philosophy when it put aside big but unanswerable questions in favor of smaller questions that could be addressed empirically.


>feel like people who study subjective topics like the value of human life (philosophers) have already internalized "learned helplessness".

This.


You can't do that, because any proof relating to how humans should live would have to start from axioms like "suffering is bad" or "other people are probably real rather than imaginary" or "suffering is bad, and other people are real, and it's bad when I suffer, and it's also bad when other people suffer", and you're not going to get everyone to agree on those.

The best you can do is to make arguments like "if you accept all the axioms I mentioned above, you should also accept this conclusion".


The online version is just to avoid believing things. We don't have independent knowledge of most of what gets reported on the Internet and we're not here to accomplish anything much, so we don't have to commit to any beliefs.

(This is the opposite of the "instant expert syndrome" to which many people seem susceptible, where they are certain they know which side to take about whatever was recently in the news.)

In real life, you probably do have commitments and decisions to make.


It seems to me that these issues would fall into the author's category of things you need not take seriously: in the first case, don't look for proofs of a proposition that is not well-defined enough to be amenable to either proof or refutation, and in the second, don't expect to reach the end of a line of inquiry that you can see will never end.


Check our his follow up essay raising the specter that tradition in general might be much more likely to be a useful authority than we give credit for.


At the root you should ask whether a question is a mechanistic one or a philosophical one.


I dislike the name, but I can't think of a better one for the mental condition.

I find the easiest way out of the "we're in a simulation" conundrum is to ask why you would behave differently, ethically, inside a simulation to outside: doesn't the fact you display consistency on your behaviour define you as you?


>That is, if you have a valid argument for something, then you should accept the conclusion. Even if the conclusion is unpopular, or inconvenient, or you don’t like it. He envisioned an art of rationality that would make people believe something after it had been proven to them.

Lewis Carroll wrote an essay on this topic:

https://en.m.wikisource.org/wiki/What_the_Tortoise_Said_to_A...



Pascal’s Wager seems trivially countered by pointing out that there’s an equal amount of evidence of the opposite hypothesis.

Or, in a more refined way: There is an effectively infinite number of theories that have equal evidence (namely: none), and consequently you should give each one only (1/infinity) attention, i.e. epsilon, i.e. basically zero.


Note: The reason this is reposted now is that he used it as a leading post in a sequence on the subject. The next post in sequence is "Book review: The Secret of Our Success"

https://slatestarcodex.com/2019/06/04/book-review-the-secret...


This piece would be better without all the preemptive self-congratulations.


For how good he actually is, he's actually being quite humble.

And more importantly, the goal is not to make himself look good, but to relate a point as he experienced it.


It would.

But it is a repost from livejournal, so it's probably over 10 years old. I can understand not wanting to rewrite it.

His newer writing is not like this. On the other hand, his newer writing is not so short.


I would guess that a lot of appearances can be sculpted, by hitting a vulnerable person with priors.

I have no idea how frequently this happens, or how it resolves.


This is essentially the argument I make to friends whenever someone brings up some homeopathic remedy. Most of us are not smart enough or well educated enough to really sift through this stuff and sniff out fact from fiction. Lucky for us there is a whole field of trained professionals who (hopefully) are.

At the same time though, I think it's worth remembering that the placebo effect is real and measurable so if someone claims X, Y or Z cures their headaches maybe they're not lying :-)


> smart enough or well educated enough

A really good consideration is experience and observability. Those trump smart and well educated most of the time. It's the difference between seeing something and inferring it or simply repeating orthodoxy.

The Observability test is good when dealing with putative authority. Instead of asking if they are right you first ask, is this something they can observe. If not their opinion means a lot less.

A good modern example is climate change. Almost no one has the training, experience and familiarity with actual data to have form a reliable opinion. Without all three a persons opinion isn't worth much.


Happy to see slatestarcodex classic




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: