Hacker News new | past | comments | ask | show | jobs | submit login
People with Greater Intellectual Humility Have Superior General Knowledge (opensociet.org)
565 points by pseudolus on June 7, 2019 | hide | past | favorite | 254 comments



A few years ago the guy I was sharing an office with at work was cleaning off his desk and found a stack of resumes. He said they were all from the batch that I was hired from and asked if I wanted to know why I was picked. Very cautiously I said yes. He told me I was the only one who when really pushed during the technical interview said "I don't know". Their initial strategy had been to start easy on some topic and just keep going deeper until they had a measure of what level the applicant was at. After a certain point they were just surprised at how far people would carry on into nonsense.


The older and more confident I got the less I felt a need to 'know everything.' I'm not perfect, but these days when it comes to anything whether it be technical or cultural (e.g. bands or movies) I strive to not be afraid to say "I don't know what that is" and be excited to learn about it.

There's so much out there. I actually picked this up from reading Hacker School's user manual, "no feigning surprise"[1] and of course an xkcd comic[2].

It was a comfort to see an engineer who I highly respect (Dan Abramov) post a list of things he doesn't know[3].

[1] https://www.recurse.com/manual

[2] https://xkcd.com/1053/

[3] https://overreacted.io/things-i-dont-know-as-of-2018/


We HNers really love these kinds of submissions since we think we're so smart.

The top comment is literally someone bragging about how they once got a job with their superior applause-worthy character of saying "idk".


I like that we're using the word "we" to make our aggression look more passive. We wouldn't want to give the wrong impression, would we?


I agree with his point.

> people with greater humility have superior...

It's a contradiction in terms. It makes humility look like a tool. One could instead just say "try to be humble".


That is an extremely uncharitable read of the comment.

I don't think it comes off as bragging at all, but just an observation of what worked, and how it's interesting that people will go into levels of nonsense to avoid saying they don't know something.


"More intellectual humility was also associated with less “social vigilantism”, defined as seeing other people’s beliefs as inferior."


More like they have a right to their beliefs, inferior or not, as long as they don't try to impose them on me.


Comments like these are why I spend more time reading HN comments than the articles


It's really interesting to get a retrospective, once you get to know your co-workers/managers, on what they were thinking when hiring you. It is DEFINITELY NOT what you might expect!

I was once was hired as a "career-changer" because I had worked in one of my early jobs as a science museum staff-member doing demonstrations for school groups. For some reason, the hiring manager thought that was indicative of communication ability and curiosity. I guess it was, but I never thought it to be very important nor formative at the time.


Maybe not formative but selective.

If you lasted at that job it meant you already had those attributes.


As an interviewer at one of the FAANG companies I can confirm this is one tactic we use as well. A lot of people are very confident spewing trash for as long as you'll let them.


Likewise. I open my interviews with the explicit statement that I’d like them to disclaim when they don’t know, and that furthermore, the discussion is arranged around challenging them until we reach that point. We have less than an hour together and I need to judge your technical abilities in an intrinsically imperfect medium. Help me help you - I can only work with what I’m given. If you bullshit, what I’m given isn’t good.

I don’t really get it. I can perhaps understand that some candidates might have gotten good feedback from blatantly guessing in the past, but that’s why I now explicitly tell them to disclaim guesses. If anything it looks more impressive when you honestly don’t know something but intuit the substantially correct answer (as long as it’s something that could be realistically intuited).

Yet even with my disclaimer, I’ve still conducted phone screens and onsite interviews where the candidate eventually started bullshitting. It’s one thing to say you don’t know and give a wildly incorrect answer - at least then I can try and steer the interview towards another of the candidate’s strengths. It’s even okay to preface your wild guess with an, “I think...”. But the cavalier way in which people will just spout nonsense is disturbing. Even if you’ve been performing well up to the point, engaging in bullshit is nearly immediate grounds for me to discount you as a candidate.


Obviously that is part of the point of the interview, though, right?

You are able to identify these over-confident people and prevent them from being a toxic influence on a team.


Well yeah, that's why I continue to ask. I guess what I was getting at is that I'm sort of shocked people will still bullshit despite my explicit declaration ahead of time.


After they said "I don't know" I'd then encourage (or nudge) them to derive a solution. It gave me a very good insight into their thought process and their ability to apply their knowledge to solve an unseen problem.

For instance, students (during campus interview) would at some point reach a dead end while explaining process scheduler. However after encouraged to work it out about 70% ended up with some version of timer interrupt. It was fascinating to watch them go through the process.


It depends on what you mean by derive a solution. If a candidate doesn't know an algorithm they couldn't be expected to derive it on the spot.

In many cases the popular algorithms were carefully designed by Computer Scientists as part of their research. People don't pass that type of test because of the ability to derive solutions on the spot, they pass based on their skill at rote memorisation and application.

(I absolutely agree that if despite not knowing they show signs of being able to reason their way towards something sensible - robust, likely to be efficient, etc - that is a very good sign.)


Same thing when I interview people. I don’t know is a sign of confidence to me. It is the right answer in so many situations. Unless you have a reasonable shot at deriving the answer on the spot, it’s what more people should be saying, and not only in interviews.


Agree. Although a better answer is (sometimes) I don't know, but... and some indication of how they might get to the answer.


That’s my usual prompt. How much do you know about the domain, how long will it take you to know.


That's reminiscent of the nasty version of 20 questions, where everyone bar the subject is told the only rule is to give an answer that is consistent with previous answers.

Never been on the receiving end, but what I've seen you don't want to be on the receiving end.


"The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge" - Stephen Hawking


That’s been my go-to interview technique for years. I want to see candidates say they don’t know under pressure.

Unless it’s for a very senior role, I think attitude and personality are greater predictors of success than technical competency, so long as their technical competency is ‘good enough’ to begin with.


I'd say attitude and personality are greater predictors of success than technical competency for very senior level people too. I can't think of many people who reach senior levels with poor attitude and personality. And those that do often hit a wall. Eventually, their attitude and personality gets in the way of their brilliance and other people's willingness to work with them.

I think it doesn't matter only in cases where you were the guy who started it all or you're some crazy skilled mercenary for hire brought in only for emergencies. You can treat people like crap for as much as you want if you're in control like that. Maybe it's even necessary for those unicorn situations. And yet, even Linus Torvalds has said that he needed to change.

Finally, I'd say poor attitude and personality lends itself to arrogance, which also makes technical people hit a wall because their arrogance decreases their technical curiosity. Improving one's skill really does require a certain intellectual humility.


> I'd say attitude and personality are greater predictors of success than technical competency for very senior level people too.

Yeah I’d agree. But I think somebody with a good attitude, who can get on well with others, has a greater chance of being able to quickly fill in any competency gaps they have in non-senior positions. For senior positions, the baseline is higher. I’d hire somebody with good character, but inadequate competency (within reason) for a lower IC position, but for a senior position, I’d want both.


"I don't know" is often the best answer.

I was listening to some Christian on a video criticizing atheists and saying something related to the god of the gaps fallacy.

He commented something along the lines of "you have to have an answer to these things!"

No. No you don't.

You don't just invent some fairy tale just because you don't understand something.

Often the best answer is "I don't know"


Please keep religious flamewar off this site.

https://news.ycombinator.com/newsguidelines.html


I do a lot of interviewing... I've found that folks who say "I don't know, but I'd guess x based on y" is a fantastic answer. Lets me know how they are thinking as they walk through an educated guess.


maybe that knowledge makes them hireable after they go to the Heaven


To be fair, both christians and atheists who are interested in debating each other tend to have a hard time understanding the limits of their knowledge. The criticism can be directed equally in both directions.

The entire sum of scientific knowledge is based on unprovable axioms, just the same as the entire sum of religious knowledge is based on unprovable axioms. Faith is a belief in something you can’t prove, and people have faith in scientific knowledge in exactly the same way that others have faith in religious knowledge.

A wise person would understand that scientific knowledge and the various forms of religious knowledge are based on completely different foundational axioms, and as a result, any attempt to debate the merits of one over the other is entirely pointless. A wiser person might understand that criticising somebody for what they choose to have faith in is both pointless and hypocritical.


Faith is a belief in something without evidence, correct?

Science is rooted in empirical evidence. It may be with imperfect models and imperfect measurement, but there is indeed fairly reasonably-measured evidence.


Faith is a belief in something you can’t prove. The munchausen trilemma invalidates all logical proofs. As such, all forms of knowledge are based on unprovable axioms from which knowledge is derived. If you were to try generalise some of the axioms that form the foundation of scientific knowledge, you might say that ‘empirical evidence is a source of absolute truth’ or that ‘everything we don’t understand about the physical world will either be explained by science, or is not possible to explain’. Any individual might consider those ideas and decide that they are worthy of having faith in, or they might not. But any unprovable axiom has no more or less merit than any other unprovable axiom. To say otherwise is to be ignorant to the limits of your own knowledge. You might say that scientific knowledge is more complex or rigorous than religious knowledge, but that doesn’t speak to its merit either. Flat earthers rigorously promote an incredibly complex system of beliefs, it is naturally more complex than science, because it needs to add additional ideas to explain scientific ideas that conflict with their own. That doesn’t speak to its merits at all.


What is your definition of "prove"?


Just the ordinary dictionary definition. Using evidence and arguments to establish a fact or truth.

Of course the arguments and evidence that you use to prove your truth must also be proven themselves, and so and and so forth. No matter what it is you’re trying to prove, there are only three possible outcomes. Circular reasoning, infinite regression, or stopping at an arbitrary point (usually described as an axiom). This is known as the munchausen trilemma.

Although this question has lead me to see that my previous argument is incomplete. It is possible to believe a truth without faith. That is, through ignorance. A failure to scrutinize your belief sufficiently to understand that it is based on an unprovable axiom, and is thus an act of faith. People who debate the merits of science vs religion tend to be ignorant of this, equally on both sides.

Nothing about what I’m arguing is even remotely controversial. It would be a part of any entry level course on logic at any university.


The Faithful would also call just about anything “evidence”.


Faith is not belief in something without evidence, it’s belief in something without proof. The mere existence of evidence does not constitute proof.


A wise person might also be aware that words can have multiple meanings, that "faith" is one of those words, and that equivocation fallacies don't make for good reasoning.


The general meaning of the word is belief in any idea that you can’t prove. Belief in scientific knowledge requires no more or no less faith than belief in any religious system.


Yes, I am aware that you can use equivocation fallacies to explain that we can't know anything and all claims are equally likely to be true. I was talking about what a wise person would do, though.


Please don't cross into rudeness in HN threads. Also, please do religious flamewar somewhere else, not here.

https://news.ycombinator.com/newsguidelines.html


Would you mind explaining what about my comment was either rude or a flame, given the context in which rationality was named as a sign of being unwise?


Any comment of the form, "Yes, I am aware that you can do stupid thing X, but I was talking about intelligent thing Y" is rude and a flame.


... when "stupid thing X" is making an argument that doesn't address the problem, and "you can do stupid thing X" is actually an explanation of what the problem with that argument is, and it's also not calling anyone names, but just calling out fallacious reasoning as what it is: Fallacious reasoning? Fallacious reasoning might well be correlated with stupidity, but that doesn't make pointing out fallacious reasoning an insult in itself.

Equally, if the topic of the discussion is what makes a certain decision wise or not, as it happened to be the case here, I don't see how pointing out that a suggested methodology does not qualify due to fallacious reasoning in that methodology is either rude or a flame. That is, unless you consider the start of that discussion (https://news.ycombinator.com/item?id=20129547) a rude flame, which you possibly could, even though I don't think it was intended as such.


I'm afraid I'm not really following this. In a way it doesn't matter, though, because even if I missed some subtlety in perceiving your comment as rude, I can guarantee you that most readers would miss that subtlety as well. Since there are plenty of ways to make your substantive points, why not just choose ones that don't straddle that line?


[flagged]


Please don't cross into rudeness in HN threads. Also, please do religious flamewar somewhere else, not here.

https://news.ycombinator.com/newsguidelines.html


I’ll take your feedback on rudeness, but I never made any posts promoting or deriding any particular religion, or religion in general.


> I’ve only asserted that it is impossible to prove any absolute truth.

No, you have also asserted that there is no way to distinguish different levels of justification for "non-absolute truths", or else your whole argument makes no sense. Which is also exactly what an equivocation fallacy is: Claiming that two things are the same because you can decide to ignore the differences.


I made no such assertion. I’m made no comments that attempted to ignore the differences that exist between different belief systems, nor made any comments that said you couldn’t qualitatively differentiate them. Simply that you can’t use the truth as the basis of that differentiation. All belief systems, whether they’re a belief in science as a source of truth, or a religion as a source of truth, are based on assertions that cannot be proven, and are therefor all equally unprovable. This is a perfectly rational equivalence, and doesn’t present a fallacy on any level. If you were more intellectually humble this might be easier for you to accept, but as it stands, you are no different from anybody else who chooses to reject the limits of their knowledge.


> All belief systems, whether they’re a belief in science as a source of truth, or a religion as a source of truth, are based on assertions that cannot be proven, and are therefor all equally unprovable.

Or in other words: There is no such thing as absolute (mathematical, provable) truth about reality, and if you ignore that there is such a thing as evidence for claims about reality, then all claims are equally justified, as long as you make assumptions that are consistent with your claims.

> If you were more intellectually humble this might be easier for you to accept, but as it stands, you are no different from anybody else who chooses to reject the limits of their knowledge.

Well, yeah, it is tragic how religion poisons minds to the point where everything about the world is upside down.

The funny thing is, you yourself don't actually believe that that supposed limit to our knowledge is there. You yourself do constantly make decisions prefering empirical evidence over other "belief systems". When the stove is hot, you don't put your hand on it "because it can't be proven that I will hurt myself (true!), and if I assume that heat doesn't hurt you, that is just as proven as the assumption that empirical evidence tells me something about reality (true!), so I am just as justified in my belief system that putting your hand on a hot stove won't hurt you as people with a scientific 'belief system'! Believing the science requires just as much faith as believing that a hot stove won't hurt you!" You are not actually stupid enough to believe that. In your daily life, for the most part, you constantly act consistent with the belief that empirical/scientific methodology gives you reliable information about reality, and inconsistent with the belief that any other assumptions instead of what possibly underlies scientific "belief systems" is just as justified/just as much "faith based". Nor would you accept so from anyone who disagree with you about something. If someone made the assumption that murdering people made their loved ones happy, and started murdering people on that basis, you would not say "oh, well, it's their assumption, and it can not be proven, but science can't either, so it proably makes people happy". You would immediately call that out as completely moronic reasoning that is way overstretching the implications of an iteresting conundrum of epistemology to be more real than the obvious immediate experience that killing people does not make their loved ones happy.

All of this is not actually a set of principles that you believe in. It's a set of excuses you give so as to avoid seriously examining the epistemic foundations of this one particular belief that you happen to have, and that you apply only very selectively to that one claim.


This is really just an incoherent anti-religious rant, with a whole lot of completely unfounded assumptions you’ve made about me personally thrown in.

I have a tremendous amount of faith in science, I have a degree in physics. I simply happen to understand the contextual limitations of the knowledge that I’ve derived from it, something you seem far too arrogant to do yourself.


> I simply happen to understand the contextual limitations of the knowledge that I’ve derived from it

Such as?


I don't consider this a fair description of the two realms of inquiry.


Of course. Choosing to value a set of beliefs that make the most sense to you is how we give meaning to our lives. We all have opinions about what makes sense and what doesn’t. The only thing we can’t do is prove anything to be an absolute truth.


A questionnaire from the article:

1) I am willing to admit if I don't know something.

2) I like to compliment others on their intellectual strengths.

3) I try to reflect on my weaknesses in order to develop my intelligence.

4) I actively seek feedback on my ideas, even if it is critical.

5) I acknowledge when someone knows more than me about a certain subject.

6) If someone doesn't understand my idea, it's probably because they aren't smart enough to get it. (reverse)

7) I sometimes marvel at the intellectual abilities of other people.

8) I feel uncomfortable when someone points out one of my intellectual shortcomings. (reverse)

9) I don't like it when someone points out an intellectual mistake that I made. (reverse)

Critique: I can see some confounding problems right away. When I was younger I was pretty painfully shy, so (4) I would not seek feedback, but not out of a lack of humility. Also (3) I don't get, because while I am extremely motivated by intellectual curiosity, I have maybe zero motivation to "develop my intelligence" like it was some kind of fitness thing. Also maybe I'm a wierdo but I don't think so much in terms of people being "smart", I think it is something you do. That is, it's hard work.

And with (8) and (9), it takes training to not feel uncomfortable or to not dislike it when someone points out your mistakes or shortcomings. I don't think that is really related to humility. In fact if you are super critical of yourself and judge yourself poorly compared to other people, you still hate it if anybody else points it out.


I'm not sure your critiques invalidate the questions. This questionnaire isn't a value judgment on who you are as a person, it is simply exploring your behavior. Having good reasons for not giving the "right" answers doesn't mean the questions aren't poking at valid points.


I think you are right that they are getting to valid points, but I am saying they don't necessarily relate to a theory of intellectual humility.


Maybe that's why there are 9 questions, not one or two. So they together show a more exact picture


> 6) If someone doesn't understand my idea, it's probably because they aren't smart enough to get it. (reverse)

One of my high school teachers said something like: if someone doesn't understand your explanation, then you don't know it as well as you think you do.


It's also quite possible they lack the domain knowledge to receive a succinct explanation of the problem and evaluate if a solution actually solves it.


That is possible, but it is unwise to bet that way.

Why?

First, because far more things can be explained to an intelligent lay person than most of us realize, and figuring out how to do so improves our own understanding. Therefore it is worthwhile to make the attempt.

Second, because it is far too easy to fool ourselves into thinking that we're making a succinct explanation that requires domain knowledge when actually we're spouting just enough jargon to remind those who agree with us of shared and possibly unexamined dogma. There is a real cost to mental habits that make such mental mistakes easier to make and sustain.

So yes. The other person may be too ignorant/stupid/whatever to understand you. But the alternative explanations should be disproven before you conclude that.


It's a classroom setting so the discussions are between peers with similar knowledge.


It is even more true in the real world when the discussions are between people with different knowledge. Do people recognize someone from another background as a peer or not?


If all parties are interested in the discussion they can ask questions to fill in gaps in the knowledge. Doesn't always happen of course.


Not entirely correct

if someone doesn't understand your explanation, it might be that you don't understand the person well enough or you don't know the correct way to explain to that person in a plain language


I struggle with this when everyone outside of tech asks me what I do.

My answer always changes since I haven't narrowed down a way to describe "cloud engineering" to people.

I've heard this saying before and every time I think, "maybe I don't know what I do".


I always respond in layers. The first layer is to say I work in computers. If they want to know more, they'll ask. If there is a follow up question that doesn't make it clear they're tech savvy, I'll explain one level deeper. I work on the website for x company, or I help secure your credit card info at such and such back. It doesn't matter if you're front end, back end, dba, network, sysadmin, help desk, etc. Just something that a regular human might recognize as being a thing. Most people don't know what those job titles mean, but they know people must do something to make websites work and secure credit cards. And so on and so forth until it's satisfied their curiosity.


The key thing is knowing that their level of curiosity may not be what you prefer. I've come to terms that people often don't actually care about understanding what I do and would rather just make assumptions about it. If you can't accept that and let it be, you can become a very despondent person. :) And then when you try to correct their misunderstanding, you just make them unhappy too. Everyone loses.


Yep, if they're not actually interested, which is usually the case, you're either going to bore them or insult them because it will come off as bragging.


Richard Feynman said, "If you can't explain something to a first year student, then you haven't really understood."


Note that being great in every of these aspects can lead to suffering. Because most people don't and you'll be talked down regularly to the point of breaking. And it's utterly difficult to keep respect for people when you try to be balanced, team oriented and generous when others only know how to assert their POV and limited experience.


I agree with you. My honest answers to each of questions (save 8 and 9) are what the researchers would probably consider the "correct" response. Personally I don't like it when people point out my shortcomings or errors, but I'd argue those questions are under-specified. I may not like it, but I'll welcome it and earnestly assume they have some insight I do not.

In any case I think it makes me an unenjoyable person to be around. I more or less don't engage in opinionated discussion with people unless I'm intimately familiar with the topic at hand. Then it's no longer discussing opinions but trading facts. When I do engage, it's usually to ask questions of the other person's opinion without really challenging them. While I might agree with their opinions, I'm reluctant to refute them unless I can systematically prove them wrong. That's usually not possible, because they often know more than I do about any given topic. When I am asked questions directly I hedge my answers extensively.

When I was younger I used to take pride in this, but now I find it isolating. It's difficult to relate to people like this.


I feel similarly and agree about the responses. I typically avoid political arguments or heavily opinionated conversations. Maybe it's because I'm still young (I'll be 30 in a couple years) but I don't think this makes me an unenjoyable person to be around.

I like having _good_ conversations, and can still have them about controversial or opinionated subjects. If I don't know enough about a topic to offer my opinion I like to ask questions that will offer some insight as to why the person feels so strongly. Usually (not always) when someone feels very strongly about something, there is some kernel of truth somewhere that will at least be interesting. It's fun to at least figure out _why_ people think they way they do.

I used to have a bad habit of being a devil's advocate. I'm sure it was annoying when I was even younger, but I've found that faking (exploring) an opinion can help give the conversation some depth. Nowadays I do that less, but can get away with it if I preface it nicely enough.


Like most people I also dislike being wrong. I’m also game for disagreements, which means I’m wrong a lot, easily the most on my team at work. I’d hesitate to say that makes me more “enjoyable” to my teammates.

More likely people find me difficult to take seriously because I’ll passionately argue for something I don’t necessarily have the best evidence for and then immediately give up and say I’m wrong when someone gives me the evidence I’m looking for.


Fun counter intuitive social trick :)

There's a thing called.. hmm .. positive negotiation ? when you interact with someone by always aiming at finding a middleground/compromise, rather than arguing for the sake of it.

Your point reminds me of something that I'm seeing (IIUC). A lot of the time people will assert more than they know (I do that sometimes too[0]) and the discussion will stop. By insisting even at the risk at being wrong, you force everybody to show their hand and sometimes they'll realize that they may be wrong or off point and that they need to reevaluate the situation. .. We're tribal, even in scientific fields.


> By insisting even at the risk at being wrong, you force everybody to show their hand and sometimes they'll realize that they may be wrong or off point and that they need to reevaluate the situation

If this was more common, I think the world would start to become a much better place surprisingly quickly, at a cost of people suffering a little minor intellectual humiliation until they started to be more disciplined about their beliefs.


This resonates with me to a surprising extent. My perceived threshold for expressing an opinion is very high, and I also find that I don't really have any particularly _strong_ opinions to begin with.

The "hedge my answers extensively" bit is spot-on, as is the isolating nature of this "trait", unfortunately.


I agree with all your points, especially (8) and (9). I think people who answer "no" to these questions because they lack self-awareness or because they are giving "right" answers instead of honest ones will far outnumber the people who have managed to completely eradicate their insecurity.


> Also (3) I don't get, because while I am extremely motivated by intellectual curiosity, I have maybe zero motivation to "develop my intelligence" like it was some kind of fitness thing.

Given the amount of cognitive bias each of us has, I'd say an important part of intellectual curiosity is making effort to learn to think better. I mean, a flat-Earther 9/11-truther astrologer could still claim to be primarily motivated by "intellectual curiosity".


The flat-Earther 9/11-truther astrologer probably really is primarily motivated by "intellectual curiosity", he probably just has poor priors. Which is to say, he's seeing that world as he's seeing it, it's just that something in his lower intellectual foundation is rotten.

The interesting question is whether that specific bit of rotten reasoning is still accessible to alteration after one has passed through that specific phase of neuroplasticity.


This test is aimed at the median of the population. However it could be altered slightly.

Answer yes or no AND write a short explanation of why.

Then the judge of the answer sheet should decide if points for or against are awarded for each question.

As an example, I just am really bad at complimenting others. Both because my own overall standards are so high and also because that was never a skill I learned; so I'm actually very shy at doing so. Maybe I should have been complimented more, and better trained as a child to compliment others.

I explained 6 in a different reply under this thread. However a different way of getting the data (which would be better and yield a clearer view) could be to ask: "If someone doesn't understand my idea what are the possible reasons?"

7) This one is just painful. It really depends on who you're exposed to on a daily basis and how you relate to them. I _have_ marveled at how smart, driven, or successful some others are... mostly the successful and driven parts though.

8/9) Emotional reactions are different from how you try to handle emotional reactions.


Alot of what has helped me evolve, is listening to HBR Ideacast podcasts.

There are alot of discussions which basically come down to how humans interact and communicate with one another, and what works and what doesn’t.

I noticed I compliment others quite a bit more since listening.


I would add some important evaluations. These are some less obvious, deeper aspects of intellectual humility:

10) I accept that someone with less experience may have better ideas than I have, even in my own field of expertise.

11) Even if I am not certain of myself, I am willing to suggest and defend my ideas because they deserve equal treatment.

12) I am willing to forgo new ideas when they hurt momentum too much.

13) I am willing to forgo momentum when the benefits of new ideas outweigh the costs.

14) When I feel it is right to assert my position, I am able and willing to back down if I receive feedback that changes my assumptions.

15) I know when to humbly defer decisions to the team even though I believe the team's decision is less than ideal. People are more important than problems to be solved.


>>10) I accept that someone with less experience may have better ideas than I have, even in my own field of expertise.

There are also people who assume their ideas in fields they don’t have experience in are good, precisely because of that lack of experience. They think their “outsider’s perspective” significantly increases their likelihood of being correct because they “think outside the box” or some such. It’s the opposite of humility, and supremely frustrating to deal with!


Care to explain the difference between 12 and 13?


12 is about recognizing when you're changing too quickly, 13 is about recognizing when you're changing too slowly. A lack of intellectual humility can impede you either way.


> Also (3) I don't get, because while I am extremely motivated by intellectual curiosity, I have maybe zero motivation to "develop my intelligence" like it was some kind of fitness thing.

It would be more apt to, if you were writing code that has some level of networking involved. During this you think "I don't remember/know much of the OSI model, so I should go read up on the model to better understand the code I am writing". Instead of just trying different things and adapting examples not understanding what it is actually doing.


That sounds as if trying to figure out on your own was antithetical to curiosity. That doesn't sound right to me.


I agree about not thinking about people being "smart" or not. I think everyone's interest just varies and sometimes their interests intersect with what people think of as "smart things".

One of my friends called me a genius the other day, but I'm just really nerdy about something that makes money. He's really into comics and is a walking encyclopedia of characters, timelines, stories and arcs, but wouldn't consider that "smart stuff", even though it is to me.


> Also (3) I don't get, because while I am extremely motivated by intellectual curiosity, I have maybe zero motivation to "develop my intelligence" like it was some kind of fitness thing.

I don't think of this in terms of intellectual fitness, but when I realise I made a wrong step in my modelling of the world I generally take a moment to reflect on where I went wrong and what I could do differently in the future to avoid similar pitfalls.


My problem lies in being open minded to hearing out someone who, after enough conversing, has an opinion that you’ve delved into deeply before and have seen the glaring shortcomings of, and when that individual is not open to those factual shortcomings.

That is- dealing with stubborn, close minded people, is incredibly difficult to stay open for me- in the best case I just move on.


Sounds to me like you enter conversations with a pre-made idea of what you want the conclusion to be.

Questions to ask yourself if you feel that way regularly:

Have you considered that various shortcomings might hold different weights to people depending on their background, experience in other adjacent domains, etc? That what is factual for you might not be so to others?

How do you know that your analysis of the shortcomings are correct/pertinent? That the person you're talking to hasn't delved in it as deeply as you have and just reached a different conclusion? Are you the close minded one for having decided early in the conversation that because you've thought about it in a certain way in the past, any interlocutor who thinks of it differently is necessarily wrong and stubborn?


Note how I said ‘after enough conversing‘.

I do give the benefit of the doubt for as long as I can. I admit to understanding positions that are premised on different values. I have a hard time when once I understand the values, inaccurate steps are taken to reach a false conclusion.

What I’m talking about is more akin to talking with someone making a mistake in a proof, being pointed out the mistake, but still digging their feet in to not admit the mistake.

I’m fine if assumptions aren’t shared, but not when conclusions don’t follow logically from assumptions.

Being open minded isn’t akin to listening to everything and anything- there are limits.

So no, it is not that ‘I enter conversations with a pre-made idea of what [I] want the conclusion to be.’


Actually, you likely do. You appear to expect rational self consistency.

Not all humans present that. They have reasons, some of which may just be their nature.


> Actually, you likely do.

Well sure everyone does, to some degree. But I thought dyarosla's clarification was quite sound, particularly this part: I admit to understanding positions that are premised on different values.

In my experience, a lot of people are completely ignorant of this important idea, that values (or axioms) are a crucially important part of disagreements, that someone may be approaching the same general topic from a very different perspective than you. Not only do some people not know/appreciate this, I've encountered several people who completely reject this idea if you point it out to them.


I am not judging on "sound", just pointing out that self consistency is the core exception.

That statement on "values", coupled with the other ones, all support strong self consistency.

How people evaluate things can, does, will vary from a highly rational POV.


Is logical consistency not a core tenant of a rational argument?

I’ll concede that some people don’t value logical consistency - but that doesn’t make them more rational in their arguments.

If there is subjectivity in a ‘logical inference’, I’ll err on the side of being open to it.


It absolutely is.

However, people are often not entirely rational creatures.

Advocacy, for a very effective example, is a combination of reason, emotion and character.

How people feel matters. Who they are interacting with and or referencing matters.

Roll all that up, and we are likely to encounter people who are not self consistent.

That is OK, human. I just noted that being a predetermination, that's all.

Secondly, there is no requirement they be more rational in their arguments. They may not even see something as one, depending on what it is.

They may, for example, seek better mutual understanding.


I readily affirm that feelings are most important, but it's pretty common to refuse to abandon one's feelings or one's facts and logic in the name of consistency.

It's not that someone can expect you to feel differently because they've presented a logical argument. That's not likely or expected.


Does any of that matter?

The OP said they did not predetermine, and I just pointed out that they do. (And I do not care)

No worries, just information.

Understanding others helps considerably when having conversations IMHO. My comments here speak to that.

Just know others may, or may not make the value judgements presented in this thread, that's all.

It is far more likely than you think. Consider a politically charged issue, some matter where religion is involved...

This all just is not the set piece implied.


> and when that individual is not open to those factual shortcomings.

That's when you should play the part of being stubborn and close-minded yourself, if this is what it takes to ensure that your points are heard. Intellectual humility is the best case, but sometimes it is best to cut one's losses.


>> and when that individual is not open to those factual shortcomings.

> That's when you should play the part of being stubborn and close-minded yourself, if this is what it takes to ensure that your points are heard.

Do you honestly believe that this is an effective technique in spreading your ideas?


While I cannot 100% agree with 0815test's statements (mainly because I don't see discussion as a win/lose battle, counting points, cutting losses...), there is in fact a certain motivation to take the stubborn position. When you look at the real world examples of people who successfully spread their ideas, unfortunately quite often there will be a certain degree of stubborness. It might be called different names - determination, believing in one's goals, but the effect is the same - at a certain point they will refuse to accept an alternative idea, even without any hard data. And by doing that, they might earn a social position that allows them to spread the ideas among many. Sometimes they even turn out to be right, sometimes not, but it was the aura of close-minded belief in themselves that brought them followers. Honestly, I'm a little disappointed that it works like this, because I would really prefer if deliberate polite factual discussion was always the norm.


I didn’t downvote you, but I also don’t agree with the methodology- it doesn’t seem conducive to progress.


I always think it is pretty neat when somebody responds, “I’ve never thought about it like that.”


> Also (3) I don't get, because while I am extremely motivated by intellectual curiosity, I have maybe zero motivation to "develop my intelligence" like it was some kind of fitness thing.

What's the difference between curiosity and a drive to develop your (crystallized) intelligence?


I mean I'm not driven to be "smarter", whatever that is. I am immersed in finding out more about everything that interests me.

Like I don't read to be well read. I don't listen to news to be informed. I read and listen out of my interests.


And during the process of learning what interests you, do you ever find that something tangentially related, you don't fully understand and should learn more about it.

I don't think anyone literally thinks, I need to know this to improve my intelligence as it is a weakness. It's usually more abstract.


Most of us are smart enough. Really, most things come down to doing the work, and doing it for reasons that make sense.


I feel like for 8 and 9, the right answer is actually that you are uncomfortable and don't like it.

The only people who would deny those things are phonies who aren't in touch with themselves. Accept it, acknowledge it, account for the bias it brings. Then maybe one could work on it.


You are assuming your reaction is the only legitimate reaction.

Some people like being uncomfortable because it motivates them.

Being comfortable for some people can lead to complacency.


> Also maybe I'm a wierdo but I don't think so much in terms of people being "smart", I think it is something you do.

Weirdo in a good way. Thanks for this one - sometimes I forget that thoughts/actions are smart, not people.


> When I was younger I was pretty painfully shy, so (4) I would not seek feedback, but not out of a lack of humility.

To fix such a things in a questionnaire there are 9 questions. Maximum score you could possible get is 9 times 7 = 63, but probably not many people would get so much. Even those who are super humble.

> Also (3) I don't get, because while I am extremely motivated by intellectual curiosity, I have maybe zero motivation to "develop my intelligence" like it was some kind of fitness thing.

So you do not reflect on your weaknesses in order to develop intelligence? Or you are doing it sometimes, when it is obvious for you that your task needs more intelligence from you? Maybe 3 as an answer would fit?

> I don't think that is really related to humility.

Good questionnaires are not like a random internet ones. There are special methods allowing to ensure that questionnaire is measuring what we want to measure, not something else. For example, you can measure intellectual humility based on the interviewing a participant, and then ask participant to fill a questionnaire with a 50 questions (or maybe 150, as much as your creativity could devise), that you think might be good. Get 50 participants, or maybe 100. The more is better, but it is a long and difficult task, to interview 100 people, and if you used a several PhD students as interviewers, than you need to control for interviewer also, because different people could get different results even when asking the same questions. So you would need even more participants to get a statistical significance. Number of questions itself also increases the number of data points needed.

After you gathered data, you would run it through R to find questions that are the best fit, to reduce questionnaire to bearable size. Probably after it you should test your new questionnaire by the same tactic you invented it. You even might need to do this, if at the first stage you got insufficient number of participants. But now you would need no factor analysis or something like, just a correlation would be enough. Or maybe chi-square.

I didn't read an article itself, but it is a published paper in a reviewed journal, so I bet it is done good, not just authors picked 9 questions on a random. Therefore I bet, that if you don't think that it is really related to humility, than you probably should change your mind and think the other way around. If it somehow important for you, I'd suggest to find a full text of paper describing methods used to devise a questionnaire and read them through. Maybe you are right after all. There might be a statistical mistakes, mistakenly used statistical methods for example, or some issues with the data gathering.


> The fact is, people who base their self-worth on being right about everything prevent themselves from learning from their mistakes. They lack the ability to take on new perspectives and empathize with others. They close themselves off to new and important information.

From The Subtle Art of Not Giving a Fuck.


> They lack the ability to take on new perspectives and empathize with others.

It often feels like this, "everyone is dead inside but me", and when we make these sweeping generalizations we'll find plenty of instances to justify them. But we really can't peer in each others' heads, and over time I realized that when I acted on these kinds of assumptions, I behaved like a fucking tool.


https://xkcd.com/610/

It can be a bit uncomfortable to realize that what we think of as "us" these thoughts and feelings we internalize and think of as unique... aren't so I can understand some resistance to recognize the depths and thoughts of others... or being like everyone else.


Came across that in audiobook form not long after a really toxic abusive relationship with someone who suffers from a textbook case of narcissistic personality disorder.

The chapter about the author's wannabe hustler friend described them like it was a biography.


I would argue that is a bit poorly phrased. One can base their self-worth on being right and simultaneously be humble enough to understand one can never truly know it all and thus be open to criticism and improvement.

Or is there something deeper from the book that better explains the point?


The book is full of gross generalizations and anecdotes. It's a self help/pop psychology book meant to make you feel good, not a proper scientific analysis. That said, it's still a good book that I enjoyed reading and I have a list of quotes saved in a text file that have helped me understand certain family members.


The Subtle Art of Not Giving a Fuck

Also make sure you're not overly invested in having the attitude of NGAF.


> The fact is, people who base their self-worth on being right about everything prevent themselves from learning from their mistakes.

I've never heard of this happening can you give examples on this site?


It's difficult to think of examples "from this site" because I rarely remember the posters' names.

I would point to cryptocurrencies as a rich source of extremely wrong takes: how bitcoin would replace paypal and credit cards, how government conspiracies would try to fight it, etc. None of those predictions came to pass, to put it mildly. What I can't offer is specific instances of people pivoting from "it's a currency! It even has 'coin' in its name!" to "it's not a currency! It's a store of value". But I'm sure there must be some.

I also remember when here (and, possibly, on slashdot before HN) one overdone "meme" was criticising CSI-style shows for "enhance!" making low-res images of license plates readable. "You cannot recover information that isn't there!" the comment would go, and it was the easiest upvote to get.

Today, there are plenty of AI demos that can, indeed, reconstruct license plates from low-res. Turns out the information wasn't actually lost. Unfortunately, I'm denied the gratification of all those people writing apologies, and I can't prove that they are the ones posting "It's not intelligence, it's just statistics!" today.

Nuclear power might be another example on a pretty good multi-decades run of varying other reactor technologies (pebble bed, fusion, etc) always on the cusp of breakthrough. This example is especially funny, because the actual scientists working on energy, and even the supposedly stupid politicians, have now created alternatives that are safe, clean, and close to competing with even coal in economic terms, let alone the far more expensive nuclear tech. Yet the wider tech community disregards this economic argument, and insists on fighting the public on safety. They just can't let go because they feel they were wronged on that issue in the 80s and 90s. Which is at least plausible, but it just doesn't really matter any more. There's a strong undercurrent of cultural grievances here, as if people were forever living in the science fiction of their youth.


person you replied to here:

well yes, I was being sarcastic but I'm glad it sparked this high value comment you wrote. It is easy to see how the 'consensus narrative' changes, as the accounts are censored and their comments are in fact promoted by the collective community. But it is impossible to see who was part of that collective community.


are *not censored


I've worked with previous my boss's son. This guy felt like he was the best at everything and would get annoyed if people gave him critical feedback on anything he did and would insist he was right. Therefore, even when more experienced people told him things he disagreed with, he wouldn't accept their opinion and keep doing it his way.

This caused a lot of people to not want to work with him and, in the end, was a major part of why I ended up quitting.


People like that are impossible to work with. It's a flavour of narcissism where the person in question is never wrong, cannot be wrong, so any mistakes are obviously someone else's fault.


I don't think this could ever be considered factual, but it makes sense to me.

In times where I'm invested in being right, it's harder to accept learning that I had been wrong, and I'm certainly not seeking out more information with which to challenge my understanding.


I saw many junior devs who thought they knew so much that they almost stopped to be curious and find the truth and knowledge out there, and I saw many devs who thought they were experts and just stopped to learn anything.


Confirmation bias is basically this.


A good friend once told, after listening to me bitch about someone who loved to comment on my work, "You can learn something from everyone".

I spent some time trying to dismiss that before considering that if he was right I'd already missed out on a lot, and that certainly humbled me because when I started looking for it I found he was right.

The flip side to this is sharing what you know when it contradicts what others are envisioning as a result can result in those same others responding with animosity.


Years ago I saw someone doing a community outreach lecture tour, for Next Generation Science Standards. On behalf of NSF I think. She was getting hostile questions, many of the too-broken-to-even-be-wrong variety. Ones where normally I'd just shake my head, perhaps think "nutter", and hope the speaker was able to move on without getting bogged down.

She addressed the questions with a grace, and insight, and empathy, that was awesome to watch. I'd not have thought it possible. I wish now I knew who she was, and had video. If anyone knows of this skill set being taught, I'd love a pointer.

So I no longer think of the quality of a question, and of a questioner, as being a worthy bound on the quality of an answer.


I learned this lesson from the videos of the Harvard course on justice. The lecturer got explanations that I thought at the time were laughable, but somehow in his response he rephrased them to be reasonable and a great platform to deepen the conversation. It requires a great deal of empathy and intelligence to do that and I try my darndest to read as much sense as possible from any question directed my way. Here's an example:

https://youtu.be/kBdfcR-8hEY?t=2446


That lecturer was truly amazing. I thought the trolly car thought experiment was something that is not worth much more than a few minutes thought, but he not only proved otherwise, but was patient enough to ask questions and wait for the class to stumble slowly through the arguments.


I try to give good answers to misguided questions, but it can be really hard to do well. You have to guess what wrong assumption they're proceeding from in real time, and then respond with something that addresses that while still being interesting to the rest of the audience. Some people are great at it, through some combination of genius and practice.


A perspective I try to keep in mind ever since it occurred to me years ago, is that every utterance is a true statement, given the right context. No matter how mistaken or deliberately deceitful, it adds to your knowledge about the world if you figure out how it fits in. Conversely, every lie is an instance of not telling "the whole truth". It's common sense for most people that there is a clear and moral difference between a straight out lie and leaving out some context, but I think there is a smooth gradient and no bright line between them.


“Every man I meet is my better in some way, and in that I learn from him.”

Ralph Emerson’s secret of the true scholar.


I've not seen that before but it is perfectly put.

That is exactly the approach I've learned to take. I honestly started out trying to disprove the notion because I was full of myself. But it really didn't take long to prove it and then doing that soon became a goal.

Canoeing is a good example. My first shot was with my wife on the Buffalo River in Arkansas. I was working my ass off and about 2 hours into our float the guy who shuttled us to the dropoff came gliding by with his wife pretty much effortlessly while I floundered and ricocheted from bank to bank.

Awhile later I saw them sitting on the bank chilling out so I paddled over and flat out asked him "How do you make this look so easy?"

He told me everything I needed to know in just a few minutes and the rest of that float was a joy.


I find myself in a similar situation - a friend's perspective is that you can learn from anything. Mine is that you should triage endlessly after a certain point, I guess that makes me dismissive.

Of course, if I told him "So, you can browse facebook all day and be better off?", he would probably say "You know what I mean."

Real "wisdom" is knowing when to be dismissive and when not to be, either extreme is bad, but being able to discern is tacit knowledge that no one can really say anything useful on without being super specific.


To me, to "learn from anything" requires a certain diversity in your sequence of "anythings". Every now and then you need to try something completely different, and/or explore several dimensions of life simultaneously.


It's likely the inverse: the more knowledge, the more intellectual humility, which is just rephrasing Aristotle, “The more you know, the more you know you don't know.”


Not necessarily. At one point, before the rise of Google, most of my answers to various general knowledge questions would be more or less just a somewhat selective average of things I'd heard and read from sources. I have an educated family and background so these answers were probably "about average" for the educated but still entirely wrong a significant part of the time.

At a certain point in my development, I got in the habit of asking myself "how do I know that" and Googling if I didn't know. Which is to say I think the more intellectually humble someone is, the more likely they are actually verify their ideas through research.

The thing about all this is, it is happening in the era of Google, where if one has some humility, nothing keeps you from learning more (though having a basic background is needed to filter out idiocy also). Pre-Internet, the situation might have been different.


Sounds like you're proving Aristotle correct


I wouldn't be positive about that. Someone with more intellectual humility may be more likely to seek out more information and to continue learning than someone who doesn't.


I wouldn't be positive about that either. It could be that both are true, and create a virtuous cycle.

Or it could be that neither is true, and it's really that both characteristics are engendered by having a curious temperament.


FWIW, I'm not sure that a curious temperament is materially different from intellectual humility.


I think curiosity implies intellectual humility but the converse is not necessarily true.


I think intellectual humility without curiosity is a form (or symptom) of depression.


Why do you think so? I've met plenty of people who lead a simple life and are fine with knowing they don't know much, especially in more traditional communities, where superfluous knowledge isn't valued.


Sure, but I just want to note I never made the claim it was the case - I was just offering one possible alternative to what OP claimed.


Every door of knowledge I "crack" yields twenty more locked doors.


It's a finding more pithily stated by the poet W.B. Yeats:

"The best lack all conviction, while the worst are full of passionate intensity" - The Second Coming (poem).


I also like this one:

"The trouble with the world is that the stupid are cocksure and the intelligent are full of doubt." - Bertrand Russell


Reminds me of a Robert Hughes quip: "The greater the artist, the greater the doubt; perfect confidence is granted to the less talented as a consolation prize".


Is Yeats saying lack of conviction is a good thing? Or commenting on the fact that the wrong people have conviction?


I have always read that as "The good people are paralyzed by doubt (right now)" - a bad thing for them and for us. The context is a list of ways the world is falling apart.


Yes he is saying that lack of conviction is a good thing.


Even though the verse context is listing a bunch of bad things that are happening?

Bad thing;

Bad thing;

Bad thing,

Bad thing, and

Bad thing;

Good thing, and Good thing!


How sure are you about this?


+1 for Yeats


You two might not realize how happy that makes me feel. It's like when I discovered how popular the Master and Margarita is on HN. Never underestimate the depth and diversity of interests among your peers!


Ha, I just finished that book after being turned on it by HN. Thanks, everyone!


I just found another google search.


Politics is interesting from a meta standpoint because if there are millions of people taking a political standpoint different from yours, the odds are quite strong that there are people who are well-informed with good intentions who have thought through their position in-depth and have reached a different conclusion than you have. It's also obviously a place where humans act the most like herd animals. So when you find a really smart person who has a different opinion yet wants to know more, it's an opportunity for both of you to test out your own reasoning skills.

This can be quite fascinating and intellectually humbling. Instead of viewing politics as something for dumb people who want to clan (which it can be for many), you get a personal feel for how much each of us relies on definitions and language that prevents us from doing much introspection. And if it's true in the political arena, it's certainly true in all of the other parts of our lives.

But that's just the first step of the journey. It leads naturally to questions like "How do we 'know' things?" which is a great adventure on its own. Eventually you lead to questions like "How do people of vastly different cultures and background communicate on anything? If I were dropped in the middle of a neanderthal tribe 40k years ago, could each of us learn the other's language and culture? If so, how?"

You end up with a profound sense of ignorance, but that's okay because most stuff works most of the time and we don't have to go around poking at the foundations of knowledge simply to drive to the grocery store. But then you see one of us tech folks make some dramatic and overly-confident statement like "C++ is obviously better than Java" and you think across the panopolpy of human experience you're able to have a good conversation without feeling personally threatened.

Intellectual humility is not only learnable, it's the first step on a grand adventure of realizing where we all fit into the universe.


There was a study sometime ago, about using a PET scan, to gauge responses about various topics, and the majority of the folks used the same part of their brain for both politics and sports teams.

Using their ‘gut’, not logical reasoning.


There has been a lot of research that points to the brain acting first, then justifying whatever actions it took.

I believe this to be the natural state of affairs. I think the brain is as maximally lazy as it can be. This translates to clanning with a large-ish group of people, then trusting that group to make a good collective decision. Act immediately on social observation and emotion and then if absolutely necessary come back and revisit the action to figure out "why" you did it.

If this is the way it works, then the default viewpoint regarding anything we do or say should be highly suspicious from a rational standpoint. It also means that we tech folks have spent a lot of time and effort trying to create artificial intelligence that reasons from logic to action, the complete opposite of the way real intelligence works.


I'll say this gets causality backwards. Intellectual humility is a luxury of the obviously skilled. In a room of smart people where everyone is competing for influence and attention, admitting your lack of knowledge can be a real handicap. When I was younger rather than admit anything I just kept my mouth shut. Now when I am in a room, no one questions I know what I am talking about, and I can usually establish my expertise with a few choice observations. This gives me a lot of lattitude to ask questions and admit what I (or what we all) don't know, because I have the social standing to get away with it.

I'm not sure this is necessarily a mark of my humility.


Yes and no. Yes, being known as one of the ones who knows gives you freedom to say "I don't know" without being judged. In the past few years, I've started feeling that freedom. (Maybe others wouldn't have judged me before. Maybe the issue is that now I don't judge myself for not knowing.)

But no, being the one who doesn't know in a room full of those who do shouldn't stop you from learning. There are two different cases here, though.

If you're in a sane (not too political, and not too toxic) environment, just ask when you don't know something. (Don't do it so often that you hijack the meeting, but do it once or twice a meeting.) You will often be speaking for others as well who don't know but are unwilling to admit their ignorance (unless you're the only junior person). Your asking can free them to also ask and also learn.

If you are in a toxic or political environment, you still need to learn, without getting destroyed. That's harder. You need to try to find out who it's safe to ask. But the bigger task in that environment, in my opinion, is to not learn to be political or toxic yourself.


You're also getting at why it's difficult for those underrepresented in industry to make headway or gain knowledge. It's not as easy as "we have a diverse team". :)


Which brings us to an even more important facet of humility, being aware enough to know you're uncomfortable with certain people _because you're unfamiliar_. Most people are not nearly as cosmopolitan as they imagine themselves to be.


I think that is a good point. However, when I skimmed the article my thoughts were of the opposite people: the people who brag about their skills but then end up having very few skills in that area.


There's a saying in some Indian languages, which I'll paraphrase: the tree which bears a lot of fruit is always bent over. In other words, the person (tree) with lots of knowledge (fruit) is humble (bent over).


नमन्ति फलिनो वृक्षाः नमन्ति गुणिनो जनाः |

शुष्क काष्ठस्य मूर्खश्च न नमन्ति कदाचन् ||

Namanti phalino vrukshaah namanti gunino janaah.

Shushka kaashthasya moorkhashchas nsa namanti kadaachan.

Meaning: noble and virtuous persons are always humble and and are always committed to help others and compares them to a tree bowing down with the weight of its fruits. In contrast dull and foolish persons are always rigid and unadjusting like dried wood.


And it's knowledge itself that humbles them? I think this is a great saying.


Yes, since fruit is heavy, and the more fruit-laden a tree is, the more it bends.


I think this article overlooks a crucial factor, which is ego insecurity. People with an exaggerated sense of overconfidence often learn and achieve more, because their overconfidence serves as a buffer against insecurity, and their inflated self-importance gives them motivation to work hard.

A certain amount of overconfidence is an integral part of being a healthy, thriving human being. People who are depressed have more accurate self-assessments than healthy people. Humility exposes a person to a lot of unpleasant, aversive feelings that can convince them that their energy is not being well spent, making it hard to continue investing at a high enough level to achieve much.

I think it's good to learn the ideas and skills of humility. I think it's important to recognize that it's unpleasant to find out that we're wrong, and we have a natural tendency to avoid unpleasant things, so we have to consciously and actively compensate for our tendency to preserve our own mistaken assumptions. However, I wonder if it's really healthy to internalize humility, to give up a subconscious way of thinking associated with high functioning and adopt a subconscious way of thinking that is associated with depression.

Ideally we'd all have such unassailable emotional security that we wouldn't be bothered by an accurate degree of humility. In reality, I think we have to fake it, and I think a lot of the people preaching this gospel of "you shouldn't feel bad when you find out you're wrong" are faking it too. But that's okay. Our full being is a combination of our messy human psyche that isn't designed for the things we try to use it for and the conscious ideas by which we interact with it. Sometimes these parts are compare to an elephant and a rider — the tiny rider on top is the part of our mind that we consciously control. I think it's okay if that's the only part that really practices humility.


I think this is an interesting point, but responding to the last part ("...a lot of the people preaching this gospel of "you shouldn't feel bad when you find out you're wrong" are faking it too. But that's okay.")

I think there are different levels of "feeling bad". When I'm wrong about something, it certainly doesn't feel good, and in some cases it really sucks, but I don't then generalize that feeling to represent something intrinsic to myself. I think the crux is separating the bad feeling of the instance from an overall bad feeling. Ie - I got this one wrong, but that doesn't mean I'm stupid.


One of the ways I think about humility is that it is the ability of a person to successfully integrate information that requires some non-trivial refactoring of one's internal model of the world.

Refactoring can be a costly operation, so it makes sense that the mind shouldn't take on that task too often, especially if one's mind is more likely to suffer net harm from an unsuccessful or partial refactoring.

However, if a person is gifted at learning in the broadest sense, then it would make a lot of sense that their mind would undertake this process much more gladly, and as a result, form a very strong model of the world.


refactoring is a very appropriate term here. how is it bad or costly? it should be done constantly, every day. if i can make myself less full of shit at least a tiny bit today, i will take it.

having seen many characters in the line of duty, the clear pattern is - if someone is not afraid to say "i was wrong" or "i am not the smartest guy", then there is no doubt who is the smartest guy in the room.


why would a partial refactor be bad for the mind?

(I don't consider the mind to be digital, and the brain even less so)

how can you know when you have completed a refactor?


You know that scene from Animal House where the 60's college kids are sitting around smoking a joint, and someone proposes the idea that every electron is a universe, and their minds are all blown? That's a parody, but it also really is what a human mental model refactor looks and feels like. (Both from the outside and the inside. Yes, I'm speaking from personal experience.)

how can you know when you have completed a refactor?

Before you've completed it, other people might think you're annoying because you want to talk about one particular thing a whole lot, and they're waiting for you to get over it. Most likely, you'll roll back some of the changes and make some bug fixes, and you'll stop feeling "whoa" and euphoric, and people will start treating you normally again.

The above answers:

why would a partial refactor be bad for the mind?

Probably good for your mind, but an interruption for your social progress.


Based on my understanding of the learning process, typically you can tell when a refactor is occurring if you go through "Foundational Collapse," meaning that you've learned something that makes almost your entire understanding of the subject - even one that you've known for a while, to fail. This occurs until you are able to rework your entire understanding of the subject to fit the new criteria. This occurs more easily the better you understand the subject in the first place, as long as you understand what is happening and you give your brain an opportunity to contextualize the information.

Since this is done by the DMN instead of working memory, when this occurs I think that the best thing you can do is take a weekend off. When you are learning aggressively, like at a high level university, I imagine that a student will go through several of these a year in various subjects.


I suppose you have an incomplete model where you haven't thought of and understood all of the corner cases.

I don't think you ever 'finish' a refactor! But you get to be able to deal with your everyday without bumping into conflicting information that causes you to have to refactor every day.

I think a partial refactor would be bad because you're working on an inconsistent model, and if it's wildly inconsistent then it can cause you issues, and might not be a complete model.

For example, imagine you just learned that sugar is bad for you! But thats the sum of your new dietary knowledge. So you start going for fatty foods with lots of preservatives. This isn't going to be any better for you. And you'll have to 'refactor' when someone points out your new diet is still bad for you.


Probably not bad, but taxing. Unlearning and re-learning is difficult so it's avoided at the expense of retaining bad habits or knowledge. Like the adage about people getting more stubborn as they age, teaching an old dog new tricks etc.


It's funny, I always considered myself one of the most humble people i know. Guess this explains why I am so knowledgeable about so many topics.


The irony in this statement is fantastic. Thank you for humbly making this post and giving me a good laugh.


I'm not humble, but I've learned to fake it pretty well.

The trouble is most people believe me when I say I'm not that smart.


100% this.

And so was born the pervasive humble brag.


Prior to reading this article, I was arrogant and conceited. Now I know to proceed through life with intellectual humility in order to amass knowledge.


I'm at least 10% humbler than you are.


Came here to say that - actually, I'm impersonating a humble person, which is even better. I feel like I'm still getting the benefit of superior knowledge.


You can be humble and still believe you know more than other people. The issue arises when your self-reported knowledge exceeds your actual knowledge.


I think I'm much more humble than you would understand.


> I always considered myself one of the most humble people i know.

This is the paragon of humble-brags.


... Not really? There's nothing humble about his statement, it's just a brag.

Also, it's intended to be ironic.


> I always considered myself one of the most humble people i know

That's a brag about being humble.


The article title is clickbait nonsense:

“The findings in relation to knowledge acquisition were mixed. While an online study involving 604 adults (and using the more comprehensive measure of intellectual humility) found the aforementioned link between greater intellectual humility and superior general knowledge, another involving college students (and the briefer intellectual humility questionnaire) found that those higher in intellectual humility achieved poorer grades.”


Thanks for this. I read the title, and my thought process was:

"I agree".

Then:

"There's no way the study backing the article is good enough to draw meaningful conclusions. This is clickbait."

Then:

"I guess if i'm going to be intellectually humble I should still read the article."

I think based on your comment I can safely skip the read, and save myself some time. Intellectual humility is one thing, but I can go read something with a higher probability of being informative instead.


There's more to it than the excerpted quote. It's worth a read. The article does not breathlessly treat any of the studies it discusses as the word of the flying spaghetti monster, and it does a good job of presenting a broader perspective on the question.


One study showing the opposite result likely means that all of their results comes from confounding variables.


I'm not seeing the contradiction.


Then sometimes, you meet someone with the attitude, "I must be very intellectually humble, because I already know just about everything!" This sounds like a cheap joke, but I'm serious about this observation. It's easy to fall into this for someone who has been inculcated with the academy's values with regard to knowledge. This not only includes academics, but disciplines with some degree of intellectual rigor, like programming and software development.

Here's a good rule: Don't get so pumped up, that someone else wants to poke you to deflate you.


Isn’t there a chance it’s the other way round. People might start off arrogant but once they acquire enough superior general knowledge they figure they know jack and forces them to be humble ? It’s a common enough fallacy to get cause and effect backwards.


> People might start off arrogant but once they acquire enough superior general knowledge they figure they know jack and forces them to be humble ?

There are too many other variables at play - i.e. inherent psychological disposition, environmental and social factors, institutional incentives - to know how cause and effect work in the development of intellectual humility.

But sometimes arrogance in general (and perhaps intellectual arrogance too) is itself just a thin defense against a dominating unexamined insecurity about oneself.

If acquiring more general knowledge helps make you more comfortable with yourself, it might make you more intellectually humble, but it could also do nothing, or even the opposite.


There is also a chance these concept correlate because they share many of the same characteristics, i.e. they are mostly the same thing.

Measuring psychological concept is far easier then then rigid and if you align one question or answer to one concept there is a good chance you are actually measuring the other, or both. In fact the very concept of general intelligence has been severely criticized for this. To an extend that many psychologist (and laymen like my self) don’t even belief it exists.


Is the article implying causation or just correlation? Because either cause->effect pathway would imply correlation. The pathway you describe seems like a common way to get to humility, through actually learning how little domain knowledge you have.


I know that there isn't a lot of scientific support for Myers-Briggs vs the Big 5 personality test - but I always found the "judging" vs "perceiving" aspect of Myers-Briggs to be sonewhat meaningful in understanding people's mindsets. I think this article clarified for me that the factor I was actually looking for boiled down to intellectual humility. It would be interesting if they did do myers-Briggs and other personality inventories alongside these studies just to see if it really does match up.


I'm reminded of something Jesse James said on Monster Garage (I'm sure he wasn't the first to say it and he is not the best role model himself but...) If a guy says he knows everything he probably doesn't but the real experts are the ones that are more humble about their skills.


TBH I doubt this very much. In fact I doubt, in general, that such broad assertions can even be made. Anecdotally, I have worked at 2 research labs by now (one of them was MS Research, another relatively unknown), and of the few truly genius-grade people I met there none really had much "humility" on display. They know a staggering array of stuff really well, and they know they know it. If you don't know something, they'll let you know without much consideration for your self-esteem.


Is it possible there were people in those labs that knew more than these geniuses, but you never noticed because they presented themselves modestly?


Unlikely. My bar for a "genius" is pretty high. It's actually pretty obvious when you're in a room with one - it almost feels like they're different species. Stuff that's insurmountable to you is easy as a pie to them, and they can explain it in simple terms. They couldn't care less about the fact that you weren't able to understand it on your own.

And some of them present very modestly most of the time, but if you're wrong about their stuff, they won't mince words. Nor would they be humble about things to get along with you, or doubt their knowledge just because it's trendy to do so. They don't necessarily shove themselves into everyone's face.


The article points out that there are conflicting studies. Personally I don't believe there is a strong correlation (one way or the other) between intellectual humility, intelligence or breadth of knowledge. But my opinion on that matter is entirely unqualified; all I have to go on is basic anecdata I've observed along with my own interiority.

While reading through the article I found this interesting:

> In terms of insight, higher scorers in intellectual humility were less likely to claim knowledge they didn’t have (the researchers tested this by assessing participants’ willingness to claim familiarity with entirely fictitious facts that they couldn’t possibly know), and they also tended to underestimate their performance on a cognitive ability test.

I can appreciate the reasoning behind asking that question, but the fact that it's so useful for judging intellectual humility saddens me. Why would a person spontaneously respond, "Yes" when asked if they're familiar with a thing they know they aren't? What does that say about us as a species, that this behavior is so prevalent?


We have a long history of education based on punishment, reward, and PR.

Myself I've felt compelled to pretend to know-it-all out of fear of being excluded. Being excluded is not a naive thing, it could mean not getting any job, and falling in disgrace and being hated by a majority of people. It happened to me for saying the wrong thing in the wrong place at the wrong time to the wrong people.

There is also the problem of PR, the post-modern fallacy, people who think they can convince anyone of anything just to get what they want, because truth is relative, and it doesn't really matter if you are good or competent as long as you get a seat at the big table.

Modern western society has a low tolerance for ignorance, which is sad given that many people is ignorant not out of their will, they were just dealt a bad hand. Also, it is very difficult for ignorant people to get out of their ignorance if nobody helps them, there are unknown unknowns which they cannot see without external help.


Actually had an option to make something like this a teaching moment to my kids yesterday. I'd been reading articles about the bridge between Abell 0399 and Abell 0401. There was a Forbes article that said this provided supporting evidence for dark matter. https://www.forbes.com/sites/startswithabang/2019/06/06/scie...

Dark matter bothers my sense of cosmic aesthetics. I'm not a physicist, I just like reading about how the universe works, so please don't attack my ignorance; there's only so much I can read. I'm painting with very broad strokes here.

Going back to the Forbes article, it provided some good arguments that this was supporting evidence. I had to concede that the blocks making up the argument seemed sound, so I started rearranging mental furniture.

Then there was the correction at the end of the article, and I'm at the position where I can understand how you can provide a situation that can at least provide supporting evidence to support dark matter.

---

However, that's not the main point, it's just the foundation for where I'm going.

I used it to show my kids that you can understand/believe something that seems right or is "known" to be true. But when you come across evidence that shows you're wrong, or presents valid evidence to support a differing point of view, then you have to consider it outside of what you desire to be true. Mind, after examining it, poke at holes that exist and shred it if it's garbage, but if it's not susceptible to that, you need to change your world view.


I believe this is largely the reason Galileo is often considered the first scientist. When Copernicus argued for a heliocentric world-view, he was largely doing so on the basis of aesthetic value: it simplified the mathematics. But it was always theoretically possible for another mathematical model to make the geocentric perspective more appealing, so anybody who was attached to it could simply cast his model off as a mathematical trick.

But Galileo built a telescope that anybody with a pair of eyes could use, no matter your aesthetic preferences, and he was compelled to transform his world-view on the basis of the facts he observed in it. He didn't really have to make any mathematical argument at all; he just had to explain what he observed, and it directly contradicted the Aristotelian model of reality.


Sometimes people can be acting passive aggressive in the name of humility. It's hard to understand yourself to find out if you truly are willing to find out or just trying to prove another person wrong. I think I have lived in both spectrum in different phase of my life so I can vaguely relate.


> A final study showed that participants who read a popular magazine article about the malleability of intelligence (designed to foster a “growth mindset”) subsequently scored higher on intellectual humility than another group who read an article about intelligence being fixed. > What’s more, those in the growth mindset condition went on to display a more positive approach when imagining dealing with someone with opposing views, and this seemed to be driven by their increased intellectual humility.

I find this part interesting as it might explain why some communities are so toxic and other... nicer: the niceness can spread between members. So the first few members are very important in setting up the community's culture.


My understanding of the great geniuses is they tend to not be very humble...is this concept empirically supported, or does it just sound nice?

In my experience, the really knowledgeable people tend to be quite opinionated, while also being willing to question their own ideas and receive critical feedback. So, it seems they have both great humility and arrogance, not one or the other.


The canonical genius has to be Albert Einstein, who was known for humility. As to some others, people make history as geniuses for (1) doing genius things and (2) making sure everyone knows they're the genius that did it, so you have to consider the effect of (2) when you try making a list.


If they manage to do #2, they are still a genius, which contradicts the notion that geniuses must be humble.

Personally, I prefer a bit of arrogance so that people clearly state what they believe and why, even if it is a bit blunt. The clarity of this approach seems to lead to better ideas, since my personal observation is that great ideas also tend to be fairly simple and can be stated in a straightforward manner.

Plus, when an idea is clearly stated, it is easier to know if the idea is right or wrong, which is necessary for the humility bit of self questioning.


Scanning the comments, I don't really see anyone discussing the value of "superior general knowledge".

I like knowledge, so I find value in attaining it. However, depending on your goals, intellectual humility will set you back a great deal. Most people use confidence and certainty as a signal, and suppressing these means they'll pick someone over you (for a promotion, for a relationship, for a contract, for your knowledge/expertise (even though you likely have more of it than the other guy)).

The people who are ahead are rarely the ones who are the most knowledgeable (with the possible exception of academia).

I've been told: "Be humble and tentative on the inside, but self-promoting and confident on the outside".


One can very much be humble and confident.

Especially in an interview or promotion situation, I'm proud of and speak to my accomplishments and the skills and knowledge I've been able to obtain, but I can also speak with real honesty about the mistakes I've made, tend to make, and my own shortcomings. I talk about both without hesitation.

Though I know it's entirely anecdotal, I've found most of the time those traits were taken quite well in an interview context, and certainly I've appreciated people that display those traits when I'm interviewing them.

The key I've always found was the tone in which both your humility and confidence are displayed. After all, confidence is a totally different animal than cockiness.


Yes, I did not mean to say that the two are necessarily exclusive. You can be confident about things you know exceptionally well, or about things from your past (accomplishments). But if you show humility for other things, and start using qualifiers like "most likely", "I suspect", "generally", the majority of folks are likely to pick the guy who doesn't use them.

>Especially in an interview or promotion situation, I'm proud of and speak to my accomplishments and the skills and knowledge I've been able to obtain, but I can also speak with real honesty about the mistakes I've made, tend to make, and my own shortcomings. I talk about both without hesitation.

These are things for which there is no uncertainty: You know your accomplishments, and you know your failures. And it is in a very confined scenario (interview/promotion). What I've found is that demonstrating uncertainty in day-to-day matters can greatly impact whether you'll be considered at all for a promotion/job. My first manager always wanted certainty. If you said "I'll see what I can do" or "I'll do my best", that was perceived negatively. And it would be more negative if a conversation followed where you outlined the reasons for lack of certainty (various risks, dependencies, etc). If you said "I'll get it done", he'd be happy. Over time I noticed it was irrelevant whether you actually get it done. If you said "I'll get it done", and failed, and explained why you failed, things were good. If you said "I'll do my best" and failed, and explained why you failed, his belief was "He didn't try hard enough".

Granted, not all managers are like this (thankfully!). But I've found most people in the general public are like him. Both online and in my personal life, when I've expressed any doubts/risks in an endeavor I'm pursuing, I've been told that I'm setting myself up for failure. And when it comes to non-solo pursuits, they're usually right, but not for the reasons they believe. The key was this:

People who express doubt are less likely to receive help from others.

People who express unrealistic confidence are much more likely to get help from others.

So once again: Hide the humility and fake confidence on the outside.


I think it takes a really courageous person to hear someone share their negative traits and still trust them, just like it takes one to buy a product on Amazon with a lot of 1* reviews. I'm trying to be that person, but its hard! In my experience, voicing negative traits about oneself can put a kind of iceberg into someone else's mind, where they think "what else must be wrong with this dude". People who know you and trust you already, won't be impacted by this of course.


I’m curious where the term “Superior General Knowledge” in the headline came from. It’s not explicitly defined in the article and ironically is not an intellectually humble way to describe the article’s conclusions.


Makes sense. Well educated people have educational credentials that protect their self-esteem. They can afford to be intellectually humble. Admitting to mistakes or ignorance doesn't hurt as much.


Some of them are just intelligent enough to understand that they don't know everything.


Intelligence can't transform the unknown unknowns into known unknowns. Awareness of one's own ignorance has to be taught.


I hate to nitpick but "superior general knowledge" should probably be "more depth and breadth in general knowledge." Superior is subjective and in this context sounds elitist.


Using "aperçus" in an article about intellectual humility? Ironic, I'm pretty sure. Unless I'm just unfamiliar with inexplicable pluralization being appropriate.


I took it as a tongue in cheek thing, but if you look here https://trends.google.com/trends/explore?date=now%207-d&geo=... you might find the results amusing. Apparently a decent amount of people reading an article about intellectual humility are the type to google a word they don't know.


Good for them! I looked it up, too, to see if there was a common usage I'm unfamiliar with. I am a non-native French speaker, and the pluralization didn't make sense to me, but I thought maybe this was a case where I was unaware of a word being re-appropriated and used differently in English. It does seem to show up as plural, but without the accent. I still don't really know, though.


The pluralization does exist, as “those who have perceived” or “those which have been perceived”, but it is indeed a grammatical error in the article's usage. The ç is always correct, but accents are a common casualty of transliteration and informal writing.


As an artist i must say that this is actually the opposite of what you want to be when making inovative art. You have to push your world view so hard that it becomes real.

Ignorance creates art[1], stubborn ignorance guided by personality.

Ps. This might explain why so many artists are narcissistic

[1] https://youtu.be/25kmuPSt60w around 2:48 guy makes a good point about this


Or do people with "superior general knowledge" recognize it is socially advantageous to present as humble?


"When an honest man discovers he is mistaken, he will either cease to be mistaken or cease to be honest."


i try to remember: "i like liking thing."

this seems to help me avoid intellectual-ier-than-thou attitudes and behaviours. it's simple in its elegance to me:

1) i like to like

2) in order to like something, a manner of understanding is always involved (in maths it's empirical, in watching a magic stage show that fools you it's in a less direct understanding such as 'i understand there's a person attempting to create a sense of awe and wonder even though i do not understand the methodology of the trick')

3) so intention to like -> (as leads to) liking

4) and intellectual humility -> understanding

5) so intellectual humility -> understanding -> liking things

inversely,

intellectual arrogance -> ignorance -> not liking things

i don't want to do that thing. the ignorance and hate thing. it is - at least figuratively and perhaps moving toward literally - the only thing i actively do not like.


So as the levels of intellectual humility increase there comes a point where a person's superior general knowledge on a subject can actually be greater than a domain expert's specific knowledge.

At this point maintaining the intellectual humility so as not to descend into idiocy becomes a real challenge.


Guess what? They confirm in the article, that different studies had opposite results, yet we got this title.


There is an Indian saying... A tree full of fruits is closer to the ground than the one without.


Curious if cause and effect could be reversed here. Perhaps it's "People with superior general knowledge have greater intellectual humility"? Perhaps it's a feedback loop where they both impact one another.


知者不言,言者不知

"The one who knows doesn't speak, the one who speaks doesn't know"

- Dao De Jing


Those who know don’t talk, and those who talk don’t know.


Ah yes. Always knew my superior general knowledge was due to my absolutely incredible intellectual humility.


Me too. I always seem to know a little or even a lot more on any subject than others do. I realise now, it's down to my outstanding intellectual humility.


Noam Chomsky comes to mind immediately. I have seen very few people who are as humble.


The bigger the attitude, the smaller the talent. I have always noticed


The Socrates paradox goes "I know that I know nothing".


The Socrates paradox: "I know that I know nothing"


The corralary being: a shallow brook babbles the loudest


Socrates got it right more than 2 thousand years ago.


This is a description of the Dunning-Kruger effect. While Dunning-Kruger is a popular topic around tech circles, I'm not so sure about the general population. I wish people weren't so ignorant about their own ignorance.

Edit: Adding clarity

People who have superior general knowledge have greater intellectual humility because they are more aware of their own ignorance and the complexities of the subject matter. Those who do not have superior knowledge are subject to the Dunning-Kruger effect.


Ignorance of what? Your reality or their own?


Reality isn't subjective. People are just imperfect antennae with varying receptions and demodulators.


Isn't it subjective when we consider the reality of the fish versus the reality of the human? Even among two humans the cornerstones of your psychic landscape can be vastly different. Are you talking about elemental reality? Any "elemental reality" that leaves out the vital elements of psyche, behavior, habit, preference, and perception would clearly omit a lot of vital information in coming to know Reality with a capital R.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: