Intellectual Humility is one of the most exciting fields of study in current psychology and philosophy. Recent research has convinced me that getting better at it can change our lives—and make a big difference in the world. This article lays out much of what we know today about it, based on research from scholars across several industries, along with some original data of my own. If you’re interested in training exercises for developing intellectual humility in yourself in others, check out our new Dream Teams program at Snow Academy.
HAVE YOU EVER FELT frustrated by someone who refuses to change their mind about something, even when they’re clearly proven wrong?
If you’ve ever used Facebook, I suspect you have.
Take a second to recall just how frustrating that feeling is.
Now take a second and try to recall the last time you admitted you were wrong about something important.
When was the last time you changed your mind about something really close to your identity—say, a political issue, a spiritual belief, or what you wanted to do with your life? How did that feel? How tough was it?
Doing this kind of thing is hard. We rarely do it. But it’s almost always worth it.
The people I have respected most in my own life have been living examples of this: My grandfather, who converted his life of vice into a life of generosity; my journalism mentor went from crack addict to family man; my mother who dared to raise her kids differently than she’d been raised—and thereby breaking a cycle of abuse.
Gallup’s list of Most Admired People is populated with people who have changed, repented, evolved, and grown—not people who got it all right the first time. Saul the Christian persecutor changed his ways and became Paul the Christian apostle—and wrote half of the New Testament. I could go on.
We live in an era full of social pressure to appear perfect all the time, to never admit fault or weakness or else risk losing that promotion, that election, those followers, etc.
It’s an era where changing your mind or admitting you were wrong is fuel for your enemies—even when it was the right thing to do—and where blaming Others is the go-to strategy for dealing with setbacks.
And even without social pressure, changing our minds about things we’ve made our minds up about is psychologically painful. As humans, we tend to avoid the pain that comes with unlearning.
But the people who make the most positive difference in the world are not like this. They are the ones who are able to discern when they need to change and then be brave enough to do so even when the cost is high.
These kinds of people make amazing teammates because they are able to consider people and ideas that others won’t.
They make amazing leaders, because they are able to choose the right thing to do over the easy thing to do.
They make amazing citizens, because they don’t stop learning and thinking as the world around them changes.
Scientists have a term for people like this: Intellectually Humble.
*
WHEN WE LOOK AT THINGS that have shaped history—as well as breakthroughs in our own lives—two truths stand out:
Nothing great was done alone. Progress is always the product of working together, or being influenced by the work of others; and
Breakthroughs only happen when we think differently than before.
Indeed, our capacity to change the way we think—driven by the unusually high neuroplasticity of our brains versus other animals—has a lot to do with why we were able to work together and win Planet Earth in the first place. The sabretooth tigers were bigger than us, but together we learned how to outsmart them.
Intellectual Humility is one of the most fascinating fields of study in current psychology, because it helps us understand how to get better at both of these things—working together and thinking differently. We’ve only recently begun to understand how intellectual humility works in our brains and how a person can get better at it. But the ramifications are exciting!
This article digs into everything we know today about intellectual humility and how to get better at it. I’m convinced that this is the number one thing that any leader, team member, citizen, or family member can learn in order to become more effective tomorrow than they are today.
And it’s especially relevant in today’s world of “alternative facts” and deep ideological divisions. The more intellectual humility we have, the better we can work with others who aren’t like us, the more innovative problem solvers we can be, and the better people we can become.
Warning: This post is long. For that reason I’m including a Table of Contents here for anyone who wants to skip around:
*
MALCOLM X’S HOUSE WAS burnt down by racists twice.
The first time was when he was a child. White supremacists torched his family’s home, shortly before one murdered his father.
These events propelled Malcolm to become a fundamentalist preacher who, among other things, taught that white people were devils—literally not human—and that it was thus okay to commit violence against them.
It’s hard to blame him, given the circumstances.
Malcolm built an enormous following by preaching this. He denounced Martin Luther King Jr. and the civil rights movement for trying to get the races to get along. He taught that it was “a beautiful thing” when innocent white people died. He declared that women were weak, Jews were evil, and JFK had it coming when he was assassinated. He even started discussions with the American Nazi party to work on promoting the thing both he and they wanted: to separate black and white people from each other.
Dr. King in turn denounced him. “The hatred expressed by Malcolm X is not shared,” he declared.
*
THE SECOND TIME RACISTS burnt Malcolm X’s house down was after Malcolm suddenly began preaching that racism was evil.
He’d gone on a trip abroad for a few months, and when he got back, everything had changed. He began giving sermons and television interviews preaching the opposite of his earlier stump speeches. He declared that his past ideas about race were wrong. He apologized, and said that women should be equal to men, and that the human family would need Jews and Christians and Muslims and even Atheists in order to make progress.
Indeed, Malcolm began preaching that Dr. King’s quest for equality and civil rights was good.
“A man should not be judged by the color of his skin, but rather by his conscious behavior, by his actions… I believe in taking an uncompromising stand against any forms of segregation and discrimination.”
– Malcolm X
We revere Malcolm X as one of history’s heroes because toward the end of his life, he changed his mind.
He rethought the fundamental ideas on which he’d built his whole career.
He rejected the philosophies on which he’d staked his reputation for decades—knowing full well that it could get him killed.
It did get him killed.
But not before Malcolm X was able to contribute to the Civil Rights Movement, converting many disaffected people to the causes of integration and nonviolence.
And after his death, his autobiography went on to inspire generations of people to stand up for themselves and live better lives.
I love Malcolm X’s story, and not only because it’s a righteous bit of history.
I love it because it can help us understand a question that’s extremely relevant to all of us today:
What does it take to change our minds about important things?
I spent a couple of years exploring this question for my book Dream Teams, which is about the science of becoming better together. And it turns out that the answer is one of the biggest keys to making progress in teamwork, creative solo work, and society: a virtue called intellectual humility, or IH for short.
If everyone in the world developed more of this virtue, a lot would change. Innovation would skyrocket. War and violence would plummet. Facebook arguments would actually be productive.
So what exactly is IH, and how do we develop it? In order to really understand, first we need to talk about a misunderstood term we’ve been kicking around for ages, open mindedness.
*
SOME PEOPLE ARE PARTICULARLY good at changing the way they think and do things. Sometimes we just call them good learners. But often we give them a different label: open-minded.
But what exactly does that mean?
Colloquially, we often use open mindedness to describe someone who has a “willingness to do or believe lots of things.”
But Dr. Jason Baehr, professor of philosophy at Loyola Marymount University, defines it a little more specifically:
“An open-minded person is characteristically: (a) willing and (within limits) able (b) to transcend a default cognitive standpoint (c) in order to take up or take seriously the merits of (d) a distinct cognitive standpoint.”
Open mindedness, in this definition, is linked to more creativity, less fearfulness (especially of people not like us), and better problem-solving.
Unfortunately, there are a few problems with open mindedness, too.
The first is perhaps the most surprising.
For centuries, philosophers and psychologists have been unable to fully conclude how humans become more or less open minded. And that’s because we’ve had no way to actually measure it.
The second problem is that too much open mindedness, especially in the colloquial definition, can turn out badly. Changing your mind based on every new piece of information, or just because, is not good for survival—or business.
As Carl Sagan once put it: “Keeping an open mind is a virtue, but… not so open that your brains fall out.”
Finally, there’s the problem that I uncovered in a 1,249-person study that I conducted for Dream Teams. About three-quarters of adults claim they are more open minded than the average person.
By definition, that can’t be true. Even worse: less than 5% of adults think they are more closed minded than average:
Source: Dream Teams 2018 IH Survey
In other words, we’re confused about what open-mindedness really is, we think we’re open minded even when we’re probably not, and scientists haven’t been able to do much about it.
The one related thing psychologists have been able to measure is a personality trait called Openness To Experience, or OTE. This is part of the Big 5 assessment that a lot of companies use to figure out if a job applicant is going to fit their culture.
(Side note: that’s a bad idea.)
But OTE is only your willingness to try something new. Like, pickle flavored ice cream. Or that new Netflix show about chinchillas.
It’s not a measure of your willingness to change your mind about pickle flavored ice cream. Or of your willingness to transcend your default cognitive standpoint on chinchillas.
Open-mindedness begins only once we’ve taken in the new experience or information. And philosophers say that humans have two ways of doing that:
The first is confirmatory thought, or “a one-sided attempt to rationalize a particular point of view.”
The other is exploratory thought, or an “evenhanded consideration of alternative points of view.”
You can be open to thinking about things while still considering them in a one-sided way—just engaging with other viewpoints so you can confirm your current thinking. That’s not true open mindedness. (We all know that person who’s willing to do things with no intention of changing their mind about them just to “prove” that they’re right about what they think.)
Malcolm X was willing to debate his intellectual rivals—like leftist icon Bayard Rustin—with no intention of changing his own far-right views.
Malcolm was open to the experience of hearing another argument, but it took developing something else to get him to actually change his mind.
*
INTELLECTUAL HUMILITY IS WHAT’S known as a moral virtue. As Aristotle defined it, a moral virtue sits between two vices. It’s the temperate thing between defect and excess.
For example, “courage” is considered a moral virtue because it sits between “cowardliness” (too little courage) and “recklessness” (an excess of courage):
This is different than a personality trait, like being funny or extraverted or having OTE. Many if not most personality traits are not inherently good or bad from a moral perspective.
There’s no morally superior place to be on the spectrum between funny and not funny:
So while OTE is helpful for learning new things, there’s nothing morally wrong with not wanting to try something you don’t want to try it. It’s not a sin to be too nervous to go snowboarding, to not want to talk about politics with your uncle, or to have no interest in trying chicken foot soup.
You may live a less interesting life if you don’t try these things out—just like your life might be more fun if you’re funny. But it doesn’t make you immoral either way:
On the other hand, philosophers agree that it’s morally good to be able to decide when to change your mind, and to know when you shouldn’t:
Over the years, most IH research stayed in the realm of religious philosophy. But that changed as psychologists started discussing how this ability to change your mind (and know when you shouldn’t) was important for us in everyday contexts outside of religion.
And with this came a breakthrough that gives us a better way to think about the whole “open mindedness” thing.
That breakthrough was in 2016, when two professors from Pepperdine University, Drs. Liz Mancuso and Stephen Rouse, broke IH down into components and then figured out a test people can take to measure them.
Mancuso and Rouse defined IH as “a nonthreatening awareness of one’s intellectual fallibility.” They said this should result in four things:
and ultimately:
In many ways, these four factors feed into each other. Though it’s possible to rank high in some and not in others, to be truly intellectually humble, you need all these things.
As Duke professor Mark Leary put it to me, “If I respect others’ viewpoints—that is, I don’t disrespect or reject people for what they believe—yet I always think that I am right, I’m certainly not intellectually humble.”
Mancuso and Rouse created a peer-reviewed assessment for measuring these components of IH, and published it in the Journal of Personality Assessment. Around the same time, Leary and his colleagues published a similar scale for measuring IH—focusing in addition on the relationship between IH and curiosity, tolerance for ambiguity, and low dogmatism—in the Personality and Social Psychology Bulletin.
The explosion of research on IH since we’ve been able to measure it shows that this virtue comes with tons of benefits.
Studies already show that people high in IH pay more attention to evidence and are interested in the reasons that other people disagree with them, rather than just overcoming their opponents.
People with lots of IH also have less emotional reactions to ideas they don’t agree with. And they’re better at distinguishing between fake news and truth.
Lord knows we need more people who can do that today.
The thing I like about Mancuso and Rouse’s four components of IH is it gives us an easier way to think about developing it than just saying, “Be better at changing your mind when you should!”
So for the next half of this post, we’re going to dig into research on each of them, so we can go over what it truly takes to master this virtue.
But before we get to more on that, let’s talk about an important caveat:
It’s possible to have all the IH in the world and never get the chance to use it. What good is being willing to change if you never take in any new information? To unlock situations where we can use IH, it helps to have some OTE:
Like I said before, there’s nothing morally wrong with being scared or unwilling to try something new. Not being open to a new experience doesn’t make you bad. (I don’t want to try PCP, #sorrynotsorry.)
But, in the same way that being funny can be more helpful in depressurizing tense situations than not being funny, having more OTE is useful. It gives us the ability to get more out of IH, because it leads us to get more information to consider.
Very few people have zero OTE. It’s just that those who are willing to try new things get the opportunity to use IH more.
*
THE FOUR DIMENSIONS OF IH add up to the ability to do the kind of thing that Malcolm X did, to change our minds when it’s the right thing to do—even when it’s risky.
When you combine these four factors with the OTE assessment from the Big 5 Personality Assessment, it gives you a pretty decent approximation for how “open minded” a person is. (Dr. Mancuso told me that this is the most well-rounded way to measure open-mindedness that she’s heard of so far.)
So I did just that: I combined the assessments for IH and OTE to create a 5-factor Open Mindedness test. You can take the assessment yourself at the bottom of this post. (Click here to jump there now in a new window!)
Based on data from thousands of people who’ve taken this assessment—and combined research across neuroscience, psychology, and sociology—the rest of this post explores what we currently know about how to get better at each of the aspects that play into IH, which in turn gives us a pretty clear path to becoming better collaborators and more innovative thinkers.
INTELLECTUAL HUMILITY HAS FOUR main components, according to psychologist Elizabeth Krumrei-Mancuso of Pepperdine University, one of the foremost researchers of the topic who has developed a scale to measure it.
Scoring low in any of the four components tells us we’re not that intellectually humble, but also tells us what we can work on to get there:
As someone who had written more on IH than any other journalist I knew of, I presumed that I would score top marks on all fronts when I took the assessment.
But that thought itself foreshadowed my test results.
Here’s how I did on each dimension, on a scale of 0 to 1, with 1 being the best and 0.75 being about average:
As you can see, I scored average or above on most of the factors.
But holy smokes… apparently I have an ego issue.
Upon reflection, I can see it. As a journalist and explorer, I’m quite curious about new things and people and ideas. And I’ve changed my views on things like religion and politics in big ways in my life.
So I’m open to being wrong about things.
But man does it tend to hurt when that happens.
My conclusion is that since I was a kid I’ve staked my identity on being “a smart person.” So when I’m proven wrong about something, it cuts at my identity, and my ego feels attacked. This surely leads me to have less productive debates, and to avoid digging into subjects where the truth could hurt.
The good news was now I knew exactly what I needed to work on!
Research shows that genetics and parenting each have an impact on general IH levels, as does living in a culture that values evidence-based learning.
But it’s also shown that people can change. We’re not just at the mercy of our circumstances when it comes to IH—if we work at it deliberately.
As The John Templeton Foundation points out in an excellent overview of the psychology of intellectual humility:
“People differ not only in their general level of IH but also with respect to particular beliefs and attitudes. People may be intellectually humble with regard to some of their beliefs while being arrogant about others.”
In particular, research summarized by Duke University shows that we tend to throw out IH when we feel we are under threat—such as during times of war, economic hardship, thoughts about our own mortality, or when waves of immigrants move into town.
So the scores one receives on a self-reported IH assessment like the one above only really tell us how someone operates generally. For each of us, there will be times and topics that trigger us to start operating with lower IH.
For this reason, understanding the building blocks of IH can help even the best of us operate at a higher level, especially during times when we feel threatened.
Let’s go through the research on each of the IH dimensions in order:
INTELLECTUAL HUMILITY FACTOR 1
HUMANS HAVE TWO COMPETING psychological motivations when it comes to group interactions.
The first is our need to belong.
We survived as a species by banding together in tribes and groups. Getting kicked out of the group means you might not make it. So built into our mental wiring is a deep desire to do what we can to stay in the group.
The second is our need to be distinct.
We’re more valuable to the group if we bring something unique to it. The one person who knows how to build a fire is more useful than the tenth person who knows how to pick berries. So our brains are always trying to maintain a balance between belonging and being unique.
This psychology gets complicated by another thing built into our brains: our threat detection system. The crude history of evolutionary psychology is that at a certain point the biggest threats to our survival were no longer big animals or bad weather. We’d conquered those by banding together. So after we overcame nature, our biggest threat became groups of other humans.
So we developed something called in-group/out-group bias.
Basically, every time we encounter a person, our brains decide very quickly whether that person is safe or not. Can we turn our back on this stranger? Or are they liable to club us for our woolly mammoth steak? Our brains decide this in less time than it takes us to think about it, and then we go on defense, ready for fight or flight.
The cues our brains use for this in-group/out-group categorization were useful thousands of years ago, but not so useful today. Those cues were:
Does this person look like me or my family?
Does this person talk like me or my family?
Does this person think and behave like me or my family?
If the answer is ever no, our brains go into threat prevention mode. We get ready to eliminate that person (fight) or avoid that person (flight) in order to survive.
Unfortunately, this is ancient brain wiring, so we all have this problem.
We categorize people like us into the safe group and treat them with respect—we listen to them and generally trust their intentions. And we categorize people not like us as unsafe. And we treat them and their ideas with less respect.
Research indicates we’re more willing to listen to advice from people from similar demographics to ourselves.
Fortunately, we’re also evolved enough to consciously override the fight or flight mechanism that happens subconsciously.
Our brains may flinch at foreign people and their different ideas, but after that we have the conscious capacity to decide what to do next.
Will we disrespect the out-group and take away their power?
Or will we pause, and consider them and their ideas?
The upshot is when we encounter a viewpoint that doesn’t line up with what we currently think, we have an opportunity to evaluate whether we can learn and grow from it. But if we don’t have respect for things that don’t line up with our own thinking, it’s a nonstarter. We’ll be biased against the new information from the get-go.
So what exactly does it mean to respect someone with a different viewpoint? The concept of respect is generally framed in terms of what you don’t do, but it amounts to not taking away the person’s power to express themselves.
In other words, respect for other viewpoints includes:
Listening to viewpoints that are not your own without interrupting
Not disparaging or otherwise attacking the person behind any viewpoint, even if you don’t agree
Treating the person or viewpoint with the same kind regard that you’d treat your own ideas or self
In other words, respect is treating humans as inherently worthy of being considered no matter how good or bad we think their viewpoint is.
This is particularly hard to do when an idea we’re dealing with is abhorrent to us. Or when we’re dealing with a person who doesn’t have that same respect for others. It would be hard to sit down with Hitler and actually listen to his ideas without calling him an asshole. But you don’t have to agree with Hitler to be respectful. And you can even conclude that Hitler’s viewpoints are wrong and he needs to be locked up for his crimes, while still employing human respect.
Respect breaks down into two sub-categories:
Earned Respect is the kind of respect that we give people because they bring something valuable to the group. This is the kind of respect that people in our out-group can get from us—if they can prove they deserve it somehow.
Owed Respect is the default respect that we owe all human beings because they are humans. It’s being civil, listening, not being assholes to them. We tend to give more of this respect to our in-groups by default. Even if we’re generally disrespectful to everyone, we tend to give more respect to “our” people.
Neuroscience, psychology, and IH research show us a few hacks for getting Earned Respect for people we deal with in person. And they show us how we can be more humble with people or ideas we’re not dealing with face-to-face, by broadening our Owed Respect to generally include more kinds of people.
*
Here are three quick ways to generate respect for specific people who make us flinch, or whose ideas make us scratch our heads:
Dr. Jonathan Haidt of NYU (author of The Righteous Mind, and the new bestseller The Coddling Of the American Mind), is one of the pioneers in research on “moral psychology.”
His research on Moral Foundations digs into the underlying morals behind humans’ decisions. It explains in large part why good people can disagree so viciously on things like religion and politics.
In other words, it explains why I hear my good-hearted politically conservative, Mormon and Protestant friends back home in Idaho say the same thing that my good-hearted liberal, Atheist and Agnostic friends in New York say about them: “I can’t believe someone could believe in that!”
Haidt’s research says that we can develop respect for differing viewpoints if we make the effort to unearth their moral motivations.
Few people actually think of themselves as evil. So, unless you’ve got something wrong in your brain (e.g. you’re a malignant narcissist or a psychopath), you will tend to justify your decisions to help you feel like a “good” person. Under the surface, you’ll create good reasons for what you think.
Moral Foundations theory says humans share at least six innate moral foundations that serve as the universal building blocks of morality. For the most part, evolutionary psychologists and spiritual belief systems agree on these. They are:
Care. Being kind and preventing harm.
Fairness. Justice, and not cheating people.
Loyalty. Patriotism and self-sacrifice for the group, not betraying the group.
Authority. Deference to legitimate authority for the good of the group.
Sanctity. Striving to be noble, clean, and not contaminated.
Liberty: Rights, freedom, rejection of constraints and of oppression.
Studies show that we give our in-group lots of benefit of the doubt because we think “they’re good people.” We understand their underlying morals.
But we don’t afford our out-groups the same benefit of doubt. They might be “bad people,” so we justify not respecting them. (You can see this any time someone calls someone else a “liberal” or “right-wing” in a derisive way and implies through the label that the person is evil and therefore what they say is suspect.)
So, Haidt says, when we’re dealing with others, it pays to step back and identify the underlying morals they are operating from. Once we can isolate the moral values driving someone to think what they think, we can more easily respect them even if we disagree.
As a hypothetical example: Let’s say that my buddy back home and I disagree on a charged topic—like, what to do about immigration to the US.
Now, my buddy might make some common anti-immigration arguments about crime and economic impacts. He may say that people sneaking into the US burdens the system, and breaking the law to get here is wrong.
I might make a pro-immigration argument, saying it’s wrong to prevent people from living where they want to live. I may say that our immigration laws are unnecessarily cruel. I may point out that my best friend, my girlfriend, and my roommate are all immigrants, and that they make my life and this country better.
Underneath, what each of us is really doing is using post-hoc justifications to back up a moral intuition that we value most. And so as the argument continues, we’ll trot out statistics or stories that confirm our biases. We’ll tune out inconvenient evidence that calls our particular stance into question. The fact that my Brazilian best friend pays hella taxes, and my Guatemalan girlfriend makes everyone around her a better person might be dismissed by my buddy’s anecdote about a foreign gang member shooting someone in Texas.
We may not even realize it, but we’re not respecting or considering each other’s arguments while we’re so busy defending our own. This conversation likely won’t go anywhere, and is likely to leave us disliking each other.
But say we forced ourselves to dig out the moral motivations behind our immigration stances. We might end up unearthing this:
My buddy values Fairness and Authority above all else. So he thinks it’s not fair that some people can break the law and get away with it (entering the country illegally). Even if the law is a little cruel, breaking the law is a betrayal to society. And he thinks it’s not cool to disrespect the Authority of a country by breaking it’s laws, even if the law is not cool. Finally, my buddy might be worried about the Sanctity of the country. Letting in anyone means we might let some bad guys in, too. It’s good to not risk contaminating the swimming pool, he’d say.
Once we unearth this, I can acknowledge that my buddy’s motivations are good, even if I disagree with his conclusions. After all, I can get down with Fairness and Authority too. Even though those aren’t my primary morals, I understand that my buddy is coming from a place of trying to do the right thing.
In contrast, I can help him see how I value Care and kindness above all else. If he’s listening, he’ll agree that that’s a good thing, too. I can explain how I think we should treat people like they’re valuable no matter where they were born. This explains why I think restricting immigration the way we do is unkind. And he might be surprised to discover that I also value Fairness. The way I see Fairness in the case of immigration is that it’s not fair to tell one human they can live here and another they can’t. We don’t choose where we were born, and I think it’s unfair to restrict someone for that.
So we both value Fairness, we just apply it in different ways.
Once we unearth these moral foundations, even if we still don’t agree on a conclusion, we have earned respect each other’s viewpoint. I see my buddy as a good person. He has good moral motivations behind his arguments. And he sees the same in me.
This means we might be able to have a more productive conversation about what to do. We might just be able to employ the next three parts of IH and get somewhere together.
As Dr. Haidt summed it up in his first TED talk “A lot of the problems we have to solve are problems that require us to change other people. And if you want to change other people, a much better way to do it is to first understand who we are—understand our moral psychology, understand that we all think we’re right—and then step out, even if it’s just for a moment, step out of the moral matrix, just try to see it as a struggle playing out, in which everybody does think they’re right, and everybody, at least, has some reasons—even if you disagree with them—everybody has some reasons for what they’re doing.”
In the final chapter of Dream Teams I explore one of the most surprising recent discoveries in neuroscience: how stories help our brains develop empathy.
The short version of the science is this: Our brains pay special attention to stories, engaging more areas of the mind than when we hear or see facts. And when we learn a good story, our brains synthesize the neurochemical oxytocin. This helps us feel others’ emotions and empathize with them. Scientists have shown that high oxytocin levels—whether we snort it or get it naturally through hearing a story—lead us to donate more to charity, be more interested in people’s well-being, and have more respect for “others” who aren’t like us.
As Dr. Paul Zak, one of the world’s leading oxytocin researchers put it to me in an interview: “Oxytocin melts the in-group, out-group divide.”
In other words, if we want to develop earned respect for someone, it’s a pretty good idea to sit down and hear their personal story.
In recent years, companies as big as Blackrock (the world’s largest investment management firm) have caught on to this. They’ve started using personal storytelling as a way to get people to get along better when they don’t see eye to eye at work. Importantly, in these “storytelling interventions,” people are encouraged to identify the emotions they felt in their stories. This helps generate even more of that oxy. (tocin, that is!)
I experienced this effect a few years ago at my last startup company. We had hired a VP to run sales, and after a few months it became clear that she and I did not see eye-to-eye on some things. I soon found myself trying to find fault with anything she proposed. I questioned her motivations. And I am ashamed to admit that I even started treating her rudely in meetings and emails.
Things changed dramatically after I somehow ended up at a dinner at this VP’s house. As I remember it, I mentioned at work to the team that I wasn’t going home to Idaho for Thanksgiving, and she extended an invite to me and whoever else didn’t have a place to go. I felt like I couldn’t say no, so I showed up. And at dinner, I met her sister. I saw her baby pictures. We cooked together. We sang karaoke in the living room. I learned her story of growing up in the south, how her father was a captain in the Air Force (just like a family member of mine), and how much she loved and missed her family.
After that, it was like a switch had flipped. I found myself saying hi to her at work and actually being happy about it. I started considering her ideas in meetings, backing her up in person and standing up for her when she wasn’t around. We still were very different, but she had turned into someone who I respected—and I ended up learning from her a great deal.
I’m going to her wedding in the Spring. All because I learned her story.
Blackrock bigwig Jonathan McBride (formerly the head of staffing for the Obama White House) put it to me well in an interview last year. “You need people to care about each other,” he said, if you want them to respect their different viewpoints. “And how you get people to care is through emotional narrative.”
I once made friends with a scary homeless man in Philadelphia. (You can read the story in this free bonus chapter of my book.) All it took was a game of chess.
Whereas at first the man’s appearance made me not want to go near him—much less listen to anything he might have to say—after playing chess for an hour, I found that, inexplicably, I was no longer afraid of him. In fact, I decided I loved the guy. He had gone from my out-group to part of my in-group. I later learned from psychology research that this was precisely because we played together.
Researchers have found over and over that play builds bridges between people from different walks of life. It explains how anti-Semitism dropped in Argentina when Jewish kids started playing soccer with Christian kids. It explains how this 22 year old rapper became real-life pals with an 80 year old lady because of Words With Friends. And it explains how we can hack the in-group/out-group psychology and earn respect for people like us.
In a nutshell, play and humor put us in a sort of “magic circle” where everyone who’s in on the game is psychologically “safe” for the moment. Subconsciously, play simulates a situation of anxiety, only our brains know there’s no actual danger. This is how we learn to handle stress, so when the danger is real we can handle our shit. Cats play with each other to learn how to hunt. Monkeys and lemurs play together in order to get less scared of other monkeys and lemurs.
When we step out of the magic circle, studies show that we’re more likely to respect the people we played with. This in turn helps us to respect their viewpoints.
For more on this, check this great post by Charlie Hoehn about using play to overcome anxiety, or Chapter 3 of Dream Teams.
*
We’re not always going to be able to deal with the individuals behind the viewpoints we encounter. So while it’s great to use the above tactics of unearthing moral foundations, learning each other’s stories, and playing together, what about when we come across information that we don’t like, presented by people we don’t know? Ideally, we ought to be able to explore ideas that don’t jibe with our own so that we can determine whether we need to change our minds because of it. But we can’t even dig into the validity of a viewpoint if we don’t have basic owed respect in the first place.
Building owed respect—i.e. Expanding your in-group to include more of humanity, including people you don’t know—boils down to training your brain to see other people as part of your family, and therefore just as valid as you are.
In the same way that a big family like mine (I’m the first of seven kids) with lots of different viewpoints can sit down for Thanksgiving and be nice, a person who develops a broad in-group can sit down and listen and treat other people’s viewpoints more easily than others.
This translates to more possibilities for creativity (considering broader arrays of ideas increases our chances of coming up with new solutions to problems) and more productive collaborations.
The best way to build this kind of owed respect is to reinforce the idea in our brains that there can be more than one “right” way to do something. IH research indicates that there are three simple ways to do this:
Psychologists have found that people who have lived in other countries are more likely to be creative—which means that their brains are more open to considering ideas that are outside of the expected.
My IH study found that living in lots of different countries or states (enough that you’ve likely had to truly immerse yourself in cultures outside of your own) or living in another country for at least 3 months (enough to have to actually slot into the other country’s way of living and not just Vacation Mode), correlated with a small but real boost in Respect For Viewpoints.
Another surprise from the IH study is that people who read a book every month (or more), or people who watch a couple hours a day of TV, tend to score higher on Respect For Viewpoints.
Knowing the neuroscience of storytelling makes the likely reason for this obvious, because what is fiction, if not stories of people who aren’t like us? Those stories unlock empathy (hello oxytocin!) and reinforce the idea that other people can have valid lives and ideas even if they’re not like us.
A series of studies published in 2014 by a group of Italian psychologists found that reading Harry Potter significantly reduced people’s prejudices. High school and university students who read the books were more likely to have respect for people in their out-groups—in particular immigrants and refugees—than average.
Curiously, my IH study didn’t find any correlation between reading news and Respect For Viewpoints. News is good for being informed, but not for building respect, it would appear.
Brain scans show that multilingual people have physically different brains than people who speak just one language. And these studies show that multilingual people’s brains generally gain an advantage in problem solving and focus. People who can speak more languages generally gain the capacity to look at things from more angles, studies show, and they tend to have a higher chance of being more creative.
While there’s not much research directly studying the links between multilingualism and IH yet, any easy hypothesis to make based on these observations is that the more your brain can reinforce the idea that there’s more than one “right” way to speak, the better our ability to consider that there might be more than one valid way to think about other ideas, too. In other words, it’s not a stretch to say that having multiple languages in your head builds your respect for other viewpoints.
INTELLECTUAL HUMILITY FACTOR 2
CONFIDENCE IS A CRUCIAL SKILL for living our best lives. But it can be a double-edged sword.
Too much confidence in our own way of thinking can lead us to close ourselves to alternative ways of thinking. Intellectual overconfidence means we’re not as able to learn.
The brain-science of confidence is tricky, and we’re still learning about it. But what we do know is confidence is associated with rewards. Our brains feel rewarded when we turn out to be right about something. So when we’re confident, our brains get ready to feel good about themselves.
The downside happens when our brains too closely associate being right with feeling good about ourselves. Most of us develop a habit of seeing ourselves as right as much as possible, so we can trigger that mental reward and feel good. And this reward can feel better than the reward our brain gets from feeling like it’s learning and growing—because after all, that can hurt.
Two extremely common things help us become intellectually overconfident:
The first is a bad social habit, the fact that we reward each other for being right. This reinforces the need in our brain to be “right at all costs.”
As Caroline Mehl, the director and lead author of OpenMind, a new evidenced-based online training program for developing IH, points out in the course: “In many ways, our culture instills in us an aversion to being wrong. Universities reward students who have perfect GPAs and SAT scores, and social media enables us to curate the illusion of living a flawless and failure-free existence.”
We praise kids for getting the correct answers, rather than for exploring and questioning. We promote the employee who suggests the safe idea, rather than the people who push the envelope. (Studies show that people with creative ideas are much less likely to be put into leadership than people with run-of-the-mill ideas.) In politics, we reward the candidates for “winning” the debate, rather than learning and changing their minds in light of the debate.
The second, and more tricky thing that leads to intellectual overconfidence is our own success. The more we win at something, the greater the chance that we get stuck in our own ways later when we may need to pivot.
Researchers call this cognitive entrenchment. Dr. Erik Dane of Rice University explains in his seminal paper on the subject: “As one acquires domain expertise, one loses flexibility with regard to problem solving, adaptation, and creative idea generation.”
To help explain this concept, I’d like to introduce an analogy from Dream Teams called Problem Mountain, inspired by the work of Complex Systems professor Dr. Scott Page from University of Michigan.
Any problem in the world can be visualized like a mountain range with lots of peaks of different heights. We can pretend that each peak represents a solution to whatever problem this is. In real life, most problems have several solutions, some better, some worse, and some MUCH better or MUCH worse. Kind of like this:
Research indicates we’re more willing to listen to advice from people from similar demographics to ourselves.
When we’re working on a hard problem, it’s like we’re hiking around the mountain range in the fog. We can’t see all the possible solutions, so we do our best to cobble together the people and information we need to find the best spot. Once we’ve found a mountain peak, we have to figure out whether we’re able to (or whether it’s it’s worth it to) climb down and find a better one.
What often happens is somebody finds a really good solution to the problem, and it’s like they’ve climbed to one of the high peaks on the range:
And then they tell everyone about it, and soon we’ve developed a “best practice” for how to attack this particular problem / mountain range.
Unfortunately, our success here—and the more we reinforce that this is the best way to tackle this mountain range—makes it less likely that we’ll be able to see other solutions that might be out there. In this case, there might be a better mountain that we don’t see. We can’t think of a different way to explore this range.
Or in another case, pretend we’ve made it all the way to the best peak on the range…
But then an earthquake happens elsewhere, and a bigger peak forms. When we’re entrenched on our own mountain peak, it becomes unlikely that we’ll explore past it and find the new one, because hey—we really killed it up here.
And this is how disruption happens:
People don’t like to admit they’re wrong for several reasons. Among them: it hurts, it lets others off the hook, and it makes us appear weak. But perhaps most of all, we don’t admit we’re wrong because we don’t think we’re wrong. We haven’t allowed for the possibility that we don’t know everything. And one of the best ways to put ourselves in this position is, ironically, to do something that works. To become successful.
Research indicates that simply acknowledging our fallibility helps us to be more open to following arguments wherever they lead, without pre-judging them based on our own knowledge. It primes us to be more humble. But that kind of acknowledgement can be hard in a world where we’re rewarded for being right.
Fortunately, we can put this know-it-all / being right thing to our advantage by turning it into a couple of hacks:
Ben Franklin was often the smartest person in the room, and he knew it. But he was also smart enough to know that he would not always be right, even though he was smarter. So he developed a little trick to help him overcome his overconfidence.
Whenever he was about to engage in an argument, he would start by saying something to the effect of, “I could be wrong, but…” and then smash you with his argument. He’d also avoid using any absolutes, like “always” or “certainly” or “never.”
In this way, he would avoid putting his debate opponent on personal defense. But even more important, this became a clever way for him to leverage his confidence in a way that still allowed him to learn.
After all, if some new argument or bit of information convinced him that he was wrong, Ben could still say he was right. Because remember? In the beginning he said, “I could be wrong.” And it turned out he was right about that!
Over the last few years, researchers like Carol Dweck have shown the importance of having a “growth mindset,” or believing that we have the ability to get smarter. If we believe that intelligence is fixed, it becomes psychologically painful to admit we’re wrong, because that’s like admitting we’re stupid forever. So we often compensate by doubling down on being right, which leads us to become further cognitively entrenched.
There’s a one-word hack that helps us overcome that. It’s a word that I associate with my girl Hermione Granger from the Harry Potter series.
And that word is: yet.
Hermione was a know-it-all. But she was also an avid learner. So, if there was something she didn’t know, she didn’t throw a fit. She would instead decide to learn about it. If she didn’t know something, or happened to be wrong about something, instead of feeling stupid, she would decide that she just had incomplete knowledge. She could say, “I don’t know… yet.”
Yet allows us the possibility of changing our minds without feeling bad about ourselves.
INTELLECTUAL HUMILITY FACTOR 3
WHEN IT COMES TO ego and IH, its helpful to tease apart the different ways people define ego itself:
Colloquial definition of ego: “a person’s sense of self-esteem or self-importance.” E.g. “a boost to my ego.”
Psychological definition of ego: “the part of the mind that mediates between the conscious and the unconscious and is responsible for reality testing and a sense of personal identity.”
Philosophy / metaphysics definition of ego: “a conscious thinking subject.”
What we mean when we talk about the “bad” kind of ego: I think psychologist Dr. Scott Barry Kaufman’s puts it well when he says ego is “that aspect of the self that has the incessant need to see itself in a positive light.”
(Note: This podcast episode on ego with my pal Ryan Holiday is definitely worth checking out.)
Ego is the thing that makes you think you are you, separate from other things. It is inherently self-focused. And though your ego isn’t necessarily smarmy or counterproductive, it often is.
Separating Ego from Intellect is about not making ideas about you. It’s about not feeling threatened by disagreements, dissonance, or ambiguity. It means not making ideas personal.
One of the hardest things about this is how often other people make things personal when we disagree. Being able to recognize when this happens and not reflect the personal-ness back is a difficult but valuable skill.
Dr. Haidt’s work shows that not only are we are good at finding arguments that support their existing beliefs, but we actually tend to form our beliefs first and then justify them post-hoc.
We do this for many reasons. But one of the big ones is because we tend to attach our beliefs to our identity (ego), which means it’s psychologically painful to reconsider them. Questioning something that is core to your identity is like questioning your identity yourself. And that’s the worst.
So we invent our justifications to support the things we hold close and personal. Separating Ego from Intellect is about making ideas about ideas, and not about us. Easier said than done!
Here are some tactics that can help with ego separation:
By definition, it’s going to be difficult to intentionally separate your ego from ideas if you don’t know how to spot your ego in the first place. And in my own experience, getting to know my ego’s tendencies has been difficult, but extremely useful for my everyday life.
As former monk Hari Prasada, co-founder of Upbuild and instructor for a workshop called Excavating Your Ego, puts it, “When we confront our ego it is a rude awakening. But this necessary suffering will help us avoid a lot of unnecessary suffering.”
Upbuild teaches workshops that help people identify the components of their egos. Their goal is for you to understand your motivations, triggers, and the social masks you put on when your ego feels threatened.
Taking a long hard look at my own ego tendencies helped me wake up to my tendency to try to “achieve at all costs”—making me unnecessarily competitive and image-conscious—and how my ego rears its head primarily when I feel ashamed, and that my ego is less driven by anger or fear. It also helped me understand the paradox between those ego-driven tendencies and my tendency toward generosity in good times, and irrational people-pleasing in my worst times.
This explains why keeping a daily journal of my white lies this summer helped me take more control of my ego. (I learned to spot the exaggerations that were purely about my image as an “achiever” and stop myself from giving in to the temptation.) It also explains why a lie journal might not work for you if, say, your ego gets triggered by the fear of disloyalty rather than by shame.
Look for the signs, in yourself and others, that a conversation has veered from the realm of ideas and into the realm of the personal. If it helps, watch some debates on CNN and look for the following:
Defensive body language like tenseness, sudden arm crossing, or other protective posture
Self-justification using “I feel” statements instead of “I think” statements
Getting emotional (which usually doesn’t happen if you don’t personally feel threatened)
Starting to fight dirty
Ad hominem arguments, or when someone switches to attacking a person’s character as a means of discrediting their ideas
When you notice these things, pause, identify that it’s gotten personal, and rewind.
It’s much easier said than done. But identifying when others make ideas personal is a good way to build the muscle of ego separation yourself.
Many meditation practices are built around the idea of noticing and identifying emotions without acting on them. Being able to do this in the course of life can help with ego separation. The ego is the part of us that feels emotional, so identifying our emotions can help flag when the ego has gotten involved in the conversation.
Further, research shows that having more words to describe our emotions helps us regulate them better. If we’re able to consciously identify competing emotions (e.g. bittersweetness), we’re likely to be able to better identify when ego is getting in the mix.
This classic essay by Paul Graham makes a compelling case for “keeping your identity small” when debating and exploring ideas.
A lot of our language actually reinforces the bond between identity and ideas—in English especially. We say, “I’m a Liberal” or “I’m a Republican.” That’s attaching our ego to our ideas, which means it will be painful to change our minds about them. (Instead, say “I currently tend to side with the Republican platform on this issue.” It takes longer, but it leaves you room for nuance, or to change without much ego pain.)
Research on psychological priming indicates that people who are reminded of their association with a particular group are likely to act or argue with a bias toward that identity in mind. You’re likely to prompt a different response with the question, “What do you think?” than with the question, “As a man, what do you think?” Invoking identity this way invites people to put a subjective hat on, and makes it riskier to be wrong, because you’re now “representing” an identity group. This makes it harder to revise our viewpoint, and more likely to fight dirty.
Quieting the ego is a concept recently promoted by psychologists Heidi Wayment and Jack Bauer in their book Transcending Self Interest, and explained magnificently in this recent Scientific American article by Dr. Kaufman.
Quieting the ego involves four things:
Detached Awareness. Essentially, this could be described as mindfulness. This is a reason that mindfulness meditation, which helps us build the habit of observing things patiently and without intervening, is so good for ego management.
Inclusive Identity. Seeing all other people as part of our “team” helps us not take things personally. (For more on how to do that, you can check out that book I keep mentioning. :))
Perspective Taking. Seeing things from other people’s point of view helps us focus less on our own identity. (The OpenMind program I mentioned before has a very good section on learning to do this.)
Growth Mindedness. Maintaining a focus on learning. Check out Carol Dweck’s book Mindset. (It also appears on the list of book recommendations from guests on The Tim Ferriss Show, and is recommended by Tony Robbins, as well as polymath Josh Waitzkin.)
Writes Dr. Kaufman, “I don’t think it’s an overstatement to say that the cultivation of these skills in our society would lead to greater mental health, useful reality-based information, as well as peace and unity among humans. Instead of destroying each other how about we learn from each other?”
I’m obliged to start with this warning: Don’t try this at home. But honestly, the quickest way to improve this aspect of ego is probably via a supervised psychedelic trip. Clinical trials show that people come back from psychedelic therapy reporting greater connectedness to others and less self-centeredness.
For everything you want to know on this, check out Michael Pollan’s excellent book How To Change Your Mind.
INTELLECTUAL HUMILITY FACTOR 4
IF WE CAN MASTER the above three dimensions of IH, this one follows pretty naturally. Changing our minds requires us to consider other viewpoints, acknowledge we could be wrong, and not take ideas personal. At that point, revising our viewpoint is almost a piece of cake.
My IH research, supported by various studies in psychology and neuroscience, shows us one very straightforward thing we can do to build up our ability to revise our viewpoints: Travel. Either physically, or through fiction.
IH research makes the benefits of travel quite clear: people who have traveled to 2 or more countries have higher Openness to Revising Viewpoints.
Research by Dr. Adam Galinsky of Columbia University and several of his colleagues finds that people who travel become better at “idea flexibility” or being able to solve problems in multiple ways. Travel, their studies show, also “helps overcome functional fixedness.”
In other words, being away from our safe home turn makes our minds more receptive to rethinking our old ideas.
Why is this? Why is this? What’s happening in our brains when we’re in travel environments?
The first thing that occurs is the same thing that makes it easier for people to swear in a foreign language than their own. Research from University of Warsaw finds that our “home” language is more emotionally connected to our identities, but when we speak in foreign languages, we become psychologically disembodied.
When we’re in unfamiliar places, the lens through which we see the world becomes less connected to our precious identities.
This may explain why in my study, fiction readers were over 40% more likely to score high on this dimension of IH. Psychology research further shows us that fiction readers need less “cognitive closure,” or in other words they are less likely to need to set ideas in stone than people who don’t read much.
Dr. Zak’s neuroscience research on storytelling has shown that the most powerful kinds of stories for generating “immersion,” or full engagement of our brains—which includes oxytocin synthesis—is video. So it’s not a big stretch to hypothesize that fictional television is helpful for developing IH, too. Though I didn’t ask them what TV they watched, people who said they watch 1-2 hours of television a day tended to score higher in IH than non-television watchers. Polls like this one from The Hollywood Reporter that show that a high percentage of people who changed their minds on gay rights attribute their attitude directly to Glee and Modern Family seem to support this conclusion.
Traveling opens the door for something interesting to happen.
There’s a concept in psychology called “Balance Theory” that explains why we like or dislike people and things by association, and how travel can help us change our minds about things that are connected to our identities.
Balance Theory says that our brains don’t like inconsistency. So when something is out of balance, our brains adjust to put it back.
It’s like this:
Say you have two beliefs. You think Triangles are good. You think that Octagons are good. You find out that Triangles are big fans of Octagons. This is great news for you, because you like them both. You are in balance.
However, if you find out that Triangles think Octagons are bad, you become out of balance. You can’t think highly of both Triangles and Octagons if Triangles think Octagons are bad.
This will bother your brain until one of two things happens. Either you’ll decide that Octagons must be bad:
Or you’ll decide that you’re wrong about Triangles:
When we embed ourselves in places with a different culture than our own, we often get confronted with paradoxes about our beliefs. We realize that the way we have balanced the world may not actually line up. We’re faced with the choice of either rebalancing the equation, or adding more nuance to our balance charts by dividing them up:
Laboratory research shows that many of the things we are biased about—the negative associations we have with things—are the result of our brains trying to balance things out with our prior beliefs, not the result of rational arguments.
When we travel outside of our home environment, our brains can start to let go of the barriers that prevent us from properly analyzing the information for or against our prior beliefs.
This can lead to a couple of results. Since we’re less tethered to our ego, travel experiences can make it easy for us to change our minds about things.
Visiting unfamiliar places primes us to de-couple things our brains automatically link via Balance Theory. It helps us to decategorize people and cultures and pick apart stereotypes from individuals.
And this was what ultimately happened to Malcolm X.
*
IN 1965, MALCOLM X decided to do something he had aspired to do for a long time—make a pilgrimage to the holy city of Mecca.
The pilgrimage involves journeying through the desert of Saudi Arabia and re-enacting rituals established by the prophet Muhammad, in remembrance of Abraham and his family. It culminates in gigantic gatherings of Muslims from around the world praying together in harmony.
Malcolm was incredibly moved by the experience.
“Islam brings together in unity all colors and classes,” he wrote in his diary.
“Everyone shares what he has, those who have share with those who have not, those who know teach those who don’t know.”
He saw pilgrims of every skin color treat each other with kindness.
He saw people who in America would be classified as white who “were more genuinely brotherly than anyone else had ever been.”
And he saw brown and black people smiling and praying with them. It was like they even liked each other!
And when he saw this, his heart started to soften.
After decades of preaching division and violence, and years of opposing the Civil Rights cause of integration and peace, Malcolm said this:
“I have eaten from the same plate, drank from the same glass, slept on the same bed or rug, while praying to the same God… with fellow Muslims whose skins was the whitest of white, whose eyes were the bluest of blue… [for] the first time in my life… I didn’t see them as ‘white’ men,” he wrote.
It “forced me to ‘rearrange’ much of my own thought-pattern and to toss aside some of my previous conclusions.”
It was impossible to hold onto the idea that an entire race of people were all bad after hanging out with and feeling connected to so many good people from that race. The only choice Malcolm had was to either decide that Muslims were bad, or that he was wrong in his generalization about white people. In other words, he had to break apart the elements of his Balance Theory loop into more nuanced loops.
Malcolm changed his mind because he generated earned respect for people he hadn’t respected before.
He traveled to a place where he was a cultural fish out of water, which made it easier to separate his experiences from his ego and “home.” This in turn made it easier to revise his viewpoints on race, equality, violence, and rights. It helped him see that the struggle for civil rights was actually part of a broader struggle for human rights.
And that epiphany released him from the confines of his previous beliefs.
After Mecca, Malcolm decided to spend some time living in Africa. As IH research would have predicted, this further opened his mind up to rethinking his ideas and changing his life.
Malcolm X is not the only person to have developed and exercised intellectual humility after traveling. But I love his story because the things he changed his mind about were extremely tricky, and totally connected to his identity. If he can do that, so can we.
And here’s a small but interesting parallel to this story:
A half-century after Malcolm X, a young white man named Derek Black, the godson of KKK leader David Duke and a golden child of the white nationalism movement, underwent a similar change of heart about race. He converted from white supremacist to champion of equality. The catalyst? You guessed it: moving by himself to a new place (college, across the state) and living in a different culture, where he made friends with Jewish and Peruvian students.
If you think about it, how many of us have changed our core beliefs—and not just about race or social issues—after moving away to college, or spending a semester abroad, or taking a journey across the country in a beat up station wagon? I certainly did when I moved from Idaho to Hawaii, and then again when I moved to New York City.
There’s magic in traveling away from our home. Whether the adventure takes place in a book or IRL, a journey opens a door for us to change our minds if we need to.
As Malcolm’s daughter said of her father, “The more he traveled, the freer he became. The freer we all became.”
HOW THE HABITS OF INTELLECTUAL HUMILITY CHANGE EVERYTHING
IMAGINE IF EACH OF us could do what Malcolm X did, and “‘rearrange’ much of [our] own thought-pattern and toss aside some of [our] previous conclusions” whenever it was important to.
Imagine all of the things that would change if we all could do that—if we all got a little more intellectually humble.
Imagine if our elected officials were graded on their ability to take in new information and adapt—and so they prioritized IH over being right.
Imagine if the arguments they had in Congress were about building on each others’ different perspectives and coming up with better ideas than any one side would come up with on their own.
Imagine how much better our family lives, our relationships with our neighbors, and our collaborations at work would get if everybody got a little more respect, and if everybody made their ego a little quieter.
What would political pundits even have to talk about at that point?
As Caroline Mehl recently put it to me, if everyone took a course in IH (like her and Dr. Haidt’s OpenMind program), we’d have a world “in which people are less susceptible to biased and tribal thinking, and are more tolerant of moral, political, ethnic, religious, and cognitive differences.”
The cool thing is that even just learning about IH helps us increase it.
If you were to take the Open Mindedness Assessment below before and after reading this post, you’re pretty likely to score better the second time.
OpenMind’s data shows that “48% of people were more intellectually humble” after taking a quick course about it. Even more impressively, 70% of people who learned about both IH and Haidt’s Moral Foundations Theory get less polarized against political opponents.
In a pinch, the tips in this post can help us operate day to day with more IH. But going places we’ve never been—and even better, spending time living among other cultures—may just be the biggest hack of all.
So give yourself a little permission to explore.
Learn about things outside of your field. Read a novel. Watch Stranger Things. Take that vacation to Tokyo. Even better, schedule a sabbatical in Mexico City and train your brain to see things in new ways.
A lifetime of intellectual humility will be well worth the cost.
Take an official intellectual humility self-assessment here »