Recently, I spent over a year researching the little-known history of how the CIA used fake news to overthrow the government of Guatemala in the 1950s. That story itself is wild, and worth a read or listen. But one of the most fascinating elements to me about this story is how the top leaders of the United States—some of the smartest people in their fields—fell for false information themselves while running the caper.
The brief history is this: The U.S. Secretary of State, John Foster Dulles, and his brother Allen Dulles, the CIA Director, were the chief masterminds of the plot to overthrow Guatemala. They believed, despite plenty of evidence to the contrary, that the Guatemalan president was in cahoots with the Soviet Union and planned to turn the Central American nation into a beachhead for Western Communism. The two brothers also had a personal financial interest in the ouster of the Guatemalan president, which likely clouded their judgment.
But perhaps the most unbelievable thing was what the Dulleses did after they succeeded and replaced him with a leader of the US’s choosing.
The CIA raided the Guatemalan ex-president’s home and offices... and couldn’t find the evidence of a Soviet connection that the Dulleses hoped for. Yet—instead of admitting that they’d been wrong, they ordered CIA agents to plant some evidence.
And they didn’t do this because they were worried about being caught on the wrong side of history (which they were).
Fake news have been around for years
Fudging some evidence would simply get the press off their backs, while they worked harder to prove what they were sure was true.
This isn’t an isolated kind of story.
I’ve worked with smart business leaders who have launched company initiatives and, in the face of clear evidence that their strategy was failing, have refused to try something else.
Startup founders will often pitch me ideas, and when I point out problems, they’ll argue hard that those problems don’t exist—because they badly want for their idea to be right.
The 2018 documentary Behind The Curve showed this very human tendency when it followed people who believe that the earth is flat as they tried to do scientific experiments to prove it. When their experiments didn’t work—and indicated that the earth was indeed round—over and over the experimenters decided something must have been wrong with their equipment. This film was so eye-opening because it showed how Flat Earthers were not just a bunch of low IQ jerks; they were engineers and businessmen and generally well-educated, nice people who wanted to find the truth—but who in presuming they already knew what the truth was, couldn’t be dislodged by evidence that countered their beliefs. In fact, they were worried that if they ever found out they were wrong, people would think they were stupid.
This is why dumb-shaming incorrect ideas doesn’t work.
It’s easy to disparage people who believe things that aren’t right. We call them stupid, and don’t feel that bad about it. It’s their fault, after all, right?
But if we put ourselves in the shoes of a person who, say, believes a false conspiracy theory, or can’t let go of a bad idea at work, or whose rationale for a strong opinion makes no sense to us—what we find in every case is something universally human:
Nobody actually wants to be wrong.
This is a key starting point if we want to understand how we all can fall for incorrect ideas.
Research and history are clear that our ability to be wrong about things is not highly correlated with our IQ. Smart leaders in business, politics, and the hard sciences are routinely wrong about things they strongly believe.
Show me a person who hasn’t worked for a boss who had bad ideas and I’ll show you a person who has just never had a boss.
In other words, what causes us to believe false things—and even double down on them in the face of new evidence—is often not stupidity. It’s often one of two things:
- We have ulterior motives that make us want something to be true so badly that we resort to intellectual dishonesty and fool ourselves about it; or
- We don’t want people to think we’re stupid, so we do mental gymnastics to justify what we've said we believe.
In the case of the Dulles brothers overthrowing Guatemala, ulterior motives—political and financial—led them to want to be able to justify what they did. And because of the amount of fake history they generated to do so, for 66 years we’ve still been untangling the details of what exactly happened there.
But in the case of your team leader who won’t let go of a bad strategy, or your eccentric uncle’s opinion about how smoking is actually good for you—and I suspect in the Dulleses’ case as well—sticking to a belief in something wrong is also very much about the very human desire to not be thought of as stupid.
So as leaders and teammates and family members, this is where I think our societal habit of treating people like they’re stupid when they’re wrong needs to be rethought.
As long as changing our viewpoint is tantamount to an admission that we are not smart enough, we will do anything we can to avoid it. We’ll ignore evidence, or twist logic, or lash out, or lie as a means to justify the end we hope is right.
And the smarter we are, the better we actually can be at that kind of self-deception.
But what if instead, we treated each other as if we’re each doing our best with the information we have? And then what if we treated changing our minds after being wrong in the past as a good thing?
What if believing something incorrect in the past and changing our minds was considered a strength?
What if we actually praised people for admitting they’d changed, instead of calling them idiots or flip floppers?
This kind of shift in attitude might not dislodge people from their ulterior motives—and I doubt it would have stopped the Dulleses from trying to overthrow all the countries they overthrew—but it would do a lot of good generally. Lord knows if we stopped using the label "stupid" in today's discourse, we'd have less doubling down on unproductive behavior and attempts to destroy one another, and we'd have more useful conversations.
I think that if we didn’t have so much social pressure to be right about everything, we’d be far better at steering our communities and companies. After all, in both organisms and organizations, raw ability is not nearly as valuable as adaptability.