The more you think you know about something, the less you actually do. I just finished reading Think Again: The Power of Knowing What You Don’t Know, by organizational psychologist Adam Grant.
The book’s full of great stories and interesting facts (and great cartoons!). For example, here’s an exercise directly from the book:
Compared to most people, how much do you think you know about each of the following topics — more, less, or the same?
· Why English became the official language of the United States
· Why women were burned at the stake in Salem
· What job Walt Disney had before he drew Mickey Mouse
· On which spaceflight humans first laid eyes on the Great Wall of China
· Why eating candy affects how kids behave
On the questions above, if you felt you knew anything at all, think again. America has no official language, suspected witches were hanged in Salem but not burned, Walt Disney didn’t draw Mickey Mouse (it was the work of an animator named UB Iwerks), you can’t actually see the Great Wall of China from space, and the average effect of sugar on children’s behavior is zero.
Called the Dunning-Kruger effect from research by psychologists David Dunning and Justin Kruger, the facts say that when I lack competence about a particular area of knowledge, I’m more likely to be overconfident. In the original study, people who scored lowest on various tests of reasoning, grammar, and the like had the highest opinion of their skills, believing they did better than 62 percent of their peers, when in reality they outperformed them by only 12 percent.
It seems we think we know more than we actually do.
Why “What We Don’t Know” Is Important
The same week I finished reading Think Again, a friend recommended an On Being podcast interview with Daniel Kahneman, Nobel Prize winning social psychologist — Why We Contradict Ourselves and Confound Each Other — about rethinking everything from what we think we know to how we choose to live our lives.
In his 80’s now, Kahneman had me backing up several times to hear again his wisdom and research. Whenever I take a stand on something now, I recall his theory that the opinions we hold so dear are not based on reasoning. Rather the opinion is already formed and we look for the reasons to support it.
Example: When you ask my opinion on climate change, Kahneman says the opinion I give you was formed early on because of who I am and other beliefs I hold, and the reasons to support it came later. Which is why it’s difficult to change someone’s beliefs with logical reasoning.
All of this is to say that we have to stay curious about our beliefs and be willing to re-examine them. And that talking to others about their strongly held beliefs is more about learning when and how they developed their views and why they think as they do — and what (if anything) would cause them to change their thinking. Clearly, pushing our message will never cause them to change theirs. They (or we) will only change our thinking if makes sense to us to do so, not because it makes sense to someone else.
When in doubt, ask a (useful) question. And stay curious!