Last week I bought Robert Sapolsky’s new book on free will, or perhaps more accurately, on our lack of it. I’m not finished with it, but the book promises to make a comprehensive argument that free will is an illusion and that our biology and environment determine our life outcomes.
The strongest argument I’ve heard refuting Sapolsky’s hypothesis is that it doesn’t match with our intuition, that life just doesn’t feel that way. We feel like we are in control, that we are the authors of our own experience. We’ve built our institutions around this intuition — governments, religions, meritocracies, prisons, etc.
But the intuition account is not a strong argument. Myriad discoveries we’ve made run counter to intuition. For a long time we thought the world was flat because it looked that way. Is space-time intuitive? Try asking a physicist to explain quantum theory. What about the self? Our intuition is that we have a self. That it’s an actual thing we can know. But close inspection reveals that the self is just a thought, appearing in our brains like any other.
Intuition is useful, but it has flaws. It makes mistakes, many of which are systematic and predictable. And we shouldn’t allow it to get in the way of facts and reason.
A recent study makes it painfully clear just how destructive our cognitive flaws can be with politics, and how they can warp both our intuition and our reasoning. Using a rather elegant set of experiments, the researchers show that activating political identity — conservative or liberal — impaired participants’ reasoning. With issues like gun control, immigration, abortion, marriage equality, capital punishment and the Affordable Care Act, participants were less able to detect flawed reasoning than they were with similarly structured reasoning on neutral topics. Critically, the researchers point out that “individuals on the liberal side of the spectrum fall prey to this handicap just as much as individuals on the conservative side.”
When we encounter a politically-charged topic, it activates our tribal affiliation and our intuition tells us we must represent or protect our tribe. Once in this state, our capacity to think clearly declines. “We align our conclusions with our beliefs,” as the researchers put it.
A mentor once told me, “You don’t really know something until you can teach it.” More recently, a wise 14-year old explained to me what happens in high school debate — you have to develop the best arguments you can for both sides of a proposition. Getting good at this takes discipline — you need to override some of your default settings. This sort of recalibration and awareness takes practice. It requires you to be charitable and ask yourself at least two questions:
How might I be wrong?
What’s the most charitable interpretation of the thing that person I think I don’t like is saying?
Wisdom is difficult to define, but this seems an important ingredient. Being open and curious enough to, at least attempt to, fully comprehend the best arguments against your position will help you understand your own beliefs all the more clearly. It’s an exercise worth doing.
—
I’m away for a few days and won’t publish a post next week. If you find these posts useful, please consider recommending A New Angle to a friend. And perhaps consider supporting us further with a paid subscription.
Thanks!
jA