Last week I posted about avoiding faulty thinking in Chip and Dan Heath’s book Decisive: How to Make Better Choices in Life and Work (Random House Digital, Inc., 2013). Here’s another great book: If you haven’t read this one on decision making, I suggest you do: Nobel Prize laureate Daniel Kahneman writes in Thinking, Fast and Slow (Farrar, Straus and Giroux, 2011):
My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely.
Kahneman simplifies the mind’s decision-making process by dividing it into Systems 1 and 2.
- System 1 is fast, routinely guiding our thoughts and actions—and it’s generally on the mark. Our associative memory maintains a richly detailed model of our world, as well as a vast repertoire of skills acquired over a lifetime of practice. This allows us to produce remarkable solutions to everyday challenges.
- System 2 is slow. It represents our rational self (who we think we are). It articulates judgments and makes choices, but it often endorses or rationalizes ideas and feelings generated by System 1.
But System 2 isn’t merely an apologist for System 1; it also prevents many foolish thoughts and inappropriate impulses from becoming overt expressions. System 2 is not always rational, and we don’t always think straight. We often make mistakes because we don’t know any better.
System 1 simultaneously generates answers to related questions, and may substitute a response that more easily comes to mind for the one that was requested. By using heuristics, it quickly provides probable answers that are often correct—but sometimes they are quite wrong.
There is no way for System 2 to know if a System 1 answer is a skilled or heuristic response without slowing down and attempting to construct an answer on its own. But this is a slow and arduous thinking process, which the brain resists.
And so, we are prone to thinking errors. System 1 is not readily educable. The only recourse is to recognize you are in a cognitive minefield, slow down and ask for System 2 reinforcement.
No warning bell rings. “The voice of reason may be much fainter than the loud and clear voice of an erroneous intuition, and questioning your intuitions is unpleasant when you face the stress of a big decision,” according to Kahneman. “More doubt is the last thing you want when you are in trouble.”
It’s usually easier to spot a minefield when you see others wander into it. This is why smart leaders work with senior leadership teams and executive coaches. Observers are less cognitively mired and more open to information than those who are intensely involved.