Take a moment to try this exercise:
How well do you think you know how a toilet works on a scale from 1 (you can’t tell the tank from the seat) through 10 (you’re practically a plumber).
Now explain how it works.
Now rate yourself again on your knowledge of how you think it works.
We bet that your rating is lower the second time around—after explaining the mechanism—than it was initially.
This is called the illusion of explanatory depth. It’s the idea that we think we know more than we do. The process of explaining how something works makes us realize we had a feeling of knowledge beyond what we actually know. Deliberating forces us to reflect. We appreciate how complex things really are.
In their book, The Knowledge Illusion, Steven Sloman and Phillip Fernbach explain how human knowledge is a “community of mind.” Humans divide up cognitive labor and rely on specialists: people who know things. We rely on others for knowledge and we collaborate to get things done. For this to work we need a common goal or purpose.
When we share a common goal, we can share our knowledge—we have a place to put our own knowledge and we can reason about someone else’s. But, as Sloman and Fernbach explain, the illusion gives rise to a problem—we fail to distinguish between the knowledge that’s in our own heads and the knowledge that’s in someone else’s head. Like the exercise above demonstrates, we have a strong intuition that we know something but usually our knowledge is less than we feel.
We are cognitive team players. With our thinking done in communities, it doesn’t matter who knows something, it just matters that someone does and that we feel confident we can access that knowledge. But the knowledge illusion tells us that when we work with others our intuitions can lead us astray—we think we bring more to the table than we do.
So we are left with a powerful and humbling insight: we are reliant on our communities for our knowledge. And because we are mostly unaware of when we outsource our thinking, we have limited awareness of the situations when our own intuitions are wrong.
For intuition to be reliable we need timely feedback on our judgment, a relatively stable environment, and time to reflect. Humans are unique in our ability to think about our thinking. Human metacognition is unique and important for learning.
Self-reflection is human and gives rise to powerful emotions, such as regret. Only humans can experience regret. A dog might feel disappointed if it doesn’t get a treat but she doesn't have the capacity to reflect on what they could have done differently. Regret is a metacognitive process. Reflecting on regret is a complex process involving thinking about thinking, counterfactual reasoning, and forecasting difficult emotions. Not everyone uses regret as a platform for making better decisions in the future. But, if someone tells you they live life with no regrets, you can reasonably wonder about their metacognitive skill level.
Great Human Strength: We can build, share, and store knowledge in our community. We can accept feedback from someone we trust who knows us well and has our interests at heart. We can learn from our choices and choose to be better.
Great Human Weakness: We can think we know more than we do. We are prone to be overconfident in our own knowledge and judgments.
Machine Opportunity: Designs which prompt us to reflect on what we know versus what others know.
Machine Threat: Designs which encourage us to overestimate the state of our knowledge.