Being a leader is hard. You know how rewarding it can be when it all comes together. You want more of those moments.
So we’ve assembled our favorite nudges from our book Make Better Decisions and paired them with some of the trickiest situations you face as a leader:
How to Balance Stability with Responsiveness
How to Bring Together Diverse Viewpoints
How To Structure Problem Solving and Knowing Which Decisions Matter
You’ve set a plan. It’s well thought out, rigorous, robust, and data-driven. Then new information comes in that makes you question what you know. Do you adjust course or stay the course?
For example, your product strategy is to develop an app that helps late-stage investors perform better due diligence but early customer feedback hints that early-stage investors are more active users. Should you pivot to focus on them instead?
One of the most important jobs you have as a leader is to make decisions such as this. The difference between making a good decision and making a bad decision is balancing stability with responsiveness.
Responsive means understanding that the world has changed so you need to change with it. But stability is valuable—people understand the goal and can continue to work towards it.
This balance is tricky because it can be legitimately hard to know whether the benefits of change outweigh the costs and risks of change. We don’t have access to perfect information so we have to use our judgment. Making this harder still is that changing direction can feel deeply uncomfortable for some people while, for others, it’s perpetually exciting to be responding to new information from the world.
When you have to decide how to respond to new information that conflicts with your well-laid plans, here are two decision nudges to help you improve your judgment.
All decisions are emotional. Findings from modern neuroscience tell us that thinking and feeling are the same. Emotions are indispensable for rationality. Feelings can point us in the right direction and take us to the appropriate place in a decision-space. Once there, we can use the tools of logic and critical thinking.
Do you resist responding to new information because you want things to be more certain and predictable than they are?
Or do you sometimes overreact to new information when it triggers fears or insecurities?
The first place to look is inside yourself. Nudge yourself to notice how information makes you feel because feelings come first. How we feel influences how we respond to new information. Noticing how you feel about that chart or table nudges you to notice how it frames what you see in it.
When you get new information that raises difficult questions about the direction of a project or initiative, take a moment to consciously recognize the intensity of your feelings. The more intense you feel, the more likely you are to resist the information that doesn’t match your beliefs.
When you constantly feel like you’re fighting fires, reflect on how your emotions drive your reactions. Do you enjoy putting out fires because it makes you feel valued? Or do you find it frustrating that you’re constantly in reactive mode?
When someone tells you to drop everything you’ve been doing and solve a different problem, consider whether your feelings towards them alters your assessment of the problem.
Nudge: Be a Mental Yogi
Making better decisions requires being better at predicting the future. One of the most important attributes of a good future forecaster is their ability to change their mind. People who rethink their forecasts are more mentally flexible and this translates into being more accurate.
Changing your mind is not always comfortable. There’s a reason for this. Certainty feels good. Our beliefs are interlocked, so changing one belief may have a flow-on effect.
People who are more cognitively flexible are better decision makers. This doesn’t mean you have to change your mind all the time or have a big change of opinion. An unstable opinion is just as likely to work against good decision-making. A more effective strategy is to remain open-minded and experiment. Can you make a small change in your opinion?
Instead of trying to figure out if the world has changed, be a mental yogi and update your opinion a little. Be sensitive to small perturbations and build your intuition over time. Focus on gaps in your knowledge because it will release you from trying to anticipate a precise outcome.
Being a mental yogi takes experience but you’ll get better with practice.
When the data changes, instead of modeling a big change, take an incremental step and make a small update to your forecasts. Aim to do this more frequently and make smaller, more regular error corrections.
When you feel stuck, look for small changes in the world and experiment with whether they are relevant. How would your decisions change if small changes were indications of a big new trend?
When people demand a certain answer to an uncertain situation, help them differentiate between what is uncertain and what can reasonably be decided.
Your most important role as a leader is to unlock every individual’s unique perspective and bring others’ skills and knowledge to the table. It takes a lot of skill to stay open to new ideas and to reason precisely with others, especially if you disagree.
The key to better reasoning with others is to remain actively open-minded. Actively open-minded is the disposition to be fair towards different conclusions even if they go against one’s initially favored or pet conclusion. So the real trick is twofold: resist the urge to form an initial conclusion in the first place then encourage everyone to assign a probability to their knowledge.
These two nudges will show you how it works.
Nudge: Delay Intuition
In his book Thinking Fast and Slow, Daniel Kahneman details how humans make judgments about the world. Human thinking follows two modes: 1. fast—intuitively and emotionally, and 2. slow—deliberately and more logically.
The first, commonly called system 1, is easy, fast, and efficient. System 2 is harder work. When we make intuitive judgments via system 1, we are prone to particular errors in predictable ways.
For example, loss aversion is a bias toward avoiding losses over acquiring equivalent gains. Individuals prefer not to lose fifty dollars over finding fifty dollars. These are called cognitive biases and we have many. We can’t avoid these biases. Even Kahneman says he is subject to all of them! But we can be more aware of these biases and when they may be operating.
The trick with cognitive biases is not to memorize or unlearn them but to minimize their impact on your judgment. Kahneman advises delaying forming an intuition. This can be hard to do.
As humans rely more and more on data for decision-making, it’s important to value effortful thinking over intuition because AI and big data operate beyond the limits of human intuition.
Intuition works best when we get salient and timely feedback on our judgments. Our gut feelings are more reliable when we make predictions within the range of the “normal.” Extreme emotionality narrows what we attend to. Intuition is great for avoiding a lion but less useful for handling a disagreement over an abstract fact.
Start by resisting the feeling that you know the answer. Hold back, slow down. Write down the ways you may be wrong.
Delaying intuition can be especially valuable when you are making a significant choice which may be influenced by brand loyalty. If you are in the market for a new car, chances are your choice is already strongly influenced by what brands speak to you. Maybe you think it would be a waste of time to shop around. But maybe you’d be wrong.
The pandemic changed how people buy cars. With shortages making many people’s first choices unavailable, they were forced to look at vehicles they never would have considered. One buyer, loyal to Chevy’s Silverado but unable to buy one, ended up purchasing a Nissan Titan, which he hadn’t formerly considered at all. He loves it. “As a big guy, I found the seats to be extremely comfortable.”
The situation in which we use our intuition matters. Just as city roads need different speed bumps than parking lots, we need different ways to slow down our thinking to make a good decision.
Delaying or suppressing intuition keeps you actively open-minded, and open-mindedness is one of the best predictors of making better decisions.
Nudge: Calibrate Confidence
Overconfidence may be a decision-maker’s greatest enemy. Psychologists Don Moore and Max Bazerman call overconfidence the “mother of all biases.”
The Nobel Prize-winning psychologist Daniel Kahneman echoes a similar sentiment. Kahneman has called overconfidence the “most significant of the cognitive biases”—the bias he would most like to eliminate if he had a magic wand.
Overconfidence is so deeply wired in our brains that changing it would change how our minds work. This is the conundrum: overconfidence contributes to social success because it makes for charismatic leaders (most people don’t want to follow someone who says they aren’t really sure) but it contributes to our greatest errors of judgment. We must work with its presence in our thinking.
Moore says that we likely think about confidence in the wrong way. The self-help movement has framed maximizing confidence as necessary for success. But decision scientists know that overconfidence leads to an inflated view of our own abilities, which can lead to overuse of erroneous intuitive judgments.
Confidence calibration is when someone’s subjective assessment of the probabilities matches their uncertainty.
We used Dave as the guinea pig for confidence calibration and had him answer four trivia questions. He answered each as either true or false and added a rating for how confident he felt about his answer.
Here’s how Dave did on a small sample from an online quiz.
Question: English is the top language used on the Internet.
Dave’s answer: True.
He was 90% sure he’s right.
Question: The ferrule connects the bristles to the handle on a paint brush.
Dave’s answer: True.
He’d never heard of a ferrule so he was 50% confident of his answer.
Question: Australia is larger in area than Brazil.
Dave’s answer: True.
He wasn’t certain, but reasoned that Australia is a large continent on its own, so he rated his confidence as 70%.
Question: “Perigee” refers to the point in the orbit of a satellite that is farthest from the earth
Dave’s answer: False.
He was confident that the apogee is the point farthest away, so he rated his confidence at 80%.
The answers are: True, True, False, False. Dave’s answers are: True, True, True, False. His accuracy was 75%.
What about his confidence calibration? It’s the average of his confidence ratings, which is 72.5% Not bad!
A trivia example may not represent real life, but you can take this idea and travel with it. It’s a useful way to clarify the language people use when they express their views. The word certain can represent a wide range of probabilities. If you practice making an estimate and making an estimate of how correct you think you are, you can develop a better feel for how right you actually are versus how right you think you are.
You can try this yourself using an online confidence calibration quiz. You can have your team take the quiz. You can start to associate your own knowledge with a probability of how correct you think you are. Try making a prediction of something that matters to you—sales for the month, prices at the end of the quarter, someone’s response when you praise them, or how much snow will fall on the weekend. How confident are you that your estimate will be correct? Write it down, check what happens, measure how well you did.
As you build up skill in calibrating your confidence, you will begin to notice how well others can calibrate theirs. You can start to question people. You can ask an employee who seems perpetually under confident to explain how she might be right, or ask your overconfident boss how he thinks he might be wrong.
The more we practice calibrating confidence, the easier it becomes to detect both our overconfidence and our underconfidence.
Overconfidence will lead you to make decisions where you underestimate risk, overestimate success, and overuse intuition. Overconfidence feels good and can be difficult to check because it feeds on itself. As you make more intuitive judgments, decisions get easier, judgments become more fluent, confidence increases.
Underconfidence will make you miss out on opportunities. Well-calibrated confidence is the sweet spot.
On any given day, as a leader you don't just have to make big decisions, you have to make many small decisions. You also have to tackle a wide range of problems, many of which seem open-ended and vague. It can be difficult to untangle decisions from problems and know what matters.
Some decisions feel small but may set a bad precedent. For example, you may feel pressured to make an exception for an especially difficult customer but worry about what happens if you have to make the same exception for everyone. Is this decision part of a bigger problem? If so, how should you tackle it?
Some decisions feel small but they lead to bad leadership habits. For example, you may be tempted to check your phone in your weekly one-on-one meetings and think that it isn’t a big deal because it’s just this one time. But what message does that send?
Some decisions feel small but may cut off future decisions and options. For example, a product design choice that makes sense today but you worry that it could limit the breadth of product functionality that’s possible in the future.
Some decisions feel small but may have unintended consequences. For example, in complex systems of people and technology, small variations at the start can lead to a huge variation in outcomes. You worry about a butterfly effect—a reference to how, in chaos theory, there may be a sensitive dependence on initial conditions in a non-linear system—and that you’ll be ill equipped to deal with the consequences.
In our fast changing world, it’s not easy to anticipate the impact of a small decision today or of a problem left for another day. If you worry that the one meeting you missed could have been the one that mattered, or that by cutting a feature now you’ve ruled out a lucrative direction, or that by making one exception you’ve just hammered in the thin of the wedge, you are caught in a bind.
Here’s two decision nudges to help you improve your judgment about how to make better decisions when you aren’t sure about what matters.
Nudge: Break Up Problems Early
Problems have structure. Some problems are well-defined and have all the information you need to solve them. Some problems are ill-defined, where the elements required to solve the problem are missing.
Problems can also be rich or poor. They can be semantically-rich, where you can bring a vast store of prior knowledge and skill to the problem. Alternatively, a problem can be semantically-lean, where there is no experience in your past that prepares you to face that problem.
Problems can be simple or complex. Sometimes a solution falls into place once you take a single, crucial step.
Sometimes a problem can seem overwhelming, especially when there is a lot of data. Break up problems into its known, unknown, and unknowable components. This helps “flush ignorance into the open.”
Use the concept of MECE. MECE (pronounced mee-see) stands for “mutually exclusive and collectively exhaustive.” It matters because early analysis is a mess without it.
Mutually exclusive means that each branch is self contained. Collectively exhaustive means that you have every element that matters somewhere. MECE structuring can even simplify and otherwise unorganized problem.
Take the problem of what you and a group of your friends want to eat for dinner. Do you want Chinese food? Do you want to cook instead? Do you want to order take out? Do you want pasta? Or sushi? Or tacos? You can keep brainstorming and making lists of ideas but all this does is add to the confusion.
A MECE approach breaks this down first into two mutually exclusive branches—eat in or go out. You can’t do both. You can then further break down eat in into cook, takeout, or delivery. Each of these branches has two sub-branches: asian food or non-asian food. On the branch for go out, asian and non-asian food are also sub-branches. You now have all choices captured. None of the different parts overlap and all of the different parts account for the whole.
A critical step in decision making is to choose the error you’d rather correct if the worse case scenario happened. Like the party game that poses a dilemma in the form of a question beginning with “would you rather,” describe what could go wrong on either side of the decision. Which of these can you live with?
For example, would you rather waste time trying to predict something that can’t be predicted, or not predict what could have been? Would you rather keep options open on your product design and deal with the additional complexity or have to reengineer later?
When you feel overwhelmed with a decision, ask what error you can live with. Figure out the worst case scenarios and figure out which outcome you’d rather deal with.
When you feel swamped by a problem, break it up into smaller pieces. What do you know? What is unknown? What is unknowable? Use MECE to make sure you’ve captured all the elements.
When you need to make good guesses, use the crowd. Diverse groups which include experts can filter and sort the big decisions from the small ones.
Nudge: Synthesize Later
Eventually problems need to be put back together so you can make a decision. For every good decision there is typically a counterargument and the process of integrating various perspectives is an important one. Integration yields insight.
Integrating perspectives is particularly important when decisions come with ethical consequences. Facial recognition technology is a case-in-point. The technology is increasingly used for everything from consumer applications to city surveillance.
We worked with a team wrestling with the ethics of the technology in a consumer setting. Facial recognition can be used to streamline check- in for travelers and it can simplify security and access. The problem is that the technology is less accurate when used on people of color and women. In addition to being distasteful, this problem introduced a host of other problems with fairness. How would it play on Twitter if women were disproportionately standing in lines to speak to a person? How would staff feel about being accused of defending a technology that was discriminating against people of color?
For every person in this scenario who was staunchly opposed to facial recognition technology, there was someone in favor of it. The only way to come to a decision was to step slowly through the issues, iterate positions, and nudge each side. We had people make a list of the things that would nudge them toward each other. Eventually an agreement was made. The organization decided to hold off adopting the technology until they had a more mature understanding of the perils and benefits of the technology.
The back-and-forth between opposing views gave way to a common perspective: what mattered most was more complicated than efficiency versus parity. The decision to proceed had to recognize the complexity of new problems that would arise, and the organization’s ability to handle those problems.
Dialectical thinking refers to the ability to view issues from multiple perspectives and arrive at the most economical and reasonable reconciliation of seemingly contradictory information. In decision- making, we can think of this as a process of clashing causal forces— developing one view, putting up the opposing view, then coming to a hybrid perspective by synthesizing a new position.
Tetlock and Gardner write, “There are no paint-by-number rules here. Synthesis is an art that requires reconciling irreducibly subjective judgments.” If done well, synthesis yields more nuanced views and decisions that are more likely to work.
Good decisions require resolving subjective judgments. Have each side make a list of what would nudge them to the other side.