Hindsight Bias: Why You Make Terrible Life Choices

Hindsight Bias: Why You Make Terrible Life Choices

Nir’s Note: Discover other reasons you make terrible life choices like confirmation biasdistinction biasextrinsic motivationfundamental attribution error, hyperbolic discounting, and peak end rule.

By: Erik Johnson and Nir Eyal

In 2000, a 69-year-old man began experiencing a persistent cough, chest discomfort, and weight loss. His physician recommended a radiograph of his chest to identify the root of the issue, which revealed a large tumor. A biopsy confirmed the worst: malignant thymoma, a cancer hiding between the lungs consuming the patient’s body from the inside out1https://www.ajronline.org/doi/full/10.2214/ajr.175.3.1750597. Though the tumor had been growing for years, if left untreated, the man would die of the disease within 16 months.

Shocked, the man struggled to understand how this could have happened. Three and a half years earlier, he’d had the same chest radiograph done as part of a routine examination. The radiologist who did the original exam found nothing out of the ordinary. Now, because of the bad test results, there was little time for treatment.

The patient decided to sue the doctor who missed the tumor.

In the trial, the patient’s attorney showed the initial x-rays to other radiologists, who had no trouble identifying the tumor. There it was. The doctors corroborated; they could all see it.The doctors agreed the patient had suffered a clear case of medical malpractice.

Or had he?

The radiologist’s attorney defended his client by claiming that the other radiologists only spotted the cancer because they already knew what to look for. After all, why else would a lawyer ask doctors to give an x-ray a second look?

The radiologist was innocent, his lawyer claimed, because without knowing there was something to find in the scans, he had made a reasonable diagnosis. It was only in hindsight that the cancer could be easily spotted. The other radiologists’ opinions couldn’t be trusted because they had succumbed to “hindsight bias.”

The jury had to decide whether to convict the radiologist based on how much they attributed to the benefit of hindsight. Was the cancer as obvious and the radiologist as incompetent as they now seemed? Or did knowing the outcome make it impossible to blame the radiologist for making the wrong call?

What is Hindsight Bias?

Hindsight bias occurs when people feel that they “knew it all along” – when they believe that an event is more predictable after it becomes known than it was before it became known2https://journals.sagepub.com/doi/abs/10.1177/1745691612454303.

In other words, when we’re looking back at an event after it already happened, knowing that outcome influences our perception of the events leading up to it.

A woman on the ground after slipping on a banana, saying how she knew all along how she was destined to slip on it

We experience hindsight bias because our brains are constantly trying to make sense of the world. To do that, we’re constantly connecting causes and effects, connecting chains of events. Understanding cause and effect is an essential skill for survival, but when we succumb to hindsight bias, we oversimplify those explanations. We see unpredictable events as obvious. “I knew it all along,” we end up thinking.

Just as the radiologists were more likely to spot the malignancy when they knew something was wrong with the patient’s x-rays, we allow our knowledge of the present to affect our interpretation of the past every day of our lives.

Examples of Hindsight Bias

Hindsight bias influences our perception of all sorts of life events:
  • An investor might think, “Of course the stock market crashed this year. I knew it would!” despite not having sold out when there was a chance.
  • A disappointed birthday girl might tell her party guests, “I just knew it was going to rain on my birthday,” even though there’s no way she could have predicted the weather.
  • A sports fan might claim with confidence, “I always knew my team would win the championship this year,” while s/he had little clue when the season began.

What these three examples have in common is the tendency to see the past as certain, but only after the fact.

Hindsight bias isn’t just about the past, though. It hurts us the most by manipulating how we make decisions for the future.

 Example of hindsight bias: a woman placing a final piece in a puzzle, saying she knew beforehand where it would fit

How Hindsight Bias Tricks You Into Making Terrible Life Choices

When we misinterpret our thinking that preceded an outcome, it can color our judgments for future predictions.

“The important thing to know about hindsight bias is that it not only changes how you see the world, but also how you see yourself in it,” Neal Roese, a professor of marketing at the Kellogg School of Management at Northwestern University, told the New York Times. “You begin to think: ‘Hey, I’m good. I’m really good at figuring out what’s going to happen.’ You begin to see outcomes as inevitable that were not.”

For instance, once burned by not getting out of the stock market before a crash, investors might avoid getting back in the markets, even when they logically know it’s the right thing to do. In contorting the past to fit a narrative that he had known the market would crash all along, even when he didn’t, the investor might listen to his incorrect intuition and get burned yet again.

“If you feel like you knew it all along, it means you won’t stop to examine why something really happened,” wrote Roese3https://journals.sagepub.com/doi/abs/10.1177/1745691612454303. “It’s often hard to convince seasoned decision-makers that they might fall prey to hindsight bias.”

One way this misinterpretation happens is in assuming the causes of past events were much simpler than they were. Kathleen Vohs, a social scientist at the University of Minnesota’s Carlson School of Management, found the bias’s consequences included “myopic attention to a single causal understanding of the past (to the neglect of other reasonable explanations)4https://journals.sagepub.com/doi/abs/10.1177/1745691612454303.”

This can also lead to incorrectly assigning blame or credit for an outcome. Overly simplistic explanations of the past make it easier to pin the consequences on individuals, rightly or wrongly.

Incorrectly assigning credit can also cause distorted thinking. According to Vohs, the simplistic explanations of the past provided by hindsight bias breed overconfidence in the certainty of one’s judgments. “It makes people think they can look back at past events and interpret something; it makes them think they have new ability to predict,” Drew Boyd, executive director of the University of Cincinnati’s MSc Marketing program, explained to the BBC.

When we concoct such false explanations of the past—and of our own predictions—we can’t learn and adapt for future decisions. When we don’t take what really happened to heart, we fail to learn from our mistakes and tend to repeat them.

How Does Hindsight Bias Affect Certain Fields?

Hindsight Bias in Business

Nobel Prize-winning American economist Richard Thaler believes business executives may be more prone to hindsight bias than people in other fields. In one study, researchers surveyed 705 entrepreneurs from failed startups. Before the failure, 77.3% of entrepreneurs believed their startup would grow into a successful business with positive cash flow. But after the startup failed, only 58% said they had originally believed their startup would be a success5https://www.effectuation.org/wp-content/uploads/2017/05/An-investigation-of-hindsight-bias-in-nascent-venture-activity.pdf.

Such a disconnect clouds business leaders’ judgment and ability to learn from the past. “It makes people think they can look back at past events and interpret something; it makes them think they have new ability to predict,” said Boyd to the BBC. “It happens in business a lot when you think that something that has happened before is going to happen again. It seems to make sense. But then it doesn’t happen again and you wonder what happened.”

Hindsight bias in business: a person eyeing a sales chart with a weak quarter, exclaiming they knew it was going to happen
“Business people will decide on a strategy because it worked for them before. But the conditions in the next environment are going to be different: it’s a different market situation, different people, and it’s a mistake to immediately assume that what worked before is going to work again,” he continued. This tendency makes it more likely that business leaders will take on risky and poorly thought out ventures.

Hindsight Bias in Law

As you saw in the radiologist lawsuit described earlier, legal disputes and deliberations can also be uniquely affected by hindsight bias. Judges and juries, knowing the outcome of a situation at the beginning of a trial, are uniquely susceptible to the oversimplifications of hindsight bias when assessing blame for crimes. Separating the facts of the case from the blurred vision caused by knowing the outcome is extremely difficult

Hindsight Bias in Medicine

Medical professionals are also susceptible to hindsight bias. In one classic experiment, a group of physicians were given the medical case history for a patient with a list of four possible diagnoses and asked to estimate the probability of each being correct6https://psycnet.apa.org/record/1981-23945-001. Some of the physicians—the “hindsight” group—were told that one of the four diagnoses was correct before making their estimates. The hindsight group assigned far greater probability estimates to the diagnoses they were told were “correct” than the rest of the subjects. Hindsight bias significantly altered their perception of the symptoms and their cause.

Hindsight Bias in Politics

Political judgement is also clouded by hindsight. Winston Churchill remarked that politics is “the ability to foretell what is going to happen tomorrow, next week, next month and next year. And to have the ability afterward to explain why it didn’t happen.” This is clear in political punditry, in which commentators always have neat explanations for election results or legislative decisions, regardless of their own advance predictions.

In an experiment conducted by psychologists at Loyola University in Chicago, students were asked to predict the likelihood that then-president Bill Clinton would be convicted or acquitted in his impeachment trial7https://www.researchgate.net/publication/232943492_I_Knew_It_All_Along_Eventually_The_Development_of_Hindsight_Bias_in_Reaction_to_the_Clinton_Impeachment_Verdict. They gave estimates at 22 days and 3 days before the verdict, and then they were asked to recall those estimates 4 days and 11 days afterwards. 11 days after the Senate’s acquittal vote, students inflated their original estimates by about 10 percentage points, while their estimates of what a friend would have guessed showed no such inflation.

Political campaigns are aware of voters’ susceptibility to hindsight bias and exploit it to win votes. With the benefit of hindsight, it’s easy for voters to develop a narrative explaining the results of a politician’s service, and campaigns are happy to help craft it. If a swing voter made a difficult decision in the previous election, for example, a challenger can exploit the bias to create ‘voters’ remorse’ for the incumbent. “As the challenger, you need to win over some of those voters,” a former adviser to President George W. Bush, Mark McKinnon, said to the New York Times. “You need to give them an out for their ‘voters’ remorse.’”

Hindsight may make it easy to justify their initial reservations about the incumbent and make them more willing to vote for the challenger. Of course, this can also work in the opposite direction: voters are just as likely to justify their original decision, which may be why incumbents win more often.

Hindsight Bias in History

 Illustration of a hiker looking back on the path to their goal, overconfident with the benefit of hindsight

Finally, hindsight bias is also difficult to avoid in studying history itself. According to Professor Nick Chater of Warwick Business School in an interview with the History News Network, “Looking back into the past, we often think we can understand how things really were—like what caused the 2008 financial crisis, the collapse of communism, or the first world war—because we know how things turned out. But now we know about hindsight bias we should be suspicious of this ‘feeling of understanding’. The idea we can look back on history and understand it should be viewed with scepticism.”

This hurts our ability to learn from history. In 1962, the historian Roberta Wohlstetter wrote in her book that, in reference to the attack on Pearl Harbor, “It is much easier after the event to sort the relevant from the irrelevant signals. After the event, of course, a signal is always crystal clear.”

No one wants to imagine that they might have owned slaves had they been a white man in the American South during Antebellum. It’s incomprehensible to think of ourselves as Nazis, had we lived in Germany in the 1930s. However, unconscionable acts were carried out by normal everyday citizens, just like us.

We don’t want to think of ourselves as capable of hideous acts, so we use hindsight bias to protect ourselves from the truth. We’d much rather believe bad people do bad things or assign blame to a sinister mastermind like Adolf Hitler, but that rationale misses the real lessons of history. By refusing to understand the circumstances that made good people (like us) do horrible things, we risk not learning from and therefore potentially repeating the mistakes of the past.

The History of Hindsight Bias

Though the term is modern, examples of hindsight bias have been documented for centuries.

In his 1867 novel War and Peace, Leo Tolstoy described hindsight bias as the “law of retrospectiveness, which makes all the past appear a preparation for events that occur subsequently.”

Hindsight bias was formally named, documented, and studied in the 1970’s. In 1975, Baruch Fischhoff, a doctoral student of Nobel Prize-winning psychologist Daniel Kahneman, conducted the first experimental study on the concept.8https://www.nifc.gov/PUBLICATIONS/acc_invest_march2010/speakers/2aFischoff.pdf Fischhoff wanted to test what he called the “creeping determinism” hypothesis, in which people tend to perceive outcomes as having been relatively inevitable.

In the study, participants were given brief descriptions of real historical events with four potential outcomes for each and asked to predict the probability of each outcome being the true one. However, some participants were told that one of the choices was the true outcome in advance (though the given answer was not always the correct one). They were then asked to answer as though the answer hadn’t been given.

The results supported Fischoff’s hypothesis. When given the “true” outcome in advance, participants were significantly more likely to view it as inevitable. Further, when asked to explain the reasoning behind their choices, they emphasized the data points supporting the chosen outcome as most relevant. Being given an answer in advance, they were both more confident in their choice and more cognizant of information confirming it.

Fischoff concluded that creeping determinism not only biased their impressions of what they would have known without knowledge of the outcome, but also their impressions of what they themselves and others actually did know in foresight.

In the decades since Fischoff’s initial study, researchers have conducted many more experiments documenting hindsight bias in a variety of ways. The bias has been found in a wide range of task types, such as confidence judgments in the outcome of events, choices between alternatives, and estimations of quantities9http://library.mpib-berlin.mpg.de/ft/uh/uh_hindsight_2000.pdf. It has also been documented in a wide span of knowledge domains, including political events10https://link.springer.com/article/10.1007/BF03186737, medical diagnosis11https://psycnet.apa.org/record/1981-23945-001, outcomes of scientific experiments12https://psycnet.apa.org/record/1978-26684-001, economic decisions13https://journals.aom.org/doi/abs/10.5465/256462 autobiographical memory14https://www.sciencedirect.com/science/article/abs/pii/0010027781900111, and general knowledge15https://link.springer.com/article/10.3758/BF03197054.

In a 1991 meta-analysis of hindsight bias studies conducted by Jay J.J Christensen-Szalanski and Cynthia Fobian Willham, they found the effects of hindsight bias to be statistically strong across the findings, concluding the bias to be robust and difficult to eliminate16https://www.sciencedirect.com/science/article/abs/pii/074959789190010Q.

Why do we have Hindsight Bias?

The consequences of hindsight bias are clear. What’s less clear is why we’re so susceptible to it in the first place. Why would we so naturally rewrite history when it can have such negative effects on our learning and decision-making?

Like most cognitive biases, it’s the result of mental processes that work well in many situations, but backfire in others. To understand why, we first need to become familiar with those processes that combine to create hindsight bias.

For one, we naturally rely on information that is easiest to recall, not that is most relevant or accurate. For example, our fear of flying will often increase after hearing of a crash on the news, even though the statistical risk is unchanged. The vividness and recency of the memory weighs more heavily than objective data. Once we know an outcome, we retrieve memories that align with the result more easily than those that contradict it.

We are also often overconfident in our abilities. 93% of Americans rate themselves as better than average drivers, for example17https://www.sciencedirect.com/science/article/abs/pii/0001691881900056. Overconfidence leads us to create a rosier picture of ourselves than we deserve. Thus, it’s natural to retroactively look back at our interpretations of events as being obvious predictors of an outcome with the benefit of hindsight.

The anchoring effect is another component of hindsight bias. Anchoring refers to our mental tendency to latch onto any available information as a reference point for interpretation and action, regardless of its relevance. Once we know a result, we naturally anchor to it when considering the preceding events. In most cases, we can’t help it. In one study18https://academic.oup.com/qje/article/118/1/73/1917051, researchers asked participants to write down the last two digits of their social security number, then state whether they’d pay the equivalent dollar figure for a variety of consumer items. Participants with higher numbers were willing to pay higher prices for the items. For the highest numbers, the prices were higher by a factor of three. They were anchored to the arbitrary numbers, despite them having no real relevance to the products’ value.

These biases work together for hindsight bias. The anchor of the outcome likely leads to the ease of recall for coinciding data points via the availability heuristic. Then, overconfidence crafts the story of how we knew it all from the beginning.

Beyond these biases, the way we observe the world contributes to our tendency.

For one, our brains have evolved to find patterns, even when none exist. The ability to fill in the blanks between disparate pieces of information made it easier for humans to spot and avoid predators, for instance, creating a major evolutionary advantage.

“Superior pattern processing is the essence of the evolved human brain,” Johns Hopkins neurologist Mark Mattsonn explained to Psychology Today. It is “the fundamental basis of most, if not all, unique features of the human brain including intelligence, language, imagination, invention, and the belief in imaginary entities such as ghosts and gods.”

Anchoring and pattern recognition as psychological components of hindsight bias, resulting in overconfident judgments

The success of pattern recognition also can lead to overcompensation, though. The same mental processes that aided in primitive survival helps breed conspiracy theories. When we observe an event happening, we are more likely to assume it was the result of an intentional series of events rather than randomness. This is called the fundamental attribution error, or our tendency to prefer dispositional explanations to situational ones. Similarly, aligning past events with a known outcome creates a neat and easy to understand narrative19https://journals.sagepub.com/doi/abs/10.1177/1745691612454303.

Aside from recognizing patterns, the mind is also designed with an adaptive thinking process that modifies perception as new information is received. As we take in new information, our mind naturally recalibrates its understanding of the world around us.

Some researchers argue that hindsight bias is simply a feature of that adaptive process (see Cognitive Illusions anthology). They contend that humans are naturally iterative thinkers who update our priors as we receive new information, so it’s obvious that we would update our perceptions of the past with the new information of the results. The “knew it all along” effect is simply a by-product of that framework. Since we are cognitive misers with limited mental capacity, it’s more efficient to update our interpretation of past events like we would for anything else.

How to Overcome Hindsight Bias

Perhaps the most effective tactic to overcome hindsight bias is to document the inputs to decisions and predictions as they happen, leaving a clear record that isn’t at the mercy of biased memory.

“The surest protection against hindsight bias is disciplining ourselves to make explicit predictions, showing what we did in fact know. That record can also provide us with some protection against those individuals who are wont to second guess us, producing exaggerated claims of what we should have known (and perhaps should have told them),” states Jon Scott Armstrong, professor of Marketing at the Wharton School and editor of Principles of Forecasting.

Richard Thaler supports this method as well, according to an interview with McKinsey. “One suggestion I make to my students…is ‘write stuff down.’ I have a colleague who says, ‘If you don’t write it down, it never happened…any company that can learn to distinguish between bad decisions and bad outcomes has a leg up.’”

 Illustration of a balance holding a notebook of documented decisions, outweighing a brain biased by hindsight
When documenting the inputs that went into making decisions, be sure to weigh them, as well. “We try to teach people to use what we call Bayesian thinking,” said Drew Boyd to the BBC. “[Eighteenth-century English statistician] Thomas Bayes’ premise was to consider all sources of information but weight them: some information is more valuable, but all information has some value. Weight that information appropriately and you tend to make the best decision… make decisions based on what the data says is likely to happen, not what you think is going to happen.”

For example, imagine deciding whether or not to invest in a company. There are many inputs that factor into the choice: profit margins, growth rates, industry trends, cash flow, share price, and so on. When making the decision, the investor should document each data point and assign a weight to each one quantifying its perceived importance to the outcome.

Documenting such weights ahead of time avoids the revisionist history that accompanies hindsight bias. When evaluating outcomes that were not properly documented like this, our mind will retroactively assign probabilities. The paper trail counteracts that tendency.

If we haven’t documented the information leading up to the event, then we can “consider the opposite.” When you construct an interpretation of the past, you should flip it and think: how could this not have gone the way you anticipated?

“Once you have your little narrative in your mind, think: how could the outcome go in a different direction or not occur at all?” Kathleen Vohs told the BBC. “If you flip the script like that in a number of ways you can reduce hindsight bias.”

This also helps disrupt our natural overconfidence. “Anything that reduces people’s confidence in predicting something will happen or the pathway in which something will happen is a good way to go about it,” Vohs added.

Conclusion and Summary

In the case of the radiologist accused of malpractice, hindsight bias was a core part of the defense attorney’s strategy. He argued that reviews of the original scans by other experts could never be accurate20https://www.ajronline.org/doi/10.2214/ajr.175.3.1750597.

One of the experts on the defense team supported the view that hindsight bias was impossible to avoid in such a case, even when the graphs were shown without any context of the case.

“I’ve never had an attorney bring me a normal radiograph,” he said. “Whenever an attorney shows or sends me radiographs, the first and only question that comes to my mind is, what was missed on these films?”

The plaintiff’s experts, of course, argued otherwise. They claimed they had reviewed the initial radiographs before they had looked at subsequent studies and before they had been told that the case centered on an alleged missed lung tumor.

“But didn’t you surmise that if an X-ray was brought to you for review by an attorney, that X-ray would necessarily be abnormal?” asked the defense attorney. “Not at all,” answered one of the radiology experts retained by the plaintiff. “The lawyer could well bring normal radiographs.”

The jury bought the latter argument, voting the defendant radiologist liable for malpractice by a 10-to-2 vote, and awarded $872,000 to the family of the deceased patient.

Hindsight bias has real consequences. The question of how much the future influences our interpretation of the past has important implications. Due to many of our cognitive processes, we cannot help but let the knowledge of an outcome influence our perception of the events preceding it.

Illustration of a man looking into a mirror to see another mirror behind him, representing hindsight bias
We can, however, mitigate these retroactive judgments with clear records. By documenting predictions and conditions surrounding events, we can have a more accurate understanding of the past. With that, we can better prepare for the future.
Don’t Follow Your Gut (and What to Do Instead)

Don’t Follow Your Gut (and What to Do Instead)

How should we make decisions in life? Dr. Gleb Tsipursky, a behavioral economist and cognitive neuroscientist, says that whatever you do, Never Go With Your Gut. It’s such bold advice that Dr. Tsipursky decided to make it the title of his latest book. In this interview, Dr. Tsipursky discusses his unorthodox approach and warns against the dangerous mental blindspots that lead to decisions we later regret. He generously allowed NirAndFar readers special access to the assessment on dangerous judgment errors from his new book, available here.

Nir Eyal: Why did you write your book?

Gleb Tsipursky: TLDR; To overcome one of the most dangerous pieces of decision-making advice: “follow your gut.”

My book empowers readers to defeat the dangerous judgment errors (called cognitive biases) that come from following your gut and lead to decision disasters in our careers and businesses. To do so, it combines cutting-edge research in cognitive neuroscience and behavioral economics with pragmatic real-life business case studies to help readers make the wisest and most profitable decisions.

But to give you the full story, I’d have to start with my background.

As a kid, my dad told me, with utmost conviction and absolutely no reservation, to “go with your gut.” I ended up making some really bad decisions in my professional activities, for instance wasting several years of my life pursuing a medical career.

My conviction that the omnipresent advice to “follow your gut” was hollow grew stronger as I came of age during the dotcom boom and bust and the fraudulent accounting scandals around the turn of the millennium. Seeing prominent business leaders blow through hundreds of millions in online-based businesses without effective revenue streams – Webvan, Boo.com, Pets.com – was sobering. I saw the hype that convinced investors to follow their intuitions and put all this money into dotcoms.

Even worse, I saw how the top executives of Enron, Tyco, and WorldCom used illegal accounting practices to scam investors. They must have known they would inevitably be caught, have their reputations ruined and, in many cases, go to jail. Why this seemingly irrational behavior? They were willing to follow their gut, letting their short-term fear of losing social status and being seen as failures drive terrible long-term choices.

As someone with an ethical code of utilitarianism – desiring the most good for the most number – I felt a calling to reduce suffering and improve well-being through helping business leaders and professionals avoid dangerous judgment errors. I believed that’s the best way I could apply my knowledge and skills to improve people’s lives.

NE: You’ve done some fascinating research. From what you’ve learned, what surprised you the most?

GT: I’ve been surprised by many things, but perhaps the strongest surprise – and one of the most negative ones – is that many traditional methods used by leaders to try to address the weaknesses of human nature are often more harmful than helpful.

For example, consider business strategic planning assessments such as the very popular SWOT, where a group of business leaders tries to figure out the Strengths, Weaknesses, Opportunities, and Threats facing their business. The large majority of SWOT assessments fail to account for cognitive biases.

One of the most dangerous mental blindspots for business leaders performing SWOT is overconfidence bias. You might not be surprised that those who were most successful in the past are the ones who grow most overconfident. In fact, such people tend to believe themselves to not be prone to dangerous judgment errors, which is itself a mental blindspot called bias blind spot. To quote Proverbs 16:18, “Pride goeth before destruction, and an haughty spirit before a fall.”

A related problem is the optimism bias, our tendency to look at life through rose-colored glasses. Research shows that top leaders — whether CEOs or entrepreneur-founders — are especially likely to be excessively optimistic about their success, which harms their ability to make effective strategic plans. They tend to overvalue their skills, knowledge, and ability. Such optimism results in problems ranging from too-high earnings forecasts to paying too much when acquiring companies to bad corporate investments.

As a result, business leaders tend to list way too many strengths and opportunities and not nearly enough threats and weaknesses during SWOT. Their overconfidence and optimism biases lead them to disregard risks and overestimate rewards. Such problems apply not only to SWOT, but also to other popular strategic assessments.

It’s ironic and tragic that we put so much effort into techniques that provide false comfort and often lead to the exact business disasters that we seek to avoid by using such techniques.

NE: What lessons should people take away from your book regarding how they should design their own behavior or the behavior of others?

GT: One big one is a meta-lesson: distance yourself from what feels comfortable in designing intentional behaviors for yourself and others. Gut reactions cause us to behave in ways that feel comfortable, but are often very dangerous for us. Our gut reactions are adapted for the savannah environment, when we lived as hunters and foragers in small tribes, not the incredibly complex, multicultural world. Thus, you need to avoid the temptation of going with what’s comfortable and assuming that what feels right is what’s actually good for you.

Instead, plan out your goals, and then design behaviors from those goals. If some behaviors go against your instincts, deliberately go further than feels intuitive to you to accomplish your goals. Otherwise, you won’t address the anchoring effect, the cognitive bias of not going far enough from our current patterns even though we think we’ve done what we needed to do.

Another big lesson is to be wary of frequently-given advice for designing their own behavior or the behavior of others. For example, many business gurus and advice columns heap praise on those leaders who make quick gut decisions — about the direction of their company, whether or not to launch a new product, which candidate to hire.

“Trust your instincts” feels very comfortable to us, and we tend to choose what’s comfortable rather than what’s true or good for us. Sadly, gurus who tell people what they want to hear and what makes them comfortable get paid the big bucks, while experts who speak uncomfortable truths usually get ignored. What would you intuitively rather hear: someone describing delicious, delightful, delectable dozen donuts or someone sharing about how to maintain your physical fitness?

“Go with your gut” is the equivalent of the dozen donuts dessert of business advice. Sure, the box of dozen donuts contains more calories than we should eat in a whole day. However, our gut wants the donuts instead of the healthy but less intuitively appealing fruit platter of not going with our intuitions. The choice that is most appealing to your gut is often the worst decision for your bottom line, just like the donuts are much more intuitively desirable than a fruit tray, but are the worst choice for your waistline. Too often, we choose an attractive dessert (or a business option) that we later regret (myself included).

NE: Writing a book is hard. What do you do when you find yourself distracted or going off track?

GT: I had to learn the hard way that I need more breaks than I intuitively feel I do, and a good sign for that is when I feel myself getting distracted or going off track.

Since I’m so passionate about helping folks avoid decision disasters, my gut reactions push me to be a workaholic. So I’m naturally tempted to keep working and working and working.

However, the longer I work without a break, the more my productivity suffers. I find myself unintentionally checking email and social media, reading irrelevant articles, clicking on cat pictures.

My work quality goes down as well. When I look back at the writing I did over a long stretch of uninterrupted work, I have to edit what I wrote at the tail end of the stretch of work much more than what I wrote in the early stretch.

I had to retrain my intuitions and design my behavior to learn to take breaks. Then, I go back to doing writing work refreshed.

NE: What’s one thing you believe that most people would disagree with?

GT: I get a lot of pushback when I insist we need to evaluate decisions more by the decision-making process than the outcome.

It’s intuitive to look at what happened as a result of a decision and judge the quality of decisions by that. Did you meet the quarterly earnings? Did you get that job? Did you make the sale? That’s what we tend to focus on in our assessments.

Yet we forget the role that luck plays in decisions and outcomes. Someone might have had a terrible decision-making process and gotten lucky. Someone else might have done everything right but had rotten luck. Focusing on outcomes over process is known as the outcome bias, one of the many cognitive biases that lead to decision disasters.

Instead, we should reward those who did the right things – who have the best decision-making process – rather than those who got lucky with a bad process. After all, good or bad luck can happen to anyone. The most anyone can do is adopt the most effective decision-making process so as to maximize our chance of good outcomes and minimize the chance of bad ones. This is a counterintuitive approach that most people disagree with, but it’s the one that gets the best outcomes over time.

NE: What’s your most important good habit or routine?

GT: I have a set morning routine that really helps pave the way for my day. After I wake up and do my morning hygiene, I have a series of yoga and cardio exercises that take about 45 minutes. Then, I do about 15-30 minutes of journaling about what happened yesterday and what I learned from it, as well as setting my goals and priorities for the day. After that, I can get to doing writing and other work with a clear head and a good plan.

NE: Are you working on changing any bad habits?

I’m constantly working on keeping in check harmful intuitive tendencies. For example, I am frequently tempted to stay up too late, being a night owl by nature. Yet I often have morning commitments – coaching meetings, podcast interviews, webinars, and so on. So I strive to keep a good sleeping schedule. It helps that my wife and I have a mutual commitment of spending time together on our couch about an hour before going to sleep. That mutual commitment helps keep me honest. I struggle to maintain that good sleeping habit when I travel, though. It can be tough.

NE: What one product or service has helped you build a healthy habit?

GT: Well, it was certainly helpful to read Indistractable, but I guess folks who follow Nir already know about that.

One useful but simple tool has been timers. It’s too easy for me to lose track of time, such as when I’m working or when I’m staying up too late. Making a commitment to setting a timer for activities where I tend to lose track of time has really helped me manage myself better.

NE: What’s the most important takeaway you want people to remember after reading your book?

GT: The most important – and most challenging – takeaway is that what feels most comfortable is often exactly the wrong thing for us to do. In our technologically disrupted environment, the future is never going to be like today. We have to adapt constantly to an increasingly-changing environment to ensure the success of our business and our careers. That ever-intensifying pace of change means our gut reactions – which are suited for the ancient savanna environment – will be less and less suited in the future, and relying on them will lead us to crash and burn.

The ones who survive and flourish in the world of tomorrow will recognize this paradigm shift. They will adopt counterintuitive, uncomfortable, and highly profitable techniques to avoid disasters and make the best decisions to design behaviors that address the systematic and predictable dangerous judgment errors we all tend to make. It is my fervent hope that all readers of the book do so, minimizing suffering and maximizing wellbeing for themselves and others they care about.
Why We Should All Be Wearing (and Making) Face Masks Right Now

Why We Should All Be Wearing (and Making) Face Masks Right Now

Everyone should be wearing a face mask now, whether they are sick or healthy. We can make our own masks to ensure we’re not taking them away from health care workers. In several Asian countries that are successfully lowering the number of infections from Covid-19, mask wearing is widely promoted, so why isn’t the U.S. following suit?

In China, authorities use drones fitted with loudspeakers to scold people who don’t wear masks outdoors. In South Korea, the government rationed masks, staggering days when citizens could buy a limited number so there would be enough for everyone. Leaders in Hong Kong and Taiwan set a good example by wearing masks when making public appearances, often flanked by a cadre of mask-wearing staffers.

The consistent messaging from authorities in Asia is in stark contrast with American officials who have fed confusion and distrust. On February 29th, Dr. Jerome Adams, the U.S. Surgeon General, tweeted, “Seriously people- STOP BUYING MASKS! They are NOT effective in preventing general public from catching #Coronavirus, but if healthcare providers can’t get them to care for sick patients, it puts them and our communities at risk!”

Dr. Adams’ recommendation is dead wrong. It fails to understand consumer psychology and, in the course of time, will prove to have cost thousands of lives. While the standard advice to stay home and wash your hands still stands, many people work in essential services outside the home and almost everyone needs to go outside once in a while for groceries and other supplies. Dr. Adams makes three arguments which lead to one terribly wrong conclusion.

As a behavioral designer, I believe we need to adjust our public health messaging for the way people behave in the real world.

Scarcity Encourages Hoarding

First, it’s true that people should stop buying masks. But telling people to do so only encourages hoarding. The scarcity heuristic, well known to social psychologists and anyone with common sense, tells us that people are more likely to value something when it suddenly becomes scarce. Telling people that there’s a run on face masks only makes them more likely to stock up before supplies run out.

Rather than telling people to stop and hoping they’ll listen, authorities should follow the steps implemented in South Korea and ration certain masks while there is a critical shortage. Heavy duty N95 masks and other medical-grade respirators should be reserved by law for health care facilities with an amnesty for people who hoarded them to provide an incentive to sell them to hospitals who are eager to buy.

Social Proof Changes Human Behavior

Second, it’s true that face masks alone do not prevent all infections. However, wearing masks has been shown to be more effective at preventing transmission of disease than hand-washing alone and significantly more effective when combined with other interventions.

Furthermore, healthy people wearing face masks protect the greater population. People need social proof to change their behavior. Asking only sick people to wear a face mask makes little sense, since people are unlikely to want to advertise they are ill. Covid-19 can infect people who never show symptoms, don’t know they’re sick, and yet may infect others. If we all started wearing face masks in public, we’d limit the spread of the disease by making mask-wearing socially acceptable, as it is today in much of Asia.

Some argue that people don’t know how to wear a mask properly and may end up touching their faces more. However, no studies have shown increased transmission rates when the general public is encouraged to wear masks and people can learn the simple steps for handling masks properly.

Contradiction Hurts Credibility

Dr. Adams’ third mistake is his failure to explain why a health care provider without a mask is at risk while the rest of us can walk around mask-free and risk-free. The virus doesn’t ask to see your medical credentials after all, and the contradiction hurts his credibility.

Adams should clarify his message and issue a call to action. Masks do appear to reduce transmission, and there are other options than buying them. Americans should start making their own masks as quickly as possible.

Most people have the basic materials to make a mask right at home right now, and studies find homemade masks are more effective than no protection. Making our own masks will help save existing stock for healthcare professionals as manufacturers ramp up production in the coming months. It can increase the supply so that no mail carrier, food courier, or janitor needs to be without a mask.

It’s time to call forth the “can do” American spirit and encourage people stuck at home to start sewing. Instead of ill-conceived mixed messages, we can use social proof to make wearing a mask the new norm. Instead of wearing a button or lapel pin as people have done to show their support during past crises, let’s think of wearing a face mask as an act of solidarity. Wearing a mask shows we’re all in this together.

Fundamental Attribution Error: Why You Make Terrible Life Choices

Fundamental Attribution Error: Why You Make Terrible Life Choices

Nir’s Note: This post part of a series on cognitive bias co-authored by Nir Eyal and illustrated by Lakshmi Mani. Discover other reasons you make terrible life choices like confirmation biashyperbolic discounting, distinction biasextrinsic motivationhindsight bias, and peak end rule.

There I was, sitting in a packed movie theatre. I waited two years for this sequel and I’ve got enough popcorn and diet soda to last me a full three hours. Fifteen minutes into the movie, the hero and villain are facing off for the first time when a lady bursts into the theater. Trying to find a seat, she awkwardly tries to squeeze into the middle of the row in front of me blocking the best part of the movie. “What a rude and inconsiderate person!” I think to myself as I dodge her body when she scuffles by.

A week later I’m rushing to catch another film with my friends. It’s pouring rain and traffic is crazy. I hope I make it before the previews end but when I reach the theater (soaking wet I might add), the movie has already begun. I have to turn on the flashlight on my phone to find my seat and accidentally step on a few movie goers’ toes. I hear tuts and loud sighs. It’s clear these people think I’m a complete jerk.


Peak-End Rule: Why You Make Terrible Life Choices

Nir’s Note: This post part of a series on cognitive bias co-authored with and illustrated by Lakshmi Mani. Discover other reasons you make terrible life choices like confirmation biashyperbolic discountingdistinction biasextrinsic motivationhindsight bias, and fundamental attribution error.

It’s New Year’s Eve. There I am on the dance floor – it’s teeming with people and there’s hardly space to breathe. Loud thumping music pierces my eardrums and I have no idea where my friends are.

Then, the guy next to me takes a misstep, spills an entire cup of beer down my shoulder. I gasp as the cold brew winds its way down my back. But he’s too drunk and the music is too loud for him to notice. Is this supposed to be fun? What am I doing here? I hail a ride to get out of there.

At home, after wringing out my shirt and getting ready for bed, I take a minute to pull up my phone and glance at my Instagram stories.

There I see the plate of chocolate cake from the dinner with friends that started the evening.