Select Page

Nir’s Note: Hindsight bias is only one of many cognitive biases—discover other reasons you make terrible life choices like confirmation biasdistinction biasextrinsic motivationfundamental attribution error, hyperbolic discounting, self-serving bias, and peak end rule.

By: Erik Johnson and Nir Eyal

In 2000, a 69-year-old man began experiencing a persistent cough, chest discomfort, and weight loss. His physician recommended a radiograph of his chest to identify the root of the issue, which revealed a large tumor. A biopsy confirmed the worst: malignant thymoma, a cancer hiding between the lungs consuming the patient’s body from the inside out1https://www.ajronline.org/doi/full/10.2214/ajr.175.3.1750597?cookieSet=1 . Though the tumor had been growing for years, if left untreated, the man would die of the disease within 16 months.

Shocked, the man struggled to understand how this could have happened. Three and a half years earlier, he’d had the same chest radiograph done as part of a routine examination. The radiologist who did the original exam found nothing out of the ordinary. Now, because of the bad test results, there was little time for treatment.

The patient decided to sue the doctor who missed the tumor.

In the trial, the patient’s attorney showed the initial x-rays to other radiologists, who had no trouble identifying the tumor. There it was. The doctors corroborated; they could all see it.The doctors agreed the patient had suffered a clear case of medical malpractice.

Or had he?

The radiologist’s attorney defended his client by claiming that the other radiologists only spotted the cancer because they already knew what to look for. After all, why else would a lawyer ask doctors to give an x-ray a second look?

The radiologist was innocent, his lawyer claimed, because without knowing there was something to find in the scans, he had made a reasonable diagnosis. It was only in hindsight that the cancer could be easily spotted. The other radiologists’ opinions couldn’t be trusted because they had succumbed to “hindsight bias.”

The jury had to decide whether to convict the radiologist based on how much they attributed to the benefit of hindsight. Was the cancer as obvious and the radiologist as incompetent as they now seemed? Or did knowing the outcome make it impossible to blame the radiologist for making the wrong call?

What is Hindsight Bias?

Hindsight bias occurs when people feel that they “knew it all along” – when they believe that an event is more predictable after it becomes known than it was before it became known2https://journals.sagepub.com/doi/abs/10.1177/1745691612454303 .

In other words, when we’re looking back at an event after it already happened, knowing that outcome influences our perception of the events leading up to it.

A woman on the ground after slipping on a banana, saying how she knew all along how she was destined to slip on it

We experience hindsight bias because our brains are constantly trying to make sense of the world. To do that, we’re constantly connecting causes and effects, connecting chains of events. Understanding cause and effect is an essential skill for survival, but when we succumb to hindsight bias, we oversimplify those explanations. We see unpredictable events as obvious. “I knew it all along,” we end up thinking.

Just as the radiologists were more likely to spot the malignancy when they knew something was wrong with the patient’s x-rays, we allow our knowledge of the present to affect our interpretation of the past every day of our lives.

Examples of Hindsight Bias

Hindsight bias influences our perception of all sorts of life events:
  • An investor might think, “Of course the stock market crashed this year. I knew it would!” despite not having sold out when there was a chance.
  • A disappointed birthday girl might tell her party guests, “I just knew it was going to rain on my birthday,” even though there’s no way she could have predicted the weather.
  • A sports fan might claim with confidence, “I always knew my team would win the championship this year,” while s/he had little clue when the season began.

What these three examples have in common is the tendency to see the past as certain, but only after the fact.

Hindsight bias isn’t just about the past, though. It hurts us the most by manipulating how we make decisions for the future.

 Example of hindsight bias: a woman placing a final piece in a puzzle, saying she knew beforehand where it would fit

How Hindsight Bias Tricks You Into Making Terrible Life Choices

When we misinterpret our thinking that preceded an outcome, it can color our judgments for future predictions.

“The important thing to know about hindsight bias is that it not only changes how you see the world, but also how you see yourself in it,” Neal Roese, a professor of marketing at the Kellogg School of Management at Northwestern University, told the New York Times. “You begin to think: ‘Hey, I’m good. I’m really good at figuring out what’s going to happen.’ You begin to see outcomes as inevitable that were not.”

For instance, once burned by not getting out of the stock market before a crash, investors might avoid getting back in the markets, even when they logically know it’s the right thing to do. In contorting the past to fit a narrative that he had known the market would crash all along, even when he didn’t, the investor might listen to his incorrect intuition and get burned yet again.

“If you feel like you knew it all along, it means you won’t stop to examine why something really happened,” wrote Roese3https://journals.sagepub.com/doi/abs/10.1177/1745691612454303 . “It’s often hard to convince seasoned decision-makers that they might fall prey to hindsight bias.”

One way this misinterpretation happens is in assuming the causes of past events were much simpler than they were. Kathleen Vohs, a social scientist at the University of Minnesota’s Carlson School of Management, found the bias’s consequences included “myopic attention to a single causal understanding of the past (to the neglect of other reasonable explanations)4https://journals.sagepub.com/doi/abs/10.1177/1745691612454303 .”

This can also lead to incorrectly assigning blame or credit for an outcome. Overly simplistic explanations of the past make it easier to pin the consequences on individuals, rightly or wrongly.

Incorrectly assigning credit can also cause distorted thinking. According to Vohs, the simplistic explanations of the past provided by hindsight bias breed overconfidence in the certainty of one’s judgments. “It makes people think they can look back at past events and interpret something; it makes them think they have new ability to predict,” Drew Boyd, executive director of the University of Cincinnati’s MSc Marketing program, explained to the BBC.

When we concoct such false explanations of the past—and of our own predictions—we can’t learn and adapt for future decisions. When we don’t take what really happened to heart, we fail to learn from our mistakes and tend to repeat them.

How Does Hindsight Bias Affect Certain Fields?

The effects of hindsight bias are especially significant in certain areas.

Business

Nobel Prize-winning American economist Richard Thaler believes business executives may be more prone to hindsight bias than people in other fields. In one study, researchers surveyed 705 entrepreneurs from failed startups. Before the failure, 77.3% of entrepreneurs believed their startup would grow into a successful business with positive cash flow. But after the startup failed, only 58% said they had originally believed their startup would be a success5https://www.effectuation.org/wp-content/uploads/2017/05/An-investigation-of-hindsight-bias-in-nascent-venture-activity.pdf .

Such a disconnect clouds business leaders’ judgment and ability to learn from the past. “It makes people think they can look back at past events and interpret something; it makes them think they have new ability to predict,” said Boyd to the BBC. “It happens in business a lot when you think that something that has happened before is going to happen again. It seems to make sense. But then it doesn’t happen again and you wonder what happened.”

A person eyeing a sales chart with a weak quarter, exclaiming they knew it was going to happen

“Business people will decide on a strategy because it worked for them before. But the conditions in the next environment are going to be different: it’s a different market situation, different people, and it’s a mistake to immediately assume that what worked before is going to work again,” he continued. This tendency makes it more likely that business leaders will take on risky and poorly thought out ventures.

Law

As you saw in the radiologist lawsuit described earlier, legal disputes and deliberations can also be uniquely affected by hindsight bias. Judges and juries, knowing the outcome of a situation at the beginning of a trial, are uniquely susceptible to the oversimplifications of hindsight bias when assessing blame for crimes. Separating the facts of the case from the blurred vision caused by knowing the outcome is extremely difficult

Medicine

Medical professionals are also susceptible to hindsight bias. In one classic experiment, a group of physicians were given the medical case history for a patient with a list of four possible diagnoses and asked to estimate the probability of each being correct6https://psycnet.apa.org/record/1981-23945-001 . Some of the physicians—the “hindsight” group—were told that one of the four diagnoses was correct before making their estimates. The hindsight group assigned far greater probability estimates to the diagnoses they were told were “correct” than the rest of the subjects. Hindsight bias significantly altered their perception of the symptoms and their cause.

Politics

Political judgement is also clouded by hindsight. Winston Churchill remarked that politics is “the ability to foretell what is going to happen tomorrow, next week, next month and next year. And to have the ability afterward to explain why it didn’t happen.” This is clear in political punditry, in which commentators always have neat explanations for election results or legislative decisions, regardless of their own advance predictions.

In an experiment conducted by psychologists at Loyola University in Chicago, students were asked to predict the likelihood that then-president Bill Clinton would be convicted or acquitted in his impeachment trial7https://www.researchgate.net/publication/232943492_I_Knew_It_All_Along_Eventually_The_Development_of_Hindsight_Bias_in_Reaction_to_the_Clinton_Impeachment_Verdict . They gave estimates at 22 days and 3 days before the verdict, and then they were asked to recall those estimates 4 days and 11 days afterwards. 11 days after the Senate’s acquittal vote, students inflated their original estimates by about 10 percentage points, while their estimates of what a friend would have guessed showed no such inflation.

Political campaigns are aware of voters’ susceptibility to hindsight bias and exploit it to win votes. With the benefit of hindsight, it’s easy for voters to develop a narrative explaining the results of a politician’s service, and campaigns are happy to help craft it. If a swing voter made a difficult decision in the previous election, for example, a challenger can exploit the bias to create ‘voters’ remorse’ for the incumbent. “As the challenger, you need to win over some of those voters,” a former adviser to President George W. Bush, Mark McKinnon, said to the New York Times. “You need to give them an out for their ‘voters’ remorse.’”

Hindsight may make it easy to justify their initial reservations about the incumbent and make them more willing to vote for the challenger. Of course, this can also work in the opposite direction: voters are just as likely to justify their original decision, which may be why incumbents win more often.

History

 Illustration of a hiker looking back on the path to their goal, overconfident with the benefit of hindsight

Finally, hindsight bias is also difficult to avoid in studying history itself. According to Professor Nick Chater of Warwick Business School in an interview with the History News Network, “Looking back into the past, we often think we can understand how things really were—like what caused the 2008 financial crisis, the collapse of communism, or the first world war—because we know how things turned out. But now we know about hindsight bias we should be suspicious of this ‘feeling of understanding’. The idea we can look back on history and understand it should be viewed with scepticism.”

This hurts our ability to learn from history. In 1962, the historian Roberta Wohlstetter wrote in her book that, in reference to the attack on Pearl Harbor, “It is much easier after the event to sort the relevant from the irrelevant signals. After the event, of course, a signal is always crystal clear.”

No one wants to imagine that they might have owned slaves had they been a white man in the American South during Antebellum. It’s incomprehensible to think of ourselves as Nazis, had we lived in Germany in the 1930s. However, unconscionable acts were carried out by normal everyday citizens, just like us.

We don’t want to think of ourselves as capable of hideous acts, so we use hindsight bias to protect ourselves from the truth. We’d much rather believe bad people do bad things or assign blame to a sinister mastermind like Adolf Hitler, but that rationale misses the real lessons of history. By refusing to understand the circumstances that made good people (like us) do horrible things, we risk not learning from and therefore potentially repeating the mistakes of the past.

The History of Hindsight Bias

Though the term is modern, examples of hindsight bias have been documented for centuries.

In his 1867 novel War and Peace, Leo Tolstoy described hindsight bias as the “law of retrospectiveness, which makes all the past appear a preparation for events that occur subsequently.”

Hindsight bias was formally named, documented, and studied in the 1970’s. In 1975, Baruch Fischhoff, a doctoral student of Nobel Prize-winning psychologist Daniel Kahneman, conducted the first experimental study on the concept. Fischhoff wanted to test what he called the “creeping determinism” hypothesis, in which people tend to perceive outcomes as having been relatively inevitable.

In the study, participants were given brief descriptions of real historical events with four potential outcomes for each and asked to predict the probability of each outcome being the true one. However, some participants were told that one of the choices was the true outcome in advance (though the given answer was not always the correct one). They were then asked to answer as though the answer hadn’t been given.

The results supported Fischoff’s hypothesis. When given the “true” outcome in advance, participants were significantly more likely to view it as inevitable. Further, when asked to explain the reasoning behind their choices, they emphasized the data points supporting the chosen outcome as most relevant. Being given an answer in advance, they were both more confident in their choice and more cognizant of information confirming it.

Fischoff concluded that creeping determinism not only biased their impressions of what they would have known without knowledge of the outcome, but also their impressions of what they themselves and others actually did know in foresight.

In the decades since Fischoff’s initial study, researchers have conducted many more experiments documenting hindsight bias in a variety of ways. The bias has been found in a wide range of task types, such as confidence judgments in the outcome of events, choices between alternatives, and estimations of quantities8https://library.mpib-berlin.mpg.de/ft/uh/uh_hindsight_2000.pdf . It has also been documented in a wide span of knowledge domains, including political events9https://idp.springer.com/authorize?response_type=cookie&client_id=springerlink&redirect_uri=https%3A%2F%2Flink.springer.com%2Farticle%2F10.1007%2FBF03186737 , medical diagnosis10https://psycnet.apa.org/record/1981-23945-001 , outcomes of scientific experiments11https://psycnet.apa.org/record/1978-26684-001 , economic decisions12https://journals.aom.org/doi/abs/10.5465/256462 autobiographical memory13https://www.sciencedirect.com/science/article/abs/pii/0010027781900111 , and general knowledge14https://idp.springer.com/authorize?response_type=cookie&client_id=springerlink&redirect_uri=https%3A%2F%2Flink.springer.com%2Farticle%2F10.3758%2FBF03197054 .

In a 1991 meta-analysis of hindsight bias studies conducted by Jay J.J Christensen-Szalanski and Cynthia Fobian Willham, they found the effects of hindsight bias to be statistically strong across the findings, concluding the bias to be robust and difficult to eliminate15https://www.sciencedirect.com/science/article/abs/pii/074959789190010Q .

Why do we have Hindsight Bias?

The consequences of hindsight bias are clear. What’s less clear is why we’re so susceptible to it in the first place. Why would we so naturally rewrite history when it can have such negative effects on our learning and decision-making?

Like most cognitive biases, it’s the result of mental processes that work well in many situations, but backfire in others. To understand why, we first need to become familiar with those processes that combine to create hindsight bias.

For one, we naturally rely on information that is easiest to recall, not that is most relevant or accurate. For example, our fear of flying will often increase after hearing of a crash on the news, even though the statistical risk is unchanged. The vividness and recency of the memory weighs more heavily than objective data. Once we know an outcome, we retrieve memories that align with the result more easily than those that contradict it.

We are also often overconfident in our abilities. 93% of Americans rate themselves as better than average drivers, for example16https://www.sciencedirect.com/science/article/abs/pii/0001691881900056 . Overconfidence leads us to create a rosier picture of ourselves than we deserve. Thus, it’s natural to retroactively look back at our interpretations of events as being obvious predictors of an outcome with the benefit of hindsight.

The anchoring effect is another component of hindsight bias. Anchoring refers to our mental tendency to latch onto any available information as a reference point for interpretation and action, regardless of its relevance. Once we know a result, we naturally anchor to it when considering the preceding events. In most cases, we can’t help it. In one study17https://academic.oup.com/qje/article/118/1/73/1917051 , researchers asked participants to write down the last two digits of their social security number, then state whether they’d pay the equivalent dollar figure for a variety of consumer items. Participants with higher numbers were willing to pay higher prices for the items. For the highest numbers, the prices were higher by a factor of three. They were anchored to the arbitrary numbers, despite them having no real relevance to the products’ value.

These biases work together for hindsight bias. The anchor of the outcome likely leads to the ease of recall for coinciding data points via the availability heuristic. Then, overconfidence crafts the story of how we knew it all from the beginning.

Beyond these biases, the way we observe the world contributes to our tendency.

For one, our brains have evolved to find patterns, even when none exist. The ability to fill in the blanks between disparate pieces of information made it easier for humans to spot and avoid predators, for instance, creating a major evolutionary advantage.

“Superior pattern processing is the essence of the evolved human brain,” Johns Hopkins neurologist Mark Mattsonn explained to Psychology Today. It is “the fundamental basis of most, if not all, unique features of the human brain including intelligence, language, imagination, invention, and the belief in imaginary entities such as ghosts and gods.”

Anchoring and pattern recognition as psychological components, resulting in overconfident judgments

The success of pattern recognition also can lead to overcompensation, though. The same mental processes that aided in primitive survival helps breed conspiracy theories. When we observe an event happening, we are more likely to assume it was the result of an intentional series of events rather than randomness. This is called the fundamental attribution error, or our tendency to prefer dispositional explanations to situational ones. Similarly, aligning past events with a known outcome creates a neat and easy to understand narrative18https://journals.sagepub.com/doi/abs/10.1177/1745691612454303 .

Aside from recognizing patterns, the mind is also designed with an adaptive thinking process that modifies perception as new information is received. As we take in new information, our mind naturally recalibrates its understanding of the world around us.

Some researchers argue that hindsight bias is simply a feature of that adaptive process (see Cognitive Illusions anthology). They contend that humans are naturally iterative thinkers who update our priors as we receive new information, so it’s obvious that we would update our perceptions of the past with the new information of the results. The “knew it all along” effect is simply a by-product of that framework. Since we are cognitive misers with limited mental capacity, it’s more efficient to update our interpretation of past events like we would for anything else.

How to Overcome Hindsight Bias

Perhaps the most effective tactic to overcome hindsight bias is to document the inputs to decisions and predictions as they happen, leaving a clear record that isn’t at the mercy of biased memory.

“The surest protection against hindsight bias is disciplining ourselves to make explicit predictions, showing what we did in fact know. That record can also provide us with some protection against those individuals who are wont to second guess us, producing exaggerated claims of what we should have known (and perhaps should have told them),” states Jon Scott Armstrong, professor of Marketing at the Wharton School and editor of Principles of Forecasting.

Richard Thaler supports this method as well, according to an interview with McKinsey. “One suggestion I make to my students…is ‘write stuff down.’ I have a colleague who says, ‘If you don’t write it down, it never happened…any company that can learn to distinguish between bad decisions and bad outcomes has a leg up.’”

 Illustration of a balance holding a notebook of documented decisions, outweighing a brain biased by hindsight
When documenting the inputs that went into making decisions, be sure to weigh them, as well. “We try to teach people to use what we call Bayesian thinking,” said Drew Boyd to the BBC. “[Eighteenth-century English statistician] Thomas Bayes’ premise was to consider all sources of information but weight them: some information is more valuable, but all information has some value. Weight that information appropriately and you tend to make the best decision… make decisions based on what the data says is likely to happen, not what you think is going to happen.”

For example, imagine deciding whether or not to invest in a company. There are many inputs that factor into the choice: profit margins, growth rates, industry trends, cash flow, share price, and so on. When making the decision, the investor should document each data point and assign a weight to each one quantifying its perceived importance to the outcome.

Documenting such weights ahead of time avoids the revisionist history that accompanies hindsight bias. When evaluating outcomes that were not properly documented like this, our mind will retroactively assign probabilities. The paper trail counteracts that tendency.

If we haven’t documented the information leading up to the event, then we can “consider the opposite.” When you construct an interpretation of the past, you should flip it and think: how could this not have gone the way you anticipated?

“Once you have your little narrative in your mind, think: how could the outcome go in a different direction or not occur at all?” Kathleen Vohs told the BBC. “If you flip the script like that in a number of ways you can reduce hindsight bias.”

This also helps disrupt our natural overconfidence. “Anything that reduces people’s confidence in predicting something will happen or the pathway in which something will happen is a good way to go about it,” Vohs added.

Conclusion and Summary

In the case of the radiologist accused of malpractice, hindsight bias was a core part of the defense attorney’s strategy. He argued that reviews of the original scans by other experts could never be accurate19https://www.ajronline.org/doi/10.2214/ajr.175.3.1750597 .

One of the experts on the defense team supported the view that hindsight bias was impossible to avoid in such a case, even when the graphs were shown without any context of the case.

“I’ve never had an attorney bring me a normal radiograph,” he said. “Whenever an attorney shows or sends me radiographs, the first and only question that comes to my mind is, what was missed on these films?”

The plaintiff’s experts, of course, argued otherwise. They claimed they had reviewed the initial radiographs before they had looked at subsequent studies and before they had been told that the case centered on an alleged missed lung tumor.

“But didn’t you surmise that if an X-ray was brought to you for review by an attorney, that X-ray would necessarily be abnormal?” asked the defense attorney. “Not at all,” answered one of the radiology experts retained by the plaintiff. “The lawyer could well bring normal radiographs.”

The jury bought the latter argument, voting the defendant radiologist liable for malpractice by a 10-to-2 vote, and awarded $872,000 to the family of the deceased patient.

Hindsight bias has real consequences. The question of how much the future influences our interpretation of the past has important implications. Due to many of our cognitive processes, we cannot help but let the knowledge of an outcome influence our perception of the events preceding it.

Illustration of a man looking into a mirror to see another mirror behind him, representing hindsight bias
We can, however, mitigate these retroactive judgments with clear records. By documenting predictions and conditions surrounding events, we can have a more accurate understanding of the past. With that, we can better prepare for the future.

References[+]