Imagine you’re given two options about a man named Chris, who has had a distinguished career working for the Red Cross and completed an MBA focused on corporate social responsibility. You’re asked which scenario is more likely:

A) Chris works for a major bank.

B) Chris works for a major bank, where he runs its Third World foundation.

Which one would you choose?

Many would opt for B, but that choice is a fallacy. In reality, option A is far more probable. Here’s why: The second option adds an extra condition. While it’s true that Chris’s background might align with working for a bank’s Third World foundation, the addition of this detail makes it a much smaller subset of people working for banks. And by definition, a smaller subset can never be more likely than the entire set. This misstep is what psychologists refer to as the conjunction fallacy, a bias to which we all fall victim.

Understanding the Conjunction Fallacy

The conjunction fallacy is a subtle yet pervasive cognitive bias that causes people to incorrectly judge the probability of two events happening together as more likely than a single event occurring alone, even when the combined probability is lower. This mistake arises because our brains are wired to prefer coherent, plausible, or “harmonious” stories, even at the expense of accuracy. We tend to be drawn to those details when presented with additional details that fit well into an already established narrative, believing they make the story more likely. In reality, however, the more specific the combination of events, the less likely it is to occur.

In the case of Chris, when presented with two options—whether he works for a major bank or works for a major bank and runs its Third World foundation—the addition of the latter condition makes it seem more likely. Why? His background in social philosophy, humanitarian aid, and corporate social responsibility suggests he might be a perfect fit for such a role. However, by adding this extra detail, we’re narrowing the field. Far fewer people work in a specific capacity, like running a Third World foundation within a major bank. Therefore, the probability of this specific subset of people working in that role is far smaller than the general likelihood of anyone working at a major bank. This is the essence of the conjunction fallacy—by adding an additional condition, we mistakenly increase the perceived likelihood, even though it statistically becomes less probable.

This fallacy arises from our natural inclination to weave facts into a story that makes sense, often disregarding mathematical probabilities in favor of what feels intuitively true. Our minds are prone to this type of reasoning because narratives help us quickly sense complex information. We are wired to believe in stories and find them comforting and easy to understand. However, the cognitive shortcut of interpreting additional details as making an event more likely can lead to serious errors in judgment.

Real-World Examples of the Conjunction Fallacy

The conjunction fallacy plays out in real-world scenarios far more often than we realize, particularly when we make judgments about everyday occurrences. We tend to be drawn to details that make a story seem more plausible, often overlooking other, equally likely explanations that don’t fit neatly into our mental models.

Take the example of the Seattle airport being closed. If you were asked which of the following is more likely: A) “Seattle airport is closed. Flights are canceled.” B) “Seattle airport is closed due to bad weather. Flights are canceled.”

Most people would choose option B, assuming that the added detail of bad weather makes it a more plausible and reasonable explanation. This happens because bad weather is a common cause of airport closures and flight cancellations, and we are naturally drawn to explanations that fit into our understanding of how the world works. However, option A is more likely because it doesn’t impose additional constraints. The airport could have been closed for many reasons—bad weather is just one of many possibilities. While bad weather is a frequent cause of airport disruptions, it’s not the only cause. The added condition in option B reduces the overall likelihood of that specific event.

This is a classic case of the conjunction fallacy, where the addition of seemingly plausible details leads us to believe that the event described is more probable than it is. The extra condition of bad weather makes the event feel more “complete” and thus more likely. But in reality, this extra detail narrows the pool of possible causes, making the scenario less probable.

Even in everyday situations, we often make similar errors. For example, when someone tells you a story about a person in need of help, and the person happens to be from a developing country, you might be more inclined to believe that the story is about charity work or humanitarian aid simply because it fits into a narrative you expect. However, adding this extra detail about their background does not increase the likelihood of the event—it only adds a layer of bias that makes the story feel more likely.

Experts Are Not Immune

The conjunction fallacy isn’t limited to casual thinkers. Even experts—highly trained and experienced in their fields—are not immune to this bias. This is particularly evident in situations where complex data is being interpreted or where professionals have extensive knowledge that they rely on when making decisions.

The 1982 experiment conducted by Daniel Kahneman and Amos Tversky demonstrates how even academics can fall prey to the conjunction fallacy. In this study, participants were divided into two groups. Group A was told: “Oil consumption will decrease by 30%,” while Group B was presented with a more detailed forecast: “A dramatic rise in oil prices will lead to a 30% reduction in oil consumption.” Both statements described the same basic scenario—a decrease in oil consumption due to economic forces—but Group B’s forecast seemed far more plausible to the participants. The extra detail about rising oil prices added a sense of clarity and context, making the prediction feel more grounded in reality, even though it wasn’t statistically more likely.

This highlights a key aspect of the conjunction fallacy: the human tendency to rely on additional, seemingly relevant details to guide our judgment, even when those details don’t improve the accuracy of our predictions. Experts, too, are influenced by the allure of plausible narratives. The fact that Group B felt much more strongly about their forecast, despite it being no more accurate than Group A’s, underscores how deeply ingrained this bias is, even in individuals who are expected to make rational decisions based on facts and data.

The conjunction fallacy can lead to significant misjudgments in fields like economics, finance, healthcare, and other areas requiring expert judgment. Experts might feel more confident about a prediction because it aligns with a detailed, plausible story rather than critically examining the facts and realizing that those details don’t increase the likelihood of the outcome. This is why, even in specialized fields, it’s crucial to challenge intuitive judgments and pay close attention to the actual probabilities rather than being swayed by a convincing narrative.

Intuitive vs. Rational Thinking

The key to understanding why we fall for the conjunction fallacy lies in the interplay between two distinct types of thinking: intuitive and rational. Daniel Kahneman’s work on dual-system thinking distinguishes between these two cognitive processes. System 1 is fast, automatic, and intuitive. It operates on impulse and is based on gut feelings. System 2, in contrast, is slow, deliberate, and rational—it involves careful thought, logical reasoning, and analysis.

Intuitive thinking is essential for navigating day-to-day life efficiently. It allows us to make quick decisions based on patterns and past experiences without much cognitive effort. However, this automatic system can also lead us astray. When we encounter a decision or scenario, our intuitive brain quickly jumps to conclusions, often relying on vivid, emotionally charged details that seem to fit the story. This is exactly what happens in the conjunction fallacy—our minds gravitate toward the extra details, believing they make the scenario more likely when they do the opposite.

System 2, on the other hand, is the more deliberate, analytical mode of thinking. It requires conscious effort and attention to detail. It’s crucial to engage System 2 to evaluate all possible outcomes carefully and rationally when making important decisions. The problem is that System 1 often operates so quickly that it overrides the more logical System 2 before we even have a chance to engage it. This is why we often make snap judgments that feel right but are flawed.

For example, in the case of Chris and the bank, our intuitive mind is drawn to the extra details that make the scenario feel complete. His background in aid work makes the idea of him running a Third World foundation at a bank seem plausible. However, our rational mind—if we slow down to consider the statistics—would quickly recognize that the first option, where Chris simply works for a major bank, is far more probable.

This dynamic between the intuitive and rational systems is at the heart of why we fall for cognitive biases like the conjunction fallacy. It’s not that we’re inherently bad at reasoning; it’s just that our brain’s default mode of thinking is to make decisions quickly, relying on patterns and heuristics, which can lead us to make errors in judgment.

The Importance of Conscious Thought

To overcome the conjunction fallacy and other cognitive biases, it’s essential to consciously engage the rational System 2 part of our brain. Kahneman’s research highlights the importance of recognizing when our intuitive thinking might lead us astray and taking a step back to engage in more deliberate, careful reasoning.

When faced with a decision that involves additional details—such as Chris and the Third World Foundation—take the time to evaluate whether those details actually increase the likelihood of the scenario. In many cases, the extra information simply narrows the pool of possibilities, making the situation less likely, not more. By slowing down and engaging System 2, we can override the automatic biases of System 1 and make more accurate decisions.

Training ourselves to be more aware of our cognitive biases is crucial for improving decision-making in all aspects of life. When confronted with a scenario that seems too good to be true or an event that seems particularly likely because of the extra details, we must pause and critically assess whether those details are truly relevant. Often, they’re not; by recognizing this, we can make decisions that are more grounded in reality.

This process takes effort, but the rewards are significant. By deliberately slowing down our thinking, questioning our assumptions, and avoiding the seductive pull of extra details, we can avoid the conjunction fallacy and make better, more informed decisions.

The Takeaway: Avoiding the Conjunction Fallacy

We must develop an awareness of how our minds operate to avoid falling into the conjunction fallacy. Our natural inclination is to favor stories that feel complete and plausible, but this leads us astray. When faced with a decision, especially an important one, remember that adding details and conditions does not make an outcome more likely. It makes it less likely.

When you encounter an option with extra, specific details, pause momentarily. The question is whether those details are necessary or have been added to make the situation seem more plausible. If they are unnecessary, discard them and focus on the core of the scenario. By doing so, you’ll make decisions based on logic rather than the seductive pull of a convincing but flawed narrative.

In summary, the conjunction fallacy reminds us that the more complex and detailed a story sounds, the more we must be on guard. Details should not alter probabilities—only increase understanding. Whether you’re making personal, professional, or financial decisions, it’s critical to recognize when your intuition is being led astray by a seemingly plausible but ultimately flawed story.

This article is part of The Art of Thinking Clearly Series based on Rolf Dobelli’s book.