In his short story Del rigor en la ciencia, Jorge Luis Borges paints a picture of an extraordinary country where the art of cartography has reached perfection. Here, the mapmakers go to such lengths that they create a map with a scale of 1:1, as large as the country itself. The citizens, however, soon discover that such a map offers no new insights—it merely replicates what they already know. Borges’ tale is an extreme yet insightful metaphor for what we experience today: the information bias, or the delusion that more information leads to better decisions.

In a world where data flows like an unstoppable river, it’s easy to assume that the more information we gather, the more accurate our choices will be. But, just like Borges’ map, we can drown in the details, and all the extra data becomes nothing more than a mirror reflecting what we already know.

A Hotel Search Gone Awry

In today’s digital age, it’s almost impossible to escape the sheer volume of information available at our fingertips. With websites, apps, customer reviews, and travel blogs, we have endless sources to help us make decisions, especially regarding purchases and services. This was exactly the situation I found myself in when searching for a hotel in Miami. Having narrowed my options to five highly-rated hotels, I was ready to choose. But, of course, I wasn’t satisfied with just these five options—I felt the need to dig deeper.

I began reading through customer reviews, which were a mixture of glowing praise and critical feedback. Some reviewers raved about the hotel’s proximity to the beach, while others complained about noisy neighbors or outdated furniture. Still, others focused on service quality, mentioning everything from check-in efficiency to staff friendliness. Then I moved on to travel blogs that offered more personalized recommendations, but even these were conflicting. One blog touted the hotel’s amazing location, while another criticized the same spot for being too far from restaurants and attractions. I went in circles, questioning whether I had found the best hotel for my needs.

Two hours later, after scrutinizing reviews, watching countless video tours, and comparing different photos of the hotels, I arrived back at the same decision I had initially made: the hotel I was drawn to initially. In hindsight, I realized that the additional information didn’t help me make a better decision—it only wasted my time. If I had trusted my instincts and booked the hotel right away, I would have saved myself hours of research. The paradox of choice had taken over, and the additional data had complicated the process without offering any new insight.

This scenario is a perfect example of the dangers of information overload. The more information we gather, the more likely we are to second-guess our initial instincts. Ultimately, the information I sought didn’t lead to a better decision—it just delayed it. By overanalyzing, I had turned a simple decision into a drawn-out process. The truth is, the extra data didn’t add value. It created unnecessary complexity, making it harder for me to focus on the most important aspects of my choice.

The Illusion of Additional Information in Medicine

Information bias becomes even more critical in fields like healthcare, where lives and well-being are at stake. In one of Jonathan Baron’s experiments, he asked a group of doctors to decide for a patient with an 80% probability of having disease A, with two other diseases (X and Y) equally likely if disease A wasn’t present. The patient’s condition was clear enough to suggest that disease A was the most probable diagnosis. The doctors, logically, should have recommended the treatment for disease A based on the high probability of its presence.

However, Baron introduced a diagnostic test that was supposed to detect diseases X and Y. If the patient had disease A, the test would return a positive result in 50% of cases and a negative result in the other 50%. Even though this test result would provide no additional value—since the probability of disease A remained much higher—the doctors were still inclined to recommend conducting the test.

This scenario highlights a key example of how extra information can skew judgment. The test result would not change the patient’s likelihood of disease A. Yet, the additional data seemed necessary to the doctors, even though it contributed nothing to their decision-making process. It illustrates a common cognitive error in which the allure of more information leads us to make decisions that are not only unnecessary but also counterproductive.

In medicine, as in other fields, the constant accumulation of information can lead to overdiagnosis and overtreatment. The obsession with getting more data, whether through additional tests or second opinions, can often result in wasted resources and, in some cases, harm the patient. The additional information, rather than helping to clarify the decision, only complicates it. In such cases, the best decision might be to trust the initial diagnosis and avoid the extra data layers that provide no real value.

This experiment illustrates the broader problem of information bias: the false belief that more information always leads to better decision-making. Sometimes, the most valuable thing a decision-maker can do is to filter out the irrelevant data and focus on the essential facts. In medicine, this could mean relying on proven diagnostic tools rather than getting lost in a maze of inconsequential details.

A Bias Towards Familiarity: More Information, More Confusion

When it comes to decision-making, familiarity can often be a double-edged sword. In Gerd Gigerenzer’s experiment involving students from the University of Chicago and the University of Munich, a seemingly simple question revealed the cognitive bias caused by familiarity. He asked the students which city had a larger population: San Diego or San Antonio. While straightforward for those familiar with both cities, the answer revealed how our existing knowledge can shape our decisions, even when it’s irrelevant to the question at hand.

The students in Munich, who were more familiar with San Diego, overwhelmingly answered that it had a larger population, even though they didn’t know much about San Antonio. On the other hand, the students from Chicago, who were equally familiar with both cities, were more likely to second-guess their answers, ultimately leading them to make the wrong choice. The German students, who had less information overall, relied on their knowledge of San Diego and made the correct decision. Meanwhile, the Chicago students, faced with the additional information about both cities, overcomplicated the situation and misjudged the answer.

This experiment illustrates how having more information, especially based on familiarity, can lead to poor decisions. When we are familiar with multiple pieces of information, we tend to overanalyze and second-guess our decisions, often making them more complicated than they need to be. The extra data doesn’t always make the decision clearer—it can muddy the waters, causing us to overthink and miss the simplest solution.

In a world that increasingly values information, the bias towards familiarity is a powerful force. It’s easy to assume that the more you know about a subject, the better your decision will be. However, as this experiment shows, the opposite can be true: more information, especially when it involves familiar concepts, can lead to confusion and errors in judgment. When making decisions, it’s important to recognize when your familiarity with certain data points influences you and when this familiarity might be clouding your judgment.

The Fallacy of Financial Forecasts

In the world of finance, information bias can have disastrous consequences. An overwhelming amount of economic data and analysis characterized the years leading up to the 2008 financial crisis. Financial professionals, including economists, analysts, and traders, relied on a massive amount of information—research reports, mathematical models, economic indicators, and data from stock exchanges—all in an attempt to predict the future. With so much data available, the assumption was that the more information they had, the better they could understand market trends and prevent major crises.

However, when the financial crisis struck, all the information seemed to be of little value. The countless research reports and economic models that had been meticulously compiled could not predict the collapse of the housing market or the ensuing global financial disaster. Despite the mountains of data, financial institutions were blindsided by the crisis. The extra information didn’t prevent the disaster—it contributed to a false sense of confidence.

The financial world’s obsession with data highlights the flaw in the logic that more information leads to better decisions. As financial analysts worked tirelessly to predict the future, they overlooked the complexity and unpredictability of the global economy. The more data they had, the more they trusted their ability to forecast the future. Yet when the crisis hit, it became clear that all that data had not prepared them for the unexpected.

This situation offers a stark reminder that valuable data should not be treated as a guarantee of accuracy. The financial crisis showed that an overreliance on information can lead to overconfidence, poor decision-making, and, ultimately, failure. In finance, as in other fields, the key is not accumulating endless data but understanding and interpreting the information meaningfully. Sometimes, less information is more, especially when it helps you see the bigger picture.

The Power of Simplicity

In a world inundated with information, it’s easy to forget that simplicity can often lead to the best decisions. The key is to know when to stop gathering data and focus on the essentials. Rather than becoming trapped in a cycle of endless research and analysis, the most effective decision-makers learn to cut through the noise and focus on what truly matters.

This approach can be seen in business, medicine, and personal decision-making. When presented with many options, it’s tempting to search for the perfect answer by gathering as much information as possible. But more often than not, the simplest solutions are the best. As Daniel J. Boorstin famously said, “The greatest obstacle to discovery is not ignorance—it is the illusion of knowledge.” The more information we have, the more we believe we have all the answers. But this belief can lead us astray.

The ability to embrace simplicity and distill complex problems into their core elements is crucial for making better decisions. In many cases, the best approach is to ignore the noise and focus on the key facts that matter most. By stripping away the extraneous data, we are left with a clearer path forward.

This mindset shift is not about discarding information entirely but about understanding that not all information is equal. In many cases, pursuing more data distracts us from making the most effective decision. Simplifying can help us navigate the noise and find the clarity we need to move forward in a world that values constant analysis and information.

Conclusion

The information bias, fueled by the delusion that more information guarantees better decisions, is a trap that can lead us astray. Borges’ fictional map, with a scale of 1:1, reflects the extreme manifestation of this bias. Excessive information often proves futile and time-consuming in the quest for a hotel, the medical field, or the financial industry. It is crucial to recognize the value of simplicity and the significance of focusing on essential facts rather than succumbing to the allure of surplus knowledge. By doing so, we can navigate the sea of data more effectively, making informed decisions and avoiding the illusion of knowledge.

This article is part of The Art of Thinking Clearly Series based on Rolf Dobelli’s book.