We like to believe we have a knack for predicting the future. After all, our models, projections, and analyses seem impressive. Yet, the truth lies in a subtle but profound nuance: we are adept at forecasting the expected, but utterly blind to the surprises — the curveballs that redefine everything. These unforeseen events are not just footnotes; they are the main act.

The largest, most consequential risks always emerge from the shadows of our blind spots. If no one anticipates them, no one prepares. And when they arrive, their impact is magnified, their devastation amplified. It’s a brutal reality that no amount of planning can fully erase.

A Cautionary Tale from the Edge of Space

NASA’s mission to conquer space was never just about rockets and astronauts; it was a symphony of precision, rigorous testing, and painstaking attention to the smallest details. Before humans were catapulted into orbit, every piece of technology had to prove itself under conditions mimicking the unforgiving vacuum and freezing cold of near-space. One of the critical steps in this journey involved sending pilots up in high-altitude balloons, floating at the very edge of Earth’s atmosphere.

On May 4, 1961, Victor Prather, a Navy flight surgeon, embarked on such a mission. Accompanied by a co-pilot, Prather ascended in a balloon to an astounding altitude of 113,720 feet—more than 21 miles above sea level—skimming the boundary between Earth and space itself. The goal was straightforward yet vital: test NASA’s newly developed space suit in real near-space conditions, verifying its ability to keep a human alive in an environment where the air thins to near nothingness and temperatures plummet.

The mission was a technical success. The suit maintained pressure, oxygen supply, and temperature flawlessly. It was a triumph for engineering and science—proof that humans could soon venture beyond our atmosphere wearing this suit as a protective shell. The astronauts of the future owed a debt to flights like this one.

However, the true test came on descent. As Prather neared altitudes where breathing unassisted was possible, he opened the faceplate of his helmet to take in fresh air. This action, natural and reasonable in the moment, had devastating consequences.

Once he splashed down into the ocean—the designated landing zone—he was to be retrieved by a rescue helicopter. But during the transfer, Prather slipped while hooking himself to the helicopter’s rescue line. Ordinarily, the space suit’s buoyancy and watertight seals would have kept him afloat, a fail-safe against drowning. But the open faceplate compromised the suit’s integrity. Seawater flooded in rapidly, extinguishing his ability to stay afloat.

Despite the presence of the rescue team, Prather drowned. The tragedy was a direct result of a tiny, overlooked factor: opening the helmet’s faceplate too early. It was a detail so small it escaped notice in thousands of hours of planning and preparation.

This incident encapsulates a profound lesson about risk. NASA, arguably one of the most planning-obsessed organizations in history, with thousands of experts obsessing over every contingency, failed because of a risk no one saw coming. This was not a failure of competence or diligence—it was the inherent nature of risk itself.

In complex systems, the unknown risks—the “leftover” risks Carl Richards describes—will always exist. No amount of meticulous planning can eliminate the blind spots that lurk just beyond our imagination. When your environment is so hostile and your stakes so high, even a tiny misstep unseen in rehearsals can prove catastrophic.

For Morgan Housel, this story is a vivid metaphor. It illustrates how, in investing and life, the most devastating setbacks often stem not from foreseeable dangers but from those elusive “unknown unknowns.” It underscores the futility of believing any plan is ever truly complete. The space suit worked perfectly—except for the one unanticipated action that changed everything.

The caution here is clear: risk isn’t merely the hazards on your radar. It’s what remains after you’ve imagined, tested, and prepared for every known threat. And that residual unknown? It’s where true risk thrives.

History’s Greatest Shocks: The Unforeseen Giants

History is a mosaic of moments that have shaped civilizations, economies, and societies—many of which share a striking feature: their arrival caught nearly everyone off guard. The cataclysms that carved the deepest grooves in our collective memory—COVID-19, the September 11 attacks, Pearl Harbor, the Great Depression—weren’t merely significant due to their scale. Their defining characteristic was the sheer surprise they delivered.

Take the Great Depression, for instance. The 1920s were a decade of exuberance, a heady period where economic growth seemed unending, stock prices soared, and optimism ruled. It felt like prosperity was permanent. This sentiment was encapsulated in October 1929 by Irving Fisher, a highly respected economist, who famously proclaimed that stock prices had reached “what looks like a permanently high plateau.” It was a statement of confidence, a declaration that the good times would last.

From today’s vantage point, such confidence appears naive, even reckless. How could an economist be so blind? Yet Fisher was not an outlier. Countless intelligent, well-informed individuals shared this belief. No widespread warning about an impending collapse circulated. That is the paradox of hindsight: what seems obvious in retrospect was shrouded in uncertainty and optimism at the time.

I recall an illuminating conversation with Robert Shiller, Nobel laureate and one of the leading voices on economic bubbles. When I asked if anyone predicted the Great Depression, his response was unequivocal: “Nobody forecasted that. Zero. Nobody. Some people said the market was overpriced, but nobody predicted a decade-long depression.” This blunt honesty strips away any illusion that foresight is common or easy.

This disconnect arises from two potential explanations. One, that those living through these times were collectively deluded, unable to see the dangers lurking beneath the surface. Or two, that we today are victims of hindsight bias, fabricating clarity where none existed.

The latter is the more compelling explanation. Our clarity is born of knowledge unavailable to those who experienced the events in real time. History looks obvious because we know its outcomes; at the moment, uncertainty dominates.

This pattern repeats through history. Even prestigious and diligent institutions miss the biggest stories until they unfold. For example, The Economist’s January 2020 issue made no mention of COVID-19. Its January 2022 issue ignored the looming Russian invasion of Ukraine. These are not failures of journalism but reflections of an unavoidable truth: the most consequential risks and events are often beyond our anticipatory reach.

Why does this matter? Because it reshapes how we think about risk. The gravest dangers are not the ones we can list or quantify. They are the “unknown unknowns”—the wild cards hiding beyond our foresight. This makes risk a slippery adversary.

Morgan Housel often emphasizes that this is why risk can never be fully controlled or mastered. The largest upheavals are shaped not by the risks we calculate but by those we cannot envision. Recognizing this is crucial—it invites humility and cautious preparation over hubris and blind certainty.

These shocks are more than history lessons; they are warnings etched into the fabric of our existence. They remind us that surprises will come. They always do. And when they arrive, they change everything.

The Blinding Nature of Limited Perception

One of the most unsettling truths about risk is that even the most catastrophic events often unfold under a veil of collective ignorance. The Great Depression stands as a stark example of this phenomenon. Though today we regard it as an obvious turning point, at the time, many influential observers completely missed the warning signs. In 1930, well into the depths of the economic crisis, the National Economic League polled its members about what they perceived as the gravest problems facing the United States. Surprisingly, their top concerns were justice administration, Prohibition, lawlessness, crime, and world peace. Unemployment—the very heart of the Depression—was ranked a distant eighteenth.

A year later, in 1931, unemployment had climbed the ranks to fourth place, still trailing behind other issues like Prohibition and law enforcement. This striking misalignment between reality and perception exemplifies how limited human cognition and selective attention can distort understanding during a crisis. People simply weren’t mentally or financially prepared because the scale and immediacy of the disaster were hidden in plain sight.

This cognitive myopia isn’t just a relic of the past; it’s a recurring human limitation. History, after all, is an incomplete and imperfect tapestry. It’s pieced together from photographs, official records, personal writings, and interviews—all of which capture only fragments of what truly transpired. These sources are vulnerable to bias, misinterpretation, selective memory, and outright deception. Franklin Delano Roosevelt famously joked about this when he opened his presidential library, quipping that historians would arrive expecting definitive answers but find only partial truths.

This fundamental limitation in our understanding creates a blindness not only to the past but also to the present. Imagine the worldview of a three-year-old child, basking in the sunlight with toys scattered around, blissfully unaware of the complexities of geopolitics, economic systems, or health risks. Their universe is small, bounded by what they can immediately perceive and comprehend.

Adults, despite their knowledge and experience, often operate with a similar, albeit more sophisticated, blindness at scale. Daniel Kahneman, a pioneer in behavioral economics, captures this succinctly: “The idea that what you don’t see might refute everything you believe just doesn’t occur to us.” This psychological barrier prevents us from considering that our worldview might be fundamentally flawed or incomplete.

A haunting real-world illustration comes from a New York City newscast on the morning of September 11, 2001. Just minutes before the terrorist attacks, the anchors cheerfully reported on the pleasant weather and the usual news—unaware that the world was about to change irrevocably. Risk was invisible in that moment because it hadn’t yet manifested.

This limited perception shapes how societies and individuals respond to risk. It explains why crises can escalate rapidly, why early warnings are often ignored or misunderstood, and why even the most prepared can be caught off guard. Our awareness is bounded not only by information but by the frameworks we use to interpret it.

Morgan Housel reflects on this as a core reason why uncertainty is so persistent. No matter how much data we gather or how sophisticated our models become, there will always be a horizon beyond which we cannot see—a shadowland where the greatest risks hide.

To acknowledge this is not to surrender to fatalism but to cultivate humility and adaptability. Understanding the blinding nature of limited perception encourages us to question our assumptions, seek diverse perspectives, and build resilience not just against known threats but against the unpredictable unknown.

Preparing for the Unimaginable

If risk lives in the realm of the unseen, the question naturally arises: how do we prepare for dangers that we can’t name, much less predict? The answer lies less in attempting exact forecasts and more in cultivating a mindset of preparedness and resilience—a philosophy exemplified by the way California confronts earthquakes.

No seismologist can predict precisely when or where the next major earthquake will strike, nor its magnitude. The timing and intensity remain shrouded in uncertainty. Yet, rather than paralyzing the state, this unknown has driven a culture of readiness. Building codes are stringent, requiring structures to withstand forces that may not materialize for decades or even centuries. Emergency services conduct regular drills, stockpile resources, and educate the public—all without knowing the specifics of the next quake.

This approach acknowledges a fundamental truth: the inevitability of risk doesn’t depend on prediction but on acceptance. Nassim Taleb, the author of The Black Swan, encapsulates this wisdom succinctly: “Invest in preparedness, not in prediction.” This flips conventional wisdom on its head. Instead of fixating on predicting specific risks, the focus shifts to building capacity to absorb shocks of any kind.

Forecasts, by their nature, are limited. They rely on past data, observable trends, and known variables—tools that can only capture part of the complex systems that shape our world. When the truly unexpected strikes—black swans—they fall silent or mislead.

For individuals and institutions alike, this means adopting expectations rather than precise forecasts. Expect that something disruptive will happen, though you can’t say when, where, or how. This expectation fuels prudent actions and buffers against complacency.

In personal finance, for example, this mindset translates into saving more than what feels comfortable. Morgan Housel shares how he came to view “just enough” savings as dangerously insufficient. The buffer should be uncomfortable—it should feel like an excess. Because when the unexpected hits, what once seemed excessive is what keeps you afloat.

The same principle applies to managing debt. Most people underestimate how little debt they can safely carry. The right level is usually less than your intuition suggests. Preparation that feels excessive is the price of survival in a world where risk is shaped by unseen forces.

This philosophy goes beyond finance. Businesses must build operational resilience; governments must maintain emergency readiness; communities must foster social cohesion. The key is embracing uncertainty as an immutable fact, then structuring your systems and behavior accordingly.

Even the most meticulous planners—those who map every conceivable scenario—will confront unknowns. They might anticipate a dozen or twenty risks but will fail to imagine the one that truly disrupts everything. That is not incompetence; it is the nature of complexity.

Morgan Housel often reminds us that in investing, life, and risk, humility is paramount. The arrogance of certainty invites disaster. The wisdom of preparedness, on the other hand, acknowledges that the greatest risks are invisible until they strike. Your job is not to predict them but to build your capacity to endure their arrival.

In this light, preparedness is an ongoing practice, a state of mind that values buffers, redundancy, and flexibility over precision. It’s a commitment to survive not just the expected but the unimaginable.

Houdini’s Last Lesson: Vulnerability in the Unseen

Harry Houdini’s name is synonymous with mastery over danger. He was the quintessential escape artist, famous for liberating himself from impossible restraints: locked handcuffs, submerged water tanks, buried coffins, and chains thrown into turbulent rivers. His feats weren’t mere illusions—they were the product of exhaustive preparation, intense physical conditioning, and razor-sharp mental focus. Houdini’s ability to foresee and neutralize risk was unparalleled; he thrived on confronting the known dangers head-on.

Yet, even Houdini’s extraordinary skill had a blind spot. In 1926, after one of his performances, a young student named Gordon Whitehead approached him backstage. Inspired by Houdini’s legendary strength and endurance, Whitehead began to punch Houdini repeatedly in the abdomen—without warning, and without Houdini bracing for the impact as he would on stage.

This sudden assault was not malicious; Whitehead was attempting to replicate Houdini’s famous trick, unaware of the damage he was inflicting. Houdini was caught completely off guard. He wasn’t flexing his solar plexus, steadying his stance, or holding his breath—techniques he relied on to withstand blows during performances.

The consequence was catastrophic: the repeated punches ruptured Houdini’s appendix. The injury quickly escalated to peritonitis, a life-threatening infection that ultimately claimed Houdini’s life just days later.

This episode is one of the most poignant examples of the lethal power of unseen risk. Here was a man who had mastered the art of surviving dramatic, anticipated dangers—yet was undone by a small, unpredictable, and unanticipated event. It was a risk he neither saw coming nor prepared for.

Houdini’s fate underscores a critical lesson about the nature of risk: no matter how skilled, prepared, or experienced you are, the greatest threats often arise from blind spots. These risks lurk in the corners of our awareness, waiting to strike when our defenses are down.

Morgan Housel frequently reflects on this story to highlight a universal truth in investing and life: success is less about avoiding all danger and more about recognizing that the biggest risks come disguised as the mundane, the overlooked, and the unexpected. Houdini’s death wasn’t the result of grand failure or recklessness—it was a subtle jab from the unknown.

This teaches us humility. The illusion of control is fragile. Our plans, expertise, and experience can protect us from many perils, but not from all. There will always be vulnerabilities we haven’t identified, surprises that unravel even the most carefully crafted defenses.

Ultimately, Houdini’s last lesson is this: the most dangerous risk is the one you never anticipate. The unseen jab carries a weight far heavier than the threats you prepare for. Recognizing this is the cornerstone of wise risk management—accepting uncertainty, preparing for resilience, and never becoming complacent, no matter how invincible you feel.

Conclusion

The landscape of risk is littered with invisible threats. The surprises—those wild, uncharted variables—dictate the course of history, economics, and personal fortunes. The only certainty is uncertainty itself. Embrace it with humility, over-prepare with intention, and remain ever aware that the most profound risks are those that lurk beyond sight.