Since the beginning, the consequences of disobeying authority have been ingrained in our collective consciousness. From the tale of Adam and Eve’s expulsion from paradise to the narratives propagated by earthly figures of authority, the notion of blindly adhering to power has been deeply entrenched. This article delves into authority bias, which clouds our ability to think critically and independently. We examine the track records of authorities, the shocking results of psychological experiments, and the impact of this bias on decision-making in various realms of society.
What is Authority Bias?
Authority bias is a pervasive psychological phenomenon that influences nearly every decision. This bias refers to our tendency to automatically trust, believe, or follow the opinions and directives of those in power, expertise, or influence positions. From the figures of religious authority to corporate CEOs, doctors, and scientists, authority figures wield significant influence over our thoughts and actions.
Our trust in authority figures is deeply ingrained in human nature and social structure. From a young age, we are taught to respect figures of authority—parents, teachers, and public figures are trusted because they are believed to hold superior knowledge or power. This helps establish order in society and ensures we operate within specific norms and guidelines. But, this automatic deference can be dangerous. When authority figures are wrong, misguided, or manipulative, we can make decisions not in our best interest. Authority bias clouds our ability to think critically, making us more likely to follow instructions or accept ideas without question, even when those in charge are not the most reliable source of information.
Track Records of Authorities: A Sobering Reality
When we think about authority, we tend to believe those who hold positions of power have earned their status through competence, experience, or knowledge. However, examining the track record of various authorities across different fields reveals a much less impressive reality. Experts, despite their titles and credentials, often get things wrong. Their predictions, decisions, and advice can be proven flawed or misguided, sometimes with catastrophic results.
Take the global financial crisis of 2008 as an example. Economists, financial experts, and banking professionals—many of whom held prestigious titles and years of experience—failed to predict the impending collapse of the world economy. Despite approximately one million trained economists worldwide, only a handful, such as Nouriel Roubini and Nassim Taleb, could foresee the magnitude of the crisis. Most economists failed to predict the crisis and misjudged how it would unfold, from the bursting of the housing bubble to the collapse of credit default swaps and the subsequent economic crash. The fact that so many highly educated, well-regarded authorities could not foresee such an event underlines how vulnerable we are to over-relying on those with official titles.
This pattern of failure extends far beyond economics. Before the 20th century, doctors were frequently wrong in the medical field, relying on methods like bloodletting and unsterile procedures. Medical authorities made decisions based on outdated theories, and patients often paid the price. While we have made enormous strides in medicine since then, the history of poor medical advice reminds us that even highly regarded professionals can make errors. The risk of relying too heavily on authority figures without questioning their methods or track record can lead us to make dangerous or misguided choices.
The Experiment That Exposed Authority Bias
Stanley Milgram’s 1961 experiment is one of the most chilling demonstrations of the power of authority bias. In this study, participants were told they would be assisting in a research project on learning and memory, where they were instructed to administer electrical shocks to a person on the other side of a partition. The “learner,” actually an actor, was to answer questions, and each incorrect answer would result in a progressively stronger shock. Starting at 15 volts, the shock levels would increase in 15-volt increments, eventually reaching 450 volts—a lethal amount in real life.
The experiment’s most disturbing aspect was how the authority figure, the experimenter in a lab coat, urged the participants to continue. Despite hearing screams and cries for mercy from the actor, and even when the person on the other side of the wall stopped responding (indicating unconsciousness), most participants followed the instructions to increase the voltage, all because the experimenter told them to keep going. Even though the experiment was staged and no shocks were delivered, the participants believed they were inflicting real pain and suffering.
Milgram’s experiment shocked the world, revealing how powerful authority figures can influence people’s actions. More than half of the participants continued to the maximum shock level, blindly following the experimenter’s commands without question. This phenomenon highlights the danger of blindly obeying authority without stopping to question the morality or consequences of our actions. The experiment is an eye-opener into the psychological mechanisms when individuals defer to those in power.
The Authority Bias in the Airline Industry
The airline industry provides another example of the dangers of authority bias, specifically in its historical treatment of captaincy. In the past, captains were viewed as infallible figures, with their commands respected and feared. While fostering respect for the captain’s experience, this hierarchical structure also created a situation where co-pilots and other crew members were often unwilling to challenge the captain, even when they believed something was wrong. This reluctance to speak up, especially in the face of danger, has been directly linked to several tragic accidents in the aviation industry.
One of the most infamous examples was the crash of Avianca Flight 52 in 1990. The first officer noticed the captain’s questionable decision-making and attempted to voice his concerns. Still, the hierarchical nature of the cockpit culture made it difficult for him to speak out. The tragedy underscored the dangerous consequences of an unquestioned command structure. As a result, airlines began to recognize the importance of breaking down the traditional power dynamics in the cockpit.
In response to these concerns, the aviation industry introduced Crew Resource Management (CRM), a set of training programs designed to promote open communication among all crew members, regardless of rank. CRM encourages pilots, co-pilots, and other crew members to speak up if they believe something is amiss, ensuring that critical issues are addressed promptly and effectively. This shift has played a significant role in improving flight safety over the past two decades. CRM has proven that de-emphasizing rigid authority in favor of teamwork and open communication can save lives and improve outcomes.
CRM is a valuable lesson for other industries as well. It shows that fostering a culture of collaboration and questioning authority can lead to better decision-making and greater safety. The success of CRM in aviation proves that when authority is questioned rather than blindly followed, better outcomes can be achieved.
The Corporate Hierarchy and the Authority Bias
The corporate world, particularly in companies with domineering CEOs, is another environment where authority bias can have detrimental effects. In organizations where power is concentrated at the top, employees often fear speaking out against decisions or offering their ideas, even if they believe a course of action could be detrimental. This fear of challenging the “boss” leads to an environment where ideas are stifled, and problems may go unaddressed simply because no one feels empowered to raise concerns.
In such organizations, the hierarchical structure discourages collaboration and the free exchange of ideas. This leads to a lack of innovation and creative thinking. Employees may withhold valuable feedback, not out of malice, but because they are conditioned to respect the CEO’s or senior leaders’ authority. The danger in this scenario is that critical issues may not be raised until they escalate into crises. The authority bias stifles the flow of information, and as a result, businesses miss out on opportunities for improvement, growth, and problem-solving.
In contrast, companies that create environments where questioning authority is encouraged tend to see higher levels of engagement, creativity, and productivity. Organizations like this foster a culture of transparency and open dialogue, where employees feel comfortable challenging ideas, proposing solutions, and raising concerns. These companies recognize that good ideas can come from anywhere, regardless of rank, and that empowering all levels of the organization to speak freely leads to better decision-making and innovation.
Symbols of Authority and the Perception of Expertise
Symbols of authority—such as uniforms, titles, and other visual cues—are critical tools in shaping how we perceive expertise. A person in a white coat, a CEO in a tailored suit, or a military officer in complete regalia projects an image of competence and authority. These symbols play a significant role in shaping our perception of an individual’s expertise and credibility.
In today’s media-saturated world, these symbols have only expanded. Doctors, for instance, may appear on television programs, write books, or contribute to high-profile magazines. These activities reinforce their image as experts, even though their qualifications or experience may not be as extensive as we believe. Similarly, corporate leaders use their appearances on talk shows, in press releases, or in business magazines to perpetuate their image as capable decision-makers, which can shape how their advice and decisions are received.
However, we must remember that these symbols are not always indicators of actual expertise. They are often part of a carefully constructed image designed to convince the public of an individual’s authority and competence. While these symbols can help establish credibility, they can also mislead us into overestimating someone’s abilities. Just because someone wears a white coat or sits in a boardroom doesn’t mean their judgment is infallible. Critical thinking requires looking beyond the symbols and assessing the actual competence of the individual.
Challenging Authority for Clearer Thinking
The key to protecting ourselves from the dangers of authority bias is always to question authority figures, regardless of their position, title, or expertise. When faced with a decision, it is essential to consider whose influence is shaping our thinking. Are we trusting an expert’s opinion because of their credentials or because we have critically evaluated the evidence?
When encountering authority figures, don’t simply accept their advice or directives at face value. Engage with them by asking questions, seeking clarification, and exploring alternative perspectives. Challenge assumptions and demand evidence. By questioning authority, we can avoid the pitfalls of blindly following the crowd and making more thoughtful and informed decisions.
This doesn’t mean rejecting authority outright. It simply means recognizing that authority, like any other source of information, can be flawed. True wisdom lies in the ability to question, evaluate, and make decisions based on our judgment and reasoning rather than blindly deferring to those in power. This way, we regain control over our thinking and make more informed decisions.
Conclusion
The authority bias presents a formidable obstacle to clear thinking and independent decision-making. By acknowledging the fallibility of authorities, questioning their track records, and recognizing the potential dangers of blind obedience, we can break free from the grip of this cognitive bias. It is incumbent upon us, as individuals and as a society, to foster a culture that values critical thinking and encourages constructive skepticism. So, the next time you encounter an authority figure, remember the importance of challenging them, for it is through this process that we can genuinely navigate a path toward informed and autonomous decision-making.
This article belongs to The Art of Thinking Clearly Series, based on Rolf Dobelli’s book.