Since the beginning of time, the consequences of disobeying authority have been ingrained in our collective consciousness. From the tale of Adam and Eve’s expulsion from paradise to the narratives propagated by earthly figures of authority, the notion of blindly adhering to power has been deeply entrenched. In this article, we delve into the concept of authority bias, which clouds our ability to think critically and independently. We examine the track records of authorities, the shocking results of psychological experiments, and the impact of this bias on decision-making in various realms of society.

The Fallibility of Authority

Our trust in authority figures often hinges on the assumption that their expertise guarantees accuracy. This belief is deeply ingrained, influencing how we perceive the reliability of experts in various fields. However, historical and contemporary evidence reveals that such trust can be misplaced.

In economics, the 2008 financial crisis serves as a poignant example of this fallibility. Economists, who are supposed to be the custodians of financial wisdom, were largely unable to foresee or predict the crisis.

Despite the presence of approximately one million trained economists worldwide, few anticipated the timing or the scale of the collapse. The failure to predict the real-estate bubble’s burst, the subsequent collapse of credit default swaps, and the ensuing global financial meltdown highlighted a significant shortcoming in the discipline.

The exceptions, such as Nouriel Roubini and Nassim Taleb, who had warned about the potential for such a crisis, were anomalies rather than the norm. Their predictions were based on unconventional thinking and a critical analysis of the risks, contrasting sharply with the general consensus that failed to foresee the crisis. This instance underscores the limitations of relying on expert opinions, especially when they reflect a narrow range of perspectives or a flawed understanding of complex systems.

Similarly, the medical field’s history provides another example of the dangers associated with authority bias. Prior to the 20th century, medical practices were often rudimentary and sometimes harmful. Treatments such as bloodletting, which was based on outdated theories about bodily humors, frequently worsened patients’ conditions.

The lack of basic hygiene and antiseptic techniques led to high rates of infection and mortality following surgeries. For example, many patients contracted sepsis from surgical wounds due to unsterilized instruments and unsanitary conditions.

This historical context illustrates that even authoritative figures in medicine, who were viewed as the ultimate experts of their time, could cause significant harm due to their lack of scientific knowledge and outdated practices. The medical field’s evolution demonstrates the necessity of questioning and re-evaluating accepted practices as new evidence and methodologies emerge.

Milgram’s Experiment: A Case Study in Obedience

Stanley Milgram’s 1961 experiment on obedience to authority provides a striking demonstration of authority bias in action. The experiment was designed to explore how far individuals would go in obeying instructions from an authority figure, even when those instructions conflicted with their personal moral beliefs.

Participants were told they were part of a study on learning and memory. They were instructed to administer increasingly severe electrical shocks to a person in another room whenever that person gave a wrong answer to a question. The shocks ranged from 15 volts to a maximum of 450 volts, labeled as “danger: severe shock.” In reality, no actual shocks were administered; the person receiving the shocks was an actor who feigned pain and distress.

As the intensity of the shocks increased, the actor’s responses became more agonizing. Despite this, many participants continued administering shocks when prompted by the experimenter, who insisted that the study required their compliance. The experiment revealed that a significant majority of participants were willing to administer the highest voltage level, demonstrating a strong tendency to obey authority figures even when it meant inflicting harm. Over 60% of participants reached the maximum voltage, driven by the authority of the experimenter’s instructions. This result underscored the powerful influence of authority on individual behavior, revealing how ordinary people could commit acts that contradicted their personal ethics simply because they were directed to do so by an authoritative figure.

Milgram’s findings have profound implications for understanding obedience in various contexts, from workplace dynamics to societal behaviors, illustrating how authority can override personal moral judgments and lead individuals to act against their better instincts.

The Aviation Industry’s Response: Learning from Mistakes

The aviation industry has confronted the dangers of authority bias by implementing Crew Resource Management (CRM) programs, a proactive approach designed to enhance communication and decision-making within flight crews. Historically, cockpit hierarchies reinforced a culture where captains’ decisions were rarely questioned. Co-pilots and other crew members were often reluctant to challenge or question the captain’s authority, fearing repercussions or undermining the captain’s authority.

This hierarchical structure posed significant risks, particularly in situations where the captain’s decisions could be flawed or where critical safety issues needed to be addressed. The lack of open dialogue and feedback could lead to dangerous situations being overlooked or uncorrected. For example, if a co-pilot noticed a potential problem but hesitated to speak up, the issue might go unaddressed, leading to safety hazards.

In response to these risks, CRM programs were developed to foster a more collaborative and communicative environment. CRM training emphasizes the importance of teamwork, open communication, and mutual respect among all members of the flight crew. It encourages crew members to voice concerns, provide feedback, and collaboratively address potential issues, regardless of rank. This shift in culture has led to significant improvements in flight safety, as evidenced by reduced incident rates and enhanced operational efficiency. By mitigating the impact of authority bias, CRM has made a tangible difference in aviation safety, demonstrating how changes in organizational culture can lead to better outcomes.

Corporate Culture and the Risk of Authority Bias

Many organizations continue to grapple with the challenges posed by authority bias, particularly those with dominant or autocratic leaders. In such environments, employees may feel compelled to suppress their opinions and concerns, fearing negative consequences or undermining their superiors’ authority. This dynamic can stifle innovation, hinder problem-solving, and perpetuate inefficiencies within the organization.

For instance, in a company where the CEO exhibits a domineering leadership style, employees might be hesitant to challenge the CEO’s decisions or offer alternative perspectives. This reluctance can lead to a lack of critical feedback, which is essential for identifying and addressing potential issues. The result can be a culture where only the leader’s views are considered, potentially leading to poor decision-making and missed opportunities for improvement.

To combat authority bias and foster a more inclusive and dynamic workplace, companies need to create an environment where diverse opinions are valued and encouraged. This involves establishing channels for open communication, promoting a culture of mutual respect, and empowering employees at all levels to contribute their insights and ideas. By embracing a more collaborative approach, organizations can harness the collective intelligence of their teams, avoid the pitfalls of authority bias, and drive greater success.

Symbols of Authority and Their Influence

Symbols of authority play a significant role in shaping our perceptions and reinforcing the status of authority figures. These symbols serve as visual cues that signal expertise, power, and legitimacy. For example, in the medical field, white coats are emblematic of professionalism and trustworthiness, signaling that the wearer is a trained and qualified practitioner. Similarly, business executives often don suits and ties to project an image of competence and authority.

Military personnel use rank badges to denote their level of authority and experience, while royalty and heads of state may wear crowns or other regalia to signify their position. In contemporary society, authority is also signaled through media appearances, book publications, and online presence. For instance, appearing on talk shows, featuring on magazine covers, or maintaining a well-curated Wikipedia entry can enhance an individual’s perceived authority and influence.

These symbols of authority can create a powerful impression, influencing how we perceive and respond to authority figures. Just as fashion trends evolve, so do the symbols of authority, with new symbols emerging and shifting societal perceptions. Awareness of these symbols and their impact on our judgments can help us critically evaluate the credibility and expertise of those who wield authority, ensuring that our decisions are based on merit and evidence rather than mere appearances.

Conclusion

The authority bias presents a formidable obstacle to clear thinking and independent decision-making. By acknowledging the fallibility of authorities, questioning their track records, and recognizing the potential dangers of blind obedience, we can break free from the grip of this cognitive bias. It is incumbent upon us, as individuals and as a society, to foster a culture that values critical thinking and encourages constructive skepticism. So, the next time you encounter an authority figure, remember the importance of challenging them, for it is through this process that we can truly navigate a path towards informed and autonomous decision-making.

This article belongs to The Art of Thinking Clearly Series, based on Rolf Dobelli’s book.