Back in 2019, my boss was reading this book and he recommended that I read it too. The book has no less than 99 cognitive biases and logical fallacies explained in small chapters. You can think of this book as a compendium of errors your brain can fall prey to when making decisions. Let’s start by understanding what is a cognitive bias?
What is a Cognitive Bias?
Cognitive biases are typically considered to be unconscious and automatic, influencing our thinking without our awareness or deliberate control. They arise due to various factors, such as limited information processing capacity, heuristics (mental shortcuts), social influences, emotional factors, and personal beliefs or experiences.
There are numerous types of cognitive biases that have been identified and studied by researchers in psychology and behavioral economics. Some common examples include confirmation bias (the tendency to search for or interpret information in a way that confirms preexisting beliefs or expectations), availability heuristic (the tendency to rely on easily accessible or vivid examples when making judgments), anchoring bias (the tendency to rely heavily on the initial piece of information when making subsequent judgments), and framing bias (the influence of how information is presented or framed on decision-making).
Understanding cognitive biases is important because they can lead to errors, flawed judgments, and irrational behavior. Recognizing and mitigating these biases can help improve decision-making, critical thinking, and problem-solving skills.
In this article, I will try and summarize all cognitive biases described in “The Art of Thinking Clearly” by Rolf Dobelli. This is going to be fun. Let’s get started.
Survivorship Bias
Survivorship bias is a cognitive bias that occurs when we focus on the individuals or things that have “survived” or succeeded in a particular process or situation while ignoring those that did not. It can lead to overestimating the likelihood of success and misunderstanding the factors that contributed to it.
For example, if we study the success stories of entrepreneurs who have built successful businesses, we may be tempted to draw conclusions about the characteristics or behaviors that lead to success. However, by focusing only on successful entrepreneurs, we are ignoring the many individuals who tried but failed to build successful businesses. We may be missing important factors contributing to failures, such as external circumstances, luck, or factors outside of an individual’s control. Survivorship bias can be a particular problem in fields such as investing, where successful investors may be held up as models to follow, even if their success is largely due to luck rather than skill.
Swimmer’s Body Illusion
The Swimmer’s Body Illusion is a cognitive bias that occurs when we mistake correlation for causation. Specifically, the Swimmer’s Body Illusion refers to the tendency to believe that a certain activity (such as swimming) is responsible for a particular body type, when in fact the relationship between the two is more complex.
The name of this bias comes from the fact that many people believe that swimming is responsible for the lean, muscular physique often seen in competitive swimmers. However, this belief ignores the fact that successful competitive swimmers are typically already lean and muscular due to genetics and other factors, and that swimming itself may not necessarily be responsible for their body type.
The Swimmer’s Body Illusion can lead to a number of errors in thinking and decision-making, such as assuming that engaging in a particular activity will necessarily lead to a particular outcome (e.g. assuming that swimming will automatically lead to a lean and muscular physique), or failing to consider other factors that may be responsible for a particular outcome (e.g. genetics, diet, exercise habits).
Clustering Illusion
The Clustering Illusion is a cognitive bias that occurs when we perceive patterns or clusters in random data where none actually exist. This bias arises from our natural tendency to look for patterns and meaning in information, even when there is none.
For example, imagine looking at a cloud and seeing the shape of a familiar object, such as a dog or a person’s face. In reality, the cloud is just a random collection of water vapor, but our brains are wired to look for patterns and meaning in the world around us, even when none exists.
In a similar way, the Clustering Illusion can lead us to see patterns or clusters in random data, such as in stock market fluctuations or sports statistics. We may perceive trends or patterns that are not actually there, leading to inaccurate predictions or decisions.
The Clustering Illusion can be particularly problematic in fields such as investing or finance, where false patterns or trends can lead to poor investment decisions or financial losses.
Cognitive Bias – Social Proof
Social proof is a psychological phenomenon that occurs when people rely on the actions or opinions of others to determine their own behavior or beliefs. Essentially, it is the idea that we are influenced by what others around us are doing or saying.
For example, imagine that you are walking down a street and see a large crowd gathered around a restaurant. Even if you have never heard of the restaurant before, the fact that so many people are there may lead you to assume that it is a popular and worthwhile place to eat.
Social proof can manifest in a variety of ways, such as through online reviews, celebrity endorsements, or the behavior of people in our social networks. It is often used in marketing and advertising, where companies may highlight the popularity of their products or services in order to attract more customers.
While social proof can be a useful heuristic in some cases, it can also lead to conformity and groupthink, where people may be reluctant to express dissenting opinions or behaviors. It is important to be aware of the influence of social proof and to make decisions based on our own beliefs and values, rather than simply following the crowd.
Sunk Cost Fallacy
The Sunk Cost Fallacy is a cognitive bias that occurs when we continue investing resources (such as time, money, or effort) in a project or endeavor, simply because of the resources we have already invested, rather than making a rational decision based on future outcomes. Essentially, we are reluctant to abandon something that we have already invested in, even if it no longer makes sense to continue.
For example, imagine that you have spent a large amount of money renovating a house, but then discover that the foundation is irreparably damaged and the house is not worth saving. Even though it may be more rational to cut your losses and abandon the project, the Sunk Cost Fallacy may lead you to continue investing money in the house, in order to justify the resources you have already spent.
The Sunk Cost Fallacy can lead to a variety of negative outcomes, such as wasting resources, perpetuating unwise decisions, and failing to adapt to changing circumstances. It is important to be aware of this bias and to make decisions based on future outcomes, rather than past investments.
Cognitive Bias – Reciprocity
Reciprocity is a social principle that describes the tendency of people to respond in kind to the actions or behaviors of others. Essentially, it is the idea that if someone does something nice for us, we are more likely to do something nice in return.
For example, imagine that your friend helps you move into a new apartment. Even if you didn’t ask for their help, you may feel inclined to reciprocate by helping them in the future, or by buying them a thank-you gift.
Reciprocity is a powerful social principle that can be used in a variety of ways, such as in marketing, negotiations, and social interactions. By doing something nice for someone else, we can create a sense of indebtedness or obligation, which can be used to our advantage in future interactions.
However, reciprocity can also be problematic if it is used in a manipulative or insincere way. For example, if someone gives us a gift or does something nice for us with the expectation of receiving something in return, this can be seen as a form of manipulation rather than genuine kindness. It is important to use reciprocity in an ethical and sincere way, rather than as a tool for manipulation or exploitation.
Confirmation Bias
Confirmation bias is a cognitive bias that occurs when we tend to seek out, interpret, and remember information in a way that confirms our preexisting beliefs or hypotheses while ignoring or dismissing information that contradicts them. In other words, we look for evidence that supports our beliefs and ignore evidence that challenges them.
Confirmation bias can affect our perception and decision-making in a variety of contexts, such as in politics, religion, and personal relationships. For example, we may only read news sources that align with our political beliefs and dismiss information from sources that have a different perspective.
Confirmation bias can also lead to the formation of stereotypes, where we hold preconceived notions about certain groups of people that are not based on evidence. It can also lead to the perpetuation of myths and misconceptions, as we tend to accept information that confirms what we already believe, even if it is not accurate.
It is important to be aware of confirmation bias and to actively seek out information that challenges our beliefs, rather than simply seeking out information that confirms them. By doing so, we can make more informed decisions and have a more accurate perception of the world around us.
Authority Bias
Authority bias is a cognitive bias that occurs when we attribute greater accuracy and credibility to the opinion or actions of an authority figure, regardless of their actual expertise or knowledge in the relevant area. We tend to believe and follow the opinions and advice of people who are perceived to have authority, such as doctors, professors, or government officials, even if their opinions are not based on sound evidence.
This bias can lead to individuals blindly accepting the opinions of authority figures without critically evaluating the evidence or considering alternative viewpoints. In some cases, authority bias can also result in individuals following orders or instructions that may be unethical or harmful.
It is important to recognize the influence of authority bias and to evaluate information based on its merit and supporting evidence rather than solely relying on the opinion of an authority figure. It is also important for authority figures to recognize the power they hold and to use it responsibly by providing accurate and evidence-based information.
Contrast Effect
The contrast effect is a cognitive bias that distorts our perception of an object or person based on how it compares to something similar that we have seen or experienced recently. This bias can cause us to overemphasize differences and overlook similarities.
For example, if you are shopping for a car and you see a very expensive luxury car first, you may perceive the other cars you see later as less desirable or of lower quality, even if they are objectively good cars. On the other hand, if you see a low-end car first, you may perceive the other cars as much more expensive and luxurious than they actually are.
The contrast effect can also apply to people. For instance, if you meet someone who is very physically attractive, you may perceive someone who is average-looking as less attractive than they actually are.
This bias can lead to poor decision making, as our perceptions and evaluations are skewed by recent experiences. To overcome the contrast effect, it is important to evaluate objects or people on their own merits rather than comparing them to others. It is also important to be aware of how recent experiences can influence our perceptions and to take steps to minimize the impact of this bias.
Availability Bias
The availability bias is a cognitive bias that occurs when we rely on easily accessible and vivid information to make judgments or decisions, rather than considering all relevant information. This bias is based on the idea that people tend to judge the frequency or probability of an event based on how easily examples come to mind.
For example, if you hear news reports about plane crashes, you may start to believe that flying is unsafe, even though statistically, flying is a very safe mode of transportation. Similarly, if you hear about several cases of food poisoning caused by a particular restaurant chain, you may avoid eating there, even though most of their meals are safe and delicious.
The availability bias can lead to inaccurate assessments and decisions, as well as increased anxiety and fear. To counteract the availability bias, it is important to seek out diverse sources of information, rather than relying solely on what is readily available. It is also important to critically evaluate the information that is available and to consider the frequency and probability of an event based on reliable statistics and evidence, rather than personal anecdotes or news reports.
It’ll Get Worse Before It Gets Better Fallacy
The “It’ll-Get-Worse-Before-It-Gets-Better” fallacy occurs when people believe that a situation will deteriorate further before it starts to improve, without any supporting evidence for such a belief.
For example, a person might believe that their financial situation will continue to worsen, even if they have taken concrete steps to improve it, such as reducing expenses or increasing their income. This pessimistic outlook can lead to unnecessary stress and anxiety, as well as inaction in the face of a solvable problem.
It is important to avoid this fallacy by examining the evidence and evaluating the situation realistically. By taking a more balanced and objective approach, we can better understand the causes of the problem and develop effective strategies to address it.
Story Bias
The story bias is a cognitive bias that occurs when we rely too heavily on stories, anecdotes, and personal experiences to form our beliefs and opinions, rather than considering more reliable statistical evidence. This bias can lead us to overemphasize individual experiences and ignore larger trends and patterns.
For example, if a friend tells you a story about a bad experience they had with a certain brand of product, you may be inclined to avoid that brand, even if statistically it has a good reputation. Similarly, if you read a news story about a rare and dramatic event, such as a plane crash or a shark attack, you may overestimate the likelihood of such events occurring, leading to excessive fear or avoidance of related activities.
The story bias can also lead to confirmation bias, as we may seek out stories and experiences that confirm our existing beliefs and opinions, rather than seeking out information that challenges them.
To counteract the story bias, it is important to seek out and consider reliable statistical evidence, as well as to critically evaluate personal anecdotes and stories to determine their relevance and reliability. We should also be aware of our tendency to seek out information that confirms our existing beliefs and make a conscious effort to seek out diverse perspectives and evidence.
Hindsight Bias
Hindsight bias is a cognitive bias that occurs when people believe, after an event has occurred, that they could have predicted or foreseen the event’s outcome, even when they had no basis for such a prediction. Hindsight bias is often referred to as the “I-knew-it-all-along” phenomenon.
For example, after a stock market crash, people might say “I knew it was going to happen” even though they didn’t take any action to protect themselves from the crash. Similarly, after a major political event, people might say “I saw it coming” even though they didn’t actually make any predictions or take any action based on their supposed foresight.
Hindsight bias can lead to overconfidence and an underestimation of the complexity of predicting events. It can also lead to a lack of accountability for poor decisions made in the past, as people tend to overestimate their own abilities to predict events.
To counteract hindsight bias, it is important to recognize that we cannot accurately predict every outcome and that we should evaluate decisions and actions based on the information available at the time they were made, rather than based on what we know now. It is also important to evaluate decisions based on the quality of the decision-making process rather than the outcome, as the outcome of a decision is not always within our control.
Overconfidence Effect
The overconfidence effect is a cognitive bias in which people overestimate their own abilities or knowledge. This bias can lead to overestimating the accuracy of one’s predictions or the likelihood of success in a particular task or situation.
For example, a person might be overconfident in their ability to complete a project on time, despite having little experience with the required tasks or underestimating the complexity of the project. Or, a person might be overconfident in their ability to predict the outcome of a sporting event or stock market performance, leading them to make poor decisions.
The overconfidence effect can also lead to a lack of consideration of alternative perspectives or information, as the person believes that their own opinion or knowledge is superior.
To counteract the overconfidence effect, it is important to seek out and consider alternative perspectives and information, as well as to regularly evaluate one’s own abilities and performance. Seeking feedback and engaging in self-reflection can also help to mitigate the overconfidence effect.
Chauffeur Knowledge
Chauffeur knowledge is a term used to describe a type of knowledge that involves knowing how to perform a task or operate a system, without necessarily understanding the underlying principles or concepts behind the task or system. The term was coined by economist and philosopher Friedrich Hayek.
For example, a chauffeur might know how to drive a car, operate the radio, and adjust the air conditioning, but they may not understand the mechanical workings of the engine or the physics of driving. Similarly, a computer user might know how to use certain software programs, but they may not understand the coding or programming that goes into creating those programs.
Chauffeur knowledge can lead to a false sense of expertise or mastery, as the person may be able to perform the task or operate the system but lack a deeper understanding of how it works. This can also lead to a lack of innovation or problem-solving skills, as the person may not be able to adapt or troubleshoot in situations where their existing knowledge is insufficient.
To avoid falling into the trap of chauffeur knowledge, it is important to seek out a deeper understanding of the underlying principles and concepts behind a task or system, as well as to continually learn and adapt to changing circumstances. This can involve seeking out additional training or education, engaging in critical thinking and problem-solving, and remaining open to new perspectives and information.
Illusion of Control
The illusion of control is a cognitive bias that leads people to believe that they have more control over outcomes than they actually do. It occurs when people believe that their actions or decisions can influence events or outcomes that are actually beyond their control.
For example, a gambler might believe that they can control the outcome of a dice roll by blowing on the dice or using a certain technique to throw them. In reality, the outcome of the roll is determined by chance and is beyond the gambler’s control. Similarly, a person might believe that they can control the outcome of a job interview by wearing a lucky item of clothing, despite the fact that the outcome is influenced by many other factors such as the interviewer’s opinions and the competition for the job.
The illusion of control can lead to overconfidence in one’s abilities and decision-making, as well as a failure to recognize the role of chance and external factors in determining outcomes. It can also lead to a failure to take appropriate action when outcomes are not going as planned, as the person may believe that they can simply exert more control to change the outcome.
To counteract the illusion of control, it is important to recognize the role of chance and external factors in determining outcomes and to focus on factors that are within our control. It is also important to seek out and consider alternative perspectives and information, as well as to regularly evaluate one’s own abilities and performance. Seeking feedback and engaging in self-reflection can also help to mitigate the illusion of control.
Incentive Super Response Tendency
The incentive super-response tendency is a cognitive bias that describes the human tendency to over-respond to incentives or rewards. It occurs when people become highly motivated by the prospect of a reward, to the point where their behavior becomes irrational or counterproductive.
For example, a salesperson who is offered a large commission for every sale they make might become so focused on making sales that they neglect to provide good customer service or prioritize the needs of the customer. Or, a student who is offered a prize for the highest grade in a class might become so focused on achieving the highest grade that they neglect to learn or understand the material.
The incentive super-response tendency can lead to short-term gains but can also have negative long-term consequences, such as decreased motivation, burnout, or a decrease in the quality of work. Additionally, it can lead to a narrow focus on the reward at the expense of other important factors, such as ethical considerations or the well-being of others.
To counteract the incentive super-response tendency, it is important to consider the potential consequences of the reward or incentive and to prioritize the long-term impact of one’s actions. It is also important to maintain a broader perspective and to consider factors beyond the immediate reward or incentive. Finally, it can be helpful to consider alternative forms of motivation, such as intrinsic motivation or a sense of purpose and fulfillment in one’s work or actions.
Regression to Mean
The regression to the mean cognitive bias is a related concept to the statistical phenomenon of regression to the mean. It occurs when people incorrectly attribute the regression to the mean phenomenon to their own actions or interventions, leading to erroneous conclusions or decisions.
For example, if a student receives a very high grade on a test, they may attribute this success to their own abilities or hard work, rather than recognizing that it may be due in part to chance or to the fact that they were particularly well-prepared for that particular test. Conversely, if a student receives a very low grade on a test, they may attribute this failure to their own lack of ability or effort, rather than recognizing that it may be due in part to chance or to external factors such as a particularly difficult test or personal issues that were affecting their performance.
The regression to the mean cognitive bias can lead to overconfidence or underestimation of one’s abilities, as well as a failure to recognize the role of chance or external factors in determining outcomes. It can also lead to a failure to take appropriate action or make necessary changes, as the person may believe that their actions are responsible for the outcomes, rather than recognizing that chance or external factors played a role.
To counteract the regression to the mean cognitive bias, it is important to recognize the role of chance and external factors in determining outcomes and to focus on factors that are within our control. It is also important to seek out and consider alternative perspectives and information, as well as to regularly evaluate one’s own abilities and performance. Seeking feedback and engaging in self-reflection can also help to mitigate the regression to the mean cognitive bias.
Outcome Bias
The outcome bias is a cognitive bias that involves evaluating a decision or action based solely on its outcome, rather than considering the quality of the decision or action itself. This bias occurs when people judge the effectiveness of a decision or action based on whether it resulted in a positive or negative outcome, rather than on the soundness of the decision-making process or the quality of the action itself.
For example, if a stockbroker recommends a risky investment that ends up performing well, the client may overlook the risky nature of the investment and instead attribute the broker’s decision as a good one. Conversely, if the investment performs poorly, the client may view the decision as a bad one, even if the broker made a sound and reasonable recommendation based on the information available at the time.
The outcome bias can lead to overconfidence or complacency when a positive outcome is achieved, as well as harsh self-criticism or inaction when a negative outcome occurs. It can also lead to a failure to learn from past experiences or to identify areas for improvement, as the focus is on the outcome rather than the process.
To counteract the outcome bias, it is important to evaluate decisions and actions based on the quality of the decision-making process or the action itself, rather than solely on the outcome. This involves considering the information available at the time, the potential risks and benefits, and the soundness of the decision-making process or action. It is also important to focus on learning and improvement rather than solely on achieving positive outcomes. By evaluating decisions and actions in this way, it becomes possible to learn from mistakes and make better decisions in the future.
Paradox of Choice
The paradox of choice is a phenomenon where having too many options or choices can lead to anxiety, decision paralysis, and dissatisfaction with the chosen option. This can happen when people are presented with a wide range of options and feel pressure to make the “right” choice, which can be overwhelming and lead to feelings of regret or missed opportunities.
The paradox of choice can be seen in many different situations, such as shopping for consumer products, selecting a restaurant or meal, or choosing a career path. When faced with too many options, people may spend an excessive amount of time comparing and evaluating choices, which can lead to decision fatigue and a decreased ability to make quality decisions.
To counteract the paradox of choice, it can be helpful to set clear goals and priorities, limit the number of options being considered, and focus on the most important factors in making a decision. This can help to reduce anxiety and increase the likelihood of making a satisfactory decision. Additionally, taking breaks and allowing for reflection can help to alleviate decision fatigue and increase the ability to make quality decisions over time.
Liking Bias
Liking bias is a cognitive bias where a person’s perception of another individual, object, or idea is influenced by their feelings of affection, admiration, or likeability towards it. This bias can affect many different aspects of life, such as social relationships, job interviews, and consumer decisions.
For example, a person may be more likely to trust and agree with someone they like, even if that person is presenting information that is not entirely accurate or beneficial. Similarly, a person may be more likely to choose a product or service that is endorsed by someone they like, even if there are other options that are more objectively suited to their needs.
To counteract the liking bias, it is important to be aware of the influence that feelings of liking can have on perception and decision-making. It is important to evaluate individuals, objects, or ideas based on their actual qualities and merits, rather than on the basis of personal feelings or social relationships. This may involve seeking out alternative viewpoints, actively questioning assumptions, and considering multiple perspectives. By doing so, it becomes possible to make more objective and informed decisions, free from the influence of liking bias.
Endowment Effect
The endowment effect is a cognitive bias where people tend to place a higher value on objects or items they own or possess, simply because they possess them. This can lead to a reluctance to give up the item, even if it is not particularly useful or valuable.
For example, imagine a person is given a mug as a gift. Even if the person has no particular attachment to the mug, they may be unwilling to sell it for less than a certain price, simply because it is now “theirs”. This is because the person places a higher value on the mug than they would if they did not own it.
The endowment effect can have significant implications in areas such as negotiations, marketing, and economics. For example, a seller may set a higher price for an item they own than they would if they did not own it, simply due to the endowment effect. Similarly, a buyer may be willing to pay a higher price for an item they perceive as having value, even if that value is primarily based on ownership rather than objective worth.
To counteract the endowment effect, it is important to be aware of the potential bias and to evaluate items based on their objective value, rather than their ownership status. This may involve seeking out alternative options, negotiating prices, and considering the actual usefulness or value of an item, rather than simply focusing on the fact that it is “yours”.
Events Coincidence
The coincidence bias is a cognitive bias that refers to the tendency to perceive a relationship between two events, even when no causal connection exists. It is the assumption that there is a causal link between two events when there is actually only a coincidence.
For example, if a person buys a new car and then experiences a minor accident a few days later, they may attribute the accident to the new car, even if the accident had nothing to do with the car itself. This bias can also lead people to believe in superstitions or conspiracy theories, as they may see connections between events that are not actually related.
To counteract the coincidence bias, it is important to evaluate events based on objective evidence, rather than on perceived connections or assumptions. This may involve seeking out alternative explanations for events, considering alternative perspectives, and questioning assumptions. By doing so, it becomes possible to make more objective and informed decisions, free from the influence of the coincidence bias.
Cognitive Bias – Groupthink
Groupthink is a cognitive bias that occurs when a group of people make decisions or form opinions without critically evaluating the information or alternatives. Instead, groupthink often involves conforming to the views and opinions of the group, in order to maintain harmony or avoid conflict.
Groupthink can lead to flawed decision-making, as the group may fail to consider important information, overlook potential risks or negative consequences, and ignore dissenting opinions. This can be particularly problematic in situations where the group has significant power or influence, such as in political or organizational decision-making.
Some common symptoms of groupthink include a strong sense of unanimity or pressure to conform, the suppression of dissenting opinions or alternative viewpoints, and a lack of critical evaluation or consideration of alternatives.
To counteract groupthink, it is important to encourage open and honest communication within groups, to promote diversity and inclusion, and to actively seek out dissenting opinions or alternative perspectives. This can help to ensure that decisions are based on a comprehensive evaluation of all available information, rather than on the biases or preferences of a particular group.
Neglect of Probability
Neglect of probability is a cognitive bias that refers to the tendency to ignore or underestimate the importance of probabilities when making decisions or judgments. This bias can lead people to make poor decisions based on incomplete or inaccurate information, particularly when faced with uncertain or ambiguous situations.
For example, people may be more likely to overestimate the likelihood of rare or dramatic events, such as winning the lottery or being struck by lightning, while underestimating the probability of more common events, such as being involved in a car accident or developing a health problem.
Neglect of probability can also be exacerbated by the availability heuristic, which is the tendency to rely on information that is easily available or salient when making judgments. For example, people may be more likely to overestimate the likelihood of a shark attack after reading news stories about such incidents, even though the probability of being attacked by a shark is actually quite low.
To counteract neglect of probability, it is important to consider all available information, including probabilities and statistical data, when making decisions or forming opinions. This may involve seeking out objective sources of information, evaluating the reliability and validity of available data, and being aware of potential biases or distortions in perception. By doing so, it becomes possible to make more informed and accurate judgments, even in the face of uncertainty or ambiguity.
Scarcity Error
Scarcity error is a cognitive bias that occurs when people place a higher value on items or resources that are perceived as scarce or in limited supply. This bias is driven by the idea that things that are difficult to obtain must be more valuable or desirable, leading people to place a greater importance on them than they might otherwise.
For example, scarcity error can be seen in consumer behavior, where products that are marketed as “limited edition” or “one-of-a-kind” are often perceived as more valuable and desirable, even if there is no objective reason for this perception. Similarly, in social situations, people may place a higher value on friendships or relationships that are perceived as rare or difficult to obtain, even if these relationships are not necessarily the most fulfilling or meaningful.
Scarcity error can also contribute to irrational decision-making, as people may be more likely to take risks or make impulsive choices when they perceive that a limited opportunity is at stake. This can lead to poor financial decisions, such as overspending or investing in high-risk ventures, as well as social or personal decisions that may not be in one’s best interest.
To counteract scarcity error, it is important to consider the actual value and utility of the items or resources in question, rather than simply being swayed by their perceived scarcity or rarity. This may involve taking a more objective approach to decision-making, seeking out additional information and perspectives, and being mindful of the potential biases and pitfalls of scarcity thinking.
Base Rate Neglect
Base rate neglect is a cognitive bias in which people tend to focus too much on specific, individual information while ignoring broader statistical data or general probabilities. This bias can lead people to make inaccurate judgments and decisions, particularly when it comes to evaluating risk or predicting outcomes.
For example, imagine a study has found that people who eat a lot of vegetables have a lower risk of heart disease. However, if an individual knows someone who eats a lot of vegetables but still has heart disease, they may ignore the statistical likelihood that eating vegetables reduces the risk of heart disease and instead focus on the specific case that they know about.
Base rate neglect can also contribute to stereotypes and prejudice, as people may overemphasize individual cases that confirm their preconceived notions and ignore broader statistical trends that contradict them.
To counteract base rate neglect, it is important to consider both individual cases and broader statistical data when making judgments and decisions. This may involve seeking out objective sources of information, evaluating the reliability and validity of available data, and being aware of potential biases or distortions in perception. By doing so, it becomes possible to make more informed and accurate judgments, even in the face of incomplete or ambiguous information.
Gambler’s Fallacy
The gambler’s fallacy is a cognitive bias in which people believe that future probabilities are affected by past events, when in fact the two are independent. This bias can lead people to make irrational decisions, particularly in situations involving chance or randomness.
For example, imagine a gambler at a roulette table who has seen the ball land on black for the last ten spins. The gambler may begin to believe that a red outcome is “due” and bet accordingly, even though the probability of the ball landing on red is the same as it was on the previous ten spins.
The gambler’s fallacy can also occur in non-gambling situations, such as in investing, where people may believe that a stock is more likely to rise or fall based on past performance, rather than on objective market data and trends.
To counteract the gambler’s fallacy, it is important to recognize that past events do not influence future probabilities, and that chance outcomes are independent from one another. This means that each individual event should be evaluated based on its own probability and potential outcomes, rather than on any perceived “streaks” or patterns. By doing so, it becomes possible to make more rational decisions and avoid the pitfalls of the gambler’s fallacy.
Cognitive Bias – The Anchor
Anchor bias is a cognitive bias in which people rely too heavily on the first piece of information they receive when making subsequent judgments or decisions. This initial information, or “anchor,” can influence people’s perceptions and lead them to make inaccurate or irrational choices.
For example, imagine a person is shopping for a new car and sees a car advertised for $20,000. This price becomes the anchor for their perception of what is reasonable to spend, even if other cars with similar features are priced lower or higher. The person may feel like they are getting a good deal if they can negotiate the price of the car down to $19,000, even if this is still more expensive than other comparable options.
Anchor bias can also occur in negotiations, where the first offer made by one party can influence the perceived value of subsequent offers. This bias can be particularly influential when the initial anchor is presented with confidence or authority, or when there is a lack of alternative information available.
To counteract anchor bias, it is important to seek out and consider multiple sources of information before making a decision, and to be aware of the potential influence of initial anchors. This may involve questioning initial assumptions, seeking out alternative perspectives or data, and remaining open to new information and perspectives. By doing so, it becomes possible to make more rational and informed judgments, rather than being overly influenced by initial anchors
Induction
Induction bias is a cognitive bias that occurs when people draw conclusions based on limited or incomplete information. It involves making generalizations or assumptions based on a limited set of observations or experiences, without considering the possibility of other explanations or factors that may be relevant.
For example, imagine a doctor who observes several patients with similar symptoms and concludes that they all have the same illness, without considering other potential causes of their symptoms. This doctor may be subject to induction bias, as they have made a generalization based on a limited set of observations.
Induction bias can also occur in areas such as research or data analysis, where a researcher may draw conclusions based on a small or biased sample of data, without considering the broader context or potential alternative explanations.
To avoid induction bias, it is important to gather as much relevant data as possible before drawing conclusions, and to consider alternative explanations or factors that may be relevant. This may involve seeking out different sources of data, testing alternative hypotheses, or remaining open to the possibility that there may be multiple explanations for a given phenomenon. By doing so, it becomes possible to make more accurate and informed judgments, rather than being overly influenced by limited or incomplete information.
Loss Aversion
Loss aversion is a cognitive bias in which people tend to strongly prefer avoiding losses to acquiring gains of equal or greater value. It is often described as the psychological phenomenon where the pain of losing is greater than the pleasure of gaining.
For example, imagine a person is offered a choice between receiving $100 or taking a 50/50 chance to win $200. Most people would choose the $100, even though the expected value of the 50/50 gamble is also $100. This is because the prospect of losing $100 is more emotionally impactful than the potential gain of $200.
Loss aversion can impact a wide range of decisions, from financial investments to personal relationships. It can cause people to hold on to losing investments longer than they should, avoid taking risks even when the potential rewards outweigh the risks, or make choices that prioritize avoiding losses over maximizing gains.
Understanding and being aware of loss aversion can help individuals make more informed and rational decisions. For example, by considering the potential gains and losses of a decision objectively and weighing them equally, rather than being disproportionately influenced by the fear of loss, individuals can make more rational and effective choices.
Social Loafing
Social loafing is a phenomenon in which individuals exert less effort when working on a group task than they would if they were working alone. This can occur because individuals feel less accountable for their performance when working in a group, or because they believe that their contributions are less necessary or valuable in the context of a larger group.
For example, imagine a group project in school where each member is responsible for contributing to a presentation. If one member of the group believes that their contribution is not essential, they may exert less effort in preparing their portion of the presentation, assuming that other members will make up for any shortcomings.
Social loafing can have negative effects on group performance, as it can lead to reduced motivation and productivity, as well as lower quality outcomes. To counteract social loafing, it is important to establish clear expectations and individual responsibilities within the group, as well as to encourage and recognize individual contributions to the group’s overall success. This can help to increase accountability and motivation, and lead to more effective and successful group outcomes.
Exponential Growth
Exponential growth bias is a cognitive bias in which people tend to underestimate the exponential growth of a phenomenon, leading to inaccurate predictions or expectations about its future trajectory.
For example, consider a hypothetical scenario where a bacteria colony doubles in size every hour. If the colony started with just one bacterium, after one hour it would have grown to two, after two hours to four, after three hours to eight, and so on. While the growth may seem slow at first, it quickly accelerates and can lead to very large numbers in a short amount of time.
Despite this, many people may struggle to accurately predict the future size of the colony, underestimating just how quickly it will grow due to their bias towards linear thinking. This is because the human brain is not naturally equipped to process exponential growth, which can lead to inaccurate predictions or underestimations of its impact.
Exponential growth bias can have significant consequences in a variety of contexts, from personal finance to public health. For example, it can lead to underestimating the impact of compounding interest on investments, or underestimating the spread of a contagious disease. By recognizing this bias, individuals can work to adjust their predictions and expectations in situations where exponential growth may be a factor, leading to more accurate and informed decisions.
Winner’s Curse
Winner’s curse is a cognitive bias that occurs in auctions or negotiations where the winning bidder or negotiator overpays for an item or asset because they have overestimated its value relative to other bidders or negotiators.
For example, imagine that several bidders are competing to purchase a rare artwork. Each bidder has their own estimate of the artwork’s value, but these estimates may be inaccurate due to various factors such as incomplete information, emotional attachment to the artwork, or social pressure to win the auction. If one bidder overestimates the artwork’s value and wins the auction with a bid that is much higher than the other bidders, they may end up experiencing the “winner’s curse” because they have paid more for the artwork than it is actually worth.
The winner’s curse can occur in other contexts as well, such as in business negotiations, where a company may overpay for an acquisition or partnership due to overestimating the value of the other party’s assets or capabilities. It can also occur in stock market trading, where investors may overvalue a particular stock and bid up its price, only to later realize that they overpaid.
To avoid the winner’s curse, it is important to do thorough research and analysis before making a bid or offer, and to have a clear understanding of the value of the item or asset being purchased or negotiated. It can also be helpful to set limits or thresholds for how much you are willing to pay, and to avoid getting caught up in the emotions of the moment.
Fundamental Attribution Error
The fundamental attribution error is a cognitive bias that involves attributing someone’s behavior to their personality or disposition rather than to external factors. In other words, people tend to overemphasize dispositional or personality factors when explaining the behavior of others, while underemphasizing situational factors that may have influenced the behavior.
For example, if a person sees someone yelling at a cashier in a store, they may assume that the person is naturally aggressive or rude, without considering that the person may be experiencing some external stressors or frustrations that are causing them to act out of character. This bias can lead to misunderstandings and negative judgments of others, and can also make it difficult to empathize with others and understand their perspectives.
The fundamental attribution error has been studied extensively in social psychology, and it is believed to be influenced by a variety of factors, such as cultural norms, individual differences in personality and cognitive processing, and situational factors such as time pressure or cognitive load. To overcome this bias, it is important to be aware of its influence and to make a conscious effort to consider situational factors when evaluating the behavior of others.
False Causality
False causality is a cognitive bias that involves assuming a cause-and-effect relationship between two events, even though there may not be any direct causal link between them. This bias can lead people to make incorrect assumptions about the causes of events, and can result in incorrect decisions and actions.
For example, someone may assume that because they drank a cup of coffee before experiencing a headache, the coffee caused the headache. However, there may be other factors, such as stress or lack of sleep, that could be the actual cause of the headache. In this case, the assumption of causality is false.
False causality can also arise from a number of other factors, such as the desire to find explanations for events, the tendency to seek out patterns and connections between events, and the influence of prior beliefs and expectations.
To avoid false causality, it is important to consider all possible factors that could be influencing an event or outcome, and to be skeptical of assumptions of causality unless there is strong evidence to support them. It is also helpful to approach situations with an open mind and a willingness to consider multiple explanations for events, rather than jumping to conclusions based on limited information or preconceived notions.
Halo Effect
The halo effect is a cognitive bias in which our overall impression of a person, company, or other entity influences our judgments about that entity’s specific traits or characteristics. In other words, if we have a positive impression of someone or something, we are more likely to assume that they have other positive qualities, even if we do not have direct evidence of those qualities.
For example, if we meet someone who is physically attractive and friendly, we may assume that they are also intelligent, successful, and kind, even if we have no direct evidence of those traits. Similarly, if we have a positive impression of a company based on its branding or reputation, we may assume that its products or services are of high quality, even if we have not tried them ourselves.
The halo effect can lead to biases in hiring, performance evaluations, and other areas of decision-making. For example, a hiring manager may be more likely to hire a candidate based on their physical appearance or confident demeanor, rather than their actual qualifications for the job.
To avoid the halo effect, it is important to be aware of our own biases and to gather objective information about a person or entity before making judgments or decisions. It is also helpful to evaluate each trait or characteristic separately, rather than allowing our overall impression to influence our judgments about specific qualities.
Alternative Paths
“Alternative Paths Bias” is sometimes used to refer to the tendency to overestimate the number of possible alternatives or options available in a decision-making situation. This bias can lead to decision paralysis or poor decision-making, as people become overwhelmed by the perceived complexity of the decision and struggle to identify the best course of action.
To avoid the alternative paths bias, it can be helpful to focus on a limited number of high-quality options and to prioritize the most important criteria for the decision. It can also be useful to seek out feedback and advice from trusted sources, but to be mindful of the potential for decision-making biases in the advice-giving process.
Forecast Illusion
The “Forecast Illusion” is a cognitive bias in which people tend to overestimate their ability to predict future events or outcomes. This bias can be particularly strong in situations where people have limited or incomplete information, and may be driven by a variety of factors including overconfidence, the availability bias (relying too heavily on easily accessible information), and the illusion of control (believing that one has more control over future events than is actually the case).
The forecast illusion can lead people to make poor decisions or to take unnecessary risks based on overly optimistic predictions about the future. To avoid this bias, it can be helpful to seek out diverse perspectives and to consider a range of potential outcomes, including both positive and negative scenarios. It can also be useful to gather as much relevant information as possible before making predictions or decisions, and to be open to adjusting one’s expectations based on new information as it becomes available.
Conjunction Fallacy
The Conjunction Fallacy is a cognitive bias in which people overestimate the probability of two events occurring together, compared to the probability of either event occurring alone. This fallacy is based on the belief that multiple specific conditions are more probable than a single more general condition.
For example, consider the statement: “Linda is a bank teller and is active in the feminist movement.” People may be more likely to believe that it is more likely for Linda to be both a bank teller and a feminist, rather than just a bank teller. However, statistically speaking, it is more likely for Linda to be a bank teller than for her to be a bank teller and a feminist.
The conjunction fallacy can lead people to make incorrect judgments and decisions because they overestimate the likelihood of specific events occurring together, and do not properly weigh the probabilities of each individual event. To avoid the conjunction fallacy, it is important to carefully consider the probabilities of each event separately, and to avoid making assumptions about the likelihood of multiple events occurring together without proper evidence.
Framing
Framing refers to the way in which information is presented or “framed” in order to influence people’s attitudes, perceptions, and decisions. The way in which information is presented can have a powerful impact on the way people think and act.
For example, consider a study where participants were asked to choose between two treatments for a disease. Treatment A was described as having a 70% success rate, while Treatment B was described as having a 30% failure rate. Although both descriptions convey the same information, people were more likely to choose Treatment A when it was framed as having a success rate, compared to when it was framed as having a failure rate.
Framing can be used in various settings, such as in advertising, politics, and even personal relationships. By presenting information in a certain way, it is possible to manipulate people’s attitudes and behaviors. Therefore, it is important to be aware of framing and to consider the context in which information is presented in order to make more informed decisions.
Action Bias
Action bias is the tendency to believe that taking action, any action, is better than doing nothing in the face of a problem or difficult situation, even if the action taken may not be effective or appropriate. This bias is driven by the belief that it is better to do something rather than nothing, as well as the fear of regret if no action is taken.
For example, in sports, coaches may be more likely to make substitutions or changes to their team’s strategy during a game, even if those changes may not be necessary or effective, simply because they feel the need to do something to show that they are actively trying to win.
In investing, action bias may lead people to buy or sell stocks based on short-term fluctuations in the market, rather than sticking to a long-term investment strategy. Similarly, in politics, action bias may lead leaders to take military action or make policy changes without carefully considering the potential consequences.
Action bias can be detrimental in situations where careful thought and analysis are necessary to make the best decision. It is important to consider the potential risks and benefits of taking action versus doing nothing before making a decision.
Omission Bias
Omission bias refers to the tendency to view harmful actions (i.e., actions that lead to harm) as worse than harmful inactions (i.e., failing to take action to prevent harm), even when the consequences of inaction are more severe than those of action.
For example, a doctor may be more likely to choose a treatment that has a known side effect but has been shown to be effective, rather than choosing to do nothing, even though the side effect may be harmful. Similarly, a policy maker may be more likely to implement a controversial policy that they believe will benefit society, rather than choosing to do nothing, even though the consequences of inaction may be dire.
Omission bias is driven by several psychological factors, including the tendency to avoid taking responsibility for negative outcomes, the belief that one has more control over actions than inactions, and the perception that actions are more visible and can be seen as proof of effort or commitment.
However, in some cases, inaction may actually be the best course of action. It is important to carefully weigh the potential consequences of both action and inaction in any given situation before making a decision.
Self-serving Bias
Self-serving bias is a cognitive bias that refers to the tendency to attribute successes to one’s own abilities and efforts, while attributing failures to external factors, such as bad luck or other people’s actions. In other words, people tend to take credit for positive outcomes, but blame external factors for negative outcomes.
For example, a student who receives a good grade on an exam may attribute it to their intelligence and hard work, while a student who receives a bad grade may blame the teacher’s poor teaching or unfair exam questions. Similarly, a business executive may attribute the company’s success to their own leadership skills, but blame external factors such as the economy or market competition for any failures.
Self-serving bias is driven by several psychological factors, including the desire to maintain a positive self-image and the need to protect one’s self-esteem. By attributing successes to one’s own abilities and efforts, people feel good about themselves and their achievements, which in turn boosts their self-esteem. Conversely, by attributing failures to external factors, people can avoid feelings of guilt or shame.
However, self-serving bias can also lead to overconfidence and a lack of self-awareness. By attributing all successes to one’s own abilities, people may fail to recognize the role of luck or other external factors, which can lead to unrealistic expectations and poor decision-making. Therefore, it is important to recognize the influence of self-serving bias and to strive for a more balanced and objective view of one’s own abilities and achievements.
Hedonic Treadmill
The hedonic treadmill is a concept in psychology that refers to the human tendency to return to a relatively stable level of happiness after experiencing positive or negative events. The term “hedonic” refers to pleasure, while “treadmill” suggests a constant pursuit of happiness without ever achieving a lasting sense of contentment.
According to the hedonic treadmill theory, people have a baseline level of happiness that they tend to return to over time. This baseline level is influenced by genetic factors, personality traits, and life circumstances. Positive events, such as winning the lottery or getting a promotion, can temporarily boost happiness levels, but over time, people tend to adapt to these changes and return to their baseline level of happiness. Similarly, negative events, such as the loss of a job or the end of a relationship, can initially decrease happiness levels, but people tend to adapt and return to their baseline level over time.
The hedonic treadmill is driven by two main factors: adaptation and social comparison. Adaptation refers to the process of becoming accustomed to positive or negative changes in our lives. For example, if we get a new car, we may initially feel very happy and excited, but over time, the novelty wears off and we adapt to having the car. Social comparison refers to the tendency to compare ourselves to others, which can lead us to feel dissatisfied with what we have even when we objectively have a lot.
While the hedonic treadmill can make it difficult to achieve lasting happiness, there are ways to counteract its effects. One approach is to focus on gratitude and cultivate a sense of appreciation for the good things in our lives. Another approach is to practice mindfulness and be more present in the moment, rather than constantly chasing future happiness. By understanding the hedonic treadmill and its effects, we can make more intentional choices about how we pursue happiness and live our lives.
Self-selection Bias
Self-selection bias is a type of bias that occurs when participants in a study or survey are allowed to choose whether or not to participate. This can skew the results of the study, as those who choose to participate may be different in some important way from those who choose not to participate.
For example, if a study on the effectiveness of a new drug is offered only to people who have already tried other treatments, the results may not be representative of the general population because the participants may have more severe or different symptoms than those who did not participate.
Similarly, if a survey about a product is only sent to customers who have had a positive experience with the product, the results may be biased towards positive reviews and not reflect the experiences of those who had negative experiences or did not try the product at all.
Association Bias
Association Bias, also known as the availability heuristic, refers to the cognitive bias that leads us to make judgments about the likelihood of an event based on the ease with which examples come to mind. In other words, we tend to overestimate the likelihood of events that are more memorable or that we have experienced personally.
This bias can be particularly problematic in situations where we rely on anecdotal evidence or personal experience to make important decisions, as it can lead us to ignore more relevant statistical data.
For example, a person may be more likely to believe that airline travel is dangerous because they remember vividly the one news story they heard about a plane crash, even though statistically speaking, airline travel is much safer than driving a car.
Beginner’s Luck
Beginner’s luck refers to the phenomenon of a novice experiencing unexpected success or good fortune in a particular activity, despite having little or no previous experience or training in that activity. This term is commonly used in gambling, where a beginner may win a large amount of money on their first attempt, but it can also be applied to other areas of life, such as sports or business.
The concept of beginner’s luck is often attributed to psychological factors such as reduced anxiety and a lack of fear of failure, which can lead to a more relaxed and open mindset that allows for greater creativity and risk-taking. Additionally, beginners may have fewer preconceived notions about the activity, leading them to approach it in a more open-minded and innovative way.
However, it’s important to note that beginner’s luck is a fallacy in the sense that it doesn’t actually increase one’s chances of long-term success in the activity. In reality, the initial success is often just a result of chance, and as the beginner gains more experience and competes against more experienced individuals, the odds of continued success may decrease. Therefore, relying on beginner’s luck as a strategy for long-term success is not a sound approach.
Cognitive Dissonance
Cognitive dissonance is a psychological concept that describes the mental discomfort that occurs when a person holds two or more conflicting beliefs, ideas, or values. This state of tension can arise when a person’s beliefs or actions are inconsistent with one another, or when they are confronted with new information that contradicts their existing beliefs or values.
For example, a person who smokes cigarettes but also knows that smoking is harmful to their health may experience cognitive dissonance. They may attempt to resolve this discomfort by rationalizing their behavior (e.g., “I don’t smoke that much” or “I’ll quit soon”) or by changing their beliefs (e.g., “Smoking isn’t that bad for you”).
Cognitive dissonance can have both positive and negative effects. On one hand, it can motivate people to change their behavior or beliefs in order to resolve the tension. On the other hand, it can lead people to ignore or reject information that conflicts with their existing beliefs, which can prevent them from making more informed decisions or taking action to address problems.
Overall, understanding cognitive dissonance can help individuals recognize and address the inconsistencies in their own beliefs and behaviors, as well as understand why others may resist changing their beliefs or behaviors even in the face of new evidence.
Hyperbolic Discounting
Hyperbolic discounting is a cognitive bias in which people show a preference for smaller, immediate rewards over larger, delayed rewards, even when the latter offers a greater overall benefit. This bias results from the way our brains process information about the value of future outcomes. We tend to place a much greater emphasis on immediate rewards, while downplaying the value of future rewards. This can lead to impulsive decision-making, as we prioritize immediate gratification over long-term goals. The bias is called “hyperbolic” because the discounting curve is steeper than a typical exponential discounting curve.
Because Justification
Because Justification refers to the cognitive bias where people are more likely to comply with a request or agree with an argument when given a reason, even if the reason is not logically relevant to the request or argument. This bias was demonstrated in a classic study where participants were more likely to let someone cut in line to use a copy machine when they were given a reason (“Can I cut in line because I need to make copies?”) than when no reason was given (“Can I cut in line?”). The study showed that the mere presence of a reason, even if it was a weak or irrelevant one, was enough to increase compliance.
Decision Fatigue
Decision fatigue is a psychological phenomenon that describes the deterioration of one’s ability to make sound decisions after a prolonged period of decision-making. It occurs when a person is forced to make a large number of decisions over an extended period, leading to mental exhaustion and a decreased ability to focus and make rational choices.
Decision fatigue can affect various areas of life, including work, relationships, and personal life. For example, a judge who has been making multiple decisions throughout the day may experience decision fatigue, which can lead to errors in judgment in the later cases of the day. Similarly, individuals who are constantly making decisions about what to eat, wear, or do may also experience decision fatigue, leading them to make impulsive choices or avoid decision-making altogether.
Studies have shown that decision fatigue can be mitigated by taking regular breaks, delegating decision-making responsibilities, simplifying choices, and prioritizing decisions. It is also important to recognize when decision fatigue is setting in and to take steps to combat it to ensure that important decisions are made thoughtfully and with a clear mind.
Contagion Bias
Contagion bias is a cognitive bias that occurs when we believe that the characteristics or qualities of one thing are automatically transferred to another just because they are physically connected, have recently been in contact, or share some other superficial similarity. Essentially, it’s the idea that things or people are inherently “tainted” by their association with other things or people that we perceive negatively.
For example, if someone is wearing a shirt with a company logo that you dislike, you may assume that the person is not trustworthy or is incompetent, even though you have no evidence to support these assumptions.
To counter the contagion bias, it’s important to focus on the specific characteristics or qualities of the thing or person in question, rather than assuming that they are tainted by their association with other things or people. We should try to evaluate each thing or person on their own merits, rather than making judgments based on superficial associations.
Another way to counter the contagion bias is to increase our exposure to things or people that we perceive negatively, in order to break down our associations between those things or people and negative qualities. This can be done through exposure therapy or deliberate attempts to challenge our assumptions and biases.
The Problem with Averages
The Problem with Averages is a cognitive bias that occurs when individuals rely solely on average values to understand a distribution of data, without considering the variance or the outliers. This can lead to incorrect assumptions and decisions. For example, a company may advertise that the average salary of their employees is $70,000 per year, but this does not account for the fact that the salaries of some employees may be much higher or lower than the average.
To counter the problem with averages, it is important to consider the entire distribution of data, including the variance and outliers. This can be done by looking at the range of data, the standard deviation, and the shape of the distribution. Additionally, it is helpful to ask questions about the data, such as “what is the median salary?” or “what is the range of salaries?” rather than relying solely on the average.
Motivation Crowding
Motivation crowding refers to the phenomenon where external incentives (such as monetary rewards or punishments) can undermine people’s intrinsic motivation to engage in a particular activity. This can occur when people perceive that their behavior is being controlled by external factors rather than their own interests or values. The result is that people may become less motivated to perform the activity in question, or may shift their focus away from the intrinsic rewards that initially motivated them.
For example, imagine that you are an artist who loves to paint. You derive a great deal of enjoyment and satisfaction from your art, and you spend many hours each week painting. One day, someone offers to pay you a large sum of money for one of your paintings. While you are initially excited about the prospect of making money from your art, you may find that the pressure to create a piece that meets the buyer’s specifications undermines your intrinsic motivation to paint. You may become more focused on creating something that will sell rather than on the process of creating art that you find personally fulfilling.
To counter the negative effects of motivation crowding, it can be helpful to offer incentives that are aligned with people’s intrinsic motivation. For example, rather than offering monetary rewards for a particular behavior, it may be more effective to highlight the intrinsic rewards associated with that behavior (such as the sense of accomplishment or enjoyment that comes from doing it well). In addition, it can be helpful to provide opportunities for people to have autonomy and control over their actions, which can help them to feel more invested in the activity and more likely to engage in it voluntarily.
Twaddle Tendency
Twaddle tendency is the tendency to use technical or complicated language to impress others or to obscure the fact that one doesn’t actually understand the subject matter. This cognitive bias is closely related to the Dunning-Kruger effect, where people overestimate their competence in a particular area.
For example, a politician might use technical jargon or complex sentence structures to make their speech sound more intelligent and knowledgeable than it actually is. Similarly, a business executive might use buzzwords and acronyms to impress colleagues or clients, even if they don’t fully understand the concepts behind them.
To counter the twaddle tendency, it’s important to be aware of your own level of understanding and to communicate in a clear and straightforward manner. If you’re unsure about a technical term or concept, don’t be afraid to ask for clarification. Additionally, try to avoid using jargon or buzzwords unless they are necessary and clearly understood by your audience.
Will Rogers Phenomenon
The Will Rogers Phenomenon is a statistical phenomenon in which the overall average of a group increases when a member with a below-average value is removed and replaced by a new member with an above-average value. This phenomenon is named after the famous American humorist and actor, Will Rogers, who is known for making the following statement: “When the Okies left Oklahoma and moved to California, they raised the average intelligence of both states.”
An example of the Will Rogers Phenomenon is when a hospital replaces a low-performing doctor with a high-performing one. Although the hospital’s overall average performance may increase, it does not necessarily mean that all of the doctors have improved their performance.
To counter the Will Rogers Phenomenon, it is important to look beyond the averages and consider individual performances. Instead of focusing on the overall average, it is important to analyze the data at the individual level to identify specific areas for improvement. Additionally, it is important to ensure that the replacement member is truly above average and not simply an average member from a different group.
Information Bias
Information bias refers to the tendency to seek out and value information, even if that information is irrelevant, useless or misleading. People often assume that more information will lead to better decisions, but this is not always the case. Seeking too much information can lead to analysis paralysis, where one becomes overwhelmed with information and is unable to make a decision.
For example, imagine you are trying to choose a restaurant for dinner. You spend hours researching online reviews and menus, but when you finally arrive at the restaurant, you find that the food is terrible. In this case, your information bias led you to focus too much on the reviews and not enough on the actual quality of the food.
To counter information bias, it is important to focus on the quality and relevance of the information you are seeking. Ask yourself if the information you are looking for is truly necessary and if it will actually help you make a better decision. Limit the amount of time you spend researching, and prioritize personal experience and intuition. Finally, be open to new information and perspectives, and don’t let your preconceptions bias your interpretation of the information you receive.
Effort Justification
Effort Justification, also known as the “sunk cost fallacy,” refers to the tendency to continue investing in something, even when the investment no longer makes sense, simply because one has already invested so much time, effort, or resources into it.
For example, suppose a person invests a significant amount of money in a stock that has been performing poorly. Rather than cut their losses and sell the stock, they continue to hold on to it, hoping that it will eventually turn around. This behavior can be attributed to the effort justification bias because they have already invested so much money in the stock that they are unwilling to accept the loss and move on.
To counter the effort justification bias, it is important to evaluate investments and decisions based on their current merits, rather than on past investments. Rather than asking, “How much have I invested in this already?” one should ask, “Is it still a good investment now?” One can also try to focus on the future potential of an investment, rather than the past investment made into it.
The Law of Small Numbers
The Law of Small Numbers is a cognitive bias that occurs when people make assumptions or conclusions based on a small sample size of data, rather than a larger and more representative sample. This can lead to incorrect judgments or decisions because the small sample size may not accurately reflect the larger population.
For example, a small business owner may assume that their sales will continue to increase at a steady rate after experiencing a successful month or two. However, this assumption may not hold true if the sample size is small, and the larger market conditions are unpredictable.
To counter the Law of Small Numbers bias, it is essential to collect and analyze data from a representative sample size that accurately reflects the larger population. Additionally, it is necessary to take into account other factors that may affect the data, such as market conditions, external events, and other variables that can affect the outcome. By analyzing a larger sample size and taking these factors into account, we can make more accurate and informed decisions.
Expectations
Expectations refer to the beliefs, predictions, or assumptions that people have about future events or outcomes. It is a cognitive process that can be influenced by past experiences, social norms, culture, and personal biases. Expectations can be either realistic or unrealistic, and they can have a significant impact on how people perceive, interpret, and respond to situations.
For example, if someone has a high expectation of receiving a job promotion, they may feel disappointed, frustrated, or resentful if they are passed over for the promotion. On the other hand, if someone has low expectations of succeeding in a particular task, they may not put in enough effort to achieve their goals.
Expectations can also be influenced by external factors such as media reports, social media, and other people’s opinions. For instance, a person may have a lower expectation of enjoying a new restaurant based on negative online reviews, even if they have not tried it themselves.
To counter the impact of unrealistic expectations, it can be helpful to practice mindfulness and maintain a more realistic perspective. It can also be helpful to consider multiple perspectives and sources of information before forming expectations, as well as being open to adjusting them based on new information or experiences. Additionally, focusing on the process rather than the outcome can help reduce the pressure of expectations and allow individuals to enjoy the journey towards their goals.
Simple Logic
Simple logic refers to the tendency to oversimplify complex problems or situations, leading to flawed conclusions or decisions. This bias occurs when individuals make decisions based on simple and easily understandable information, rather than considering all available information and potential outcomes.
Example: A person may make a decision to invest in a particular stock based solely on the company’s brand name or popularity, without analyzing the company’s financial health, competitors, or other factors that could impact the stock’s performance.
To counter simple logic bias, it is important to gather and analyze all relevant information and to consider multiple perspectives and potential outcomes. Additionally, seeking input and advice from others with expertise in the area can help to avoid oversimplification and increase the accuracy of decisions.
Forer Effect
The Forer effect, also known as the Barnum effect, is a cognitive bias in which individuals tend to believe that vague and general personality descriptions are highly accurate and specific to them, despite the descriptions being applicable to a wide range of people.
Example: A horoscope that reads “You are a generous and caring person, but can also be reserved and introspective” can be interpreted by many individuals as a personal and accurate description of themselves. However, this statement is so general that it can apply to almost anyone.
Countermeasure: To counter the Forer effect, individuals can develop critical thinking skills to evaluate the accuracy and validity of personality descriptions and assessments. They can also strive to obtain more specific and concrete information about themselves rather than relying on general statements.
Volunteer’s Folly
Volunteer’s Folly is a cognitive bias that can occur when people volunteer for a task or project without fully considering the amount of work, time, or resources required to complete it. Volunteers may underestimate the difficulty or complexity of the task and overestimate their own ability to complete it, leading to unrealistic expectations and potential failure.
Volunteer’s Folly can also occur when people are asked to volunteer for a task and feel pressured to say yes, even if they are not truly interested or capable of completing it. This can lead to resentment, frustration, and a negative experience for both the volunteer and the organization or project they are supporting.
To counter Volunteer’s Folly, it is important to take the time to fully understand the scope and requirements of a task before volunteering to ensure it is a good fit for your skills, interests, and available resources. Organizations can also help prevent Volunteer’s Folly by clearly communicating expectations and providing adequate support and resources to volunteers.
Affect Heuristic
The affect heuristic is a cognitive bias where people rely on their emotions, feelings, or “affect” when making decisions or judgments, rather than logical or objective reasoning. This bias occurs when people make judgments or decisions based on their emotional reactions to a situation, rather than on a careful analysis of the facts.
For example, imagine that you are trying to choose between two different job offers. One job offers a higher salary, but the other job is more in line with your personal values and passions. If you rely on the affect heuristic, you may choose the job that aligns with your passions, even if it pays less. This is because you may be more emotionally invested in that job, and your emotions are guiding your decision.
To counter the affect heuristic, it is important to be aware of your emotions and try to separate them from the decision-making process. Take the time to consider the facts and logic behind a decision, rather than just relying on your initial emotional reaction. It may also be helpful to seek the opinions of others who can offer a more objective perspective on the situation.
Introspect Illusion
The introspection illusion is a cognitive bias that occurs when individuals believe that they have access to their own internal mental states, thoughts, and emotions, and that these are transparent and easily understood. In other words, people tend to overestimate their own ability to accurately and objectively understand their own mental processes.
For example, a person may believe that they are very good at reading other people’s emotions and intentions based on subtle cues, but may fail to recognize their own biases and misinterpretations of those cues.
To counter the introspection illusion, it is important to engage in self-reflection and recognize that our own subjective experiences and interpretations may not always be accurate or reliable. Seeking feedback from others and considering alternative perspectives can also help to overcome this bias. Additionally, using external sources of information and data to supplement our own introspection can help to provide a more objective and accurate understanding of our own mental processes.
Inability to Close Doors
The Inability to Close Doors is a cognitive bias that refers to a person’s tendency to have difficulty closing or finishing projects, tasks, or decisions once they have been opened or initiated. This can lead to a lack of productivity and the accumulation of unfinished projects, as individuals struggle to move on from something that they have invested time and energy in.
Examples of this bias can be seen in individuals who struggle to complete tasks that they have started, even if it is clear that the task is no longer a priority or if there are more important tasks that require attention. For instance, an individual may continue to work on a project that is no longer relevant or beneficial, simply because they have already invested a significant amount of time and effort into it.
To counter the Inability to Close Doors bias, it is important to regularly reassess one’s priorities and goals. This can involve setting clear deadlines for projects and tasks, as well as regularly reviewing progress to determine whether or not a particular task or project is still worth pursuing. Additionally, it can be helpful to establish a system for prioritizing tasks and projects, which can help to prevent individuals from becoming overwhelmed by the number of tasks that need to be completed.
Neomania
Neomania is a cognitive bias in which an individual is excessively attracted to novelty and new things. People affected by neomania tend to prefer and be drawn to new products, ideas, or situations over more familiar or established ones, regardless of their actual value or utility. This can lead to impulsivity in decision-making, risk-taking, and an over-reliance on the latest trends and fads.
Examples of neomania can be seen in the technology industry, where consumers often clamor for the latest gadgets and software releases, even if they do not necessarily provide any significant improvement over previous versions. Similarly, in the fashion industry, trends come and go rapidly, and many people feel the need to stay on top of the latest styles and fashions, regardless of whether they are practical or comfortable.
To counteract neomania, individuals should try to evaluate new products or ideas based on their actual value, rather than simply because they are new or trendy. It can also be helpful to take time to reflect on one’s own values and needs before making impulsive decisions, and to seek out the opinions of others who may have more experience or knowledge in the relevant area. Additionally, setting clear goals and priorities can help individuals stay focused on what is truly important, rather than being distracted by every new thing that comes along.
Sleeper Effect
The sleeper effect is a phenomenon in which a message or information that is initially perceived as less credible or persuasive becomes more persuasive over time. It occurs when people forget the source of the information but retain the message itself, and it is most likely to occur when the message is memorable or interesting.
For example, a study found that when people were exposed to a message promoting a certain product, they rated the message as more credible when it was presented by a low-credibility source than when it was presented by a high-credibility source. However, over time, the people who had been exposed to the low-credibility message became more likely to buy the product than those who had been exposed to the high-credibility message.
To counter the sleeper effect, it is important to remind people of the source of the information, and to emphasize the importance of considering the credibility of the source when evaluating the information. It is also important to present information in a clear and concise manner, and to avoid using tactics that may distract from the message itself.
Alternative Blindness
Alternative blindness is a cognitive bias where people fail to see alternatives to a preferred option. This bias can arise when individuals become fixated on a particular course of action or idea and are unable to consider other options that may be just as good or better. People who suffer from alternative blindness may overlook opportunities, fail to explore new possibilities, and may end up making suboptimal decisions.
Examples of alternative blindness can be found in a range of different contexts. For instance, a job applicant might become fixated on a particular position and be blind to other career opportunities. Similarly, investors may become too attached to a particular stock or investment, leading them to overlook other investment opportunities.
To counter alternative blindness, individuals should take the time to consider multiple options and actively seek out new ideas and alternatives. It can also be helpful to get feedback from others and to consider different perspectives to avoid getting too fixated on a particular option. Additionally, individuals should be open-minded and willing to re-evaluate their choices as new information becomes available.
Social Comparison Bias
Social comparison bias refers to the tendency of individuals to evaluate their own abilities and opinions by comparing themselves with others. People often engage in social comparison to gain information about their own abilities and to evaluate their own opinions and beliefs. However, this bias can also lead to negative consequences, such as feelings of inferiority or superiority, as well as distorted self-perceptions.
Examples of social comparison bias can be seen in various contexts. For instance, a person may compare their academic performance with that of their peers to determine how they measure up. Alternatively, someone may compare their physical appearance with that of others to assess their own attractiveness. Social comparison can also occur in the workplace, where individuals may compare their job performance or salary with those of their colleagues.
To counteract social comparison bias, individuals can focus on their own strengths and accomplishments, rather than constantly comparing themselves to others. Additionally, seeking out objective feedback and focusing on personal growth and improvement can help to reduce the negative impact of social comparison bias. It’s also important to recognize that everyone has unique strengths and weaknesses, and that it’s not always helpful or productive to compare oneself to others.
Primacy and Recency Bias
Primacy and recency bias refer to the tendency of people to better remember the first and last items in a series or list, and to be more influenced by them when making judgments or decisions. These biases are related to the idea of serial position effect, which suggests that the order of presentation of information affects how people perceive and remember it.
Examples of primacy and recency bias can be found in various domains. For instance, in a job interview, the interviewer may be more likely to remember the first and last candidate interviewed, and be influenced by them in their final hiring decision. Similarly, in a political campaign, voters may be more likely to remember the first and last candidates they heard about, and be more inclined to vote for them. In advertising, companies may strategically place their product at the beginning or end of a commercial break to increase its impact and memorability.
To counteract primacy and recency bias, it can be helpful to increase the salience of middle items in a series or list, such as by highlighting them or repeating them. In a job interview or election, it may be useful to take notes or rate each candidate immediately after they have been interviewed or presented, in order to avoid the influence of primacy and recency. Additionally, taking breaks or switching tasks between presentations of information can help to reduce the effects of these biases.
Not Invented Here Syndrome
Not Invented Here Syndrome (NIHS) refers to the tendency of individuals or organizations to reject ideas, products, or processes developed externally, even if they are superior, and instead opt to develop their own solutions from scratch. The syndrome is prevalent in many industries, such as technology, where companies are often resistant to adopting innovative ideas from competitors or startups.
NIHS can be detrimental to organizations in various ways. Firstly, it can lead to a waste of resources, as organizations spend time and money developing solutions that already exist in the market. Secondly, it can result in missed opportunities for growth and development, as organizations miss out on valuable ideas and technologies that could enhance their products and services. Finally, it can lead to a culture of isolation and insularity, where employees are resistant to change and unwilling to collaborate with external partners.
Countermeasures to NIHS include:
- Acknowledge the syndrome: Managers need to recognize that NIHS exists in their organization and actively work to counter it. This includes creating a culture of openness and collaboration, where employees are encouraged to share and adopt external ideas.
- Promote diversity: Hiring employees from diverse backgrounds, with different perspectives and experiences, can help to challenge existing assumptions and biases.
- Encourage experimentation: Providing employees with the freedom and resources to experiment with new ideas can help to overcome resistance to change and encourage innovation.
- Foster external partnerships: Developing partnerships with external organizations, such as startups, can help to bring in new ideas and technologies that may not have been considered otherwise. This can also help to build a culture of openness and collaboration.
- Monitor decision-making: Managers should monitor decision-making processes to ensure that biases, such as NIHS, are not influencing outcomes. This can involve seeking external feedback or engaging in critical self-reflection.
The Black Swan
The Black Swan is a term coined by Nassim Nicholas Taleb in his book “The Black Swan: The Impact of the Highly Improbable.” The term refers to an event that is rare, unpredictable, and has a significant impact. The idea behind the black swan is that it is impossible to predict these types of events with traditional forecasting methods, and they can have a massive impact on individuals, organizations, and society as a whole.
Examples of black swan phenomenon include the 9/11 terrorist attacks, the 2008 financial crisis, and the COVID-19 pandemic. These events were unexpected, had a significant impact on society, and were difficult to predict with traditional forecasting methods.
Countermeasures to black swan events include scenario planning and stress testing. Scenario planning involves imagining multiple possible outcomes and planning for each scenario. Stress testing involves testing the resilience of systems and processes to determine their ability to withstand unexpected events.
It’s important to note that while black swan events are difficult to predict, they are not impossible to prepare for. By being aware of the potential for unexpected events and taking steps to prepare for them, individuals and organizations can better mitigate the impact of these events when they do occur.
Domain Dependence
Domain dependence is a cognitive bias that occurs when a person’s judgment or decision-making is influenced by the specific context or domain in which they are operating. It can cause individuals to make decisions that are inconsistent with their overall goals and values.
For example, a person may be an excellent decision-maker in their professional life but make poor decisions in their personal life due to domain dependence. This can manifest in various ways, such as a person being a skilled investor but being unable to manage their personal finances effectively.
To counteract the effects of domain dependence, it can be helpful to take a step back and evaluate decisions from a broader perspective. This may involve seeking input from others who are not as closely tied to the particular domain, or consciously considering how the decision fits into one’s overall values and goals. Additionally, seeking out diverse experiences and perspectives can help to break down domain dependence and promote more well-rounded decision-making.
False Consensus Effect
The False Consensus Effect is a cognitive bias that occurs when people overestimate the extent to which their opinions, attitudes, beliefs, or behaviors are shared by others. This bias is often due to the fact that people tend to surround themselves with like-minded individuals, leading them to assume that their views are more widely held than they actually are.
For example, someone who strongly supports a particular political candidate may assume that most other people they encounter also support that candidate, even if this is not necessarily true. This bias can also be seen in group settings, where individuals may assume that their group’s beliefs and opinions are more widely shared than they actually are.
To counter the false consensus effect, it can be helpful to seek out diverse perspectives and engage in dialogue with people who hold different opinions and beliefs. It is also important to consider empirical evidence and objective data rather than relying solely on personal experiences and anecdotal evidence.
Falsification of History
Falsification of history, also known as historical negationism, is the deliberate distortion or denial of historical events or facts. This is often done for political or ideological purposes, and it can have serious consequences for how people understand the past and present.
Examples of falsification of history include Holocaust denial, the denial of the Armenian Genocide, and the denial of the Nanjing Massacre. In each of these cases, there are individuals or groups who reject the overwhelming historical evidence that these events occurred and seek to spread alternative narratives that downplay or deny their significance.
Falsification of history can also take more subtle forms, such as selective editing of historical records, downplaying certain events or figures, or emphasizing others to fit a particular agenda. This can be particularly effective when combined with propaganda, as it can lead people to accept a distorted version of history without questioning its accuracy.
To counter the falsification of history, it is important to promote accurate and objective historical research and education. This includes teaching critical thinking skills to help individuals evaluate historical sources and recognize biased or distorted narratives. It also involves preserving historical records and artifacts and promoting open access to them so that they can be studied and interpreted by a wide range of people.
In Group Out Group Bias
In-group out-group bias is a psychological tendency to favor individuals belonging to one’s own group (the in-group) over individuals belonging to other groups (the out-group). This bias can occur in various settings, such as sports teams, political groups, and nationalities.
Examples of in-group/out-group bias include favoring one’s own political party and demonizing the opposition, viewing people from one’s own country as superior to those from other countries, and supporting one’s own sports team while denigrating its rivals.
The in-group/out-group bias can lead to harmful behaviors, such as discrimination, prejudice, and even violence against members of the out-group. To counter this bias, individuals can try to expand their social circles and interact with people from different backgrounds, challenge their own assumptions and stereotypes, and actively seek out diverse perspectives.
Additionally, organizations can implement policies that promote diversity and inclusion, such as hiring practices that prioritize candidates from underrepresented groups, creating safe spaces for open and honest conversations about bias and discrimination, and offering training and education on cultural competence.
Ambiguity Aversion
Ambiguity aversion refers to the tendency of individuals to avoid options or situations with uncertain outcomes or where the probability distribution is not known. This bias can manifest in several ways, including:
- Preference for known risks over unknown risks: People may prefer to take a known risk with a well-understood probability distribution over an unknown risk, even if the unknown risk has a higher expected value.
- Preference for clear information over ambiguous information: People may prefer to have clear, unambiguous information even if it is not as useful as ambiguous information that provides more detail.
- Avoidance of complex decisions: People may avoid complex decisions that involve uncertainty, preferring simpler, more straightforward options.
Examples of Ambiguity Aversion:
- Investment decisions: An investor may choose to invest in a bond with a fixed rate of return, even if the returns on a variable investment are likely to be higher, simply because the variable investment has greater ambiguity.
- Medical decisions: A patient may choose a treatment with known risks and benefits over a treatment with uncertain outcomes, even if the latter is potentially more effective.
Countermeasures for Ambiguity Aversion:
- Encourage information gathering: Providing more information about the probabilities of uncertain outcomes can help individuals make more informed decisions.
- Break down complex decisions: Breaking a complex decision down into smaller, more manageable components can reduce the level of ambiguity and make it easier to arrive at a decision.
- Encourage experimentation: Encouraging individuals to experiment with new options can help reduce the fear of ambiguity and increase comfort with uncertain outcomes.
Default Effect
The Default Effect is a cognitive bias where people are more likely to choose an option that is pre-selected or presented as the default option. This bias occurs because people often take the default option as a signal that it is the most reasonable, convenient, or popular choice. The default option can be an actual default setting or the most commonly chosen option.
Examples of the Default Effect can be found in various aspects of everyday life. For instance, people are more likely to donate to charity if they are automatically enrolled in a program that takes a portion of their paycheck as a donation. In contrast, people are less likely to donate if they have to take the initiative to make a donation themselves. Similarly, people are more likely to choose healthier food options if they are presented as the default option in a cafeteria.
To counter the Default Effect, it is essential to be aware of the influence that pre-selected options can have on our decision-making. One way to do this is to consider alternatives to the default option and evaluate them based on their merits. Another way to counter the Default Effect is to actively choose to opt-out of the default option and explore other alternatives. It is also essential to recognize that the default option is not necessarily the best option, and taking the time to evaluate alternative options can lead to better decision-making.
Fear of Regret
Fear of regret is a cognitive bias that refers to the tendency to choose a less risky option because of the fear of regretting a decision if it turns out badly. It is a type of loss aversion, where the fear of losing something is stronger than the potential gain.
Example: An individual may choose a less rewarding job over a more challenging one because they fear regretting the decision if they fail or are not able to perform well in the more challenging job.
How to counter: One way to counter this bias is to focus on the potential positive outcomes of a decision rather than the fear of negative outcomes. Another approach is to consider the potential regret of not taking a risk or making a decision that could lead to greater rewards. It is important to weigh the potential risks and rewards of a decision and make an informed choice rather than being driven solely by fear of regret.
Salience Effect
The salience effect is a cognitive bias that refers to the tendency of people to focus on the most noticeable or memorable aspects of a situation or event, rather than on those that are less noticeable or memorable. This can lead people to overestimate the importance or likelihood of certain outcomes, or to ignore or downplay other factors that may be just as important.
For example, if a person hears a news story about a rare and deadly disease, they may become very fearful of contracting that disease, even if the actual risk of infection is very low. This is because the vividness and emotional impact of the news story makes the threat seem more salient and immediate than other, more likely risks that are less visible or emotionally compelling.
To counteract the salience effect, it is important to consider all available information and to try to balance the emotional impact of vivid or memorable events with more objective or statistical information about risk and probability. This may involve seeking out multiple sources of information, weighing the relative importance of different factors, and avoiding knee-jerk reactions based solely on the most salient or emotionally charged aspects of a situation.
House Money Effect
The house money effect is a cognitive bias that suggests people are more likely to take greater risks when they perceive the money being used as “not their own.” This bias is often observed in casinos, where players are more likely to bet with their winnings from earlier rounds (i.e., house money) rather than the money they brought with them.
For example, imagine a person goes to a casino with $100 and wins $500. They might be more willing to take risks with the $500 because they see it as “house money” and not their own. This can lead to riskier behavior that they wouldn’t have taken if they had started with $500 in the first place.
To counter the house money effect, it is important to remember that all money, regardless of where it came from, is equal in value. One way to avoid this bias is to set aside a certain amount of money for gambling and not use any winnings to continue gambling beyond that amount. It can also be helpful to remember that past results do not necessarily predict future outcomes, and each round of gambling is independent of the previous one.
Procrastination
Procrastination is the tendency to delay or avoid completing tasks or making decisions, often until the last possible moment. This can lead to decreased productivity, increased stress and anxiety, and missed opportunities.
Example: An individual may delay completing an important project until the day before it is due, causing them to rush and produce lower quality work.
Counter: Some strategies to overcome procrastination include breaking tasks into smaller, manageable pieces, setting clear deadlines and timelines, and minimizing distractions such as social media and email notifications. Additionally, practicing good time management habits, such as prioritizing tasks and setting aside dedicated time for work, can help reduce the tendency to procrastinate. It may also be helpful to identify the root causes of procrastination, such as fear of failure or lack of motivation, and address them directly.
Cognitive Bias – Envy
Envy is an emotion that occurs when one person desires something that another person has, whether it be material possessions, status, or achievements. Envy often arises when the person who is envious perceives a sense of injustice or unfairness in the situation, and feels that they themselves deserve what the other person has.
Envy can have negative effects on individuals and relationships, leading to feelings of resentment, bitterness, and hostility. It can also lead to social comparison, where an individual evaluates their own worth and success based on how they measure up to others.
One way to counter the negative effects of envy is through gratitude. Focusing on the good things in one’s life and being thankful for what one has can help shift the focus away from what is lacking and towards what is present. Another strategy is to engage in positive self-talk, focusing on one’s own strengths and accomplishments instead of comparing oneself to others. Finally, practicing empathy and compassion towards others can help to reduce feelings of envy by fostering a sense of connection and shared humanity.
Cognitive Bias – Personification
Personification is a cognitive bias in which we attribute human-like qualities, emotions, or intentions to non-human entities such as animals, objects, or even natural phenomena. This bias can be seen in literature, art, and everyday language, where animals are often depicted with human emotions and characteristics.
For example, a common personification is “The wind whispered through the trees.” In reality, the wind cannot whisper because it does not have a mouth, but we give it human-like qualities to make it more relatable and understandable.
Personification can be used for artistic purposes, but it can also lead to misunderstandings and incorrect assumptions. For example, a person may assume that a pet cat is being disobedient or vindictive when it scratches furniture, when in reality it is simply following its natural instincts.
To counteract the personification bias, it is important to recognize when we are attributing human-like qualities to non-human entities and to question whether these attributions are accurate or helpful. It can also be helpful to seek out factual information about non-human entities to avoid making assumptions based on personal biases.
Illusion of Attention
The illusion of attention is a cognitive bias that occurs when people believe that they are paying attention to more information than they actually are. This bias is often referred to as inattentional blindness, and it is the phenomenon where individuals fail to notice an unexpected stimulus in their field of vision because they are focused on something else.
For example, a person may be driving on the road and focusing their attention on the car in front of them. In doing so, they may not notice a pedestrian crossing the street, even though they are right in front of them. This is because their attention is focused on one thing, and they are not consciously aware of the other stimuli around them.
To counteract the illusion of attention, individuals can practice mindfulness and being present in the moment. This involves paying attention to one’s surroundings and consciously trying to notice new stimuli. Additionally, taking breaks and switching focus can help combat inattentional blindness. By intentionally changing the focus of attention, individuals can help broaden their awareness and avoid missing important information.
Strategic misRepresentation
Strategic misrepresentation is a cognitive bias where individuals selectively represent information in a manner that will produce the desired outcome. This can occur in many contexts, including politics, marketing, and negotiation.
For example, a politician may strategically misrepresent information about their voting record to appeal to a certain constituency. They may highlight their support for a particular issue while downplaying or omitting information about their stance on other issues.
In marketing, companies may use strategic misrepresentation to manipulate consumers into making purchases. They may highlight the benefits of a product while downplaying its drawbacks or negative side effects.
To counter the effects of strategic misrepresentation, it is important to seek out diverse sources of information and critically evaluate the information being presented. It can also be helpful to research both sides of an issue and seek out opinions from individuals with different perspectives. Additionally, being aware of the potential biases of the person or organization presenting the information can also help to mitigate the effects of strategic representation.
Cognitive Bias – Overthinking
Overthinking is a cognitive bias in which an individual spends too much time analyzing, worrying, and ruminating about a situation, which can lead to decision paralysis and unnecessary stress. Overthinking can be a result of anxiety, perfectionism, and low self-esteem.
For example, if someone is offered a job opportunity, they may spend an excessive amount of time overanalyzing the offer, the company, the salary, and the responsibilities, which can lead to indecisiveness and missed opportunities.
To counter overthinking, it is essential to practice mindfulness and present moment awareness. One can also benefit from setting boundaries, creating a to-do list, and seeking support from others. Learning to let go of perfectionism and embracing imperfection can also help reduce overthinking. Additionally, learning relaxation techniques such as deep breathing, meditation, and yoga can help reduce anxiety and improve decision-making.
Planning Fallacy
The planning fallacy is a cognitive bias in which individuals tend to underestimate the time and resources required to complete a future task, even when they have experience with similar tasks. People often have an optimistic outlook and assume that everything will go according to plan, without considering unexpected delays, complications, or obstacles.
For example, a student may believe that they can finish a research paper in one week, even though similar papers have taken them much longer in the past. Similarly, a construction manager may underestimate the time and cost required to complete a project, leading to delays and budget overruns.
One way to counter the planning fallacy is to use reference class forecasting, which involves looking at the actual outcomes of similar projects or tasks to predict the likely outcome of the current project or task. Another strategy is to break the task down into smaller, more manageable pieces and estimate the time and resources required for each piece individually, rather than trying to estimate the overall time and resources required for the entire project at once. Finally, it can be helpful to involve others in the planning process, as they may offer a more realistic perspective and help identify potential challenges and obstacles.
Deformation Professionelle
Deformation Professionelle is a cognitive bias in which an individual’s professional training and specialized knowledge creates a tendency to view all problems and situations through the lens of their specific profession, even if this narrow perspective is not appropriate or relevant. Essentially, this bias occurs when someone sees the world only through the narrow lens of their profession, and is unable to see beyond it.
For example, a doctor may be inclined to diagnose someone with a medical condition, even if the symptoms are not necessarily indicative of a medical issue. Or an engineer may be prone to seeing a problem as a technical one, when it could also have organizational or social causes.
This bias can be countered by seeking out perspectives from individuals outside of one’s profession or discipline, and by deliberately trying to take a broader perspective when analyzing problems or situations. It is also helpful to recognize that different situations may require different approaches and that no one profession or discipline has all the answers.
Zeigarnik Effect
The Zeigarnik effect refers to the phenomenon where people tend to remember and recall uncompleted or interrupted tasks better than completed tasks. The effect is named after Bluma Zeigarnik, a Russian psychologist who first observed it in the 1920s.
For example, if someone is interrupted while reading an article, they are more likely to remember the details of what they read compared to if they had read the entire article uninterrupted. Similarly, if someone is given a list of tasks to complete, they may remember the incomplete tasks better than the ones they have already finished.
The Zeigarnik effect is often used as a strategy to improve memory retention, by intentionally interrupting or leaving tasks incomplete to increase recall. However, it can also contribute to feelings of stress and anxiety when we are unable to let go of incomplete tasks and they continue to occupy our thoughts.
One way to counteract the negative effects of the Zeigarnik effect is to create a plan to complete tasks and make progress towards achieving goals, as this can reduce the mental burden of unfinished tasks. Additionally, practicing mindfulness and being present in the moment can help to reduce the impact of incomplete tasks on our mental state.
Illusion of Skill
The illusion of skill is a cognitive bias in which people overestimate their abilities or expertise in a particular area. It often occurs when people have some experience in a particular skill or activity, and they begin to believe that they are better at it than they actually are. This bias can be seen in a wide range of domains, including sports, art, and business.
Examples of the illusion of skill can be seen in many areas of life. For instance, a person who has played golf a few times might start to believe that they are a much better golfer than they actually are. They may become overconfident in their abilities and begin to take risks that they would not have taken if they were more aware of their true level of skill.
Another example might be seen in a person who has had some success in a particular business venture. They may begin to believe that they are experts in the field, even though their success may have been due to a number of factors outside of their control.
To counter the illusion of skill, it is important to remain humble and open to learning. Seeking feedback and actively working to improve one’s skills can also help to counter this bias. Additionally, recognizing that success may be due to a combination of factors, including luck and external circumstances, can help to keep one’s perspective in check.
Feature Positive Effect
The Feature Positive Effect is a cognitive bias where people tend to focus on the positive features or attributes of an object or a person, and give less weight or ignore the negative ones. This bias can lead to overvaluing or overestimating the qualities of an object or a person, while ignoring the potential downsides or limitations.
For example, when considering purchasing a new car, a person might focus on the positive features such as good gas mileage, a sleek design, and advanced safety features, while overlooking negative features such as a high price, a small trunk space, and low reliability ratings.
To counter the Feature Positive Effect, it is important to consider both the positive and negative features of an object or a person, and weigh them equally in making a decision. It can also be helpful to seek out information from unbiased sources, such as expert reviews or ratings, to get a more balanced view.
Cherry Picking
Cherry picking is a cognitive bias where a person only chooses or focuses on information that supports their beliefs or point of view while ignoring or dismissing information that contradicts it. It is a form of confirmation bias where someone seeks out evidence that confirms their pre-existing beliefs and ignores evidence that challenges those beliefs.
For example, a person who strongly believes in the health benefits of a particular diet may selectively search for and share articles or studies that support the benefits of the diet, while ignoring or dismissing articles or studies that raise questions or concerns about it.
To counter cherry picking, it is important to approach information objectively and consider multiple perspectives and sources of information. It is also important to be aware of your own biases and actively seek out information that challenges your beliefs in order to gain a more well-rounded understanding of a topic. Critical thinking skills, fact-checking, and seeking out diverse perspectives can also help counteract the effects of cherry picking.
Fallacy of the Single Cause
The fallacy of the single cause (also known as causal reductionism) is a logical fallacy that occurs when someone oversimplifies a complex event or phenomenon by assigning a single, simple cause to it. It is the opposite of the complex cause fallacy, where multiple complex causes are attributed to a simple effect.
Examples of the fallacy of the single cause include:
- A person attributes their success entirely to their hard work and talent, ignoring external factors such as luck, privilege, and support from others.
- A person blames all of their problems on a single factor, such as their upbringing or a particular person, ignoring the complexity of the situation and the influence of other factors.
- A news article claims that a particular event was caused by a single factor, such as a particular policy or individual, when in reality there were likely multiple factors at play.
To counter the fallacy of the single cause, it is important to acknowledge the complexity of situations and events, and consider multiple factors and perspectives when analyzing them. It can be helpful to gather and evaluate evidence from multiple sources, and to be open to the possibility that there may be multiple causes for a particular outcome.
Intention to Treat Error
Intention to treat (ITT) analysis is a statistical method used in clinical trials to evaluate the effectiveness of medical treatments. It involves analyzing the data according to the initial treatment assignment of the study participants, regardless of whether they completed the treatment or not. The purpose of ITT analysis is to prevent the bias that can occur when participants drop out of a study or switch treatments, which can affect the outcome.
However, the intention to treat error can occur when the researchers do not follow the protocol of the ITT analysis properly. This can happen in several ways, including:
- Including patients who were not supposed to be in the study: If patients who were not eligible to participate in the study are included in the analysis, it can affect the results.
- Excluding patients who were supposed to be in the study: If patients who were eligible to participate in the study are excluded from the analysis, it can also affect the results.
- Protocol violations: If patients do not follow the protocol of the study, it can affect the results. For example, if a patient does not take their medication as prescribed, it can affect the outcome of the study.
To avoid the intention to treat error, researchers should ensure that they follow the protocol of the ITT analysis properly. They should also analyze the data according to the initial treatment assignment, regardless of whether the patients completed the treatment or not. By doing this, they can reduce the bias that can occur in clinical trials and get more accurate results.
News Illusion
The News Illusion is a cognitive bias that describes the tendency to believe that one is informed about the world and current events based on consuming news media, despite often only being exposed to a narrow and biased selection of information. People who experience the News Illusion may feel well-informed and knowledgeable, but in reality, they may only have a limited understanding of the complex issues and events they are following.
The News Illusion can be especially dangerous when it comes to politics and social issues, as people may feel that they have formed informed opinions and beliefs based on the news they consume, when in reality they may be missing important context or alternative perspectives. This can lead to polarization, where people become more entrenched in their views and less open to considering other perspectives.
To counter the News Illusion, it is important to seek out diverse sources of information, including those that present alternative viewpoints and are critical of one’s own beliefs. It can also be helpful to seek out in-depth and investigative journalism that goes beyond superficial coverage of events. Developing media literacy skills, such as evaluating sources for bias and accuracy, can also be helpful in avoiding the News Illusion.
Parting Thoughts
Phew! There you have it. All 99 cognitive biases listed and summarized. This was an interesting and informative reading experience. I understand, there are more cognitive biases since this book is not exhaustive as the author states.
It is also impossible to guard against all cognitive biases at all times, but knowing about them and being aware of them is the first step to making better decisions. There is a lot of applicable advice in the book if you read between the lines.
I do practice observing some of these biases and fallacies occur in my life. Most notably, I make it a point that whenever I am reading about a subject or having a discussion, I try to analyze it from different perspectives in an attempt to avoid confirmation bias.
I highly recommend this book. After reading the book cover-to-cover a couple of times, I am sure to revisit parts of this book again occasionally.