This post explores human bias. Not the kind that we accuse each other when we don't agree with someone else's opinion but the one built into the machine. Our behaviours have developed for the majority of our existence as Homo sapiens in a nomadic tribal context. We are an infinity removed from that setting, however, still unwittingly apply tribal behaviours to our current lives in complex societies. The consequences of applying a reliably misfiring (emphasis on reliable) and out of date operating system is many fold. We will take a journey into evolutionary psychology to dig into why we are biased, what types of bias exist, when it arises, how this affects us and how we can avoid falling victim to it.
There are many reasons to care about this topic. Not falling prey to our suboptimal biology seems like an important way to become a better human. In a world where we can't agree on anything, we can maybe unite around how we are all deficient. That may be a naive dream but at least gaining awareness of how we fail reliably may help create empathy for the other (even if it doesn't lead to the acceptance of one's own fallibility). On a group level, the pervasive nature of bias in human cognition, makes it essential for researchers, policymakers, and professionals across fields to be aware of its potential impact. By recognising and accounting for bias, more accurate, objective, and equitable outcomes can be achieved in areas ranging from scientific research to social policy.
WE'RE OLD SCHOOL
Let's be real, a lot of the current day suboptimal behaviour of humans is related to the fact that we just recently (in cosmic terms) left behind our old ways. Homo sapiens has been around for about 300-150k years. Some sort of settled civilisation has been around for 15-10k years. This means that 90-97% of our existence has been as tribal hunter gatherers. This life was one of nomadism, scarcity and focused on survival. Our physical and mental operating system has been trained in that environment for hundred thousands of years. Physiological, safety, belongingness and esteem needs were all geared towards the main goal: in a world of brutal adversity stick around for as long as you can to create healthy and fit offspring to guarantee the survival of the species. This is still by and large the foundation of human behaviour today. Our societies are very different now and our nurture in this new reality adapts some of those embedded behaviours to be more in line with our current circumstances. But our origins cannot be ignored.
Let's flesh out some specific reasons why we may be biased by our tenure in the ancestral environment:
Efficient information processing: With predators abound, the ability to quickly process information and make rapid decisions could be the difference between life and death. Shortcuts to facilitate quicker, energetically cheaper, albeit not always accurate, decision-making were useful.
Learning from experience: Confidence in one's decision-making abilities was important to react quickly in life and death situation. Over time, this could promote more assertive decision-making in future uncertain situations.
Pattern recognition: The human tendency to see patterns, even where none exist may be adaptive. Recognising patterns quickly, like identifying a camouflaged predator, would have been crucial for our ancestors' survival.
Enhanced social cohesion: Favouring one's group and seeking confirmatory information to strengthen group bonds would have been advantageous in a cooperative hunting and gathering environment.
Self-preservation and reproductive success: Maintaining self-esteem and confidence can influence an individual's status within a group and attractiveness to potential mates, both of which have direct implications for reproductive success.
Risk aversion: Being highly attuned to potential threats and negative events (like predators or rival groups) would have been vital for survival.
The analysis of potential biases that are outgrowths of a neural wiring optimised as described above, falls in the domain of evolutionary psychology. From this perspective, many biases we show today can be understood as adaptive mechanisms that provided our ancestors with survival and reproductive advantages in the ancestral environment. Applied to our present complex world the wiring which lead to those advantages is outdated and can cause some reliable problems which we call bias.
NEW SCHOOL HANGUPS
Broadly speaking biases fall into a few categories: decision-making bias, social bias, memory errors, belief bias, perceptual bias. The classification is not exact and many of these biases can interplay across different categories. There is also far too much bias for one post so I will focus on the one's that are both interesting and common.
Decision-Making Bias
Decision-making bias refers to systematic patterns of deviation from norm or rationality in making judgments. These biases can result from our cognitive processes, personal beliefs, emotions, and social influences, leading us to perceive reality through a skewed lens or make decisions that aren't in our best interest. As mentioned, brains want to be efficient at predicting the world. Integrating information that creates cognitive dissonance and requires rewiring is energetically more expensive than using existing wiring, so we'd rather not do it.
Having to predict a world in which we lose a thing that's already wired into the neural pattern is problematic. Enter Loss Aversion. In the ancestral environment, the cost of missing out on a potential gain (not finding food one day) was often less consequential than the cost of a comparable loss (losing all one's food supplies). The latter could lead to immediate threats to survival. Therefore, being more sensitive to potential losses would have increased our chances of survival and subsequent reproductive success. Numerous experiments, most notably those conducted by Kahneman and Tversky (in their development of prospect theory), have shown that we often make decisions that are inconsistent with expected utility theory. These decisions can be better explained when considering the asymmetrical value individuals place on gains versus losses. In other words winning 100 bucks is not as great as losing 100 bucks is painful. This leads to decision making that reduces innovation (cost is up front), neglects long-term benefits (rather have less money now, than more later), leads to bad health (paying for the scan costs money), stalls negotiations (both parties focus on what they give up, than on upside) and general downside focused thinking.
A resistance to change results in another well known bias called the Sunk Cost Fallacy. This is the escalation of a commitment to a sub-optimal or losing strategy. Essentially, decisions are overly influenced by past investments (time, money, or effort) rather than being based solely on the expected future outcomes. A rational actor would consider incremental costs and benefits to assess a strategy not past, irrecoverable costs. The reasons for this are similar to the above. Loss aversion doesn't allow us to accept our losses and move on. A resistance to change makes us prefer the status quo strategy. A lazy brain doesn't want to deal with cognitive dissonance.
The Anchoring Effect is a cognitive bias where individuals overly rely on an initial piece of information, known as the "anchor," when making subsequent judgments or decisions. Also introduced by Amos Tversky and Daniel Kahneman in their seminal research on heuristics and biases, anchoring occurs even when the anchor is irrelevant to the decision at hand or is blatantly arbitrary. Let me give you an actual experiment they ran, as it is eye opening. Participants were asked to spin a wheel of fortune that was rigged to stop either on 10 or 65. After spinning the wheel, they were asked to estimate the percentage of African nations in the UN. Those who spun a 10 estimated, on average, that the percentage was 25%. In contrast, those who spun a 65 estimated it to be 45%. Even though the wheel of fortune's number was entirely random and unrelated to the question at hand, their brain went - I just saw a high/low number so let me guess high/low. Looking at this from an evolutionary biology lens we understand that the behaviour here is rooted in needing to take quick survival optimising actions, reduce cognitive dissonance, fit into the group, follow the leader, etc. When cost benefit analysis is limited to simple trade offs this makes sense but it is clearly not ideal when we make complex decisions.
The same thing happens with the so called Framing Effect. The framing effect refers to the phenomenon wherein the way information is presented or "framed" can influence decision-making outcomes, even if the underlying substantive details remain unchanged. A choice that is framed in terms of potential gains is often perceived differently than when framed in terms of potential losses, even if mathematically the two presentations are equivalent ("you will get 85% on the dollar" versus "it will cost you 15%").
Social Biases
Social bias refers to the systematic and non-random patterns of deviation in judgment or behaviour towards members of particular social groups. These biases play a significant role in shaping interpersonal interactions, group dynamics, and broader societal structures.
One of the most obvious candidates for this is the In-group Bias. It's at the heart of many social phenomena, from sports team rivalries to nationalistic fervour. Our ancestors lived in a dangerous world and had to band together. Forming tight-knit groups and coalitions provided essential benefits, such as shared resources, collective defense, and increased reproductive success. In this context, favouring one's in-group makes a lot of sense. Of course, it is natural to make group delineations given our meat space reality (I need take care of my own first) but reliable misfiring of in-group thinking leads to stereotyping (or Out-group Homogeneity Bias), prejudice and discrimination. Even worse politicians of all colours use this cheap trick to galvanise their base and draw up simplistic borders between people who given this ancient circuitry become unnecessarily suspicious of one another (rather than of said politicians).
The related Affinity Bias (sometimes referred to as "similarity bias") is the unconscious tendency for individuals to gravitate towards those who share similar characteristics, backgrounds, or experiences as themselves. Whether it's based on shared hobbies, educational backgrounds, cultural references, or even physical appearance, this bias drives individuals to feel a deeper connection or trust with those perceived as like them. On the surface this doesn't seem too problematic until you think about workplace dynamics: hiring similar people, only listening to the extroverts, "old boy's club", etc. When this affinity is taken to an extreme racism becomes an outgrowth of it. Another extreme outcome of this are social bubbles, which result in reduced exposure to diverse experiences, cultures, and ideas.
The Fundamental Attribution Error (FAE) refers to the tendency for individuals to overemphasise dispositional or personality-based explanations for the observed behaviours of others, while underemphasising situational explanations. In other words, we have a propensity to believe that a person's actions are indicative of their innate character traits, rather than considering external situational factors that might have influenced their behaviour. Say we see a coworker snap at someone, our immediate reaction might be to label them as short-tempered. However, we might overlook situational factors—like that they have a tyrannical 6 month old baby at home that's kept them up all night. Simply put, we are quick to judge someone wholesale based on one action. Again the lazy brain is looking for simplification and reduced cognitive load. When observing others, the individual is often the primary focus of attention, because understanding complex surrounding context is work. This uneven allocation of our attention can make dispositional factors seem more causally potent.
The flip side of the FAE is the Halo Effect. The term was originally coined by psychologist Edward Thorndike and refers to the cognitive bias where the perception of a single positive trait in a person influences our judgment of their other unrelated characteristics. In essence, when someone excels in one area, there's a tendency to assume they are competent in other areas as well. So if an individual is perceived as physically attractive, they might also be assumed to be more intelligent, kind, or trustworthy, even in the absence of evidence supporting these assumptions. This is why people who can talk well often get the job. This is also why brands employ celebs to sell you something (Tom Brady telling us to use FTX as crypto exchange).
Researchers Claude Steele and Joshua Aronson studied a cognitive phenomenon called the Stereotype Threat. This refers to a situational predicament in which individuals feel at risk of confirming negative stereotypes about their social group. This pressure, whether perceived or real, can impede performance and alter behaviour, often aligning with the very stereotype the individual seeks to disprove. An example would be, a woman, aware of the stereotype suggesting women are weaker in mathematical abilities, might underperform in a math test due to the anxiety and pressure stemming from this stereotype. This anxiety is due to the cognitive load generated by a mind that wants to fit in, even with an unhelpful group stereotype. Overcoming the stereotype which means abandoning the group, consumes cognitive resources, reducing the capacity available for the task at hand.
Memory Biases
While memory often feels like a faithful recorder of past events, in reality, it is more akin to a reconstructive process susceptible to various distortions. Memory biases, which fall under this domain, are systematic patterns of deviation from norm or rationality in remembering. These biases can significantly shape our perceptions, beliefs, and decisions, often without our conscious awareness. The insertion of the bias can creep in while encoding the memory, during "storage" or while retrieving it (Retroactive Interference). Emotional states, and prior/new knowledge or beliefs, external cues can all potentially lead to selective or distorted encoding, storage or retrieval.
A prominent example is the Hindsight Bias. In essence, it's a cognitive distortion that leads us to rewrite our own memories, making us believe that we were more accurate in our predictions than we actually were. This is the typical "I-knew-it-all-along" trap: longing the market and after the market crashes we say “I knew it all along that stocks were overvalued”; convinced that Biden will win and not bothering to vote and after Trump's victory we say “I knew all along Trump would win, which is why I didn't bother voting”; cheering for our football team feverishly until it loses we say “I knew all along they would lose”. The reason our brain does this is to promote a sense of predictability and control in an unpredictable world, fostering a perception of one's environment as more manageable. We also don't like dealing with the discomfort or cognitive dissonance of being wrong. Believing we had "known it all along" can alleviate this discomfort.
Another curious misfiring is called the Misinformation Effect where a person's account of an event can be altered if they are exposed to an incorrect recounting of that event by someone else. This is related to Suggestibility Effect which is the inclination to incorporate misleading information from any external sources into personal recollections. Some more distortions are caused by the Recency and Primacy Effects which describe the tendency to best remember the most recent information (Recency Effect) and the first pieces of information presented (Primacy Effect) in a series. All of these can have implications in various settings but the most severe are as you can imagine in legal proceedings. All these effects are as per usual down to a lazy brain that doesn't want to have too much cognitive load and wants to fit in with the group.
This leads us to the Self-Serving Bias. This bias is a pervasive cognitive tendency in which we attribute positive events and outcomes to our own intrinsic qualities, such as skill or effort, while negative events and outcomes are attributed to external factors, such as luck or bias from others. This phenomenon can be seen as a protective mechanism that upholds our self-esteem and positive self-view. This in turn increased the perception of us as more desirable mates (still does). Feelings of inadequacy or guilt require energy to overcome so best not waste any calories on them. The outcome of such a bias is also a misattribution of success to ourselves and failures to others, which works to blame out groups and create prejudice.
Belief Biases
Belief bias refers to the tendency of individuals to judge the validity of a given argument based on the plausibility of its conclusion rather than how logically sound the argument is. In essence, when we face logical reasoning tasks, we tend to be influenced more by our pre-existing beliefs about the world than by the logical structure of the arguments presented (the Sun God is angry, which is why it has been raining all summer).
Let's start with one of my favourites - Confirmation Bias. This is the tendency to search for, interpret and favour information that confirms one's preexisting beliefs. It is mentally more efficient to seek out, remember and interpret things so that they fit into our current frameworks (of course this was the Illuminati at work again). Without self-doubt we can make faster and more confident decisions. This helped with survival (I heard a tiger) but also with attractiveness towards the opposite sex (I know my stuff and am therefore competent). This is also why there is something called an Overconfidence Bias.
Another version of this is the Status Quo Bias. We exhibit this bias because we prefer to maintain our current choices and are resistant to change, even when objectively superior alternatives might be available. This preference for the current situation can manifest even when there are clear advantages to changing or when the costs of change are negligible. This is very much tied up with the Endowment Effect in which we ascribe more value to things simply because we possess them. This can lead to an overvaluation of the current state (or current possessions) and a consequent resistance to change. Change brings unknown risk, which is worse than predictable subsistence.
The Gambler's Fallacy, also known as the Monte Carlo Fallacy, refers to the mistaken belief that if a particular event occurs more frequently than normal during a given period, it will happen less frequently in the future (or vice versa). Essentially, it's the belief that past independent events can influence the outcome of a future independent event. It's not less likely for heads to come up even if it came up 5 times in a row while flipping a coin. Our brain, which likes to find patterns in this world, doesn't do so well with probabilities. It expects that a sequence of independent random events will "even out" in the short run, not understanding that the "law of averages" applies to large numbers and not short sequences. For instance, if a series of rulings has been in favour of the defendant, a judge might subconsciously feel the next should be for the plaintiff. Or Investors might believe that after a series of financial downturns, the market is "due" for an upturn (or vice versa).
The Availability Heuristic is a mental shortcut, or heuristic, that we utilise when estimating the probability or frequency of an event based on the ease with which occurrences can be brought to our mind. In simpler terms, if something can be recalled quickly or is immediately familiar, we are more likely to believe it's common or likely. Pair this with the fact that we have a loss aversion and you'll remember that shark attack you read about two months ago when you are stepping into the Atlantic Ocean (but also misestimations of risk in general). So whatever you consume daily in the news may unknowingly be a dominant factor in your beliefs (which is why you should really take care of your Digital Hygiene). This misfiring is also a strong ally of stereotyping as it allows us to form judgments about entire groups based on a few salient or memorable examples.
Perceptual Biases
Perceptual bias refers to the systematic ways in which the context and nature of sensory inputs, as well as individual cognitive and psychological predispositions, influence one's perceptions. This phenomenon shapes and distorts our interpretation of sensory information, often deviating from objective reality. Most of us have come across visual and auditory illusions before. There are millions of examples but one I recently came across this one:
However, there are more complex perceptual phenomena worth highlighting. The perceptual equivalent of the confirmation bias is called Selective Perception. We naturally filter out information to avoid being overwhelmed by the vast amount of data our senses receive. This can lead to a bias where we only perceive things that align with our existing beliefs or expectations. When we are upset with our partner for instance we might ignore the nice things they do for us and just focus on the negative. When spiritual people talk about "putting the energy out there that you want to receive" this effect is really at work. We will see the things that we value and that are easy to integrate with our current prevailing story of the world.
The Contrast Effect refers to the fact that we evaluate stimuli based on contrast with another stimulus rather than an objective standard. For instance, after lifting a heavy box, a lighter one feels even lighter than it objectively is. Put your hand in ice cold water and after that into cold water. The cold water will seem warmer than it usually does. This bias is similar to the anchoring effect but is focused on physical experience.
Everyone has fallen victim to the Expectancy Bias. Our perceptions can be altered by our expectations (see below Green Needle versus Brainstorm). For example, we may be too optimistic about how entertaining a movie would be then rate it a lot worse than it actually is because it's below our expectation (this can also swing in the other direction). This can result in a lot more serious issues: reinforcing stereotypes - “I was expecting a Hollywood style date, men are all lazy”; medical misdiagnoses - we had too high expectations for a drug, therefore the decent results are ignored; bad business decisions - we wanted a lot more signups on day one but don’t realise we attracted the most loyal customers; reduced personal growth - “man I thought I could play like Mozart after one lesson”; etc.
Just found out you/your wife are/is pregnant and are seeing pregnant women everywhere. You heard the song for the first time and then it's everywhere! You've bought yourself a new thing and now everyone has it! This is the Frequency Illusion (or Baader-Meinhof Phenomenon). Once you notice something for the first time, you tend to notice it more often afterward, leading to the belief that its frequency of occurrence has suddenly increased. Perceiving that an idea is everywhere can lead to reinforcement of conspiracy theories (and trap you in an Overton Mirror). It can also lead to misreading trends and therefore bad business decisions or policy.
Our takeaway here is that our senses and cognition are not as reliable as we make them out to be. Our brain floats in a dark cranium. It only has access to suboptimal signals sent from perceptual organs that are making stuff up to predict and navigate the world. Colour and sound don't exist in the universe, our eyes and ears have no other way to interpret the reality given their anatomy. So our brain takes these signals and tries to make sense of them. In the process it layers a bunch of evolutionary psychology, which has evolved over hundred thousands of years on top of the interpretation. The outcome is a reliable misfiring of our narratives of why and how things are. Knowing all this is the first step to improving the collateral damage that is caused by it all.
OVERCOMING OR MITIGATING BIAS
In understanding our inherent biases, we take a significant step towards making wiser decisions. Recognising and understanding the biases above is the first step, but actively employing strategies to overcome or mitigate them is vital to ensuring more accurate, inclusive, and objective decisions in both personal and professional realms.
As the adage goes, "To know thyself is the beginning of wisdom." The first step in addressing bias is to recognise and accept that everyone has biases. Engage in self-reflection, and consider taking implicit bias tests (more below) to uncover subconscious beliefs. Next is education. Well done, you are reading this and are ahead! Education, both formal and informal, is a cornerstone in the fight against bias. Through structured learning and personal exploration (I have a book list in the addendum), we can uncover the roots of biases, recognise their manifestations, and learn strategies to mitigate their effects. Moreover, educational environments can be designed to promote critical thinking, cultural competence, and empathy, all of which are essential in the battle against bias. While people who study psychology might learn about the different types of bias, all high school students (well and adults) should be taught about the shortcoming of their neural wiring. The only reason not to teach them these is because one wants to take advantage of said wiring.
The main defense we have against making decision, belief, perceptual and memory biased mistakes is by fostering critical thinking. This demands that we actively challenge our preconceived notions and the information presented to us. Rather than accepting information at face value or allowing underlying biases to influence our judgment, critical thinkers employ logic and reasoning to evaluate situations or propositions. We love to elevate intuition to a magical level of reverence, which in some cases might be justified but very often a lot can be gained from a deep rational analysis of any situation or problem. By questioning assumptions, seeking out diverse sources of information, and analysing the roots of our beliefs, we can begin to separate objective facts from biased interpretations. I know this is time consuming and a lot to ask for, especially in a fast moving ever distracting world. Nobody expects perfection from a monkey. Stop yourself once a day when you are buying something, hiring someone, judging something, remembering something, making an important decision, etc and ask yourself, "how may I be biased here". You may not be biased at all but by stoping yourself you are training a "mental muscle". After enough reps it becomes second nature.
Another tool that helps is active listening. Engaging in conversations without making immediate judgments by listening to understand rather than to reply. This practice can help us counteract biases in real-time. This goes hand in hand with avoiding to jump to conclusions. While it is not easy, deliberately delaying our decision-making to ensure that we've had ample time to consider all perspectives can be a game changer. Taking the time to think things through can prevent immediate biases from determining outcomes. We know that initial reactions to situations or people are made by the lazy brain. Recognising that these can be rooted in bias, we should try to reassess first impressions after gaining more information. Finally, there is nothing wrong with asking our peers, colleagues, or friends to point out when we might be exhibiting biased behaviour. This external input can provide a different viewpoint on our actions and thoughts (I have an anonymous form at the bottom of my personal website that I make people aware of, so that they can give me feedback on my everyday behaviour).
The key to over come social bias is empathy. By genuinely trying to understand and feel the emotions, perspectives, and experiences of others, we can break down barriers erected by biases. An empathetic approach fosters a sense of shared humanity, making it less likely for biases to cloud our judgment or incite divisive actions. An important technique here is to counteract stereotypes. This means actively challenging stereotypical statements or beliefs, whether they come from others or ourselves. For example, if a stereotype pops into into my mind, I deliberately try to think of an example that contradicts it. What helps here of course is exposing oneself to diverse cultures, opinions, and experiences. This is one of the most effective ways to challenge and combat biases. Homogeneous environments tend to reinforce existing beliefs and biases, creating echo chambers where similar views are amplified. In contrast, diverse environments present an array of perspectives, requiring individuals to confront and re-evaluate their biases. Whether in workplaces, academic institutions, or social settings, fostering diversity ensures a richer tapestry of viewpoints, promoting understanding and reducing prejudiced thinking.
While all the approaches above apply to the individual, understanding of bias has spawned so called "nudge units" ("behavioural insights teams") in governments. Coined by behavioural economist Richard Thaler and legal scholar Cass Sunstein in their book "Nudge," the concept of 'nudging' involves subtly guiding individuals towards better decisions without restricting their freedom of choice. By understanding how biases can influence decisions, it's possible to design systems or present options in a way that "nudges" individuals towards more objective or beneficial choices. For instance, placing healthier food options at eye level in a store can "nudge" consumers towards better dietary choices. When implemented ethically, nudges can serve as a potent tool against biases, harnessing our understanding of human behaviour to design more effective and beneficial systems. Some argue that nudges can be paternalistic, infringing on individual autonomy. There's a fine line between guiding choices and manipulating them, and concerns arise about who gets to decide what's best for the public. That said, I think most product, marketing and PR departments in corporate or political organisations are already using these techniques to appeal to our ancestral wiring to get our time, money or vote. A bit of well-meant nudging in the other direction is probably not such a big deal as its opponents make it out to be.
CONCLUSION
Overcoming the misfiring of our decision making and belief systems is important on many levels. On the individual level it will allow us to think more critically to make smarter decisions. This can at the very least lead to objectively better economic outcomes for us and our families. It can also make us better voters and consumers, who are not easily manipulated by crafty corporations and politicians. On the group level, it can help us make better decisions at work and among friends, creating more value with less unnecessary friction. On an institutional level, we can ensure better resource allocation and a fairer world with a strengthened democratic process.
Understanding social bias is crucial, not just from a theoretical standpoint but also for addressing pressing societal issues. Whether we're examining ethnic conflicts, political polarisation, or even biases in workplaces, the shadow of bias looms large. Knowing of its existence increases the importance of creating environments and narratives that emphasise common humanity, shared goals, and mutual understanding. After all, while our evolutionary history might have predisposed us towards favouring our own group, our shared future depends on transcending these divisions and working together for the collective good. Embracing diversity, after all, is not just a moral imperative but a cornerstone for innovation and progress. In a world where differences abound, let us challenge ourselves to see beyond the familiar and embrace the richness of the full spectrum of perspectives.
ADDENDUM
There are numerous insightful books that delve into the topic of human bias, examining the ways it manifests in our everyday decisions and behaviour, and even suggesting how we can mitigate its impact. I've read and can highly recommend books 1-5. The summaries for 6-8 look great and I will get to them eventually. In other words, I recommend all these books if you want to dig deeper:
1. "Thinking, Fast and Slow" by Daniel Kahneman: Nobel laureate Daniel Kahneman presents his groundbreaking research on the two systems of thought that drive our judgments—System 1, which is fast and intuitive, and System 2, which is slow and deliberate. Kahneman demonstrates how various cognitive biases arise from the interplay between these two systems.
2. "Predictably Irrational: The Hidden Forces That Shape Our Decisions" by Dan Ariely: Behavioural economist Dan Ariely explores why humans often make irrational decisions in both personal and professional contexts, challenging traditional economics' assumption of the rational actor.
3. "Blindspot: Hidden Biases of Good People" by Mahzarin R. Banaji and Anthony G. Greenwald: The authors use their experience in the Implicit Association Test (IAT) to discuss unconscious bias and its effects on behaviour.
4. "Nudge: Improving Decisions About Health, Wealth, and Happiness" by Richard H. Thaler and Cass R. Sunstein: Thaler and Sunstein propose a new perspective on preventing the countless mistakes we make—including ill-advised personal investments and consumption habits, neglect of our natural resources, and health choices—through nudging.
5. "The Person and the Situation: Perspectives of Social Psychology" by Lee Ross and Richard E. Nisbett: A foundational work in social psychology that explores how our situations influence our behaviour often more than our personalities or internal dispositions. It provides a comprehensive look at the fundamental attribution error, a pervasive bias in how we perceive others.
6. "Invisible Women: Data Bias in a World Designed for Men" by Caroline Criado Perez: This book discusses gender bias in design and decision-making processes, revealing how this impacts women's lives, and offering ways forward.
7. "Bias: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do" by Jennifer L. Eberhardt: A social psychologist offers a powerful exploration of racial bias, how it infects our society, our institutions, and our minds, and what we can do to combat it.
8. "The Righteous Mind: Why Good People Are Divided by Politics and Religion" by Jonathan Haidt: A moral psychologist uses his research to explain why we disagree so much—especially about religion and politics—and he offers some tips for how we can better understand one another.
Remember, the understanding of bias is a vast field and the list here represents only a small fraction of the excellent work out there.