Bias is an erroneous reasoning at the level of perception, judgement and memory. Some biases stem from the way our senses work (perceptual biases), others are the effects of how we judge, believe and expect (cognitive biases).

Repetition effect

People tend to judge claims they hear more often as likelier to be true. In other words, we assign more credibility to a statement that we hear repeated many times. One of the examples of repetition effect is believing in rumours. 

Face-detection bias

People tend to see faces in visual information that is ambiguous or just plain random. Seeing faces in the moon is an example of this perceptual bias.

Believing is seeing

Our beliefs, desires or expectations can partly determine our perceptions. For example, our expectation of hearing a particular phrase in the backward messages of a song would shape that message in the sounds for our hearing. Expecting to see resemblances between infant and parent may lead us to see these resemblances.

Inattentional blindness

Inattentional blindness is a bias of our attention. When people are concentrated on one task, they tend not to notice or hear some irregular events in front of them. An example of it is when we walk past a friend who waves at us but we fail to notice him.

Confirmation bias

Confirmation bias is a way of gathering, noticing, interpreting or remembering evidence so as to overstate the evidence against a hypothesis. For example, a confirmation bias occurs when we look for some information to overestimate the evidence for one sort of conclusion. When we believe that P, we may ignore evidence suggesting that not-P. Once we have data supporting our prior belief, we stop looking for evidence.

A confirmation bias affects us even when we notice evidence as a confirming instance that reminds us of the hypothesis in question.  For instance, when something reminds us of a particular stereotype, this confirming instance will strike us as particular significant. If we believe that all Scots have red hair, this stereotype will be called to mind every time we will meet a red-haired Scot. However, the cases in which we will encounter non-red-haired Scots will fail to make us think about the stereotype.

Another example of a confirmation bias is remembering evidence which creates an unwarranted perception of a trend. A kind of event may remind us of other events of that kind that we experienced. On one hand, we have a sense of having dreamed about an event that is occurring. This is a kind of déjà vu instance or prophetic dream. On the other hand, we will never recall about unfulfilled dreams or events that would undermine this perception of regularity.

Self-fulfilling prophecies are another sort of confirmation biases. These are predictions that come true not simply because the predictor foresees how events will unfold, but because the prediction itself has an effect on how things unfold. In this case, the prediction gives rise to an expectation that the event will occur, and this expectation leads to actions that cause the prophesized event. Placebo effect can be explained by the same form of confirmation bias. Taking some pill that makes us believe that it works, we feel better or recover as a result. This improvement in our health arises merely out of the expectation that we will recover more quickly. It might somehow boost our body’s recovery systems, remind us to lead a healthier life and get rest. Moreover, we might misinterpret any random instances as signs of recovery, while underestimating the symptoms of the illness.    

Self-serving attribution

Self-serving attribution is a bias of ascribing special significance into the events that involve us and into our roles in those events. For instance, there is a tendency for people to own success and disown failure, while it requires a great effort to identify internal causes for failure and external causes for success.

Optimistic self-assessment

People tend to rank themselves as above average with respect to certain virtues. These are vaguely defined abilities such as leadership, friendship, kindness, wit and others.

Hindsight Bias

Hindsight bias is also called the Historian fallacy. It consists of false beliefs that past events were foreseeable.  After the event has already occurred, it’s easy to say “You should have known that this was a bad idea!”. Once unpredictable, the event seems now predictable. So, we often overestimate our past assessments of evidence.

Continued Influence Effect

Continued influence effect is a bias denoting the influence of the announcement of the media even after a retraction of the fact.  We continue to believe the initial story even after we know that it was misinformation. This can be explained by the difficulty that people often have in dealing with neutral states of information. In order to abandon the belief of the initial story, we require more than just the removal of the evidence to the effect. It’s easier for us to accept a replacement of one story onto another.   

Framing effect

Different word choices in argument or assertion can have strong effects on the judgements we have. These influences are called framing effects. For example, our reactions to particular claims about people may be very different depending on whether they are described soldier, freedom fighter, guerrilla, terrorist, irregular or insurgent. Once known as insurgents, armed groups may quickly become freedom fighters if they switch sides. Thus, it is important to think past the labels which have an enormous effect on how something need to be judged. We need to monitor claims with strong connotations and to consider how such claims would look with more neutral terms substituted for them.

Biases of memory

Biases of memory can happen when we talk about our childhood or other past events. We tend to rely on our memory as if it is a fact since it seems exquisitely clear. However, memory can be misleading. Some particular claims can have strong effects on our judgements and, as a result, on the contents of our memories (framing effects). Some aspects of memory can be added or changed. Numerous important details can disappear from it. Even entire vivid memories can be implanted by circumstance or therapy. It’s not that we cannot trust our memory at all, but rather consider it only as a piece of evidence among others.  

Social stereotype

People tend to have stereotyped attitudes. A social stereotype consists of un automatic attribution of characteristics to a person of a particular gender, race, height, age, profession, habits etc.  Examples of social stereotypes are:

  • Blonde – stupid
  • Skinny body – anorexic
  • Wearing black – goth
  • Wearing glasses – geek
  • Teenager – trouble causer

When we make judgements, we should be aware of having prejudice affecting our reasoning. In this case, it’s important to be prepared to self-consciously counteract such bias.

Fundamental Attribution Error

Fundamental attribution error is a reasoning when we explain someone’s situation or behaviour in terms of their personality, character, or dispositions. In this case, we forget that there might be other explanations in terms of context, accidents, or the environment. Some examples of this bias are as follows:

  • John: Sorry, I’m late.
  • Kate: You are irresponsible as always. (Kate overlooks an explanation in terms of a car accident that might happened to John.)
  • John: I did absolutely nothing yesterday.
  • Kate: You are so lazy. (Kate omits the fact that John might be sick.)

Because of the Fundamental Attribution Error, we are apt to prop up the idea that if someone is deprived, poor, unemployed, homeless etc., it must be owing only to his choice. However, they may owe their situations partly or wholly to external factors.

False Polarization Effect

During discussions on controversial topics, we tempt to polarize positions. There are two faces of this bias. The first one is a tendency to consider the views of others as the extreme and most stereotypical positions of those sorts. The second one is the self-perception of reasonable neutrality. In other words, thinking that our position is neutral and reasonable when arguers on both sides disagree with us is a red flag.  We might count them extremists, while our position is just in a disagreement with theirs on one side of the issue on the other. If we take the spectrum of views on some issue that vary from the extreme pro-side to the extreme con-side, ours might actually be far over to the pro-side or close to the con-side. 

Bandwagon Effect

Bandwagon effect is a tendency to shift our beliefs toward the popular opinion. It can be explained by the fact that it can be socially and cognitively expensive to express a position that runs against the consensus. Someone who voices the consensus view will be challenged more rarely than someone who has a minority view. That is why it is important to give careful consideration to unpopular views and to self-monitor for signs of the bandwagoning.

False Consensus Effect

False consensus effect is a tendency to believe that others around us share out beliefs and attitudes. This bias can be explained by the fact that most of our beliefs are trivial such as “sun is shining” or “snow is white”. So, we think that our beliefs overlap with others in many topics. In some social contexts, we can even overestimate the extent to which beliefs about politics, religion, ethics and the like too are shared. The danger of this bias consists in giving a misleading sense of reasonableness of our beliefs and attitudes. We take other people’s silence as evidence that they share our expressed belief. Then we interpret this imagined consensus as a validation of our belief. As a result, this can lead to unpleasant surprises in decision-making situations. We expect that others vote or support us when in fact they don’t.

Levelling and sharpening

There are several biases that affect the transmission of the information. Levelling and sharpening is the group of biases occurred during the transmission of information through personal interactions. When the story is perceived as minor or less central, people tend to minimize or omit details (levelling). When people want a story to become significant, they exaggerate specific elements of it (sharpening).  These systematic biases occur regularly in the media which applies oversimplification and exaggeration in the reporting of events. Oversimplification in the media emerges in the form of platitudes, clichés or slogans. Exaggeration is expressed more often in a headline to give a reader a thrill and appeal to the mysterious. 

Pseudo-original source

It might occur that the original claim is replaced by pseudo information sources. For example, some independent assertion of the same claim may end up being the one from which most subsequent reports trace. The original source remains original one till some independent retellers confirm the claim by providing their own testimonial evidence.    

Hence, it is important to appropriately evaluate the information we receive from every sort of source, as the original story can simply mutate in retellings. Tracing back to the original story is crucial in critical thinking even if it may require some careful research.

Selection bias

Selection bias may occur during a survey when the researcher picks un unrepresentative sample from the population at the outset. For example, John conducts a polling to know how his clients evaluate the service quality of his company. For picking a sample, John mails 300 persons who received a discount membership card. Here is a selection bias. Those who received a discount card are likely to appreciate more the service than those who didn’t receive it. Thus, the list of clients with a membership status is likely to seriously underrepresent the clients who do not enjoy the service quality.

Bibliography:

Kenyon, Tim, Clear Thinking in a Blurry World, Nelson Education Ltd, 2008.