It is often stated that, “If you have a brain, you have bias.” Along these same lines, while exploring the topic of research bias this week, I came to the realization that any aspect of research you can name probably has a type (or many types) of bias to go along with it. It is easy to find long lists and explanations of different types of bias, from the common confirmation bias to the less intuitively named streetlight bias (which we will define later in this post and has nothing to do with your feelings about streetlights). While understanding many different types of bias is important to being able to identify where bias may play a role in our research and recognizing when it does, there’s significantly less instruction on how to actively prevent our bias from entering into our research in the first place, so that is what we will focus on today. 

Biases are Elusive

As discussed in our last article on interview bias, one of the first things that you can do to mitigate biases in research, and in everyday life, is understand what they are and acknowledge that you have them. Broadly, there are two types of bias – bias that you are aware of (explicit bias) and bias that you are not aware of (unconscious or implicit bias). For example, you may consciously know that black cats scare you (explicit bias), or you may think that you love black cats but then still feel uneasy when one runs in front of you on the street (implicit bias). It may feel like a contradiction that we are asked to acknowledge our biases and, at the same time, told that biases can be unconscious and therefore will elude us. This is why it takes more than just awareness to actively fight bias.

Moving forward with the understanding that implicit bias exists, it can then feel like a daunting task to control it. How do we proceed with our research while managing something that is unconsciously haunting our decisions? Well, research isn’t perfect and there’s no miracle cure for implicit bias, but hopefully the following suggestions and exercises will give you some tangible tools to keep bias at bay.   

1. Work as a Team

We can team up against bias by gathering perspectives and input from those we work with. Our coworkers may not have the same implicit biases and for this reason may be able to catch blind spots in our data collection and analysis. Bias can create a personal lens through which we view our work. For example, confirmation bias occurs when you only seek out or absorb data that confirms what you want to believe, making it easier to ignore contradictory data. Of course, our coworkers are susceptible to this same bias, but it is also possible that their different experiences will help to provide you with a different perspective. They may not be trying to confirm the same data as you and therefore may interpret the data differently. It takes trust and patience to let people in on your project and listen to their input, but it can save you from falling into the trap of your own implicit bias.

Studies show that, in general, viewing people as team players also helps to reduce implicit biases. If you believe somebody is on the same team as you, you are less likely to hold biases against them. Thinking in collaborative terms can significantly limit bias towards those you are collecting data from. Viewing your research participants as other from you makes it much easier to hold biases against them that could impact your research and cause harm. On the other hand, collaborating with participants and thinking of them as your team members will help to mitigate any unconscious biases towards them. Lastly, we can all be team players by encouraging those around us to be more mindful of their biases.

2. Be On Alert

We can turn a general awareness of bias into active mitigation of bias through self reflection exercises. It is easy to think we have a complete grasp on our own thoughts and understanding of bias but again, implicit bias happens beneath our surface thoughts and writing can help bring them to light. Here are a few things you may want to write down before diving into your research:

  • What you expect to find 
  • What you don’t expect to find
  • What you don’t want to find  
  • Where you think bias is most likely to play a role in your research 
  • Preexisting assumptions
  • Examples of counter stereotypes and challenges to your preexisting assumptions

Obviously, this exercise may be personal, so you don’t necessarily need to share your writing. Just pulling these thoughts out of your head will help you be more mindful of bias as you begin your research. 

There are additional actions we can take throughout an evaluation to prepare against potential biases. First, it’s important to do your research prior to beginning your study. Will you be working with any cultural or societal considerations that you don’t completely understand? If so, take it upon yourself to be fully educated before moving forward. Don’t allow ignorance to cloud your research from the start. Next, have a thorough plan and timeline laid out for you and your team. This can prevent biases from leading your work astray later on. As you move forward into data collection constantly reevaluate any impressions you have of participants. Are these impressions affecting how you read the data? When it comes to presenting your findings, choose your vocabulary carefully. For example, are you generalizing with the word “teens” when the data actually only includes answers from young women between the ages of 16 and 18? Be specific whenever possible to avoid perpetuating stereotypes to larger groups.

Finally, in both research and everyday life, be mindful of the media you consume. If the last few years have taught us anything, it is that our news and entertainment sources have a huge effect on what we believe and how we act. Be mindful of the news sources you listen to and seek out fact based, neutral information so as not to inadvertently feed yourself more biases. 

3. Be Open-minded

It may seem obvious that research should be conducted with an open mind, but in practice it is no small task. Too often in research we correlate a cause and effect without considering what other factors may be at play. This brings us back to the streetlight bias. To understand the streetlight effect imagine you have a black cat that escapes from the house on Halloween night. As you look for your cat you walk down the street, so you can look under the streetlights instead of in the dark alley. It is probably more likely that your cat is hiding in a dark alley, but it is so much easier to see under the streetlights that your brain ignores this fact, and you continue to search where there is light. In research, the streetlight effect means that there is a spotlight on certain data, often the data that is easiest to measure, so that is where we look for answers. Data that is more easily collected and measured takes precedence over data that may be just as relevant but is more challenging to capture.

In addition to looking at our data with an open mind, we must learn how to put ourselves in somebody else’s shoes. Ask yourself – if you had different experiences in your life would you still be drawing the same conclusions? Try to view other cultures through their own cultural lens and not just through your own. If presenting qualitative data use the participant’s words and not your own. If you try to put somebody else’s thoughts into your own words you may end up changing their meaning to fit your own understanding and losing important pieces of their story.   

Conditions and Conclusions

While thinking about bias this week I came across the quote, “Bias is a human condition, it is not a personal flaw.” I first heard this quote in the CSL in Session webinar Using Courage to Confront Bias and came across it again in the article “How to confront bias without alienating people”. This quote caught my attention because, like many of us, I can easily feel defensive when thinking about biases. Although I am well aware that we all have biases, hearing this quote allowed me to let this acknowledgement sink in a little deeper and find renewed energy to tackle my own implicit biases. Lastly, you can test your own implicit bias with this common Implicit Association Test by Harvard University. As you consider your results remember that these tests are designed to gather data from an entire population and not necessarily at the individual level; however, they can still be a useful tool if they inspire you to actively challenge your own biases.

LRS’s Between a Graph and a Hard Place blog series provides instruction on how to evaluate in a library context. Each post covers an aspect of evaluating. To receive posts via email, please complete this form.