PRISM Project Toolkit
Are PRISon libraries Motivators of pro-social behavior and successful re-entry?
Welcome!
This project was funded by the Institute of Museum and Library Services (IMLS) with one goal of the project being to develop and share the research design for other libraries to adapt and use. This toolkit includes some background information, along with information about research methods and logistics. Please help yourself to this information, including our survey and focus group questions, as it is available under CCY BY-NC-SA 4.0 (Attribution-Noncommercial-Sharealike 4.0 International).
Project Overview
What is PRISM?
PRISM project justification:
There are 1,155 state prisons in the United States as of 20191, with 1,230,100 people incarcerated in them as of 20222. While many of these facilities have libraries that are used by institution residents, there has never been a large-scale outcomes-focused study on prison libraries. This leaves a gap in the library community’s knowledge about the impact of these libraries. This study will fill that gap and it will create a survey instrument and a scalable research plan that can be replicated in other states to learn more about their specific prison libraries and contribute to this first of its kind body of knowledge about prison libraries. The information gained in this study will support prison library services and demonstrate their value to incarcerated and formerly incarcerated people and the communities to which they belong and return to after their incarceration.
Right now, the prison population and the libraries that serve them have no outcomes based assessments available, which makes it impossible to determine how to improve library services and without the knowledge of whether those libraries are contributing to pro-social behaviors for incarcerated people or supporting them as they rejoin our communities. By building this knowledge, we will have the opportunity to make positive impacts on communities, in and outside of prisons, throughout Colorado and the rest of the United States. This study will allow us to generate data and stories about the impact of libraries that will establish a unique dataset that prison librarians can use to improve their services. This dataset will be able to be added to by states and territories that choose to use the study design and survey instrument that will be established in the PRISM Project. It is crucial that we include the voices of incarcerated and formerly incarcerated people in the research about library services, which is uniquely foregrounded in this research study. This study will allow prison libraries to have the same outcomes-focused evaluation tools and understandings that academic and public libraries have, which directly strengthens the community and advances access to relevant collections for incarcerated and formerly incarcerated members of the community.
Colorado Department of Education PRISM proposal and grant documents
General project timelines
[From IMLS proposal]:
This project will use the survey instrument developed in the planning grant, with slight modifications for use with incarcerated people. The research team will engage an IRB to review the plan prior to the start of data collection. Data collection will take approximately eight months, ending in April 2023. The analysis and reporting phase will take 13 months, ending in May 2024. Data collection will begin with 8 informational focus groups with formerly-incarcerated people, 2 for each region in Colorado. The survey portion will involve substantial outreach; as of April 2024, there are 12,290 people incarcerated in prisons in the state of Colorado and 9,401 people currently on parole3. The LRS research team, in collaboration with Institutional Library Development at the CO State Library, will give surveys in person at Colorado prisons, collecting a sample size of 267. The Remerg team will conduct the focus groups and will give the surveys to formerly incarcerated people across Colorado, collecting a survey sample size of 264. The LRS research team will work with the research contractor to refine the survey instrument and develop the questions for focus groups. The Department of Corrections will allow the research team to have access to internal prisoner data, including disciplinary and behavioral records, which will be anonymized and analyzed to help determine the impact of library services on the behavior of incarcerated people. The information gathered will be analyzed using quantitative and qualitative methods by an outside consultant, to determine the outcomes of library services in prisons. This project will be considered a success if we are able to conduct the focus groups and surveys as described, gain access to institutional data from Colorado Department of Corrections, and analyze and report on that data to the broader library community.
History
Project history:
2023-2024 IMLS proposal
The language that our team used to describe our research participants changed over time, from our initial application and the inception of this project, to working with DOC and in facilities, and sharing our research findings in the library community and more broadly. We found that we needed to mirror and use whatever language was preferred by our stakeholders in respect to each different setting, though these terms differed from the recommendations put forth by the Marshall Project and in the 2020 journal article, “The Language of Incarceration,” by Alexandra Cox. Our grant proposal to the IMLS uses the language “incarcerated people,” per feedback we received in response to our application. The Colorado Department of Correction favors the term “resident.” It will be important for anyone conducting research of a similar nature to consider what terms are appropriate in your respective environments, as well as the overall impact of word choices and the language used to talk about incarceration.Be ready to change what you say and find a way to say it in a way that makes sense to your stakeholders and research participants.
Logistics
Security clearance: Researchers conducting focus groups and entering the facility will need to obtain clearance. This will entail a background check, and probably some training. Each facility can provide you with information about their security procedures.
Dress codes: Be sure to learn the dress code for each facility that you visit; as with security clearance and rules, these guidelines may vary by site. Expect to dress conservatively and to wear closed-toed shoes.
Rules at each facility: Codes of conduct will vary by facility and organization. Be sure to learn the rules for each facility as you plan your visit.
Food : Each Department of Correction will have specific rules about what food and what type of container is allowed.
Self-care: Consider how you will take care of yourself and your team throughout the course of the project. We offer a video specific to self-care and focus groups, but note that this work can be challenging at any point from designing your study to analyzing and sharing your research fundings.
Hear from the team
Ethics
Doing research with people who are incarcerated requires focused and sustained attention to research ethics, following the strictest guidelines for obtaining and respecting informed consent for the research. Any research that is done with sensitive populations requires the consent of an IRB for the research. If a researcher is affiliated with a university, they will have an IRB in place that you can work with directly. If you are not affiliated with a university or other research institution, commercial IRBs are available that will be able to ethically evaluate your work and allow the project to proceed.
According to the National Alliance on Mental Illness (NAMI) approximately 37% of people who are incarcerated suffer from some form of mental illness, where in the general US population, that number is 22.8%.
While researchers should always be considering the mental health of their research subjects, this is another reason that researchers in the carceral setting should be constantly thinking about informed consent.
In this project, we retained a commercial IRB that specialized in social sciences research. They required the names and affiliations of participants, the complete research plan, all documents that will be used in the research, and any additional information that they request. As a part of the process, the PI was required to do the CITI training for social sciences research. It is a best practice to complete the CITI training to anyone who is interested in doing social sciences research, especially with sensitive populations.
Departments of Correction have their own process for approving research in their facilities, which is often an internal committee tasked with the same kind of ethical analysis that an IRB will exercise.
The U.S. Department of Health and Human Services offers an FAQ section.
People and Organizations Involved
Include voices of incarcerated/formerly incarcerated people
Internal work group
Agile principles of project management
[Coming soon]
Team roles
The internal working group was comprised of members of Library Research Service (LRS, a unit of the Colorado State Library), members of Institutional Library Development (ILD, also a unit of the Colorado State Library), and a contracted employee, Chelsea Jordan-Makely. Members of LRS who were a part of this project are Charissa Brammer (Director of LRS), Sara Wicen (Research Assistant at LRS) and Amy Bahlenhorst (Research Analyst at LRS.) Charissa was the project lead and was in charge of the submission of the grant to IMLS, grant management, and directed data collection and analysis. She also attended and conducted all of the focus groups. Sara Wicen played a vital role in data collection and analysis; she attended and conducted about half of the focus groups, took part in survey creation and data collection, coded and interpreted data, and was integral in the creation of this toolkit, presentations and the white paper. Amy was the project manager in charge of keeping the project on track and organized as well as communication with partners. She also conducted about half of the focus groups, assisted in coding and analyzing data, and helped to write and present the toolkit and white paper. Chelsea was hired on as a contractor and gave the team a vital outside perspective while coding and analyzing data. She also provided a guiding light while writing the toolkit and white paper by using her vast knowledge of the prison library environment.
This project wouldn’t have been possible without the inter-unit collaboration at the Colorado State Library between LRS and ILD. ILD provides a variety of services to Colorado’s institutional libraries. This team was comprised of Renee Barnes (Supervisor), Teresa Allen (Youth Institutions and Acquisitions Senior Consultant), Molly Bassford (Institutional Libraries Senior Consultant) and Erin Boyington (Adult Institutions Senior Consultant.) This group bridged the gap between the State Library and the Colorado Department of Corrections (CDOC.) Their existing relationships and trust with CDOC made it possible for us to gain entry to the prisons. They often acted as liaisons between LRS and CDOC and were hugely important in helping the working group gain a better understanding of the CDOC landscape. They were also the ones who scheduled all focus groups and coordinated with prison library staff in order to make the visits run smoothly. Additionally, they acted as security for LRS while conducting focus groups and as sounding boards when questions or concerns arose.
Partner Organizations
- How to locate partners
[Coming soon]
- How to get buy-in from partners
[Coming soon]
Data Collection
Focus Groups and Surveys
Hear from the team
Hear from the team
Focus Group Tips
Selection
Participants for our focus groups were randomly selected from a roster of the current facility population divided by custody levels. The Institutional Library Development (ILD) team assisted us in obtaining the rosters and completing the random selection. The number of people selected from each facility was based on the number of people in each custody level at the facility. We invited people from every custody level in each facility and normally held a separate focus group for each custody level because they often could not be grouped together. Once participants were randomly selected, we mail merged their name, cell location, and DOC number with our invitation letter and sent these letters to the facility librarians to distribute. Some librarians chose to send the letters through the facility mail, and some librarians chose to deliver them to the cells themselves.
After a couple of focus groups, we updated our invitation letter because we heard from a focus group participant that it could easily be mistaken for a letter from the Colorado Department of Corrections (CDOC). This could have been a deterrent for some of the people randomly selected because there may be reluctance to voluntarily participate in something run by CDOC possibly due in part to fear of authority and the power structures incarcerated people endure daily. We changed the text of our invitation to ensure it was clear that we are not affiliated with CDOC.
Once people received the invitation, they could select whether they would like to participate in the study and return the form to the facility librarian. We did hear from a couple of participants while conducting focus groups that this part of the process was occasionally difficult to navigate for some who received a letter. We heard secondhand from participants of people who wanted to participate but were unable to return the form to the librarian, sometimes due to lack of cooperation from correction officers.
Once the facility librarians received the interest in participating from people randomly selected, they communicated with the liaison from Institutional Library Development (ILD) assigned to that facility to start planning the focus group schedule. How the schedules were set up varied from facility to facility, but we always completed the focus groups from each custody level at a facility in one to two days, sometimes conducting as many as six focus groups in a day.
Conducting a focus group
Most of the time focus groups were conducted in the facility library. The library served as a comfortable meeting place because it often felt like one of the most normalized places in the facility and interruptions were limited. There were a handful of times when the focus groups for certain custody groups could not be conducted in the library due to security concerns or lack of access to the library. In these cases alternative meeting spaces were used for the group, or private visitor booths were used when the interviews needed to be conducted individually for the highest custody levels.
To protect participants’ privacy, when focus groups were held at the library, the facility librarian stayed in their office during the focus group. This was also important to ensure participants’ responses were not influenced by the librarian’s presence. The only people present during the focus groups were the participants, members of the research team (usually one taking notes and one asking questions) and a member of the ILD team. After the focus group was over and the recording stopped, the ILD staff member helped answer any questions about facility libraries that arose throughout the conversation. This also provided an opportunity for ILD to spread the word about new or existing services at that facility. We made sure that the ILD staff member present had not previously worked at that facility library and did not know any of the participants present.
An ILD staff member was also present during focus groups to follow safety protocols. The ILD team had previous experience working in prison libraries, DOC badges, and the necessary training completed to act as our chaperones within facilities. Members of the research team did not walk around the facility without a member of ILD or other facility staff person escorting them.
To begin the focus groups, a member of the research team read the entire consent to participate form out loud to the group. There was then time for participants to ask any questions they might have, and if they agreed to participate they signed and turned in the consent form. We were careful throughout this process to make clear that participating would not have any impact, positive or negative, on participants’ terms of incarceration and that participants could refrain from answering questions or leave the focus group at any time. We also made sure that participants had enough information and a clear understanding of the study for them to be capable of giving true informed consent.
The research team provided pens to participants for the consent form but were required to collect them all back from participants. Food and water bottles were set out before the start of the focus groups along with paper plates, napkins and plastic utensils, so participants could help themselves throughout the focus group. We also needed to collect the leftover food and anything we distributed with the food at the end of the focus groups so participants could not bring any of it back to their cells.
Once participants had signed the consent form, we started the recording and began asking our questions. Our outline was set up with seven main questions and multiple follow up questions under each. Not every follow up question was asked in each focus group depending on whether the participants readily volunteered information. Focus groups were semi-structured, with the main questions and question order staying the same from focus group to focus group, but unscripted follow up questions were also often asked for clarification or to pursue an interesting topic presented by participants.
Surveys
Survey Design
The basic design of the survey was created during the planning stages of the project. Before administering the surveys, however, we made updates to improve the questions, organization/formatting, and language used. The whole team discussed these changes during multiple meetings, and we also had a sensitivity reader review the survey. The sensitivity reader helped ensure we were approaching topics through a trauma-informed lens for people who are or were incarcerated.
We administered surveys to people outside of facilities who were previously incarcerated before administering surveys inside the facilities. Surveys for people who were previously incarcerated were distributed by our partner Carol Peeples, founder of the reentry organization Remerg. Carol brought printed surveys to parole offices around the state and returned the completed surveys to the Library Research Service (LRS). LRS then manually entered the 271 completed surveys into an online version of the survey created in the survey platform Alchemer. This allowed us to easily export the data to spreadsheets or build reports within Alchemer to filter and analyze the data. We did not collect any PII, so the identities of the survey respondents were protected.
Although we exceeded our goal of 264 completed surveys from formerly incarcerated people, Carol communicated that there was sometimes reluctance from people to fill out the survey. We contributed this in part to the length of the survey. Large font size and spacing for the open-ended questions (both of which are crucial to distributing an accessible survey) also lengthened the survey. The resulting thick, stapled packet may have been viewed as intimidating and time consuming by potential survey takers. There was room for formatting improvements, so before distributing surveys inside facilities, we spent time redesigning the second survey for currently incarcerated people and focused on condensing the design while keeping it accessible. We removed questions such as, “Have you ever been incarcerated by the Colorado Department of Corrections?”, “Year of most recent release:”, “Year of most recent entry:” and “Please describe your current living situation” because these questions are not relevant for surveys distributed within prisons. We also improved the wording of several questions. These changes were all guided by our experience with the first survey distributed outside of facilities. We identified questions within the survey that were confusing to respondents and places where the wording needed to change for currently incarcerated people. These differences in surveys distributed outside and inside of facilities make it challenging to directly compare answers to multiple questions between the two surveys; however, we determined that these changes were necessary to ensure that we were distributing a clear and accessible survey and collecting quality data from within prisons.
Logistics of Administering Surveys in Prison
When possible, both printed and online surveys were administered within the facilities. Printed surveys were sent to a randomly selected sample of people at each facility. Our goal was to receive enough surveys for a representative sample, which was calculated at 267 surveys from currently incarcerated people. This target sample size was used to calculate how many people to randomly select and send surveys to within facilities. Those randomly selected received both an English and Spanish version of the survey, a cover letter that also provided a form for requesting the surveys in additional languages, and a pre-addressed and stamped envelope for returning the surveys to LRS. We manually entered the printed surveys we received back into another online version of the survey in Alchemer for analysis.
In facilities that allowed it, we also had the survey available on the library computer as an option for people to fill out during their time in the library. This online version was open to any currently incarcerated person who wanted to fill it out while in the library, regardless of whether they were randomly selected. Online responses were recorded in the same survey where we manually entered the printed surveys received from currently incarcerated individuals, and the text was identical between formats. We also created an online survey in Spanish for people to complete.
Hear from the team
Downloadable Forms
Survey for Formerly Incarcerated Individuals (Outside) (Word)
Survey for Currently Incarcerated Individuals (Inside) (Word)
Cover Letter for Inside Survey (Word)
Data Analysis
Focus Groups
Codebook
Coding is an approach to qualitative analysis in which words or short phrases are applied to pieces of the data. Before we could start coding the transcripts we needed to begin building our codebook, or in other words, the list of codes that we would apply and their definitions. This process truly evolved through grounded theory. Conducting focus groups left us with a preliminary idea of the themes brought up by participants most often, such as barriers to library use and the library services available. This experience alongside a careful review of our first focus group transcripts, laid the groundwork for the codebook we started with. Inevitably, as we read through more transcripts, we identified additional recurring topics relevant to our analysis and added them to our codebook. We structured our codebook with parent codes and child codes, parent codes being an overarching topic and child codes being components that fall under a parent code, such as “information access” as a child code for “library services.”
Even as we finished coding all the transcripts, we recognized in hindsight that there were elements of our codebook that could be improved. For example, we would suggest the addition of the code “books” with child codes “fiction” and “nonfiction”, and we would suggest incorporating the code “self-help books” as a specific example of the code “self-led learning”. We would also suggest the addition of the code “suggestions for improvement” so long as this works with the goals of your project. We would also like to emphasize the importance of clearly defining codes in the codebook. It is sometimes necessary for a code’s definition to evolve throughout analysis, but making sure the entire team has a similar understanding of how each code should be applied from the start can save a lot of time later on.
Download the PRISM Codebook (Word)
Qualitative Analysis
For our analysis of this qualitative study, we used the software Dedoose. Dedoose is a relatively low cost analysis tool and is robust enough for our scope of work. It allows users to code transcripts and other data as well as add descriptors. Descriptors are characteristics assigned to each transcript, and can be helpful for identifying differences between groups. For example, our descriptors were gender, security level of facility, facility name, custody level of each focus group, facility size, collection size, and the number of full-time and part-time staff. These categories added an extra layer of analysis beyond code count and co-occurrence. Codes were added to excerpts from transcripts and we tried to make sure excerpts were at least a few sentences long so as not to lose too much context when reviewing code application. Dedoose also enables teams to check inter-rater reliability through their “training” function. This allowed our team to make sure we were on the same page in terms of how and when we were applying codes. We did encounter a few drawbacks of using Dedoose; the software can be slow and clunky at times, requiring lots of updates and patience. It is not the most intuitive software, but a robust tutorial library is available as are trainings on platforms such as Udemy and LinkedIn Learning.
Teamwork and collaboration were vital to our success in Dedoose. We had 56 transcripts (called “media” in Dedoose) to code, which would be a very heavy lift for one person. We divided the transcripts between three team members. We originally divvied the transcripts by number of transcripts themselves, but learned that some people were getting very short transcripts while others had more lengthy ones. We reassessed and divided transcripts roughly by number, but paid more attention to length, which made the share of work much more equal.
In order to check each other’s work and make sure we agreed on code application, we used Dedoose’s training (also called testing) function. We began by having one person test the work of another person, but realized that discussing each excerpt would take far too long and was not a great use of time since we had 2409 excerpts to go through. Instead, we deciphered what the six most disagreed upon codes (as in they were applied differently when testing) were, and tested just on those. The best way we found to do this was to have one person run a test that included excerpts in which any of those six codes had been applied. This ended up being 873 excerpts which was still about a third of all our excerpts! We found that testing 100 excerpts took about an hour. It’s tedious work. Doing all 873 excerpts at once was not possible, so the tester did about 100 excerpts in a sitting. Ideally when testing you would be looking for a Kappa score which indicates agreement between raters. Since we had to divide the 873 excerpts into chunks, we were not able to use Kappa (another downfall of Dedoose). In order to find agreement, we went back to our original method of going through each of the 873 excerpts as a team and discussing any disagreement between the original coder and the tester. We would apply new codes to excerpts if we all agreed, but we did not remove any original codes unless the group felt very strongly that a code did not fit. This again was a long process: at our fastest we could discuss about 75 excerpts per hour. Despite this being a tedious task, the end result was a high rate of agreement on coding as well as rich discussions on etymology and library theory.
Research Findings
[Coming soon]
More Information
Notes/slides from ALA
[Coming soon]
Listserv for community discussion
[Coming soon]
Public libraries serving incarcerated patrons at the Franklin County Jail in western Massachusetts have adapted a version of the survey instrument for people who are currently incarcerated, which they then used to collect data to report on the outcomes of a separate grant funded by the LSTA (Library Services and Technology Act) through the MBLC (Massachusetts Board of Library Commissioners).
Citations
- Maruschak, Laura, and Emily Buehler. 2021. “Bureau of Justice Statistics · Statistical Tables.” https://bjs.ojp.gov/content/pub/pdf/csfacf19st.pdf.
- Carson, E. Ann, and Rich Kluckow. 2023. “Prisoners in 2022 – Statistical Tables.”
- Bureau of Justice Statistics. November 2023. https://bjs.ojp.gov/library/publications/prisoners-2022-statistical-tables. “Monthly Population and Capacity Report,” Colorado Department of Corrections, last modified April 30, 2024. https://drive.google.com/file/d/1nv7vMGbRwqO56x4mZyynIN21wTBgbKvR/view.