Pilot to Measure Social and Emotional Learning at Denver Public Library

Three girls playing with blocks.
By Hillary Estner, Katie Fox and Erin McLean

Why evaluate?

How can you measure relationship-building abilities? How can you understand which of your library’s programs best support users’ development of skills like problem-solving? How can you determine whether the youth who come to your library need help learning how to ask a question?

At Denver Public Library (DPL), we wanted to answer these questions, which address a vital set of skills called social and emotional learning, or SEL. A key goal of our public library, like many libraries, is to provide experiences that positively impact participant learning and growth. Particularly with our youth participants, we hoped that library programs fostered SEL, but we had not yet found a way to measure it.

In summer 2017, at the urging of the executive level of our library, we launched a pilot project to explore methods of evaluating youth outcomes from library summer programming, with a focus on SEL. We partnered with the Colorado State Library’s Library Research Service, and the three of us—a reference librarian, branch librarian, and research analyst—set out to measure SEL.

Who participated?

While we assessed several components of the library’s summer programming, here we will focus on a collaboration with the Denver Public Schools program, Summer Academy (https://www.dpsk12.org/summer-academy-provides-free-learning-activities/). DPS offers Summer Academy to students whose reading scores are below grade level and students in the English Language Acquisition program. Youth who were invited to Summer Academy were also invited to participate in the library programming. Library programming participants attended literacy instruction during the morning and two hours of library enrichment in the afternoons for four weeks.

Library programming participants were split into two groups based on age, with one group of youth entering first, second, and third grades in the fall and the other entering fourth, fifth, and sixth grades. For both classrooms, typically the youth had some unstructured time at the beginning of the library-led programming, which often was time playing outside or LEGO® free time. After that unstructured time, participants in the younger classroom had a choice of two structured activities which had a clearly defined end product. Participants in the older classroom had several self-directed activities they could choose from and often ended up designing their own projects that did not have a defined end result.  

How did the evaluation work?

We knew SEL would be challenging to measure, so we tried several strategies. Library instructors facilitated individual smiley face surveys about specific activities, youth created end of summer reflective projects to share their experience, and our team observed four days of the program, focusing on SEL behaviors.  Unfortunately, the smiley face surveys did not work because it was challenging to consistently administer them, and participants reported that every activity was fun and easy. Our observations indicated that these reports were not always accurate–we saw youth struggle and disengage at times. The youths’ responses to the reflection prompts were largely positive and vague.

It is very possible that the youth we were working with were too young to share an opinion that was not positive. For example, in response to reflection questions about what they liked and disliked about the program, one youth wrote “I liked everything,” and drew hearts. Another limitation of this assessment was that the participants, particularly the younger age group, were still developing their reading and writing abilities. While we tried to minimize this issue by using smiley faces for response categories, it was still a problem.

Observational rubric

The observational behavior rubric was the most challenging and fruitful component of our project. After reviewing the literature, we were not able to find a freely available observational behavior rubric focused on SEL, so we developed our own. We initially observed youth with some key social and emotional behaviors in mind, and through the process of coding these observations, we developed a coding scheme and observational rubric.  

To create our coding scheme and rubric, we first identified three key areas of SEL, according to the Collaborative for Academic, Social, and Emotional Learning (CASEL). We chose to focus on self-management, relationship skills, and decision making. We used a behavior rubric that the Logan School for Creative Learning generously shared with us as a model to get started.  

After our initial design, we tested and refined the rubric repeatedly so that we could code consistently.  For example, under the category of self-management, the rubric included both “dis-engagement” and “engagement.” Engagement included behaviors like listening, being on task, task-completion, observing peers or teachers, and being responsive to directions. The relationship skills category included behaviors like “kind comment,” “unkind comment,” and “friendly chatting with peers or instructor.” The responsible decision making category included behaviors like “letting someone else do it for you,” “pride in work,” and “helping peers.”

Ultimately, coding our observations yielded preliminary, but valuable, results which are being used to inform youth programming and staff training.  

Results

Of the thirty-three enrolled students, twenty-six were in the younger group and seven were in the older group. We received nineteen consent forms for the younger group and six for the older group. There was inconsistent attendance, so the amount of time we were able to observe each participant varied. We observed seventeen participants in the younger group, and five in the older group. Due to the small sample size for the older group, as well as the open-ended design of their program, we decided to only analyze the data for the younger group.

Through analyzing our observational data, we found that during certain activities we saw more youth showing specific social and emotional skills and behaviors. For example, during the complex activity of making a solar-powered toy bug, youth participants were more frequently engaged in positive problem-solving and decision-making than during the simpler activity of painting a tree and attaching buttons to make a “button tree.”

Youth also displayed the highest rates of positive relationship skills–such as friendly chatting and sharing–during slime and leaf imprint activities, which are both open-ended, exploratory activities (projects with multiple ways to successfully complete the task).  Participants also had the highest rate of positive self-management during these two activities. We saw an even higher percentage of positive relationship skills during unstructured activity time, often LEGO® time.

Our sample per activity was quite small (sometimes we observed as few as three students completing an activity), so we are cautious about drawing overarching conclusions. Nonetheless, these results yielded helpful information about which types of activities could provide environments that foster SEL, which can inform our design of programs tailored to SEL skills.  

Resources for libraries

We want the library community to benefit from our experience trying to measure SEL, and in particular we want to share our observational behavioral rubric as a free tool for organizations to use to conduct their own evaluation.  For a copy of our rubric, click here. You may use and modify the rubric as long as you cite us.  For more information about this project, please contact Katie Fox at Library Research Service.