8 Survey Results: Executive Summary As seen in the survey, libraries are always looking for potential new partners and collaborators 49 respondents (89%) identified working with a new partner over the past three years. Campus orientation departments and development were the most frequent collaborators. Surprisingly, 27 respondents (48%) had never worked with the study abroad office and 24 (44%) had never worked with the veterans’ center. Write-in comments most frequently described additional collaborations with athletics, student affairs, alumni offices, and residence life. Assessment and Reporting The survey also explored how libraries assessed and reported their outreach activities. Libraries employed a variety of assessment methods, but the most common types were headcounts (55, or 98%), observations (53, or 95%), feedback from outreach volunteers or partners (49, or 88%), and collecting comments (46, or 82%). Some types of assessment methods were uncommon, including minute papers (11, or 20%), interviews (16, or 29%), and focus groups (21, or 38%). These responses suggest that libraries relied on quick, unobtrusive and less resource-intensive techniques for most of their assessment. As seen in the goals section, the majority of respondents’ goals were internally or library-focused and/or intent on measuring factors such as participation rather than seeking to assess patrons’ experience with the library or library services and resources. Therefore, it makes sense that assessment methodologies such as headcounts and observations were the most common. If libraries were developing more specific outcomes related to outreach participants’ learning or experience with the activity, then there would be an opportunity to utilize a larger variety of assessment methods, including more time- and resource- intensive methods such as focus groups and interviews. Very few libraries appeared to have designated staff to design and test assessment tools for individual outreach activities. Rather, a variety of individuals or groups were involved in assessing activities. Those most often involved in assessment were the individuals or groups planning the events and/or communications and marketing staff. Very few libraries hired consultants or external staff for outreach assessment. Similarly, there did not appear to be one person at any given library responsible for assessing libraries’ overall outreach programs. Fifteen respondents (27%) said no one was responsible for overall program assessment and several of the “Other” responses echoed this same issue, describing an ad hoc approach to assessing outreach programs. Compared to other key functional initiatives in libraries, such as reference, collections, and instruction, outreach does not appear to be assessed programmatically. The survey asked about professional development opportunities and/or training that the library provided for those who assess outreach activities. Respondents described libraries providing funding for general professional development, such as conference attendance, but not necessarily opportunities related specifically to outreach assessment. Six responses mentioned having employees attend the Library Assessment Conference. This biennial conference, which is sponsored by ARL and the University of Washington, focuses on assessment, but does not explicitly focus on assessing outreach. Based upon survey responses, it appears that if an individual would like training on outreach assessment, it would have to be self-initiated or affiliated with general professional development funding. This is an area where professional library associations and organizations could take a lead in providing opportunities (through workshops, webinars, and conference programming) to help members and participants learn about assessment strategies and analysis related to outreach events and activities. Although libraries have not taken a programmatic approach to assessing their outreach events, they did report using assessment data to make decisions about outreach programming. The majority of respondents (38, or 68%) reported that their library has canceled or discontinued events based on assessment data. The most cited kind of assessment data related to cancellation was poor attendance or low usage statistics. To a lesser extent, observations, feedback and comments, and surveys were mentioned as data types that were useful for making the decision to cancel. Most respondents (49, or
Previous Page Next Page