9 SPEC Kit 361: Outreach and Engagement 88%) also stated that they have made either small or substantive changes based on assessment data. The types of data that respondents described as useful in making these decisions were more varied than in the previous question, mentioning various kinds of formal and informal feedback gathering, observations, surveys, and focus groups, along with headcounts and usage statistics. One respondent described how their library changes their event based on assessment: “[w]e collect relevant information to determine if the program met the intended goals—that can be head counts, staff feedback, participant feedback/ information, etc. When goals aren’t being met we look for ways to adjust or improve the program, or determine if there are other valuable goals that are being met and adjust accordingly.” Most respondents reported that outreach activities were reported in library employees’ performance evaluations (44, or 79%), that outreach assessment data was compiled to respond to administration requests (32, or 57%), and that departments provided reports to library administrators on either an annual or regular basis (28, or 50%). This is in line with what would be expected, as 53 of respondents (95%) reported library personnel who had outreach responsibilities specifically written into their position descriptions, illustrating that they would need to be evaluated on outreach. Nine respondents (16%) stated that no reporting on outreach was required in their library. Finally, the survey asked respondents to identify how much time libraries provided for establishing impact and demonstrating success of outreach activities. The majority (34, or 61%) said that there was no defined time frame, with fewer responses saying success and impact could be demonstrated incrementally (14, or 25%), or after two or three iterations (4, or 7%). Only one respondent said that success and impact had to be demonstrated immediately. One comment described how the expectations for reporting differed based on the type of activity: “[o]ngoing assessment and reporting of impact is expected for well- established outreach programs. Those reports come out soon after the events. Other activities may not require formal assessment, but are reported to show impact. Success of newly developed events can be shown incrementally.” Case Studies In the Case Study section of the survey, respondents shared an outreach event or activity their library conducted in the last two years. This section revealed the wide variety of events, activities, and programs that ARL institutions offered within their outreach programs. Common activities included resource fairs, open house events, and finals de-stressing activities, but respondents also mentioned activities such as high school internship programs, seed lending libraries, and “human library” events, among many other activities. Libraries frequently partnered with other campus units, including both academic departments and student support units such as wellness centers or writing centers, on outreach activities. The budgets for these programs varied widely. Some programs, such as a social media-focused Archives Hashtag Party, had a $0 budget, while others, such as a common reader program, had a budget of up to $50,000 for an annual event. One consistent theme throughout the free-text responses was the need for administrative approval, which was also discussed earlier in the survey. Some respondents did not indicate an approval authority or indicated that individual librarians, such as an outreach librarian, had the authority to approve the event. However, many respondents indicated that a library director/dean or other administrator had to approve the event, even for low and no-cost events. Respondents frequently connected administrative approval to budgets. Finally, case study respondents revealed that outcomes and assessment plans varied widely. Some institutions had defined and measurable outcomes for their case study activity. For example, one library listed specific goals, objectives, and outcomes for their human library event, and also administered a participant survey in order to measure achievement of the outreach outcomes. Several other institutions listed more general outcomes for their activities but lacked any assessment plan. This suggests that, while
Previous Page Next Page