12 · Survey Results: Executive Summary
Beyond this basic dilemma, practical challenges
also abound. First, libraries need a clear goal towards
which the impact assessment contributes. Second, we
need standard definitions about measures, so that the
profession can have a shared vocabulary to discuss
these concepts, for instance, what consists of student
success? Third, impact assessment requires sufficient
resources and skilled professionals so the effort does
not end after the data has been collected. The value of
impact assessment rests with utilizing the results to
improve decision making. Fourth, to make inroads in
this challenging field, we need to get more comfort-
able as a profession with gathering and analyzing
confidential, but not anonymous data. Librarianship’s
proud tradition in protecting confidentiality, too often
leads to knee-jerk rejection of performing data analy-
sis, when carefully adhering to standard data protec-
tion methods would be sufficient to protect users.
Last, but not least, we need success stories where the
impact measures led to positive outcomes for the li-
brary, and we need to know how to share the findings
effectively. One respondent to our survey said that the
study in question “prevented more significant cuts to
our budget than we might have suffered without this
information.” Can we do better?
Endnotes
1. Stephanie Wright and Lynda S. White, Library
Assessment, SPEC Kit 303 (Washington, DC:
Association of Research Libraries, December 2007).
2. Roswitha Poll and Philip Payne, “Impact
Measures for Libraries and Information Services,”
Library Hi Tech 24, no. 4 (2006): 547–62.
Previous Page Next Page