Library Assessment · 13
to coordinate the collection, reporting, or archiving
of data, to fill requests for library data, or to submit
external surveys.
The majority of assessment staff have collabo-
rated on assessment activities with other non-
library departments, agencies, or units within the
institution, though standing committees are less
likely to do so. These non-library collaborations are
most often with institutional offices of research and
learning, information technology, and assessment
and planning.
Assessment Results Distribution and Outcomes
Methods of distributing assessment results vary
depending on the audience, although overall, the
most frequently used method is through a Web
site. In addition, the methods most widely used to
inform the parent institution are print reports and
library newsletter articles, while presentations and
e-mail announcements are used more frequently for
library staff. Staff appears to be the most targeted
audience for the distribution of library assessment
results; all methods except a campus newsletter are
heavily used for them. Results are overwhelmingly
distributed to the general public through a Web site
or library newsletter articles.
The top two types of assessment information
listed on a library’s assessment Web site (whether
publicly accessible or staff-only) are general library
statistics and analyses of assessment activity re-
sults. Assessment publications are found more fre-
quently on a public Web site than on a staff-only
Web site, while presentations and assessment data
are provided more on staff-only Web sites than on
public ones. Other types of information mentioned
by more than one respondent include meeting
notes and agendas on staff-only Web sites.
There is little point in having an assessment
program unless the results are used to make im-
provements in services. Respondents were asked
to list three outcomes that were attributable to their
assessment activities. Twenty areas were reported,
but changes to Web sites and facilities were the
most frequently mentioned. Collections, hours, and
staff formed the next highest groups. Other areas
that were changed include customer service, jour-
nals, access services, the online catalog interface,
instruction and outreach, and reference services.
Only one respondent reported no changes attribut-
able to assessment.
Professional Development
When asked if their library provides assessment
training to library staff, all but 20 of 68 respondents
(71%) indicated they received some sort of sup-
port for training, whether provided by the library
(28%), their institution (32%), or an outside source
(62%). When the library provides training, the top-
ics focus primarily on assessment methods, basic
statistics, survey construction, the value of assess-
ment, and data analysis.
When evaluating assessment-related profes-
sional development venues (such as conferences)
outside the institution, the most highly recom-
mended and most attended events were ARL as-
sessment-related meetings and the 2006 Library
Assessment Conference. When asked to identify
the professional development needs not being met
by the aforementioned conferences, respondents
focused on training, indicating that there is a lack
of available instruction on basic statistical analysis,
methodologies, and tools.
Culture of Assessment
The survey included a series of statements on the
culture of assessment. Respondents were asked to
rate on a scale of 1 to 5 how well the statements de-
scribed their respective libraries. Between 68% and
79% of the respondents agreed or strongly agreed
with statements related to the commitment of their
library administrations to assessment. The remain-
ing statements were related to staff and their sup-
port for, or ability to carry out, assessment activi-
ties. Only 50% or fewer of the respondents rated
Previous Page Next Page