SPEC Kit 341: Digital Collections Assessment and Outreach · 13
plan, six have an overarching plan that covers digital
collections, and six have a plan specifically for locally
curated digital collections. Examples of assessment ac-
tivities include keeping web usage statistics, collecting
feedback from collaborators, and tracking the use of
collections for research and teaching. One respondent
indicated that assessment was covered by a digital
preservation plan. Another noted that the existing as-
sessment plans were specific to individual collections,
and, thus, did not support ongoing programmatic
assessment needs.
Having an assessment plan doesn’t necessarily
correlate with whether the library has performed as-
sessment of the collections. While all six of the librar-
ies that have an overarching plan reported perform-
ing an assessment of locally curated digital collections
within the last three years, only half of the libraries
with specific plans have done so. Twenty-four of the
libraries that don’t have a plan have nonetheless per-
formed assessment of their collections, and another
20 plan to. In their comments, respondents described
some of the recent activities, including analysis of web
statistics for an annual report, informal assessments
of collection scope and workflows for particular col-
lections, usability analysis for a repository redesign,
and formal and informal assessments for use in plan-
ning new supports for data management/curation
and digital scholarship.
Assessment Reasons and Frequency
The majority of respondents reported multiple rea-
sons for assessing locally curated digital collections.
Most frequently they conduct assessment to improve
functionality (44 or 86%), to inform ongoing iterative
development (42 or 82%), for technical enhancement
evaluation (36 or 71%), when needed as new formats or
functionality are added to the collections (32 or 63%),
and for stakeholder buy-in (26 or 51%). They conduct
assessments less frequently for funding requirements
(16 or 31%). Among the other reasons for conducting
assessments are: migrating to new systems, analyzing
storage requirements, integrating new data support,
informing digitization efforts, understanding users,
tracking impact for digital research processes, general
usability, and evaluating and prioritizing new content.
One respondent commented that assessment included
a “survey of our activities prompted by hiring a digital
assets librarian who performed an environmental
scan” that showed the close relationship of assessment
activities, staffing, and local resource availability.
Respondents use a variety of assessment meth-
ods that are most often employed on an as-needed,
monthly, or quarterly basis. They tend to capital-
ize on existing automatically collected data such as
user comments that are received from the web and
statistics from web logs. In addition to leveraging
automatically collected data for assessment, respon-
dents reported conducting more resource intensive
surveys, focus groups, workshops, and similar activi-
ties, again more often on a per-project or as-needed
basis. In describing this combination of approaches,
one respondent explained, “User comments are gath-
ered in real time on an ongoing basis. With at least
some of the projects, meetings with stakeholders oc-
cur twice a year.” Another provided similar insight
on the types of assessment methods and frequency
when noting that activities are tied to specific project
or development needs and that it “depends on the
area in question. In general, these activities are done
in parallel with development milestones.” In contrast
to the many as-needed and as-possible responses, at
least one respondent tied their current set of activities
to larger goals: “In the future, we want to build a rou-
tine schedule of assessment in concert with another
program in the library, Digital User Services.”
Assessment Outcomes
The survey found significant and substantive benefits
from assessment. The majority of respondents report-
ed that the results from assessment led to changes to
user interfaces (39 or 87%), new search features (30 or
67%), collaboration with faculty to add new resources
to collections (26 or 58%), collaboration with faculty
for instruction (25 or 56%), and development of new
digital collections to promote student or faculty schol-
arship (23 or 51%). Other positive results include high
impact benefits with “changes in institutional subsidy
for storage,” “[b]etter collection development policies,”
“[c]ollaboration with administrative units to develop
outreach centered on alumni and other groups,” and
“[n]ew resources for curators for curation needs [...]for
integration with research and teaching, and for greater
Previous Page Next Page