42 · Survey Results: Survey Questions and Responses
12. Please briefly describe up to three challenges your library has encountered when assessing locally
curated digital collections. Include any methods that were successful in overcoming that challenge.
N=42
Building connections to our users, including our “internal” collection curators and community users. Asking the right
questions to return actionable data.
Collecting meaningful usage statistics. Defining the audience for digitized collections and assessing their use of
collections.
Consistency in review cycle
Delegating staff time to develop, implement, and gather data through reliable assessment methods. Extracting and
interpreting data from free services (i.e., Google Analytics) that are skewed to online businesses, rather than scholarly
inquiry. Communicating assessment results to stakeholders, community members who are not familiar with assessment
terminology.
Determining the correct level of granularity to use in applying analytics code for accurate metrics. The decentralized
nature of our organization has made this challenging. There is significant pressure to keep working on new projects with
little capacity left to assess existing.
Determining whether web analytics are accurate lack of meaningful/substantial and/or demographic details in web
analytics. Solution: continue to experiment with new tools and refine methods. Some digital collections aren’t being
used in any substantial way yet (such as web archives)—how can we assess future use/forecast that?
Difficulty in collecting and comparing usage statistics across platforms
Difficulty of defining “usage” (i.e., visiting a page doesn’t mean someone actually used it for anything). Absence of
formalized assessment plan for digital collections.
Digital collections should not be approached as if they have the same kind of lifecycle as analog collections. This survey
seems to imply that. Reformatting of outdated formats (interactive flash learning objects, flash video, for example).
Digital Preservation—an emerging yet critical field, with significant costs to be incurred.
Discrepancies between application-gathered statistics in DSpace and Google Analytics statistics on the same content.
What to assess? Size of files? Number of files? Number of collections? Preservation files? Access files? Bibliographic
records? Number of page views?
Dispersed collections across multiple platforms controlled by various staff persons. A homegrown solution allows us
to pursue usability improvements despite lack of expertise. Burgeoning assessment program with many units across
institution needing their support.
Educating our users about why assessment is important. Connecting with project stakeholders about best practices,
technical guidelines, and related costs before they get too far along with a proposal. Tracking citations and other uses
by the scholarly community.
Evaluating options for long-term preservation. Determining staffing needs at appropriate levels.
Expertise in assessment—sent librarian to weeklong training after the fact. Funding for assessment tools—used free
version of Loop11. Time!
Gathering content enhancements from experts led to improved and more accurate content. Working with Education
faculty led to improved educational tools for users of the digital collections. Getting adequate response (any) rates from
users on some small, specialized collections.
Previous Page Next Page

Extracted Text (may have errors)

42 · Survey Results: Survey Questions and Responses
12. Please briefly describe up to three challenges your library has encountered when assessing locally
curated digital collections. Include any methods that were successful in overcoming that challenge.
N=42
Building connections to our users, including our “internal” collection curators and community users. Asking the right
questions to return actionable data.
Collecting meaningful usage statistics. Defining the audience for digitized collections and assessing their use of
collections.
Consistency in review cycle
Delegating staff time to develop, implement, and gather data through reliable assessment methods. Extracting and
interpreting data from free services (i.e., Google Analytics) that are skewed to online businesses, rather than scholarly
inquiry. Communicating assessment results to stakeholders, community members who are not familiar with assessment
terminology.
Determining the correct level of granularity to use in applying analytics code for accurate metrics. The decentralized
nature of our organization has made this challenging. There is significant pressure to keep working on new projects with
little capacity left to assess existing.
Determining whether web analytics are accurate lack of meaningful/substantial and/or demographic details in web
analytics. Solution: continue to experiment with new tools and refine methods. Some digital collections aren’t being
used in any substantial way yet (such as web archives)—how can we assess future use/forecast that?
Difficulty in collecting and comparing usage statistics across platforms
Difficulty of defining “usage” (i.e., visiting a page doesn’t mean someone actually used it for anything). Absence of
formalized assessment plan for digital collections.
Digital collections should not be approached as if they have the same kind of lifecycle as analog collections. This survey
seems to imply that. Reformatting of outdated formats (interactive flash learning objects, flash video, for example).
Digital Preservation—an emerging yet critical field, with significant costs to be incurred.
Discrepancies between application-gathered statistics in DSpace and Google Analytics statistics on the same content.
What to assess? Size of files? Number of files? Number of collections? Preservation files? Access files? Bibliographic
records? Number of page views?
Dispersed collections across multiple platforms controlled by various staff persons. A homegrown solution allows us
to pursue usability improvements despite lack of expertise. Burgeoning assessment program with many units across
institution needing their support.
Educating our users about why assessment is important. Connecting with project stakeholders about best practices,
technical guidelines, and related costs before they get too far along with a proposal. Tracking citations and other uses
by the scholarly community.
Evaluating options for long-term preservation. Determining staffing needs at appropriate levels.
Expertise in assessment—sent librarian to weeklong training after the fact. Funding for assessment tools—used free
version of Loop11. Time!
Gathering content enhancements from experts led to improved and more accurate content. Working with Education
faculty led to improved educational tools for users of the digital collections. Getting adequate response (any) rates from
users on some small, specialized collections.

Help

loading