SPEC Kit 341: Digital Collections Assessment and Outreach · 43
Getting usable statistics from technical platforms. A variety of tools used to provide similar (but not exactly matched)
information.
Inconsistency in data. Data normalization.
It is difficult to make time for aging legacy access systems that should be migrated forward. Maintenance, at least,
is required. The best is when we have been able to migrate content to newer platforms. Our content preservation
requirements and validation processes have become more rigorous, making migration forward both valuable and
challenging. We have put a lot of time into fixing content in order to move it forward. Valuable, and worth it, but time
consuming. The logistics of moving content to a new preservation repository are especially complex if part of the goal is
to limit disruption to users as much as possible. We are planning carefully.
It is early days for our data repository, so we had to demonstrate use not only through the number of published data
sets, but also by looking at other indicators of interest, such as projects with data in the pipeline and the number of
proposals using our repository as its data management solution.
Lack of staff time and training. Lack of commonly used assessment models for digital collections. Platforms not
maintained by us are resistant to statistics gathering.
Lack of staff/faculty for doing assessment. Developing the Curator Tools and doing trainings to support all Curators and
Collection Managers in doing assessment of their digital collections along with their physical collections.
Lack of standardization of metadata. Lack of digitization standards resulting in the need to re-digitize materials. Lack of
a central repository.
Level of ongoing resources to support program. Lack of formal policy and mandate.
Multiple platforms and software versioning. Poor data collection tools for evaluation. Lack of strategic focus in this area.
No front-end infrastructure for many projects/materials that allow tracking and assessment. Lack of dedicated staff for
assessment. Lack the ability to access and convert assessment data into information.
No systematic approach to assessment, and no one person or group charged with the responsibility. There are many of
us who care about this work, however, so we do our best to keep things current and evaluate the product (as it were).
Along the same lines, we tend to be overwhelmed with work, and move on to the next project as quickly as possible,
and we lose opportunities to really evaluate/improve our work based on previous projects.
Not enough information to provide useful metadata. Navigation problems within DAMS. Upload/storage size limitations
on files
Older content needs significant work to be brought up to contemporary standards. Content in HTML is difficult to
migrate.
Resources available to carry out the work locally. Forming a working group has help to prioritise digitization and focus
resources. Determining the extent to which we will support digitization efforts by faculty and students versus carrying
out our own projects.
Staff who oversee digital collections are scattered throughout the organization. Statistics for the repositories are
currently not kept in a central location. There is no one person responsible for coordinating assessment and outreach
activities related to digital collections.
Staff/faculty time to plan and carry out the actual assessment. Staff/faculty time to make recommended changes.
We have a wide variety of resources in the digital library and a one-size-fits-all structure (that allows more efficient
management) presents problems.
Previous Page Next Page