SPEC Kit 341: Digital Collections Assessment and Outreach · 37
We would like to work with a user interface computer science course to evaluate usability of ScholarWorks as a case
study.
Workshops with stakeholders will be dependent on whether we need training for internal library staff or library users.
If you selected “Another regular interval” above, please specify the method and the interval. N=34
According to our User Experience Department (within LIT), assessment is to be done early, often, and at the end of a
project. A variety of methods are used depending on the situation.
As needed
As required for statistical reporting purposes
Assessment is conducted at intervals determined by grant funding. Generally a three-year assessment is used.
Assessment training for staff: as needed. Workshops with stakeholders: as needed, usually specific to projects or
collections.
Assessment training is part of professional development and occurs as needed. Use statistics are sent to authors
monthly. Statistics are continually tracked and reviewed.
Comments from Digital Archivist regarding digital library: We don’t “collect” user comments, but we do allow users to
contact us freely via contact form. We receive emails on a weekly basis. Comments from Digital Scholarship Librarian
regarding IR: The number of records and download statistics are documented monthly to evaluate the growth of the IR.
We receive user comments by e-mail every now and then.
For some collections, we track and report metrics on a monthly basis, for others quarterly. Metrics are used as needed,
according to the project and stakeholders.
It is dependent on the product. I cannot give a generalization.
Monthly statistics
On an as needed basis
Ongoing (2 responses)
Our use of “another regular interval” represents a range from daily through to ongoing, iterative assessment and
through to project milestones.
Quarterly (2 responses)
Quarterly page views, 2 year-long audience surveys
Regular interval for usability testing: this is an ongoing process, we perform testing as we are working in an agile
fashion, to test how end users react to features. Comments are generally always available and collected on an ad hoc
basis.
Some of these answers are consistent (user comments). Some are quarterly. Others are twice/year. Still others are every
2–3 years.
Statistics gathering/log analysis: monthly statistical reports are generated. Collect user comments: user comments are
always welcome and encouraged via a notice on our website.
These methods are used on a varying basis, generally more than once a year.
Previous Page Next Page