95 SPEC Kit 350: Supporting Digital Scholarship
Interviews and ongoing conversations, developing new relationships with campus stakeholders, and
tending to existing relationships.
Interviews with faculty and students provide ongoing opportunities to evaluate the breadth and depth
of our support.
Interviews with individual researchers (2 responses)
It would be very helpful to hear what our peer institutions have done in this area.
It’s quite difficult to choose one of these as most useful, as different departments tend to use different
methods, based on the nature of their digital scholarship work. Our department of Assessment
and User Experience Services conducts annual user surveys, which include questions related to
digital scholarship services and spaces, and has followed up with focus groups to better understand
responses. This is very useful for helping to secure support for new or enhanced spaces, services,
collections, and initiatives. Individual departments, especially Data and Visualization Services
and Research and Instructional Services, have tracked the frequency and length of consultation
sessions over several years this data has helped them to discern trends (such as increases not just in
the number of consultations over time, but also the length of those consultations, which in turn has
informed decisions about staffing service desks and providing alternative consultation services. Digital
Scholarship Services, which primarily partners with students and faculty on digital projects, has
tracked information on the nature of these projects, to better assess the landscape of interest and need
in digital scholarship. While all the assessment methods above help us gauge researcher needs and
interests in digital scholarship and the value of existing services, focused discussions with different
user groups (administrators, faculty, students)—whether as part of focus groups or interviews—
arguably are most indispensable in helping us to better understand the context of digital scholarship
work and thus take a longer view of the ways the Libraries can effectively transform scholarship. For
instance, an increase in the frequency and length of repeat digital research consultations at a service
desk might suggest a need to provide more staff and longer hours. But an interview or focus group with
these same students might reveal that class assignments require the use of digital tools and approaches
but don’t include that training as part of the course. A more successful intervention, then, would be
to involve ourselves in curricular discussions at the university level or, minimally, to offer a series of
workshops that can help scaffold learning in these areas and share those workshop schedules with
faculty. In essence, we cannot rely on one form of assessment rather, we benefit from regular and
thoughtful assessment in a variety of ways, to get a fuller picture and make more informed choices
about how we direct our work.
Library User Survey: includes questions on relative importance, satisfaction, usage, etc.
Number of consultations
Project data and discussions with patrons
Survey
The assessment activities focus on individual services and not the program as a whole, so the
assessment activities are all useful.
The LibQUAL+ survey and faculty interviews conducted for a “Future of the Libraries” study.
Too early to say. At this stage, we see all these methods as useful in different ways.
Usage data, interviews, and focus groups
User research
User satisfaction survey
Previous Page Next Page