SPEC Kit 322: Library User Experience · 39
advantage of at the Library of Congress? How do visitors perceive the Library of Congress compared to other DC cultural
The survey was used to assess the frequency of utilization and satisfaction level with the library’s resources—including
computers, audio-visual equipment, databases, and printers—and its services such as instruction, information or
reference, interlibrary loan, and circulation.
The User Spaces Task Force created a survey to poll users on how they interact with the libraries facilities, what
improvements they would like to see, and the considerations made when choosing where to study within the library.
This study employed methods of user feedback collection to learn about the information needs of sciences faculty and
students at the University of North Carolina at Chapel Hill in order to improve library services for this population. Our
research questions we were attempting to address were: What are the information needs and behaviors of faculty and
students in the sciences at UNC-Chapel Hill? How can the UNC libraries best meet those needs through the provision of
resources and services?
To learn how users navigated our Digital Collections website and how they used the search options.
Upon the launch of a re-designed website, we mounted a feedback survey and conducted usability testing.
Usability testing for redesign of library website.
User-centered website redesign.
Way ﬁnding Exercise: We conducted three way ﬁnding studies with a total of 10 participants covering three distinct
areas of a single library building. Each participant performed at least 10 tasks over the course of one hour. For each task
a printout of a preselected OPAC item record was given to the participant, who then had to attempt to locate the item
on the shelf while the facilitator observed. Participants were also asked to locate amenities such as bathrooms and copy
machines, and completed a survey following the tasks.
We administer LibQUAL+® every two years to capture user perceptions on library service quality, by asking questions
in three “dimensions”: Affect of Service, Library as Place, and Information Control. Survey results provide a snapshot of
user perceptions of service levels (minimally-acceptable, desired, and perceived) at a particular point in time.
We administered LibQUAL+® in fall 2010 for the ﬁrst time. We also surveyed faculty for their rating of liaison services.
We routinely assess instruction. Other recent surveys include MINES® and ClimateQUAL®.
We conducted a series of observations (remotely) with students conducting research for a class assignment to see how
they used library resources (or not!) in an unmediated setting. We did not identify ourselves as the library so as to not
influence their behavior. We’ve completed a pilot phase and have plans to expand it in the fall.
We conducted an ethnographic research study using surveys and interviews to study how undergraduate and graduate
students and faculty were using the existing Rutgers University Libraries Web interface to conduct online research and
compose papers and reports.
We examined the use of our central search and discovery interface that resides on our library home page. Currently, we
use a tabbed system where the user must select which tool they want to use, such as the catalog, e-journals, databases,
or article search. The goal was to determine which tabs were seen as most useful, as well as whether the presence or
number of tabs was confusing. A second project was spawned in which we investigated the use and effectiveness of
our federated articles search interface. This included looking at use statistics as well as user interviews. For both studies
we used Morae software, ﬁlmed the participants, and presented results to the larger library community.