12 · Survey Results: Executive Summary
campus groups (students and faculty) to
better understand user needs and informa-
tion seeking behaviors as discovery systems
and collections continue to be amalgamated,
redesigned, and/or acquired.
Strategic planning, website usability, and
OPAC usability testing.
We plan an observational study of our
library spaces in the spring of 2011, and an
ethnographic study of how scholarly meth-
ods are changing due to new technologies
and formats, also in Spring 2011.
We will be starting a summer study of how
researchers do their scholarly work, with
a special emphasis on data management
needs.
The survey asked respondents to select up to two
user experience activities the library had recently
undertaken that had the biggest impact or were most
innovative. They were then asked a set of questions
about those activities. They described 121 different
activities. Many respondents reported on activities
to solicit user input related to building renovation
and redesign. Other UX projects included assessing
the OPAC, user input regarding access to electronic
resources, and general website usability.
Respondents were asked to describe techniques
and tools they used to gather user input. The most
frequently mentioned tool was surveys. The simplest
were homegrown instruments that were printed and
distributed in libraries or that were created using web
survey sites. The most commonly mentioned survey
tool was LibQUAL+® or a variant such as LibQUAL+®
Lite. Many respondents indicated they regularly use
LibQUAL+® every two to three years, creating a set
of longitudinal data. A number of respondents also
noted that they employ LibQUAL+® to identify broad
areas of user concern and then utilize focus groups or
targeted surveys to further understand those areas
of concern.
Combined, the passive techniques of gathering
anecdotal user comments or suggestions received
physically or online were the second most frequently
mentioned form of user input. Nearly two-thirds of
the examples cited by respondents incorporated this
type of feedback at some point in the data collection
process.
Half of the UX activities used focus groups and a
third employed some form of usability testing. The
latter technique was used primarily for redesigning
websites. As might be expected, more labor inten-
sive techniques, such as individual interviews and
observations, were not cited as frequently their use
was noted in ten and five per cent of the responses,
respectively.
For approximately half of the examples, respon-
dents used a combination of both open recruitment
and direct invitations to solicit participants for feed-
back. A fourth used open recruitment only and the
other fourth used direct invitation only. The survey
data indicates that libraries used a variety of tech-
niques to recruit participants. The most frequently
mentioned example was e-mail, closely followed by
an invitation on the library’s web page or personal
contact from a library employee. More than half of
the respondents used all three of these approaches.
Around a quarter of the respondents used social me-
dia tools, and a like number used in-house media,
such as a library newsletter, in their recruitment.
Libraries planning to recruit feedback participants
should budget for some type of incentive, as over 70%
of respondents indicated that they provided incen-
tives. The most common incentives were food and
gift cards. Nearly three-quarters of the respondents
indicated that the costs associated with their feedback
projects were borne by the library’s operating budget
the remainder were financed by library foundation
funds or special, one-time funding such as a grant.
Funds spent on soliciting user feedback seemed
to generate a high return on investment 43% of re-
spondents noted that the feedback led to a complete
redesign of, or major modifications to, library services
or spaces. Another 39% noted that the feedback led
to minor modifications to existing services or spaces.
For nearly 90% of the projects mentioned, libraries
reported feedback results to important constituencies,
such as users and library administration and staff.
Also, many respondents indicated that they share
survey results and other products of user experience
activities in written form with institutional governing
bodies. Examples include:
Previous Page Next Page