SPEC Kit 322: Library User Experience · 13
• Library of Congress Executive Committee
and Management
• Data used in budget presentations to the
President’s Executive Team
• Campus Renovation Committee
• Senior levels of the university administra-
tion via the library’s annual report
• The Learning Commons design process
mentioned in the annual report and in the
faculty newsletter
• Institutional Research Planning
Some respondents also indicated they share results
within the library community via conference presen-
tation and publication. For example:
• Conference presentations (IUG, ALA
Annual, and possibly IFLA) as well as an
intended article for Library Trends
• Publishing the results more broadly, e.g., in
an academic article
• Communicating to the broader academic
library community through conference
presentations
A smaller number indicated they share results
with the general user community via more wide-
spread and public means such as social media, post-
ing results on websites, and through the use of open
forums.
Organizational Structure
Several questions in the survey sought information
on how libraries organized activities and staffed posi-
tions related to assessment and, more specifically, the
user experience. Nearly all respondents indicated that
their library at least periodically conducts assessment
activities, but a surprising number indicated no for-
mal assessment structure in their organization. Most
respondents indicated that assessment activities were
often ad hoc and conducted by one or more library
units that hoped to benefit from the particular infor-
mation sought. Still, half of the respondents reported
a dedicated Assessment Coordinator position, and a
quarter identified a dedicated position focusing on
user experience. Based on respondent comments, one
might expect a future upward trend for these types
of positions. Numerous comments alluded to new or
recently revitalized assessment efforts and new orga-
nizational structures and personnel to support such
programs. The comments also indicated a very broad
and growing awareness of the need to have activities
focused solely on measuring and improving user ex-
perience. Indeed, while many respondents noted that
user experience efforts were but one component of a
broader assessment program, the importance of the
user experience component appears to be growing
substantially. One particularly appropriate comment
demonstrating this trend is the following:
(UX activities) are the heart of our assessment
activities. Most of our other “assessment” activi-
ties are merely keeping statistics about usage and
involve very little actual assessment at this point
in time.
As noted above, many of the responding libraries
do not currently have one person dedicated to coor-
dinating an assessment or user experience program.
An inherent danger in not having a coordinator is
the potential lack of a consistent message or brand
in this area. In general though, responding libraries
seem to have some awareness of this issue and have
assigned fairly high-level supervision here. When
asked to name who in their library has primary over-
sight of user experience activities, libraries that do not
have dedicated user experience and/or assessment
coordinators routinely indicated oversight by another
department head level position or by someone at the
associate dean/AUL level. When asked to whom this
coordinator reports, over three quarters of the respon-
dents indicated the coordinator reported to someone
at the dean or associate dean level.
Strategic Planning
While there was not a specific question about it in
the survey, a number of respondents referred to the
library strategic plan or planning process. Several
comments noted how user experience, or in a broader
context, assessment activities provided input into their
most recent strategic plan. Two respondents specifi-
cally mentioned the use of focus groups for user input,
• Library of Congress Executive Committee
and Management
• Data used in budget presentations to the
President’s Executive Team
• Campus Renovation Committee
• Senior levels of the university administra-
tion via the library’s annual report
• The Learning Commons design process
mentioned in the annual report and in the
faculty newsletter
• Institutional Research Planning
Some respondents also indicated they share results
within the library community via conference presen-
tation and publication. For example:
• Conference presentations (IUG, ALA
Annual, and possibly IFLA) as well as an
intended article for Library Trends
• Publishing the results more broadly, e.g., in
an academic article
• Communicating to the broader academic
library community through conference
presentations
A smaller number indicated they share results
with the general user community via more wide-
spread and public means such as social media, post-
ing results on websites, and through the use of open
forums.
Organizational Structure
Several questions in the survey sought information
on how libraries organized activities and staffed posi-
tions related to assessment and, more specifically, the
user experience. Nearly all respondents indicated that
their library at least periodically conducts assessment
activities, but a surprising number indicated no for-
mal assessment structure in their organization. Most
respondents indicated that assessment activities were
often ad hoc and conducted by one or more library
units that hoped to benefit from the particular infor-
mation sought. Still, half of the respondents reported
a dedicated Assessment Coordinator position, and a
quarter identified a dedicated position focusing on
user experience. Based on respondent comments, one
might expect a future upward trend for these types
of positions. Numerous comments alluded to new or
recently revitalized assessment efforts and new orga-
nizational structures and personnel to support such
programs. The comments also indicated a very broad
and growing awareness of the need to have activities
focused solely on measuring and improving user ex-
perience. Indeed, while many respondents noted that
user experience efforts were but one component of a
broader assessment program, the importance of the
user experience component appears to be growing
substantially. One particularly appropriate comment
demonstrating this trend is the following:
(UX activities) are the heart of our assessment
activities. Most of our other “assessment” activi-
ties are merely keeping statistics about usage and
involve very little actual assessment at this point
in time.
As noted above, many of the responding libraries
do not currently have one person dedicated to coor-
dinating an assessment or user experience program.
An inherent danger in not having a coordinator is
the potential lack of a consistent message or brand
in this area. In general though, responding libraries
seem to have some awareness of this issue and have
assigned fairly high-level supervision here. When
asked to name who in their library has primary over-
sight of user experience activities, libraries that do not
have dedicated user experience and/or assessment
coordinators routinely indicated oversight by another
department head level position or by someone at the
associate dean/AUL level. When asked to whom this
coordinator reports, over three quarters of the respon-
dents indicated the coordinator reported to someone
at the dean or associate dean level.
Strategic Planning
While there was not a specific question about it in
the survey, a number of respondents referred to the
library strategic plan or planning process. Several
comments noted how user experience, or in a broader
context, assessment activities provided input into their
most recent strategic plan. Two respondents specifi-
cally mentioned the use of focus groups for user input,