12  ·  Survey Results:  Executive Summary
There appears to be no single universal service
model for scholarly output assessment services. The
majority of respondents reported that services are
provided informally on an ad hoc basis rather than
in a coordinated fashion within the organizational
structure of the library. As one commented, “It is a ‘toe
in the water,’ not a fully developed service.” The ser-
vice model for scholarly output assessment services
appears to be in the initial phases of development
and perhaps represents a promising indicator of an
emergent model, “a rapidly growing area for librar-
ies,” as one respondent noted. Others commented
that, “Assessment will be a priority as it develops in
areas of our new organizational structure” and “We
recognize the importance of services in this area.”
Some respondents also reported plans to “develop
a more well-defined set of services in this area” and
to hire new staff devoted to scholarly output assess-
ment services.
Training
The majority of responding libraries (49 or 64%) cur-
rently provide training related to scholarly output
assessment. Three reported that training is in devel-
opment, and 18 others are considering it. Training
includes classes, workshops, informal one-on-one
training sessions, drop-in sessions, brown-bag ses-
sions, special events, and “one-on-one conversations
with faculty.” Some training is offered on a regular
basis; others are ad hoc as requested by users. Only
seven respondents (9%) have no plans to offer this
type of training. One respondent noted that “a more
integrated approach is planned for development in
FY16 planning cycle.”
A wide variety of course titles was reported:
Article Level Metrics; Building Your Academic
Profile; Citation Analysis; Citation Management;
Collaboration; Communicating Research; Digital
Humanities; Data Management; Determining Your
Scholarly Impact; Scholarly Impact: Traditional
and Alternative Metrics; Basics of Citation Metrics;
Impact Measurements; MyResearch graduate series;
SCOPUS: A Tool for Authors; Enhancing the Visibility
and Impact of Your Research; Who is Citing Your
Work?; Journal Impact Factors and Citation Analysis;
Measuring Your Scholarly Impact; Library Tools for
the Publication Cycle; to name a few. (See Q11 in the
Survey Questions & Responses section for others.)
Content descriptions for training included “high-
lighting one or a mix of the following: overview of
bibliometrics/altmetrics, h-index and Eigenfactor,
Scopus and Web of Science comparison, Google
Scholar, and InCites” and the “significance of h-index
for scholarly output assessment.” One description
of a workshop included learning outcomes: “This
hands-on and practical workshop will focus on the
three areas of article, author, and journal assessments.
Participants will become familiar with different multi-
faceted citation analysis using a variety of metrics and
their implications.”
Training is provided to faculty, students, research-
ers, and administrative staff. Some specific target au-
diences reported by respondents include media rela-
tions staff, graduate students, research coordinators,
and early-stage faculty. Some training efforts are also
tailored for specific areas of study such as science,
health science, humanities, and education.
Software and Resources
Survey respondents recommend a variety of scholarly
output assessment software and related resources
(subscription and free) to library users. The most fre-
quently recommended resources are bibliographic
citation databases, such as Web of Science, Google
Scholar, and Scopus, and resources that provide jour-
nal metrics, such as Journal Citation Reports. Some re-
spondents reported recommending or using resources
that capture non-citation data such as ImpactStory (36
respondents), Altmetric.com (30 respondents), and
Plum Analytics (7 respondents plus another 22 that
are considering it). A few respondents recommend
visualization software, such as NodeXL, Tableau, Sci2,
Gephi, and Wordle. Forty-six respondents (61%) re-
ported that they do not do cost sharing for subscrip-
tion resources. Twenty-nine (39%) reported sharing
costs with campus administration units such as the
Office of the Provost, Office of Research, or the Office
of Institutional Analysis.
Staffing
The survey asked respondents to list job titles for li-
brarians involved with scholarly output assessment
Previous Page Next Page