16 · Survey Results: Executive Summary
themselves are being innovative when seeking outside
funding streams to purchase cutting-edge tools.
Funding for ongoing maintenance and replace-
ment of equipment follows a very similar pattern to
that of initial purchase funding: most respondents
depend on the general library and/or IT/systems bud-
get. Funding from student tech fees drops to 25%
of respondents and from the parent institution’s IT/
systems budget falls to 20%. As might be expected,
grant funding and public/private partnerships drop
off considerably after initial purchases of equipment
and the parent institution or library takes over main-
tenance and repair. Two libraries use library fines
and fees for maintenance and repair. One institution
generates income from a “Distance Learning Library
Services program.” One library hopes that as some
collaborative tools gain popularity across campus that
university administration will acquire a site-license.
Only four libraries report charging fees for the
use of collaborative teaching and learning tools. One
institution charges unaffiliated users a fee to use some
equipment and rooms. At one library, late fees are $5
an hour for electronic equipment and $1 an hour for
accessories. Another library charges a fee for late re-
turn of laptops ($20/hour, up to a maximum of $200).
While no up-front fees are charged to affiliated users
of these institutions, refusal to adhere to use policies
and due dates for electronic equipment potentially
can be seen as additional revenue stream for their
purchase, maintenance, and repair.
Publicity and Evaluation
When offering a new service, libraries often try to
publicize the new service through a variety of me-
dia such as library websites, fliers, social networking
sites, email, newsletters, and the campus newspaper.
However, when asked how they promoted the avail-
ability of new collaborative teaching and learning
tools in their libraries, respondents overwhelmingly
relied on simple word of mouth (59 responses or 95%).
Not far behind that response are announcements on
the library website (56 or 90%), followed by mentions
in library classes and tours (54 or 87%). Such seemingly
passive promotion of a new service may be due to the
technical support and large learning curves associ-
ated with tools that may be deemed technologically
advanced for library staff and users. Even a traditional
method of promotion like signs and flyers (42 or 68%)
ranks slightly ahead of “web 2.0” social networking
methods like Facebook, Twitter, YouTube, etc. (40 or
65%). Fewer than half of the respondents reported us-
ing email (30 or 48%), library newsletters (29 or 47%), or
campus newspapers (16 or 26%), signifying much less
reliance on these methods as a means to reach a more
technologically advanced user. Open-ended responses
indicated use of various “digital signs,” e.g., electronic
signs on campus or screen savers on workstations, to
reach potential users. Three respondents relied on
library outreach or liaisons to campus departments.
Two libraries used institutional websites, while one
had not started marketing initiatives, yet.
Similar to the methods employed in publicity, as-
sessing the success of offering collaborative teaching
and learning tools is largely informal in most of the
responding libraries. Informal user feedback (57 or
93%) and tracking the number of uses of each tool (55
or 90%) are the two most common evaluation meth-
ods. Surprisingly, fewer than half indicated they use
formal surveys of users (26 or 43%), though an analy-
sis of the “other” responses shows this number is
misleading. Three libraries report using focus groups,
two others use faculty surveys, one uses an “Opinions
Survey,” and yet another relies on the library’s annual
survey—all of which can be viewed as methods of
formalized user surveys. As a measure of user de-
mand, the fourth most popular evaluation technique
is tracking the number of requests for each tool (24
or 39%). Some libraries track the number of techni-
cal support requests for each tool as an evaluative
measure (16 or 26%). One library has recently hired
an “Assessment Librarian,” whom they hope will be
able to track evaluation of support for collaborative
teaching and learning services. Interestingly, one li-
brary somehow tracks “turn aways” (i.e., number of
users turned away from a service desk because all of
the needed tools are checked out).
Benefits and Challenges
Some of the most informative and thought-provok-
ing comments in the survey come from the sec-
tions in which respondents were asked to list up to
three benefits and three challenges associated with
themselves are being innovative when seeking outside
funding streams to purchase cutting-edge tools.
Funding for ongoing maintenance and replace-
ment of equipment follows a very similar pattern to
that of initial purchase funding: most respondents
depend on the general library and/or IT/systems bud-
get. Funding from student tech fees drops to 25%
of respondents and from the parent institution’s IT/
systems budget falls to 20%. As might be expected,
grant funding and public/private partnerships drop
off considerably after initial purchases of equipment
and the parent institution or library takes over main-
tenance and repair. Two libraries use library fines
and fees for maintenance and repair. One institution
generates income from a “Distance Learning Library
Services program.” One library hopes that as some
collaborative tools gain popularity across campus that
university administration will acquire a site-license.
Only four libraries report charging fees for the
use of collaborative teaching and learning tools. One
institution charges unaffiliated users a fee to use some
equipment and rooms. At one library, late fees are $5
an hour for electronic equipment and $1 an hour for
accessories. Another library charges a fee for late re-
turn of laptops ($20/hour, up to a maximum of $200).
While no up-front fees are charged to affiliated users
of these institutions, refusal to adhere to use policies
and due dates for electronic equipment potentially
can be seen as additional revenue stream for their
purchase, maintenance, and repair.
Publicity and Evaluation
When offering a new service, libraries often try to
publicize the new service through a variety of me-
dia such as library websites, fliers, social networking
sites, email, newsletters, and the campus newspaper.
However, when asked how they promoted the avail-
ability of new collaborative teaching and learning
tools in their libraries, respondents overwhelmingly
relied on simple word of mouth (59 responses or 95%).
Not far behind that response are announcements on
the library website (56 or 90%), followed by mentions
in library classes and tours (54 or 87%). Such seemingly
passive promotion of a new service may be due to the
technical support and large learning curves associ-
ated with tools that may be deemed technologically
advanced for library staff and users. Even a traditional
method of promotion like signs and flyers (42 or 68%)
ranks slightly ahead of “web 2.0” social networking
methods like Facebook, Twitter, YouTube, etc. (40 or
65%). Fewer than half of the respondents reported us-
ing email (30 or 48%), library newsletters (29 or 47%), or
campus newspapers (16 or 26%), signifying much less
reliance on these methods as a means to reach a more
technologically advanced user. Open-ended responses
indicated use of various “digital signs,” e.g., electronic
signs on campus or screen savers on workstations, to
reach potential users. Three respondents relied on
library outreach or liaisons to campus departments.
Two libraries used institutional websites, while one
had not started marketing initiatives, yet.
Similar to the methods employed in publicity, as-
sessing the success of offering collaborative teaching
and learning tools is largely informal in most of the
responding libraries. Informal user feedback (57 or
93%) and tracking the number of uses of each tool (55
or 90%) are the two most common evaluation meth-
ods. Surprisingly, fewer than half indicated they use
formal surveys of users (26 or 43%), though an analy-
sis of the “other” responses shows this number is
misleading. Three libraries report using focus groups,
two others use faculty surveys, one uses an “Opinions
Survey,” and yet another relies on the library’s annual
survey—all of which can be viewed as methods of
formalized user surveys. As a measure of user de-
mand, the fourth most popular evaluation technique
is tracking the number of requests for each tool (24
or 39%). Some libraries track the number of techni-
cal support requests for each tool as an evaluative
measure (16 or 26%). One library has recently hired
an “Assessment Librarian,” whom they hope will be
able to track evaluation of support for collaborative
teaching and learning services. Interestingly, one li-
brary somehow tracks “turn aways” (i.e., number of
users turned away from a service desk because all of
the needed tools are checked out).
Benefits and Challenges
Some of the most informative and thought-provok-
ing comments in the survey come from the sec-
tions in which respondents were asked to list up to
three benefits and three challenges associated with