60 Survey Results: Survey Questions and Responses
Difficult to combine data/statistics from different sources without a lot of manual labor
Difficult to describe/present the outcomes of data analysis without spreadsheets
Not having good benchmarking data for meaningful comparisons
Difficulty parsing data from MARC bibliographic records
Each assessment project differs in the type of data it requires. Therefore, although we collect a lot of
data, we don’t typically have precisely what is needed for a particular project. This means that we often
have to gather or reorganize data in a different way for each project, which is time-consuming and
labor-intensive, and cuts down on the amount of assessment we can undertake.
Although our assessment projects nearly always contribute in a general way to an enhanced
understanding of how our collection is perceived and used by our faculty and students, it can be difficult
to turn that understanding into specific actions.
Electronic resource usage data that is not COUNTER compliant
Insufficient staff to manage collections data
Insufficient time for analysis and also for training liaisons in analysis
Funding
Space
Gathering use data can be difficult if vendor does not supply COUNTER reports. Their analysis is
sometimes complicated by messy data and incomplete title lists.
In the context of a very large research collection, overlap analysis is a key aspect of collections
assessment projects, and this is a resource intensive undertaking.
Since our institution is so large, even the smallest collections receive high use.
Getting selectors actively engaged. Have not overcome that challenge.
Data: there’s both a lot and not enough. Bad records make it impossible to do good comparisons. Have
not overcome that challenge.
Staff using the data that is gathered to make decisions. Have not overcome that challenge.
Having the time to analyse the data
Being able to have meaning reports about e-resources usage
Moving to a culture of evidence-based decisions with the data to assist/drive collection development
when we know past decisions resulted in collections where 80% weren’t used.
Historical lack of assessment
Historical difficulty acquiring raw data from vendors, ILS department
Inconsistent data
Lack of time and dedicated staff
Large quantity of data
Difficult to combine data/statistics from different sources without a lot of manual labor
Difficult to describe/present the outcomes of data analysis without spreadsheets
Not having good benchmarking data for meaningful comparisons
Difficulty parsing data from MARC bibliographic records
Each assessment project differs in the type of data it requires. Therefore, although we collect a lot of
data, we don’t typically have precisely what is needed for a particular project. This means that we often
have to gather or reorganize data in a different way for each project, which is time-consuming and
labor-intensive, and cuts down on the amount of assessment we can undertake.
Although our assessment projects nearly always contribute in a general way to an enhanced
understanding of how our collection is perceived and used by our faculty and students, it can be difficult
to turn that understanding into specific actions.
Electronic resource usage data that is not COUNTER compliant
Insufficient staff to manage collections data
Insufficient time for analysis and also for training liaisons in analysis
Funding
Space
Gathering use data can be difficult if vendor does not supply COUNTER reports. Their analysis is
sometimes complicated by messy data and incomplete title lists.
In the context of a very large research collection, overlap analysis is a key aspect of collections
assessment projects, and this is a resource intensive undertaking.
Since our institution is so large, even the smallest collections receive high use.
Getting selectors actively engaged. Have not overcome that challenge.
Data: there’s both a lot and not enough. Bad records make it impossible to do good comparisons. Have
not overcome that challenge.
Staff using the data that is gathered to make decisions. Have not overcome that challenge.
Having the time to analyse the data
Being able to have meaning reports about e-resources usage
Moving to a culture of evidence-based decisions with the data to assist/drive collection development
when we know past decisions resulted in collections where 80% weren’t used.
Historical lack of assessment
Historical difficulty acquiring raw data from vendors, ILS department
Inconsistent data
Lack of time and dedicated staff
Large quantity of data