61 SPEC Kit 352: Collection Assessment
Inconsistent data or bad match points. We have used OCLC and other APIs to attempt to make data
more consistent.
Getting data in the first place. Persistence on a case-by-case basis was our only solution.
Integrating data from different sources: overcome with analytical programming skills
Presenting data from multiple sources in a manner that allows faculty input in collections review:
overcome by combining skill sets across departments.
Forecasting physical collection growth: overcome with analytic programming skills
It is difficult to collect and analyze data produced from many different sources.
Lack of reliable unique identifiers across data sources makes it difficult to bring data together
for analysis.
Difficult to extract and interpret data from our largest systems.
Information in our systems (e.g., ILLIAD and ILS and OCLC) is a mess, inconsistent. Springer’s breach
this spring calls into question the veracity of COUNTER stats from vendors.
Lack of time and people is a challenge, so we will shortly be recruiting for an assessment librarian who
will work on collections and other types of assessment.
We have cut collections and lost purchasing power over the last decade, so we are focusing on defining
what core resources are needed and how much money is needed for the core.
Lack of skill on using Access and other programs is a personal challenge.
Lack of time to do a proper assessment
Difficulty in obtaining data/reports needed
Staff don’t always have sufficient expertise with Excel to analyze data
Limited access to data: circulation and in-house use
Migration to a new ILS impacted comparable data reporting
Comparability between print and electronic measures
Not always having a standard identifier in the record, for example ISBN or ISSN numbers
Negotiating the political dimension of assessment for example, some users believing assessment
decisions reflect the extent to which the library values and supports particular fields of research. We
try to address this through clear and abundant communications and through transparency.
Communicating the complexities of purchasing models and restrictions to decision-makers and other
stakeholders. We’ve tried to address this through the creation of glossaries of terms and very careful
and thoughtful contextual details related to resources under evaluation.
Messy data. Oftentimes, the data that is used to inform assessment decisions is messy, inconsistent,
problematic, and full of caveats.
Previous Page Next Page