8 Survey Results: Survey Questions and Responses consider your library assessment work as being situated under this inclusive term when answering the survey questions. 1. Is your institution engaged in learning analytics projects/initiatives as defined above? N=53 Yes 43 81% No 10 19% Comments N=19 Answered Yes N=15 Based on my initiatives only. They do not do any of this relating to the library unless we initiate the project. But only in ad hoc projects. Here it is referred to as Learning Outcomes, although there is no formal office on campus doing this that we are aware of. In a very rudimentary way, the university is engaged in learning analytics. I believe this is done mostly for reporting purposes at a very high level. I am not aware of college or department level analytics projects, except for those happening in my own Research and Instruction Services department. In the context of university-wide assessment and accreditation, the university conducts an evaluation of student portfolios as well as writing samples. These are qualitative assessments, not quantitative. Isolated discussions on the learning environment but nothing formal. Some indicators are reported for strategic theme of outstanding academic programming. Mostly the User Experience (UX) Department, the READ scale through the Research Instruction and Outreach (RIO) Department. One-off projects to study things like impact of learning commons, embedded librarianship. The Assessment & User Experience (AUX) Department, in collaboration with other departments, engages in learning analytics projects and initiatives. In the Libraries, assessment and analytics-based work is often tied to either library strategic initiatives and goals, or departments goals. We define assessment as a set of activities that allow us to understand our impact upon institutional outcomes, identify user needs and satisfaction, and can be used to improve collections and library services. Other departments within the library pursue their own analysis of data. These efforts are often very contextual designed to seek feedback for service or instruction improvement. One of the most robust examples includes a yearlong multi-modal learning assessment from the doctoral program in the School of Education. Students are assessed various ways, such as after online synchronous workshops emails research consultations, and after face-to-face orientations. Individual librarians also collect learning statistics through their instruction and outreach. Our library also administratively houses a department responsible for robust professional development for teaching faculty—and there is a great deal of assessment and analysis around the work such as exploration of short-term, mid-term, and long- term learning outcomes associated with the activities of this particular department. Their assessment and analytics are directed to the provost of the university and not included in library service design, assessment, or collection assessment so for the purposes of this survey, this unit has not been included because most other programs like this at other institutions are housed administratively outside of the library.