23 Association of Research Libraries Research Library Issues 299 — 2019 as a means of preserving life and restoring capacities these efforts can enable patients to regain speech and motor function, for instance. Even this medical context for enhancement is generating well warranted concerns about ableism and eugenics, particularly as the meaning of a “normal body” or “normal capacity” is reshaped by this technology. It seems unlikely on ethical grounds, however, that such valid concerns will be used to deny everyone even the possibility of regaining the ability to walk again or to have impaired vision or hearing restored through technological enhancements as the development of these technologies continues to advance. All of this means that technology innovation is on pace to reshape the future of humanity in deeply consequential ways, including at the foundational level of what it means to be (a) human. What Should AI Ethics Look Like? As our global society increasingly recognizes that technology is not merely technical but also societal and human-oriented, new doors of opportunity are opening for humanists to take leadership of the most important efforts that might shape the future of society. The University of Oxford announced with great fanfare in June of 2019 that Blackstone CEO Stephen Schwarzman had gifted more than $188 million to fund a humanities center housing a new institute for AI ethics. The billionaire-philanthropist had previously donated $350 million to MIT to create an institute-wide “College of AI” that will emphasize the role of the liberal arts and human sciences. In 2019, Stanford University launched a new institute harnessing university-wide efforts to support human-centered AI, placing at the helm a philosopher and a computer scientist. We are witnessing growing efforts to ensure that technology serves human interests through regulatory efforts, ethical frameworks, and more comprehensive education. “Public interest technology” is among the key growth areas devoted to ensuring that social justice and