32 Association of Research Libraries Research Library Issues 299 2019 “(1) model-agnostic, so it can be applied to any black box model (2) logic-based, so that explanations can be made comprehensible to humans with diverse expertise, and support their reasoning (3) both local and global, so it can explain both individual cases and the overall logic of the black-box model (4) high-fidelity, so it provides a reliable and accurate approximation of the black box behavior.”11 A more holistic view would include explanations that consider the data used for training and decision-making, the computational environment utilized, the context of the algorithmic design and deployment, and those responsible for its operation and use (that is, a sociotechnical analysis).12 Technical explanations are required for those involved in system design and performance testing while accessible explanations are needed for those affected by algorithms. In the latter context, a good explanation is contrastive (“why P not Q?”), selective (only certain evidence is required not a complete explanation), and social (a dialogue, interactive, contextual).13 As academic libraries increasingly acquire and develop algorithmic decision-making systems and services in support of scholarly communications and the operation of the library, they must do so in a manner that insists on interpretability and explanation. To do anything less is an unknowing delegation to technology and an abrogation of scholarly rigor. XAI Strategies, Techniques, and Processes Approaches to XAI can be broadly categorized as proofs, validations, and authorizations. Within these categories are numerous explanatory practices, which are contextual, system or model dependent, and audience specific.
Previous Page Next Page