Seminar
The event has passed

Case-based reasoning in AI: interpretability, explanations, and sharing best practices

Prof. Nirmalie Wiratunga is visiting Chalmers in connection with the CHAIR theme Interpretable AI, and will give a joint CHAIR and DSAI seminar.

Overview

The event has passed
Registration (Opens in new tab)

Agenda

14:00-15:00: Seminar

15:00-15:45: Fika in the seminar room, to accompany discussions with Nirmalie Wiratunga and/or other colleagues around the seminar topic.

Abstract, Nirmalie Wiratunga:

In this talk I will present an overview of Case-Based Reasoning (CBR) with particular focus on how similar cases help it to be inherently interpretable, as evidenced by recent case studies from the AI & Reasoning research Group at RGU.

CBR's use of case similarity makes it well suited as an explainer method for black box models. Specifically, we will look at recent work in post-hoc exemplar-based explanations that use CBR for factual, semi-factual and counterfactual explanations.

I will also provide an overview of the EU Chistera-funded iSee project, showcasing CBR's application as a platform for recommending explanation strategies for black box models and for sharing best practices. Specifically, I will discuss how CBR facilitates the sharing and reuse of user explanation experiences, by users for users.

Bio:

Nirmalie Wiratunga is a Professor in Intelligent Systems at RGU's School of Computing, and the Associate Dean for Research in the school, with over two decades of experience in computer science and AI research. She has held positions such as post-doctoral researcher on EPSRC funded projects, and was appointed Readership in 2009, and Professorship in 2016. She is also an adjunct IDUN professor at the Norwegian University of Science and Technology.

Nirmalie Wiratunga leads the Artificial Intelligence & Reasoning Research Group (AIR) in the School of Computing. She has been involved in numerous funded AIR projects, including the development of platforms for reusable explainable AI experiences and initiatives in healthcare. Additionally, she co-founded the Attendr.app, a spinout for mobile attendance tracking of students at the university and attendees at conferences.

 

Interpretable AI

Interpretable AI is an emerging field, focused on developing AI systems that are transparent and understandable to humans.