< Back to previous page

Project

Towards the next generation of interactive and adaptive explanation methods

Despite the long history of work on explanations in the Machine Learning, AI and Recommender Systems literature, current efforts face unprecedented difficulties: contemporary models are more complex and less interpretable than ever. As such models are used in many day-to-day applications, justifying their decisions for non-expert users with little or no technical knowledge will only become more crucial. Although several explanation methods have been proposed, little work has been done to evaluate whether the proposed methods indeed enhance human interpretability. Many existing methods also require significant expertise and are static. Several researchers have voiced the need for interaction with explanations as a core requirement to support understanding. In this project, we will research the next generation of model-agnostic explanation methods that are tailored to the needs of non-expert users. We will also research how these explanation methods can be combined to develop more powerful explanation interfaces, and how such interfaces can be adapted on-the-fly to different personal and situational characteristics.

Date:1 Jan 2021 →  Today
Keywords:interactive and adaptive explanation methods
Disciplines:Other computer engineering, information technology and mathematical engineering not elsewhere classified