Abstract
Nowadays, eXplainable Artificial Intelligence (XAI) is well-known as an important field in Computer Science due to the necessity of understanding the increasing complexity of Artificial Intelligence (AI) systems or algorithms. This is the reason why we can find a wide variety of explanation techniques (explainers) in the literature, on top of some XAI libraries. The challenge faced by XAI designers here is deciding what explainers are the most suitable for each scenario, taking into account the AI model, task to explain, user preferences, needs and knowledge, and overall, fitting into the explanation requirements. With the aim of addressing this problem, the iSee project was conceived to provide XAI design users with supporting tools to build their own explanation experiences. As a result, we have developed iSee, a Case-Based Reasoning-driven platform that allows users to create personalised explanation experiences. With the iSee platform, users add their explanation experience requirements, and get the most suitable XAI strategies to explain their own situation, taking advantage of XAI strategies previously used with success in similar context. The iSee platform is composed of different tools and modules: the ontology, the cockpit, the explainer library, the Explanation Experiences Editor (iSeeE3), the chatbot, and the analytics dashboard. This paper introduces these tools as a demo and tutorial for current and future users and for the XAI community.
| Original language | English |
|---|---|
| Pages (from-to) | 313-320 |
| Number of pages | 8 |
| Journal | CEUR Workshop Proceedings |
| Volume | 3793 |
| Publication status | Published - 2024 |
| Event | Joint of the 2nd World Conference on eXplainable Artificial Intelligence Late-Breaking Work, Demos and Doctoral Consortium, xAI-2024:LB/D/DC - Valletta, Malta Duration: 17 Jul 2024 → 19 Jul 2024 |
Keywords
- Case-Based Reasoning
- Evaluation Cockpit
- Explainer Library
- Explanation Experiences Editor
- Personalised Explanation Experiences
- XAI Chatbot
- XAI Ontology