TY - CHAP
T1 - iSee
T2 - 27th European Conference on Artificial Intelligence, ECAI 2024
AU - Wijekoon, Anjana
AU - Wiratunga, Nirmalie
AU - Corsar, David
AU - Martin, Kyle
AU - Nkisi-Orji, Ikechukwu
AU - Palihawadana, Chamath
AU - Caro-Martínez, Marta
AU - Díaz-Agudo, Belen
AU - Bridge, Derek
AU - Liret, Anne
N1 - Publisher Copyright:
© 2024 The Authors.
PY - 2024/10/16
Y1 - 2024/10/16
N2 - Explainable AI (XAI) can greatly enhance user trust and satisfaction in AI-assisted decision-making processes. Recent findings suggest that a single explainer may not meet the diverse needs of multiple users in an AI system; indeed, even individual users may require multiple explanations. This highlights the necessity for a “multi-shot” approach, employing a combination of explainers to form what we introduce as an “explanation strategy”. Tailored to a specific user or a user group, an “explanation experience” describes interactions with personalised strategies designed to enhance their AI decision-making processes. The iSee platform is designed for the intelligent sharing and reuse of explanation experiences, using Case-based Reasoning to advance best practices in XAI. The platform provides tools that enable AI system designers, i.e. design users, to design and iteratively revise the most suitable explanation strategy for their AI system to satisfy end-user needs. All knowledge generated within the iSee platform is formalised by the iSee ontology for interoperability. We use a summative mixed methods study protocol to evaluate the usability and utility of the iSee platform with six design users across varying levels of AI and XAI expertise. Our findings confirm that the iSee platform effectively generalises across applications and its potential to promote the adoption of XAI best practices.
AB - Explainable AI (XAI) can greatly enhance user trust and satisfaction in AI-assisted decision-making processes. Recent findings suggest that a single explainer may not meet the diverse needs of multiple users in an AI system; indeed, even individual users may require multiple explanations. This highlights the necessity for a “multi-shot” approach, employing a combination of explainers to form what we introduce as an “explanation strategy”. Tailored to a specific user or a user group, an “explanation experience” describes interactions with personalised strategies designed to enhance their AI decision-making processes. The iSee platform is designed for the intelligent sharing and reuse of explanation experiences, using Case-based Reasoning to advance best practices in XAI. The platform provides tools that enable AI system designers, i.e. design users, to design and iteratively revise the most suitable explanation strategy for their AI system to satisfy end-user needs. All knowledge generated within the iSee platform is formalised by the iSee ontology for interoperability. We use a summative mixed methods study protocol to evaluate the usability and utility of the iSee platform with six design users across varying levels of AI and XAI expertise. Our findings confirm that the iSee platform effectively generalises across applications and its potential to promote the adoption of XAI best practices.
UR - https://www.scopus.com/pages/publications/85216660296
U2 - 10.3233/FAIA241057
DO - 10.3233/FAIA241057
M3 - Chapter
AN - SCOPUS:85216660296
T3 - Frontiers in Artificial Intelligence and Applications
SP - 4626
EP - 4633
BT - ECAI 2024 - 27th European Conference on Artificial Intelligence, Including 13th Conference on Prestigious Applications of Intelligent Systems, PAIS 2024, Proceedings
A2 - Endriss, Ulle
A2 - Melo, Francisco S.
A2 - Bach, Kerstin
A2 - Bugarin-Diz, Alberto
A2 - Alonso-Moral, Jose M.
A2 - Barro, Senen
A2 - Heintz, Fredrik
PB - IOS Press BV
Y2 - 19 October 2024 through 24 October 2024
ER -