TY - JOUR
T1 - iSee
T2 - 30th International Conference on Case-Based Reasoning Workshop, ICCBR-WS 2022
AU - Martin, Kyle
AU - Wijekoon, Anjana
AU - Wiratunga, Nirmalie
AU - Palihawadana, Chamath
AU - Nkisi-Orji, Ikechukwu
AU - Corsar, David
AU - Díaz-Agudo, Belén
AU - Recio-García, Juan A.
AU - Caro-Martínez, Marta
AU - Bridge, Derek
AU - Pradeep, Preeja
AU - Liret, Anne
AU - Fleisch, Bruno
N1 - Publisher Copyright:
© 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings (CEUR-WS.org)
PY - 2022
Y1 - 2022
N2 - The right to an explanation of the decision reached by a machine learning (ML) model is now an EU regulation. However, different system stakeholders may have different background knowledge, competencies and goals, thus requiring different kinds of explanations. There is a growing armoury of XAI methods, interpreting ML models and explaining their predictions, recommendations and diagnoses. We refer to these collectively as”explanation strategies”. As these explanation strategies mature, practitioners gain experience in understanding which strategies to deploy in different circumstances. What is lacking, and what the iSee project will address, is the science and technology for capturing, sharing and re-using explanation strategies based on similar user experiences, along with a much-needed route to explainable AI (XAI) compliance. Our vision is to improve every user's experience of AI, by harnessing experiences of best practice in XAI by providing an interactive environment where personalised explanation experiences are accessible to everyone.
AB - The right to an explanation of the decision reached by a machine learning (ML) model is now an EU regulation. However, different system stakeholders may have different background knowledge, competencies and goals, thus requiring different kinds of explanations. There is a growing armoury of XAI methods, interpreting ML models and explaining their predictions, recommendations and diagnoses. We refer to these collectively as”explanation strategies”. As these explanation strategies mature, practitioners gain experience in understanding which strategies to deploy in different circumstances. What is lacking, and what the iSee project will address, is the science and technology for capturing, sharing and re-using explanation strategies based on similar user experiences, along with a much-needed route to explainable AI (XAI) compliance. Our vision is to improve every user's experience of AI, by harnessing experiences of best practice in XAI by providing an interactive environment where personalised explanation experiences are accessible to everyone.
KW - Case-Based Reasoning
KW - Explainability
KW - Project Showcase
UR - https://www.scopus.com/pages/publications/85159775368
M3 - Article
AN - SCOPUS:85159775368
SN - 1613-0073
VL - 3389
SP - 231
EP - 232
JO - CEUR Workshop Proceedings
JF - CEUR Workshop Proceedings
Y2 - 12 September 2022 through 15 September 2022
ER -