Paper ID: 2307.02131

Beyond Known Reality: Exploiting Counterfactual Explanations for Medical Research

Toygar Tanyel, Serkan Ayvaz, Bilgin Keserci

The field of explainability in artificial intelligence (AI) has witnessed a growing number of studies and increasing scholarly interest. However, the lack of human-friendly and individual interpretations in explaining the outcomes of machine learning algorithms has significantly hindered the acceptance of these methods by clinicians in their research and clinical practice. To address this issue, our study uses counterfactual explanations to explore the applicability of "what if?" scenarios in medical research. Our aim is to expand our understanding of magnetic resonance imaging (MRI) features used for diagnosing pediatric posterior fossa brain tumors beyond existing boundaries. In our case study, the proposed concept provides a novel way to examine alternative decision-making scenarios that offer personalized and context-specific insights, enabling the validation of predictions and clarification of variations under diverse circumstances. Additionally, we explore the potential use of counterfactuals for data augmentation and evaluate their feasibility as an alternative approach in our medical research case. The results demonstrate the promising potential of using counterfactual explanations to enhance acceptance of AI-driven methods in clinical research.

Submitted: Jul 5, 2023