Human Centered Explainable AI

Human-centered explainable AI (HCXAI) focuses on designing AI systems whose explanations are understandable, useful, and trustworthy to human users, addressing the "black box" problem of many machine learning models. Current research emphasizes user-centered design methodologies, exploring diverse explanation formats (visual, textual, conversational) and investigating the impact of explanation accuracy and presentation on human-AI collaboration and decision-making. This field is crucial for building trust and ensuring responsible deployment of AI in high-stakes applications, particularly in healthcare, finance, and industrial settings, by bridging the gap between technical sophistication and human understanding.

Papers