Paper ID: 2211.06402
Behaviour Trees for Creating Conversational Explanation Experiences
Anjana Wijekoon, David Corsar, Nirmalie Wiratunga
This paper presented an XAI system specification and an interactive dialogue model to facilitate the creation of Explanation Experiences (EE). Such specifications combine the knowledge of XAI, domain and system experts of a use case to formalise target user groups and their explanation needs and to implement explanation strategies to address those needs. Formalising the XAI system promotes the reuse of existing explainers and known explanation needs that can be refined and evolved over time using user evaluation feedback. The abstract EE dialogue model formalised the interactions between a user and an XAI system. The resulting EE conversational chatbot is personalised to an XAI system at run-time using the knowledge captured in its XAI system specification. This seamless integration is enabled by using Behaviour Trees (BT) to conceptualise both the EE dialogue model and the explanation strategies. In the evaluation, we discussed several desirable properties of using BTs over traditionally used STMs or FSMs. BTs promote the reusability of dialogue components through the hierarchical nature of the design. Sub-trees are modular, i.e. a sub-tree is responsible for a specific behaviour, which can be designed in different levels of granularity to improve human interpretability. The EE dialogue model consists of abstract behaviours needed to capture EE, accordingly, it can be implemented as a conversational, graphical or text-based interface which caters to different domains and users. There is a significant computational cost when using BTs for modelling dialogue, which we mitigate by using memory. Overall, we find that the ability to create robust conversational pathways dynamically makes BTs a good candidate for designing and implementing conversation for creating explanation experiences.
Submitted: Nov 11, 2022