Paper ID: 2408.16780
$EvoAl^{2048}$
Bernhard J. Berger (1 and 2), Christina Plump (3), Rolf Drechsler (4 and 3) ((1) University of Rostock, Software Engineering Chair Rostock, Germany, (2) Hamburg University of Technology, Institute of Embedded Systems, Germany, (3) DFKI - Cyber-Physical Systems Bremen, Germany, (4) University of Bremen, Departments of Mathematics and Computer Science)
As AI solutions enter safety-critical products, the explainability and interpretability of solutions generated by AI products become increasingly important. In the long term, such explanations are the key to gaining users' acceptance of AI-based systems' decisions. We report on applying a model-driven-based optimisation to search for an interpretable and explainable policy that solves the game 2048. This paper describes a solution to the GECCO'24 Interpretable Control Competition using the open-source software EvoAl. We aimed to develop an approach for creating interpretable policies that are easy to adapt to new ideas.
Submitted: Aug 15, 2024