Paper ID: 2210.02202

A new family of Constitutive Artificial Neural Networks towards automated model discovery

Kevin Linka, Ellen Kuhl

For more than 100 years, chemical, physical, and material scientists have proposed competing constitutive models to best characterize the behavior of natural and man-made materials in response to mechanical loading. Now, computer science offers a universal solution: Neural Networks. Neural Networks are powerful function approximators that can learn constitutive relations from large data without any knowledge of the underlying physics. However, classical Neural Networks ignore a century of research in constitutive modeling, violate thermodynamic considerations, and fail to predict the behavior outside the training regime. Here we design a new family of Constitutive Artificial Neural Networks that inherently satisfy common kinematic, thermodynamic, and physic constraints and, at the same time, constrain the design space of admissible functions to create robust approximators, even in the presence of sparse data. We revisit the non-linear field theories of mechanics and reverse-engineer the network input to account for material objectivity, symmetry, and incompressibility; the network output to enforce thermodynamic consistency; the activation functions to implement physically reasonable restrictions; and the network architecture to ensure polyconvexity. We demonstrate that this new class of models is a generalization of the classical neo Hooke, Blatz Ko, Mooney Rivlin, Yeoh, and Demiray models and that the network weights have a clear physical interpretation. When trained with classical benchmark data for rubber, our network autonomously selects the best constitutive model and learns its parameters. Our findings suggests that Constitutive Artificial Neural Networks have the potential to induce a paradigm shift in constitutive modeling, from user-defined model selection to automated model discovery. Our source code, data, and examples are available at https://github.com/LivingMatterLab/CANN.

Submitted: Sep 15, 2022