Paper ID: 2404.07943 • Published Mar 18, 2024

A Pretraining-Finetuning Computational Framework for Material Homogenization

Yizheng Wang, Xiang Li, Ziming Yan, Shuaifeng Ma, Jinshuai Bai, Bokai Liu, Timon Rabczuk, Yinghua Liu
TL;DR
Get AI-generated summaries with premium
Get AI-generated summaries with premium
Homogenization is a fundamental tool for studying multiscale physical phenomena. Traditional numerical homogenization methods, heavily reliant on finite element analysis, demand significant computational resources, especially for complex geometries, materials, and high-resolution problems. To address these challenges, we propose PreFine-Homo, a novel numerical homogenization framework comprising two phases: pretraining and fine-tuning. In the pretraining phase, a Fourier Neural Operator (FNO) is trained on large datasets to learn the mapping from input geometries and material properties to displacement fields. In the fine-tuning phase, the pretrained predictions serve as initial solutions for iterative algorithms, drastically reducing the number of iterations needed for convergence. The pretraining phase of PreFine-Homo delivers homogenization results up to 1000 times faster than conventional methods, while the fine-tuning phase further enhances accuracy. Moreover, the fine-tuning phase grants PreFine-Homo unlimited generalization capabilities, enabling continuous learning and improvement as data availability increases. We validate PreFine-Homo by predicting the effective elastic tensor for 3D periodic materials, specifically Triply Periodic Minimal Surfaces (TPMS). The results demonstrate that PreFine-Homo achieves high precision, exceptional efficiency, robust learning capabilities, and strong extrapolation ability, establishing it as a powerful tool for multiscale homogenization tasks.