Paper ID: 2502.01517 • Published Feb 3, 2025
Regularized interpolation in 4D neural fields enables optimization of 3D printed geometries
Christos Margadji, Andi Kuswoyo, Sebastian W. Pattinson
TL;DR
Get AI-generated summaries with premium
Get AI-generated summaries with premium
The ability to accurately produce geometries with specified properties is
perhaps the most important characteristic of a manufacturing process. 3D
printing is marked by exceptional design freedom and complexity but is also
prone to geometric and other defects that must be resolved for it to reach its
full potential. Ultimately, this will require both astute design decisions and
timely parameter adjustments to maintain stability that is challenging even
with expert human operators. While machine learning is widely investigated in
3D printing, existing methods typically overlook spatial features that vary
across prints and thus find it difficult to produce desired geometries. Here,
we encode volumetric representations of printed parts into neural fields and
apply a new regularization strategy, based on minimizing the partial derivative
of the field's output with respect to a single, non-learnable parameter. By
thus encouraging small input changes to yield only small output variations, we
encourage smooth interpolation between observed volumes and hence realistic
geometry predictions. This framework therefore allows the extraction of
'imagined' 3D shapes, revealing how a part would look if manufactured under
previously unseen parameters. The resulting continuous field is used for
data-driven optimization to maximize geometric fidelity between expected and
produced geometries, reducing post-processing, material waste, and production
costs. By optimizing process parameters dynamically, our approach enables
advanced planning strategies, potentially allowing manufacturers to better
realize complex and feature-rich designs.