Automatic Differentiation

Automatic differentiation (AD) is a computational technique for efficiently calculating derivatives of complex functions, crucial for gradient-based optimization in machine learning and scientific computing. Current research focuses on extending AD's capabilities to handle complex systems, including partial differential equations (PDEs) solved via neural networks and physics-informed models, Monte Carlo integration using neural control variates, and non-differentiable operations in spiking neural networks. AD's impact spans diverse fields, enabling faster training of machine learning models, improved data assimilation in weather forecasting, and accelerated solutions to complex scientific problems previously intractable due to computational cost.

Papers