Function Approximation

Function approximation aims to represent complex, potentially high-dimensional functions using simpler, computationally tractable models. Current research emphasizes developing novel architectures, such as Kolmogorov-Arnold networks and those incorporating Chebyshev polynomials or path signatures, alongside improved algorithms like those based on optimism principles or two-timescale methods. These advancements are crucial for addressing challenges in diverse fields, including reinforcement learning, contextual bandits, and scientific machine learning, where accurate and efficient function representation is essential for effective modeling and prediction.

Papers