Spectral Bias
Spectral bias in neural networks refers to the tendency of these models, during training, to prioritize learning low-frequency components of a function before higher-frequency ones. Current research focuses on understanding this bias across various architectures, including multi-layer perceptrons, convolutional networks, and transformers, and exploring methods to mitigate it, such as novel initialization schemes, normalization techniques, and modified activation functions. Addressing spectral bias is crucial for improving the accuracy and efficiency of neural networks in applications requiring high-frequency detail, such as image processing, scientific computing, and solving partial differential equations.
Papers
October 21, 2024
October 17, 2024
October 7, 2024
October 2, 2024
September 15, 2024
July 28, 2024
July 25, 2024
July 24, 2024
July 19, 2024
April 1, 2024
March 22, 2024
March 11, 2024
February 18, 2024
February 1, 2024
December 5, 2023
August 28, 2023
July 19, 2023
July 17, 2023
July 12, 2023