Better Initialization
Better initialization techniques for neural networks are a crucial area of research aiming to improve model training efficiency and performance. Current efforts focus on leveraging pre-trained models (like foundation models and LLMs) to generate informative initializations, employing novel initialization schemes tailored to specific data types (e.g., tabular data or point clouds), and using meta-learning or Jacobian tuning for automated initialization optimization. These advancements lead to faster convergence, improved accuracy, and reduced computational costs across diverse applications, including active learning, neural architecture search, and various classification tasks.
Papers
February 4, 2024
November 7, 2023
October 25, 2023
March 30, 2023
September 23, 2022
July 30, 2022
June 27, 2022
May 25, 2022
May 10, 2022
April 21, 2022