Training Scheme
Training schemes for neural networks are undergoing significant refinement, focusing on improving efficiency, generalization, and robustness. Current research emphasizes techniques like cross-training with knowledge fusion for federated learning, optimized training strategies for Vision Transformers (ViTs) including methods like Experts Weights Averaging, and efficient handling of sequential data of varying lengths. These advancements aim to enhance model performance across diverse tasks and datasets, impacting areas such as image classification, natural language processing, and model compression.
Papers
November 12, 2024
May 30, 2024
May 21, 2024
October 16, 2023
August 11, 2023
April 13, 2023
October 28, 2022
June 23, 2022
June 13, 2022