Single Deep
Single deep learning models are being explored as a means to improve efficiency and generalization across diverse machine learning tasks, moving away from ensembles of separate networks. Current research focuses on developing architectures capable of handling multiple tasks or domains simultaneously, often employing transformer-based models or incorporating techniques like knowledge distillation and heterogeneous quantization to optimize performance and resource utilization. This approach promises to reduce computational costs and improve the adaptability of deep learning systems for various applications, including image processing, material science, and real-time systems.
Papers
September 30, 2024
May 29, 2024
April 15, 2024
April 3, 2024
February 9, 2024
January 23, 2024
November 10, 2023
September 28, 2023
September 18, 2023
September 1, 2023
June 16, 2023
May 13, 2023
April 21, 2023
August 30, 2022
August 24, 2022