Rest RESTAD NAP
Research on "REST" (and related acronyms like RESTAD, ReSTR, and ResMem) spans diverse applications, primarily focusing on improving the efficiency and robustness of machine learning models. Current efforts involve developing novel architectures, such as transformer-based models and graph neural networks, alongside algorithms like reweighted sparse training and residual memorization, to enhance performance in areas including anomaly detection, text classification, and image segmentation. These advancements aim to address challenges like data scarcity, adversarial attacks, and biases in model outputs, ultimately leading to more accurate, efficient, and reliable AI systems across various domains.
Papers
August 8, 2024
May 13, 2024
January 17, 2024
December 15, 2023
December 5, 2023
November 23, 2023
August 25, 2023
April 27, 2023
February 3, 2023
September 29, 2022
May 3, 2022
March 31, 2022
January 13, 2022