Good Better
"Good Better" encompasses research aiming to improve the performance and efficiency of various machine learning models and algorithms. Current efforts focus on optimizing model architectures (e.g., Transformers, CNNs) and training methodologies (e.g., gradient clipping, multi-token prediction, evolutionary algorithms) to achieve better accuracy, faster inference, and reduced computational costs. These advancements have significant implications for diverse applications, including natural language processing, computer vision, and robotics, by enabling more efficient and effective systems while addressing issues like bias and resource constraints.
Papers
Towards Better Few-Shot and Finetuning Performance with Forgetful Causal Language Models
Hao Liu, Xinyang Geng, Lisa Lee, Igor Mordatch, Sergey Levine, Sharan Narang, Pieter Abbeel
The Better Your Syntax, the Better Your Semantics? Probing Pretrained Language Models for the English Comparative Correlative
Leonie Weissweiler, Valentin Hofmann, Abdullatif Köksal, Hinrich Schütze