Formality Transfer
Formality transfer, in its broadest sense, aims to leverage knowledge learned in one context (the "source") to improve performance in a related but different context (the "target"). Current research focuses on adapting this concept across diverse domains, employing techniques like transfer learning within neural networks (including transformers and convolutional neural networks), multi-armed bandit algorithms, and knowledge distillation. This research is significant because it addresses the challenge of data scarcity in many applications, improving efficiency and performance in areas such as financial prediction, robotic manipulation, and natural language processing.
Papers
Don't Waste Data: Transfer Learning to Leverage All Data for Machine-Learnt Climate Model Emulation
Raghul Parthipan, Damon J. Wischik
Data-Efficiency with a Single GPU: An Exploration of Transfer Methods for Small Language Models
Alon Albalak, Akshat Shrivastava, Chinnadhurai Sankar, Adithya Sagar, Mike Ross
On Neural Consolidation for Transfer in Reinforcement Learning
Valentin Guillet, Dennis G. Wilson, Carlos Aguilar-Melchor, Emmanuel Rachelson
Development and validation of deep learning based embryo selection across multiple days of transfer
Jacob Theilgaard Lassen, Mikkel Fly Kragh, Jens Rimestad, Martin Nygård Johansen, Jørgen Berntsen