Formality Transfer
Formality transfer, in its broadest sense, aims to leverage knowledge learned in one context (the "source") to improve performance in a related but different context (the "target"). Current research focuses on adapting this concept across diverse domains, employing techniques like transfer learning within neural networks (including transformers and convolutional neural networks), multi-armed bandit algorithms, and knowledge distillation. This research is significant because it addresses the challenge of data scarcity in many applications, improving efficiency and performance in areas such as financial prediction, robotic manipulation, and natural language processing.
Papers
Explaining the physics of transfer learning a data-driven subgrid-scale closure to a different turbulent flow
Adam Subel, Yifei Guan, Ashesh Chattopadhyay, Pedram Hassanzadeh
Transfer learning to decode brain states reflecting the relationship between cognitive tasks
Youzhi Qu, Xinyao Jian, Wenxin Che, Penghui Du, Kai Fu, Quanying Liu