Paper ID: 2406.16893
A Survey on Transformers in NLP with Focus on Efficiency
Wazib Ansar, Saptarsi Goswami, Amlan Chakrabarti
The advent of transformers with attention mechanisms and associated pre-trained models have revolutionized the field of Natural Language Processing (NLP). However, such models are resource-intensive due to highly complex architecture. This limits their application to resource-constrained environments. While choosing an appropriate NLP model, a major trade-off exists over choosing accuracy over efficiency and vice versa. This paper presents a commentary on the evolution of NLP and its applications with emphasis on their accuracy as-well-as efficiency. Following this, a survey of research contributions towards enhancing the efficiency of transformer-based models at various stages of model development along with hardware considerations has been conducted. The goal of this survey is to determine how current NLP techniques contribute towards a sustainable society and to establish a foundation for future research.
Submitted: May 15, 2024