Paper ID: 2409.18999
Enhancing TinyBERT for Financial Sentiment Analysis Using GPT-Augmented FinBERT Distillation
Graison Jos Thomas
In the rapidly evolving field of financial sentiment analysis, the efficiency and accuracy of predictive models are critical due to their significant impact on financial markets. Transformer based models like BERT and large language models (LLMs) like GPT-4, have advanced NLP tasks considerably. Despite their advantages, BERT-based models face challenges with computational intensity in edge computing environments, and the substantial size and compute requirements of LLMs limit their practical deployment. This study proposes leveraging the generative capabilities of LLMs, such as GPT-4 Omni, to create synthetic, domain-specific training data. This approach addresses the challenge of data scarcity and enhances the performance of smaller models by making them competitive with their larger counterparts. The research specifically aims to enhance FinBERT, a BERT model fine-tuned for financial sentiment analysis, and develop TinyFinBERT, a compact transformer model, through a structured, two-tiered knowledge distillation strategy. Using data augmented by GPT-4 Omni, which involves generating new training examples and transforming existing data, we significantly improved the accuracy of FinBERT, preparing it to serve as a teacher model. This enhanced FinBERT then distilled knowledge to TinyFinBERT, employing both GPT-4 Omni and GPT-3.5 Turbo augmented data. The distillation strategy incorporated both logit and intermediate layer distillation. The training and evaluation of TinyFinBERT utilized the PhraseBank dataset and the FiQA 2018 Task1 dataset, achieving performance comparable to FinBERT while being substantially smaller and more efficient. This research demonstrates how LLMs can effectively contribute to the advancement of financial sentiment analysis by enhancing the capabilities of smaller, more efficient models through innovative data augmentation and distillation techniques.
Submitted: Sep 19, 2024