Paper ID: 2409.18984

Harnessing Large Language Models: Fine-tuned BERT for Detecting Charismatic Leadership Tactics in Natural Language

Yasser Saeid, Felix Neubürger, Stefanie Krügl, Helena Hüster, Thomas Kopinski, Ralf Lanwehr

This work investigates the identification of Charismatic Leadership Tactics (CLTs) in natural language using a fine-tuned Bidirectional Encoder Representations from Transformers (BERT) model. Based on an own extensive corpus of CLTs generated and curated for this task, our methodology entails training a machine learning model that is capable of accurately identifying the presence of these tactics in natural language. A performance evaluation is conducted to assess the effectiveness of our model in detecting CLTs. We find that the total accuracy over the detection of all CLTs is 98.96\% The results of this study have significant implications for research in psychology and management, offering potential methods to simplify the currently elaborate assessment of charisma in texts.

Submitted: Sep 16, 2024