Paper ID: 2408.08261

mhGPT: A Lightweight Generative Pre-Trained Transformer for Mental Health Text Analysis

Dae-young Kim, Rebecca Hwa, Muhammad Mahbubur Rahman

This paper introduces mhGPT, a lightweight generative pre-trained transformer trained on mental health-related social media and PubMed articles. Fine-tuned for specific mental health tasks, mhGPT was evaluated under limited hardware constraints and compared with state-of-the-art models like MentaLLaMA and Gemma. Despite having only 1.98 billion parameters and using just 5% of the dataset, mhGPT outperformed larger models and matched the performance of models trained on significantly more data. The key contributions include integrating diverse mental health data, creating a custom tokenizer, and optimizing a smaller architecture for low-resource settings. This research could advance AI-driven mental health care, especially in areas with limited computing power.

Submitted: Aug 15, 2024