Music Emotion
Music emotion research aims to understand and model the relationship between musical features and the emotional responses they evoke, focusing on accurate prediction and generation of emotional music. Current research employs diverse deep learning architectures, including convolutional and recurrent neural networks, often incorporating multimodal data (audio and symbolic representations) and leveraging techniques like multi-task learning and latent space disentanglement to improve model performance and interpretability. This field is significant for advancing music information retrieval, personalized music recommendation systems, and applications in therapeutic contexts, such as music therapy and emotional regulation.
Papers
February 20, 2022
January 15, 2022