Music Emotion
Music emotion research aims to understand and model the relationship between musical features and the emotional responses they evoke, focusing on accurate prediction and generation of emotional music. Current research employs diverse deep learning architectures, including convolutional and recurrent neural networks, often incorporating multimodal data (audio and symbolic representations) and leveraging techniques like multi-task learning and latent space disentanglement to improve model performance and interpretability. This field is significant for advancing music information retrieval, personalized music recommendation systems, and applications in therapeutic contexts, such as music therapy and emotional regulation.
Papers
October 29, 2024
October 15, 2024
September 23, 2024
August 13, 2024
August 4, 2024
June 20, 2024
June 13, 2024
May 10, 2024
February 22, 2024
December 16, 2023
November 7, 2023
October 29, 2023
August 28, 2023
July 3, 2023
June 5, 2023
May 14, 2023
April 11, 2023
March 3, 2023
January 10, 2023