User Emotion
Understanding user emotion is crucial for developing more human-like and effective human-computer interaction systems. Current research focuses on integrating emotion recognition into task-oriented dialogues, leveraging multimodal data (text, voice, facial expressions, physiological signals) and employing various machine learning models, including large language models and neural networks, to analyze this data and predict user emotional states. This work aims to improve user experience and task completion rates in applications ranging from chatbots and virtual assistants to robots and virtual environments, ultimately leading to more intuitive and empathetic AI systems.
Papers
October 20, 2024
August 15, 2024
August 5, 2024
July 3, 2024
January 24, 2024
January 17, 2024
January 5, 2024
August 24, 2023
August 22, 2023
June 2, 2023
December 23, 2022
October 25, 2022
October 24, 2022