User Emotion

Understanding user emotion is crucial for developing more human-like and effective human-computer interaction systems. Current research focuses on integrating emotion recognition into task-oriented dialogues, leveraging multimodal data (text, voice, facial expressions, physiological signals) and employing various machine learning models, including large language models and neural networks, to analyze this data and predict user emotional states. This work aims to improve user experience and task completion rates in applications ranging from chatbots and virtual assistants to robots and virtual environments, ultimately leading to more intuitive and empathetic AI systems.

Papers