Emotional Expression
Emotional expression research aims to understand how humans convey and perceive emotions through various modalities, including facial expressions, speech, body language, and even contextual cues. Current research focuses on developing robust multimodal models, often employing deep learning architectures like transformers, convolutional neural networks, and variational autoencoders, to classify and generate emotional expressions from diverse data sources. This work is significant for advancing human-computer interaction, improving mental health diagnostics, and mitigating biases in AI systems that process and generate emotional content.
Papers
November 1, 2024
October 20, 2024
October 18, 2024
October 2, 2024
September 27, 2024
September 4, 2024
August 21, 2024
July 31, 2024
July 23, 2024
July 22, 2024
July 17, 2024
June 17, 2024
May 29, 2024
May 28, 2024
May 14, 2024
April 26, 2024
April 10, 2024
April 4, 2024