Emotional Expression
Emotional expression research aims to understand how humans convey and perceive emotions through various modalities, including facial expressions, speech, body language, and even contextual cues. Current research focuses on developing robust multimodal models, often employing deep learning architectures like transformers, convolutional neural networks, and variational autoencoders, to classify and generate emotional expressions from diverse data sources. This work is significant for advancing human-computer interaction, improving mental health diagnostics, and mitigating biases in AI systems that process and generate emotional content.
Papers
May 28, 2024
May 14, 2024
April 26, 2024
April 10, 2024
April 4, 2024
March 21, 2024
March 19, 2024
February 19, 2024
February 5, 2024
January 10, 2024
December 14, 2023
November 9, 2023
November 6, 2023
October 2, 2023
August 27, 2023
August 23, 2023
July 3, 2023
May 30, 2023
May 7, 2023