Paper ID: 2209.05745

A virtual reality-based method for examining audiovisual prosody perception

Hartmut Meister, Isa Samira Winter, Moritz Waeachtler, Pascale Sandmann, Khaled Abdellatif

Prosody plays a vital role in verbal communication. Acoustic cues of prosody have been examined extensively. However, prosodic characteristics are not only perceived auditorily, but also visually based on head and facial movements. The purpose of this report is to present a method for examining audiovisual prosody using virtual reality. We show that animations based on a virtual human provide motion cues similar to those obtained from video recordings of a real talker. The use of virtual reality opens up new avenues for examining multimodal effects of verbal communication. We discuss the method in the framework of examining prosody perception in cochlear implant listeners.

Submitted: Sep 13, 2022