Paper ID: 2203.16393

Online Motion Style Transfer for Interactive Character Control

Yingtian Tang, Jiangtao Liu, Cheng Zhou, Tingguang Li

Motion style transfer is highly desired for motion generation systems for gaming. Compared to its offline counterpart, the research on online motion style transfer under interactive control is limited. In this work, we propose an end-to-end neural network that can generate motions with different styles and transfer motion styles in real-time under user control. Our approach eliminates the use of handcrafted phase features, and could be easily trained and directly deployed in game systems. In the experiment part, we evaluate our approach from three aspects that are essential for industrial game design: accuracy, flexibility, and variety, and our model performs a satisfying result.

Submitted: Mar 30, 2022