Motion Synthesis
Motion synthesis aims to generate realistic human and animal movements from various inputs, such as text, audio, or sparse sensor data, primarily to create lifelike animations and interactive experiences. Current research heavily utilizes diffusion models and transformers, often incorporating techniques like autoregressive generation, attention mechanisms, and multi-modal conditioning to improve motion coherence, detail, and controllability. This field is significant for its applications in animation, gaming, virtual reality, and robotics, as well as for its potential to advance our understanding of human and animal movement through the creation of large-scale synthetic datasets.
Papers
October 31, 2024
October 18, 2024
October 8, 2024
September 20, 2024
September 3, 2024
August 23, 2024
August 6, 2024
July 11, 2024
June 10, 2024
May 27, 2024
April 19, 2024
March 27, 2024
March 23, 2024
March 13, 2024
March 4, 2024
February 11, 2024
February 1, 2024
January 16, 2024
January 4, 2024
December 22, 2023