Accompaniment Generation
Accompaniment generation focuses on automatically creating instrumental music to complement a given audio or symbolic musical input, such as vocals or a melody. Current research emphasizes real-time performance and high-quality output, employing diverse architectures including transformers, diffusion models, and variational autoencoders, often incorporating techniques like contrastive learning and beat tracking to improve coherence and stylistic consistency. This field is significant for its potential to enhance music production workflows, enabling musicians to create richer compositions more efficiently and fostering new forms of human-computer creative collaboration.
Papers
October 30, 2024
July 23, 2024
June 30, 2024
May 13, 2024
April 25, 2024
April 14, 2024
February 20, 2024
February 2, 2024
July 19, 2023
July 8, 2023
October 12, 2022
September 13, 2022