Interactive Sonification
Interactive sonification translates data into sound, aiming to enhance data understanding and exploration through auditory perception. Current research emphasizes developing effective sonification methods for diverse data types, including those from quantum computing, robotics, astronomy, and healthcare, often employing machine learning models like recurrent neural networks and exploring synthesis techniques such as frequency modulation. This multidisciplinary field is expanding the accessibility and interpretability of complex datasets, improving data analysis in scientific research and offering novel interfaces for human-computer interaction.
Papers
November 14, 2024
September 11, 2024
July 12, 2024
May 9, 2024
April 22, 2024
April 12, 2024
March 15, 2024
February 26, 2024
January 31, 2024
November 30, 2023
November 28, 2023
October 23, 2023
September 21, 2023
August 4, 2022
June 30, 2022
February 24, 2022