Perceptual Fidelity
Perceptual fidelity research aims to understand and replicate how humans perceive sensory information, particularly in images and sounds, for applications like immersive displays and robotic interactions. Current efforts focus on developing computationally efficient models, such as multi-tasking neural networks and diffusion models, to enhance perceptual quality across various tasks including image compression, super-resolution, and sound synthesis, often incorporating text-guided approaches for semantic control. This work is crucial for improving the realism and user experience of various technologies, from virtual and augmented reality to human-robot collaboration, by aligning digital representations with human sensory perception.
Papers
July 31, 2024
May 29, 2024
March 5, 2024
January 1, 2024
November 23, 2023
November 21, 2023
August 16, 2023
April 18, 2023
December 3, 2022