Hand Pose Estimation
Hand pose estimation aims to accurately determine the 3D position and orientation of a hand's joints from images or video, enabling applications in human-computer interaction, robotics, and healthcare. Current research emphasizes improving robustness to challenging conditions like occlusion and varying lighting, often employing transformer networks, diffusion models, and graph convolutional networks to capture spatial and temporal relationships between hand joints. These advancements are driving progress in areas such as sign language recognition, gesture-based control of devices, and automated analysis of human activities, particularly in egocentric settings.
Papers
November 14, 2024
November 10, 2024
November 5, 2024
October 13, 2024
October 11, 2024
September 18, 2024
September 15, 2024
August 26, 2024
July 30, 2024
July 13, 2024
June 18, 2024
June 5, 2024
April 4, 2024
March 20, 2024
March 14, 2024
February 26, 2024
January 31, 2024
December 11, 2023
October 26, 2023
October 18, 2023