Human Hand
Research on the human hand focuses on understanding its complex mechanics and capabilities, primarily through advanced computer vision and machine learning techniques. Current efforts concentrate on accurate 3D hand pose estimation and tracking, often employing transformer-based networks, convolutional neural networks, and diffusion models to analyze images and videos, including egocentric perspectives. These advancements are driving progress in areas like robotic hand design, prosthetic control, human-computer interaction, and assistive technologies for individuals with disabilities, particularly stroke survivors. The ultimate goal is to create more realistic and functional robotic hands and improve the diagnosis and treatment of hand-related conditions.
Papers
EyeSight Hand: Design of a Fully-Actuated Dexterous Robot Hand with Integrated Vision-Based Tactile Sensors and Compliant Actuation
Branden Romero, Hao-Shu Fang, Pulkit Agrawal, Edward Adelson
Towards Unconstrained Collision Injury Protection Data Sets: Initial Surrogate Experiments for the Human Hand
Robin Jeanne Kirschner, Jinyu Yang, Edonis Elshani, Carina M. Micheler, Tobias Leibbrand, Dirk Müller, Claudio Glowalla, Nader Rajaei, Rainer Burgkart, Sami Haddadin