Paper ID: 2410.15414

An Agile Large-Workspace Teleoperation Interface Based on Human Arm Motion and Force Estimation

Jianhang Jia (1), Hao Zhou (2), Xin Zhang (1, 2) ((1) Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, China, (2) School of Computing, University of Portsmouth, Portsmouth, United Kingdom.)

Teleoperation can transfer human perception and cognition to a slave robot to cope with some complex tasks, in which the agility and flexibility of the interface play an important role in mapping human intention to the robot. In this paper, we developed an agile large-workspace teleoperation interface by estimating human arm behavior. Using the wearable sensor, namely the inertial measurement unit and surface electromyography armband, we can capture the human arm motion and force information, thereby intuitively controlling the manipulation of the robot. The control principle of our wearable interface includes two parts: (1) the arm incremental kinematics and (2) the grasping recognition. Moreover, we developed a teleoperation framework with a time synchronization mechanism for the real-time application. We conducted experimental comparisons with a versatile haptic device (Omega 7) to verify the effectiveness of our interface and framework. Seven subjects are invited to complete three different tasks: free motion, handover, and pick-and-place action (each task ten times), and the total number of tests is 420. Objectively, we used the task completion time and success rate to compare the performance of the two interfaces quantitatively. In addition, to quantify the operator experience, we used the NASA Task Load Index to assess their subjective feelings. The results showed that the proposed interface achieved a competitive performance with a better operating experience.

Submitted: Oct 20, 2024