Paper ID: 2303.08359

Haptics-Enabled Forceps with Multi-Modal Force Sensing: Towards Task-Autonomous Surgery

Tangyou Liu, Tinghua Zhang, Jay Katupitiya, Jiaole Wang, Liao Wu

Many robotic surgical systems have been developed with micro-sized forceps for tissue manipulation. However, these systems often lack force sensing at the tool side and the manipulation forces are roughly estimated and controlled relying on the surgeon's visual perception. To address this challenge, we present a vision-based module to enable the micro-sized forceps' multi-modal force sensing. A miniature sensing module adaptive to common micro-sized forceps is proposed, consisting of a flexure, a camera, and a customised target. The deformation of the flexure is obtained by the camera estimating the pose variation of the top-mounted target. Then, the external force applied to the sensing module is calculated using the flexure's displacement and stiffness matrix. Integrating the sensing module into the forceps, in conjunction with a single-axial force sensor at the proximal end, we equip the forceps with haptic sensing capabilities. Mathematical equations are derived to estimate the multi-modal force sensing of the haptics-enabled forceps, including pushing/pulling forces (Mode-I) and grasping forces (Mode-II). A series of experiments on phantoms and ex vivo tissues are conducted to verify the feasibility of the proposed design and method. Results indicate that the haptics-enabled forceps can achieve multi-modal force estimation effectively and potentially realize autonomous robotic tissue grasping procedures with controlled forces. A video demonstrating the experiments can be found at https://youtu.be/pi9bqSkwCFQ.

Submitted: Mar 15, 2023