Paper ID: 2407.10606
Visual-tactile manipulation to collect household waste in outdoor
Julio Castaño-Amorós, Ignacio de Loyola Páez-Ubieta, Pablo Gil, Santiago Timoteo Puente
This work presents a perception system applied to robotic manipulation, that is able to assist in navigation, household waste classification and collection in outdoor environments. This system is made up of optical tactile sensors, RGBD cameras and a LiDAR. These sensors are integrated on a mobile platform with a robot manipulator and a robotic gripper. Our system is divided in three software modules, two of them are vision-based and the last one is tactile-based. The vision-based modules use CNNs to localize and recognize solid household waste, together with the grasping points estimation. The tactile-based module, which also uses CNNs and image processing, adjusts the gripper opening to control the grasping from touch data. Our proposal achieves localization errors around 6 %, a recognition accuracy of 98% and ensures the grasping stability the 91% of the attempts. The sum of runtimes of the three modules is less than 750 ms.
Submitted: Jul 15, 2024