Paper ID: 2111.07524

PatchGraph: In-hand tactile tracking with learned surface normals

Paloma Sodhi, Michael Kaess, Mustafa Mukadam, Stuart Anderson

We address the problem of tracking 3D object poses from touch during in-hand manipulations. Specifically, we look at tracking small objects using vision-based tactile sensors that provide high-dimensional tactile image measurements at the point of contact. While prior work has relied on a-priori information about the object being localized, we remove this requirement. Our key insight is that an object is composed of several local surface patches, each informative enough to achieve reliable object tracking. Moreover, we can recover the geometry of this local patch online by extracting local surface normal information embedded in each tactile image. We propose a novel two-stage approach. First, we learn a mapping from tactile images to surface normals using an image translation network. Second, we use these surface normals within a factor graph to both reconstruct a local patch map and use it to infer 3D object poses. We demonstrate reliable object tracking for over $100$ contact sequences across unique shapes with four objects in simulation and two objects in the real-world. Supplementary video: https://youtu.be/FHks--haOGY

Submitted: Nov 15, 2021