Paper ID: 2211.04047
DNN Filter for Bias Reduction in Distribution-to-Distribution Scan Matching
Matthew McDermott, Jason Rife
Distribution-to-distribution (D2D) point cloud registration techniques such as the Normal Distributions Transform (NDT) can align point clouds sampled from unstructured scenes and provide accurate bounds of their own solution error covariance -- an important feature for safety-of-life navigation tasks. D2D methods rely on the assumption of a static scene and are therefore susceptible to bias from range-shadowing, self-occlusion, moving objects, and distortion artifacts as the recording device moves between frames. Deep Learning-based approaches can achieve higher accuracy in dynamic scenes by relaxing these constraints, however, DNNs produce uninterpretable solutions which can be problematic from a safety perspective. In this paper, we propose a method of down-sampling LIDAR point clouds to exclude voxels that violate the assumption of a static scene and introduce error to the D2D scan matching process. Our approach uses a solution consistency filter -- identifying and suppressing voxels where D2D contributions disagree with local estimates from a PointNet-based registration network. Our results show that this technique provides significant benefits in registration accuracy, and is particularly useful in scenes containing dense foliage.
Submitted: Nov 8, 2022