Paper ID: 2112.00432
A benchmark with decomposed distribution shifts for 360 monocular depth estimation
Georgios Albanis, Nikolaos Zioulis, Petros Drakoulis, Federico Alvarez, Dimitrios Zarpalas, Petros Daras
In this work we contribute a distribution shift benchmark for a computer vision task; monocular depth estimation. Our differentiation is the decomposition of the wider distribution shift of uncontrolled testing on in-the-wild data, to three distinct distribution shifts. Specifically, we generate data via synthesis and analyze them to produce covariate (color input), prior (depth output) and concept (their relationship) distribution shifts. We also synthesize combinations and show how each one is indeed a different challenge to address, as stacking them produces increased performance drops and cannot be addressed horizontally using standard approaches.
Submitted: Dec 1, 2021