Paper ID: 2406.14634
Adaptive Manipulation using Behavior Trees
Jacques Cloete, Wolfgang Merkt, Ioannis Havoutis
Many manipulation tasks pose a challenge since they depend on non-visual environmental information that can only be determined after sustained physical interaction has already begun. This is particularly relevant for effort-sensitive, dynamics-dependent tasks such as tightening a valve. To perform these tasks safely and reliably, robots must be able to quickly adapt in response to unexpected changes during task execution. Humans can intuitively respond and adapt their manipulation strategy to suit such problems, but representing and implementing such behaviors for robots remains an open question. We present the adaptive behavior tree, which enables a robot to quickly adapt to both visual and non-visual observations during task execution, preempting task failure or switching to a different strategy based on data from previous attempts. We test our approach on a number of tasks commonly found in industrial settings. Our results demonstrate safety, robustness (100% success rate for all but one experiment) and efficiency in task completion (eg, an overall task speedup of 46% on average for valve tightening), and would reduce dependency on human supervision and intervention.
Submitted: Jun 20, 2024