Paper ID: 2211.03367
Semantic-Aware Environment Perception for Mobile Human-Robot Interaction
Thorsten Hempel, Marc-André Fiedler, Aly Khalifa, Ayoub Al-Hamadi, Laslo Dinges
Current technological advances open up new opportunities for bringing human-machine interaction to a new level of human-centered cooperation. In this context, a key issue is the semantic understanding of the environment in order to enable mobile robots more complex interactions and a facilitated communication with humans. Prerequisites are the vision-based registration of semantic objects and humans, where the latter are further analyzed for potential interaction partners. Despite significant research achievements, the reliable and fast registration of semantic information still remains a challenging task for mobile robots in real-world scenarios. In this paper, we present a vision-based system for mobile assistive robots to enable a semantic-aware environment perception without additional a-priori knowledge. We deploy our system on a mobile humanoid robot that enables us to test our methods in real-world applications.
Submitted: Nov 7, 2022