Paper ID: 2411.03581
Can Robotic Cues Manipulate Human Decisions? Exploring Consensus Building via Bias-Controlled Non-linear Opinion Dynamics and Robotic Eye Gaze Mediated Interaction in Human-Robot Teaming
Rajul Kumar, Adam Bhatti, Ningshi Yao
Although robots are becoming more advanced with human-like anthropomorphic features and decision-making abilities to improve collaboration, the active integration of humans into this process remains under-explored. This article presents the first experimental study exploring decision-making interactions between humans and robots with visual cues from robotic eyes, which can dynamically influence human opinion formation. The cues generated by robotic eyes gradually guide human decisions towards alignment with the robot's choices. Both human and robot decision-making processes are modeled as non-linear opinion dynamics with evolving biases. To examine these opinion dynamics under varying biases, we conduct numerical parametric and equilibrium continuation analyses using tuned parameters designed explicitly for the presented human-robot interaction experiment. Furthermore, to facilitate the transition from disagreement to agreement, we introduced a human opinion observation algorithm integrated with the formation of the robot's opinion, where the robot's behavior is controlled based on its formed opinion. The algorithms developed aim to enhance human involvement in consensus building, fostering effective collaboration between humans and robots. Experiments with 51 participants (N = 51) show that human-robot teamwork can be improved by guiding human decisions using robotic cues. Finally, we provide detailed insights on the effects of trust, cognitive load, and participant demographics on decision-making based on user feedback and post-experiment interviews.
Submitted: Nov 6, 2024