Paper ID: 2203.00488

Emergence of human oculomotor behavior from optimal control of a cable-driven biomimetic robotic eye

Reza Javanmard Alitappeh, Akhil John, Bernardo Dias, A. John van Opstal, Alexandre Bernardino

In human-robot interactions, eye movements play an important role in non-verbal communication. However, controlling the motions of a robotic eye that display similar performance as the human oculomotor system is still a major challenge. In this paper, we study how to control a realistic model of the human eye with a cable-driven actuation system that mimics the six degrees of freedom of the extra-ocular muscles. The biomimetic design introduces novel challenges to address, most notably the need to control the pretension on each individual muscle to prevent the loss of tension during motion, that would lead to cable slack and lack of control. We built a robotic prototype and developed a nonlinear simulator and two controllers. In the first approach, we linearized the nonlinear model, using a local derivative technique, and designed linear-quadratic optimal controllers to optimize a cost function that accounts for accuracy, energy expenditure, and movement duration. The second method uses a recurrent neural network that learns the nonlinear system dynamics from sample trajectories of the system, and a non-linear trajectory optimization solver that minimizes a similar cost function. We focused on the generation of rapid saccadic eye movements with fully unconstrained kinematics, and the generation of control signals for the six cables that simultaneously satisfied several dynamic optimization criteria. The model faithfully mimics the three-dimensional rotational kinematics and dynamics observed for human saccades. Our experimental results indicate that while both methods yielded similar results, the nonlinear method is more flexible for future improvements to the model, for which the calculations of the linearized model's position-dependent pretensions and local derivatives become particularly tedious.

Submitted: Mar 1, 2022