Paper ID: 2112.09298

Human-vehicle Cooperative Visual Perception for Autonomous Driving under Complex Road and Traffic Scenarios

Yiyue Zhao, Cailin Lei, Yu Shen, Yuchuan Du, Qijun Chen

Human-vehicle cooperative driving has become the critical technology of autonomous driving, which reduces the workload of human drivers. However, the complex and uncertain road environments bring great challenges to the visual perception of cooperative systems. And the perception characteristics of autonomous driving differ from manual driving a lot. To enhance the visual perception capability of human-vehicle cooperative driving, this paper proposed a cooperative visual perception model. 506 images of complex road and traffic scenarios were collected as the data source. Then this paper improved the object detection algorithm of autonomous vehicles. The mean perception accuracy of traffic elements reached 75.52%. By the image fusion method, the gaze points of human drivers were fused with vehicles' monitoring screens. Results revealed that cooperative visual perception could reflect the riskiest zone and predict the trajectory of conflict objects more precisely. The findings can be applied in improving the visual perception algorithms and providing accurate data for planning and control.

Submitted: Dec 17, 2021