Paper ID: 2307.03854
inTformer: A Time-Embedded Attention-Based Transformer for Crash Likelihood Prediction at Intersections Using Connected Vehicle Data
B M Tazbiul Hassan Anik, Zubayer Islam, Mohamed Abdel-Aty
The real-time crash likelihood prediction model is an essential component of the proactive traffic safety management system. Over the years, numerous studies have attempted to construct a crash likelihood prediction model in order to enhance traffic safety, but mostly on freeways. In the majority of the existing studies, researchers have primarily employed a deep learning-based framework to identify crash potential. Lately, Transformer has emerged as a potential deep neural network that fundamentally operates through attention-based mechanisms. Transformer has several functional benefits over extant deep learning models such as LSTM, CNN, etc. Firstly, Transformer can readily handle long-term dependencies in a data sequence. Secondly, Transformers can parallelly process all elements in a data sequence during training. Finally, a Transformer does not have the vanishing gradient issue. Realizing the immense possibility of Transformers, this paper proposes inTersection-Transformer (inTformer), a time-embedded attention-based Transformer model that can effectively predict intersection crash likelihood in real-time. The proposed model was evaluated using connected vehicle data extracted from Signal Analytics Platform. Acknowledging the complex traffic operation mechanism at intersection, this study developed zone-specific models by dividing the intersection region into two distinct zones: within-intersection and approach zone. The best inTformer models in 'within-intersection,' and 'approach' zone achieved a sensitivity of 73%, and 70%, respectively. The zone-level models were also compared to earlier studies on crash likelihood prediction at intersections and with several established deep learning models trained on the same connected vehicle dataset.
Submitted: Jul 7, 2023