Paper ID: 2111.02632
A Fast Parallel Tensor Decomposition with Optimal Stochastic Gradient Descent: an Application in Structural Damage Identification
Ali Anaissi, Basem Suleiman, Seid Miad Zandavi
Structural Health Monitoring (SHM) provides an economic approach which aims to enhance understanding the behavior of structures by continuously collects data through multiple networked sensors attached to the structure. This data is then utilized to gain insight into the health of a structure and make timely and economic decisions about its maintenance. The generated SHM sensing data is non-stationary and exists in a correlated multi-way form which makes the batch/off-line learning and standard two-way matrix analysis unable to capture all of these correlations and relationships. In this sense, the online tensor data analysis has become an essential tool for capturing underlying structures in higher-order datasets stored in a tensor $\mathcal{X} \in \mathbb{R} ^{I_1 \times \dots \times I_N} $. The CANDECOMP/PARAFAC (CP) decomposition has been extensively studied and applied to approximate X by N loading matrices A(1), . . . ,A(N) where N represents the order of the tensor. We propose a novel algorithm, FP-CPD, to parallelize the CANDECOMP/PARAFAC (CP) decomposition of a tensor $\mathcal{X} \in \mathbb{R} ^{I_1 \times \dots \times I_N} $. Our approach is based on stochastic gradient descent (SGD) algorithm which allows us to parallelize the learning process and it is very useful in online setting since it updates $\mathcal{X}^{t+1}$ in one single step. Our SGD algorithm is augmented with Nesterov's Accelerated Gradient (NAG) and perturbation methods to accelerate and guarantee convergence. The experimental results using laboratory-based and real-life structural datasets indicate fast convergence and good scalability.
Submitted: Nov 4, 2021