Paper ID: 2208.02423
Adaptive Latent Factor Analysis via Generalized Momentum-Incorporated Particle Swarm Optimization
Jiufang Chen, Ye Yuan
Stochastic gradient descent (SGD) algorithm is an effective learning strategy to build a latent factor analysis (LFA) model on a high-dimensional and incomplete (HDI) matrix. A particle swarm optimization (PSO) algorithm is commonly adopted to make an SGD-based LFA model's hyper-parameters, i.e, learning rate and regularization coefficient, self-adaptation. However, a standard PSO algorithm may suffer from accuracy loss caused by premature convergence. To address this issue, this paper incorporates more historical information into each particle's evolutionary process for avoiding premature convergence following the principle of a generalized-momentum (GM) method, thereby innovatively achieving a novel GM-incorporated PSO (GM-PSO). With it, a GM-PSO-based LFA (GMPL) model is further achieved to implement efficient self-adaptation of hyper-parameters. The experimental results on three HDI matrices demonstrate that the GMPL model achieves a higher prediction accuracy for missing data estimation in industrial applications.
Submitted: Aug 4, 2022