Paper ID: 2205.00481
A recipe of training neural network-based LDPC decoders
Guangwen Li, Xiao Yu
It is known belief propagation decoding variants of LDPC codes can be unrolled easily as neural networks after assigning differed weights to message passing edges flexibly. In this paper we focus on how to determine these weights, in the form of trainable paramters, within a framework of deep learning. Firstly, a new method is proposed to generate high-quality training data via exploiting an approximation to the targeted mixture density. Then the strong positive correlation between training loss and decoding metrics is fully exposed after tracing the training evolution curves. Lastly, for the purpose of facilitating training convergence and reducing decoding complexity, we highlight the necessity of slashing the number of trainable parameters while emphasizing the locations of these survived ones, which is justified in the extensive simulation.
Submitted: May 1, 2022