Learned Decimation for Neural Belief Propagation Decoders

11/04/2020
by   Andreas Buchberger, et al.
0

We introduce a two-stage decimation process to improve the performance of neural belief propagation (NBP), recently introduced by Nachmani et al., for short low-density parity-check (LDPC) codes. In the first stage, we build a list by iterating between a conventional NBP decoder and guessing the least reliable bit. The second stage iterates between a conventional NBP decoder and learned decimation, where we use a neural network to decide the decimation value for each bit. For a (128,64) LDPC code, the proposed NBP with decimation outperforms NBP decoding by 0.75 dB and performs within 1 dB from maximum-likelihood decoding at a block error rate of 10^-4.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/21/2020

Pruning Neural Belief Propagation Decoders

We consider near maximum-likelihood (ML) decoding of short linear block ...
research
07/21/2020

Towards Quantum Belief Propagation for LDPC Decoding in Wireless Networks

We present Quantum Belief Propagation (QBP), a Quantum Annealing (QA) ba...
research
01/24/2019

Learned Belief-Propagation Decoding with Simple Scaling and SNR Adaptation

We consider the weighted belief-propagation (WBP) decoder recently propo...
research
11/27/2020

Pruning and Quantizing Neural Belief Propagation Decoders

We consider near maximum-likelihood (ML) decoding of short linear block ...
research
05/22/2019

MIST: A Novel Training Strategy for Low-latencyScalable Neural Net Decoders

In this paper, we propose a low latency, robust and scalable neural net ...
research
05/17/2023

Generalization Bounds for Neural Belief Propagation Decoders

Machine learning based approaches are being increasingly used for design...
research
06/01/2022

Neural Decoding with Optimization of Node Activations

The problem of maximum likelihood decoding with a neural decoder for err...

Please sign up or login with your details

Forgot password? Click here to reset