Technical Program

Paper Detail

Paper IDC.1.4
Paper Title Multi-Label and Concatenated Neural Block Decoders
Authors Cheuk Ting Leung, NUS, Singapore; Rajshekhar Bhat, IIT, India; Mehul Motani, NUS, Singapore
Session C.1: Coding for Communications I
Presentation Lecture
Track Coding for Communications
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract There has been a growing interest in designing neural-network based decoders (or neural decoders in short) for communication systems. In the prior work, we cast the problem of decoding an $(n,k)$ block code as a single-label classification problem, and it is shown that the performance of such \emph{single-label neural decoders} closely approaches that of the corresponding maximum likelihood soft-decision (ML-SD) decoders. The main issue is that the number of output nodes of single-label neural decoders increases exponentially with $k$, making it prohibitive to decode a code with medium or large dimension. To address this issue, we first explore a multi-label classification based neural decoder for block codes, in which the number of output nodes increases linearly with $k$. The complexity of the multi-label neural decoder is lower, but the performance is still close to that of the ML-SD decoder. We also consider concatenating a high-rate short-length outer code with the original code as the inner code. The proposed concatenated decoding architecture consists of a multi-label neural decoder for the inner code and a single label neural decoder for the outer code. The results demonstrate that the concatenated decoding approach leads to better bit and block error performance as compared to a benchmark soft-decision decoder. We note that the overall size of the concatenated neural decoder is close to that of the single-label neural decoder.

Plan Ahead

IEEE ISIT 2021

2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!