论文标题

与部分重复的广义空间耦合并行串联代码

Generalized Spatially-Coupled Parallel Concatenated Codes With Partial Repetition

论文作者

Qiu, Min, Wu, Xiaowei, Yuan, Jinhong, Amat, Alexandre Graell i

论文摘要

引入了一类新的空间耦合涡轮样代码(SC-TCS),称为广义上的空间耦合并行串联代码(GSC-PCC)。这些代码是通过在平行串联代码(PCC)上应用空间耦合而构建的,其中一小部分信息位重复$ q $ times。 GSC-PCC可以看作是Moloudi等人提出的原始空间耦合并行串联代码的概括。 [2]。为了表征GSC-PCC的渐近性能,我们得出相应的密度演化方程并计算其解码阈值。观察并证明了阈值饱和效应。最重要的是,我们严格地证明,具有2态卷积组件代码的任何费率-R $ r $ GSC-PCC合奏至少可以达到$ 1- \ frac {R} {r+q} $的二元擦除通道(BEC)的能力的重复因子$ q \ q \ q \ q \ q \ q \ q \ q \ q \ q \ q \ q \ q $ q $ q $ q $ q $据我们所知,这是被证明具有能力实现的SC-TC。此外,建立了组件代码强度,GSC-PCC的解码阈值与重复因子之间的连接。通过计算机模拟将其错误性能与现有SC-TC的错误性能进行比较,可以说明具有有限区块长度的拟议代码的优势。

A new class of spatially-coupled turbo-like codes (SC-TCs), dubbed generalized spatially coupled parallel concatenated codes (GSC-PCCs), is introduced. These codes are constructed by applying spatial coupling on parallel concatenated codes (PCCs) with a fraction of information bits repeated $q$ times. GSC-PCCs can be seen as a generalization of the original spatially-coupled parallel concatenated codes proposed by Moloudi et al. [2]. To characterize the asymptotic performance of GSC-PCCs, we derive the corresponding density evolution equations and compute their decoding thresholds. The threshold saturation effect is observed and proven. Most importantly, we rigorously prove that any rate-$R$ GSC-PCC ensemble with 2-state convolutional component codes achieves at least a fraction $1-\frac{R}{R+q}$ of the capacity of the binary erasure channel (BEC) for repetition factor $q\geq2$ and this multiplicative gap vanishes as $q$ tends to infinity. To the best of our knowledge, this is the first class of SC-TCs that are proven to be capacity-achieving. Further, the connection between the strength of the component codes, the decoding thresholds of GSC-PCCs, and the repetition factor are established. The superiority of the proposed codes with finite blocklength is exemplified by comparing their error performance with that of existing SC-TCs via computer simulations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源