论文标题

在大型继电器系统中计算和前向:限制和渐近最佳调度

Compute-and-Forward in Large Relaying Systems: Limitations and Asymptotically Optimal Scheduling

论文作者

Shmuel, Ori, Cohen, Asaf, Gurewitz, Omer

论文摘要

计算和正向(CF)是一种编码方案,它使接收器能够在利用晶格代码的线性属性和共享介质的添加性质的同时解释同时发送消息的线性组合。该方案最初是为中继网络设计的,但是在其他通信问题(例如MIMO通信)中被发现有用。当前文献中的作品假设系统中的发射器和接收器数量固定。但是,随着通信网络密度的增加,当发射机数量较大时,研究CF的性能很有趣。 在这项工作中,我们表明,随着发射机的数量的增加,CF变成了退化,从某种意义上说,继电器更喜欢解码一个(最强的)用户,而不是传输代码字的任何其他线性组合,将其他用户视为噪声。此外,系统的总和率也趋于零。这使得为​​了维持CF提供的优越能力所需的时间安排。因此,我们研究了CF的调度问题。我们首先了解为什么会找到良好的安排机会。然后,我们提供渐近最佳的多项式时间调度算法并分析其性能。我们得出的结论是,按照适当的调度,CF不仅是非成型的,而且实际上为系统总和率提供了收益,直到$ o(\ log {\ log {\ log {l}})$的最佳缩放定律。

Compute and Forward (CF) is a coding scheme which enables receivers to decode linear combinations of simultaneously transmitted messages while exploiting the linear properties of lattice codes and the additive nature of a shared medium. The scheme was originally designed for relay networks, yet, it was found useful in other communication problems, such as MIMO communication. Works in the current literature assume a fixed number of transmitters and receivers in the system. However, following the increase in communication networks density, it is interesting to investigate the performance of CF when the number of transmitters is large. In this work, we show that as the number of transmitters grows, CF becomes degenerated, in the sense that a relay prefers to decode only one (strongest) user instead of any other linear combination of the transmitted codewords, treating the other users as noise. Moreover, the system's sum-rate tends to zero as well. This makes scheduling necessary in order to maintain the superior abilities CF provides. We thus examine the problem of scheduling for CF. We start with insights on why good scheduling opportunities can be found. Then, we provide an asymptotically optimal, polynomial-time scheduling algorithm and analyze its performance. We conclude that with proper scheduling, CF is not merely non-degenerated, but, in fact, provides a gain for the system sum-rate, up to the optimal scaling law of $O(\log{\log{L}})$.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源