论文标题

具有非线性功率放大器的大量MIMO下行系统的失真感知线性预编码

Distortion-Aware Linear Precoding for Massive MIMO Downlink Systems with Nonlinear Power Amplifiers

论文作者

Aghdam, Sina Rezaei, Jacobsson, Sven, Gustavsson, Ulf, Durisi, Giuseppe, Studer, Christoph, Eriksson, Thomas

论文摘要

在存在非线性功率放大器(PAS)的情况下,我们在大量多输入多输出下行链路系统上引入了线性预编码器设计的框架。通过研究失真的空间特征,我们证明了常规的线性预码技术将非线性畸变转向用户。我们表明,通过考虑到PA非线性,可以设计减少的线性预编码器,并且在单用户方案中,甚至可以完全消除用户方向上传播的失真。但是,这是以减少阵列增益的价格实现的。为了解决此问题,我们提出了预编码器优化算法,这些算法同时考虑了数组增益,失真,多源干扰和接收器噪声的影响。具体而言,我们得出了可实现的总和速率的表达式,并提出了一种迭代算法,该算法试图找到最大化该表达式的预编码矩阵。此外,使用用于PA功率消耗的模型,我们提出了一种算法,该算法试图找到预编码的矩阵,该矩阵最大程度地减少了给定最低可实现的总和的消耗功率。我们的数值结果表明,与传统的线性预码相比,提出的失真感知的预码技术可在光谱和能源效率方面显着提高。

We introduce a framework for linear precoder design over a massive multiple-input multiple-output downlink system in the presence of nonlinear power amplifiers (PAs). By studying the spatial characteristics of the distortion, we demonstrate that conventional linear precoding techniques steer nonlinear distortions towards the users. We show that, by taking into account PA nonlinearity, one can design linear precoders that reduce, and in single-user scenarios, even completely remove the distortion transmitted in the direction of the users. This, however, is achieved at the price of a reduced array gain. To address this issue, we present precoder optimization algorithms that simultaneously take into account the effects of array gain, distortion, multiuser interference, and receiver noise. Specifically, we derive an expression for the achievable sum rate and propose an iterative algorithm that attempts to find the precoding matrix which maximizes this expression. Moreover, using a model for PA power consumption, we propose an algorithm that attempts to find the precoding matrix that minimizes the consumed power for a given minimum achievable sum rate. Our numerical results demonstrate that the proposed distortion-aware precoding techniques provide significant improvements in spectral and energy efficiency compared to conventional linear precoders.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源