论文标题
光滑的Rényi熵及其应用的渐近扩展
Asymptotic Expansions of Smooth Rényi Entropies and Their Applications
论文作者
论文摘要
这项研究考虑了无条件的平滑rényi熵,这是Kuzuoka [\ emph {ieee trans。\ inf。\理论}提出的平滑条件性Rényi熵。特别是,当具有侧向信息的基础源是静止且无内存的,我们检查了这些熵的渐近膨胀。使用这些平滑的rényi熵,我们建立了几种信息理论问题的一声编码定理:坎贝尔的源编码,猜测问题和编码问题的任务,都允许错误。在每个问题中,我们考虑两个错误形式主义:平均和最大误差标准,其中平均和最大化是相对于源的侧面信息。将我们的渐近扩展应用于派生的单发编码定理,当允许其错误概率不变时,我们为这些问题提供了各种渐近基本限制。我们表明,在非分类设置中,在平均值和最大误差标准下,一阶基本限制有所不同。这与当前作者考虑的不同但相关的设置(对于允许错误的可变条件源编码允许错误)中,其中一阶项是相同的,但在这些标准下二阶项不同。
This study considers the unconditional smooth Rényi entropy, the smooth conditional Rényi entropy proposed by Kuzuoka [\emph{IEEE Trans.\ Inf.\ Theory}, vol.~66, no.~3, pp.~1674--1690, 2020], and a new quantity which we term the conditional smooth Rényi entropy. In particular, we examine asymptotic expansions of these entropies when the underlying source with its side-information is stationary and memoryless. Using these smooth Rényi entropies, we establish one-shot coding theorems of several information-theoretic problems: Campbell's source coding, guessing problems, and task encoding problems, all allowing errors. In each problem, we consider two error formalisms: the average and maximum error criteria, where the averaging and maximization are taken with respect to the side-information of the source. Applying our asymptotic expansions to the derived one-shot coding theorems, we derive various asymptotic fundamental limits for these problems when their error probabilities are allowed to be non-vanishing. We show that, in non-degenerate settings, the first-order fundamental limits differ under the average and maximum error criteria. This is in contrast to a different but related setting considered by the present authors (for variable-length conditional source coding allowing errors) in which the first-order terms are identical but the second-order terms are different under these criteria.