论文标题

预测,概括和递归的基本限制:熵创新的观点

Fundamental Limits of Prediction, Generalization, and Recursion: An Entropic-Innovations Perspective

论文作者

Fang, Song, Zhu, Quanyan

论文摘要

在本文中,我们检查了预测的基本绩效限制,无论有没有侧面信息。更具体地说,我们在预测错误的$ \ mathcal {l} _p $ narms上得出通用的下限,该预测错误适用于任何预测算法和任何数据分布。同时,我们结合了信息理论中的熵分析和从预测/估计理论的创新方法,以表征条件(例如,定向信息或相互信息)以达到界限。我们还研究了结果在分析概括(学习)问题中的基本限制(从侧面信息的预测的角度)以及通过将递归算法的基本限制视为概括性预测问题的基本限制。

In this paper, we examine the fundamental performance limits of prediction, with or without side information. More specifically, we derive generic lower bounds on the $\mathcal{L}_p$ norms of the prediction errors that are valid for any prediction algorithms and for any data distributions. Meanwhile, we combine the entropic analysis from information theory and the innovations approach from prediction/estimation theory to characterize the conditions (in terms of, e.g., directed information or mutual information) to achieve the bounds. We also investigate the implications of the results in analyzing the fundamental limits of generalization in fitting (learning) problems from the perspective of prediction with side information, as well as the fundamental limits of recursive algorithms by viewing them as generalized prediction problems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源