论文标题
通过信息测量的凸度,预期的概括误差界限
Tighter Expected Generalization Error Bounds via Convexity of Information Measures
论文作者
论文摘要
概括误差范围对于理解机器学习算法至关重要。本文根据输出假设和每个输入训练样本之间的平均关节分布提出了新型的预期概括上限。提供了基于不同信息度量的多个泛化误差上限,包括Wasserstein距离,总变化距离,KL Divergence和Jensen-Shannon Divergence。由于信息度量的凸面性,根据文献中的各个样本,根据瓦斯汀距离和总变化距离而言,提出的边界比其对应物更紧密。提供了一个示例来证明所提出的概括误差界限的紧密度。
Generalization error bounds are essential to understanding machine learning algorithms. This paper presents novel expected generalization error upper bounds based on the average joint distribution between the output hypothesis and each input training sample. Multiple generalization error upper bounds based on different information measures are provided, including Wasserstein distance, total variation distance, KL divergence, and Jensen-Shannon divergence. Due to the convexity of the information measures, the proposed bounds in terms of Wasserstein distance and total variation distance are shown to be tighter than their counterparts based on individual samples in the literature. An example is provided to demonstrate the tightness of the proposed generalization error bounds.