论文标题

有效的数据依赖性学习性

Efficient Data-Dependent Learnability

论文作者

Fogel, Yaniv, Shapira, Tal, Feder, Meir

论文摘要

最近已经提出了预测性归一化最大似然(PNML)方法作为批处理学习问题的最低最佳解决方案,在该问题中,训练集和测试数据特征都是个人,即已知序列。这种方法产生了一种可学习的措施,也可以解释为稳定度量。该措施表明,在检测分布外的示例方面有些潜力,但它具有相当大的计算成本。在这个项目中,我们提出和分析了基于影响功能的PNML的近似。结合理论分析和实验,我们表明,当应用于神经网络时,该近似值可以有效地检测到分布的示例。我们还通过为每个可能的标签进行单个梯度步骤来比较它的性能。

The predictive normalized maximum likelihood (pNML) approach has recently been proposed as the min-max optimal solution to the batch learning problem where both the training set and the test data feature are individuals, known sequences. This approach has yields a learnability measure that can also be interpreted as a stability measure. This measure has shown some potential in detecting out-of-distribution examples, yet it has considerable computational costs. In this project, we propose and analyze an approximation of the pNML, which is based on influence functions. Combining both theoretical analysis and experiments, we show that when applied to neural networks, this approximation can detect out-of-distribution examples effectively. We also compare its performance to that achieved by conducting a single gradient step for each possible label.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源