论文标题

原油:从经验上校准回归不确定性分布

CRUDE: Calibrating Regression Uncertainty Distributions Empirically

论文作者

Zelikman, Eric, Healy, Christopher, Zhou, Sharon, Avati, Anand

论文摘要

机器学习中校准的不确定性估计对于许多领域至关重要,例如自动驾驶汽车,医学以及天气和气候预测。尽管有关于分类不确定性校准的大量文献,但分类发现并不总是转化为回归。结果,预测回归设置不确定性的现代模型通常会产生未校准和过度自信的估计。为了解决这些差距,我们提出了一种用于回归设置的校准方法,该方法在误差上不假定特定的不确定性分布:从经验上校准回归不确定性分布(粗糙)。粗略的假设是,误差分布在整个输出空间之间具有恒定的任意形状,并通过预测的平均值转移,并按预测的标准偏差进行了缩放。我们详细介绍了原油和共形推理之间的理论联系。在一系列广泛的回归任务中,原油表现出比最新技术的始终如一,校准更好,更准确的不确定性估计。

Calibrated uncertainty estimates in machine learning are crucial to many fields such as autonomous vehicles, medicine, and weather and climate forecasting. While there is extensive literature on uncertainty calibration for classification, the classification findings do not always translate to regression. As a result, modern models for predicting uncertainty in regression settings typically produce uncalibrated and overconfident estimates. To address these gaps, we present a calibration method for regression settings that does not assume a particular uncertainty distribution over the error: Calibrating Regression Uncertainty Distributions Empirically (CRUDE). CRUDE makes the weaker assumption that error distributions have a constant arbitrary shape across the output space, shifted by predicted mean and scaled by predicted standard deviation. We detail a theoretical connection between CRUDE and conformal inference. Across an extensive set of regression tasks, CRUDE demonstrates consistently sharper, better calibrated, and more accurate uncertainty estimates than state-of-the-art techniques.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源