论文标题

填充材料以进行有效的不确定性估计

Packed-Ensembles for Efficient Uncertainty Estimation

论文作者

Laurent, Olivier, Lafage, Adrien, Tartaglione, Enzo, Daniel, Geoffrey, Martinez, Jean-Marc, Bursuc, Andrei, Franchi, Gianni

论文摘要

深层合奏(DE)是在关键指标上取得出色性能的重要方法,例如准确性,校准,不确定性估计和分布外检测。但是,现实世界系统的硬件限制限制在较小的合奏和容量较低的网络上,从而大大恶化其性能和属性。我们介绍了包装量的(PE),这是一种通过仔细调节其编码空间的尺寸来设计和训练轻巧结构合奏的策略。我们利用分组的卷积将合奏并行分为一个共享的主链,并向前传球,以提高训练和推理速度。 PE旨在在标准神经网络的内存范围内运行。我们的广泛研究表明,PE可以准确保留DE的性质,例如多样性,并且在准确性,校准,分布外检测和分布转移的鲁棒性方面表现同样出色。我们在https://github.com/ensta-u2is/torch-uncneyty上提供代码。

Deep Ensembles (DE) are a prominent approach for achieving excellent performance on key metrics such as accuracy, calibration, uncertainty estimation, and out-of-distribution detection. However, hardware limitations of real-world systems constrain to smaller ensembles and lower-capacity networks, significantly deteriorating their performance and properties. We introduce Packed-Ensembles (PE), a strategy to design and train lightweight structured ensembles by carefully modulating the dimension of their encoding space. We leverage grouped convolutions to parallelize the ensemble into a single shared backbone and forward pass to improve training and inference speeds. PE is designed to operate within the memory limits of a standard neural network. Our extensive research indicates that PE accurately preserves the properties of DE, such as diversity, and performs equally well in terms of accuracy, calibration, out-of-distribution detection, and robustness to distribution shift. We make our code available at https://github.com/ENSTA-U2IS/torch-uncertainty.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源