论文标题

认证的单调神经网络

Certified Monotonic Neural Networks

论文作者

Liu, Xingchao, Han, Xing, Zhang, Na, Liu, Qiang

论文摘要

学习有关输入的子集的单调模型是有效解决实践中的公平性,可解释性和泛化问题的理想特征。现有的学习单调神经网络的方法需要专门设计的模型结构来确保单调性,这可能是过于限制/复杂的,或者通过调整学习过程来实现单调性,这不能证明可以证明学习模型对所选功能是单调的。在这项工作中,我们建议通过解决混合整数线性编程问题来证明一般部分线性神经网络的单调性。这为学习具有任意模型结构的单调神经网络提供了一种新的通用方法。我们的方法使我们能够通过启发式单调性训练神经网络,并且我们可以逐渐增加正则化幅度,直到学识渊博的网络被认证为单调为止。与先前的工作相比,我们的方法不需要在体重空间上进行人体设计的约束,并且还需要更准确的近似值。对各种数据集的经验研究表明,我们方法对最先进的方法(例如深层晶格网络)的效率。

Learning monotonic models with respect to a subset of the inputs is a desirable feature to effectively address the fairness, interpretability, and generalization issues in practice. Existing methods for learning monotonic neural networks either require specifically designed model structures to ensure monotonicity, which can be too restrictive/complicated, or enforce monotonicity by adjusting the learning process, which cannot provably guarantee the learned model is monotonic on selected features. In this work, we propose to certify the monotonicity of the general piece-wise linear neural networks by solving a mixed integer linear programming problem.This provides a new general approach for learning monotonic neural networks with arbitrary model structures. Our method allows us to train neural networks with heuristic monotonicity regularizations, and we can gradually increase the regularization magnitude until the learned network is certified monotonic. Compared to prior works, our approach does not require human-designed constraints on the weight space and also yields more accurate approximation. Empirical studies on various datasets demonstrate the efficiency of our approach over the state-of-the-art methods, such as Deep Lattice Networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源