论文标题

通过上下文重要性和实用程序对黑框模型预测的解释

Explanations of Black-Box Model Predictions by Contextual Importance and Utility

论文作者

Anjomshoae, Sule, Främling, Kary, Najjar, Amro

论文摘要

自主系统的重大进展以及更广泛的应用领域增加了对可信赖智能系统的需求。可解释的人工智能正在引起研究人员和开发人员的广泛关注,以满足这一要求。尽管关于可解释和透明的机器学习算法的作品越来越多,但它们主要是针对技术用户的。在许多可用且实用的应用中,最终用户的解释已被忽略。在这项工作中,我们介绍了上下文重要性(CI)和上下文实用程序(CU)概念,以提取专家和新手用户很容易理解的解释。此方法解释了预测结果,而无需将模型转换为可解释的预测。我们提供了一个提供线性和非线性模型的解释以证明该方法的概括性的示例。 CI和CU是数值,可以以视觉和自然语言形式向用户表示,以证明行动合理并解释个人实例,情况和上下文的推理。我们通过介绍完整的(即个人预测的原因)和对比解释(即与感兴趣的实例对比实例)来显示汽车选择示例和虹膜花分类中解释的实用性。实验结果表明提供的解释方法的可行性和有效性。

The significant advances in autonomous systems together with an immensely wider application domain have increased the need for trustable intelligent systems. Explainable artificial intelligence is gaining considerable attention among researchers and developers to address this requirement. Although there is an increasing number of works on interpretable and transparent machine learning algorithms, they are mostly intended for the technical users. Explanations for the end-user have been neglected in many usable and practical applications. In this work, we present the Contextual Importance (CI) and Contextual Utility (CU) concepts to extract explanations that are easily understandable by experts as well as novice users. This method explains the prediction results without transforming the model into an interpretable one. We present an example of providing explanations for linear and non-linear models to demonstrate the generalizability of the method. CI and CU are numerical values that can be represented to the user in visuals and natural language form to justify actions and explain reasoning for individual instances, situations, and contexts. We show the utility of explanations in car selection example and Iris flower classification by presenting complete (i.e. the causes of an individual prediction) and contrastive explanation (i.e. contrasting instance against the instance of interest). The experimental results show the feasibility and validity of the provided explanation methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源