论文标题

我们需要在公司中解释的AI吗?从员工的角度调查挑战,期望和机会

Do We Need Explainable AI in Companies? Investigation of Challenges, Expectations, and Chances from Employees' Perspective

论文作者

Weitz, Katharina, Dang, Chi Tai, André, Elisabeth

论文摘要

公司对人工智能(AI)的采用越来越成为业务成功的基本要素。但是,使用AI为公司及其员工提出了新的要求,包括AI系统的透明度和可理解性。可解释的AI(XAI)领域旨在解决这些问题。然而,当前的研究主要由实验室研究组成,并且有必要提高发现对现实情况的适用性。因此,本项目报告论文提供了对员工对(x)AI的需求和态度的见解。为此,我们调查了员工对(x)AI的看法。我们的发现表明,AI和XAI是对员工很重要的众所周知的术语。这种认识是XAI通过提供对AI技术的可理解见解,可以成功地推动AI成功使用AI的关键第一步。在课程学习部分中,我们讨论了确定的开放问题,并建议未来的研究指示为公司开发以人为本的XAI设计。通过提供对员工对(X)AI的需求和态度的见解,我们的项目报告有助于开发满足公司及其员工要求的XAI解决方案,最终推动在商业环境中成功采用AI技术。

Companies' adoption of artificial intelligence (AI) is increasingly becoming an essential element of business success. However, using AI poses new requirements for companies and their employees, including transparency and comprehensibility of AI systems. The field of Explainable AI (XAI) aims to address these issues. Yet, the current research primarily consists of laboratory studies, and there is a need to improve the applicability of the findings to real-world situations. Therefore, this project report paper provides insights into employees' needs and attitudes towards (X)AI. For this, we investigate employees' perspectives on (X)AI. Our findings suggest that AI and XAI are well-known terms perceived as important for employees. This recognition is a critical first step for XAI to potentially drive successful usage of AI by providing comprehensible insights into AI technologies. In a lessons-learned section, we discuss the open questions identified and suggest future research directions to develop human-centered XAI designs for companies. By providing insights into employees' needs and attitudes towards (X)AI, our project report contributes to the development of XAI solutions that meet the requirements of companies and their employees, ultimately driving the successful adoption of AI technologies in the business context.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源