论文标题

通过实时互动评估体现对话剂的数据驱动的共同语音手势

Evaluating Data-Driven Co-Speech Gestures of Embodied Conversational Agents through Real-Time Interaction

论文作者

He, Yuan, Pereira, André, Kucherenko, Taras

论文摘要

使用共同语音手势的体现的对话剂可以在许多方面增强人机相互作用。近年来,ECA的数据驱动的手势生成方法吸引了大量的研究注意力,并且相关方法不断改善。当研究人员评估产生基于规则的手势的ECA系统时,通常会使用实时互动。但是,在基于数据驱动的方法评估ECA的性能时,通常只需要观看预录的视频,这些视频无法提供有关一个人在互动过程中所感知的足够信息。为了解决此限制,我们探索了实时交互的使用来评估数据驱动的手势ECA。我们提供了一个测试床框架,并研究了手势是否可以影响人类对人类友善,动画,感知智力和集中注意力的层面上对ECA的看法。我们的用户研究要求参与者与两个ECA互动 - 一个,一个没有手势。我们从参与者的自我报告问卷和凝视跟踪器的客观数据中收集了主观数据。据我们所知,当前的研究代表了通过实时相互作用和使用凝视跟踪的第一个实验来评估数据驱动的ECA的首次尝试,以检查ECAS手势的效果。

Embodied Conversational Agents that make use of co-speech gestures can enhance human-machine interactions in many ways. In recent years, data-driven gesture generation approaches for ECAs have attracted considerable research attention, and related methods have continuously improved. Real-time interaction is typically used when researchers evaluate ECA systems that generate rule-based gestures. However, when evaluating the performance of ECAs based on data-driven methods, participants are often required only to watch pre-recorded videos, which cannot provide adequate information about what a person perceives during the interaction. To address this limitation, we explored use of real-time interaction to assess data-driven gesturing ECAs. We provided a testbed framework, and investigated whether gestures could affect human perception of ECAs in the dimensions of human-likeness, animacy, perceived intelligence, and focused attention. Our user study required participants to interact with two ECAs - one with and one without hand gestures. We collected subjective data from the participants' self-report questionnaires and objective data from a gaze tracker. To our knowledge, the current study represents the first attempt to evaluate data-driven gesturing ECAs through real-time interaction and the first experiment using gaze-tracking to examine the effect of ECAs' gestures.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源