论文标题
团队:一种无参数的算法,可以从用户演示中教授合作机器人动作
TEAM: a parameter-free algorithm to teach collaborative robots motions from user demonstrations
论文作者
论文摘要
从示范中学习(LFD)使人类可以轻松地教授协作机器人(配件)新动议,这些动议可以推广到新任务配置而无需重新培训。但是,最先进的LFD方法需要手动调整固有参数,并且很少在没有专家的工业环境中使用。我们基于概率运动原始素提出了一种无参数的LFD方法,其中参数是使用Jensen-Shannon Divergence和贝叶斯优化确定的,并且用户不必执行手动参数调整。在两次现场测试中评估了Cobot在复制学习动作方面的精确度及其对非专家用户的教学和使用的易用性。在第一个现场测试中,该配角在电梯门维护上工作。在第二次测试中,三名工厂工人教授Cobot任务对他们的日常工作流程有用。 cobot和目标关节角之间的误差微不足道 - 最差的0.28 ver-并且运动被准确地复制 - GMCC得分为1。工人完成的问卷调查表突出了该方法的易用性和复制运动的准确性。我们的方法和数据集的公开实施可在线提供。
Learning from demonstrations (LfD) enables humans to easily teach collaborative robots (cobots) new motions that can be generalized to new task configurations without retraining. However, state-of-the-art LfD methods require manually tuning intrinsic parameters and have rarely been used in industrial contexts without experts. We propose a parameter-free LfD method based on probabilistic movement primitives, where parameters are determined using Jensen-Shannon divergence and Bayesian optimization, and users do not have to perform manual parameter tuning. The cobot's precision in reproducing learned motions, and its ease of teaching and use by non-expert users are evaluated in two field tests. In the first field test, the cobot works on elevator door maintenance. In the second test, three factory workers teach the cobot tasks useful for their daily workflow. Errors between the cobot and target joint angles are insignificant -- at worst 0.28 deg -- and the motion is accurately reproduced -- GMCC score of 1. Questionnaires completed by the workers highlighted the method's ease of use and the accuracy of the reproduced motion. Public implementation of our method and datasets are made available online.