论文标题

通过过度完整的分布进行广泛的零射击学习

Generalized Zero-Shot Learning Via Over-Complete Distribution

论文作者

Keshari, Rohit, Singh, Richa, Vatsa, Mayank

论文摘要

训练有素且广义的深神经网络(DNN)应该对看见和看不见的阶级都有坚固的态度。但是,大多数现有监督的DNN算法的性能降低了培训集中看不见的课程。为了学习一个判别性分类器,该分类器在零光学习(ZSL)设置中产生良好的性能,我们建议使用可见和看不见类的条件变异自动编码器(CVAE)生成过度完整的分布(OCD)。为了在类别的OCD上使用在线批处理三重态损失(ABSL)和中心损失(CL),为了实现类之间的可分离性并减少类散点。使用零射击学习和全面的零局学习协议评估该框架的有效性,该协议在三个公开可用的基准数据库(Sun,Cub和Awa2)上进行了评估。结果表明,生成过度完整的分布并强制执行分类器从重叠到非重叠分布的转换功能可以提高可见和看不见的类的性能。

A well trained and generalized deep neural network (DNN) should be robust to both seen and unseen classes. However, the performance of most of the existing supervised DNN algorithms degrade for classes which are unseen in the training set. To learn a discriminative classifier which yields good performance in Zero-Shot Learning (ZSL) settings, we propose to generate an Over-Complete Distribution (OCD) using Conditional Variational Autoencoder (CVAE) of both seen and unseen classes. In order to enforce the separability between classes and reduce the class scatter, we propose the use of Online Batch Triplet Loss (OBTL) and Center Loss (CL) on the generated OCD. The effectiveness of the framework is evaluated using both Zero-Shot Learning and Generalized Zero-Shot Learning protocols on three publicly available benchmark databases, SUN, CUB and AWA2. The results show that generating over-complete distributions and enforcing the classifier to learn a transform function from overlapping to non-overlapping distributions can improve the performance on both seen and unseen classes.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源