论文标题

先前引导的一声神经架构搜索

Prior-Guided One-shot Neural Architecture Search

论文作者

Dong, Peijie, Niu, Xin, Li, Lujun, Xie, Linzhen, Zou, Wenbin, Ye, Tian, Wei, Zimian, Pan, Hengyue

论文摘要

神经体系结构搜索方法寻求具有有效的体重共享超级网训练的最佳候选者。但是,最近的研究表明,关于独立体系结构和共享重量网络之间的性能的排名一致性差。在本文中,我们提出了提前引导的一声NAS(PGONA),以加强超级网的排名相关性。具体而言,我们首先探讨激活功能的效果,并提出基于三明治规则的平衡采样策略,以减轻超级网中的重量耦合。然后,采用了拖鞋和禅宗得分来指导对超级网的训练,并具有相关性损失。我们的PGONA在CVPR2022第二轻型NAS挑战赛的超级网轨道中排名第三。代码可在https://github.com/pprp/cvpr2022-nas?competition-track1-3th-solution中找到。

Neural architecture search methods seek optimal candidates with efficient weight-sharing supernet training. However, recent studies indicate poor ranking consistency about the performance between stand-alone architectures and shared-weight networks. In this paper, we present Prior-Guided One-shot NAS (PGONAS) to strengthen the ranking correlation of supernets. Specifically, we first explore the effect of activation functions and propose a balanced sampling strategy based on the Sandwich Rule to alleviate weight coupling in the supernet. Then, FLOPs and Zen-Score are adopted to guide the training of supernet with ranking correlation loss. Our PGONAS ranks 3rd place in the supernet Track Track of CVPR2022 Second lightweight NAS challenge. Code is available in https://github.com/pprp/CVPR2022-NAS?competition-Track1-3th-solution.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源