论文标题
通过自适应子集选择加速NAS
Speeding up NAS with Adaptive Subset Selection
论文作者
论文摘要
神经体系结构搜索(NAS)最近的大多数发展旨在降低各种技术的计算成本,而不会影响其最终性能。为了实现这一目标,已经考虑了几种低保真性和绩效预测方法,包括仅在培训数据子集上进行培训的方法。在这项工作中,我们为NAS提供了一种自适应子集选择方法,并将其作为与最先进的NAS方法相辅相成。我们揭示了单发NAS算法与自适应子集选择之间的自然联系,并设计了一种利用来自这两个领域的最新技术的算法。我们使用这些技术实质上降低了飞镖-PT(领先的单发NAS算法),以及Bohb和DeHB(领先的多重效率优化算法),而无需牺牲精度。在多个数据集中,我们的结果是一致的,对于完整的可重复性,我们在https:// anonymon.4open.science/r/subsetsetlection nas-b132上发布代码。
A majority of recent developments in neural architecture search (NAS) have been aimed at decreasing the computational cost of various techniques without affecting their final performance. Towards this goal, several low-fidelity and performance prediction methods have been considered, including those that train only on subsets of the training data. In this work, we present an adaptive subset selection approach to NAS and present it as complementary to state-of-the-art NAS approaches. We uncover a natural connection between one-shot NAS algorithms and adaptive subset selection and devise an algorithm that makes use of state-of-the-art techniques from both areas. We use these techniques to substantially reduce the runtime of DARTS-PT (a leading one-shot NAS algorithm), as well as BOHB and DEHB (leading multifidelity optimization algorithms), without sacrificing accuracy. Our results are consistent across multiple datasets, and towards full reproducibility, we release our code at https: //anonymous.4open.science/r/SubsetSelection NAS-B132.