论文标题

金牌:渐进,一级,可区分

GOLD-NAS: Gradual, One-Level, Differentiable

论文作者

Bi, Kaifeng, Xie, Lingxi, Chen, Xin, Wei, Longhui, Tian, Qi

论文摘要

有大量的神经体系结构搜索文献,但是大多数现有的工作都使用了启发式规则,这些规则在很大程度上限制了搜索灵活性。在本文中,我们首先放松这些手动设计的约束,并扩大搜索空间,以包含$ 10^{160} $候选人。在新的空间中,大多数现有的可区分搜索方法可能会大大失败。然后,我们提出了一种新颖的算法,称为渐进的一级可分解的神经体系结构搜索(Gold-NAS),该算法将可变资源约束引入了一级优化,以便从超级网络中逐渐将弱操作员逐渐修剪出来。在标准图像分类基准中,Gold-NAS可以在单个搜索过程中找到一系列帕累托最佳体系结构。以前从未研究过大多数发现的架构,但是它们在识别准确性和模型复杂性之间取得了不错的权衡。我们认为,新的空间和搜索算法可以推进对可区分NA的搜索。

There has been a large literature of neural architecture search, but most existing work made use of heuristic rules that largely constrained the search flexibility. In this paper, we first relax these manually designed constraints and enlarge the search space to contain more than $10^{160}$ candidates. In the new space, most existing differentiable search methods can fail dramatically. We then propose a novel algorithm named Gradual One-Level Differentiable Neural Architecture Search (GOLD-NAS) which introduces a variable resource constraint to one-level optimization so that the weak operators are gradually pruned out from the super-network. In standard image classification benchmarks, GOLD-NAS can find a series of Pareto-optimal architectures within a single search procedure. Most of the discovered architectures were never studied before, yet they achieve a nice tradeoff between recognition accuracy and model complexity. We believe the new space and search algorithm can advance the search of differentiable NAS.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源