论文标题

神经建筑搜索是稀疏的超级网

Neural Architecture Search as Sparse Supernet

论文作者

Wu, Yan, Liu, Aoming, Huang, Zhiwu, Zhang, Siwei, Van Gool, Luc

论文摘要

本文旨在将神经体系结构搜索(NAS)的问题从单路径和多路径搜索到自动混合路径搜索。特别是,我们使用新的连续体系结构表示与稀疏约束的混合物建模为稀疏的超级网。稀疏的超级网使我们能够在紧凑的节点集上自动实现稀疏的混合路径。为了优化所提出的稀疏超级网,我们利用了双层优化框架内的层次加速近端近端算法。关于卷积神经网络和复发性神经网络搜索的广泛实验表明,所提出的方法能够寻找紧凑,一般和强大的神经体系结构。

This paper aims at enlarging the problem of Neural Architecture Search (NAS) from Single-Path and Multi-Path Search to automated Mixed-Path Search. In particular, we model the NAS problem as a sparse supernet using a new continuous architecture representation with a mixture of sparsity constraints. The sparse supernet enables us to automatically achieve sparsely-mixed paths upon a compact set of nodes. To optimize the proposed sparse supernet, we exploit a hierarchical accelerated proximal gradient algorithm within a bi-level optimization framework. Extensive experiments on Convolutional Neural Network and Recurrent Neural Network search demonstrate that the proposed method is capable of searching for compact, general and powerful neural architectures.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源