论文标题

分层神经体系结构搜索单图超分辨率

Hierarchical Neural Architecture Search for Single Image Super-Resolution

论文作者

Guo, Yong, Luo, Yongsheng, He, Zhenhao, Huang, Jin, Chen, Jian

论文摘要

深度神经网络在图像超分辨率(SR)方面表现出了有希望的表现。大多数SR模型都遵循分层体系结构,其中包含计算块的单元格设计和UPS采样块位置的网络级设计。但是,设计SR模型在很大程度上依赖于人类的专业知识,并且非常密集。更重要的是,这些SR模型通常包含大量参数,并且可能无法满足现实应用程序中计算资源的要求。为了解决上述问题,我们提出了一种分层神经体系结构搜索(HNAS)方法,以自动设计有前途的体系结构,并具有不同的计算成本要求。为此,我们设计了一个层次的SR搜索空间,并为架构搜索提供了层次结构控制器。这样的层次控制器能够同时找到有希望的牢房级块和UPS采样层的网络级位置。此外,要设计具有有希望的性能的紧凑型体系结构,我们通过考虑指导搜索过程的性能和计算成本来建立共同的奖励。在五个基准数据集上进行的广泛实验证明了我们方法比现有方法的优越性。

Deep neural networks have exhibited promising performance in image super-resolution (SR). Most SR models follow a hierarchical architecture that contains both the cell-level design of computational blocks and the network-level design of the positions of upsampling blocks. However, designing SR models heavily relies on human expertise and is very labor-intensive. More critically, these SR models often contain a huge number of parameters and may not meet the requirements of computation resources in real-world applications. To address the above issues, we propose a Hierarchical Neural Architecture Search (HNAS) method to automatically design promising architectures with different requirements of computation cost. To this end, we design a hierarchical SR search space and propose a hierarchical controller for architecture search. Such a hierarchical controller is able to simultaneously find promising cell-level blocks and network-level positions of upsampling layers. Moreover, to design compact architectures with promising performance, we build a joint reward by considering both the performance and computation cost to guide the search process. Extensive experiments on five benchmark datasets demonstrate the superiority of our method over existing methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源