论文标题

动态relu

Dynamic ReLU

论文作者

Chen, Yinpeng, Dai, Xiyang, Liu, Mengchen, Chen, Dongdong, Yuan, Lu, Liu, Zicheng

论文摘要

整流线性单元(RELU)通常用于深神经网络。到目前为止,Relu及其概括(非参数或参数)是静态的,对所有输入样本的性能相同。在本文中,我们提出了动态relu(dy-relu),这是一个动态整流器,其参数是由超级函数在所有put元素上生成的。关键见解是,dy-relu将全局上下文编码到超级函数中,并相应地调整分段线性激活函数。与其静态对应物相比,Dy-Relu具有可忽略的额外计算成本,但代表能力明显更大,尤其是对于轻质神经网络。通过简单地将Dy-Relu用于MobileNetV2,Imagenet分类的前1位准确性从72.0%提高到76.2%,而仅另外5%的拖鞋。

Rectified linear units (ReLU) are commonly used in deep neural networks. So far ReLU and its generalizations (non-parametric or parametric) are static, performing identically for all input samples. In this paper, we propose dynamic ReLU (DY-ReLU), a dynamic rectifier of which parameters are generated by a hyper function over all in-put elements. The key insight is that DY-ReLU encodes the global context into the hyper function, and adapts the piecewise linear activation function accordingly. Compared to its static counterpart, DY-ReLU has negligible extra computational cost, but significantly more representation capability, especially for light-weight neural networks. By simply using DY-ReLU for MobileNetV2, the top-1 accuracy on ImageNet classification is boosted from 72.0% to 76.2% with only 5% additional FLOPs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源