论文标题

深度学习激活功能的进化优化

Evolutionary Optimization of Deep Learning Activation Functions

论文作者

Bingham, Garrett, Macke, William, Miikkulainen, Risto

论文摘要

激活功能的选择可能会对神经网络的性能产生很大影响。尽管有一些尝试手动设计新型激活功能的尝试,但在实践中,整流的线性单元(Relu)仍然是最常用的。本文表明,进化算法可以发现新颖的激活函数以优于恢复。通过突变,交叉和详尽的搜索来定义和探索候选激活功能的基于树的搜索空间。在CIFAR-10和CIFAR-100图像数据集上训练宽残留网络的实验表明,这种方法是有效的。用进化的激活函数代替RELU会导致网络准确性的统计显着提高。当允许进化将激活功能自定义为特定任务时,可以实现最佳性能;但是,这些新颖的激活功能被证明可以概括,从而在跨任务中实现了高性能。因此,激活函数的进化优化是神经网络中金属性的新维度。

The choice of activation function can have a large effect on the performance of a neural network. While there have been some attempts to hand-engineer novel activation functions, the Rectified Linear Unit (ReLU) remains the most commonly-used in practice. This paper shows that evolutionary algorithms can discover novel activation functions that outperform ReLU. A tree-based search space of candidate activation functions is defined and explored with mutation, crossover, and exhaustive search. Experiments on training wide residual networks on the CIFAR-10 and CIFAR-100 image datasets show that this approach is effective. Replacing ReLU with evolved activation functions results in statistically significant increases in network accuracy. Optimal performance is achieved when evolution is allowed to customize activation functions to a particular task; however, these novel activation functions are shown to generalize, achieving high performance across tasks. Evolutionary optimization of activation functions is therefore a promising new dimension of metalearning in neural networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源