论文标题

深度麦克斯特网络高斯流程

Deep Maxout Network Gaussian Process

论文作者

Liang, Libin, Tian, Ye, Cheng, Ge

论文摘要

对无限宽度的神经网络的研究对于更好地理解实际应用中的神经网络很重要。在这项工作中,我们得出了深,无限宽度的Maxout网络和高斯工艺(GP)的等效性,并用组成结构表征了Maxout内核。此外,我们建立了深厚的Maxout网络内核与深神经网络内核之间的联系。我们还提供了有效的数值实现,可以适应任何麦克斯特等级。数值结果表明,与有限宽度的对应物和深度神经网络内核相比,基于深度麦克斯网络内核进行贝叶斯推断可能会导致竞争成果。这启发了我们,Maxout激活也可以纳入其他无限宽度神经网络结构,例如卷积神经网络(CNN)。

Study of neural networks with infinite width is important for better understanding of the neural network in practical application. In this work, we derive the equivalence of the deep, infinite-width maxout network and the Gaussian process (GP) and characterize the maxout kernel with a compositional structure. Moreover, we build up the connection between our deep maxout network kernel and deep neural network kernels. We also give an efficient numerical implementation of our kernel which can be adapted to any maxout rank. Numerical results show that doing Bayesian inference based on the deep maxout network kernel can lead to competitive results compared with their finite-width counterparts and deep neural network kernels. This enlightens us that the maxout activation may also be incorporated into other infinite-width neural network structures such as the convolutional neural network (CNN).

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源