论文标题

重新考虑峰值神经网络的归一化和残留块的作用

Rethinking the role of normalization and residual blocks for spiking neural networks

论文作者

Ikegawa, Shin-ichi, Saiin, Ryuji, Sawada, Yoshihide, Natori, Naotake

论文摘要

以生物学启发的尖峰神经网络(SNN)广泛用于实现超功耗能源消耗。但是,由于隐藏层中尖峰神经元的过多射击,深SNN并不容易训练。为了解决这个问题,我们提出了一种新颖但简单的归一化技术,称为突触后潜在归一化。该归一化从标准归一化中删除了减法项,并将第二个原始力矩而不是方差作为分区项。可以控制尖峰射击,从而使训练能够通过对突触后潜力进行这种简单的归一化来进行拨款。实验结果表明,使用我们的标准化的SNN使用其他归一化的模型优于其他模型。此外,通过预激活残差块,所提出的模型可以使用100多层训练,而无需其他专用于SNN的特殊技术。

Biologically inspired spiking neural networks (SNNs) are widely used to realize ultralow-power energy consumption. However, deep SNNs are not easy to train due to the excessive firing of spiking neurons in the hidden layers. To tackle this problem, we propose a novel but simple normalization technique called postsynaptic potential normalization. This normalization removes the subtraction term from the standard normalization and uses the second raw moment instead of the variance as the division term. The spike firing can be controlled, enabling the training to proceed appropriating, by conducting this simple normalization to the postsynaptic potential. The experimental results show that SNNs with our normalization outperformed other models using other normalizations. Furthermore, through the pre-activation residual blocks, the proposed model can train with more than 100 layers without other special techniques dedicated to SNNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源