论文标题

S $^3 $ NN:减少尖峰替代梯度的时间步长,用于培训能节能的单步尖峰神经网络

S$^3$NN: Time Step Reduction of Spiking Surrogate Gradients for Training Energy Efficient Single-Step Spiking Neural Networks

论文作者

Suetake, Kazuma, Ikegawa, Shin-ichi, Saiin, Ryuji, Sawada, Yoshihide

论文摘要

随着神经网络的尺度的增加,需要低计算成本和能源效率的技术来运行。从这种需求中,已经提出了各种有效的神经网络范式,例如尖峰神经网络(SNN)或二进制神经网络(BNN)。但是,它们具有棘手的缺点,例如推理准确性和延迟降低。为了解决这些问题,我们提出了一个单步尖峰神经网络(S $^3 $ nn),这是一个节能的神经网络,计算成本低,精度很高。拟议的S $^3 $ nn处理了SNN的隐藏层之间的信息。然而,它没有时间维度,因此作为BNN的训练和推理阶段中没有延迟。因此,拟议的S $^3 $ NN的计算成本低于需要时间序列处理的SNN。但是,由于尖峰的非差异性性质,S $^3 $ nn不能采用幼稚的反向传播算法。我们通过将多时间步进SNN的替代梯度降低到单个时间步骤来推断合适的神经元模型。我们通过实验证明,获得的替代梯度允许S $^3 $ NN接受适当的培训。我们还表明,拟议的S $^3 $ nn可以在高能效率的同时实现与全精度网络相当的精度。

As the scales of neural networks increase, techniques that enable them to run with low computational cost and energy efficiency are required. From such demands, various efficient neural network paradigms, such as spiking neural networks (SNNs) or binary neural networks (BNNs), have been proposed. However, they have sticky drawbacks, such as degraded inference accuracy and latency. To solve these problems, we propose a single-step spiking neural network (S$^3$NN), an energy-efficient neural network with low computational cost and high precision. The proposed S$^3$NN processes the information between hidden layers by spikes as SNNs. Nevertheless, it has no temporal dimension so that there is no latency within training and inference phases as BNNs. Thus, the proposed S$^3$NN has a lower computational cost than SNNs that require time-series processing. However, S$^3$NN cannot adopt naïve backpropagation algorithms due to the non-differentiability nature of spikes. We deduce a suitable neuron model by reducing the surrogate gradient for multi-time step SNNs to a single-time step. We experimentally demonstrated that the obtained surrogate gradient allows S$^3$NN to be trained appropriately. We also showed that the proposed S$^3$NN could achieve comparable accuracy to full-precision networks while being highly energy-efficient.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源