论文标题
在尖峰神经网络中明确训练的尖峰稀疏性与反向传播
Explicitly Trained Spiking Sparsity in Spiking Neural Networks with Backpropagation
论文作者
论文摘要
正在探索尖峰神经网络(SNN),这是由于稀疏,事件驱动的计算导致的势能效率。许多最近的作品表明,通过将梯度近似于不连续的神经元尖峰或射击事件,对深尖峰神经网络(SNN)进行了有效的反向传播。这些替代梯度尖峰反向传播算法的有益副作用是,触发其他计算的尖峰现在本身可以在梯度计算中直接考虑。我们建议将尖峰计数的明确包含在损耗函数中,以及传统的误差损失,从而导致反向传播学习算法以优化精度和尖峰稀疏性的权重参数。正如现有的过度参数化神经网络理论所支持的那样,有许多解决方案状态具有有效的准确性。因此,在这个多目标优化过程中训练期间两个损失目标的适当加权可以提高尖峰的稀疏性,而不会显着损失准确性。我们还探索了一种模拟退火启发的减肥技术,以随着训练时间的增加而增加稀疏性的加权。与仅用于准确性训练的同等SNN相比,我们对CIFAR-10数据集的初步结果降低了高达70.1%的峰值活性,而同等的SNN则在分类准确性中降低了1%。
Spiking Neural Networks (SNNs) are being explored for their potential energy efficiency resulting from sparse, event-driven computations. Many recent works have demonstrated effective backpropagation for deep Spiking Neural Networks (SNNs) by approximating gradients over discontinuous neuron spikes or firing events. A beneficial side-effect of these surrogate gradient spiking backpropagation algorithms is that the spikes, which trigger additional computations, may now themselves be directly considered in the gradient calculations. We propose an explicit inclusion of spike counts in the loss function, along with a traditional error loss, causing the backpropagation learning algorithms to optimize weight parameters for both accuracy and spiking sparsity. As supported by existing theory of over-parameterized neural networks, there are many solution states with effectively equivalent accuracy. As such, appropriate weighting of the two loss goals during training in this multi-objective optimization process can yield an improvement in spiking sparsity without a significant loss of accuracy. We additionally explore a simulated annealing-inspired loss weighting technique to increase the weighting for sparsity as training time increases. Our preliminary results on the Cifar-10 dataset show up to 70.1% reduction in spiking activity with iso-accuracy compared to an equivalent SNN trained only for accuracy and up to 73.3% reduction in spiking activity if allowed a trade-off of 1% reduction in classification accuracy.