论文标题

检查在非理想的回忆横梁上尖峰神经网络的鲁棒性

Examining the Robustness of Spiking Neural Networks on Non-ideal Memristive Crossbars

论文作者

Bhattacharjee, Abhiroop, Kim, Youngeun, Moitra, Abhishek, Panda, Priyadarshini

论文摘要

由于其异步,稀疏和二进制信息处理,尖峰神经网络(SNN)最近成为人工神经网络(ANN)的低功耗替代品。为了提高能源效率和吞吐量,可以在使用新兴的非挥发性(NVM)设备在模拟域中实现多重和蓄积(MAC)操作的回忆横梁上实现SNN。尽管SNN与回忆性横梁具有兼容性,但很少关注研究固有的横杆非理想性和随机性对SNN的性能的影响。在本文中,我们对非理想横杆的SNN的鲁棒性进行了全面分析。我们检查通过学习算法训练的SNN,例如替代梯度和Ann-SNN转换。我们的结果表明,跨多个时间阶段的重复横梁计算会导致错误积累,从而导致SNN推断期间的性能下降。我们进一步表明,经过较小时间阶段的SNN在备忘录横梁上部署时,可以提高准确性。

Spiking Neural Networks (SNNs) have recently emerged as the low-power alternative to Artificial Neural Networks (ANNs) owing to their asynchronous, sparse, and binary information processing. To improve the energy-efficiency and throughput, SNNs can be implemented on memristive crossbars where Multiply-and-Accumulate (MAC) operations are realized in the analog domain using emerging Non-Volatile-Memory (NVM) devices. Despite the compatibility of SNNs with memristive crossbars, there is little attention to study on the effect of intrinsic crossbar non-idealities and stochasticity on the performance of SNNs. In this paper, we conduct a comprehensive analysis of the robustness of SNNs on non-ideal crossbars. We examine SNNs trained via learning algorithms such as, surrogate gradient and ANN-SNN conversion. Our results show that repetitive crossbar computations across multiple time-steps induce error accumulation, resulting in a huge performance drop during SNN inference. We further show that SNNs trained with a smaller number of time-steps achieve better accuracy when deployed on memristive crossbars.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源