论文标题

通过非凸罚来改善尖峰稀疏恢复

Improving Spiking Sparse Recovery via Non-Convex Penalties

论文作者

Zhang, Xiang, Yu, Lei, Zheng, Gang

论文摘要

与数字方法相比,基于尖峰神经网络的稀疏恢复具有很大的优势,例如高计算效率和低功耗。但是,当前的尖峰算法无法保证更准确的估计值,因为它们通常旨在用凸惩罚解决经典优化,尤其是$ \ ell_ {1} $ - norm-Norm。实际上,观察到凸的惩罚在实践中低估了真正的解决方案,而非凸面则可以避免低估。受此启发,我们提出了一种自适应版本的尖峰稀疏恢复算法,以解决非凸的正规化优化,并对其全球渐近收敛性进行分析。通过实验,在不同的自适应方式下,准确性大大提高。

Compared with digital methods, sparse recovery based on spiking neural networks has great advantages like high computational efficiency and low power-consumption. However, current spiking algorithms cannot guarantee more accurate estimates since they are usually designed to solve the classical optimization with convex penalties, especially the $\ell_{1}$-norm. In fact, convex penalties are observed to underestimate the true solution in practice, while non-convex ones can avoid the underestimation. Inspired by this, we propose an adaptive version of spiking sparse recovery algorithm to solve the non-convex regularized optimization, and provide an analysis on its global asymptotic convergence. Through experiments, the accuracy is greatly improved under different adaptive ways.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源