论文标题
迭代地重新加权的最小二乘正方形以供基础追求,并以全球线性融合率
Iteratively Reweighted Least Squares for Basis Pursuit with Global Linear Convergence Rate
论文作者
论文摘要
稀疏数据的恢复是机器学习和信号处理中许多应用程序的核心。尽管可以使用$ \ ell_1 $ regulinization如Lasso估算器中的$ \ ell_1 $ regulination解决此类问题,但在基础追踪方法中,通常需要使用专门的算法来求解大型实例的相应高维非平滑度优化。迭代重新加权最小二乘(IRLS)是为此目的广泛使用的算法,这是由于其出色的数值性能。但是,尽管现有理论能够保证该算法与最小化器的收敛性,但它不能提供全球收敛速率。在本文中,我们证明,IRLS的变体以全局线性速率收敛到稀疏解决方案,即,如果测量值满足通常的NULL空间属性假设,则从任何初始化中立即从任何初始化中立即减少线性误差。我们通过数值实验来支持我们的理论,表明我们的线性速率捕获了正确的维度依赖性。我们预计我们的理论发现将为许多其他IRLS算法的用例,例如低级矩阵恢复中带来新的见解。
The recovery of sparse data is at the core of many applications in machine learning and signal processing. While such problems can be tackled using $\ell_1$-regularization as in the LASSO estimator and in the Basis Pursuit approach, specialized algorithms are typically required to solve the corresponding high-dimensional non-smooth optimization for large instances. Iteratively Reweighted Least Squares (IRLS) is a widely used algorithm for this purpose due its excellent numerical performance. However, while existing theory is able to guarantee convergence of this algorithm to the minimizer, it does not provide a global convergence rate. In this paper, we prove that a variant of IRLS converges with a global linear rate to a sparse solution, i.e., with a linear error decrease occurring immediately from any initialization, if the measurements fulfill the usual null space property assumption. We support our theory by numerical experiments showing that our linear rate captures the correct dimension dependence. We anticipate that our theoretical findings will lead to new insights for many other use cases of the IRLS algorithm, such as in low-rank matrix recovery.