论文标题

关于非convex优化问题的平滑近端ALM的迭代复杂性,并具有凸约限制

On the Iteration Complexity of Smoothed Proximal ALM for Nonconvex Optimization Problem with Convex Constraints

论文作者

Zhang, Jiawei, Pu, Wenqiang, Luo, Zhi-Quan

论文摘要

众所周知,解决非凸的不约束优化问题的迭代复杂性的下限是$ω(1/ε^2)$,当目标函数平滑时,可以通过标准梯度下降算法来实现。对于非convex约束问题,该下限仍然存在,而一阶方法是否可以实现此下限仍然未知。在本文中,我们表明一种简单的单环一阶算法称为平滑近端增强拉格朗日方法(ALM)可以实现这种迭代复杂性下限。关键的技术贡献是对一般凸的约束问题的强大局部错误,这是独立的兴趣。

It is well-known that the lower bound of iteration complexity for solving nonconvex unconstrained optimization problems is $Ω(1/ε^2)$, which can be achieved by standard gradient descent algorithm when the objective function is smooth. This lower bound still holds for nonconvex constrained problems, while it is still unknown whether a first-order method can achieve this lower bound. In this paper, we show that a simple single-loop first-order algorithm called smoothed proximal augmented Lagrangian method (ALM) can achieve such iteration complexity lower bound. The key technical contribution is a strong local error bound for a general convex constrained problem, which is of independent interest.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源