论文标题
超出变异不平等算法的黄金比率
Beyond the Golden Ratio for Variational Inequality Algorithms
论文作者
论文摘要
我们提高了对$ \ textIt {黄金比率算法} $的理解,该算法解决了单调变化不平等(VI)和凸 - conconcove min-max问题,这是通过适应台阶尺寸到本地LipSchitz常数的独特特征。自适应步骤尺寸不仅消除了选择超参数的需求,而且还消除了全球Lipschitz连续性的必要性,并且可以从一种迭代增加到下一个。 我们首先使用流行的VI方法(例如反射梯度,Popov或乐观的梯度下降)在无约束的情况下以恒定的台阶尺寸的情况建立了该算法的等效性。然后,我们继续进行约束设置,并引入新的分析,该分析允许使用更大的步骤尺寸,以完成黄金比率算法和文献中现有算法之间的桥梁。这样做,我们实际上消除了黄金比率$ \ frac {1+ \ sqrt {5}}} {2} $与算法之间的链接。此外,我们首先要删除最大步长高参数(来自分析的工件)以提高算法的自适应版本,以提高复杂性绑定,其次是通过将其调整为具有弱薄荷溶液的非单身酮问题,具有出色的经验性能。
We improve the understanding of the $\textit{golden ratio algorithm}$, which solves monotone variational inequalities (VI) and convex-concave min-max problems via the distinctive feature of adapting the step sizes to the local Lipschitz constants. Adaptive step sizes not only eliminate the need to pick hyperparameters, but they also remove the necessity of global Lipschitz continuity and can increase from one iteration to the next. We first establish the equivalence of this algorithm with popular VI methods such as reflected gradient, Popov or optimistic gradient descent-ascent in the unconstrained case with constant step sizes. We then move on to the constrained setting and introduce a new analysis that allows to use larger step sizes, to complete the bridge between the golden ratio algorithm and the existing algorithms in the literature. Doing so, we actually eliminate the link between the golden ratio $\frac{1+\sqrt{5}}{2}$ and the algorithm. Moreover, we improve the adaptive version of the algorithm, first by removing the maximum step size hyperparameter (an artifact from the analysis) to improve the complexity bound, and second by adjusting it to nonmonotone problems with weak Minty solutions, with superior empirical performance.