论文标题

UFO-Blo:公正的一阶双层优化

UFO-BLO: Unbiased First-Order Bilevel Optimization

论文作者

Likhosherstov, Valerii, Song, Xingyou, Choromanski, Krzysztof, Davis, Jared, Weller, Adrian

论文摘要

Bilevel优化(BLO)是一种流行的方法,具有许多应用程序,包括超参数优化,神经体系结构搜索,对抗性鲁棒性和模型 - 敏捷的元学习。但是,该方法的时间和内存复杂性与其内部优化循环的长度$ r $成正比,这导致了一些修改。一种这样的修改是\ textit {一阶} blo(foblo),它通过将第二个衍生术语归零,产生显着的速度增长,仅需要恒定的内存为$ r $而变化。尽管Foblo的知名度很高,但对其融合特性的理论理解缺乏。我们通过展示丰富的例子家族来取得进步,在这种示例中,基于FOBLO的随机优化不会融合到BLO目标的固定点。我们通过提出一个新的基于FOBLO的外部梯度的无偏估计来解决这一问题,从而使我们能够在理论上保证这种收敛性,对记忆和预期时间复杂性没有任何损害。我们的发现得到了Omniglot和Mini-Imagenet的实验结果的支持,流行的几个元学习基准。

Bilevel optimization (BLO) is a popular approach with many applications including hyperparameter optimization, neural architecture search, adversarial robustness and model-agnostic meta-learning. However, the approach suffers from time and memory complexity proportional to the length $r$ of its inner optimization loop, which has led to several modifications being proposed. One such modification is \textit{first-order} BLO (FO-BLO) which approximates outer-level gradients by zeroing out second derivative terms, yielding significant speed gains and requiring only constant memory as $r$ varies. Despite FO-BLO's popularity, there is a lack of theoretical understanding of its convergence properties. We make progress by demonstrating a rich family of examples where FO-BLO-based stochastic optimization does not converge to a stationary point of the BLO objective. We address this concern by proposing a new FO-BLO-based unbiased estimate of outer-level gradients, enabling us to theoretically guarantee this convergence, with no harm to memory and expected time complexity. Our findings are supported by experimental results on Omniglot and Mini-ImageNet, popular few-shot meta-learning benchmarks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源