论文标题
预计的梯度下降算法,用于解决生成先验的非线性反问题
Projected Gradient Descent Algorithms for Solving Nonlinear Inverse Problems with Generative Priors
论文作者
论文摘要
在本文中,我们提出了预测的梯度下降(PGD)算法,以通过噪音非线性测量值进行信号估计。我们假设未知的$ p $维信号位于$ l $ -Lipschitz连续生成模型的范围内,有限制的$ k $维输入。特别是,我们考虑两种非线性链路函数是未知或已知的情况。对于未知的非线性,类似于\ cite {liu2020 Generalized},我们做出了高斯观察结果的假设,并提出了线性最小二乘估计器。我们表明,当没有表示误差并且传感向量为高斯时,大约是$ o(k \ log l)$样品足以确保PGD算法将线性收敛到使用任意初始化的最佳统计率的点。对于已知的非线性,我们假设单调性如\ cite {yang2016sparse}中,并在传感向量上做出更弱的假设并允许表示误差。我们提出了一个非线性最小二乘估计器,该估计量可以保证具有最佳的统计率。提供了相应的PGD算法,并显示出使用任意初始化将线性收敛到估算器。此外,我们在图像数据集上提出了实验结果,以证明我们的PGD算法的性能。
In this paper, we propose projected gradient descent (PGD) algorithms for signal estimation from noisy nonlinear measurements. We assume that the unknown $p$-dimensional signal lies near the range of an $L$-Lipschitz continuous generative model with bounded $k$-dimensional inputs. In particular, we consider two cases when the nonlinear link function is either unknown or known. For unknown nonlinearity, similarly to \cite{liu2020generalized}, we make the assumption of sub-Gaussian observations and propose a linear least-squares estimator. We show that when there is no representation error and the sensing vectors are Gaussian, roughly $O(k \log L)$ samples suffice to ensure that a PGD algorithm converges linearly to a point achieving the optimal statistical rate using arbitrary initialization. For known nonlinearity, we assume monotonicity as in \cite{yang2016sparse}, and make much weaker assumptions on the sensing vectors and allow for representation error. We propose a nonlinear least-squares estimator that is guaranteed to enjoy an optimal statistical rate. A corresponding PGD algorithm is provided and is shown to also converge linearly to the estimator using arbitrary initialization. In addition, we present experimental results on image datasets to demonstrate the performance of our PGD algorithms.