论文标题

单立方体:循环增量牛顿型梯度下降,立方体化以进行非凸优化

SingCubic: Cyclic Incremental Newton-type Gradient Descent with Cubic Regularization for Non-Convex Optimization

论文作者

Shi, Ziqiang

论文摘要

在这项工作中,我们将最近的两项完全不同的作品概括为〜\ cite {shi2015large}和〜\ cite {Cartis2012appative}通过提出循环增量牛顿型梯度梯度梯度逐步逐步定期化(Singcubic)方法,以启用非convex函数。通过Singubic的迭代,使用Hessian信息的立方正规化全局二次近似。初步数值实验表明,与基本的增量或随机牛顿型实现相比,Singupic算法的令人鼓舞的性能。结果和技术可以作为对采用立方正则化的牛顿型梯度下降方法进行研究的启动。本文提出的方法和原则可用于进行逻辑回归,自动编码器培训,独立组件分析,ISING模型/Hopfield网络培训,多层透视式,深度卷积网络培训等。我们将很快开放实施的部分。

In this work, we generalized and unified two recent completely different works of~\cite{shi2015large} and~\cite{cartis2012adaptive} respectively into one by proposing the cyclic incremental Newton-type gradient descent with cubic regularization (SingCubic) method for optimizing non-convex functions. Through the iterations of SingCubic, a cubic regularized global quadratic approximation using Hessian information is kept and solved. Preliminary numerical experiments show the encouraging performance of the SingCubic algorithm when compared to basic incremental or stochastic Newton-type implementations. The results and technique can be served as an initiate for the research on the incremental Newton-type gradient descent methods that employ cubic regularization. The methods and principles proposed in this paper can be used to do logistic regression, autoencoder training, independent components analysis, Ising model/Hopfield network training, multilayer perceptron, deep convolutional network training and so on. We will open-source parts of our implementations soon.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源