论文标题

通过Triple nonConvex非平滑秩最小化对低等级张量的有效恢复

Efficient Recovery of Low Rank Tensor via Triple Nonconvex Nonsmooth Rank Minimization

论文作者

Yu, Quan

论文摘要

最近提出了一种基于张量的核标准(TNN)方法来解决张量的恢复问题,并实现了最先进的性能。但是,它可能无法产生高度准确的解决方案,因为它倾向于对每个额叶切片进行处理,并且每个正面切片的每个等级分量均等。为了获得高精度的恢复,我们提出了一个名为Double加权的非conmooth Rank(DWNNR)放松功能的一般且灵活的等级松弛函数,以有效地解决三阶张量张量恢复问题。 DWNNR松弛函数可以从三重非convex非平滑级别(TNNR)放松函数得出,通过将权重向量设置为某些凹面函数的高度级别值,从而自适应选择权重向量。为了加速所提出的模型,我们开发了一般的惯性平滑近端梯度方法。此外,我们证明生成子序列的任何限制点都是关键点。将Kurdyka-Lojasiewicz(KL)属性与一些较温和的假设相结合,我们进一步提供了其全球收敛保证。关于合成数据和真实数据的实用张量完成问题的实验结果,其结果证明了所提出算法的效率和卓越性能。

A tensor nuclear norm (TNN) based method for solving the tensor recovery problem was recently proposed, and it has achieved state-of-the-art performance. However, it may fail to produce a highly accurate solution since it tends to treats each frontal slice and each rank component of each frontal slice equally. In order to get a recovery with high accuracy, we propose a general and flexible rank relaxation function named double weighted nonconvex nonsmooth rank (DWNNR) relaxation function for efficiently solving the third order tensor recovery problem. The DWNNR relaxation function can be derived from the triple nonconvex nonsmooth rank (TNNR) relaxation function by setting the weight vector to be the hypergradient value of some concave function, thereby adaptively selecting the weight vector. To accelerate the proposed model, we develop the general inertial smoothing proximal gradient method. Furthermore, we prove that any limit point of the generated subsequence is a critical point. Combining the Kurdyka-Lojasiewicz (KL) property with some milder assumptions, we further give its global convergence guarantee. Experimental results on a practical tensor completion problem with both synthetic and real data, the results of which demonstrate the efficiency and superior performance of the proposed algorithm.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源