论文标题

半参数张量因子分析通过迭代预测的奇异值分解

Semi-parametric TEnsor Factor Analysis by Iteratively Projected Singular Value Decomposition

论文作者

Chen, Elynn Y., Xia, Dong, Cai, Chencheng, Fan, Jianqing

论文摘要

本文介绍了半参数张量因子分析(Stefa)的一般框架,该框架重点介绍了与辅助协变量的低级张量分解的方法和理论。半参数张量因子分析模型通过将辅助协变量纳入加载矩阵中扩展了张量因子模型。我们提出了一种半参数估计的迭代式奇异值分解(IP-SVD)的算法。它迭代地将张量数据投射到由协变量的基础函数跨越的线性空间上,并在每种模式下对矩阵张量进行奇异值分解。我们建立了加载矩阵和核心张量因子的收敛速率。理论结果仅需要亚指数噪声分布,这比文献中噪声下噪声的假设弱。与塔克分解相比,IP-SVD的收敛速度更快。除估计外,我们还基于Stefa模型提出了几种使用新协变量的预测方法。在合成数据和实际张量数据上,我们证明了Stefa模型和IP-SVD算法在估计和预测任务上的功效。

This paper introduces a general framework of Semi-parametric TEnsor Factor Analysis (STEFA) that focuses on the methodology and theory of low-rank tensor decomposition with auxiliary covariates. Semi-parametric TEnsor Factor Analysis models extend tensor factor models by incorporating auxiliary covariates in the loading matrices. We propose an algorithm of iteratively projected singular value decomposition (IP-SVD) for the semi-parametric estimation. It iteratively projects tensor data onto the linear space spanned by the basis functions of covariates and applies singular value decomposition on matricized tensors over each mode. We establish the convergence rates of the loading matrices and the core tensor factor. The theoretical results only require a sub-exponential noise distribution, which is weaker than the assumption of sub-Gaussian tail of noise in the literature. Compared with the Tucker decomposition, IP-SVD yields more accurate estimators with a faster convergence rate. Besides estimation, we propose several prediction methods with new covariates based on the STEFA model. On both synthetic and real tensor data, we demonstrate the efficacy of the STEFA model and the IP-SVD algorithm on both the estimation and prediction tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源