论文标题

划分和征服基于内核的功能线性回归的统计最佳性

Statistical Optimality of Divide and Conquer Kernel-based Functional Linear Regression

论文作者

Liu, Jiading, Shi, Lei

论文摘要

先前对繁殖内核希尔伯特空间(RKH)中正则功能线性回归的先前分析通常需要目标函数包含在此内核空间中。本文研究了目标函数不一定驻留在基础RKHS中的情况下,研究了分裂和构成估计器的收敛性能。作为一种基于分解的可扩展方法,功能线性回归的划分估计值可以大大降低时间和记忆中的算法复杂性。我们开发了一种积分运算符方法,以在解释变量和目标功能的各种规律性条件下建立尖锐的有限样品上限,以通过分裂和串扰估计器进行预测。我们还通过构建迷你最大下限来证明衍生速率的渐近最优性。最后,我们考虑无噪声估计器的收敛性,并表明在轻度条件下的速率可以任意快速。

Previous analysis of regularized functional linear regression in a reproducing kernel Hilbert space (RKHS) typically requires the target function to be contained in this kernel space. This paper studies the convergence performance of divide-and-conquer estimators in the scenario that the target function does not necessarily reside in the underlying RKHS. As a decomposition-based scalable approach, the divide-and-conquer estimators of functional linear regression can substantially reduce the algorithmic complexities in time and memory. We develop an integral operator approach to establish sharp finite sample upper bounds for prediction with divide-and-conquer estimators under various regularity conditions of explanatory variables and target function. We also prove the asymptotic optimality of the derived rates by building the mini-max lower bounds. Finally, we consider the convergence of noiseless estimators and show that the rates can be arbitrarily fast under mild conditions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源