论文标题

自适应噪声数据增强,用于常规线性模型中的正则估计和推断

Adaptive Noisy Data Augmentation for Regularized Estimation and Inference in Generalized Linear Models

论文作者

Li, Yinan, Liu, Fang

论文摘要

我们提出了自适应噪声增强程序(PANDA)程序,以使广义线性模型(GLM)的估计和推理正规化。熊猫迭代地给定噪声增强数据的目标函数,直到收敛以获得正则化模型估计。增强噪声旨在实现各种正规化效果,包括$ L_0 $,桥梁(包括拉索和山脊),弹性网,自适应套索和SCAD,以及集团的Lasso和Fused Ridge。我们检查了噪声增强损失函数的尾巴结合,并分别建立了噪声增强损失函数的几乎确定的收敛性及其最小化器分别与预期的罚款损失函数及其最小化器。我们得出正规化参数的渐近分布,基于该分布,可以通过可变选择同时获得推论。熊猫表现出合奏学习行为,有助于进一步减少概括误差。在计算上,熊猫很容易编码,利用现有软件实现GLM,而无需求助于复杂的优化技术。我们证明了熊猫在模拟和现实生活数据中相同类型的正规化器的现有方法的出色或相似性能。我们表明,通过熊猫的推论实现了名义上或近纽米的覆盖范围,并且与流行的现有选择后程序相比,效率要高得多。

We propose the AdaPtive Noise Augmentation (PANDA) procedure to regularize the estimation and inference of generalized linear models (GLMs). PANDA iteratively optimizes the objective function given noise augmented data until convergence to obtain the regularized model estimates. The augmented noises are designed to achieve various regularization effects, including $l_0$, bridge (lasso and ridge included), elastic net, adaptive lasso, and SCAD, as well as group lasso and fused ridge. We examine the tail bound of the noise-augmented loss function and establish the almost sure convergence of the noise-augmented loss function and its minimizer to the expected penalized loss function and its minimizer, respectively. We derive the asymptotic distributions for the regularized parameters, based on which, inferences can be obtained simultaneously with variable selection. PANDA exhibits ensemble learning behaviors that help further decrease the generalization error. Computationally, PANDA is easy to code, leveraging existing software for implementing GLMs, without resorting to complicated optimization techniques. We demonstrate the superior or similar performance of PANDA against the existing approaches of the same type of regularizers in simulated and real-life data. We show that the inferences through PANDA achieve nominal or near-nominal coverage and are far more efficient compared to a popular existing post-selection procedure.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源