论文标题

内核连接的深神经网络具有数据扩展

Kernel-convoluted Deep Neural Networks with Data Augmentation

论文作者

Kim, Minjin, Kim, Young-geun, Kim, Dongha, Kim, Yongdai, Paik, Myunghee Cho

论文摘要

使用线性插值数据的混合方法(Zhang等人,2018年)已成为一种有效的数据增强工具,以提高泛化性能和对对抗性示例的鲁棒性。动机是通过其隐式模型的约束来减少不良振荡,以在观察到的数据点之间的内在性能并促进平滑度。在这项工作中,我们正式研究了这一前提,提出了一种明确施加平滑度约束的方法,并将其扩展到与隐式模型约束结合在一起。首先,我们得出了一个由内核连接模型(KCM)组成的新功能类别,其中平滑度约束是通过局部用内核函数局部平均来施加的。其次,我们建议将混合方法纳入KCM,以扩大平滑度域。在KCM和KCM的两种情况下,在某些条件下,我们分别提供了风险分析。我们表明,多余风险的上限不比原始功能类别慢。如果混合的扰动消失的速度比\ \(n^{ - 1/2})\(n \)是样本大小,则与混合物的kCM的上限保持在KCM的主导。使用CIFAR-10和CIFAR-100数据集,我们的实验表明,与混合混合的KCM在对对抗性示例的概括和鲁棒性方面优于混合方法。

The Mixup method (Zhang et al. 2018), which uses linearly interpolated data, has emerged as an effective data augmentation tool to improve generalization performance and the robustness to adversarial examples. The motivation is to curtail undesirable oscillations by its implicit model constraint to behave linearly at in-between observed data points and promote smoothness. In this work, we formally investigate this premise, propose a way to explicitly impose smoothness constraints, and extend it to incorporate with implicit model constraints. First, we derive a new function class composed of kernel-convoluted models (KCM) where the smoothness constraint is directly imposed by locally averaging the original functions with a kernel function. Second, we propose to incorporate the Mixup method into KCM to expand the domains of smoothness. In both cases of KCM and the KCM adapted with the Mixup, we provide risk analysis, respectively, under some conditions for kernels. We show that the upper bound of the excess risk is not slower than that of the original function class. The upper bound of the KCM with the Mixup remains dominated by that of the KCM if the perturbation of the Mixup vanishes faster than \(O(n^{-1/2})\) where \(n\) is a sample size. Using CIFAR-10 and CIFAR-100 datasets, our experiments demonstrate that the KCM with the Mixup outperforms the Mixup method in terms of generalization and robustness to adversarial examples.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源