论文标题

近似消失的理想的有条件梯度

Conditional Gradients for the Approximate Vanishing Ideal

论文作者

Wirth, Elias, Pokutta, Sebastian

论文摘要

一组点$ x \ subseteq \ mathbb {r}^n $的消失理想是一组多项式,在x $中评估了所有点$ \ mathbf {x} \ in x $中的$ 0 $,并承认通过一组有限的多项式发电机的有效表示。为了适应数据集中的噪声,我们引入了成对条件梯度近似消失的理想算法(PCGAVI),该算法(PCGAVI)构建了一组近似消失的理想的发生器。构造的发电机在数据中捕获多项式结构,并产生特征图,例如,可以将其与线性分类器结合使用,以进行监督学习。在PCGAVI中,我们通过用成对条件梯度算法求解约束的凸优化问题来构建一组发电机。因此,PCGAVI不仅构造了很少的,而且还构建了稀疏的发电机,从而使相应的特征转换可靠和紧凑。此外,我们为PCGAVI得出了几种学习保证,这些学习保证使算法在理论上比相关的生成器构造方法更好。

The vanishing ideal of a set of points $X\subseteq \mathbb{R}^n$ is the set of polynomials that evaluate to $0$ over all points $\mathbf{x} \in X$ and admits an efficient representation by a finite set of polynomials called generators. To accommodate the noise in the data set, we introduce the pairwise conditional gradients approximate vanishing ideal algorithm (PCGAVI) that constructs a set of generators of the approximate vanishing ideal. The constructed generators capture polynomial structures in data and give rise to a feature map that can, for example, be used in combination with a linear classifier for supervised learning. In PCGAVI, we construct the set of generators by solving constrained convex optimization problems with the pairwise conditional gradients algorithm. Thus, PCGAVI not only constructs few but also sparse generators, making the corresponding feature transformation robust and compact. Furthermore, we derive several learning guarantees for PCGAVI that make the algorithm theoretically better motivated than related generator-constructing methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源