论文标题

讲义:有效近似内核功能

Lecture notes: Efficient approximation of kernel functions

论文作者

Bagchi, Amitabha

论文摘要

这些讲义的注释努力在一个地方收集一般的数学背景,以了解核的特性,尤其是Rahimi和Recht(NIPS 2007)的随机傅立叶特征近似。我们以支持向量机的示例来简要激励内核在机器学习中的使用。我们将详细讨论积极的确定和有条件负面的确定内核。在简短讨论了希尔伯特的空间,包括繁殖的内核希尔伯特太空建筑,我们介绍了默瑟定理。我们讨论了随机的傅立叶特征技术,然后以证据,标量和矩阵浓度的结果来呈现,以帮助我们估计该技术产生的误差。这些笔记是在2020年1月至4月之间在IIT德里发表的10次讲座的转录。

These lecture notes endeavour to collect in one place the mathematical background required to understand the properties of kernels in general and the Random Fourier Features approximation of Rahimi and Recht (NIPS 2007) in particular. We briefly motivate the use of kernels in Machine Learning with the example of the support vector machine. We discuss positive definite and conditionally negative definite kernels in some detail. After a brief discussion of Hilbert spaces, including the Reproducing Kernel Hilbert Space construction, we present Mercer's theorem. We discuss the Random Fourier Features technique and then present, with proofs, scalar and matrix concentration results that help us estimate the error incurred by the technique. These notes are the transcription of 10 lectures given at IIT Delhi between January and April 2020.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源