论文标题
多个内核积极学习
Active Learning with Multiple Kernels
论文作者
论文摘要
在线多个内核学习(OMKL)在非线性功能学习任务中提供了有吸引力的表现。最近已经缓解了利用随机特征近似值,即OMKL的主要缺点,称为维度的诅咒。在本文中,我们引入了一个新的研究问题,该问题称为(基于流的)主动多内核学习(AMKL),其中允许学习者根据选择标准从Oracle标记所选数据。这在许多现实世界应用中是必要的,因为获取真实标签是昂贵或耗时的。我们证明,AMKL实现了最佳的额定性遗憾,这意味着拟议的选择标准确实避免了无用的标签重新要求。此外,我们提出了AMKL的自适应内核选择(AMKL-AKS),其中无关的核可以从仁词词典“ fly on The Fly”中排除。这种方法可以提高主动学习的效率以及功能近似的准确性。通过具有各种实际数据集的数值测试,可以证明AMKL-AKS的性能比最著名的OMKL具有相似或更好的性能,并且具有较少的标记数据。
Online multiple kernel learning (OMKL) has provided an attractive performance in nonlinear function learning tasks. Leveraging a random feature approximation, the major drawback of OMKL, known as the curse of dimensionality, has been recently alleviated. In this paper, we introduce a new research problem, termed (stream-based) active multiple kernel learning (AMKL), in which a learner is allowed to label selected data from an oracle according to a selection criterion. This is necessary in many real-world applications as acquiring true labels is costly or time-consuming. We prove that AMKL achieves an optimal sublinear regret, implying that the proposed selection criterion indeed avoids unuseful label-requests. Furthermore, we propose AMKL with an adaptive kernel selection (AMKL-AKS) in which irrelevant kernels can be excluded from a kernel dictionary 'on the fly'. This approach can improve the efficiency of active learning as well as the accuracy of a function approximation. Via numerical tests with various real datasets, it is demonstrated that AMKL-AKS yields a similar or better performance than the best-known OMKL, with a smaller number of labeled data.