论文标题

非平稳环境中的二重性在线深度学习

Bilevel Online Deep Learning in Non-stationary Environment

论文作者

Han, Ya-nan, Liu, Jian-wei, Xiao, Bing-biao, Wang, Xin-Tan, Luo, Xiong-lin

论文摘要

近年来,在线学习取得了巨大进展。但是,在通往人造代理的道路上的一个主要挑战是概念漂移,即,数据概率分布将在数据实例以流方式依次到达的情况下改变,这将导致灾难性的遗忘和降低模型的性能。在本文中,我们提出了一个新的二线在线深度学习(BODL)框架,该框架结合了双重优化策略和在线合奏分类器。在BODL算法中,我们使用一个集合分类器,该合奏分类器使用深神经网络中不同隐藏层的输出来构建多个基本分类器,基本分类器的重要权重以在线方式根据指数梯度下降方法更新。此外,我们应用类似的约束来克服在线合奏框架的收敛问题。然后,使用分类器错误率的有效概念漂移检测机制旨在监视数据概率分布的变化。当检测到概念漂移时,我们的BODL算法可以通过双重优化来自适应地更新模型参数,然后绕过较大的漂移并鼓励正转移。最后,在各种数据集上进行了广泛的实验和消融研究,竞争性数值结果表明,我们的BODL算法是一种有希望的方法。

Recent years have witnessed enormous progress of online learning. However, a major challenge on the road to artificial agents is concept drift, that is, the data probability distribution would change where the data instance arrives sequentially in a stream fashion, which would lead to catastrophic forgetting and degrade the performance of the model. In this paper, we proposed a new Bilevel Online Deep Learning (BODL) framework, which combine bilevel optimization strategy and online ensemble classifier. In BODL algorithm, we use an ensemble classifier, which use the output of different hidden layers in deep neural network to build multiple base classifiers, the important weights of the base classifiers are updated according to exponential gradient descent method in an online manner. Besides, we apply the similar constraint to overcome the convergence problem of online ensemble framework. Then an effective concept drift detection mechanism utilizing the error rate of classifier is designed to monitor the change of the data probability distribution. When the concept drift is detected, our BODL algorithm can adaptively update the model parameters via bilevel optimization and then circumvent the large drift and encourage positive transfer. Finally, the extensive experiments and ablation studies are conducted on various datasets and the competitive numerical results illustrate that our BODL algorithm is a promising approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源