论文标题
Opencon:开放世界的对比学习
OpenCon: Open-world Contrastive Learning
论文作者
论文摘要
在野外部署的机器学习模型自然会遇到已知和新颖类中的未标记样本。从标记和未标记的数据以开放世界的半监督方式从标记和未标记的数据中学习的挑战。在本文中,我们介绍了一个新的学习框架,开放世界的对比学习(Opencon)。 Opencon应对已知和新颖阶级学习紧凑的表现的挑战,并促进了一路上的新颖性发现。我们证明了Opencon在挑战基准数据集中的有效性并建立竞争性能。在Imagenet数据集上,Opencon在新颖和总体分类精度上分别明显优于当前最佳方法的最佳方法11.9%和7.4%。从理论上讲,可以从EM算法的角度来严格解释Opencon - 通过在嵌入空间中聚类相似的样本来最大程度地减少我们的对比度损失,从而使我们的对比损失部分最大化。该代码可从https://github.com/deeplearning-wisc/opencon获得。
Machine learning models deployed in the wild naturally encounter unlabeled samples from both known and novel classes. Challenges arise in learning from both the labeled and unlabeled data, in an open-world semi-supervised manner. In this paper, we introduce a new learning framework, open-world contrastive learning (OpenCon). OpenCon tackles the challenges of learning compact representations for both known and novel classes and facilitates novelty discovery along the way. We demonstrate the effectiveness of OpenCon on challenging benchmark datasets and establish competitive performance. On the ImageNet dataset, OpenCon significantly outperforms the current best method by 11.9% and 7.4% on novel and overall classification accuracy, respectively. Theoretically, OpenCon can be rigorously interpreted from an EM algorithm perspective--minimizing our contrastive loss partially maximizes the likelihood by clustering similar samples in the embedding space. The code is available at https://github.com/deeplearning-wisc/opencon.