论文标题
重新思考几乎没有图像分类:您只需要一个良好的嵌入吗?
Rethinking Few-Shot Image Classification: a Good Embedding Is All You Need?
论文作者
论文摘要
最近的元学习研究的重点是学习算法的开发,这些算法可以迅速适应有限的数据和低计算成本测试时间任务。很少有很多学习被广泛用作元学习中的标准基准之一。在这项工作中,我们表明了一个简单的基准:在元训练集中学习监督或自我监督的表示形式,然后在此表示的基础上训练线性分类器,在此表示的基础上,胜过最先进的几次学习方法。通过使用自我验证可以实现额外的提升。这表明,使用良好的学习嵌入模型比复杂的元学习算法更有效。我们认为,我们的发现促使人们对少数图像分类基准的重新思考以及元学习算法的相关作用。代码可在以下网址提供:http://github.com/wangyueft/rfs/。
The focus of recent meta-learning research has been on the development of learning algorithms that can quickly adapt to test time tasks with limited data and low computational cost. Few-shot learning is widely used as one of the standard benchmarks in meta-learning. In this work, we show that a simple baseline: learning a supervised or self-supervised representation on the meta-training set, followed by training a linear classifier on top of this representation, outperforms state-of-the-art few-shot learning methods. An additional boost can be achieved through the use of self-distillation. This demonstrates that using a good learned embedding model can be more effective than sophisticated meta-learning algorithms. We believe that our findings motivate a rethinking of few-shot image classification benchmarks and the associated role of meta-learning algorithms. Code is available at: http://github.com/WangYueFt/rfs/.