论文标题
基因构造者:一击学习的基本数据点的注意力
BaseTransformers: Attention over base data-points for One Shot Learning
论文作者
论文摘要
很少有射击分类旨在学习仅使用有限的样本来识别新型类别。当前的大多数射击方法使用富含标记示例的基本数据集来训练用于获得新颖类的支持实例表示的编码器。由于测试实例来自与基本分布不同的分布,因此其特征表示质量差,性能降低。在本文中,我们建议利用最接近每个支持实例的基本数据集的训练良好的特征表示,以在元测试时间内改善其表示形式。为此,我们提出了基本数据集特征空间最相关区域的集质构造器,并改善了支持实例表示形式。三个基准数据集的实验表明,我们的方法适用于多个骨干,并实现最先进的一击设置。代码可在github.com/mayug/basetransformers上找到
Few shot classification aims to learn to recognize novel categories using only limited samples per category. Most current few shot methods use a base dataset rich in labeled examples to train an encoder that is used for obtaining representations of support instances for novel classes. Since the test instances are from a distribution different to the base distribution, their feature representations are of poor quality, degrading performance. In this paper we propose to make use of the well-trained feature representations of the base dataset that are closest to each support instance to improve its representation during meta-test time. To this end, we propose BaseTransformers, that attends to the most relevant regions of the base dataset feature space and improves support instance representations. Experiments on three benchmark data sets show that our method works well for several backbones and achieves state-of-the-art results in the inductive one shot setting. Code is available at github.com/mayug/BaseTransformers