论文标题

高阶结构保存图形神经网络,用于几次学习

High-order structure preserving graph neural network for few-shot learning

论文作者

Lin, Guangfeng, Yang, Ying, Fan, Yindi, Kang, Xiaobing, Liao, Kaiyang, Zhao, Fan

论文摘要

几乎没有学习可以通过元学习的相似性度量来找到先验知识和查询数据之间的潜在结构信息,以构建判别模型,以识别稀有标记的样本的新类别。大多数现有的方法试图在内部任务中对样本的相似性关系进行建模,并概括模型以识别新类别。但是,由于各个任务中的指标标准不同,因此难以考虑分离任务之间样品之间的关系。相反,提出的高阶结构保存图形神经网络(Hosp-gnn)可以进一步探索样品的丰富结构,以预测图上查询数据的标签,从而使结构的演化能够通过迭代地更新高阶结构关系来明确区分类别(而不是在多方面的结构中相对指标),而不是形象结构。 Hosp-Gnn不仅可以挖掘高阶结构,以补充可能分为元学习中不同任务的样本之间的相关性,而且还可以通过歧管约束生成结构更新的规则。此外,Hosp-Gnn不需要再培训学习模型来识别新类,并且Hosp-GNN具有可适应性的高阶结构,可用于模型适应性。实验表明,在三个Miniimagenet,tieredimagenet和FC100的基准数据集中,Hosp-GNN的表现优于监督和半监督的少量学习方法的最新方法。

Few-shot learning can find the latent structure information between the prior knowledge and the queried data by the similarity metric of meta-learning to construct the discriminative model for recognizing the new categories with the rare labeled samples. Most existing methods try to model the similarity relationship of the samples in the intra tasks, and generalize the model to identify the new categories. However, the relationship of samples between the separated tasks is difficultly considered because of the different metric criterion in the respective tasks. In contrast, the proposed high-order structure preserving graph neural network(HOSP-GNN) can further explore the rich structure of the samples to predict the label of the queried data on graph that enables the structure evolution to explicitly discriminate the categories by iteratively updating the high-order structure relationship (the relative metric in multi-samples,instead of pairwise sample metric) with the manifold structure constraints. HOSP-GNN can not only mine the high-order structure for complementing the relevance between samples that may be divided into the different task in meta-learning, and but also generate the rule of the structure updating by manifold constraint. Furthermore, HOSP-GNN doesn't need retrain the learning model for recognizing the new classes, and HOSP-GNN has the well-generalizable high-order structure for model adaptability. Experiments show that HOSP-GNN outperforms the state-of-the-art methods on supervised and semi-supervised few-shot learning in three benchmark datasets that are miniImageNet, tieredImageNet and FC100.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源