Open Access
ARTICLE
Multi-Head Attention Graph Network for Few Shot Learning
1 School of Computer Science and Technology, Huazhong University of Science and Technology, Wuhan, 430074, China
2 School of Electronics, Electrical Engineering and Computer Science, Queens University, Belfast, BT7 1NN, UK
* Corresponding Author: Hefei Ling. Email:
Computers, Materials & Continua 2021, 68(2), 1505-1517. https://doi.org/10.32604/cmc.2021.016851
Received 12 January 2021; Accepted 13 February 2021; Issue published 13 April 2021
Abstract
The majority of existing graph-network-based few-shot models focus on a node-similarity update mode. The lack of adequate information intensifies the risk of overtraining. In this paper, we propose a novel Multi-head Attention Graph Network to excavate discriminative relation and fulfill effective information propagation. For edge update, the node-level attention is used to evaluate the similarities between the two nodes and the distribution-level attention extracts more in-deep global relation. The cooperation between those two parts provides a discriminative and comprehensive expression for edge feature. For node update, we embrace the label-level attention to soften the noise of irrelevant nodes and optimize the update direction. Our proposed model is verified through extensive experiments on two few-shot benchmark MiniImageNet and CIFAR-FS dataset. The results suggest that our method has a strong capability of noise immunity and quick convergence. The classification accuracy outperforms most state-of-the-art approaches.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.