Open Access
ARTICLE
Learning Noise-Assisted Robust Image Features for Fine-Grained Image Retrieval
1 Department of Computer Science and Engineering, Graphic Era Deemed to be University, Dehradun, 248002, India
2 School of Computer Science Engineering and Technology, Bennett University, Greater Noida, 201310, Uttar Pradesh,
India
* Corresponding Author: Vidit Kumar. Email:
Computer Systems Science and Engineering 2023, 46(3), 2711-2724. https://doi.org/10.32604/csse.2023.032047
Received 05 May 2022; Accepted 08 December 2022; Issue published 03 April 2023
Abstract
Fine-grained image search is one of the most challenging tasks in computer vision that aims to retrieve similar images at the fine-grained level for a given query image. The key objective is to learn discriminative fine-grained features by training deep models such that similar images are clustered, and dissimilar images are separated in the low embedding space. Previous works primarily focused on defining local structure loss functions like triplet loss, pairwise loss, etc. However, training via these approaches takes a long training time, and they have poor accuracy. Additionally, representations learned through it tend to tighten up in the embedded space and lose generalizability to unseen classes. This paper proposes a noise-assisted representation learning method for fine-grained image retrieval to mitigate these issues. In the proposed work, class manifold learning is performed in which positive pairs are created with noise insertion operation instead of tightening class clusters. And other instances are treated as negatives within the same cluster. Then a loss function is defined to penalize when the distance between instances of the same class becomes too small relative to the noise pair in that class in embedded space. The proposed approach is validated on CARS-196 and CUB-200 datasets and achieved better retrieval results (85.38% recall@1 for CARS-196% and 70.13% recall@1 for CUB-200) compared to other existing methods.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.