Title: A light-weight model with granularity feature representation for fine-grained visual classification

Authors: Qiumei Zheng; Tianqi Peng; Ding Huang; Fenghua Wang; Nengxiang Xu

Addresses: College of Computer Science and Technology, China University of Petroleum Huadong, Qingdao, Shandong, China ' College of Computer Science and Technology, China University of Petroleum Huadong, Qingdao, Shandong, China ' School of Marxism, Hunan Non-ferrous Metals Vocational and Technical College, Zhuzhou, Hunan, China ' College of Computer Science and Technology, China University of Petroleum Huadong, Qingdao, Shandong, China ' College of Computer Science and Technology, Hunan University of Science and Technology, Changsha, Hunan, China

Abstract: Fine-grained image recognition can provide a more precise recognition technique for industrial production and applications. However, since it is difficult to capture comprehensive features and discriminative regions in convolutional neural networks (CNN), this ability is largely limited. With a lightweight orientation, we here use the advantage of Transformer in capturing global features by combining the technically mature CNN, and propose a lightweight model MV-GFR based on MobileViT. Further, we also propose three lightweight modules to help the network capture more subtle differences. First, we used the training module to provide the network with richer granularity information while ensuring its global integrity. Second, we used the feature part mask module in combining the diversity of CNN and the saliency of the transformer. Finally, we used the feature fusion module to integrate features of different levels and generate a complement between the global and local features. We then demonstrated the effectiveness of this scheme through experiments on three commonly used datasets.

Keywords: fine-grained visual classification; FGVC; light-weight; granularity feature; convolutional neural network; CNN.

DOI: 10.1504/IJCSE.2024.138426

International Journal of Computational Science and Engineering, 2024 Vol.27 No.3, pp.341 - 351

Received: 08 Feb 2023
Accepted: 28 Mar 2023

Published online: 03 May 2024 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article