Title: LightMOT: a lightweight convolution neural network for real-time multi-object tracking

Authors: Lie Guo; Pingshu Ge; Yibing Zhao; Dongxing Wang; Liang Huang

Addresses: School of Automotive Engineering, Dalian University of Technology, Dalian 116024, China; Ningbo Institute of Dalian University of Technology, Ningbo 315016, China ' College of Mechanical and Electronic Engineering, Dalian Minzu University, Dalian 116600, China ' School of Automotive Engineering, Dalian University of Technology, Dalian 116024, China ' School of Automotive Engineering, Dalian University of Technology, Dalian 116024, China ' School of Automotive Engineering, Dalian University of Technology, Dalian 116024, China

Abstract: Multi-object tracking (MOT) is an important problem in computer vision with a wide range of applications. MOT networks are computationally expensive and cannot run on mobile platforms in real-time. A lightweight multi-object tracking algorithm, named LightMOT, is proposed based on fairness multi-object tracking network (FairMOT). ShuffleNet V2 is adopted as the backbone network for feature extraction. Two efficient neural network modules are designed to improve the performance of the tracking algorithm. Information fusion module (IFM) is proposed to perform feature map information fusion with a small computational cost, while information enhancement module (IEM) is used to extract feature information from the final feature map. Tests were carried out in three typical scenarios from D2-City dataset. Urban road scene tracking results show that the MOTA value increases from 51.9% to 63.2% and the ID Switch value is also reduced when adding IFM and IEM at the same time. After the model is converted to ncnn format, it can reach 30 fps on the embedded platform.

Keywords: advanced driver assistance systems; deep learning; multi-object tracking; MOT; lightweight; FairMOT.

DOI: 10.1504/IJBIC.2023.135467

International Journal of Bio-Inspired Computation, 2023 Vol.22 No.3, pp.152 - 161

Received: 11 Aug 2022
Accepted: 02 Aug 2023

Published online: 14 Dec 2023 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article