VGG16 and Bi-LSTM fused with an attention mechanism for human action recognition in infrared images
by Gao Cheng; Tang Chao; Tong Anyang; Wang Wenjian
International Journal of Computing Science and Mathematics (IJCSM), Vol. 20, No. 1, 2024

Abstract: Action recognition has long been a popular subject of research in computer vision because of its wide prospects for application. Infrared videos are suitable for monitoring in any kind of weather and can ensure the privacy of the data. We propose a method of human action recognition in infrared videos by fusing the visual geometry group 16 (VGG16) and bi-directional long short-term memory (Bi-LSTM) with an attention mechanism. First, we extract infrared images from an infrared video and pre-process them. Second, we use the VGG16 model to extract the spatial features of the images through convolution and pooling, and we apply the Bi-LSTM fused with the attention mechanism to extract their temporal features. Finally, the two networks obtain the results of classification through the score fusion strategy at the decision level. The method is tested on various infrared datasets and the results show that it is effective.

Online publication date: Thu, 11-Jul-2024

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computing Science and Mathematics (IJCSM):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com