Title: STAGNN: a spatial-temporal attention graph neural network for network traffic prediction

Authors: Yonghua Luo; Qian Ning; Bingcai Chen; Xinzhi Zhou; Linyu Huang

Addresses: College of Electronics and Information Engineering, Sichuan University, Chengdu, 610065, China ' College of Electronics and Information Engineering, Sichuan University, Chengdu, 610065, China ' College of Computer Science and Technology, Dalian University of Technology, Dalian, 116000, China ' College of Electronics and Information Engineering, Sichuan University, Chengdu, 610065, China ' College of Electronics and Information Engineering, Sichuan University, Chengdu, 610065, China

Abstract: Accurate and real-time traffic prediction can reasonably allocate the resources of communication networks and effectively improve the communication quality of networks. However, the complex topology and highly dynamic nature of communication networks pose new challenges for traffic prediction. To be able to effectively obtain the temporal correlation and spatial dependency of network traffic and mask the redundant traffic features, we propose a spatial-temporal attention graph neural network (STAGNN). The STAGNN combines the graph attention network (GAT) and the time series model informer, where GAT is used to learn the complex spatial dependencies of network topology and informer is used to learn the dynamic temporal correlation of network traffic. Also in learning, we introduce the multi-headed attention mechanism enabling STAGNN to quickly select high-value network traffic information using limited attention resources. The experimental results demonstrate that STAGNN has better prediction performance compared with other existing methods.

Keywords: network traffic prediction; attention mechanism; temporal correlation; spatial dependency.

DOI: 10.1504/IJCNDS.2024.139320

International Journal of Communication Networks and Distributed Systems, 2024 Vol.30 No.4, pp.413 - 432

Received: 26 May 2022
Accepted: 16 Apr 2023

Published online: 01 Jul 2024 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article