Attention-based 3DTCN-LSTM short-term network traffic prediction model considering multi-base station spatiotemporal coupling
by Yuliang Zhan; Ji Zhou; Jiayi Zhang
International Journal of Web Engineering and Technology (IJWET), Vol. 17, No. 4, 2022

Abstract: Implementing an accurate traffic prediction method can help telecom operators to pre-manage and optimise the network in advance, and it is also easy to adjust the power consumption of the base station. At present, the correlation between mobile devices and local base stations cannot be ignore, to accurately predict network traffic. Combining the spatiotemporal characteristics of traffic to achieve more accurate traffic prediction, this paper proposes a 3D temporal convolutional network-long short-term memory (3DTCN-LSTM) model optimised based on the attention mechanism. The mechanism reduces redundant information interference, enabling the extraction of long-range spatial correlations. The long-term dependency characteristics of the traffic are then obtained through the LSTM network. Finally, the experiments on the dataset demonstrate that the prediction effect of the 3DTCN-LSTM model is significantly better than LSTM, BiLSTM, TCN, TCN-LSTM, 3DCNN and other models.

Online publication date: Wed, 01-Mar-2023

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Web Engineering and Technology (IJWET):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com