Research on remote sensing image classification method using two-stream convolutional neural network
by Kai Peng; Juan Hu; Siyu Liu; Fang Qu; Houqun Yang; Jing Chen
International Journal of Wireless and Mobile Computing (IJWMC), Vol. 23, No. 3/4, 2022

Abstract: Owing to the lack of remote sensing image data set and no regional pertinence in terms of characteristics for classification, we have published the remote sensing image of some areas in Haikou City, Hainan Province, and made a HN-7 dataset, which has the regional characteristics specific of Hainan Province. The HN-7 dataset consists of seven classes, of which the construction site and dirt road categories appear in the public remote sensing dataset for the first time. Owing to the limited quantity of the HN-7 dataset, we decided to train a small convolutional neural network from scratch for the classification task, by using a three-layer two-stream network for improving the accuracy of the neural network model. Our model achieved 98.57% accuracy on the test set. We compared the accuracy of four common networks trained on HN-7, and the results showed that our model achieves the best performance.

Online publication date: Mon, 12-Dec-2022

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Wireless and Mobile Computing (IJWMC):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com