A block-encoding method for evolving neural network architecture Online publication date: Mon, 06-Sep-2021
by Xiaohu Shi; Hongyan Guo; Chunguo Wu; Yanchun Liang; Zhiyong Chang
International Journal of Bio-Inspired Computation (IJBIC), Vol. 18, No. 1, 2021
Abstract: The architecture and parameters of convolutional neural networks have an important impact on their performance. To overcome the difficulties of most existing neural architecture search (NAS) methods, including fixed network architecture and huge computing cost, this paper proposes a block-encoding based on neural architecture evolving method. A new block-encoding method is designed to divide the convolutional neural network architecture into blocks consisting of multiple functional layers. Efficient mutating operation is designed to speed up evolutionary search and expand the evolution space of network architecture. Finally, the optimal evolved network is converted into an all-convolutional neural network with fewer parameters and more concise architecture. The experiments on image datasets indicate that the proposed method can greatly reduce network parameters and searching time, achieve competitive classification accuracy and directly obtain the corresponding all-convolutional neural network architecture.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Bio-Inspired Computation (IJBIC):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com