An improved three-term optical backpropagation algorithm Online publication date: Tue, 31-Mar-2015
by M. Sornam; P. Thangavel
International Journal of Artificial Intelligence and Soft Computing (IJAISC), Vol. 2, No. 4, 2011
Abstract: An improved Optical Backpropagation (OBP) algorithm for training single hidden layer feedforward neural network with third term is proposed. The major limitations of backpropagation algorithm are the local minima problem and the slow rate of convergence. To solve these problems, we have proposed an algorithm by introducing a third term with optical backpropagation (OBPWT). This method has been applied to the multilayer neural network to improve the efficiency in terms of convergence speed. In the proposed algorithm, a non-linear function on the error term is introduced before applying the backpropagation phase. This error term is used along with a third term in the weight updation rule. We have shown how the new proposed algorithm drastically accelerates the training convergence at the same time maintaining the neural network’s performance. The effectiveness of the proposed algorithm has been shown by testing five benchmark problems. The simulation results show that the proposed algorithm is capable of speeding up the learning and hence the rate of convergence.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Artificial Intelligence and Soft Computing (IJAISC):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com