Novel Technique for Optimizing the hidden layer architecture in Artificial Neural Networks
Loading...
Date
2014-08-14
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
The architecture of an artificial neural network has a great impact on the generalization power. 
More precisely, by changing the number of layers and neurons in each hidden layer generalization ability 
can be significantly changed. Therefore, the architecture is crucial in artificial neural network and hence, 
determining the hidden layer architecture has become a research challenge. In this paper a pruning 
technique has been presented to obtain an appropriate architecture based on the backpropagation training 
algorithm. Pruning is done by using the delta values of hidden layers. The proposed method has been 
tested with several benchmark problems in artificial neural networks and machine learning. The 
experimental results have been shown that the modified algorithm reduces the size of the network without 
degrading the performance. Also it tends to the desired error faster than the backpropagation algorithm.
