Novel Technique for Optimizing the hidden layer architecture in Artificial Neural Networks

dc.contributor.authorWagarachchi, NM
dc.contributor.authorKarunananda, AS
dc.date.accessioned2014-08-14T13:56:06Z
dc.date.available2014-08-14T13:56:06Z
dc.date.issued2014-08-14
dc.description.abstractThe architecture of an artificial neural network has a great impact on the generalization power. More precisely, by changing the number of layers and neurons in each hidden layer generalization ability can be significantly changed. Therefore, the architecture is crucial in artificial neural network and hence, determining the hidden layer architecture has become a research challenge. In this paper a pruning technique has been presented to obtain an appropriate architecture based on the backpropagation training algorithm. Pruning is done by using the delta values of hidden layers. The proposed method has been tested with several benchmark problems in artificial neural networks and machine learning. The experimental results have been shown that the modified algorithm reduces the size of the network without degrading the performance. Also it tends to the desired error faster than the backpropagation algorithm.en_US
dc.description.sponsorshipKeywords: , , , , hiddenen_US
dc.identifier.emailasokakaru@uom.lken_US
dc.identifier.issn2328-3491en_US
dc.identifier.issue1en_US
dc.identifier.journalAmerican International Journal of Research in Science, Technology, Engineering and Mathematicsen_US
dc.identifier.pgnospp. 1-6en_US
dc.identifier.urihttp://dl.lib.mrt.ac.lk/handle/123/10513
dc.identifier.volume4en_US
dc.identifier.year2013en_US
dc.language.isoenen_US
dc.source.urihttp://iasir.net/AIJRSTEMpapers/AIJRSTEM13-303.pdfen_US
dc.subjectbackpropagationen_US
dc.subjectdelta valuesen_US
dc.subjectfeed-forward artificial neural networksen_US
dc.subjectgeneralizationen_US
dc.titleNovel Technique for Optimizing the hidden layer architecture in Artificial Neural Networksen_US
dc.typeArticle-Abstracten_US

Files