Optimization of multi-layer artificial neural networks using delta values of hidden layers

dc.contributor.authorWagarachchi, NM
dc.contributor.authorKarunananda, AS
dc.date.accessioned2014-06-26T12:25:57Z
dc.date.available2014-06-26T12:25:57Z
dc.date.issued2014-06-26
dc.description.abstractThe number of hidden layers is crucial in multilayer artificial neural networks. In general, generalization power of the solution can be improved by increasing the number of layers. This paper presents a new method to determine the optimal architecture by using a pruning technique. The unimportant neurons are identified by using the delta values of hidden layers. The modified network contains fewer numbers of neurons in network and shows better generalization. Moreover, it has improved the speed relative to the back propagation training. The experiments have been done with number of test problems to verify the effectiveness of new approach.en_US
dc.identifier.conferenceIEEE Symposium on Computational Intelligence, Cognitive Algorithms, Mind, and Brain, CCMB 2013 - 2013 IEEE Symposium Series on Computational Intelligence, SSCI 2013en_US
dc.identifier.departmentDepartment of Computational Mathematicsen_US
dc.identifier.facultyITen_US
dc.identifier.pgnospp. 80-86en_US
dc.identifier.urihttp://dl.lib.mrt.ac.lk/handle/123/10103
dc.identifier.year2013en_US
dc.language.isoenen_US
dc.subjectArtificial Neural networks
dc.subjectDelta values
dc.subjectHidden layers
dc.subjectHidden neurons
dc.subjectMultilayer
dc.titleOptimization of multi-layer artificial neural networks using delta values of hidden layersen_US
dc.typeConference-Full-texten_US

Files