dc.contributor.author |
Wagarachchi, NM |
|
dc.contributor.author |
Karunananda, AS |
|
dc.date.accessioned |
2014-06-26T12:25:57Z |
|
dc.date.available |
2014-06-26T12:25:57Z |
|
dc.date.issued |
2014-06-26 |
|
dc.identifier.uri |
http://dl.lib.mrt.ac.lk/handle/123/10103 |
|
dc.description.abstract |
The number of hidden layers is crucial in multilayer artificial neural networks. In general, generalization power of the solution can be improved by increasing the number of layers. This paper presents a new method to determine the optimal architecture by using a pruning technique. The unimportant neurons are identified by using the delta values of hidden layers. The modified network contains fewer numbers of neurons in network and shows better generalization. Moreover, it has improved the speed relative to the back propagation training. The experiments have been done with number of test problems to verify the effectiveness of new approach. |
en_US |
dc.language.iso |
en |
en_US |
dc.subject |
Artificial Neural networks |
|
dc.subject |
Delta values |
|
dc.subject |
Hidden layers |
|
dc.subject |
Hidden neurons |
|
dc.subject |
Multilayer |
|
dc.title |
Optimization of multi-layer artificial neural networks using delta values of hidden layers |
en_US |
dc.type |
Conference-Full-text |
en_US |
dc.identifier.faculty |
IT |
en_US |
dc.identifier.department |
Department of Computational Mathematics |
en_US |
dc.identifier.year |
2013 |
en_US |
dc.identifier.conference |
IEEE Symposium on Computational Intelligence, Cognitive Algorithms, Mind, and Brain, CCMB 2013 - 2013 IEEE Symposium Series on Computational Intelligence, SSCI 2013 |
en_US |
dc.identifier.pgnos |
pp. 80-86 |
en_US |