dc.contributor.author |
Tzafestas, SG |
en |
dc.contributor.author |
Dalianis, PJ |
en |
dc.contributor.author |
Anthopoulos, G |
en |
dc.date.accessioned |
2014-03-01T01:12:09Z |
|
dc.date.available |
2014-03-01T01:12:09Z |
|
dc.date.issued |
1996 |
en |
dc.identifier.issn |
0378-4754 |
en |
dc.identifier.uri |
https://dspace.lib.ntua.gr/xmlui/handle/123456789/11978 |
|
dc.subject |
backpropagation |
en |
dc.subject |
backpropagation neural network |
en |
dc.subject |
Character Recognition |
en |
dc.subject |
Dynamic Properties |
en |
dc.subject |
Energy Function |
en |
dc.subject |
Function Approximation |
en |
dc.subject |
Generalization Capability |
en |
dc.subject |
Neural Network |
en |
dc.subject.classification |
Computer Science, Interdisciplinary Applications |
en |
dc.subject.classification |
Computer Science, Software Engineering |
en |
dc.subject.classification |
Mathematics, Applied |
en |
dc.subject.other |
Algorithms |
en |
dc.subject.other |
Approximation theory |
en |
dc.subject.other |
Backpropagation |
en |
dc.subject.other |
Character recognition |
en |
dc.subject.other |
Computer simulation |
en |
dc.subject.other |
Electric network topology |
en |
dc.subject.other |
Learning systems |
en |
dc.subject.other |
Performance |
en |
dc.subject.other |
Backpropagation neural networks |
en |
dc.subject.other |
Dynamic properties |
en |
dc.subject.other |
Generalization capabilities |
en |
dc.subject.other |
Network size |
en |
dc.subject.other |
Overtraining phenomenon |
en |
dc.subject.other |
Training set size |
en |
dc.subject.other |
Neural networks |
en |
dc.title |
On the overtraining phenomenon of backpropagation neural networks |
en |
heal.type |
journalArticle |
en |
heal.identifier.primary |
10.1016/0378-4754(95)00003-8 |
en |
heal.identifier.secondary |
http://dx.doi.org/10.1016/0378-4754(95)00003-8 |
en |
heal.language |
English |
en |
heal.publicationDate |
1996 |
en |
heal.abstract |
A very important subject for the consolidation of neural networks is the study of their capabilities. In this paper, the relationships between network size, training set size and generalization capabilities are examined. The phenomenon of overtraining in backpropagation networks is discussed and an extension to an existing algorithm is described. The extended algorithm provides a new energy function and its advantages, such as improved plasticity and performance along with its dynamic properties, are explained. The algorithm is applied to some common problems (XOR, numeric character recognition and function approximation) and simulation results are presented and discussed. |
en |
heal.publisher |
ELSEVIER SCIENCE BV |
en |
heal.journalName |
Mathematics and Computers in Simulation |
en |
dc.identifier.doi |
10.1016/0378-4754(95)00003-8 |
en |
dc.identifier.isi |
ISI:A1996UR67100002 |
en |
dc.identifier.volume |
40 |
en |
dc.identifier.issue |
5-6 SPEC. ISS. |
en |
dc.identifier.spage |
507 |
en |
dc.identifier.epage |
521 |
en |