Sign-methods for training with imprecise error function and gradient values
Magoulas, George and Plagianakos, V.P. and Vrahatis, M.N. (1999) Sign-methods for training with imprecise error function and gradient values. In: UNSPECIFIED (ed.) International Joint Conference Neural Networks: IJCNN 1999. IEEE Computer Society, pp. 1768-1773. ISBN 0780355296.
Abstract
We present nonmonotone methods for feedforward neural network training, i.e., training methods in which error function values are allowed to increase at some iterations. More specifically, at each epoch we impose that the current error function value must satisfy an Armijo-type criterion, with respect to the maximum error function value of M previous epochs. A strategy to dynamically adapt M is suggested and two training algorithms with adaptive learning rates that successfully employ the above mentioned acceptability criterion are proposed. Experimental results show that the nonmonotone learning strategy improves the convergence speed and the success rate of the methods considered.
Metadata
Item Type: | Book Section |
---|---|
School: | School of Business, Economics & Informatics > Computer Science and Information Systems |
Depositing User: | Sarah Hall |
Date Deposited: | 06 Jul 2021 13:31 |
Last Modified: | 06 Jul 2021 13:31 |
URI: | https://eprints.bbk.ac.uk/id/eprint/45009 |
Statistics
Additional statistics are available via IRStats2.