Magoulas, George and Plagianakos, V.P. and Vrahatis, M.N. (2002) Globally convergent algorithms with local learning rates. IEEE Transactions on Neural Networks 13 (3), pp. 774-779. ISSN 1045-9227.
Abstract
A novel generalized theoretical result is presented that underpins the development of globally convergent first-order batch training algorithms which employ local learning rates. This result allows us to equip algorithms of this class with a strategy for adapting the overall direction of search to a descent one. In this way, a decrease of the batch-error measure at each training iteration is ensured, and convergence of the sequence of weight iterates to a local minimizer of the batch error function is obtained from remote initial weights. The effectiveness of the theoretical result is illustrated in three application examples by comparing two well-known training algorithms with local learning rates to their globally convergent modifications.
Metadata
Item Type: | Article |
---|---|
School: | Birkbeck Faculties and Schools > Faculty of Science > School of Computing and Mathematical Sciences |
Depositing User: | Sarah Hall |
Date Deposited: | 29 Jun 2021 13:52 |
Last Modified: | 09 Aug 2023 12:51 |
URI: | https://eprints.bbk.ac.uk/id/eprint/44919 |
Statistics
Additional statistics are available via IRStats2.