Magoulas, George and Vrahatis, M.N. (2006) Adaptive algorithms for neural network supervised learning: a deterministic optimization approach. International Journal of Bifurcation and Chaos 16 (7), pp. 1929-1950. ISSN 0218-1274.
Abstract
Networks of neurons can perform computations that even modern computers find very difficult to simulate. Most of the existing artificial neurons and artificial neural networks are considered biologically unrealistic, nevertheless the practical success of the backpropagation algorithm and the powerful capabilities of feedforward neural networks have made neural computing very popular in several application areas. A challenging issue in this context is learning internal representations by adjusting the weights of the network connections. To this end, several first-order and second-order algorithms have been proposed in the literature. This paper provides an overview of approaches to backpropagation training, emphazing on first-order adaptive learning algorithms that build on the theory of nonlinear optimization, and proposes a framework for their analysis in the context of deterministic optimization.
Metadata
Item Type: | Article |
---|---|
School: | Birkbeck Faculties and Schools > Faculty of Science > School of Computing and Mathematical Sciences |
Depositing User: | Sarah Hall |
Date Deposited: | 22 Jun 2021 12:48 |
Last Modified: | 09 Aug 2023 12:51 |
URI: | https://eprints.bbk.ac.uk/id/eprint/44832 |
Statistics
Additional statistics are available via IRStats2.