Peng, C.C. and Magoulas, George D. (2011) Nonmonotone Levenberg–Marquardt training of recurrent neural architectures for processing symbolic sequences. Neural Computing and Applications 20 (6), pp. 897-908. ISSN 0941-0643.
Abstract
In this paper, we present nonmonotone variants of the Levenberg–Marquardt (LM) method for training recurrent neural networks (RNNs). These methods inherit the benefits of previously developed LM with momentum algorithms and are equipped with nonmonotone criteria, allowing temporal increase in training errors, and an adaptive scheme for tuning the size of the nonmonotone slide window. The proposed algorithms are applied to training RNNs of various sizes and architectures in symbolic sequence-processing problems. Experiments show that the proposed nonmonotone learning algorithms train more effectively RNNs for sequence processing than the original monotone methods.
Metadata
Item Type: | Article |
---|---|
Keyword(s) / Subject(s): | Levenberg–Marquardt methods, nonmonotone learning, recurrent neural networks |
School: | Birkbeck Faculties and Schools > Faculty of Science > School of Computing and Mathematical Sciences |
Research Centres and Institutes: | Birkbeck Knowledge Lab |
Depositing User: | Sarah Hall |
Date Deposited: | 18 Jul 2013 12:50 |
Last Modified: | 09 Aug 2023 12:33 |
URI: | https://eprints.bbk.ac.uk/id/eprint/7721 |
Statistics
Additional statistics are available via IRStats2.