Peng, C.C. and Magoulas, George D. (2008) Advanced adaptive nonmonotone conjugate gradient training algorithm for recurrent neural networks. International Journal of Artificial Intelligence Tools 17 (5), pp. 963-984. ISSN 0218-2130.Full text not available from this repository.
Recurrent networks constitute an elegant way of increasing the capacity of feedforward networks to deal with complex data in the form of sequences of vectors. They are well known for their power to model temporal dependencies and process sequences for classification, recognition, and transduction. In this paper we propose an advanced nonmonotone Conjugate Gradient training algorithm for recurrent neural networks, which is equipped with an adaptive tuning strategy for both the nonmonotone learning horizon and the stepsize. Simulation results in sequence processing using three different recurrent architectures demonstrate that this modification of the Conjugate Gradient method is more effective than previous attempts.
|Keyword(s) / Subject(s):||Conjugate gradient methods, global convergence, nonmonotone linesearch, adaptive learning, training algorithms, recurrent neural networks, sequence processing|
|School or Research Centre:||Birkbeck Schools and Research Centres > School of Business, Economics & Informatics > Computer Science and Informatics|
|Date Deposited:||04 Feb 2011 15:51|
|Last Modified:||17 Apr 2013 12:18|
Archive Staff Only (login required)