BIROn - Birkbeck Institutional Research Online

    Nonmonotone BFGS-trained recurrent neural networks for temporal sequence processing

    Peng, C.C. and Magoulas, George D. (2011) Nonmonotone BFGS-trained recurrent neural networks for temporal sequence processing. Applied Mathematics and Computation 217 (12), pp. 5421-5441. ISSN 0096-3003.

    Full text not available from this repository.


    In this paper we propose a nonmonotone approach to recurrent neural networks training for temporal sequence processing applications. This approach allows learning performance to deteriorate in some iterations, nevertheless the network's performance is improved over time. A self-scaling BFGS is equipped with an adaptive nonmonotone technique that employs approximations of the Lipschitz constant and is tested on a set of sequence processing problems. Simulation results show that the proposed algorithm outperforms the BFGS as well as other methods previously applied to these sequences, providing an effective modification that is capable of training recurrent networks of various architectures. (C) 2010 Elsevier Inc. All rights reserved.


    Item Type: Article
    Keyword(s) / Subject(s): Recurrent neural networks, Quasi-Newton methods, BFGS updates, nonmonotone methods, second-order training algorithms, temporal sequence
    School: School of Business, Economics & Informatics > Computer Science and Information Systems
    Research Centres and Institutes: Birkbeck Knowledge Lab
    Depositing User: Administrator
    Date Deposited: 20 Jun 2011 14:11
    Last Modified: 02 Dec 2016 13:23


    Activity Overview

    Additional statistics are available via IRStats2.

    Archive Staff Only (login required)

    Edit/View Item Edit/View Item