BIROn - Birkbeck Institutional Research Online

    Nonmonotone Levenberg–Marquardt training of recurrent neural architectures for processing symbolic sequences

    Peng, C.C. and Magoulas, George D. (2011) Nonmonotone Levenberg–Marquardt training of recurrent neural architectures for processing symbolic sequences. Neural Computing and Applications 20 (6), pp. 897-908. ISSN 0941-0643.

    Full text not available from this repository.

    Abstract

    In this paper, we present nonmonotone variants of the Levenberg–Marquardt (LM) method for training recurrent neural networks (RNNs). These methods inherit the benefits of previously developed LM with momentum algorithms and are equipped with nonmonotone criteria, allowing temporal increase in training errors, and an adaptive scheme for tuning the size of the nonmonotone slide window. The proposed algorithms are applied to training RNNs of various sizes and architectures in symbolic sequence-processing problems. Experiments show that the proposed nonmonotone learning algorithms train more effectively RNNs for sequence processing than the original monotone methods.

    Metadata

    Item Type: Article
    Keyword(s) / Subject(s): Levenberg–Marquardt methods, nonmonotone learning, recurrent neural networks
    School: Birkbeck Schools and Departments > School of Business, Economics & Informatics > Computer Science and Information Systems
    Research Centre: Birkbeck Knowledge Lab
    Depositing User: Sarah Hall
    Date Deposited: 18 Jul 2013 12:50
    Last Modified: 02 Dec 2016 13:23
    URI: http://eprints.bbk.ac.uk/id/eprint/7721

    Statistics

    Downloads
    Activity Overview
    0Downloads
    0Hits

    Additional statistics are available via IRStats2.

    Archive Staff Only (login required)

    Edit/View Item Edit/View Item