BIROn - Birkbeck Institutional Research Online

    Effective backpropagation training with variable stepsize

    Magoulas, George and Vrahatis, M.N. and Androulakis, G. (1997) Effective backpropagation training with variable stepsize. Neural Networks 10 (1), pp. 69-82. ISSN 0893-6080.

    Full text not available from this repository.

    Abstract

    The issue of variable stepsize in the backpropagation training algorithm has been widely investigated and several techniques employing heuristic factors have been suggested to improve training time and reduce convergence to local minima. In this contribution, backpropagation training is based on a modified steepest descent method which allows variable stepsize. It is computationally efficient and posseses interesting convergence properties utilizing estimates of the Lipschitz constant without any additional computational cost. The algorithm has been implemented and tested on several problems and the results have been very satisfactory. Numerical evidence shows that the method is robust with good average performance on many classes of problems.

    Metadata

    Item Type: Article
    School: Birkbeck Faculties and Schools > Faculty of Science > School of Computing and Mathematical Sciences
    Depositing User: Sarah Hall
    Date Deposited: 06 Jul 2021 13:51
    Last Modified: 09 Aug 2023 12:51
    URI: https://eprints.bbk.ac.uk/id/eprint/45012

    Statistics

    Activity Overview
    6 month trend
    0Downloads
    6 month trend
    172Hits

    Additional statistics are available via IRStats2.

    Archive Staff Only (login required)

    Edit/View Item
    Edit/View Item