Fairbank, Michael and Alonso, Eduardo (2012) Efficient calculation of the Gauss-Newton approximation of the Hessian matrix in neural networks. Neural Computation, 24 (3). pp. 607-610. DOI https://doi.org/10.1162/neco_a_00248
Fairbank, Michael and Alonso, Eduardo (2012) Efficient calculation of the Gauss-Newton approximation of the Hessian matrix in neural networks. Neural Computation, 24 (3). pp. 607-610. DOI https://doi.org/10.1162/neco_a_00248
Fairbank, Michael and Alonso, Eduardo (2012) Efficient calculation of the Gauss-Newton approximation of the Hessian matrix in neural networks. Neural Computation, 24 (3). pp. 607-610. DOI https://doi.org/10.1162/neco_a_00248
Abstract
The Levenberg-Marquardt (LM) learning algorithm is a popular algorithm for training neural networks; however, for large neural networks, it becomes prohibitively expensive in terms of running time and memory requirements. The most time-critical step of the algorithm is the calculation of the Gauss-Newton matrix, which is formed by multiplying two large Jacobian matrices together. We propose a method that uses backpropagation to reduce the time of this matrix-matrix multiplication. This reduces the overall asymptotic running time of the LM algorithm by a factor of the order of the number of output nodes in the neural network.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Algorithms; Neural Networks (Computer) |
Divisions: | Faculty of Science and Health Faculty of Science and Health > Computer Science and Electronic Engineering, School of |
SWORD Depositor: | Unnamed user with email elements@essex.ac.uk |
Depositing User: | Unnamed user with email elements@essex.ac.uk |
Date Deposited: | 14 Apr 2021 13:46 |
Last Modified: | 28 May 2024 06:44 |
URI: | http://repository.essex.ac.uk/id/eprint/21301 |
Available files
Filename: Efficient calculation.pdf