Skip to main navigation Skip to search Skip to main content

Comparison of gradient based training algorithms for multilayer perceptrons

Research output: Contribution to journalArticlepeer-review

Abstract

The training speeds of Batch Backpropagation using steepest descent, Conjugate Gradient and Quasi-Newton algorithms for a feedforward neural network are compared. Results illustrating the advantages of the Hessian-based techniques are given and issues affecting speed discussed.

Original languageEnglish
Pages (from-to)11/1-11/6
JournalIEE Colloquium (Digest)
Issue number136
Publication statusPublished - 1 Jan 1994
Externally publishedYes
EventProceedings of the IEE Colloquium on Advances in Neural Networks for Control and Systems - London, UK
Duration: 25 May 199427 May 1994

Fingerprint

Dive into the research topics of 'Comparison of gradient based training algorithms for multilayer perceptrons'. Together they form a unique fingerprint.

Cite this