Learning algorithms for neural networks with the Kalman filters

Keigo Watanabe, Spyros G. Tzafestas

Research output: Contribution to journalArticlepeer-review

33 Citations (Scopus)


Based on various approaches, several different learing algorithms have been given in the literature for neural networks. Almost all algorithms have constant learning rates or constant accelerative parameters, though they have been shown to be effective for some practical applications. The learning procedure of neural networks can be regarded as a problem of estimating (or identifying) constant parameters (i.e. connection weights of network) with a nonlinear or linear observation equation. Making use of the Kalman filtering, we derive a new back-propagation algorithm whose learning rate is computed by a time-varying Riccati difference equation. Perceptron-like and correlational learning algorithms are also obtained as special cases. Furthermore, a self-organising algorithm of feature maps is constructed within a similar framework.

Original languageEnglish
Pages (from-to)305-319
Number of pages15
JournalJournal of Intelligent and Robotic Systems
Issue number4
Publication statusPublished - Dec 1990
Externally publishedYes


  • Kalman filters
  • Neural nets
  • learning systems
  • nonlinear filtering
  • parameter estimation
  • pattern recognition

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Mechanical Engineering
  • Industrial and Manufacturing Engineering
  • Electrical and Electronic Engineering
  • Artificial Intelligence


Dive into the research topics of 'Learning algorithms for neural networks with the Kalman filters'. Together they form a unique fingerprint.

Cite this