Abstract
Based on various approaches, several different learing algorithms have been given in the literature for neural networks. Almost all algorithms have constant learning rates or constant accelerative parameters, though they have been shown to be effective for some practical applications. The learning procedure of neural networks can be regarded as a problem of estimating (or identifying) constant parameters (i.e. connection weights of network) with a nonlinear or linear observation equation. Making use of the Kalman filtering, we derive a new back-propagation algorithm whose learning rate is computed by a time-varying Riccati difference equation. Perceptron-like and correlational learning algorithms are also obtained as special cases. Furthermore, a self-organising algorithm of feature maps is constructed within a similar framework.
Original language | English |
---|---|
Pages (from-to) | 305-319 |
Number of pages | 15 |
Journal | Journal of Intelligent and Robotic Systems |
Volume | 3 |
Issue number | 4 |
DOIs | |
Publication status | Published - Dec 1990 |
Externally published | Yes |
Keywords
- Kalman filters
- Neural nets
- learning systems
- nonlinear filtering
- parameter estimation
- pattern recognition
ASJC Scopus subject areas
- Software
- Control and Systems Engineering
- Mechanical Engineering
- Industrial and Manufacturing Engineering
- Electrical and Electronic Engineering
- Artificial Intelligence