Abstract
In this letter, we propose a new weight learning algorithm, called an H∞ learning law (HLL), for recurrent neural networks with time-delay. Based on the LyapunovKrasovskii stability theory, the HLL is presented to not only guarantee asymptotical stability but also reduce the effect of external disturbance to an H∞ norm constraint. An existence condition for the HLL is represented in terms of linear matrix inequality (LMI). An illustrative example is given to demonstrate the effectiveness of the proposed HLL.
Original language | English |
---|---|
Pages (from-to) | 1217-1227 |
Number of pages | 11 |
Journal | Modern Physics Letters B |
Volume | 24 |
Issue number | 12 |
DOIs | |
Publication status | Published - 2010 May 20 |
Externally published | Yes |
Keywords
- H∞ learning law (HLL)
- Linear matrix inequality (LMI)
- LyapunovKrasovskii stability theory
- Recurrent neural networks
- Time-delay
ASJC Scopus subject areas
- Statistical and Nonlinear Physics
- Condensed Matter Physics