H∞ weight learning algorithm of recurrent neural networks with time-delay

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)


In this letter, we propose a new weight learning algorithm, called an H∞ learning law (HLL), for recurrent neural networks with time-delay. Based on the LyapunovKrasovskii stability theory, the HLL is presented to not only guarantee asymptotical stability but also reduce the effect of external disturbance to an H∞ norm constraint. An existence condition for the HLL is represented in terms of linear matrix inequality (LMI). An illustrative example is given to demonstrate the effectiveness of the proposed HLL.

Original languageEnglish
Pages (from-to)1217-1227
Number of pages11
JournalModern Physics Letters B
Issue number12
Publication statusPublished - 2010 May 20
Externally publishedYes


  • H∞ learning law (HLL)
  • Linear matrix inequality (LMI)
  • LyapunovKrasovskii stability theory
  • Recurrent neural networks
  • Time-delay

ASJC Scopus subject areas

  • Statistical and Nonlinear Physics
  • Condensed Matter Physics


Dive into the research topics of 'H∞ weight learning algorithm of recurrent neural networks with time-delay'. Together they form a unique fingerprint.

Cite this