Exponential H∞ stable learning method for Takagi-Sugeno fuzzy delayed neural networks: A convex optimization approach

Research output: Contribution to journalArticlepeer-review

22 Citations (Scopus)

Abstract

In this paper, we propose some new results on stability for Takagi-Sugeno fuzzy delayed neural networks with a stable learning method. Based on the Lyapunov-Krasovskii approach, for the first time, a new learning method is presented to not only guarantee the exponential stability of Takagi-Sugeno fuzzy neural networks with time-delay, but also reduce the effect of external disturbance to a prescribed attenuation level. The proposed learning method can be obtained by solving a convex optimization problem which is represented in terms of a set of linear matrix inequalities (LMIs). An illustrative example is given to demonstrate the effectiveness of the proposed learning method.

Original languageEnglish
Pages (from-to)887-895
Number of pages9
JournalComputers and Mathematics with Applications
Volume63
Issue number5
DOIs
Publication statusPublished - 2012 Mar
Externally publishedYes

Keywords

  • Exponential H∞ stability
  • Linear matrix inequality (LMI)
  • Lyapunov-Krasovskii approach
  • Networks
  • Takagi-Sugeno fuzzy delayed neural
  • Weight learning method

ASJC Scopus subject areas

  • Modelling and Simulation
  • Computational Theory and Mathematics
  • Computational Mathematics

Fingerprint

Dive into the research topics of 'Exponential H∞ stable learning method for Takagi-Sugeno fuzzy delayed neural networks: A convex optimization approach'. Together they form a unique fingerprint.

Cite this