Affordable Access

Access to the full text

Global Exponential Convergence of Neutral Type Competitive Neural Networks with D Operator and Mixed Delay

Authors
  • Aouiti, Chaouki1
  • Assali, El Abed1
  • Ben Gharbia, Imen1
  • 1 University of Carthage, Zarzouna, Bizerta, 7021, Tunisia , Zarzouna, Bizerta (Tunisia)
Type
Published Article
Journal
Journal of Systems Science and Complexity
Publisher
Academy of Mathematics and Systems Science, Chinese Academy of Sciences
Publication Date
Dec 01, 2020
Volume
33
Issue
6
Pages
1785–1803
Identifiers
DOI: 10.1007/s11424-020-8225-x
Source
Springer Nature
Keywords
License
Yellow

Abstract

The models of competitive neural network (CNN) was in recent past proposed to describe the dynamics of cortical cognitive maps with unsupervised synaptic modifications, where there are two types of memories: Long-term memories (LTM) and short-term memories (STM), LTM presents unsupervised and slow synaptic modifications and STM characterize the fast neural activity. This paper is concerned with a class of neutral type CNN’s with mixed delay and D operator. By employing the appropriate differential inequality theory, some sufficient conditions are given to ensure that all solutions of the model converge exponentially to zero vector. Finally, an illustrative example is also given at the end of this paper to show the effectiveness of the proposed results.

Report this publication

Statistics

Seen <100 times