Affordable Access

Publisher Website

Recurrent networks for compressive sampling

Authors
Journal
Neurocomputing
0925-2312
Publisher
Elsevier
Volume
129
Identifiers
DOI: 10.1016/j.neucom.2013.09.028
Keywords
  • Neural Circuit
  • Stability
Disciplines
  • Computer Science

Abstract

Abstract This paper develops two neural network models, based on Lagrange programming neural networks (LPNNs), for recovering sparse signals in compressive sampling. The first model is for the standard recovery of sparse signals. The second one is for the recovery of sparse signals from noisy observations. Their properties, including the optimality of the solutions and the convergence behavior of the networks, are analyzed. We show that for the first case, the network converges to the global minimum of the objective function. For the second case, the convergence is locally stable.

There are no comments yet on this publication. Be the first to share your thoughts.