Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2012 Sep;23(9):1377-89.
doi: 10.1109/TNNLS.2012.2202400. Epub 2012 Jun 28.

Convergence and rate analysis of neural networks for sparse approximation

Convergence and rate analysis of neural networks for sparse approximation

Aurèle Balavoine et al. IEEE Trans Neural Netw Learn Syst. 2012 Sep.

Erratum in

  • IEEE Trans Neural Netw Learn Syst. 2014 Aug;25(8):1595-6

Abstract

We present an analysis of the Locally Competitive Algorithm (LCA), which is a Hopfield-style neural network that efficiently solves sparse approximation problems (e.g., approximating a vector from a dictionary using just a few nonzero coefficients). This class of problems plays a significant role in both theories of neural coding and applications in signal processing. However, the LCA lacks analysis of its convergence properties, and previous results on neural networks for nonsmooth optimization do not apply to the specifics of the LCA architecture. We show that the LCA has desirable convergence properties, such as stability and global convergence to the optimum of the objective function when it is unique. Under some mild conditions, the support of the solution is also proven to be reached in finite time. Furthermore, some restrictions on the problem specifics allow us to characterize the convergence rate of the system by showing that the LCA converges exponentially fast with an analytically bounded convergence rate. We support our analysis with several illustrative simulations.

PubMed Disclaimer

Figures

Fig. 1
Fig. 1
Block diagram of the LCA [14], which is a Hopfield-style network architecture for solving sparse approximation problems.
Fig. 2
Fig. 2
Soft-thresholding activation function. When this is used for the thresholding function Tλ(·), the LCA solves the popular BPDN optimization problem used in many sparse approximation applications.
Fig. 3
Fig. 3
Generic activation function satisfying conditions (6). The area in gray represents where the activation function must lie in order to satisfy conditions (6b) and (6c).
Fig. 4
Fig. 4
Plot of the evolution with respect to time of several LCA nodes uk(t). The plain lines correspond to nodes that are active in the final solution and the dashed lines correspond to nodes that are inactive in the final solution.
Fig. 5
Fig. 5
Output a* of the LCA after convergence. Only nonzero elements are plotted. The fixed point reached by the system is very close to the initial sparse vector used to create the measurement vector (it cannot be exact due to noise). The solution is also very close to a standard digital solver [8] run using the same inputs. Note that the LCA produces many coefficients that are exactly zero (therefore not plotted).
Fig. 6
Fig. 6
Trajectories of u(t) in the plane defined by two nodes chosen randomly from the active set. Trajectories are shown for 30 random initial states.
Fig. 7
Fig. 7
Histogram (in percentage) of the number of switches the LCA requires before convergence over 1000 trials.
Fig. 8
Fig. 8
Convergence behavior of (12)u(t)u22. The dashed line shows the theoretical decay in (20) with δ* computed by using the final solution support. The dash-dot line shows the theoretical decay with δmax computed on the largest support visited. While both estimates are showing theoretically correct behavior, the estimate of the rate based on δ* is more empirically accurate than the conservative estimate based on δmax.

Similar articles

Cited by

References

    1. Balavoine A, Rozell CJ, Romberg J. Global convergence of the locally competitive algorithm. Proc. IEEE Digital Signal Process. Workshop. 2011 Jan;:431–436.
    1. Elad M, Figueiredo MAT, Ma Y. On the role of sparse and redundant representations in image processing. Proc. IEEE. 2010 Jun;98(6):972–982.
    1. Olshausen BA, Field DJ. Sparse coding of sensory inputs. Current Opinion Neurobiol. 2004 Aug;14(4):481–487. - PubMed
    1. Olshausen BA, Field DJ. Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature. 1996 Jun.381(6583):607–609. - PubMed
    1. Olshausen BA, Field DJ. Sparse coding with an overcomplete basis set: A strategy employed by V1? Vis. Res. 1997 Dec.37(23):3311–3325. - PubMed

Publication types