Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Nov;166(1-2):207-240.
doi: 10.1007/s10107-017-1114-y. Epub 2017 Feb 10.

Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions

Affiliations

Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions

Hongcheng Liu et al. Math Program. 2017 Nov.

Abstract

This paper concerns the folded concave penalized sparse linear regression (FCPSLR), a class of popular sparse recovery methods. Although FCPSLR yields desirable recovery performance when solved globally, computing a global solution is NP-complete. Despite some existing statistical performance analyses on local minimizers or on specific FCPSLR-based learning algorithms, it still remains open questions whether local solutions that are known to admit fully polynomial-time approximation schemes (FPTAS) may already be sufficient to ensure the statistical performance, and whether that statistical performance can be non-contingent on the specific designs of computing procedures. To address the questions, this paper presents the following threefold results: (i) Any local solution (stationary point) is a sparse estimator, under some conditions on the parameters of the folded concave penalties. (ii) Perhaps more importantly, any local solution satisfying a significant subspace second-order necessary condition (S3ONC), which is weaker than the second-order KKT condition, yields a bounded error in approximating the true parameter with high probability. In addition, if the minimal signal strength is sufficient, the S3ONC solution likely recovers the oracle solution. This result also explicates that the goal of improving the statistical performance is consistent with the optimization criteria of minimizing the suboptimality gap in solving the non-convex programming formulation of FCPSLR. (iii) We apply (ii) to the special case of FCPSLR with minimax concave penalty (MCP) and show that under the restricted eigenvalue condition, any S3ONC solution with a better objective value than the Lasso solution entails the strong oracle property. In addition, such a solution generates a model error (ME) comparable to the optimal but exponential-time sparse estimator given a sufficient sample size, while the worst-case ME is comparable to the Lasso in general. Furthermore, to guarantee the S3ONC admits FPTAS.

Keywords: Folded concave penalty; Lasso; NP-completeness; Non-convex programming; Sparse recovery.

PubMed Disclaimer

References

    1. Adamczak R, Litvak A, Pajor A, Tomczak-Jaegermann N. Quantitative estimates of the convergence of the empirical covariance matrix in log-concave ensembles. J Amer Math Soc. 2010;234:535–561.
    1. Ahlswede R, Winter A. Strong converse for identification via quantum channels. IEEE Trans Inform Theory. 2002;48(3):569–579.
    1. Bertsimas D, Mazumder R. Least quantile regression via modern optimization. Ann Stat. 2014;42:2494–2525.
    1. Bian W, Chen X. Optimality conditions and complexity for non-Lipschitz constrained optimization problems. 2014 http://www.polyu.edu.hk/ama/staff/xjchen/OCT26.
    1. Bian W, Chen X, Ye Y. Complexity analysis of interior point algorithms for non-Lipschitz and non-convex minimization. Math Prog A. 149:301–327.

LinkOut - more resources