Sparse quadratic classification rules via linear dimension reduction
- PMID: 31105355
- PMCID: PMC6516858
- DOI: 10.1016/j.jmva.2018.09.011
Sparse quadratic classification rules via linear dimension reduction
Abstract
We consider the problem of high-dimensional classification between two groups with unequal covariance matrices. Rather than estimating the full quadratic discriminant rule, we propose to perform simultaneous variable selection and linear dimension reduction on the original data, with the subsequent application of quadratic discriminant analysis on the reduced space. In contrast to quadratic discriminant analysis, the proposed framework doesn't require the estimation of precision matrices; it scales linearly with the number of measurements, making it especially attractive for the use on high-dimensional datasets. We support the methodology with theoretical guarantees on variable selection consistency, and empirical comparisons with competing approaches. We apply the method to gene expression data of breast cancer patients, and confirm the crucial importance of the ESR1 gene in differentiating estrogen receptor status.
Keywords: Convex optimization; Discriminant analysis; High-dimensional statistics; Variable selection.
Figures
References
-
- Bach FR, Consistency of the group Lasso and multiple kernel learning, J. Mach. Learn. Res 9 (2008) 1179–1225.
-
- Barber RF, Drton M, Exact block-wise optimization in group lasso and sparse group lasso for linear regression, arXiv.org (2010).
-
- Boyd SP, Vandenberghe L, Convex Optimization, Cambridge Univ Press, Cambridge, 2004.
-
- Cai TT, Liu W, A direct estimation approach to sparse linear discriminant analysis, J. Amer. Statist. Assoc. 106 (2011) 1566–1577.
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources
Miscellaneous