Kernel optimization in discriminant analysis
- PMID: 20820072
- PMCID: PMC3149884
- DOI: 10.1109/TPAMI.2010.173
Kernel optimization in discriminant analysis
Abstract
Kernel mapping is one of the most used approaches to intrinsically derive nonlinear classifiers. The idea is to use a kernel function which maps the original nonlinearly separable problem to a space of intrinsically larger dimensionality where the classes are linearly separable. A major problem in the design of kernel methods is to find the kernel parameters that make the problem linear in the mapped representation. This paper derives the first criterion that specifically aims to find a kernel representation where the Bayes classifier becomes linear. We illustrate how this result can be successfully applied in several kernel discriminant analysis algorithms. Experimental results, using a large number of databases and classifiers, demonstrate the utility of the proposed approach. The paper also shows (theoretically and experimentally) that a kernel version of Subclass Discriminant Analysis yields the highest recognition rates.
Figures
References
-
- Baudat G, Anouar F. Generalized discriminant analysis using a kernel approach. Neural Computation. 2000;12(10):2835–2404. - PubMed
-
- Blake CL, Merz CJ. UCI repository of machine learning databases. University of California; Irvine: 1998. http://www.ics.uci.edu/mlearn/MLRepository.html.
-
- Bregman L. The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming. USSR Comp Mathematics and Mathematical Physics. 1967;7:200217.
-
- Chen B, Yuan L, Liu H, Bao Z. Kernel subclass discriminant analysis. Neurocomputing. 2007
-
- Demsar J. Statistical comparisons of classifiers over multiple data sets. J Machine Learning Research. 2006;7:1–30.
