Feature selection based on kernel discriminant analysis

5Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

For two-class problems we propose two feature selection criteria based on kernel discriminant analysis. The first one is the objective function of kernel discriminant analysis (KDA) and the second one is the KDA-based exception ratio. We show that the objective function of KDA is monotonic for the deletion of features, which ensures stable feature selection. The KDA-based exception ratio defines the overlap between classes in the one-dimensional space obtained by KDA. The computer experiments show that the both criteria work well to select features but the former is more stable. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Ashihara, M., & Abe, S. (2006). Feature selection based on kernel discriminant analysis. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4132 LNCS-II, pp. 282–291). Springer Verlag. https://doi.org/10.1007/11840930_29

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free