For two-class problems we propose two feature selection criteria based on kernel discriminant analysis. The first one is the objective function of kernel discriminant analysis (KDA) and the second one is the KDA-based exception ratio. We show that the objective function of KDA is monotonic for the deletion of features, which ensures stable feature selection. The KDA-based exception ratio defines the overlap between classes in the one-dimensional space obtained by KDA. The computer experiments show that the both criteria work well to select features but the former is more stable. © Springer-Verlag Berlin Heidelberg 2006.
CITATION STYLE
Ashihara, M., & Abe, S. (2006). Feature selection based on kernel discriminant analysis. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4132 LNCS-II, pp. 282–291). Springer Verlag. https://doi.org/10.1007/11840930_29
Mendeley helps you to discover research relevant for your work.