It has been recently pointed out that the Regularized Least Squares Classifier (RLSC), continues to be a viable option for binary classification problems. We apply RLSC to the datasets of the NIPS 2003 Feature Selection Challenge using Gaussian kernels. Since RLSC is sensitive to noise variables, ensemble-based variable filtering is applied first. RLSC makes use of the best-ranked variables only. We compare the performance of a stochastic ensemble of RLSCs to a single best RLSC. Our results indicate that in terms of classification error rate the two are similar on the challenge data. However, especially with large data sets, ensembles could provide other advantages that we list. © Springer-Verlag Berlin Heidelberg 2006.
CITATION STYLE
Torkkola, K., & Tuv, E. (2006). Ensembles of Regularized Least Squares Classifiers for high-dimensional problems. Studies in Fuzziness and Soft Computing, 207, 297–313. https://doi.org/10.1007/978-3-540-35488-8_12
Mendeley helps you to discover research relevant for your work.