Promoting diversity in Gaussian mixture ensembles: An application to signature verification

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Classifiers based on Gaussian mixture models are good performers in many pattern recognition tasks. Unlike decision trees, they can be described as stable classifier: a small change in the sampling of the training set will produce not a large change in the parameters of the trained classifier. Given that ensembling techniques often rely on instability of the base classifiers to produce diverse ensembles, thereby reaching better performance than individual classifiers, how can we form ensembles of Gaussian mixture models? This paper proposes methods to optimise coverage in ensembles of Gaussian mixture classifiers by promoting diversity amongst these stable base classifiers. We show that changes in the signal processing chain and modelling parameters can lead to significant complementarity between classifiers, even if trained on the same source signal. We illustrate the approach by applying it to a signature verification problem, and show that very good results are obtained, as verified in the large-scale international evaluation campaign BMEC 2007. © 2008 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Richiardi, J., Drygajlo, A., & Todesco, L. (2008). Promoting diversity in Gaussian mixture ensembles: An application to signature verification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5372 LNCS, pp. 140–149). https://doi.org/10.1007/978-3-540-89991-4_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free