Sparse multimodal Gaussian processes

5Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Gaussian processes (GPs) are effective tools in machine learning. Unfortunately, due to their unfavorable scaling, a more widespread use has probably been impeded. By leveraging sparse approximation methods, sparse Gaussian processes extend the applicability of GPs to a richer data. Multimodal data are common in machine learning applications. However, there are few sparse multimodal approximation methods for GPs applicable to multimodal data. In this paper, we present two kinds of sparse multimodal approaches for multi-view GPs, the maximum informative vector machine (mIVM) and the alternative manifold-preserving (aMP), which are inspired by the information theory and the manifold preserving principle, respectively. The aMP uses an alternative selection strategy for preserving the high space connectivity. In the experiments, we apply the proposed sparse multimodal methods to a recent framework of multi-view GPs, and results have verified the effectiveness of the proposed methods.

Cite

CITATION STYLE

APA

Liu, Q., & Sun, S. (2017). Sparse multimodal Gaussian processes. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10559 LNCS, pp. 28–40). Springer Verlag. https://doi.org/10.1007/978-3-319-67777-4_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free