Parametric manifold learning of Gaussian mixture models

2Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

Abstract

The Gaussian Mixture Model (GMM) is among the most widely used parametric probability distributions for representing data. However, it is complicated to analyze the relationship among GMMs since they lie on a high-dimensional manifold. Previous works either perform clustering of GMMs, which learns a limited discrete latent representation, or kernel-based embedding of GMMs, which is not interpretable due to difficulty in computing the inverse mapping. In this paper, we propose Parametric Manifold Learning of GMMs (PML-GMM), which learns a parametric mapping from a low-dimensional latent space to a high-dimensional GMM manifold. Similar to PCA, the proposed mapping is parameterized by the principal axes for the component weights, means, and covariances, which are optimized to minimize the reconstruction loss measured using Kullback-Leibler divergence (KLD). As the KLD between two GMMs is intractable, we approximate the objective function by a variational upper bound, which is optimized by an EM-style algorithm. Moreover, We derive an efficient solver by alternating optimization of subproblems and exploit Monte Carlo sampling to escape from local minima. We demonstrate the effectiveness of PML-GMM through experiments on synthetic, eye-fixation, flow cytometry, and social check-in data.

References Powered by Scopus

Equation of state calculations by fast computing machines

30146Citations
N/AReaders
Get full text

A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition

17117Citations
N/AReaders
Get full text

Nonlinear Component Analysis as a Kernel Eigenvalue Problem

6858Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Clustering Hidden Markov Models with Variational Bayesian Hierarchical em

11Citations
N/AReaders
Get full text

PRIMAL-GMM: PaRametrIc MAnifold Learning of Gaussian Mixture Models

4Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Liu, Z., Yu, L., Hsiao, J. H., & Chan, A. B. (2019). Parametric manifold learning of Gaussian mixture models. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2019-August, pp. 3073–3079). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2019/426

Readers over time

‘19‘20‘21‘22‘23‘2402468

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 7

78%

Researcher 2

22%

Readers' Discipline

Tooltip

Computer Science 5

63%

Social Sciences 2

25%

Engineering 1

13%

Save time finding and organizing research with Mendeley

Sign up for free
0