Skip to main content
placeholder image

Variational inference for infinite mixtures of sparse Gaussian processes through KL-correction

Conference Paper


Download full-text (Open Access)

Abstract


  • We propose a new approximation method for Gaussian process (GP) regression based on the mixture of experts structure and variational inference. Our model is essentially an infinite mixture model in which each component is composed of a Gaussian distribution over the input space, and a Gaussian process expert over the output space. Each expert is a sparse GP model augmented with its own set of inducing points. Variational inference is made feasible by assuming that the training outputs are independent given the inducing points. In previous works on variational mixture of GP experts, the inducing points are selected through a greedy selection algorithm, which is computationally expensive. In our method, both the inducing points and hyperparameters of the experts are learned through maximizing an improved lower bound of the marginal likelihood. Experiments on benchmark datasets show the advantages of the proposed method.

Publication Date


  • 2016

Citation


  • T. Nguyen, A. Bouzerdoum & S. L. Phung, "Variational inference for infinite mixtures of sparse Gaussian processes through KL-correction," in 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2016, pp. 2579-2583.

Scopus Eid


  • 2-s2.0-84973344412

Ro Full-text Url


  • http://ro.uow.edu.au/cgi/viewcontent.cgi?article=6460&context=eispapers

Ro Metadata Url


  • http://ro.uow.edu.au/eispapers/5432

Start Page


  • 2579

End Page


  • 2583

Abstract


  • We propose a new approximation method for Gaussian process (GP) regression based on the mixture of experts structure and variational inference. Our model is essentially an infinite mixture model in which each component is composed of a Gaussian distribution over the input space, and a Gaussian process expert over the output space. Each expert is a sparse GP model augmented with its own set of inducing points. Variational inference is made feasible by assuming that the training outputs are independent given the inducing points. In previous works on variational mixture of GP experts, the inducing points are selected through a greedy selection algorithm, which is computationally expensive. In our method, both the inducing points and hyperparameters of the experts are learned through maximizing an improved lower bound of the marginal likelihood. Experiments on benchmark datasets show the advantages of the proposed method.

Publication Date


  • 2016

Citation


  • T. Nguyen, A. Bouzerdoum & S. L. Phung, "Variational inference for infinite mixtures of sparse Gaussian processes through KL-correction," in 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2016, pp. 2579-2583.

Scopus Eid


  • 2-s2.0-84973344412

Ro Full-text Url


  • http://ro.uow.edu.au/cgi/viewcontent.cgi?article=6460&context=eispapers

Ro Metadata Url


  • http://ro.uow.edu.au/eispapers/5432

Start Page


  • 2579

End Page


  • 2583