Preconditioned krylov subspace methods for sampling multivariate gaussian distributions

Edmond Chow, Yousef Saad

Research output: Contribution to journalArticlepeer-review

49 Scopus citations

Abstract

A common problem in statistics is to compute sample vectors from a multivariate Gaussian distribution with zero mean and a given covariance matrix A. A canonical approach to the problem is to compute vectors of the form y = Sz, where S is the Cholesky factor or square root of A, and z is a standard normal vector. When A is large, such an approach becomes computationally expensive. This paper considers preconditioned Krylov subspace methods to perform this task. The Lanczos process provides a means to approximate A1/2z for any vector z from an m-dimensional Krylov subspace. The main contribution of this paper is to show how to enhance the convergence of the process via preconditioning. Both incomplete Cholesky preconditioners and approximate inverse preconditioners are discussed. It is argued that the latter class of preconditioners has an advantage in the context of sampling. Numerical tests, performed with stationary covariance matrices used to model Gaussian processes, illustrate the dramatic improvement in computation time that can result from preconditioning.

Original languageEnglish (US)
Pages (from-to)A588-A608
JournalSIAM Journal on Scientific Computing
Volume36
Issue number2
DOIs
StatePublished - 2014

Keywords

  • Covariance matrix
  • Gaussian processes
  • Krylov subspace methods
  • Lanczos process
  • Matrix square root
  • Preconditioning
  • Sampling
  • Sparse approximate inverse

Fingerprint

Dive into the research topics of 'Preconditioned krylov subspace methods for sampling multivariate gaussian distributions'. Together they form a unique fingerprint.

Cite this