Monte Carlo likelihood inference for missing data models

Yun Ju Sung, Charles J. Geyer

Research output: Contribution to journalArticlepeer-review

29 Scopus citations

Abstract

We describe a Monte Carlo method to approximate the maximum likelihood estimate (MLE), when there are missing data and the observed data likelihood is not available in closed, form. This method uses simulated missing data that are independent and identically distributed and independent of the observed data. Our Monte Carlo approximation to the MLE is a consistent and asymptotically normal estimate of the minimizer θ* of the Kullback-Leibler information, as both Monte Carlo and observed data sample sizes go to infinity simultaneously. Plug-in estimates of the asymptotic variance are provided for constructing confidence regions for θ*. We give Logit-Normal generalized linear mixed model examples, calculated using an R package.

Original languageEnglish (US)
Pages (from-to)990-1011
Number of pages22
JournalAnnals of Statistics
Volume35
Issue number3
DOIs
StatePublished - Jul 2007

Keywords

  • Asymptotic theory
  • Empirical process
  • Generalized linear mixed model
  • Maximum likelihood
  • Model misspecification
  • Monte Carlo

Fingerprint

Dive into the research topics of 'Monte Carlo likelihood inference for missing data models'. Together they form a unique fingerprint.

Cite this