Multilevel models incorporating random effects at the various levels are enjoying increased popularity. An implicit problem with such models is identifiability. From a Bayesian perspective, formal identifiability is not an issue. Rather, when implementing iterative simulation-based model fitting, a poorly behaved Gibbs sampler frequently arises. The objective of this paper is to shed light on two computational issues in this regard. The first concerns autocorrelation in the sequence of iterates of the Markov chain. For estimable functions we clarify when, after convergence, autocorrelation will drop off to zero rapidly, enabling high effective sample size. The second concerns immediate convergence, i.e., when, at an arbitrary iteration, the simulated value of a variable is in fact an observation from the posterior distribution of the variable. Again, for estimable functions, we clarify when the chain will produce at each iteration a sample drawn essentially from the true posterior of the function. We provide both analytical and computational support for our conclusions, including exemplification for three multilevel models having normal, Poisson, and binary responses, respectively.
|Original language||English (US)|
|Number of pages||23|
|State||Published - Oct 1 2001|
- Estimable function
- Exact sampling