By extending the maximum-likelihood method of Miller and Greene for one-dimensional nuclear magnetic resonance spectroscopy, maximum-likelihood (ML) methods for two-dimensional spectroscopy are developed. The time-domain free-induction decay is modeled as a 2D signal f(t1, t2), the superposition of exponentially decaying complex exponentials in both dimensions of unknown amplitudes, frequencies, and decays. The measurement is modeled as a 2D Gaussian process with mean f(t1, t2) and fixed variance. The ML estimation problem is to choose the set of amplitudes, frequencies, and decays which minimizes the squared error between the measured FID and the parameterized signal model. An iterative expectation-maximization algorithm is proposed for solving the nonlinear least-squares problem, providing a method for estimating the signal parameters directly from the 2D FID. The ML method and implementation algorithm are tested by analyzing both simulated and experimental N-butanol data collected via the hypercomplex data-collection protocol.