randompar

Approximating the marginal likelihood estimate for models with random parameters

Abstract

Often a model for the mean and variance of a measurement set is naturally expressed in terms of both deterministic and random parameters. Each of the deterministic parameters has one fixed value while the random parameters come from a distribution of values. We restrict our attention to the case where the random parameters and the measurement error have a Gaussian distribution. In this case, the joint likelihood of the data and random parameters is an extended least squares function. The likelihood of the data alone is the integral of this extended least squares function with respect to the random parameters. This is the likelihood that we would like to optimize, but we do not have a closed form expression for the integral. We use Laplace’s method to obtain an approximation for the likelihood of the data alone. Maximizing this approximation is less computationally demanding than maximizing the integral expression, but this yields a different estimator. In addition, evaluation of the approximation requires second derivatives of the original model functions. If we were to use this approximation as our objective function, evaluation of the derivative of the objective would require third derivatives of the original model functions.

We present modified approximations that are expressed using only values of the original model functions. Evaluation of the derivative of the modified approximations only requires first derivatives of the original model functions. We use Monte-Carlo techniques to approximate the difference between an arbitrary estimator and the estimator that maximizes the likelihood of the data alone. In addition, we approximate the information matrix corresponding to the estimator that maximizes the likelihood of the data alone.

citation