4. Robustness
Jae Kwang Kim and Shu Yang
Previous | Next
We now discuss the robustness of the proposed
method against a small departure from the assumed parametric model. The
robustness feature in our proposed estimator is defined to be robust against
imputation model misspecification, a small exponential tilting of the true
model. For simplicity of the presentation, assume that the sampling design is
simple random sampling and the realized sample is a random sample from the
superpopulation model.
We assume that the true model does not belong to However, we can still specify a
working model and compute the MLE of
It is well known (White 1982)
that the MLE converges to the minimizer of the
Kullback-Leibler information
for
Sung and Geyer (2007) discussed
the asymptotic properties of the Monte Carlo MLE of
under missing data.
To
formally discuss robustness, suppose that the true distribution belongs to the neighborhood
for some radius where
is the Kullback-Leibler distance measure. The
neighborhood (4.1) can be characterized in the following way. Let
be a function of
and
standardized to satisfy and and define
where
For small it can be shown that
Equation (4.3) represents an extensive set of
distributions close to created by varying
over different standardized
functions, where
and
contain some geometric
interpretation which represent the direction and magnitude of the
misspecification respectively. For
dimension parameter
we can specify the directions of
the misspecification as
where and is the information matrix for
Represent
as
where
then
satisfies the standardization
criterion of
and
See Copas and Eguchi (2001) for
further discussion of this expression.
Let
be the fractional weight of the
form (3.3) using the true density
and
be the corresponding fractional
weight using the "working density"
By the special construction of
the weights, we can establish
Proof of (4.5) is given in Appendix A.2. Thus
For small
we have
and so the resulting estimator of
from
will be close to the true value
Previous | Next