I wanted to show my class how mplus does FIML by default, but it appears that thus isn't the case for a linear regression with a single, observed, continuous outcome. I have a demo data set with 20 cases. The data are complete for the predictors and half are missing for Y. If I simply regress Y on X1 and X2, it drops the cases with missing data on Y, even though I have not specificed "LISTWISE=ON" in the data step. The estimates are the same as what I get in SPSS using listwise deletion. Am I correct then, that FIML is not the default in regression with a single DV? That is question #1.
Now for question #2. I know I can force Mplus to use all the cases by naming the variances of X1 and X2. But when I do so, I get the exact same unstandardized estimates and SEs as in the listwise deleted model. Why would that be? I thought that naming the variances would trigger FIML and that FIML would produce less biased (i.e., different) unstandardized estimates.
I have replicated this phenomenon in a larger data set, where I know for sure that the probability of missing data on Y is conditioned on one of the X variables.
I am considering how to address missing data for plain multiple linear regression (i.e., 1 DV measured as an observed variable). I have missing data on both the DV and predictors.
1. If you include auxiliary variables in this case, does this add to the model estimation? Saturated correlates examples often focus on latent variable models.
2. For this model, what would be the FIML equivalent in Mplus of completing multiple imputation on IVs and DV while including extra auxiliary covariates during the MI process? It appears from the answer to the prior question that naming the variances of Xs does not affect the slopes if Y is missing.
Dayna Walker posted on Wednesday, October 31, 2018 - 9:54 am
Dear Drs. Muthen,
I am encountering a similar problem to the first post in this thread.
Am I correct in interpreting your response to Elizabeth to mean that Mplus will not apply FIML if there is a single dependent variable in a linear regression model? Instead, there must be more than one dependent variable for Mplus to apply the default FIML?
This is not specific to Mplus. A linear regression has only 1 DV and there is no FIML theory that can do anything there. You can bring the x into the model so that you actually have 2 DVs and then FIML can make a difference. This is all described in our Topic 11 Short course video and handout and in Chapter 10 of our RMA book.