Sample size for LGM with missing data...
Message/Author
 Daniel posted on Monday, August 01, 2005 - 8:19 am
I ran a LGM with the missing option (e.g., analysis: type=meanstructure missing h1;). For some reason I get sample size estimate equivalent to the whole sample (N=1143), including independent variables. However, when I ran the same model as a mixture model, I got sample size (N=143), based on the dependent variables. Is there some reason for this odd discrepancy?
 bmuthen posted on Monday, August 01, 2005 - 10:53 am
The short answer is that the mixture model analysis deletes individuals with missing on the IVs, whereas the non-mixture model analysis doesn't. The mixture analysis can however be made to do the same as the non-mixture analysis - simply mention parameters related to the IVs, such as

x1-x5;

The medium-length answer is that the non-mixture analysis adds a normality assumption for the IV's and this is critical for missing data handling. The mixture analysis takes the approach that we want to condition on IVs to avoid assuming anything about their distribution since the model is not about [x] but about [y | x] (here x=IV and y | x means conditioning on x)- but if you want to, you can do [y, x] (=[y |x]*[x]) modeling as shown above.

The long answer is that this difference is because the two analysis types come from different traditions. The non-mixture analysis comes from SEM where it has been common to analyze [y, x], and not [y |x]. As Joreskog-Goldberger showed in 1975 in JASA, these 2 approaches give the same result - but that is when there is no missing on x. Statistics in general work with [y | x] just like in regression analysis, so no distributional assumption for x. If you want to deal with missing on x's you have to add a distributional assumption for the x's, and then you are into [ y, x]. So you gain something at a cost. - Hope at least one of those 3 answers are helpful.
 Daniel posted on Monday, August 01, 2005 - 11:15 am
Thanks for taking time to make this detailed response. It is very helpful.