Philip Jones posted on Wednesday, January 03, 2007 - 1:53 pm
I'm trying to fit a GMM on a right-skewed outcome at 4 time points, treating the outcome as censored at the floor. I am fitting cubic growth curves and am allowing the intercept to vary within class but am fixing the other three. With the default settings, for a 3-class model I get an ABIC of 39968 and entropy of 0.841. If I free up the intercept variance and the residual variances across classes, the ABIC drops to 39621, suggesting a better fit. The differences I observe in the variance estimates across the 3 classes seem to corroborate this. However, the entropy of the latter model drops substantially to 0.408. How should I reconcile this with the presumed better fit? Such a relatively low entropy suggests to me that it's not as useful a model in terms of accurately classifying subjects. Varying the number of classes doesn't change things much; the entropies of the more general model are much lower than those of the model with variances equal across classes.
And a slightly different question: computation flies (<5 min) if I assume normality but is excruciatingly slow when I code the variables as censored or two-part (can be 45 minutes to 2 hours on my brand new 2.33GHz Core 2 Duo with 2GB memory and PROCESSORS=2), making model tweaking very arduous. I often try INTEGRATION=5 to speed things up but it either doesn't help much or results in non-convergence. Any suggestions?
Entropy is not a measure of model fit. It may just be the better fitting model does not have as clear cut classes.
When numerical integration is needed, computational time increases. One to two hours is not particularly long in this case. INTEGRATION=5 may not be enough for adequate precision. I would use 7. If convergence is slower than expected with numerical integration, it may be that the model is too complex.