Anonymous posted on Friday, February 24, 2006 - 12:42 pm
I am interested in accounting for variance in the growth of two variables usnig Mplus. These variables are modeled using a simultaneous growth model. The model fits poorly using the raw data (properly investigating different types of growth). That data is positively skewed, and a natural log transformed data fit a "linear" growth model (I recognize that this is no longer linear after the transformation).
Could someone share their thoughts about the appropriateness of using natural log transformed data, like Willett and Bub(2004, to account for variance in the latent variables? Thanks in advance!
bmuthen posted on Saturday, February 25, 2006 - 9:03 am
I would recommend against transforming the data if the only reason is to make the variables more normally distributed. Instead, all you have to do is to use a non-normality robust estimator such as the Mplus estimators MLM and MLR.
I would only transform if that made the modeling more transparent, e.g. making the variable scores easier to understand and making the relationships more linear.
Anonymous posted on Tuesday, February 28, 2006 - 8:14 am
Thank you very much for your help Dr. Muthen. I was able to complete the analyses. I've found the resources that your team provides for Mplus very helpful while learning to use the program.
The output is very interpretable, but I have one final question that I hope that you can help me with.
I recognize the the means found in the "Intercepts" section of the output differs from the means found in Tech4 output. How do I test whether the latent variable means are significant in conditional models?
The reason I ask is because the means are significant in the unconditional analyses in the "Means" section. These means are almost identical to the means presented in the Tech4 output for the unconditional model. However, the slope for one of the growth factors is not significant in the "Intercepts" section for the conditional parallel model. I'm pretty sure that doesn't mean that the slope is not significantly different than 0, but I'd like to test whether the slope is significantly different than 0 in the unconditional simultaneous model. Sorry for the basic questions, but I greatly appreciate your help. Please feel free to refer me to the user's guide if necessary (I searched but couldn't find the information). Sorry for the basic questions.
Anonymous posted on Tuesday, February 28, 2006 - 8:47 am
I made a mistake in the previous message. The third to last sentence should read "I'm pretty sure that (the "Intercepts" information) doesn't mean that the slope is not significantly different than 0, but I'd like to test whether the slope is significantly different than 0 in the *conditional* simultaneous model."
bmuthen posted on Tuesday, February 28, 2006 - 4:19 pm
Consider for example the slope growth factor s regressed on a covariate cov,
(1) s = a + b* cov,
where a is the intercept and b is the slope in this regression. The mean of s is
(2) mean(s) = a + b*mean(cov).
The Mplus intercept parameter a is therefore not the mean of s unless you have centered cov so that it has zero sample mean, but you can get the mean of s in Tech4 as you said. You can test if the mean of s is significantly different from zero very conveniently in the new Mplus Version 4 by defining a parameter "mean(s)" expressed as in (2) using Model Constraint.
Kurt Beron posted on Saturday, March 10, 2007 - 4:01 pm
I'm trying to implement your Feb 28 test suggestion of the mean (rather than intercept) of a LCM. I've tried several approaches with no success. A snippet: model: tsocint tsocslp | lntsoc3@0lntsoc4@1lntsoc5@2lntsoc6@3; psocint psocslp | lnpsocagg3@0lnpsocagg4@1lnpsocagg5@2lnpsocagg6@3; tsocint on aframer (p1) higrtin (p2) married (p3); [tsocint] (p4); !Variations: model constraint: new(meantbint); ! The following takes the means after the fact ! meantbint=p4+(p1*.212)+(p2*.355)+(p3*.657); ! Here I refer to the variables directly after defining on Variable Constraints line, with no success ! meantbint=p4+(p1*aframer)+(p2*higrtin)+(p3*married) ; model test: ! And here are various failed tests ! meantbint=p4+(p1*.212)+(p2*.355)+(p3*.657); ! 0=p4+(p1*aframer)+(p2*higrtin)+(p3*married) ; The best so far is one extra parameter that puts an actual constraint in rather than testing the mean. Thanks.
It's difficult to say where you are going wrong without more information. Another way to test the mean is the run the model without covariates. If you just want the value of the model estimated mean, you can ask for TECH4 in the OUTPUT command. Otherwise, please send your input, data, output, and license number to firstname.lastname@example.org.
I don't think the goal of the transformation of the covariates should be to make them normal - the model has no such assumption. Instead, you may transform them if that makes the assumption of linearity in the regression more realistic. With a log transform you would be saying that increasing x at a high level of x doesn't produce as much change in y and at a low level of x.