Justin Jager posted on Thursday, December 18, 2008 - 9:13 am
I have a parallel process, piece-wise growth model of self-esteem and risky behavior, which includes an intercept and two growth pieces.
I wish to examine differences in conccurent correlations across growth piece/time. The concurrent covariances between the 3 growth factors are: -.027, -.016, and -.015; while the concurrent correlations are: -.048, -.236, and -.252. When I constrain the covariances to be equal across all three growth pieces the combined covariance is .018, but (due to dramatic std dev differences across the growth factors) the stdyx estimates/correlations still vary dramatically: -.028, -.232, -.256.
Due to these std dev differences comparing covariances is like comparing apples to oranges, I would like to compare via model comparison the standardized estimates as they account for differences in std dev. Is there a way to do this in Mplus?
Yes, by using Model Constraint. In the Model command you give labels to the parameters involving the covariances and variances of the growth factors. In Model Constraint you define the "new" parameters (see UG):
corr1 = cov/(sqrt(v11)*sqrt(v12));
and then say
corr2 = corr1; corr3 = corr1;
Justin Jager posted on Thursday, December 18, 2008 - 3:34 pm
Fantastic! Thank you. I have one follow-up question though. Using the following syntax:
Irisk with Islfest(p1); L1risk with L1slfest(p2); L2risk with L2slfest(p3);
Model Constraint: NEW (corr1*-.048); NEW (corr2*-.240); corr1 = (p1/(sqrt(p4)*sqrt(p5))); corr2 = (p2/(sqrt(p6)*sqrt(p7)));
corr2 = corr1;
I got the following output: New/Additional Parameters CORR1 -0.058 0.021 -2.794 0.005 CORR2 -0.058 0.021 -2.794 0.005
So, I have gotten to the point of calculating the corrs and constraining them to be the same, but how do i determine if constraining them leads to a poorer fit? I tried to run the model with the corrs unconstrained but I got fatal errors (I assume by design or neccessity). I then tried some other things, but nothing worked. Thanks!
What you did looks right. The unconstrained model is your original one - no need to use Model Constraint. But leaving out corr2=corr1 should also work. Perhaps your variances weren't positive. If you are interested in knowing why your attempt failed, please send your input, output, data and license number to email@example.com.
I have a parallel growth process model and I want to compare covariances between growth factors across two groups via multiple group models. However, growth factor variances are different across both groups and I am therefore concerned about the validity of the tests of covariance equalities in multiple group. Should I use the approach described here instead?
we are attempting to use the input that is quoted above ( Michael Spaeth posted on Monday, August 03, 2009 - 11:01 am ) in an analysis using wlsmv. although we have successfully constrained the correlations, our attempt to test the effects of constraining, we are met with:
*** ERROR in ANALYSIS command DIFFTEST is not available in conjuction with nonlinear constraints through the use of MODEL CONSTRAINT. Request for DIFFTEST is ignored.
so, my question is: is it not possible to test constraining the correlations in this manner using difftest? if that is correct, is there another way (other than, for example, saving factor scores and handling the tests in ML instead).