Interpretation of results with covari...
Message/Author
 Yvonne Miller posted on Friday, December 04, 2009 - 9:33 am
Hi,

I'm investigating test scores of children in arithmetics at 3 time points.

My model is
i s| arith1@0 arith2@1 arith3@2
is ON gender ses mig

I get significant effects of the covariates on the intercept and the slope.
How are the effects of covariates on the the intercept are interpretated?

Can I say for example that
1. girls have better scores at the beginning or
does it mean that
2. girls have better test scores over all 3 measurement points?

Thanks!
 Linda K. Muthen posted on Friday, December 04, 2009 - 10:18 am
You should watch the video for Topic 3 to get a full description of growth modeling. The intercept growth factor in your model is defined as initial status because the time score of zero is at the first time point. If gender is scored as girls being one, a significant effect of the regression of i on gender says that girls started higher. If the regression of s on gender is significant and positive, it says girls have a higher growth rate.
 Yvonne Miller posted on Friday, December 04, 2009 - 10:46 am
 Yvonne Miller posted on Monday, December 07, 2009 - 2:13 am
Hi,

I have one further question: What is the difference between non-standardized and standardized model results.
I have cases where the influence of predictors is significant in the non-standardized results but not in the STDYX standardization??
Yvonne
 Linda K. Muthen posted on Monday, December 07, 2009 - 11:06 am
See the STANDARDIZATION option in the user's guide for a description of the various standardizations available in Mplus.

It can happen that unstandardized and standardized coefficients are not not both signficant or not significant. They should be close. In these cases, I would be conservative as far as significance goes given that you are likely looking at many parameters.
 Benedikt Neumann posted on Tuesday, December 08, 2009 - 9:31 am
Hi,

I am testing the response status of participants in 7 waves by LGC. I implemented 2 independent variables (incentives for participation) as time invariant covariates.

My model is:
Categorical are RT1-RT7;
DEFINE: INSU = INC*SUM;
ANALYSIS: Estimator = ML;
Type = MISSING;
MODEL: i s | RT1@0 RT2@1 RT3@2 RT4@3 RT5@4 RT6@5 RT7@6;
i ON INC SUM INSU;
s ON INC SUM INSU;

As far as I have understood, I can use AIC/BIC to compare the fit of more or less restricted models.

(1) But, is there any possibility to say how well the model fits the data at all (using the logit link)? If no, what alternatives could you suggest in order to make sure the model is fine?

(2) Sometimes removing significant effects of covarites on the random intercept/slope improves the model fit. Is this ought to happen?

I would need this for my thesis and would be very thankful for your advice!

Thank you!

Benedikt
 Bengt O. Muthen posted on Tuesday, December 08, 2009 - 6:14 pm
1. You can consider the frequency table chi-square when you don't have covariates in the model. With sparse cell counts, you can consider bivariate fit using TECH10.

2. That would say the covariates don't only have indirect effects on the repeated measures via the growth factors, but also directly. But if you are using BIC to reach this conclusion, you want to compare models with the same variables.
 Benedikt Neumann posted on Friday, December 11, 2009 - 10:35 am
Dear Mr. Muthen,

I have tried chi-square and TECH10. For most of my data the model is ok, for some it is not (too much variability, it seems).

But, I still feel unsure about the conclusion for the model with covariates.

(1) Can I conclude that a good model fit without covariates will entail a good model fit with covariates and a bad a bad one? (Because the measurement model is independent from the structural model?)

(2) A significant time-invariant covariate improves the structural model by explaining variance of the latent factors, but has no effect on the measurement model?

Am I thinking this right?

Thank you again for your efforts!

Benedikt
 Bengt O. Muthen posted on Friday, December 11, 2009 - 4:10 pm
1) No. But it is a good start to first get good fit without covariates. To see if the covariates have good fit you need to work with neighboring models, asking of some covariates influence some outcomes directly (which would violate the usual model assumptions).

2) Not true.
 Benedikt Neumann posted on Monday, February 22, 2010 - 8:02 am
Dear Mr. Muthen,

I am using the WLSMV estimator (binary outcome variables) for my latent growth analysis with an intercept and a linear slope and have another problem:

Without any covariates, the mean slope is negative and significant. This is in accordance with the appearance of the descriptive data which show an average decline rate by 2.5%.

When introducing (significant) time-invariant covariates of the latent factors, the slope intercept in some cases is not significant or even changes to a positive estimate value. This doesn't seem to match the analysis without covariates and is contrary to the descriptive data.

Could you give me any suggestions how to understand and how to "fix" this problem?

Thank you very much!
 Linda K. Muthen posted on Monday, February 22, 2010 - 9:27 am
I think the issue here is that in a conditional model, the estimate you obtain is an intercept of the slope growth factor and not the mean.
 Benedikt Neumann posted on Tuesday, February 23, 2010 - 11:47 am
Dear Ms. Muthen,

Thank you for your quick reply! Unfortunately, I still don't see my mistake.

I try to compare my descriptive data to the predictions of the LGC model by using the following equations:

Level 1: Y = N(-t + v*i + p*s);
Level 2:
i = 0 + b01*x1 + b02*x2 + b03*x1x2;
s = a10 + b11*x1 + b12*x2 + b13*x1x2;

When x1=0 and x2=0, s = a10. As far as I understand, when a10 is replaced by the positive slope intercept estimate of the model results section, the slope still "produces" a linear increase of Y, given all covariates are 0 and p increases linearly.

In result, the predicted Y goes up, while the outcome data of the "control group" obviously go down. Additionally, the slope intercept becomes also negative, when I use WLS instead of WLSMV.

I still don't see where this difference might come from. Could you please help me further with this?

Thank you!
 Linda K. Muthen posted on Tuesday, February 23, 2010 - 4:23 pm