Anonymous posted on Thursday, January 12, 2006 - 3:44 pm
I am trying to interpret a series of Growth Curve Models, and I'm in need of some advice. In my first model, I am running an unconditional model (i.e., no covariates). This model converges correctly but tells me that my slope is not significant. However, when I begin to add covariates to the model (i.e., a time invariant model), the slope coefficient (i.e., the coefficient in the intercepts column--mean) becomes significant. Then when I run a time variant model, I notice that my slope is again non-significant. I guess my question is why would the model show a significance in the time invariant model and not the unconditional model?
I'm a little confused about your question but let me just say this. When you have covariates in the model, you are estimating an intercept for the slope growth factor not a mean as in the unconditional analysis. Perhaps this is what you are seeing.