Slope variance
Message/Author
 empisoz posted on Thursday, June 21, 2018 - 11:41 am
Hello

I estimated a growth curve model with 4 occasions. The variance of the slope was estimated as negative. I constrained this variance to zero and the model converged. But how can it be that despite the slope variance constrained to zero, once I add a covariate and regress the slope on the covariate this covariate still explains variance in the slope? I thought once a variance is set to zero, there is nothing left to explain?
 Bengt O. Muthen posted on Thursday, June 21, 2018 - 3:49 pm
When you add a covariate, the variance refers to the residual of the DV, not its total variance. And even if the residual is zero, the slope varies as a function of its predictors. It is frequently observed that adding predictors make it easier to detect slope variation.
 TJ posted on Tuesday, September 10, 2019 - 2:35 pm
Hi Bengt, This is a follow up to your response above. Could you please elaborate a little more why is it that adding predictors make it easier to detect slope variation? Once S@0 is set, even if you add a covariate, the covariate is suppose to explain the variance of the intercept not the slope isn't it?
 Bengt O. Muthen posted on Wednesday, September 11, 2019 - 11:59 am
Q1: With more observed variables in your model you can gain in power to reject that the slope variance is zero.

Q2: Even if you have s@0, your covariates can still change the slope. You are fixing only the residual variance of s regressed on covariates - the covariates still affect s in the sense that different covariate values give different s values.