|
Message/Author |
|
ResNL posted on Wednesday, June 03, 2009 - 2:13 pm
|
|
|
Dear Linda and Bengt, I would like to check the local independence assumption for a hierarchical model with continuous latent variables (1 super factor and three first order factors) and categorical indicators. In this case, is it feasible to check the local independence assumption by allowing the estimation of some residual covariances (preferably those that have a high modification index in the default model) and see whether that model provides a better fit compared to a default model with zero residual covariances? Thanks for your help! |
|
|
You can include residual covariances of the first-order factor indicators to test local independence for the first-order factor indicators. The second-order factor is just-identified so you can't add the residual covariances. |
|
ResNL posted on Friday, June 05, 2009 - 7:34 am
|
|
|
Thanks Linda for your quick reply. I made a small error though. I have four first order factors, and then it should be possible to test the local independence assumption with residual covariances of the indicators (with for instance a item1 with item2 statement), or am I mistaken? Thanks again for your help! |
|
|
That sounds correct. |
|
Back to top |
|
|