Computational Demands of multigroup b... PreviousNext
Mplus Discussion > Confirmatory Factor Analysis >
Message/Author
 Garett Howardson posted on Sunday, September 14, 2014 - 3:57 pm
I'm currently planning a five wave study where I have a set of approximately 30 items with 4 specific factors and 1 general factor. The general factor will not load on all of the items.

I'm trying to decompose the variance for the general and specific factors at each point in time. To examine this, I was planning to run a five group bifactor model treating each point in time as a group and examining invariance across time along with changes in the variance decomposition. This brings up two questions:

1) It seems that if I'm interested in variance decomposition it would be a bad idea to constrain the item intercepts to be equal (at zero) so I would want to include a mean structure, correct?

2) My reading of the IRT literature is that with categorical measured variables the computational demands of this would be quite large due to the integration required. I have continuous indicators but I can't seem to find any information on the computational demands. I believe this could be estimated with full information maximum likelihood but I have no idea how computing power would be required for a 4 specific factor, 1 general factor model with a mean structure and 5 groups (i.e., time). I guess I'm wondering if I should pursue other options if the computational demands are going to require something like 1-2 days per model.

Thank you,
 Bengt O. Muthen posted on Sunday, September 14, 2014 - 4:54 pm
You say you have a five-wave study and you want to treat time as group so I assume you have different subjects at the different time points because that would assume independent samples at each time point.

1) If you are interested only in variance decomposition you don't in principle need scalar invariance (intercepts also invariant).

2) 5 dimensions is heavy with categorical items and ML estimation due to numerical integration although integration = montecarlo(5000) is a possibility if you have a fast computer with several processors. WLSMV makes the computations easier.
 Garett Howardson posted on Monday, September 15, 2014 - 4:12 am
Thanks for the response Bengt.

Sorry, I forgot the repeated measures part so same sample across five waves. I was under the impression that treating time as a group was the best way to test for metric non-equivalence and changes in the variance decomposition across time. I'm hypothesizing that as time progresses, the general factor will begin explaining more variance in the items and the specific factors will lose meaningfulness.
 Bengt O. Muthen posted on Monday, September 15, 2014 - 4:43 pm
The most natural way is to analyze this in wide format, so 5 x 30 items. But this is computationally challenging. For WLSMV it is tough due to many variables and for ML it is tough due to many factors.

A computationally easier way is to analyze it as 2-level with level-1 being time and level-2 being subject, where you can impose a growth model. This, however, assumes measurement invariance without testing that.
 Garett Howardson posted on Tuesday, September 16, 2014 - 9:37 am
Thanks much! Sounds like the 2-level is the way to go. Thanks again.
Back to top
Add Your Message Here
Post:
Username: Posting Information:
This is a private posting area. Only registered users and moderators may post messages here.
Password:
Options: Enable HTML code in message
Automatically activate URLs in message
Action: