Message/Author 


Hi, I'm trying to fit a secondorder and a bifactor model on a 14item scale with 2 underlying (firstorder) factors. A unidimensional and correlated 2factor model run fine, but the secondorder and bifactor model are both producing an error. For instance the second order model: TITLE: Secondorder factor analysis TVS DATA: FILE IS TVS.dat; VARIABLE: NAMES ARE y1y16; CATEGORICAL ARE y1y16; ANALYSIS: ESTIMATOR = WLSMV; MODEL: f1 BY y1y10; f2 BY y11y16; f3 BY f1f2; gives the follwong error: THE STANDARD ERRORS OF THE MODEL PARAMETER ESTIMATES COULD NOT BE COMPUTED. THE MODEL MAY NOT BE IDENTIFIED. CHECK YOUR MODEL. PROBLEM INVOLVING PARAMETER 80. The bifactor model produces a similar parameter model. Do you know what the problem is with these models and how I could run these anyway? 


A secondorder factor is not identified with only two firstorder factors. 


Thanks! But how about the bifactor model? 


Show me the MODEL command for the bifactor model. 


I was trying this command (to examine whether the IRT analysis on all items would be justified): TITLE: Bifactor analysis TVS DATA: FILE IS TVS.dat; VARIABLE: NAMES ARE y1y16; CATEGORICAL ARE y1y16; ANALYSIS: ESTIMATOR = WLSMV; MODEL: f1 BY y1y10; f2 BY y11y16; f3 BY y1y16; Thanks, Peter 


The general factor must be uncorrelated with the specific factors and they must be uncorrelated with each other. MODEL: f1 BY y1y10; f2 BY y11y16; f3 BY y1y16; f3 with f1f2@0; f1 WITH f2@0; 


Thanks Linda! Works perfectly now. 


Drs. Muthen, We are testing a bifactor model for an empirically supported traditional 3 factor model due to high correlations between the factors in the 3 factor CFA. Generally, the correlations between the 3 factors range from .6.8. We hypothesized, based on theory, that there would be one general factor (with 10 indicators) and three specific factors (2 factors with 3 indicators and 1 factor with 4 indicators); all continuous; variable correlations set to 0. However, our models with the three specific factors and one general factor would not converge. I have tried different starting values as well as setting the covariance to 0 for one of the specific factors as some of the other exploratory analyses I did suggested that that factors had a negative covariance. Due to the nonconvergence, I tested a model combining two of the specific factors. This collapsing of the factors was also based on theory as well as a .76 correlation between these two factors in the three factor CFA. Is this an accepted solution (to combine the two factors?). Doing this yielded better fit statistics for the bifactor model compared to the three factor CFA. Thanks. 


I assume you are going CFA bifactor modeling  have you tried EFA bifactor modeling? 


Yes, I am using CFA bifactor modeling. I have not tried EFA modeling; unfortunately I don't have access to Mplus 7. Do you have any suggestions with version 6? Thanks 


I assume you have zero correlations between all 4 factors in your first run. 


Yes, and it will not converge. 


For us to diagnose the problem, you would have to send input, output, data, and license number to support@statmodel.com. The EFA version of bifactor analyais can be very helpful in these situations, so getting V7 might be worth your while if you do a lot of bifactor modeling. 

JOEL WONG posted on Thursday, August 01, 2013  1:40 am



I've been reading the works of Steven Reise on bifactor models, and I've 3 questions on the test of bifactor models in Mplus: 1. In a bifactor CFA, why is it important to specify that the specific factors are uncorrelated with each other, i.e., f1 WITH f2@0? Would it be a problem if we know from a regular CFA that the specific factors are in fact strongly correlated with each other? 2. Based on a bifactor model, Reise computes a coefficient omega hierarchical (omegaH), which is how much variance in summed scores can be attributed to a single general factor. Can Mplus compute OmegaH or is information available in the output to compute OmegaH? 3. Reise also computes an explained common variance (ECV) index in bifactor models (common variance explained by the general factor divided by (common variance explained by general factor + common variance explained by specific factors). In the Mplus output under "Model Results," there is a section on variances for the general factor and each of the specific factors. Are these the same as the common variance Reise referred to? If so, could I use these to compute the ECV? Thanks a lot. Reise, S. P., Moore, T. M., & Haviland, M. G. (2010). Bifactor models and rotations: Exploring the extent to which multidimensional data yield univocal scale scores. Journal of personality assessment, 92(6), 544559. 


1. Correlating specific factors is not a problem. This can also be done in bifactor EFA in Mplus. 2. Mplus does not compute this automatically, but the analyst should be able to do it using a NEW parameter in MODEL CONSTRAINT. 3. I would think that the ECV is different from the plain variances. So again, the analyst would compute it via MODEL CONSTRAINT. 

Sarah Hafidz posted on Thursday, February 13, 2014  10:18 pm



Hi Linda I tried running a bifactor CFA as follows: MODEL: CWB BY B1  D13; Factor1 BY J1 J2 J3 J4 J5 J6 J7 K1 K2 K3 K4 K5 K6 L5 L7; Factor2 BY A3 A4 A5 A6 A7 A8 A9 A10 A11 A13 A14 D4 D8 D10 D11 D12 D13; Factor3 BY D1 D2 D3 D6 D7 D9 F1 F2 F3 F4 G4 G5; Factor4 BY B5 G1 G2 G3; Factor5 BY B1 B2 B3 B4 C1; CWB WITH Factor1  Factor5@0; Factor1  Factor5 WITH Factor1  Factor5@0; Unfortunately it says NO CONVERGENCE. NUMBER OF ITERATIONS EXCEEDED. Pls advise what I need to do, if there is something I need to look at. TQ 


Try freeing the first factor loading of each factor and fixing the factor variance to one to see if perhaps the first factor loading is not estimated close to one. If that is the problem, you can choose another factor loading that is estimated close to one to set the metric. If that is not the problem, try running the factors separately. 


I am trying to run a bifactor model across groups to evaluate invariance (configural, metric, scalar), but no estimates for the configural model are available. Is this a way to have this information? 


Please send your output and license number to Support. 


We are comparing secondorder/bifactor models (different scales: 13 and 15). Bifactor input: Model:emo by sdq3* sdq8 sdq13 sdq16 sdq24;con by sdq5* sdq12 sdq18 sdq22 revsdq7;wb by swem1* swem2swem7;g by sdq3* sdq8 sdq13 sdq16 sdq24 sdq5 sdq12 sdq18 sdq22 revsdq7 swem1swem7;emo with con@0; emo with wb@0; emo with g@0; con with wb@0; con with g@0; wb with g@0;emo@1; con@1; g@1; wb@1; With default standardisation the model doesn't converge, even with increased iterations/different first indicators. The model does converge with default freed variances when correlations between specific factors are released, which reveals variance of g=0. Does this mean the model doesn't work (reject?), and the syntax above produces a result because we force g to have variance? We tried reversing swem items (these are inversely related) and the model converges. We are aware of floor effects in sdq items. 1. why doesn't the default standardisation work? 2.Why does reversing swem items cause the model to converge? Is it “differentmethod variance” as in MTMM? 3.Does the above point to problems with forcing a general or higherorder factor? 


Try a bifactor EFA to see if you have approximately the right model. 


Many thanks for this. Bifactor EFA suggested the presence of only 2 specific factors (and this fitted with theory). However, a model where one of the subscales loads only onto the general factor and has no specific factor has the same problems as before: it will not converge unless we reverse the swem items or standardise by freeing the first loading/setting factor variances @1. The model that does not converge results in problematic factor loadings and variance for the swem as below, and that is why we think it has something to do with swem items being reversed (opposite) to the sdq ones: G BY SDQ3 1.000 SDQ8 1.192 SDQ13 1.592 SDQ16 1.019 SDQ24 0.884 SDQ5 1.075 SDQ12 0.653 SDQ18 0.830 SDQ22 0.467 REVSDQ7 0.974 SWEM1 278.699 SWEM2 351.899 SWEM3 437.628 SWEM4 419.112 SWEM5 468.005 SWEM6 261.051 SWEM7 384.547 Variances CON 0.458 WB 45.566 G 0.001 We therefore remain unsure about how to interpret these results and any advice would be much appreciated. 


Perhaps by "reverse" you mean that this would imply that their loadings change sign from negative to positive. If so, you can give negative starting values for the loadings  that sometimes helps when the loading estimate is negative. If this doesn't help, send output and data to Support along with your license number. 

Louise Black posted on Wednesday, February 21, 2018  2:27 am



Thanks this worked! 

Back to top 